- cross-posted to:
- web_design@lemmy.ml
- cross-posted to:
- web_design@lemmy.ml
I would like to know more about what was causing slowness than those couple of bullet points. I’ve only made a few websites over the years, and they were always serving semantic HTML and CSS; what scripts there were granted interactivity and weren’t necessary to view the page. So is this a matter of people turning mostly-static websites into React monstrosities or is it something else?
Besides respecting sane HTML practices, the point about trying to get info that is displayed on a map is interesting. There is no reason that I can think of that that kind of public information shouldn’t be freely available as an API, so that anyone can build a frontend for it - including a plain HTML
<ul>frontend. That kind of thing should really be mandatory for public services, because it maximises the utility of said service. Ideally it would also be mandatory to include some more-or-less raw display of the API data so that no third party needs to build a frontend, but start small.So is this a matter of people turning mostly-static websites into React monstrosities or is it something else?
Yep, replaced simple HTML with JSON and client side templating, realised it was inherently slower so re-invented server-side generation (now called SSR, server-side rendering, because everything needs a fancy name), and then merge it all together on the client (rehydration).
All this for content that is 99% static and doesn’t need that level of interactivity, even the linked site is doing it for some reason, and they don’t even have comments or something that would explain it. They’re using it purely for navigation where a plain link would suffice.
You can totally just set up an apavhe/nginx and serve article.txt and browsers will serve it.
You can but the point is that people/companies/public services generally don’t.
I suppose that’s true. Accessibility and all that.




