i absolutely hate how the modern web just fails to load if one has javascript turned off. i, as a user, should be able to switch off javascript and have the site work exactly as it does with javascript turned on. it’s not a hard concept, people.
but you ask candidates to explain “graceful degradation” and they’ll sit and look at you with a blank stare.
I wrote my CV site in React and Next.js configured for SSG (Static Site Generation) which means that the whole site loads perfectly without JavaScript, but if you do have JS enabled you’ll get a theme switching and print button.
That said, requiring JS makes sense on some sites, namely those that act more like web apps that let you do stuff (like WhatsApp or Photopea). Not for articles, blogs etc. though.
I mean yes, but Whatsapp is a bad example. It could easily use no JavaScript. In the end it’s the same as Lemmy or any other forum. You could post a message, get a new page with the message. Switching chats is loading a new page. Of course JavaScript enhances the experience, makes it more fluid, etc, but messengers could work perfectly fine without JavaScript.
Maybe I’m out of the loop because I do mostly backend, but how do you update the chat window when new chats come in, without JavaScript?
You don’t, I’m saying it would still mostly work. Getting messages as they arrive is nice but not necessary. For example, I personally have all notifications off, and I only see messages when I specifically look for them, no one can reach me instantly. Everyone seems to be missing that we’re talking about degradation here, it degrades, it gets worse with JS disabled. But it shouldn’t straight up not work.
A good example for something that does not work without JS would have been a drawing application like they said, or games, there are plenty of things that literally do not work without JS, but messaging is not one of them. Instant messaging would be of course.
Did you just propose degrading instant messengers back into email? 😂
How exactly do you propose people actually chat with such a system? Continuously hammering F5 while being actively engaged with another person? 😂
I also feel like everyone seems to be missing that we’re taking about degradation, which isn’t usually “no js at all”, it’s some subset that isn’t supported. People use feature detection to find out of some feature is supported in the browser and if it’s not the they don’t enable the feature the depends on it.
For the chat example, you could argue that a chat can degrade into a bulletin board, but I’d argue that people use chat for realtime messaging so js is needed for the base use case.
If your webpage primarily just displays static information, then I agree that it should work without js or css. Like Wikipedia, or a blog, or news, or a product marketing page, or a forum/BBS.
But there is a huge part of the web that this simply doesn’t apply to, and it’s not realistic to have them put in huge effort to support what can only be a broken experience for a fraction of a percent of users.
How would a page fetch new messages for you without JS?
You don’t. That’s the gracefull degradation part. You can still read your chat history and send new messages, but receiving messages as they come requires page reload or enabling js.
my only issue with this ideology(the require page load) is, this setup would essentially require a whole new processing system to handle, as instead of it being sent via events, it would need to be rendered and sent server side. This also forces the server to load everything at once instead of dynamically like how it currently does, which will increase strain/load on the server node that is displaying the web page, while also removing the potential of service isolation between the parts of the web page meaning if one component goes down(such as chat history), the entire page handler goes down, while also decreasing page response and load times. That’s the downside of those old legacy style pages. They are a pain in the ass to maintain, run slower and don’t have much fallover ability.
It’s basically asking the provider to spend more to: make the service slower, remove features from the site (both information and functionality wise) and have a more complex setup when scaling, to increase compatibility for a minor portion of the current machines and users out there.
this is of course also ignoring the increase request load as you are now having to resend entire webpages to get data instead of just messages/updates too.
The web interface can already be reloaded at any time and has to do all of this. You seem to be missing we’re talking about degradation here, remember the definition of the word, it means it isn’t as good as when JS is enabled. The point is it should still work somehow.
Just to make sure we are on the same page then, cause I don’t see the issue with my post.
I am using the term “Graceful Degradation” which is meant as a fault tolerance for tech stacks to allow for a critical component to be removed.
This critical component people are talking about is Javascript which is used for all dynamically loaded content, and used for fallover protection so one service going down doesn’t make it so the entire page goes down (also an example of fault tolerance).
The proposed solution given would remove that fault tolerance for the reasons I provided in the original reply, while degrading the users experience due to increased page load time (users reloading the page inconsistently vs consistently to get new information) and increasing maintenance costs and overhead on the provider.
Additionally, the new processing system that you mentioned already exists generally doesn’t, because they(websites) mostly use a dynamic load style nowadays, not a static(as in the client doesn’t change it) page, which is what this type of system would require.
note: edits were for phrasing, and a typo
How would you solve end-to-end encryption without JavaScript?