Conversation:
Notices
-
#JavaScript has a lot !security implications and it can currently not be understood by most search engines (I don't know even one that can). So if you want to drive good (and a lot) content into the search engines, you cannot bypass "static" pages (generated with #PHP, #Python, #Perl, #JSF and so on is not static but you can use #mod_rewrite to obsfucate your script names).
-
♻ @kat: "Also, surely asking the client to run the code to build the pages is better design than making everything happen server side." That may be true if you're running an underpowered server that can't generate pages, but not for the user with a lightweight browser. Using #Javascript as the sole means to present content breaks the model of Content={text;video;audio}, Semantic…
-
@kat And then there's the whole security issue that @roland touched on -- I don't want some arbitrary web site running executable code on my computer. And when that #Javascript gets pulled in from multiple sources it's easy (but wrong) to just allow the entire page to run all the scripts it wants, leaving the user vulnerable to all the Javascript that comes from shady repositories or that's embedded in malcious ads.
-
And before I forget, this entire thread needs to be tagged with #Javascrippled
-
@maiyannah Local repositories for #Javascript is a good start, but it's Javascript dependency for delivering content that really makes me sad. It's partly an accessibility issue, partly a seach engine visibility issue, partly an archiving issue. Developing a web site that's still functional without Javascript isn't that much more difficult than developing one that's intensely Java…
-
@bobjonkman No, it's that without laoding content you can gain some milliseconds and make sites load "faster" (because there's a dynamically loaded spinning wheel instead of a progress bar!).
-
@mk Yes, sure. My point is, JavaScript hate is really "you broke the internet and this could've been done w/ progressive enhancement" hate.