- cross-posted to:
- technology@beehaw.org
- cross-posted to:
- technology@beehaw.org
As someone who never did much web development, I was… surprised… at the amount of tooling that existed to paper over this issue. The headaches which stood out for me were JavaScript bundling (then you need to choose which tool to use - WebPack but then that’s slow so then you switch to esbuild) and minified code (but that’s hard to debug so now you need source maps to re-reverse the situation).
Of course the same kind of work needs to be done when developing programs in other languages. But something about developing in JS felt so noisy. Imagine if to compile Java or Rust you needed to first choose and configure your own compiler, each presenting their own websites with fancy logos adorning persuasive marketing copy.
My hypothesis is the JavaScript ecosystem is like 50:5:1 of junior : intermediate: expert contributors.
That’s why there’s so much “fuck it, I’ll make a new project from scratch!” and “screw conventions I’m making a breaking change”. Those are frankly very junior attitudes, and they feel more common in JavaScript land.
I don’t have any data to back this, but it’s my guess.
That or the ecosystem is just so polluted with bad ideas, people aren’t learning good practices.
I switched jobs from one using a mostly C++ stack to one using a Typescript/JavaScript stack for a large application. I was absolutely shocked at how slow and generally shitty the tooling for JavaScript is, and coming from C++ land the bar was already very low.
Yeah I hated Web development when ever i had to do it. Web development as a whole feels half undercooked and half overcooked.
So-called “backend” I was OK with. HTTP is well-specified. It’s a too general of a protocol what it’s being used for, so you’re stuck implementing the same stuff over and over again. When using SMTP or NNTP you realise how much work the protocol does for you when building systems on top of it.
But “frontend”… Jesus talk about abusing something that was never designed to be used like it is. Total nightmare in my opinion! UIs which are totally inconsistent in appearance and behaviour has somehow become the norm!
I don’t know about that. Frameworks like React or CSS toolkits have made things more consistent across browsers. Being rid of Internet Explorer helped, too. Things were way worse 15 years ago.
Now, making the browser into a quasi operating system might not have been a good idea, but that’s a different argument.
I read this article a few weeks ago and it sent me on a rabbit hole of web performance articles.
I think a good budget for basic websites (articles, landing pages, and small to medium functionality web apps) is what I call the “GZ250”, or 250KB of gzipped JavaScript, which is more than plenty. I picked this amount such that yesterday’s budget phones will be able to load the website in a few seconds at 1Mbps (and the name references my motorcycle).
For comparison, my full on games take way less than that. The Unscaled Incremental and Elemental Incremental are 52KB and 19KB of compressed JS respectively, and v1.0 of my new deckbuilding game is about 27KB. The unreleased V1.1 is massive but will still be around 50-60KB of compressed JS.
I don’t understand how an article uses 60x the script as my games, but cutting back to 6x would be a win for accessibility and efficiency.
Did you see this article by Dan Luu? https://danluu.com/slow-device/
Super interesting. It’s a discussion from a point of view I hadn’t considered before: how bandwidth has increased much more than CPU performance of web apps. I felt this in a way as my main computer until recently was a mini PC with the an Intel i5-5250U processor. Despite my Internet connection going from a 10mbps link to a 300mbps link, and pings dropping from 25ms to <5ms, browsing the web on the device became unbearable.
Interesting, it kinda feels like the opposite is true for me, at least on mobile. In 4 years, I’ve gone from a 1.4GHz A53 SD425 to a 2.2GHz A78 SD695 SoC, a 6x increase in single thread performance in 4 years for me. I also during that time got a powerful laptop with a Ryzen 9 5900HX CPU.
Meanwhile, it’s still not unusual to see my Internet speeds drop below 1Mbps, often hovering around 100Kbps-300Kbps, on data or crappy university WiFi, which sometimes has a ping of no joke, 20000+ on my laptop when running Ubuntu. I can sometimes reach high throughput of up to 100Mbps, but when I don’t, my Internet speeds often chug.
How big is 10 MB anyway?
To be honest, after typing all these numbers, 10 MB doesn’t even feel that big or special. Seems like shipping 10 MB of code is normal now.
If we assume that the average code line is about 65 characters, that would mean we are shipping ~150,000 lines of code. With every website! Sometimes just to show static content!
And that code is minified already. So it’s more like 300K+ LoC just for one website.
An important takeaway, as I feel byte size can be hard for people to intuitively visualize. And for those who didn’t read the article, many of the sites tested sent significantly more than 10 megs of JS, even sites containing nothing more than simple input boxes that should be doing any processing server-side.
I want to see the difference with ad-block enabled. Analytics and tracking are certainly complex enough to account for a lot of that payload. Same with an addon like Decentraleyes to see how much is bloated frameworks that could easily be cached locally.
now let’s enable ublock origin
And all of it burns CO2 in multiple places. Not just a computer crime!
Damn I love flask. I can do somewhat complex stuff without even touching js
deleted by creator
It just javascript - the bloat part is Implicitly there when talking about js/TS.