Hard drive space is the cheapest thing you’ve got on a computer.
I hate this “storage is cheap” mentality, it’s a cop out for being wasteful without a reason. “Gas is cheap” was common up to the early 1970s, until it wasn’t anymore. “Freshwater is cheap”, until it isn’t anymore.
Are you willing to give up 1080p screens and 16-bit/44.1kHz sampled music? Or how about languages that can’t be represented in ASCII, much less Latin-1? Because handling those take up way more space than code.
I have. Still small compared to the images and such that are used in a user facing application.
Edit: just to bring in real numbers, I have an old TypeScript project that results in a 109M node_modules dir. Which I agree is absurd. I also have an old anime video, 21 minutes long, at only 560x432 resolution, 24fps, which takes 171M. And that’s my point: even in really bad cases, code size tends to be swamped out by everything else in user-facing applications. If there’s any kind of images, music, or video, the code size will be a small part of the complete picture.
As a point of comparison, in the last place I worked, the main project had over 600MB of javascript dependencies it pulled from node. Plus 300MB of python libraries for Django and whatever else.
At my current job, preparing your environment for development of one “isolated” php system will need at least 3GB of dependencies. Even the main programmer behind it has no clue how it happened or why.
I’d rather have the audio compressed and decompress in realtime instead of a game taking 100Gb in size. Or maybe give me the option to only download languages I can use.
You present a false dichotomy. Yes, things like uncompressed audio and HD video take up more storage space, but that does not negate that modern commercial software is very inefficient with how it uses resources. You could improve the efficiency of the system while keeping HD video, it is not a mutually exclusive choice.
For example, booting up Windows and doing nothing takes up 4gb of RAM, while doing the same with a lean Linux installation would take up a quarter of that, despite both operating systems having identical functionality (run web browser, open applications, edit documents, play games, etc).
Sure, and there are some performance gains to be made from it I’m sure, but when my OS is doing that and my web browser is doing that and my browser based chat client and my browser based text editor are all doing that, it gets pretty sluggish.
This is why Linux is a godsend for older machines, even running the exact same applications (Firefox, Discord, and vscode) on the exact same hardware, it still feels more responsive on Linux because there is less overhead from the OS itself.
It’s an invented problem. A program takes what a program takes. Everyone cares way more about the code being legible, the code being fast enough, and the code not using a ton of memory (and even that last one is kind of shrugged off depending on context).
Applications taking 3mb take 3mb because they do next to nothing or they do it with a bunch of shared libraries … which is a whole other dependency management mess and wasting a few mb on a drive.
There’s also a huge difference between being wasteful of something that pollutes the planet in mass and is not renewable like gasoline (which is the only reason you’d be upset about that now) and wasting a few mb on a drive.
The equivalent of your complaint 3mb vs 200mb is like complaining about a person taking a trip to the grocery store… It’s insignificant and often necessary.
You can say that program does way more than you need, but … nobody is catering to “only what you specifically need” and using the larger program almost certainly covers your needs.
Furthermore, like I already said making things smaller often makes them slower… Since CPU is more expensive to improve, of course things are bigger, that’s what more people care about. Some video games take that to an extreme with uncompressed files and 250GB install footprints … but 200mb?
Everyone cares way more about the code being legible, the code being fast enough, and the code not using a ton of memory (and even that last one is kind of shrugged off depending on context).
And then you look at real life and notice that code everywhere is slow, bloated and inefficient. But hey, it’s “legible”! To one or two devs, hopefully.
The equivalent of your complaint 3mb vs 200mb is like complaining about a person taking a trip to the grocery store
Terrible analogy. A better equivalent is someone renting a garage to store stuff inside and now, because they have so much space, there’s that urge to fill it, whether it makes sense to or not.
making things smaller often makes them slower
It’s usually the other way around. As a rule of thumb, less code = smaller size = faster execution. In theory, 1k lines of code will require less computation, less processing, than 10k.
And then you look at real life and notice that code everywhere is slow, bloated and inefficient.
That’s not true in practice. I mean, that code does exist. However, the vast majority of code is reasonably performant.
Not everyone is an expert at optimization and that’s fine … we’d have a lot less software in general if only the best of the best were allowed to author it.
It would be great if more things went back to native (or at least not “I need an entire web browser for my app to function”) that to me is wasteful… But a few hundred MBs for a program as large, complicated, and feature rich as LibreOffice is not.
Terrible analogy. A better equivalent is someone renting a garage to store stuff inside and now, because they have so much space, there’s that urge to fill it, whether it makes sense to or not.
No, that’s … just wrong. It’s not like people are just writing code and leaving it there to do nothing except increase code size or are actively trying to fill the drive.
It’s usually the other way around. As a rule of thumb, less code = smaller size = faster execution. In theory, 1k lines of code will require less computation, less processing, than 10k.
That’s not inherently true, though it is a common misconception/oversimplification. When you do things like code inlining, you increase code size (because you’re taking that functions code and having your compiler copy it around to a bunch of places) but the increased locality speeds things up. There’s a reason -Os and -O3 are not the same option.
Now sure, if you execute fewer instructions that’s better than executing more localized code (though even that can be wrong given process cache and relative instruction speed). Lots of programs have added features that you might not use, but that doesn’t really “hurt you”, that’s not the source of your program or your computer’s slowness, it’s just some bytes on the drive.
We’re a long way from the Unix style “everything is a small program that gets piped into other programs to do interesting things” days. That paradigm just doesn’t work for GUI software. Nobody does that because … normal folks would rather have one office program than have to go shop for 275 programs so that they can have separate programs to edit the document, print the document, convert the document to pdf, update calculations in their spreadsheet, run macros, etc (which if you use all/most of them would likely be more expensive in terms of disk space anyways).
I hate this “storage is cheap” mentality, it’s a cop out for being wasteful without a reason. “Gas is cheap” was common up to the early 1970s, until it wasn’t anymore. “Freshwater is cheap”, until it isn’t anymore.
Resources are cheap and getting cheaper all the time.
Developers are expensive and getting more expensive all the time.
It’s no wonder everything is a sprawling mess.
Are you willing to give up 1080p screens and 16-bit/44.1kHz sampled music? Or how about languages that can’t be represented in ASCII, much less Latin-1? Because handling those take up way more space than code.
Let me quote myself:
And there’s almost always a reason. Code size tends to be modest compared to supporting data around it.
I see you’ve never dealt with a real life project that requires god knows how many different libraries off nodejs because 🤷♂️
Dependency hell takes a lot of space.
I have. Still small compared to the images and such that are used in a user facing application.
Edit: just to bring in real numbers, I have an old TypeScript project that results in a 109M
node_modules
dir. Which I agree is absurd. I also have an old anime video, 21 minutes long, at only 560x432 resolution, 24fps, which takes 171M. And that’s my point: even in really bad cases, code size tends to be swamped out by everything else in user-facing applications. If there’s any kind of images, music, or video, the code size will be a small part of the complete picture.As a point of comparison, in the last place I worked, the main project had over 600MB of javascript dependencies it pulled from node. Plus 300MB of python libraries for Django and whatever else.
At my current job, preparing your environment for development of one “isolated” php system will need at least 3GB of dependencies. Even the main programmer behind it has no clue how it happened or why.
I’d rather have the audio compressed and decompress in realtime instead of a game taking 100Gb in size. Or maybe give me the option to only download languages I can use.
It’s almost always compressed in some way. Still takes up a lot of space. You can fit a lot of compiled code in the space of a 1 minute, 128kbps mp3.
You present a false dichotomy. Yes, things like uncompressed audio and HD video take up more storage space, but that does not negate that modern commercial software is very inefficient with how it uses resources. You could improve the efficiency of the system while keeping HD video, it is not a mutually exclusive choice.
For example, booting up Windows and doing nothing takes up 4gb of RAM, while doing the same with a lean Linux installation would take up a quarter of that, despite both operating systems having identical functionality (run web browser, open applications, edit documents, play games, etc).
Does Windows booting take up that much space because of code, or because of data that code is loading?
4GB is what Windows idles at for me, after everything has loaded.
How much of that is cached state based on the percentage of ram available?
Windows takes a percentage of your available RAM, you can boot it on 4GB RAM and it will use 1GB of so
Sure, and there are some performance gains to be made from it I’m sure, but when my OS is doing that and my web browser is doing that and my browser based chat client and my browser based text editor are all doing that, it gets pretty sluggish.
This is why Linux is a godsend for older machines, even running the exact same applications (Firefox, Discord, and vscode) on the exact same hardware, it still feels more responsive on Linux because there is less overhead from the OS itself.
It’s an invented problem. A program takes what a program takes. Everyone cares way more about the code being legible, the code being fast enough, and the code not using a ton of memory (and even that last one is kind of shrugged off depending on context).
Applications taking 3mb take 3mb because they do next to nothing or they do it with a bunch of shared libraries … which is a whole other dependency management mess and wasting a few mb on a drive.
There’s also a huge difference between being wasteful of something that pollutes the planet in mass and is not renewable like gasoline (which is the only reason you’d be upset about that now) and wasting a few mb on a drive.
The equivalent of your complaint 3mb vs 200mb is like complaining about a person taking a trip to the grocery store… It’s insignificant and often necessary.
You can say that program does way more than you need, but … nobody is catering to “only what you specifically need” and using the larger program almost certainly covers your needs.
Furthermore, like I already said making things smaller often makes them slower… Since CPU is more expensive to improve, of course things are bigger, that’s what more people care about. Some video games take that to an extreme with uncompressed files and 250GB install footprints … but 200mb?
And then you look at real life and notice that code everywhere is slow, bloated and inefficient. But hey, it’s “legible”! To one or two devs, hopefully.
Terrible analogy. A better equivalent is someone renting a garage to store stuff inside and now, because they have so much space, there’s that urge to fill it, whether it makes sense to or not.
It’s usually the other way around. As a rule of thumb, less code = smaller size = faster execution. In theory, 1k lines of code will require less computation, less processing, than 10k.
That’s not true in practice. I mean, that code does exist. However, the vast majority of code is reasonably performant.
Not everyone is an expert at optimization and that’s fine … we’d have a lot less software in general if only the best of the best were allowed to author it.
It would be great if more things went back to native (or at least not “I need an entire web browser for my app to function”) that to me is wasteful… But a few hundred MBs for a program as large, complicated, and feature rich as LibreOffice is not.
No, that’s … just wrong. It’s not like people are just writing code and leaving it there to do nothing except increase code size or are actively trying to fill the drive.
That’s not inherently true, though it is a common misconception/oversimplification. When you do things like code inlining, you increase code size (because you’re taking that functions code and having your compiler copy it around to a bunch of places) but the increased locality speeds things up. There’s a reason -Os and -O3 are not the same option.
Now sure, if you execute fewer instructions that’s better than executing more localized code (though even that can be wrong given process cache and relative instruction speed). Lots of programs have added features that you might not use, but that doesn’t really “hurt you”, that’s not the source of your program or your computer’s slowness, it’s just some bytes on the drive.
We’re a long way from the Unix style “everything is a small program that gets piped into other programs to do interesting things” days. That paradigm just doesn’t work for GUI software. Nobody does that because … normal folks would rather have one office program than have to go shop for 275 programs so that they can have separate programs to edit the document, print the document, convert the document to pdf, update calculations in their spreadsheet, run macros, etc (which if you use all/most of them would likely be more expensive in terms of disk space anyways).