- cross-posted to:
- hackernews@lemmy.bestiver.se
- cross-posted to:
- hackernews@lemmy.bestiver.se
This was a fun one. Here’s my newest post on how to dramatically reduce Godot’s build size.
Some sacrifices were made… But the end result is a Godot project that works exactly the same, albeit with slightly worse performance. Hope this can help others in achieving tiny build sizes!
Very interesting. I was thinking about doing this, but went the Defold route instead. Defold is much smaller out of the box than even these results, but the sacrifices in functionality are pretty severe and the editor and engine are, shall I say, cantankerous. I may have to circle back to trying this at some point for my web games. Thanks for the post!
Some sacrifices were made… But the end result is a Godot project that works exactly the same, albeit with slightly worse performance.
The write up has lots of hard numbers for the executable size but only describes the performance impact in general terms.
It would be good to see some before and after performance numbers.
Performance testing is a whole can of worms. It’s hard to get an idea of how performance changes because it’ll depend a lot on the nodes and scripts being used. There could be huge regressions in specific cases and functions and no difference in others. Usually you’ll need a suite of tests to see what changed.
But, why?
100mb are already negligible. I want a reasonable size (less 20GB) and ok performance.
Wasting time on metrics that don’t matter is waste. I get and agree with mobile/web constraints. I did not consider them.
@Gladaed @popcar2 because every bloated 100gb game is a mix of a thousand different assets/executables/libraries where the developer thought “why optimize, 100MB is nothing”.
Often with little more than symbol stripping, removing unneeded files, and basic lz compression you can cut most game sizes in half, reduce load times, and free up terabytes of space on your system.
because every bloated 100gb game is a mix of a thousand different assets/executables/libraries
Also needlessly uncompressed audio or poorly compressed video (that likely could instead be in-engine) particularly when multiple resolutions are needed. Sometimes one of these might be the bigger issue, particularly with remasters.
@Gladaed I have less-than-stellar internet* (~6mb/s, shared with others), 20GB is definitely not a reasonable size for me. 100MiB is fine, but not negligible. Consider also storage space, particularly because users will run games from slower(->cheaper) drives when they deem games too big (which for me is already at 1GiB+, at least in terms of having a library because that adds up).
* and I live in the US, not even in the woods! The price isn’t even great, either.
As I see it, data size is an inverse multiplier for viability. The smaller a download is the more likely that users will be able to get and store it (long-term even) without issue. The difference between a day and an hour is huge, the difference between an hour and 5 minutes is still worthwhile (especially if this impacts household internet speed). Less than that (nearing instant download) is peak.
The article begins by talking about web exports. The default there is 42MB, which is kind of a lot for the web
Edit: Of course, compressed it is ~9, which isn’t so bad, I suppose.
For mobile this is a legit difference imo
Also what raptor said.