• 2 Posts
  • 163 Comments
Joined 2 years ago
cake
Cake day: June 12th, 2023

help-circle

  • Also, the overwhelming majority of USB plugs have the logo on the side away from the plastic bit, and sockets have their plastic bits towards the top of the device. You want the plastic bits on opposite sides (as physical objects don’t like to overlap), so that means that if you can feel the logo with your thumb, that side goes up when you plug it in, and you don’t even have to look.



  • Returns and refunds happened because the EU warned them that if they didn’t, they’d create legislation so they’d have to, as they were already in a grey area under EU law. EA had a similar refund policy for games bought through Origin before Steam did.

    Other than that, nearly everything you listed was done because it made business sense and would lead to more profit. Decoupling PC gaming from Windows by working on Linux means they’re not at the mercy of Microsoft’s whims, and was started at a time where it looked like Microsoft might make a version of Windows that could only install third-party software through the Windows Store. Discouraging kernel-level anti cheat discourages one of the last hurdles to Linux being able to play all Windows games. Supporting VR lets them still VR games through Steam.

    Not being publicly traded lets them be concerned about their long-term profits above their short-term ones, so they won’t do things that tarnish their reputation nearly as often as their competitors, and can do multi-year projects. They look good because their competitors are bad rather than because what they do is altruistic.

    Edit: I just did some maths. The Steam Deck has sold somewhere around 5 million units and a Windows licence costs somewhere around $50 to an OEM with a volume licensing deal. Both these figures are approximate as I couldn’t find precise numbers, but they’re enough for a ballpark figure. This means that Valve have saved around $250 million by shipping the Deck with SteamOS instead of Windows. Even if my figures are way off, it’s still a huge amount of money and goes a long way towards making all their work on Linux pay for itself.


  • Arch is at least more likely to update to a fixed version sooner, and someone getting something with pacman is going to be used to the idea of it breaking because of using bleeding edge dependencies. The difference with the Flatpak is that most users believe that they’re getting something straight from the developers, so they’re not going to report problems to the right people if Fedora puts a different source of Flatpaks in the lists and overrides working packages with ones so broken as to be useless.


  • People fall off rooftops fitting solar panels, burn to death repairing wind turbines that they can’t climb down fast enough to escape, and dams burst and wash away towns. Renewable energy is much less killy than fossil fuels, but per megawatt hour, it’s comparable to nuclear, despite a few large incidents killing quite a lot of people each. At the moment, over their history, hydro is four times deadlier than nuclear, wind’s a little worse than nuclear, and solar’s a little better. Fission power is actually really safe.

    The article’s talking about fusion power, though. Fission reactions are dangerous because if you’ve got enough fuel to get a reaction at all, you’ve got enough fuel to get a bigger reaction than you want, so you have to control it carefully to avoid making it too hot, which would cause the steam in the reactor to burst out and carry chunks of partially-used fuel with it, which are very deadly. That problem doesn’t exist with fusion. It’s so hard to make the reaction happen in the first place that any problem just makes the reaction stop immediately. If you somehow blew a hole in the side of the reactor, you’d just get some very hot hydrogen and very hot helium, which would be harmless in a few minutes once they’d cooled down. It’s impossible for fusion power, once it’s working, not to be the safest way to generate energy in history because it inherently avoids the big problems with what is already one of the safest ways.



  • That’s misleading in the other direction, though, as PhysX is really two things, a regular boring CPU-side physics library (just like Havok, Jolt and Bullet), and the GPU-accelerated physics library which only does a few things, but does them faster. Most things that use PhysX just use the CPU-side part and won’t notice or care if the GPU changes. A few things use the GPU-accelerated part, but the overwhelming majority of those use it for optional extra features that only work on Nvidia cards, and instead of running the same effects on the CPU if there’s no Nvidia card available, they just skip them, so it’s not the end of the world to leave them disabled on the 5000-series.


  • It does also get pushed by organisations that profit from fossil fuels as an excuse to never need to decarbonise as they can hypothetically just capture it all again later, which is dumb and impractical for a variety of reasons, including the one alluded to above. Some kind of Carbon sink will need to be part of the long-term solution, but the groups pushing most strongly want it to be the whole solution and have someone else pay for it so they can keep doing the same things as caused the problem in the first place.



  • The Free Software movement was generally a leftist objection to the limitations on computer use that capitalism was causing, and the open source movement was a pro-corporate offshoot to try and make the obvious benefits more compatible with capitalism (which it’s been pretty successful at, even if it has reintroduced some of the problems Free Software was trying to stop in the first place). Anyone who’s making a distinction between the two at the minimum is recognising that capitalism is why we can’t have certain specific nice things, so it’s not a huge leap to blame it for other problems, too.

    As for a sensible middle ground, the Free Software movement designed its licences to work in the capitalist societies they operated in, so the incompatibility with corporate use has never been as big a deal as it’s been made out to be. Corporations can use copyleft-licenced software just fine as long as they’re not unreasonable about it. It’s totally fine for a corporation to use a GPL tool internally and even have an internal fork as long as they put the source code for their internal fork on the company’s file share so the employees using the tool can improve it if they get the urge. They can even sell products that depend on LGPL or MPL libraries if they make the source of the builds of those libraries they used available on their website or otherwise accessible to their customers (and use a DLL/.so/.dylib build of the library of it’s LGPL). These restrictions are all less of a pain than making an MIT-licenced clone of an existing project, but companies have opted to make clones instead. The only bonus this gives them is that they can make it proprietary again later, and it has the added risk that one of their competitors could make a proprietary fork with a killer feature they can charge for, which isn’t a nice risk. There are other benefits to investing in making your own clone of something, but they don’t depend on the license it uses.


  • If you write cross-platform software, the easiest solution is usually to pretend everything’s Unix. You’ll hit some problems (e.g. assuming all filesystem APIs always use UTF-8 will bite you on Windows, which switched to UCS2 before UTF-8 or UTF-16 were invented, so now uses UTF-16 for Unicode-aware functions as that’s the one that’s ABI compatible with UCS2, and passing UTF-8 to the eight-bit-char functions requires you to opt into that mode explicitly), but mostly everything will just work. There’s no XDG_CONFIG telling you to put these files anywhere in particular, as Windows is Windows, so most things use ~ as a fallback, which Windows knows to treat as %USERPROFILE%.


  • AnyOldName3@lemmy.worldtoMicroblog Memes@lemmy.worldConsent machine
    link
    fedilink
    English
    arrow-up
    4
    arrow-down
    8
    ·
    1 month ago

    They had a bazillion complaints (and still get them) that they report the figures at all and that they don’t treat Hamas being a terrorist organisation as a statement of fact. For a couple of weeks after the October the 7th attack, the reporting was more neutral, and the whole rest of the British press was up in arms about the BBC being antisemitic, and the current situation was the compromise that calmed it down. In a world where Israel having done nothing wrong ever is somehow part of the Overton window, this is what counts as impartial. Impartiality is a bad thing when it’s forced to apply to viewpoints divorced from reality.





  • There are already slats so the only hole you can get a fork into is the earth, unless you’ve already got something convincingly shaped like an earth pin in the earth hole to open the slats over the live and neutral. If you’re going to that much effort to zap yourself, the switch isn’t going to be much of a hurdle.

    I’d suspect that it’s largely because it’s more convenient to have a switch than to unplug things and plug them back in again, especially as our plugs are a nightmare to step on to the point that Americans complaining about stepping on lego seems comical to anyone who’s stepped on lego and a plug.



  • Eating Tide pods wasn’t even a bit real. A few children tried to make videos where they pretend to eat them, but simply biting one and spitting it straight out can kill you as they’re so reactive with the tissue in your mouth that they liquify your tongue, flow to your throat, liquify that, flow to your lungs, and liquify enough of those that you suffocate. The media reported this as children eating tide pods for tiktok, so more children tried making fake videos off the back of that, not realising that they were doing the dangerous thing rather than a danger-adjacent safe-wish thing.


  • You can jam the Windows UI by spawning loads of processes with equivalent or higher priority to explorer.exe, which runs the desktop as they’ll compete for CPU time. The same will happen if you do the equivalent under Linux. However if you have one process that does lots of small allocations, under Windows, once the memory and page file are exhausted, eventually an allocation will fail, and if the application’s not set up to handle that, it’ll die and you’ll have free memory again. Doing the same under every desktop Linux distro I’ve tried (which have mostly been Ubuntu-based, so others may handle it better) will just freeze the whole machine. I don’t know the details, but I’d guess it’s that the process gets suspended until its request can be fulfilled, so as long as there’s memory, it gets it eventually, but it never gets told to stop or murdered, so there’s no memory for things like the desktop environment to use.