• kamen@lemmy.world
    link
    fedilink
    arrow-up
    5
    ·
    7 months ago

    Don’t be a fan of one or the other, just get what’s more appropriate at the time of buying.

  • Lizardking27@lemmy.world
    link
    fedilink
    arrow-up
    5
    ·
    7 months ago

    Ugh. Can I just say how much I fucking HATE how every single fucking product on the market today is a cheap, broken, barely functional piece of shit.

    I swear to God the number of times I have to FIX something BRAND NEW that I JUST PAID FOR is absolutely ridiculous.

    I knew I should’ve been an engineer, how easy must it be to sit around and make shit that doesn’t work?

    Fucking despicable. Do better or die, manufacturers.

    • Doombot1@lemmy.one
      link
      fedilink
      arrow-up
      2
      ·
      7 months ago

      Most of the time, the product itself comes out of engineering just fine and then it gets torn up and/or ruined by the business side of the company. That said, sometimes people do make mistakes - in my mind, it’s more of how they’re handled by the company (oftentimes poorly). One of the products my team worked on a few years ago was one that required us to spin up our own ASIC. We spun one up (in the neighborhood of ~20-30 million dollars USD), and a few months later, found a critical flaw in it. So we spun up a second ASIC, again spending $20-30M, and when we were nearly going to release the product, we discovered a bad flaw in the new ASIC. The products worked for the most part, but of course not always, as the bug would sometimes get hit. My company did the right thing and never released the product, though.

      • /home/pineapplelover@lemm.ee
        link
        fedilink
        arrow-up
        1
        ·
        edit-2
        7 months ago

        It’s almost never the engineers fault. That whole Nasa spacecraft that exploaded was due to bureaucracy and pushing the mission forwards.

      • Allonzee@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        edit-2
        7 months ago

        Capitalism: “Growth or die!”

        Earth: I mean… If that’s how it’s gotta be, you little assholes🤷👋🔥

        It’s kind of gallows hilarious that for all the world’s religions worshipping ridiculous campfire ghost stories, we have a creator, we have a remarkable macro-organism mother consisting of millions of species, her story of hosting life going back 3.8 billion years, most living in homeostasis with their ecosystem.

        But to our actual, not fucking ridiculous works of lazy fiction creator, Earth, we literally choose to treat her like our property to loot, rape, and pillage thoughtlessly, and continue to act as a cancer upon her eyes wide open. We as a species are so fucking weird, and not the good kind.

  • w2tpmf@lemmy.world
    link
    fedilink
    arrow-up
    3
    arrow-down
    1
    ·
    7 months ago

    This keeps getting slightly misrepresented.

    There is no fix for CPUs that are already damaged.

    There is a fix now to prevent it from happening to a good CPU.

    • exanime@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      edit-2
      7 months ago

      But isn’t the fix basically under clocking those CPU?

      Meaning the “solution” (not even out yet) is crippling those units before the flaw cripples them?

      • Kazumara@discuss.tchncs.de
        link
        fedilink
        arrow-up
        1
        ·
        edit-2
        7 months ago

        They said the cause was a bug in the microcode making the CPU request unsafe voltages:

        Our analysis of returned processors confirms that the elevated operating voltage is stemming from a microcode algorithm resulting in incorrect voltage requests to the processor.

        If the buggy behaviour of the voltage contributed to higher boosts, then the fix will cost some performance. But if the clocks were steered separately from the voltage, and the boost clock is still achieved without the overly high voltage, then it might be performance neutral.

        I think we will know for sure soon, multiple reviewers announced they were planning to test the impact.

      • w2tpmf@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        7 months ago

        That was the first “Intel Baseline Profile” they rolled out to mobo manufacturers earlier in the year. They’ve roll out a new fix now.

  • linkhidalgogato@lemmy.ml
    link
    fedilink
    arrow-up
    2
    ·
    7 months ago

    im a fan of no corporation especially not fucking amd, but they have been so much better than intel recently that im struggling to understand why anyone still buys intel

  • arefx@lemmy.ml
    link
    fedilink
    arrow-up
    2
    ·
    edit-2
    7 months ago

    Ryzen gang

    My 7800x3d is incredible, I won’t be going back to Intel any time soon.

    • Scrubbles@poptalk.scrubbles.tech
      link
      fedilink
      English
      arrow-up
      0
      ·
      7 months ago

      Honestly even with gpus now too. I was forced to team green for a few years because they were so far behind. Now though, unless you absolutely need a 4090 for some reason, you can get basically the same performance from and, for 70% of the cost

      • anivia@lemmy.ml
        link
        fedilink
        arrow-up
        0
        ·
        edit-2
        7 months ago

        I disagree. Processing power may be similar, but Nvidia still outperforms with raytracing, and more importantly DLSS.

        Whats the point of having the same processing power, when Nvidia still gets more than double the FPS in any game that supports DLSS

        • reliv3@lemmy.world
          link
          fedilink
          arrow-up
          1
          ·
          7 months ago

          FSR exists, and FSR 3 actually looks very good when compared with DLSS. These arguments about raytracing and DLSS are getting weaker and weaker.

          There are still strong arguments for nvidia GPUs in the prosumer market due to the usage of its CUDA cores with some software suites, but for gaming, Nvidia is just overcharging because they still hold the mindshare.

        • Scrubbles@poptalk.scrubbles.tech
          link
          fedilink
          English
          arrow-up
          1
          ·
          7 months ago

          I had the 3090 and then the 6900xtx. The differences were minimal, if even noticeable. Ray tracing is about a generation behind from Nvidia to and, but they’re catching up.

          As the other commenter said too fsr is the same as dlss. For me, I actually got a better frame rate with fsr playing cyberpunk and satisfactory than I did dlss!

      • sparkle@lemm.ee
        link
        fedilink
        arrow-up
        1
        ·
        7 months ago

        Are you just posting this under every comment? This isn’t even a fraction as bad as the Intel CPU issue. Something tells me you have Intel hardware…

  • littletranspunk@lemmus.org
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    7 months ago

    Glad my first self-built PC is full AMD (built about a year ago).

    Screw Intel and Nvidia

    7700X is what it was built with

    • g0nz0li0@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      7 months ago

      I’m not up to speed on the discovery you linked. It appears to be a vulnerability that can’t be exploited remotely? If so, how is this the same as Intel chips causing widespread system instability?

    • punkfungus@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      7 months ago

      This isn’t the first time such a vulnerability has been found, have you forgotten spectre/meltdown? Though this is arguably not nearly as impactful as those because it requires physical access to the machine.

      Your fervour in trying to paint this as an equivalent problem to Intel’s 13th and 14th gen defects, and implication that everyone else are being fanboys, is just telling on yourself mate. Normal people don’t go to bat like that for massive corpos, only Kool aid drinkers.