• punkfungus@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      0
      ·
      3 months ago

      This isn’t the first time such a vulnerability has been found, have you forgotten spectre/meltdown? Though this is arguably not nearly as impactful as those because it requires physical access to the machine.

      Your fervour in trying to paint this as an equivalent problem to Intel’s 13th and 14th Feb defects, and implication that everyone else are being fanboys, is just telling on yourself mate. Normal people don’t go to bat like that for massive corpos, only Kool aid drinkers.

    • g0nz0li0@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      3 months ago

      I’m not up to speed on the discovery you linked. It appears to be a vulnerability that can’t be exploited remotely? If so, how is this the same as Intel chips causing widespread system instability?

    • linkhidalgogato@lemmy.ml
      link
      fedilink
      arrow-up
      0
      ·
      3 months ago

      not gonna lie u look a lot like a fanboy urself idk ur just giving off “my beloved intel looks so bad here that i can directly say its better so ill just both sides with some dumb thing” energy

    • zaphodb2002@sh.itjust.works
      link
      fedilink
      arrow-up
      0
      ·
      3 months ago

      I loved my FX cpu but I lived in a desert and the heat in the summer coming off that thing would make my room 100F or more. First machine I built a custom water loop for. Didn’t help with the heat in the room, but did stop it from shutting down randomly, so I could continue to sit in the sweltering heat in my underpants and play video games until dawn. Better times.

      • rotopenguin@infosec.pub
        link
        fedilink
        English
        arrow-up
        0
        ·
        3 months ago

        You might want to go through the trouble of extending that radiator loop all the way out through a window.

      • Bytemeister@lemmy.world
        link
        fedilink
        Ελληνικά
        arrow-up
        0
        ·
        3 months ago

        I had the FX8350 Black Edition, and that thing would keep my room at 70f… In the winter… With a window open.

        Summer gaming was BSOD city. I miss it so much.

      • helpmepickaname@lemmy.world
        link
        fedilink
        arrow-up
        0
        ·
        3 months ago

        Of course it didn’t help the heat in the room, the heat from the CPU still has to go somewhere. Better coolers aren’t for the room, they’re for the CPU. in fact a better cooler could make the room hotter because it is removing heat at a higher rate from the CPU and dumping it into the room

  • w2tpmf@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    3 months ago

    This keeps getting slightly misrepresented.

    There is no fix for CPUs that are already damaged.

    There is a fix now to prevent it from happening to a good CPU.

    • exanime@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      3 months ago

      But isn’t the fix basically under clocking those CPU?

      Meaning the “solution” (not even out yet) is creeping those units before the flaw creeples them?

      • Kazumara@discuss.tchncs.de
        link
        fedilink
        arrow-up
        0
        ·
        edit-2
        3 months ago

        They said the cause was a bug in the microcode making the CPU request unsafe voltages:

        Our analysis of returned processors confirms that the elevated operating voltage is stemming from a microcode algorithm resulting in incorrect voltage requests to the processor.

        If the buggy behaviour of the voltage contributed to higher boosts, then the fix will cost some performance. But if the clocks were steered separately from the voltage, and the boost clock is still achieved without the overly high voltage, then it might be performance neutral.

        I think we will know for sure soon, multiple reviewers announced they were planning to test the impact.

      • w2tpmf@lemmy.world
        link
        fedilink
        arrow-up
        0
        ·
        3 months ago

        That was the first “Intel Baseline Profile” they rolled out to mobo manufacturers earlier in the year. They’ve roll out a new fix now.

  • lolcatnip@reddthat.com
    link
    fedilink
    English
    arrow-up
    0
    ·
    3 months ago

    I switched to AMD largely for better battery performance, but this mades me feel like I dodged a bullet.

    • papalonian@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      3 months ago

      Just out of curiosity, when you say better battery performance, what kind of battery are we talking about? Is this in a laptop, a desktop on some sort of remote/ backup system?

        • papalonian@lemmy.world
          link
          fedilink
          arrow-up
          0
          ·
          3 months ago

          I see, so is it a known thing that AMD CPU laptops generally have better battery life? I always see arguments for one CPU/GPU over another because of better power consumption, but I’ve never been in a position where I needed to worry much about it, so I’ve never looked much into the claims.

          • lolcatnip@reddthat.com
            link
            fedilink
            English
            arrow-up
            0
            ·
            3 months ago

            Seemed that way when I was shopping last, but that was over a year ago so I can’t cite sources. Supposedly their low mode uses less power and runs faster than Intel’s. I can’t confirm the faster part but it definitely lasts longer on battery power than any of the Intel laptops I’ve owned.

  • Lizardking27@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    3 months ago

    Ugh. Can I just say how much I fucking HATE how every single fucking product on the market today is a cheap, broken, barely functional piece of shit.

    I swear to God the number of times I have to FIX something BRAND NEW that I JUST PAID FOR is absolutely ridiculous.

    I knew I should’ve been an engineer, how easy must it be to sit around and make shit that doesn’t work?

    Fucking despicable. Do better or die, manufacturers.

    • InputZero@lemmy.ml
      link
      fedilink
      arrow-up
      0
      ·
      3 months ago

      So this doesn’t apply to the Intel situation, but a good lesson to learn is that the bleeding edge cuts both ways. Meaning that anyone buying the absolute latest technology, there’s going to be some friction with usability at first. It should never surmount to broken hardware like the Intel CPUs, but buggy drivers for a few weeks/months is kinda normal. There’s no way of knowing what’s going to happen when a brand new product is going to be released. The producer must do their due diligence and test for anything catastrophic but weird things happen in the wild that no one can predict. Like I said at the top, this doesn’t apply to Intel’s situation because it was a catastrophic failure, but if you’re ever on the bleeding edge assume eventually you’re going to get cut.

    • 31337@sh.itjust.works
      link
      fedilink
      arrow-up
      0
      ·
      3 months ago

      I’ve put together 2 computers the last couple years, one Intel (12th gen, fortunately) and one AMD. Both had stability issues, and I had to mess with the BIOS settings to get them stable. I actually had to under-clock the RAM on the AMD (probably had something to do with maxing-out the RAM capacity, but I still shouldn’t need to under-clock, IMO). I think I’m going to get workstation-grade components the next time I need to build a computer.

      • Allonzee@lemmy.world
        link
        fedilink
        arrow-up
        0
        ·
        edit-2
        3 months ago

        Capitalism: “Growth or die!”

        Earth: I mean… If that’s how it’s gotta be, you little assholes🤷👋🔥

        It’s kind of gallows hilarious that for all the world’s religions worshipping ridiculous campfire ghost stories, we have a creator, we have a remarkable macro-organism mother consisting of millions of species, her story of hosting life going back 3.8 billion years, most living in homeostasis with their ecosystem.

        But to our actual, not fucking ridiculous works of lazy fiction creator, Earth, we literally choose to treat her like our property to loot, rape, and pillage thoughtlessly, and continue to act as a cancer upon her eyes wide open. We as a species are so fucking weird, and not the good kind.

      • volodya_ilich@lemm.ee
        link
        fedilink
        arrow-up
        0
        ·
        3 months ago

        Not really, and I say this being a communist myself. Capitalism just requires to extract the maximum profit from the capital investment, sometimes it leads to what you said, sometimes it leads to the opposite (e.g. no difference between i5 1st gen and i5 8th gen)

    • Doombot1@lemmy.one
      link
      fedilink
      arrow-up
      0
      ·
      3 months ago

      Most of the time, the product itself comes out of engineering just fine and then it gets torn up and/or ruined by the business side of the company. That said, sometimes people do make mistakes - in my mind, it’s more of how they’re handled by the company (oftentimes poorly). One of the products my team worked on a few years ago was one that required us to spin up our own ASIC. We spun one up (in the neighborhood of ~20-30 million dollars USD), and a few months later, found a critical flaw in it. So we spun up a second ASIC, again spending $20-30M, and when we were nearly going to release the product, we discovered a bad flaw in the new ASIC. The products worked for the most part, but of course not always, as the bug would sometimes get hit. My company did the right thing and never released the product, though.

      • /home/pineapplelover@lemm.ee
        link
        fedilink
        arrow-up
        0
        ·
        edit-2
        3 months ago

        It’s almost never the engineers fault. That whole Nasa spacecraft that exploaded was due to bureaucracy and pushing the mission forwards.

    • Buddahriffic@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      3 months ago

      It’s not easy to make shit that doesn’t work if you care about what you’re doing. I bet there’s angry debates between engineers and business majors behind many of these enshitifications.

      Though, for these Intel ones, they might have been less angry and more “are you sure these risks are worth taking?” because they probably felt like they had to push them to the extreme to compete. The angry conversations probably happened 5-10 years ago before AMD brought the pressure when Intel was happy to assume they had no competition and didn’t have to improve things that much to keep making a killing. At this point, it’s just a scramble to make up for those decisions and catch up. Which their recent massive layoffs won’t help with.

  • littletranspunk@lemmus.org
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    3 months ago

    Glad my first self-built PC is full AMD (built about a year ago).

    Screw Intel and Nvidia

    7700X is what it was built with

  • linkhidalgogato@lemmy.ml
    link
    fedilink
    arrow-up
    0
    ·
    3 months ago

    im a fan of no corporation especially not fucking amd, but they have been so much better than intel recently that im struggling to understand why anyone still buys intel

  • gmtom@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    3 months ago

    Can we talk about how utterly useless that default could cooler is? Like for relatively high end gaming CPU it really shouldn’t be legal for it to ship with something so useless.

  • angrystego@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    3 months ago

    I thought the point would be a depressed and self deprecating “I’m something of an Intel CPU myself”.

  • kamen@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    3 months ago

    Don’t be a fan of one or the other, just get what’s more appropriate at the time of buying.