• M500@lemmy.ml
    link
    fedilink
    English
    arrow-up
    0
    ·
    23 days ago

    It’s because AI needs a not a ram. I think Apple did not expect or plan for ai which shows in the fact that only the latest pro phone can have Apple intelligence. It’s because that phone has enough ram.

    Now they will boost ram across the board because Apple intelligence will not run well without it.

    Depending on pricing, I may actually buy a MacBook in 2025.

    I’ve wanted one since the m1, but I’ve held out until 16gb was the starting amount of ram.

    • cm0002@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      23 days ago

      Or you could just get just about any other non-mac system that lets you upgrade RAM easily when you need too…

      Just stop supporting Apples soldered in BS

      • M500@lemmy.ml
        link
        fedilink
        English
        arrow-up
        0
        ·
        23 days ago

        I know what you mean, but I’m tired of window’s bullshit too.

        I’d keep pc hardware if my work could happen on Linux, but it’s sadly not an option at the moment.

      • TheGrandNagus@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        23 days ago

        Bad news: literally all current CPU gen laptops use soldered RAM.

        All of them. Every single one. No exceptions.

        Hopefully that’ll change, but as it stands right now, if you want newest gen, you cannot get replaceable RAM.

        And even before current gen, the vast majority of Windows laptops were soldered too.

        • cm0002@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          edit-2
          23 days ago

          I looked into it, yea current gen chips aren’t compatible with SODIMM

          Because they’re compatible with the brand new removable RAM standard, CAMM2. It is confusing though, as everywhere I’ve looked both soldered and CAMM2 were listed as LPDDR5 which is what makes you think it’s just soldered RAM. So far it looks like if a spec sheet lists LPDDR5x it should be a CAMM2

          CAMM2 is also very very new, so I’m sure a few manufacturers in their rush to get the new/current gen chips out the door just used soldered RAM.

          CAMM2 is very exciting, it basically eats into all of Apples listed pros for having soldered in RAM as close to the CPU as possible while still being user removable. (Performance, efficiency etc)

        • AGuyAcrossTheInternet@fedia.io
          link
          fedilink
          arrow-up
          0
          ·
          23 days ago

          I really don’t know where you’re looking because I only see that in business-class laptops and even then not all of them have soldered RAM.

          And I’m already counting the ones with one expansion slot with the soldered bunch.

          Of course, if you paid attention only to HP, Dell and Lenovo, then I’d see why you’d think so. But beyond those brands, you don’t have that soldered nonsense everywhere. At the very least, you have things like Clevo, Framework and the like to sell you laptops without soldered ram.

          I bet there are even websites that let you filter laptop models without soldered ram. Personally, I only know about Germany-based websites like that, though.

          • TheGrandNagus@lemmy.world
            link
            fedilink
            English
            arrow-up
            0
            ·
            edit-2
            23 days ago

            You are looking at previous-gen platforms.

            E.g. for Framework, you’re looking at APUs like the 7840U, which is not current gen. It’s two generations old. (7840U/Phoenix > 8850U/Hawk Point > AI 9 365 (awful naming btw AMD)/Strix Point).

            Like I said, all current CPU gen laptops cannot use SODIMM.

            And let me be clear here, I’m not exaggerating for effect; I do not mean most of them. I do not mean the vast majority of them. I do not mean practically all of them. I literally mean all of them. 100% of them. Every single one that exists.

            AMD, Intel, and Qualcomm do not currently have compatibility with SODIMM on their newest gen mobile CPUs.

            I hope that changes, and I expect it eventually will, but as it stands right now, no you cannot have SODIMM modules if you are buying any laptop with the newest gen CPUs.

            • AGuyAcrossTheInternet@fedia.io
              link
              fedilink
              arrow-up
              0
              ·
              23 days ago

              Well fudge me sideways. Every day is a school day.

              They’ve all got LPDDR5, so yeah, you’re unfortunately right. It feels kinda weird having to consider the 7000 and 8000-series last gen already; true as it is, though.

              • cm0002@lemmy.world
                link
                fedilink
                English
                arrow-up
                0
                ·
                23 days ago

                Don’t worry, the latest chips were just built to only handle CAMM2, a new removable RAM standard that replaced SODIMM

                It’s a bit confusing though because both soldered and CAMM2 are listed as LPDDR5 on spec sheets, from what I’ve looked at it appears if there’s an x at the end of the LPDDR5 it should be CAMM2

                It’s also brand BRAND new, so I’m sure quite a few manufacturers rushed out the door with the new chips just soldering on the RAM because they couldn’t get CAMM2 in it in time for whatever reason

      • bamboo@lemm.ee
        link
        fedilink
        English
        arrow-up
        0
        ·
        23 days ago

        I hate to be the bearer of bad news, but most things and light laptops have had soldered ram for many years now. There are exceptions, but they’re few and far between.

        • cm0002@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          23 days ago

          What? Lol nah plenty of laptops have removable RAM. It tended to show up often on the “Ultralight” tier, but outside of that and Chromebooks it’s been by no means the norm

          • T156@lemmy.world
            link
            fedilink
            English
            arrow-up
            0
            ·
            22 days ago

            It has kind of come with newer laptops being driven to be thinner, and for newer devices, because the old SODIMM format is no longer capable of the throughput/latencies needed for higher speed memory.

            From memory, 2.1Ghz DDR5 is where it caps out. Anything faster, like 2.8 GHz either requires it to be soldered, or one of the new formats like the one Dell has started using.

            • boonhet@lemm.ee
              link
              fedilink
              English
              arrow-up
              0
              ·
              22 days ago

              The replacement you’re talking about is called [CAMM](https://en.wikipedia.org/wiki/CAMM_(memory_module\)) and personally I’m excited about it. Not only does it support faster speeds than SO-DIMM, it takes up less physical space. And I believe you can’t even put LPDDR on a SODIMM, so CAMM should also use less power?

    • Echo Dot@feddit.uk
      link
      fedilink
      English
      arrow-up
      0
      ·
      23 days ago

      It’s a good comparison actually because Apple keeps saying that their ram is faster because it’s soldered (Which is true but only if you squint). I don’t really think it makes a difference because if you run out of space you still run out of space, the fact that you can access the limited space more quickly doesn’t really help.

      Well phone RAM also tends to be solded onto the board too so it’s a pretty good comparison.

    • stoy@lemmy.zip
      link
      fedilink
      English
      arrow-up
      0
      ·
      23 days ago

      I remember back in the early 2000s when I saw a PDA with a 232mhz cpu and 64mb ram, and I realized how far technology had come since I got my computer with a 233mhz cpu and 64mb ram…

      Obviously different architechtures, but damn that felt strange…

          • Kbobabob@lemmy.world
            link
            fedilink
            English
            arrow-up
            0
            ·
            edit-2
            23 days ago

            I didn’t think any of those are the base model. Anything with Pro or Ultra in the name should have more than 8Gb of RAM in my opinion. It also seems dominated by OnePlus as the others listed are not really players in the larger market. You could possibly argue that Xaomi is but I’ve never even seen one of these phones in the real world. In fact it looks like most of these are only available in China variant.

            • bruhduh@lemmy.world
              link
              fedilink
              English
              arrow-up
              0
              ·
              23 days ago

              I am writing this using my Xiaomi poco x3 pro, although it have 8gb ram and 256gb memory, it also have headphone jack and micro SD slot

    • narc0tic_bird@lemm.ee
      link
      fedilink
      English
      arrow-up
      0
      ·
      23 days ago

      Yup, while the current iPhone 15 Pro is the only model which has 8 GB of RAM, with the regular iPhone 15 having 6 GB. All iPhone 16 models (launching next month) will still only have 8 GB according to rumors, which happens to be the bare minimum required to run Apple Intelligence.

      Giving the new models only 8 GB seems a bit shortsighted and will likely mean that more complex AI models in future iOS versions won’t run on these devices. It could also mean that these devices won’t be able to keep a lot of apps ready in the background if running an AI model in-between.

      16 GB is proper future-proofing on Google’s part (unless they lock new software features behind newer models anyway down the road), and Apple will likely only gradually increase memory on their devices.

      • tankplanker@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        23 days ago

        If you were being cynical, you could say it was planned obsolescence and that when the new ai feature set rolls out that you have to get the new phone for them.

        • nous@programming.dev
          link
          fedilink
          English
          arrow-up
          0
          ·
          23 days ago

          I would say it is more so they can advertise a lower price. But then expect you to get the more expensive ones as the bare minimum is just not enough.

          • tankplanker@lemmy.world
            link
            fedilink
            English
            arrow-up
            0
            ·
            23 days ago

            For the base model yeah, but apple loves charging a packet for more memory so I don’t see it for the top of the range models. Would be typical for them to only offer 16gb with the increased storage as well, just to bump the price up

        • narc0tic_bird@lemm.ee
          link
          fedilink
          English
          arrow-up
          0
          ·
          23 days ago

          I think they got caught with their pants down when everybody started doing AI and they were like “hey, we have this cool VR headset”. Otherwise they would’ve at least prepared the regular iPhone 15 (6 GB) to be ready for Apple Intelligence. Every (Apple Silicon) device with 8 GB or more get Apple Intelligence, so M1 iPads from 2021 get it as well for example, even though the M1’s NPU is much weaker than some of the NPUs in unsupported devices with less RAM.

          They are launching their AI (or at least everything under the “Apple Intelligence” umbrella) with iOS 18.1 which won’t even release with the launch of the new iPhones, and it’ll be US only (or at least English only) with several of the features announced at WWDC still missing/coming later and it’s unclear how they proceed in the EU.

          • tankplanker@lemmy.world
            link
            fedilink
            English
            arrow-up
            0
            ·
            23 days ago

            With how polished Apples AI on mobile was at launch compared to Gemini on Android at launch were it could not even do basics like timers I suspect Apple had it in the works for far longer and it would not have been a total surprise.

            Also you are describing the situation at launch for new hardware, the software will evolve every year going forward and the requirements will likely increase every year. If I am buying a flagship phone right now I want it to last at least 3 years of updates, if not 5 years. The phone has to be able to cope with what is a very basic requirement that is enough RAM.

            This isn’t some NPU thing, this is just basic common sense that more RAM is better for this, something the flagship iPhones could have benefited from for a while now.

            • narc0tic_bird@lemm.ee
              link
              fedilink
              English
              arrow-up
              0
              ·
              23 days ago

              I’m not sure if you’re agreeing or disagreeing with me here. Either way, hardware has a substantially longer turnaround time compared to software. The iPhone 15 would’ve been in development years before release (I’m assuming they’re developing multiple generations in parallel, which is very likely the case) and keep in mind that the internals are basically identical to the iPhone 14 Pro, featuring the same SoC.

              AI and maybe AAA games like Resident Evil aside, 6 GB seems to work very well on iPhones. If I had a Pixel 6/7/8 Pro with 12 GB and an iPhone 12/13/14 Pro (or 15) with 6 GB, I likely wouldn’t notice the difference unless I specifically counted the number of recent applications I could reopen without them reloading. 6 GB keeps plenty of recent apps in memory on iOS.

              But I’m not sure going with 8 GB in the new models knowing that AI is a thing and the minimum requirement for their first series of models is 8 GB is too reassuring. I’m sure these devices will get 5-8 years of software updates, but new AI features might be reduced or not present at all on these models then.

              When talking about “AI” in this context I’m talking about everything new under the “Apple Intelligence” umbrella, like LLMs and image generators. They’ve done what you’d call “AI” nowadays for many years on their devices, like photo analysis, computational photography, voice isolation, “Siri” recommendations etc.

              • tankplanker@lemmy.world
                link
                fedilink
                English
                arrow-up
                0
                ·
                22 days ago

                I was under the impression that ios used sleight of hand with apps to reduce memory footprint for inactive apps rather than how android manages its recent apps list? Is it still requiring special permissions to run non apple apps in the background as active tasks? AI will need to run the background and will need a decent chunk of RAM to do so.

                I completely agree that changing the processor or revising NPU or similar is too much to do late stage, I reject that for increasing RAM or storage, both can be changed closer than 12 months from release and I would also reject that apple had the AI changes planned for much less than 12 months out as well. It just feels like a big fuck you to anybody buying a flagship from apple this year as it wont last the length of time it should do for normal consumers who would expect all of the latest AI features to roll out during the supported window.

      • filister@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        23 days ago

        Pretty much what NVIDIA is doing with their GPUs. Refusing to provide adequate future proof amount of VRAM on their cards. That’s planned obsolescence in action.

        • TheGrandNagus@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          23 days ago

          And like Apple, Nvidia has no shortage of fanboys that insist the pitiful amounts of (V)RAM is enough. The marketing sway those two companies have is incredible.

          It’s a complete joke that Sapphire had an 8GB version of the R9 290X, what, 11 years ago or something? And yet Nvidia is still selling 8GB cards now, for exorbitant prices.

          • Petter1@lemm.ee
            link
            fedilink
            English
            arrow-up
            0
            ·
            23 days ago

            This happens if you sell your hardware as DRM key to use their software (i(Pad)OS, macOS etc. and Cuda)

          • CheeseNoodle@lemmy.world
            link
            fedilink
            English
            arrow-up
            0
            ·
            23 days ago

            The current GPU situation actually has me curious about AMDs upcoming Halo APU chips. They’re likely going to be pretty expensive relative to their potential GPU equivelent performance but if they work out similar to the combined price of a CPU and GPU then it might be worthwhile as they use onboard RAM as their VRAM. Probably a crazy idea but one I look forward to theory-building in spring when they release.

      • Rai@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        0
        ·
        22 days ago

        I don’t use Apple computers but if we’re going into phones, iOS is extremely memory efficient. I’m on a six year old XS max with 4GB and it works like the day I got it, running circles around Android phones half its age.

  • padge@lemmy.zip
    link
    fedilink
    English
    arrow-up
    0
    ·
    22 days ago

    My sister just bought a MacBook Air for college, and I had to beg her to spend the extra money on 16gb of memory. It feels like a scam that it appears cheap with the starting at price, but nobody should actually go with those “starting at” specs.

    • Echo Dot@feddit.uk
      link
      fedilink
      English
      arrow-up
      0
      ·
      22 days ago

      Yeah it’s about future proofing. 8 GB might be okay for basic browsing and text editing now, but in the future that might not be the case. Also in my experience people who only want to do basic browsing and word editing, end up inevitably wanting to do more complex things and not understanding that their device is not capable of it.

      • padge@lemmy.zip
        link
        fedilink
        English
        arrow-up
        0
        ·
        22 days ago

        Exactly. I told her that 8gb might be fine for a year or two, but if she wants this thousand plus dollar laptop to last four years she needs to invest the extra money now. Especially once she told me she might want to play Minecraft or Shadow of the Tomb Raider on it

    • areyouevenreal@lemm.ee
      link
      fedilink
      English
      arrow-up
      0
      ·
      23 days ago

      The annoying thing is I have had people claim that 8GB and 16GB is fine on Apple and works better than on PC laptops. To the point one redditor point blank refused to believe I owned an Apple laptop. I literally had to take a photograph of said laptop and show it to them before they would believe me about the RAM capacity.

      • HauntedCupcake@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        23 days ago

        I own a 8GB MacBook Pro for work, it’s definitely better than a PC with 8GB of RAM, but not better or even close to a PC with 16GB. Just the amount of stutters/freezes while the swap file goes is insane

        • areyouevenreal@lemm.ee
          link
          fedilink
          English
          arrow-up
          0
          ·
          23 days ago

          Maybe this is true if you use Windows. If you use Linux on your PC versus macOS on a MacBook you will probably find the PC performs comparably if not better.

          • HauntedCupcake@lemmy.world
            link
            fedilink
            English
            arrow-up
            0
            ·
            edit-2
            23 days ago

            Oh totally, Linux is in the same ballpark as, if not better than, Macs when it comes to RAM usage. Windows is just a hog

            • areyouevenreal@lemm.ee
              link
              fedilink
              English
              arrow-up
              0
              ·
              23 days ago

              We are talking about PC vs Mac. Both have the same problem when it comes to chromium based things.

            • Echo Dot@feddit.uk
              link
              fedilink
              English
              arrow-up
              0
              ·
              23 days ago

              A Windows application and a Mac application will use pretty much the same amount of memory regardless of operating system.

              The real issue is how much memory the OS uses up. Windows is a massive waste of RAM but not enough to make any difference, certainly not with 8 GB versus 16 GB. You’re still better off on PC then.

        • Echo Dot@feddit.uk
          link
          fedilink
          English
          arrow-up
          0
          ·
          23 days ago

          Obviously it depends on the situation but sometimes it is worth talking to idiots not because you have any chance of changing their mind but just demonstrate to everyone else in the thread that they are in fact an idiot. Just in case somebody thinks they have a point.

    • lengau@midwest.social
      link
      fedilink
      English
      arrow-up
      0
      ·
      23 days ago

      My Linux machine has 64 GiB of RAM, which is like 128 GiB of Mac RAM. It’s still not enough

      • areyouevenreal@lemm.ee
        link
        fedilink
        English
        arrow-up
        0
        ·
        23 days ago

        Serious question what are you using all that RAM for? I am having a hard time justifying upgrading one of my laptops to 32 GiB, nevermind 64 GiB.

        • tal@lemmy.today
          link
          fedilink
          English
          arrow-up
          0
          ·
          edit-2
          23 days ago

          Any memory that’s going unused by apps is going to be used by the OS for caching disk contents. That’s not as significant with SSD as with rotational drives, but it’s still providing a benefit, albeit one with diminishing returns as the size of the cache increases.

          That being said, if this is a laptop and if you shut down or hibernate your laptop on a regular basis, then you’re going to be flushing the memory cache all the time, and it may buy you less.

          IIRC, Apple’s default mode of operation on their laptops these days is to just have them sleep, not hibernate, so a Mac user would probably benefit from that cache.

        • lengau@midwest.social
          link
          fedilink
          English
          arrow-up
          0
          ·
          edit-2
          23 days ago

          For me in particular I’m a software developer who works on developer tools, so I have a lot of tests running in VMs so I can test on different operating systems. I just finished running a test suite that used up over 50 gigs of RAM for a dozen VMs.

          • InvertedParallax@lemm.ee
            link
            fedilink
            English
            arrow-up
            0
            ·
            22 days ago

            Same, 48c/96t with 192gb ram.

            make -j is fun, htop triggers epilepsy.

            Few vms, but tons of Lxc containers, it’s like having 1 machine that runs 20 systems in parallel and really fast.

            Have containers for dev, for browsing, for wine, the dream finally made manifest.

        • Mistic@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          23 days ago

          If games, modding uses a lot. It can go to the point of needing more than 32gb, but rarely so.

          Usually, you’d want 64gb or more for things like video editing, 3d modeling, running simulations, LLMs, or virtual machines.

          • areyouevenreal@lemm.ee
            link
            fedilink
            English
            arrow-up
            0
            ·
            23 days ago

            I use Virtual Machines and run local LLMs. LLMs need VRAM rather than CPU RAM. You shouldn’t be doing it on a laptop without a serious NPU or GPU, if at all. I don’t know if I will be using VMs heavily on this machine or not, but that would be a good reason to have more RAM. Even so 32 GiB should be enough for a few VMs running concurrently.

            • tal@lemmy.today
              link
              fedilink
              English
              arrow-up
              0
              ·
              23 days ago

              and run local LLMs.

              Honestly, I think that for many people, if they’re using a laptop or phone, doing LLM stuff remotely makes way more sense. It’s just too power-intensive to do a lot of that on battery. That doesn’t mean not-controlling the hardware – I keep a machine with a beefy GPU connected to the network, can use it remotely. But something like Stable Diffusion normally requires only pretty limited bandwidth to use remotely.

              If people really need to do a bunch of local LLM work, like they have a hefty source of power but lack connectivity, or maybe they’re running some kind of software that needs to move a lot of data back and forth to the LLM hardware, I think I might consider lugging around a small headless LLM box with a beefy GPU and a laptop, plug the LLM box into the laptop via Ethernet or whatnot, and do the LLM stuff on the headless box. Laptops are just not a fantastic form factor for heavy crunching; they’ve got limited ability to dissipate heat and tight space constraints to work with.

    • tb_@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      23 days ago

      Does it?

      Previous benchmarks have shown the 8 GB models seriously fell behind in performance.

      • Echo Dot@feddit.uk
        link
        fedilink
        English
        arrow-up
        0
        ·
        23 days ago

        Yeah I think the joke just flew over your head.

        Apple keeps saying that their RAM is somehow magic and therefore better than Windows RAM, which is a comment that obviously makes no sense.

        • barsquid@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          23 days ago

          I think they are able to share it with the GPU or something? It is maybe slightly better but it sure as fuck is not 2x better.

          8 GB, even if it is “magic RAM,” is a joke amount and has been for a long time.

          • HauntedCupcake@lemmy.world
            link
            fedilink
            English
            arrow-up
            0
            ·
            23 days ago

            That’s just an APU, see consoles and laptops. The unified memory is basically just the above, but Apple also claims that due to Apple Silicon having the storage controller on board, the swap is magically faster 🤷

            Also Mac OS/Linux use less RAM than Windows which certainly helps.

            8GB is “fine™” on a MacBook Air, but it’s criminal for a Pro machine, and it certainly should not cost £200 for an extra 8GB. That’s genuinely insane pricing

            • barsquid@lemmy.world
              link
              fedilink
              English
              arrow-up
              0
              ·
              23 days ago

              That’s the real issue, isn’t it? The upgrade prices are disconnected from reality by a lot. If they were within the realm of sanity nobody would care much that the base is 8 GB.

              • Echo Dot@feddit.uk
                link
                fedilink
                English
                arrow-up
                0
                ·
                23 days ago

                I was saying this and my girlfriend when they first came out the whole thing is completely out of spec for everyone regardless of your use case.

                She really only wants it for playing The Sims but you’ll run into RAM limitations there, and as you say it’s not worth paying so much more just to get a device that’s actually functional.

                If you want to use it for basic word processing then you really don’t need that level of latency and you really don’t need a CPU of that level of performance. You’re just paying for stuff you’re never going to use.

                If you want it for gaming there isn’t enough memory to make it worthwhile.

                If you want it for intensive graphics editing work then there really isn’t enough memory for that to work.

                If you want it for advanced computation then you’re probably not going for a laptop anyway. The M2 chip is obsessed with retaining battery life, which is fine in a laptop but if you want high performance applications you just want it to use more power.

                It for some bizarre reason you wanted to do AI research on a laptop it’s not too bad but you’d still need the pro version and there are better things on the market but it wouldn’t be the worst I guess.

                So outside of one very niche scenario it’s literally a pointless device for 99% of the user base.

                In the end we got a framework laptop, which is more than capable of doing what we wanted and didn’t cost anywhere near as much. Plus it basically looks like a MacBook too. So even going to build quality wasn’t a consideration. I got one too for no particular reason, and it still ended up cheaper.

        • cheddar@programming.dev
          link
          fedilink
          English
          arrow-up
          0
          ·
          23 days ago

          Yeah I think the joke just flew over your head.

          I realize this should be a joke, but I am still unsure if it is.

          • Echo Dot@feddit.uk
            link
            fedilink
            English
            arrow-up
            0
            ·
            edit-2
            23 days ago

            Memory is memory. If an application requires a lot of memory then it really doesn’t matter what speed that memory is it’s more important that there’s enough of it.

            There are plenty of applications that could theoretically run on the M2 MacBook in terms of processing capacity but can’t run because there isn’t enough RAM available. Oh they run in switching mode, which is super bad, because a, it’s incredibly slow, and b, it’s bad for the hard drive.

          • HauntedCupcake@lemmy.world
            link
            fedilink
            English
            arrow-up
            0
            ·
            edit-2
            23 days ago

            It is 100% a joke. Literally other than Windows being slightly more RAM hungry, there’s not a huge difference between it and Mac’s RAM

    • T156@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      22 days ago

      Ironically, it’s the other way around, since Apple has to share their RAM between GPU and CPU, where other computers typically have them separately.

      So in normal usage with 8 GB, you’re automatically down to 7, since at least 1GB would be taken by the graphics card. More if you’re doing anything reasonably graphics-heavy with it.

  • NicePool@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    23 days ago

    Isn’t Apple the company that charges $5k+ for 16GB? All while intentionally deprecating the hardware within 2 years. /s

    I’ve had to support their products on a professional level for over a decade. I will NEVER buy an Apple product.

    • cm0002@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      23 days ago

      I’ve had to support their products on a professional level for over a decade.

      Their enterprise stuff…can only be described as a quintessential example of an ill-conceived, horrendously executed fiasco, so utterly devoid of utility and coherence that it defies all logic and reasonable expectation. It stands as a paragon of dysfunction, a conflagration of conceptual failures so intense and egregious that it resembles a blazing inferno of pure, unadulterated refuse. It is, in every conceivable sense, a searing, molten heap of garbage—hot, steaming, and reeking with the unmistakable stench of profound ineptitude and sheer impracticality.

  • Nomecks@lemmy.ca
    link
    fedilink
    English
    arrow-up
    0
    ·
    23 days ago

    Golly, thanks Apple. It’s not like I can go buy a 256GB DIMM right now. 16GB what a joke.

      • T156@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        22 days ago

        These days, the CPU probably runs Linux on itself.

        Storage drive control boards are basically small computers in their own right, now.

        • boonhet@lemm.ee
          link
          fedilink
          English
          arrow-up
          0
          ·
          22 days ago

          Linux is a bit heavy for embedded stuff.

          Intel’s ME for an example, uses Minix.

      • cmnybo@discuss.tchncs.de
        link
        fedilink
        English
        arrow-up
        0
        ·
        23 days ago

        It does make some things better, but there are a number of downsides too. The biggest downside is that it’s not practical to make the memory socketed because of the speed that’s required.

    • floofloof@lemmy.ca
      link
      fedilink
      English
      arrow-up
      0
      ·
      23 days ago

      It’s OK - for an extra $400 they’ll sell you one with an extra $50 worth of RAM.

        • ripcord@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          23 days ago

          I think they meant what the end user would NORMALLY pay, which is the better comparison.

          • JohnDClay@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            0
            ·
            23 days ago

            But Apple isn’t buying consumer ram, they’re spending $8 to put on a different chip instead. If other laptop manufacturers are charging $50, it’s because they think they can get away with it, like apple.

                • sugar_in_your_tea@sh.itjust.works
                  link
                  fedilink
                  English
                  arrow-up
                  0
                  ·
                  21 days ago

                  It’s really not. Other companies with socketed RAM also upsell, they are just limited in how much they can ask because the customer has the option to DIY adding more RAM. So the cost these companies charge is roughly the price to the customer of upgrading their own RAM, plus a bit extra for the convenience of not having to do that.

                  For example, Framework upcharges by something like 20-50% for RAM and SSDs when comparing to equivalent parts. It’s not just Apple, all OEMs do it, but Apple can charge much more because the user can’t easily replace either on their own.

  • masterspace@lemmy.ca
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    23 days ago

    Let me know how many multiple thousands of dollars it’s going to cost for a MAX variant of the chip that can run three external monitors like it’s 2008.

      • masterspace@lemmy.ca
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        23 days ago

        Nope. All base Mx Series Macs can only support a single external monitor in addition to their internal one.

        Pro Series are professional enough that Apple deems your work worthy of using two (2) external monitors.

        Max Series are the only ones that have proved their Maximum enough to Apple to let them use 3 monitors.

        It’s honestly absurd. And none of them support Display Port’s alt mode so they can’t daisy chain between monitors and they max out at 3, whereas an equivalent Windows or Linux machine could do 6 over the same Thunderbolt 3 connection.

        Windows and Linux machines also support sub pixel text rendering, so text looks far better on 1080p and 1440p monitors.

        I have to use MacOS for work and while I’ve come to accept many parts and even like some, their external monitor support is just mind numbingly bad.

        • brbposting@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          0
          ·
          23 days ago

          sub pixel text rendering, so text looks far better on 1080p and 1440p monitors.

          Why would you need that? Buy an Ultra Pro Retina Max Display and please get the stand if you don’t want Apple to go out of business.

        • narc0tic_bird@lemm.ee
          link
          fedilink
          English
          arrow-up
          0
          ·
          23 days ago

          What you’re describing as “DisplayPort alt mode” is DisplayPort Multi-Stream Transport (MST). Alt mode is the ability to pass native DisplayPort stream(s) via USB-C, which all M chip Macs are capable of. MST is indeed unsupported by M chip hardware, and it’s not supported in macOS either way - even the Intel Macs don’t support it even though the hardware is capable of it.

          MST is nice for a dual WQHD setup or something (or dual UHD@60 with DisplayPort 1.4), but attempt to drive multiple (very) high resolution and refresh rate displays and you’ll be starved for bandwidth very quickly. Daisy-chaining 6 displays might technically be possible with MST, but each of them would need to be set to a fairly low resolution for today’s standards. Macs that support more than one external display can support two independent/full DisplayPort 1.4 signals per Thunderbolt port (as per the Thunderbolt 4 spec), so with a proper Thunderbolt hub you can connect two high resolution displays via one port no problem.

          I agree that even base M chips should support at least 3 simultaneous displays (one internal and two external, or 3 external in clamshell mode), and they should add MST support for the convenience to be able to connect to USB-C hubs using MST with two (lower-resolution) monitors, and support proper sub-pixel font anti-aliasing on these low-DPI displays (which macOS was perfectly capable of in the past, but they removed it). Just for the convenience of being able to use any random hub you stumble across and it “just works”, not because it’s necessarily ideal.

          But your comparison is blown way out of proportion. “Max” Macs support the internal display at full resolution and refresh rate (120 Hz), 3 external 6K 60Hz displays and an additional display via HDMI (4K 144 Hz on recent models). Whatever bandwidth is left per display when daisy-chaining 6 displays to a single Thunderbolt port on a Windows machine, it won’t be anywhere near enough to drive all of them at these resolutions.

          • masterspace@lemmy.ca
            link
            fedilink
            English
            arrow-up
            0
            ·
            edit-2
            23 days ago

            Agreed, I typed quickly before bed and meant MST not alt mode.

            But otherwise you’re just arguing that it’s not a big deal because ‘you don’t need any of these fancy features if you throw out your monitor every three years and buy new thousand dollar ones’.

            For everyone who doesn’t want to contribute to massive piles of e-waste, we still have 1080p and 1440p, 60Hz monitors kicking around, and there is no excuse for a Mac to only be able to drive one of them with crappy looking text. It could easily drive 6 within the bandwidth of a 4k, 120Hz signal. Hell it could drive 8 or more if you drop the refresh down to 30.

            • narc0tic_bird@lemm.ee
              link
              fedilink
              English
              arrow-up
              0
              ·
              23 days ago

              I’m not generally arguing it’s not a big deal. I’m actually saying the regular M chips should be upgraded to M “Pro” levels of display support. But beyond two external displays, yes, I’m arguing it’s not a big deal, simply because >99% of users don’t want to use more than two external displays (no matter the resolution). Even if I had 6 old displays lying around I would hardly use more than two of them for a single computer. And as long as I’m not replacing all 6 displays with 6 new displays it doesn’t make a difference in terms of e-waste. On the contrary I’d use way more energy driving 6 displays simultaneously.

              I’m 100% with you that MST should be supported, but not because driving six displays (per stream) is something I expect many people to do, but because existing docking solutions often use MST to provide multiple (2) DisplayPort outputs. My workplace has seats with a USB-C docking station connected to two WQHD displays via MST, and they’d all need replacing should we ever switch to MacBooks.

              And sure, they should bring back proper font rendering on lower resolution displays. I personally haven’t found it to be too bad, but better would be … better, obviously. And as it already was a feature many moons ago, it’s kind of a no-brainer.

        • tal@lemmy.today
          link
          fedilink
          English
          arrow-up
          0
          ·
          23 days ago

          I guess you could get an eGPU. Probably not cheaper than just giving Apple their pound of flesh, though.

          • masterspace@lemmy.ca
            link
            fedilink
            English
            arrow-up
            0
            ·
            23 days ago

            You need to reread my comment where I point out that it’s only the Max chips that can drive more than two external monitors.

            And bro, a cursory Google search would also bring up this page from Apple which confirms everything I wrote. A base M3 mac can only drive two monitors if the internal display is closed, i.e. it can only drive one external monitor and one internal.

      • carleeno@reddthat.com
        link
        fedilink
        English
        arrow-up
        0
        ·
        23 days ago

        My last job issued me an M2 air that could only power 1 external monitor. Was annoying as hell.