• Monkey With A Shell@lemmy.socdojo.com
        link
        fedilink
        English
        arrow-up
        21
        arrow-down
        1
        ·
        7 months ago

        They where on Nexus for a while before that too. Now if we could get one that doesn’t constantly scan virtually all activities for ‘an improved user experience and personalization of the web’ out of the box that’d be a pretty awesome next step.

      • foggenbooty@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        ·
        7 months ago

        The earlier phones just used off the shelf Snapdragons. It’s only the 6 and up that use the Tensor chip.

        • *Tagger*@lemmy.world
          link
          fedilink
          English
          arrow-up
          5
          ·
          7 months ago

          Oh yeah, I get that - I was just pointing out they don’t kill all of their projects after two years.

  • GenEcon@lemm.ee
    link
    fedilink
    English
    arrow-up
    42
    ·
    7 months ago

    Weird article. They present a 3+ years road map for upcoming chips but at the same time call that ‘life support’. Something doesn’t add up.

    • ReallyActuallyFrankenstein@lemmynsfw.com
      link
      fedilink
      English
      arrow-up
      17
      ·
      7 months ago

      They’re claiming that after the Tensor 4 design that Google will be moving to a 3nm TSMC fully custom design.

      So the “life support” comment apparently applies to just their current Exynos-derived designs that will need to continue shipping until then, despite the designs being dated (i.e., on “life support”).

  • Max_Power@feddit.de
    link
    fedilink
    English
    arrow-up
    37
    arrow-down
    8
    ·
    edit-2
    7 months ago

    Google makes 3 things: Android (includes Play store), Google Mail, Google Search.

    That’s it. Everything else from Google has its head already on the chopping block.

  • AlmightySnoo 🐢🇮🇱🇺🇦@lemmy.world
    link
    fedilink
    English
    arrow-up
    25
    arrow-down
    3
    ·
    edit-2
    7 months ago

    I think the problem is the conflicting goals that Google has with that chip. They want the chip to be able to run AI stuff locally with the Edge TPU ASIC that it includes, but at the same time Google also wants to make money by having Pixel devices offload AI tasks to the cloud. Google can’t reconcile these two goals.

    • Kushan@lemmy.world
      link
      fedilink
      English
      arrow-up
      11
      arrow-down
      2
      ·
      7 months ago

      I don’t think they’re opposing goals. Google does not make more money from a task running in its cloud than on its devices, if anything that costs them more money.

      • AlmightySnoo 🐢🇮🇱🇺🇦@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        1
        ·
        edit-2
        7 months ago

        I think it’s realistic to assume that Google is going to impose quotas on those “free” AI features that are running on the cloud right now and have people pay for more quota. It makes no economic sense for Google to keep offering those compute services for free. Remember Google Colab? Started completely free with V100 and A100 GPUs, now you have to pay to just keep using a simple T4 GPU without interruption.

        • Kushan@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          7 months ago

          Google makes money from ads that they’re going to serve you no matter where they process your data.

          Google is going to pull all that metadata from your device regardless of where it was processed.

          Servers cost Google money to run. It costs them nothing to run something on your device. They clearly have a vested interest in running it on your device if they can.

    • nottheengineer@feddit.de
      link
      fedilink
      English
      arrow-up
      7
      arrow-down
      2
      ·
      7 months ago

      There’s a solution: Charge the customer once for the hardware and then add a monthly fee to be able to use all of it. Sony and Microsoft have great success with that.

          • 520@kbin.social
            link
            fedilink
            arrow-up
            3
            ·
            edit-2
            7 months ago

            Lol you wut?

            Do you know how expensive conventional AI setups are? An unlocked AI chip on a phone would fast replace nVidia cards in the AI scene for low level researchers, especially those dealing with sensitive data for whom cloud access is not viable.

            My laptop is $1500, and is just about viable for this kind of stuff. It took it three days non-stop to create a trading model for ~22 stocks, processing 10 years worth of data for each.

            Now maybe it doesn’t mean much for the consumer, that’s true. It means a hell of a lot for small time developers though, including those developing the apps consumers use.

  • soulfirethewolf@lemdro.id
    link
    fedilink
    English
    arrow-up
    21
    ·
    7 months ago

    It is frustrating how the different parts at Google overlap with each other in ways that basically make them counteract itself

      • HubertManne@kbin.social
        link
        fedilink
        arrow-up
        2
        ·
        7 months ago

        I worked for abbot labs and we made certain chemicals but we did not discount our product for interior sales so we usually bought the competitor which we could get for a bit cheaper. Makes no sense to me that they would not sell internally wholesale.

    • smoothbrain coldtakes@lemmy.ca
      link
      fedilink
      English
      arrow-up
      8
      arrow-down
      1
      ·
      7 months ago

      All big tech should just be broken into parts. Glass-Steagall the fuckers. Google can have their adtech empire, strip everything else away from them though. Same with Microsoft. Fuck having like 18 departments under a massive main brand, break down the departments into their own corps.

  • drwankingstein@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    19
    arrow-down
    2
    ·
    7 months ago

    I think in general, having NPUs on devices is such an underrated value of which Google’s TPU would be classified as. There’s actually a lot you can use them for. The main thing for me personally is definitely voice detection stuff. Although I have to admit FUTO’s voice detect which uses whisper, really great. This reply is being crafted entirely using it.

    Background noise removal which definitely helps with Speech detection. So it’s really nice to have that too. And of course you have things like video processing and everything else Google is bragging about.

    I don’t think these devices are incapable of doing what Google wants them to. I think it’s a mixture of they’re simply not fast enough to do it real time, which Google needs for their premium feeling this as well as just not wanting to invest the time in it.

  • ExtremeDullard@lemmy.sdf.org
    link
    fedilink
    English
    arrow-up
    5
    arrow-down
    1
    ·
    7 months ago

    What a surprise… What isn’t run in the cloud these days? The cloud makes data collection and forced subscriptions easy.

  • Deftdrummer@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    ·
    7 months ago

    It is interesting that the price has already dropped on the P8 and is expected to continue to drop thru black Friday.

    This is precisely why I never buy new anymore, having been burned by two previous Pixel releases in the past.

    This chip situation doesn’t bode well for my continued Pixel use however.

    Software is important but it isn’t everything, and like the article said raw horsepower does matter. For me. The most important things are battery, life display brightness, and cellular connectivity - something my pixel 6 Pro objectively fails at on all three fronts.

    Combine that with all of the data theft that Google software utilizes, and I think I’m pretty much done.