Google can’t make a move in 2025 without veering into the realm of generative AI, and the release of the Pixel 9a is no exception. Curiously, the AI experience on this phone may not match what you’ve seen from the company’s high-end smartphones. Google has confirmed to Ars that the phone’s lower memory prevented it from implementing the full suite of Pixel AI features. You can still talk to Gemini by holding the power button or opening the Gemini app, but the on-device Gemini Nano model has seen a downgrade on the 9a.

  • base@lemmy.world
    link
    fedilink
    English
    arrow-up
    22
    ·
    23 hours ago

    The stagnant ram/storage situation weirds me out. Dont get me wrong 6/8GB ram and 128GB rom served, serves and will serve me well. But damn i bought a phone in 2019 for 300€ with the same specs and 6 years later i’m expected to pay almost double the price for the same. Doesnt make any sense to me from a consumer perspective. Yeah the SOC is faster, the cameras a bit better and the screen gets brighter. But there is no reason for me to upgrade unless support ends for my current phone or it just breaks.

    The only compelling feature i see in the forseeable future is the whole desktopmode and virtualization thing. If done competently i would pay for that since i dabbled with the idea a lot in the last few years. It just was shit from the software stack standpoint. But then give me 12-16gb of ram and atleast 512gb of storage for lets say 600€ or less.

    Oh yeah, this article was about gemini ai. Well i dont give a shit.

    • Onomatopoeia@lemmy.cafe
      link
      fedilink
      English
      arrow-up
      3
      ·
      edit-2
      11 hours ago

      I want proper use of swap/cache by default, not something I have to root to get.

      Why is my browser reloading pages because I switched apps, on a 6gb phone?

      I could prevent this on a 2gb phone in 2017 with root by configuring it properly.

    • ChaoticNeutralCzech@feddit.org
      link
      fedilink
      English
      arrow-up
      9
      ·
      edit-2
      1 day ago

      The client app is more lightweight than a web browser so getting it that low is just not practical

      Also, in what universe is it easier to change a phone’s RAM than to disable apps or get a custom ROM?

    • henfredemars@infosec.pubOP
      link
      fedilink
      English
      arrow-up
      7
      ·
      1 day ago

      Totally get your vibe. I don’t want AI BS, and it’s a huge waste of my memory and battery. Keep it simple and stupid, please.

    • skuzz@discuss.tchncs.de
      link
      fedilink
      English
      arrow-up
      3
      ·
      23 hours ago

      Probably around 3GB. I have some cheaper Motorolas with that, usable, but a lot of app reload. Also, stick to older, slower phones to avoid the AI on-device. Otherwise rooting, GrapheneOS, etc.

    • CallateCoyote@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 day ago

      Seriously, it’s trash. I heard they’re replacing the Google Assistant with it but it can’t even handle “play music” and “navigate home” when I’m driving.

      I have no issues opening up GPT or DeepSeek if I need to ask AI a question about the weird thing on my peepee sack. It’s just a skin tag.

  • FiveMacs@lemmy.ca
    link
    fedilink
    English
    arrow-up
    39
    ·
    1 day ago

    You’d think these companies forcing all this ai bullshit would also ensure their hardware could actually run all the data harvesting crap without making their devices trash

    • DaGeek247@fedia.io
      link
      fedilink
      arrow-up
      2
      ·
      1 day ago

      Oh man, have you seen the state of new -INSERT MODERN DEVICE HERE- lately? Apparently nobody got the memo.

  • Zenodyne@lemmy.world
    link
    fedilink
    English
    arrow-up
    11
    ·
    1 day ago

    Makes me more likely to consider it, honestly. I don’t need much RAM on my phone, and I would be turning off/avoiding Gemini usage anyway

  • RegalPotoo@lemmy.world
    link
    fedilink
    English
    arrow-up
    9
    ·
    1 day ago

    So this whole Gemini thing is a tactic to push people to upgrade their phones again right? They gave up on the whole “your phone is 6 months old and therefore won’t be getting security updates anymore so you need to buy a new phone with identical specs otherwise hackers are going to break into your bank account and set your dog on fire” because regulators were starting to get twitchy, so now it’s "your phone is brand new but you didn’t spend enough money on it so you better buy a new phone or you won’t be able to have a sentient assistant to help you do your job and manage your life and you’ll be passed over for promotion by a 16 year old AI Native and never get a date and your family will be angry at you because Aunt Mildred doesn’t like fish and you booked family dinner at the wrong restaurant "

  • Ulrich@feddit.org
    link
    fedilink
    English
    arrow-up
    10
    arrow-down
    1
    ·
    1 day ago

    You’d think they’d just spend the extra few bucks on ram instead of spending probably countless hours making this new tiny model.

    • jcarax@beehaw.org
      link
      fedilink
      English
      arrow-up
      3
      ·
      21 hours ago

      Well, they probably want a leaner version for lower end phones anyway, along the lines of the Go versions of many of their apps. Luckily I won’t have to worry about this shit running Graphene, with no intention of running an LLM, so 8GB would be fine if I had any need to move on from my Pixel 8 prematurely.

      Hey, maybe it’ll cause some fairly quick, large discounts. My Pixel 5 backup with a rather shattered screen could use a replacement.

    • QuadratureSurfer@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 day ago

      Yeah, they definitely should focus on adding more RAM to their flagship phones, but smaller models are ideal for smartphones.

      Smaller models are quicker to run and use up less battery. Besides, if they’re using some AI models I would rather it be run locally than have my data uploaded to some server somewhere (big assumption that they wouldn’t do that anyway, I know)… Or at the very least I can still run it even if the network goes down.

      • Ulrich@feddit.org
        link
        fedilink
        English
        arrow-up
        3
        ·
        24 hours ago

        I understand that but they’re making an entire model just for this one device?