I’m considering stock pixel 9, but I was wondering if anyone had specific critiques of Gemini’s privacy.

I use LLM bots frequently, but am a bit hesitant to let one take over my phone.

  • GenderNeutralBro@lemmy.sdf.org
    link
    fedilink
    English
    arrow-up
    7
    ·
    4 months ago

    We’ve always safeguarded your data with an integrated stack of world-class secure infrastructure and technology, delivering end-to-end protection in a way that only Google can

    This is weaselly even by marketing standards. Most of Google’s services are still not end-to-end encrypted, and none of them were until recently. Oh, but they said “end-to-end protection”, which means absolutely nothing, so I guess it’s not technically a bald-faced lie.

    Anyway, aside from their attempt to rewrite history, it sounds a lot like Apple’s promise with their secure and verifiable AI servers. Google’s blog post references Project Oak, which I’m not intimately familiar with.

    I’m still skeptical of these supposedly-private cloud platforms, but I have a lot of reading to do if I want to develop an informed opinion.

  • boredsquirrel@slrpnk.net
    link
    fedilink
    English
    arrow-up
    4
    ·
    4 months ago

    If it is private, then this means it is FOSS and should run on AOSP, so also on GrapheneOS. Then, fine. If not, then it is not private as it is a black box.

  • Cris16228@lemmy.today
    link
    fedilink
    English
    arrow-up
    2
    ·
    edit-2
    4 months ago

    Google/Gemini and privacy? Interesting… I don’t own a pixel and I want to try one (with custom rom because fuck google) but I wouldn’t trust them lol

    On Android, we are seamlessly integrating the latest artificial intelligence (AI) capabilities, like Gemini as a trusted assistant – capable of handling life’s essential tasks. As such, ensuring your privacy and security on Android is paramount.

    Yeah sure, I can trust them now

  • boredsquirrel@slrpnk.net
    link
    fedilink
    English
    arrow-up
    1
    ·
    4 months ago

    Is this the local AI that runs on the Tensor NPU?

    That is far away from anything compared to an LLM, which requires more than 6GB of VRAM to run remotely well.

    So I am curious what they will really do with that, and think it may be a marketing trick to make people believe everything is local, while it absolutely is not.

  • Avid Amoeba@lemmy.ca
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    4 months ago

    If you’re planning to use a cloud LLM, this is probably a better option than say Apple’s which outsources to another provider (OpenAI). At least with Google, your data stays within the company that made and sold you the device. I don’t trust any of them, but one of them already has my data. Sharing it with another only increases the probability of privacy loss.