Popular YouTuber @Mrwhosetheboss has singled out the Google Tensor G3 as a major failing of the Pixel 8 Pro. According to the YouTuber, all of the new generative AI features found on the Pixel 8 Pro cannot be processed onboard the device, but need to be off-loaded to the cloud for processing, despit...
And a lot of those require models that are multiple Gigabytes in size that then need to be loaded into memory and are processed on a high end video card that would generate enough heat to ruin your phones battery if they could somehow shrink it to fit inside a phone. This just isn't feasible on phones yet. Is it technically possible today? Yes, absolutely. Are the tradeoffs worth it? Not for the average person.
He's inaccurate but he's not wrong. It takes a lot of power, relatively speaking, and phones are very power constrained devices.
I think the biggest misunderstanding here is that people are reading the headline as all AI tasks are offloading to the cloud, when it's just the heavy ones. Still a privacy issue for sure, but there's plenty of nuance there that people won't understand.
You can for example run some upscaling models on your phone just fine (I mentioned the SuperImage app in the photography tips megathread). Yes the most powerful and memory-hungry models will need more RAM than what your phone can offer but it's a bit misleading if Google doesn't say that those are being run on the cloud.
Isn't that kinda the dream. We have devices that remote the os. So we get a super powerful device that keeps getting updated and upgraded. We just need a receiver ?
Isn't that what we want. Can reduce down the bulk on devices. Just a slab with battery, screen and modem and soc that can power the remote application ?
Sometimes that's what people dream about. On the other hand that hybrid cloud model is giving up the last remnants of computing autonomy and control over devices we own.
That would be, if Google wasn't constantly killing things that didn't do good enough. Especially given how expensive generative AI can be to run remotely. Just look at what happened with Stadia
Also, it just feels disappointing. Ever since chatGPT, they've been pouring near infinite budget into stuff like this by hiring the top talent, and working them to the latest hours of the night. And the best that they could come up with for the pixel 8 is feeding it data from the cloud.
And I can't even really believe the whole "consumer hardware isn't powerful enough" thing given that there's quite a few ARM processors, Apple especially, That's been able to build consumer hardware capable of performing generative AI (I've personally been able to run stable, diffusion and whisper on my M1 MacBook). Maybe not at the scale or quality of the cloud, but still capable of doing so regardless.
I mean it sucks for offline situations or for actions that need very good latency.
But the phone's battery would be happier if the processing is not done locally.
For some things I prefer to do the stuff locally, for other things on the cloud.
Some cloud Vs local examples I'm thinking about:
For example (example not related to the Pixel) if I'm generating an image with stable diffusion at home I prefer to use my RX 6800 on my private local Linux rather than a cloud computer with a subscription.
But if I had to do the same on a mobile phone with tiny processing power and battery capacity I'd prefer to do it on the cloud.
(Another non AI example): for gaming, I prefer to run things natively. At least until it is possible to stream a game without added latency. Obviously I don't play games on a phone.
Another example (Google photos): I would prefer to connect Google photos to a local NAS server at home, and "own the storage on premise" and then pay a lower fee to Google photos for the extra services it brings. Browsing, face and objects recognition, etc. But this is not an option. With this I would be able invest on my own storage, even if I had dozens of gigabytes in 4K60FPS videos.
Funny you should say that. I'm basing this on GeForce which I use exclusively. It runs games from the cloud. I play Valhalla on my TV. I love rural and don't have don't have fibre.
Yes you read that. No fibre.
I recently got starlink and have been loving life. Living on a farm playing GeForce games. I loved stadia and bought into the whole thing. Raged that they cancelled it. But I got my money's worth.
Now it's not perfect but I live in rural NZ. So I have streaming to Aus to use GeForce since GeForce isn't native to NZ.
It's insane how good this cloud gaming is.
So fuck Google but first company to get a phone running on remote I will be buying
Stadia was great. People were just attacking it. Bad marketing from the get go. Have you tried it ? Did you just follow what everyone else says or actually try things for yourself
This just strengthens the argument to install privacy/security first operating systems like CalyxOS and GrapheneOS. I don't was a phone that's more a service subscription than it is hardware. I have the pixel 8 and didn't get the pro due to the offloading to Google servers for some "features".
Just waiting for GrapheneOS to be released for the 8... Until then, I'm sitting uncomfortable knowing my phone is uploading telemetry to Google servers...
I've bitten the bullet, replaced my Redmi 13 to a Pixel 7 + GrapheneOS - because the MIUI spyware and google spyware are just too much...
I still don't get why , as an owner of a phone , I'm not the owner of the phone.
Yea I was tempted for a second to get it, but there was a comment that it "should" be possible to upgrade to the released version when available, I don't want to have to mess around in the event that it's not. So I'm just going to wait a week or so.
When some company praises the ground braking AI capability of a new SoC they have been built, you might get the idea that it's doing these tasks on said SoC.
Why would you think otherwise?
A list of what this phone does offline and what not would be great.
HA. So that means most pixel features could be android features if google wanted to. This means that the reason why google went with a new tensor chip was not for “state of the art AI” but so they could use their own silicon and not have to pay another company