I don’t understand why Google is pushing cloud AI so much after pushing their Tensor AI chips for so long. iPhones have indexed pictures offline for years but Google insists on doing this tagging in the cloud.
I have a copy of Photoprism running that does some (not great) indexing of my pictures, and that runs on half a CPU core on an old laptop. Surely all of these fancy Tensor processors and thousand dollar phones can do that stuff while they’re charging overnight?
That’s not entirely true. They may have the same processor but they don’t have the same amount of RAM, which is actually super critical for on device AI tasks. They recently brought all the features to the Pixel 8, but it took additional time to optimize for 8 GB vs the 12 GB of the Pro.
I don’t understand why Google is pushing cloud AI so much after pushing their Tensor AI chips for so long. iPhones have indexed pictures offline for years but Google insists on doing this tagging in the cloud.
I have a copy of Photoprism running that does some (not great) indexing of my pictures, and that runs on half a CPU core on an old laptop. Surely all of these fancy Tensor processors and thousand dollar phones can do that stuff while they’re charging overnight?
Money. The latest Pixel devices all have the same processors although the features are locked behind software depending on which model you got.
That’s not entirely true. They may have the same processor but they don’t have the same amount of RAM, which is actually super critical for on device AI tasks. They recently brought all the features to the Pixel 8, but it took additional time to optimize for 8 GB vs the 12 GB of the Pro.