Sunday, May 21, 2017

Google, A.I. and the rise of the super-sensor

Originally shared by Mike Elgan

Google, A.I. and the rise of the super-sensor

(Read my column: https://goo.gl/Xz61sO )

Google dazzled developers this week with a new feature called Google Lens.

Appearing first in Google Assistant and Google Photos, Google Lens uses artificial intelligence (A.I.) to specifically identify things in the frame of a smartphone camera.

Google Lens is shiny and fun. But from the resulting media commentary, it was clear that the real implications were generally lost.

The common reaction was: "Oooh, look! Another toy for our smartphones! Isn't A.I. amazing!" In reality, Google showed us a glimpse of the future of general-purpose sensing. Thanks to machine learning, it's now possible to create a million different sensors in software using only one actual sensor -- the camera.

In Google's demo, it's clear that the camera functions as a "super-sensor." Instead of a flower-identification sensor, a bar-code reader and a retail-business identifier, Google Lens is just one all-purpose super-sensor with software-based, A.I.-fueled "virtual sensors" built in software either locally or in the cloud.

And that's not the only super sensor Google is involved with.

This is a new world:

http://www.computerworld.com/article/3197685/internet-of-things/google-a-i-and-the-rise-of-the-super-sensor.html

#supersensor
http://www.computerworld.com/article/3197685/internet-of-things/google-a-i-and-the-rise-of-the-super-sensor.html

No comments:

Post a Comment

In 1976 (yes, 1976), I heard my professor, one Don Norman, say pretty much the same thing.

In 1976 (yes, 1976), I heard my professor, one Don Norman, say pretty much the same thing. https://www.fastcompany.com/90202172/why-bad-tech...