Something strange, scary and sublime is happening to cameras, and it’s going to complicate everything you knew about pictures. Cameras are getting brains.
Until the past few years, just about all cameras — whether smartphones or point-and-shoots or CCTV surveillance — were like eyes disconnected from any intelligence.
They captured anything you put in front of them, but they didn’t understand a whit about what they were seeing. Even basic facts about the world eluded them. It’s crazy, for instance, that in 2018, your smartphone doesn’t automatically detect when you’ve taken naked pictures of yourself and offer to house them under an extra-special layer of security.
But all this is changing. There’s a new generation of cameras that understand what they see. They’re eyes connected to brains, machines that no longer just see what you put in front of them, but can act on it — creating intriguing and sometimes eerie possibilities.
At first, these cameras will promise to let us take better pictures, to capture moments that might not have been possible with every dumb camera that came before. That’s the pitch Google is making with Clips, a new camera that went on sale on Tuesday. It uses so-called machine learning to automatically take snapshots of people, pets and other things it finds interesting.