Google's New Street View Camera is Creepy

GoogleStreetView.jpg
(Photo from Google)

Google has been capturing images of neighborhoods, roads, and unwilling joggers since 2007 with its Street View Cars. Ten years later, Street View cars have captured more than 80 billion photos in thousands of cities and 85 countries. In July of 2017, they upgraded those cameras for the first time in eight years. Think about how far digital camera technology has advanced in the last eight years? The images on Google Maps are about to get a lot clearer. Here's a before and after of the same location. 

LaurelStreetBefore.jpg
(Laurel street before on Google Maps)

LaurelStreetAfter.jpg
(Laurel Street after on Google Maps)

According to an article in Wired, written by Tom Simonite: 

...Google's new hardware wasn't designed with just human eyes in mind. The car-top rig includes two cameras that capture still HD images looking out to either side of the vehicle. They're there to feed clearer, closer shots of buildings and street signs into Google's image recognition algorithms. Those algorithms can pore over millions of signs and storefronts without getting tired. By hoovering up vast amounts of information visible on the world's streets--signs, business names, perhaps even opening hours posted in the window of your corner deli--Google hopes to improve its already formidable digital mapping database. The company, built on the back of algorithms that indexed the web, is using the same strategy on the real world.

Here's a link to a video describing the new technology. 

Is that something we really want? I realize the information is already out there now but it really isn't categorized and stored that way yet. According to the article:  

"Jen Fitzpatrick, the vice president who heads the company's maps division, blames that on us. "People are coming to us every day with harder and deeper questions," she says. The first time you searched Google Maps or Street View you probably typed in a street address--perhaps your own. Fitzpatrick says the company now gets tougher queries that require a fresher, more detailed digital model of the world, like "What's a Thai place open now that does delivery to my address?" She wants her service to handle queries that assume knowledge of what the world looks like: "What's the name of the pink store next to the church on the corner?" Google's push to get us talking with its Siri-style virtual assistant encourages us to be more conversational in our demands. "These are questions we can only answer if we have richer and deeper information," Fitzpatrick says.

I use Google Maps all the time. I even use it if I'm going to a city or place I've never been before so I can virtually drive up and down the street and read the signs for parking. Those are useful things. But having them all connected to an algorithm where when I scan past a restaurant and the menu pops up is a little too Minority Report for me. This is the genesis to us wearing Amazon Glasses and while we're watching something on Amazon Prime we can look at a product in the program, nod our head and have it delivered to our door. It's too convenient and too commercial. I guess I'm in the minority, however, as Google began certifying some 360 cameras as "Street View ready," allowing you to upload your own panoramas through the Street View mobile app to live on the company's service. So you can help remove your own privacy rights by providing more data to the system of your own volition. 

Even though Google has entire pages dedicated to street view privacy, but if you glance them over they're more of an "if you're on camera and don't want to be, you can be removed." But how do you know you're on camera in the first place? I guess you could always use the facial recognition software from google photos that already group pictures of people like you're in a CTU command center. 

There's nowhere to hide, better get used to it. 


Source: WIRED