New research published by the Georgia Institute of Technology reveals that people will listen to robots in case of an emergency, even if they lead them into dangerous situations. Perhaps the research is important, but it's hard to say that it is surprising. After all, one incident this week served as a painful reminder of the price we pay for putting our trust in technology and the companies that stand behind it.
- Waze, Israeli military at odds over why soldiers ended up in Qalandiyah, sparking riot
- Guided by Waze into the heart of the Palestinian-Israeli conflict
This is not only about the soldiers who mistakenly entered the Qalandiyah refugee camp and miraculously made it out alive. The soldiers claimed that they were relying on the traffic app Waze, and the application misled them. This could have happened in many ways: bad GPS reception; misunderstanding the application's instructions; or the algorithm deciding that the best way passed through Qalandiyah. However, all of them touch on matters between man and application. Technology, just like people, is imperfect, and the connection between the two, even less so. A small glitch in the matrix, and the results can be catastrophic.
The moment we were reminded of the great danger in blind reliance on technology, and especially those behind it, was when the company responded to journalists — and blamed the soldiers for not following the app’s instructions.
"[Waze] includes a specific default setting that prevents routes through areas which are marked as dangerous or prohibited for Israelis to drive through,” the company said. “In this case, the setting was disabled. In addition, the driver deviated from the suggested route and, as a result, entered the prohibited area.”
What's amazing is that Waze answered a question that was never asked. In the mail I sent, I relied on the Facebook post of journalist Haim Har-Zahav, in which he mentioned earlier reports, such as those on Ynet, that Waze changed the navigation settings for Israeli users following the rise in tensions in East Jerusalem. Har-Zahav blamed Waze for changing its settings back after pressure by right-wing politicians like Jerusalem Mayor Nir Barkat. However, it is clear from the text that it claims that the default settings are supposed to prevent entry into Palestinian areas.
The problem is that in its haste to defend itself against accusations that it had endangered the soldiers, Waze did not hesitate to reveal private information about the users, in contradiction of its own user agreement, and in contradiction of countless declarations by the company, and its parent, Google.
Attorney Yonatan Klinger checked if this clause covers the company in this case, and responded that it only allows transferring information to law enforcement authorities.
Blogger and journalist Ido Kenan, who also wrote about the incident in his "Room 404" blog, remarked that the details of the response shows that Waze takes the least efficient approach to protecting user privacy. It protects users' specific information (as it notes in its users agreement) without destroying it or at least encoding it and leaving the decoding keys in the users’ hands.
In other words, the next time you see a screen starting with the words “Waze Mobile Limited respects your privacy,” or when Google (or its umbrella group Alphabet) again blames the American government for risking its users’ privacy, remember that the company that belongs to it did not hesitate for a second to throw the privacy of its users out the window the moment it was convenient for it do so. And the next time that you wonder what you have to hide, you should be reminded that even a trivial detail like the route of a journey on Waze can be used against you if you constitute a problem for a giant company.
Waze was asked yesterday who permitted it to release private information about the soldiers’ use of the application. It has yet to respond.