How I Became a Dreaded Zionist Robotic Spy, or – Why We Need a Privacy Standard for Robots

It all began in a horribly innocent fashion, as such things often do. The Center for Middle East Studies in Brown University, near my home, has held a “public discussion” about the futures of Palestinians in Israel. Naturally, as a Israeli living in the States, I’m still very much interested in this area, so I took a look at the panelist list and discovered immediately they all came from the same background and with the same point of view: Israel was the colonialist oppressor and that was pretty much all there was to it in their view.

MES 3-3-16 Critical Conversations.gif

Quite frankly, this seemed bizarre to me: how can you have a discussion about the future of a people in a region, without understanding the complexities of their geopolitical situation? How can you talk about the future in a war-torn region like the Middle East, when nobody speaks about security issues, or provides the state of mind of the Israeli citizens or government? In short, how can you have a discussion when all the panelists say exactly the same thing?

So I decided to do something about it, and therein lies my downfall.

I am the proud co-founder of TeleBuddy – a robotics services start-up company that operates telepresence robots worldwide. If you want to reach somewhere far away – Israel, California, or even China – we can place a robot there so that instead of wasting time and health flying, you can just log into the robot and be there immediately. We mainly use Double Robotics‘ robots, and since I had one free for use, I immediately thought we could use the robots to bring a representative of the Israeli point of view to the panel – in a robotic body.

Things began moving in a blur from that point. I obtained permission from Prof. Beshara Doumani, who organized the panel, to bring a robot to the place. StandWithUs – an organization that disseminates information about Israel in the United States – has graciously agreed to send a representative by the name of Shahar Azani to log into the robot, and so it happened that I came to the event with possibly the first ever robotic-diplomat.

2016-03-03 17.48.13

Things went very well in the event itself. While my robotic friend was not allowed to speak from the stage, he talked with people in the venue before the event began, and had plenty of fun. Some of the people in the event seemed excited about the robot. Others were reluctant to approach him, so he talked with other people instead. The entire thing was very civil, as other participants in the panel later remarked. I really thought we found a good use for the robot, and even suggested to the organizers that next time they could use TeleBuddy’s robots to ‘teleport’ a different representative – maybe a Palestinian – to their event. I went home happily, feeling I made just a little bit of a difference in the world and contributed to an actual discussion between the two sides in a conflict.

A few days later, Open Hillel published a statement about the event, as follows –

“In a dystopian twist, the latest development in the attack on open discourse by right-wing pro-Israel groups appears to be the use of robots to police academic discourse. At a March 3, 2016 event about Palestinian citizens of Israel sponsored by Middle East Studies at Brown University, a robot attended and accosted students. The robot used an iPad to display a man from StandWithUs, which receives funding from Israel’s government.

Before the event began, students say, the robot approached students and harassed them about why they were attending the event. Students declined to engage with this bizarre form of intimidation and ignored the robot. At the event itself, the robot and the StandWithUs affiliate remained in the back. During the question and answer session, the man briefly left the robot’s side to ask a question.

It is not yet known whether this was the first use of a robot to monitor Israel-Palestine discourse on campus. … Open Hillel opposes the attempts of groups like StandWithUs to monitor students and faculty. As a student-led grassroots campaign supported by young alumni, professors, and rabbis, Open Hillel rejects any attempt to stifle or target student or faculty activists. The use of robots for purposes of surveillance endangers the ability of students and faculty to learn and discuss this issue. We call upon outside groups such as StandWithUs to conduct themselves in accordance with the academic principles of open discourse and debate.”

 

 

I later met accidentally with some of the students who were in the event, and asked them why they believed the robot was used for surveillance, or to harass students. In return, they accused me of being a spy for the Israeli government. Why? Obviously, because I operated a “surveillance drone” on American soil. That’s perfect circular logic.

 

Lessons

There are lessons aplenty to be obtained from this bizarre incident, but the one that strikes me in particular is that you can’t easily ignore existing cultural sentiments and paradigms without taking a hit in the process. The robot was obviously not a surveillance drone, or meant for surveillance of any kind, but Open Hillel managed to rebrand it by relying on fears that have deep-roots in the American public. They did it to promote their own goals of getting some PR, and they did it so skillfully that I can’t help but applaud them for it. Quite frankly, I wish their PR guys were working for me.

That said, there are issues here that need to be dealt with if telepresence robots ever want to become part of critical discussions. The fear that the robot may be recording or taking pictures in an event is justified – a tech-savvy person controlling the robot could certainly find a way to do that. However, I can’t help but feel that there are less-clever ways to accomplish that, such as using one’s smartphone, or the covert Memoto Lifelogging camera. If you fear being recorded on public, you should know that telepresence robots are probably the least of your concerns.

 

Conclusions

The honest truth is that this is a brand new field for everyone involved. How should robots behave at conferences? Nobody knows. How should they talk with human beings at panels or public events? Nobody can tell yet. How can we make human beings feel more comfortable when they are in the same perimeter with a suit-wearing robot that can potentially record everything it sees? Nobody has any clue whatsoever.

These issues should be taken into consideration in any venture to involve robots in the public sphere.

It seems to me that we need some kind of a standard, to be developed in a collaboration between ethicists, social scientists and roboticists, which will ensure a high level of data encryption for telepresence robots and an assurance that any data collected by the robot will be deleted on the spot.

We need, in short, to develop proper robotic etiquette.

And if we fail to do that, then it shouldn’t really surprise anyone when telepresence robots are branded as “surveillance drones” used by Zionist spies.

Did Tesla Break into Cars? or – Are We Witnessing a Decline in Private Ownership?

Jason Hughes is a white hat hacker – a ‘good’ hacker, working diligently to discover and identify ways in which existing systems can be hacked into. During one of his most recent forays, as described in TeslaRati he analyzed a “series of alphanumeric characters found embedded within Tesla’s most recent firmware 7.1”. According to Hughes, the update included the badges for the upcoming new Tesla model, the P100D. Hughes tweeted about this development to Tesla and to the public, and went happily to sleep.

And then things got weird.

According to Hughes, Tesla has attempted to access his car’s computer and significantly downgrade the firmware, assumedly in order to delete the information about the new model. Hughes managed to stop the incursion in the nick of time, and tweeted angrily about the event. Elon Musk, CEO of Tesla, tweeted back that he had nothing to do with it, and seemingly that’s the end of the story. Hughes is now cool with Musk, and everybody is happy again.

tesla tweet 2

But what can this incident tell us about the future of private ownership?

 

A Decline in Private Ownership?

One of Paul Saffo’s rules for effective forecasting is to “embrace the things that don’t fit”. Curious stories and anecdotes from the present can give us clues about the shape of the future. The above story seems to be a rather important clue about the shape of things to come, and about a future where personal ownership of any networked device conflict with the interests of the original manufacturer.

Tesla may or may not have a legal justification to alter the firmware installed in Hughes’ car. If you want to be generous, you can even assume that the system asked Hughes for permission to ‘update’ (actually downgrade) his firmware. Hughes was tech-savvy enough to understand the full meaning of such an update. But how many of us are in possession of such knowledge? In effect, and if Hughes is telling the truth, it turns out that Tesla attempted to alter Hughes’ car properties and functions to prevent damages to the company itself.

Of course, this is not the first incident of the kind. Seven years ago, Amazon has chosen to reach remotely into many Kindle devices held and owned by private citizens, and to delete some digital books in those devices. The books that were deleted? In a bizarre twist of fate they’re George Orwell’s books – 1984 and Animal Farm – with the first book describing a dystopian society in which the citizen has almost no power over his life. In 1984, the government has all the power. In 2016, it’s starting to seem that much of this power belongs to the big IT companies that can remotely reprogram the devices they sell us.

20090717-t3722tnq7c2dqs2sk459g7mgdn.jpg
Image originally from Engadget.

 

The Legal Side

I’m not saying that remote updates are bad for you. On the contrary: remote updates and upgrades of system are one of the reasons for the increasing rate of technological progress. Because of virtual upgrades, smartphones, computers and even cars no longer need to be brought physically to service stations to be upgraded. However, these two episodes are a good reminder for us that by giving the IT companies leeway into our devices, we are opening ourselves to their needs – which may not always be in parallel with our own.

I have not been able to find any legal analysis of Hughes’ and Tesla’s case, but I suspect if the case is ever being brought to court then Tesla might have to answer some difficult questions. The most important question would probably be whether the company even bothered to ask Hughes for permission to make a change in his property. If Tesla did not even do that, let them be penalized harshly, to prevent other companies from following in their footsteps.

Obviously, this is not a trend yet. I can’t just take two separate cases and cluster them together. However, the mechanism behind both incidents is virtually the same: because of the everpresent connectivity, the original manufacturers retain some control over the devices owned by end-users. Connectivity is just going to proliferate in the near future, and therefore we should keep a watchful eye for similar cases.

 

Conclusions

This is a new ground we’re travelling and testing. Never before could upgrades to physical user-owned devices be implemented so easily, to the benefit of most users – but possibly also for the detriment of some. We need to draw clear rules for how firms can access our devices and under what pretense. These rules, restrictions and laws will become clearer as we move into the future, and it’s up for the public to keep close scrutiny on lawmakers and make sure that the industry does not take over the private ownership of end-user devices.

Oh, and Microsoft? Please stop repeatedly asking me to upgrade to Windows 10. For the 74th time, I still don’t want to. And yes, I counted. Get the hint, won’t ya?