Tuesday, July 5, 2011

Robot Apocalypse

A few years ago, the computer science department at our university was holding a “computer science day” to recruit high school students. During this time, I was assisting a professor in the Computer Science department who had received a grant for five robots to develop a multi-agent system paradigm. My job was to help program these robots so that they could communicate with each other to avoid obstacles, navigate around a room autonomously, and be controlled remotely by an operator. These were simple tasks to accomplish, and were the early stages of a much larger project.


P3-DX RobotsThe robots look like the machines pictured in this post. Human appearance was not reflected in the design – just a machine that cruised around on wheels. Each robot contained six sonar sensors. With a little bit of programming, the sensors allowed the robots to determine the distance between them and an obstacles in their path. This helped the robots communicate with each another to avoid collisions when navigating autonomously. If a human wished to intervene, we designed a touch-screen tablet that an operator could use to control the robots remotely, and the human could see what they “see” through a webcam mounted on the robot. This allowed the operator to navigate the machines around even if he or she was not in the same room.

We gave this technology to high school students during computer science day, because the robots were fun to use and we thought students would find them entertaining. During the demonstration, sometimes the robots' sonar ping would travel through a wall and hit the studs, throwing off the distance the robots calculated between themselves and the wall. As a result, the robots sometimes rammed into walls at full speed and made a few (additional) holes in Faner Hall.

The emotional impact on everyone was different. High school students, and us, winced when the robots slammed the wall, but for different reasons. Unlike the high school students, we didn't want the robots damaged primarily because they were expensive. The robots also had value to us because we spent a lot of time working with them. Nothing more. The robots were simply machines. It wasn't the same “feeling” of being intensely connected with non-living objects, as many individuals described in Sherry Turkle's book Alone Together. The robot was programmed to conduct simple tasks, and it just needed to work at the end of the day.Image attribution: University of Cincinnati's Cooperative Distributed Systems Lab

The high school students in attendance felt a bit different. The ability to control the robots was exciting, and they didn't want to lose a source of entertainment. Some high school students probably saw a robot slamming against a wall as serious excitement, especially when it created a new hole. When our robots had a collision, the unintended disruption caused many high school students to want to take control of the robots. A connection developed between the people wanting to compete over who could operate the robots most effectively, and not necessarily the connection between humans and machines themselves. In this case, the technology helped facilitate bonding and built friendships in the form of competition. It was healthy. To the high school students, I suspect watching the robots accidentally slam into the walls was a healthy and safe way to relieve some aggression indirectly – similar to why people watch boxing or aggressive sports. I also suspect that if Sherry Turkle was reading this post, she would probably express her legitimate concern to me and disagree completely, claiming these actions are destructive to society.

Later, when the robots were navigating autonomously, we programmed them to avoid obstacles and each other. Students often took this as an opportunity to walk into a group of robots operating autonomously, curious how the machines would react. As expected, the robots tried to move quickly out the way and avoid the students and each other, but the students also had to move to avoid them in the chaos. Both the operator and the robot would manipulate each others actions in a response to a disturbance. The high school students seemed to enjoy this the most. Perhaps it was the mystery of the robot that they found intriguing. It makes me question if the “connection” that Sherry Turkle mentions between humans and robotics would remain once the novelty diminished. Much like a human relationship, it's likely to get boring if it remains predicable. As a programmer, I knew how the machine would react, so perhaps my perception of the robot was different than what the high school students felt.

image attribution: Random Robotics

We also programmed the robots to follow people that came within a certain distance. The robots provided attention to the high school students and responded to their behavior and interactions by following them. When the occasional pedestrian member passed by too close to our demonstration, the robots would stop following the high school students and would begin to follow the pedestrian instead. At first it was amusing because this was completely unexpected. Innocent bystanders were suddenly in control of our robots. Some bystanders were anxious because they accidentally influenced the demonstration. Others enjoyed being the center of attention. Realizing this, students began to compete for control over who could get the most robots to follow them. It was a competition, and connection, between people... not humans and machine.

This robot demonstration was on my mind when reading Sherry Turkle's book Alone Together. As programmers, when the robots hit a wall, sometimes we just felt bad because of the potential loss of value in the robot and the time put into it. It was like a car... we work hard to pay for our vehicles and feel terrible when they get rear ended in a parking lot. We felt the same when the robots had a collision, which is why I found it so difficult to relate to Turkle's stories. When students had the attention of the robot, there was a feeling of satisfaction because of the human interactions that took place. These interactions were facilitated by the use of technology, and it was healthy – even when things went wrong. When that attention was lost, there was disappointment. Communication, even with objects, can play with our emotions in many unexpected ways. The outcome isn't always terrible, either.

No comments:

Post a Comment