Categories
Conference Notes Politics Technology

Live from ETech: iRobot…

For the most part the ETCon keynotes are pretty much high-concept fluff. They’re fundamentally high-profile, high-glamour bits of hardcore tech that (often) are completely outside the practical experience of the so-called Alpha geeks that attend these events. But they have their value – they’re designed, I imagine, to be more brain-openers than brain-developers, they’re there to extent the aspirations, intentions and creativity of the people who attend the event rather than to be of direct use to them. Nonetheless if you’re not blown away by the technology or awed by the future tech on display, they can seem like more of a waste of time. Bring on the stuff I can actually use…

Last year the troubling session of this kind was from K. Eric Drexler on Nanotechnology, which most people had already read about in great length but there wasn’t a lot of apparent movement upon. The geeks in the room were interested in the theory but wanted results or something they could participate in. Intrigue fought with frustration and in the end – I think – frustration won. This year that balance was never more in evidence in the second keynote of the morning: Robots: Saving Time, Money and Lives.

Helen Greiner from iRobot Corporation came on stage and seemed surprisingly nervous. She started talking about the Roomba automatic robotic hoover and did so at considerable length. The immediate interest (“I want one”) faded quite rapidly as people gradually tired of the technological challenges of sensing walls, picking up dust and getting in close to the walls. Watching something of technological interest but distinct from the activities of most of the people in the room just seemed to gradually cease being that fascinating. But all that changed when she moved onto the military applications and particularly the Packbot [See the brochure].

The first reaction to the Packbots is fascination and a certain amount of awe. Comments like “I’ve seen this movie!” and “I want one” mix with awed responses to the robustness of the devices concerned. A video is shown where a Packbot is thrown through a window, lands with a thump, bounces a bit, rights itself, looks around and wanders off. One zooms up a staircase. One falls from a second story window and survives intact. Murmurs of delight from the audience at the new toy on offer reverberate through the room.

But gradually the mood changes and anxieties start to appear. Questions about the applicability and potential uses of the technology start to collide with the natural utopianism of the geek audience. What will these robots be used for? Who will control them? Where are the controls? It’s not immediately clear exactly where the anxiety is coming from – we all appreciate that weapons have to be built, that there is a need for the armed forces. But there seems to be something different about using robotics. Thinking about it I come to the conclusion that maybe it’s about a sense of automated killing – an absence of human presence that makes the whole thing resonate with the increasingly mechanised processes of death that echoed through the last century. Is keeping people further out of the equation actually a good idea? Does it discourage or encourage conflict if your side can eradicate another country without suffering any losses at all? Those human horrors of shell-shock and war-weariness – the insanity caused by human-upon-human violence suddenly seem to me almost preferable options – deterrents to conflict designed to stop us arbitrarily exterminating people and going to war.

I’m not going to judge the people involved – I don’t have that right. We all know that warfare and the technologies of warfare must evolve and adapt. The arms race still exists, and will continue to do so as long as state feels under threat from other states or from terror-attacks. It’s just that I didn’t expect such an early brain-opening session to ring such alarm bells or to give me such concern for the future… On occasion, this country I’m visiting feels like it believes itself to be under seige – like some kind of gated-community surrounded by paramilitary, robotic guards…

17 replies on “Live from ETech: iRobot…”

But gradually the mood changes and anxieties start to appear. Questions about the applicability and potential uses of the technology start to collide with the natural utopianism of the geek audience.
I think this is a really healthy response — not too long ago the attitude of scientists/engineers was that technology was separate from moral concerns. “It’s not the gun that kills people,” etc. We could use a bit more of this.

More siege mentality with the hard sell from the shouty spooksman; yes, there’s an obvious attempt to harness these techs for the defence industry. And what’s scary is that the Pentagon is probably the most generous and ‘open’ funder right now, in the absence of VCs…

You could use them as portable mines – just drive them into the building, roll them up next to the ‘bad guys’ and press Detonate. There could be a whole battalion of them, known as the “Creeping Death”.

Consumer – I had responses ranging from “not technical enough” to “way over detailed technically”. Sigh. What we do concerns technology at a consumer price points. This is exciting to me because I have always dreamed of building a robot that anyone can own.
Military – In case it wasn’t clear, our robots deployed today are completely teleoperated – meaning every action is controlled by a soldier with a joystick. Very thoughful comments and questions about the future. I stayed an extra day at Oreilly because I heard (and read) concerns. Yet hardly anyone came over to talk to me about it.
Maybe I shoulda just talked social networks or blogs…Ah well, at least people didn’t leave like in other keynotes:)

Geek Conferences: Nothing to Fear but Fear Itself
Is the O’Reilly Emerging Technologies Conference elitist? This question seems to be stirring up the blogosphere, and causing lots of good people who I read and like to throw verbal bricks at each other. I thought that as someone who is clearly not a m…

“Better a robot than a man on point.” That is the most frequent comment I get when I discuss this issue with the men on the pointy end of the spear who are defending our ability to even debate this issue.
Personally, I much prefer to send the robot down range first. It costs a lot less than the pay off on my life insurance policy. It is just another tool to help the soldier do his job. If “fairness” were the objective in war, it would be a chess match not armed conflict.
If anything, having robots on point allows the rules of engagement to become more restrictive. It allows that squad working through an urban environment to actually ask questions before shooting. That translates directly to less civilian casualties. Robbie the point robot _can_ take that round and still fight where the 18 year old _has_ to shoot first if he wants to come home. By using that robot to open the door or look around the corner we can be much more descriminating in the use of lethal force and that means less death and mayhem. Further, there is no obligation for the robot to use lethal weaponry. It can _safely_ employ non-lethal weapons because it does not face the conundrum of having to kill or risk being killed.
My biggest concern is that I can keep these ‘bots up and working in a war zone. If we are going to put them on point they will be taking a lot of hits and that means we need to have the ability to keep them up and working. Otherwise they become a very brittle tool that is too expensive both in preperation time and logistics tail. The closer that repair facility is to the end user, the more the ‘bots will be used. That is my job, getting that repair facility close enough to my users that we can keep these systems on-line.
Let the philosophers debate the morality of using crossbows, gun powder, or robots. I just want the tools that make defending that freedom safer for those of us who have volunteered to defend it.

re: “we all appreciate that weapons have to be built, that there is a need for the armed forces.”
do we? i don’t think this is, or should be, a given. i realize it’s extremely difficult to envision a world in which we didn’t need weapons. on the other hand – if we *don’t* envision it, we sure as hell will never get there.
as far as developing robots to kill people in war instead of converting human beings into machanized, killing robots – is that really progress?! how many resources – person-hours, creative energy, intellectual capital – was spent on developing this ever more sophisiticated war technology, and would any of those resources perhaps have been better spent in pursuit of streamlining the process of *peace*?
yeah yeah, just call me a hippie and pass the bong, already. ;>

i, for one, fear our new robot overlords
The iRobot marketing assault at eTech was the scariest conference keynote ever. Props to Burtonator for asking the ethics question. (thanks to g10 for locating the original strip used in the presentation, in which the original dialogue had been…

Comments are closed.