It's a big time at Peter Coppin's Carnegie Mellon University-based Remote Experience and Learning Lab: For the past three weeks, eight guest scientists have joined the lab's staff to direct a robot rover named Zoí«, who hails from CMU's Robotics Institute. On Sept. 3, Zoí« began ranging about the vast, barren Atacama Desert in northern Chile. Like her predecessor robots Nomad and Hyperion, Zoí« has been outfitted with an array of data-collection tools, and her mission is to look for life -- if any is to be found, it's probably microscopic. NASA, the 3-year-old project's funder, hopes that the data-gathering techniques developed in the Atacama will help it hunt for life on other planets. Coppin's project is called EventScope, and it's part of CMU's STUDIO for Creative Inquiry.
So what's going on this week?
We're hosting scientists from SETI (Search for Extraterrestrial Intelligence), NASA, UCLA, University of Iowa, University of Tennessee and University of Arizona. Our science lead is Nathalie Cabrol, who's from SETI by way of NASA.
Is the Atacama rover the same one that visited Mars, which your lab also worked with?
A different one. It's designed to explore the Atacama Desert, and search for life there. So it's making genuine discoveries in the Atacama while also developing techniques that could be applied on other planets.
Why the Atacama?
The Atacama is one of the driest deserts in the world. It's pretty hard to find life there, especially if it's microscopic. So CMU, in partnership with the visiting science team, has developed a technique to find microbes, using dyes and florescent light to spot chlorophyll. There're also panoramic cameras, a spectrometer, a little weather station and all this stuff.
How does the rover rove?
From the data, software that we've developed creates images for the science team. The scientists command the rover by placing virtual "pins" into a virtual map -- based on an orbital image of the Atacama -- which tells the rover where to go. Visually using "pins" on a virtual map is very intuitive. You look at an orbital image, you put two pins on the map, but you're not actually controlling the robot. It has its own navigation systems. It's automatically avoiding local obstacles -- like falling into a ditch, things like that.
Basically, we're that bridge to the robot. We designed the computer interface and we host the scientists in the room, so it's like a mini-mission control center.
Who's on the team?
There's designers and artists working very closely with software engineers and roboticists and educators. Our robotics lead at CMU is David Wettergreen, working with [scientist] William "Red" Whittaker. So it's an eclectic group of people who've learned to work together in what's basically a new medium.
How did the lab get started?
Before becoming a research fellow at the STUDIO for Creative Inquiry, I did projects in the Robotics Institute, where I did some early projects with this concept of remote experience. Before that I was an art grad student, and my job then was working in the Robotics Institute as a machinist, helping build the Nomad rover.
One of the lab's goals is bringing science and art together. How does that work?
This idea of the remote experience as a guided tour through a 3-D world is very artistic, like a movie or narrative. Yet it also uses telepresence technology and robotics technology. George Lucas has teams of software people, and he's also got artists. To tell a story, even if it's factual, requires that type of interdisciplinary team.
The other thing that's pretty neat is that we have the same kind of software in a museum in Chicago, and we'll have presentations at the Carnegie Museum of Natural History. We can immediately convert the very recent scientific data into something the public can see. Because only scientists are controlling the robot, how do you make it exciting for the public? The solution is these virtual environments with guided tours.