A robot may soon represent you at meetings. The technique is clunky and expensive, but an improvement over videoconferencing. Eventually, given reported improvements in robot vision and gait, your surrogate may soon move more smoothly in remote company, and give you a better picture of what is going on, through its improved vision.
Speaking of pictures, here are some taken at this year’s Tobodex expo in Japan.
Surrogate You
Hewlett-Packard researchers have created a wheeled robotic surrogate enabling people to attend meetings remotely. It’s an advanced version of videoconferencing. Instead of a grainy and jerky video image, the remote participant is “present” at the meeting in the form of a robot. The robot has cameras and microphones that transmit a 360-degree audio and video view of its surroundings to the remote party, while displaying that person’s head on four flat panel displays for the benefit of people actually in the room with the robot.
The remote participant can move and manipulate the robot via a joystick. The researchers say the 360-degree view of the remote location and the “near-perfect” 360-degree sound reproduction gives real sense of presence, of being there.
It seems that the system requires expensive equipment at both ends of the link, not to mention substantial bandwidth for the high fidelity audiovideo. But something like it probably will replace today’s often poor-quality and unsatisfying videoconferencing. The availability of such a system at reasonable cost would almost certainly give a boost to telemedicine. It could also be used to upgrade a robot surrogate system already being used to let hospitalized kids
attend school remotely — see Robot Surrogate for Sick Kids in the April issue of HFD.
Reference: Firth, Simon (2003). “Face-to-face while far away: HP fellow’s technology lets you be in two places at the same time.” Hewlett-Packard press release, March.
Robot Gait
This is a slightly old story we came across recently, but thought it worthy of mention: Butch, a robot dinosaur designed to move “as fast as a galloping dog – a steady 10 to 15 mph,” is under development by an MIT Leg Lab researcher. Unlike Sony’s Aibo and other “traditional” robots whose limbs move until their joints reach a pre-defined angle, Butch’s limbs move until they sense resistance, giving it a much smoother, less jerky gait. Humanoid robots developed with such joints will seem far more natural.
Reference: Davis, Joshua (2003). “See Bot Run.” Wired, November 6.
Robot Vision
A shape-recognition system called Foveola is claimed to mimic the human visual system and to recognize a broad range of objects. It extracts shapes from a visual scene and assigns them a “mathematical signature.” After first being trained to recognize a shape (in a single viewing) it can recognize the shape again even if the shape is distorted. Applications would include helping robots read signs and recognize faces, and improving the accuracy of handwriting recognition. Details of the invention are sketchy, and skepticism remains the order of the day.
Reference: Kahney, Leander (2003). “Giving Robots the Gift of Sight.” Wired News, May 15.
Pictures from an Exhibition
Robots are photogenic, but in the interests of speed HFD does not carry photographs. For anyone who wants to see the robots mentioned in HFD in recent months, “blogkeeper” John Wiseman has a sterling site providing lots of pictures from Robodex 2003.
Reference: Wiseman, John (2003). “Robodex 2003 Guide.” Lemonodor, May. See here also.