- Robotics news
- Robots: not human, but getting closer
It may seem uncomfortably close to science fiction, but robots are
moving ever nearer to having humanlike abilities to smell, feel and see
their surroundings, allowing them to operate more independently and
perform some of the dangerous, dirty and dull jobs people don't want to
do.
They can "smell" gas leaks, conduct underwater surveillance
and even sort boxes by shape and colour and toss them into the
appropriate warehouse bin. Advances in sensor technology and software
allow these machines to make split-second decisions without human
masters overseeing them about how to follow a scent trail or where to go
to next.
"They are gaining human capabilities, whether it's smell, or
touch or recognising our voices," said Daniel H. Wilson, a PhD in
robotics and the author of Robopocalypse, a techno-thriller
about what happens when robots go wrong. "If they are going to solve
human problems, they will have to have human abilities. Those are things
that robots will have to understand if they play a role in our lives."
This Swedish 'gasbot' uses lasers to detect methane leaking from landfills.
Until now, robots have had to navigate with small infrared
sensors that keep them from bumping into things. Some have relied on
video cameras that send images to human operators. But a new generation
of robots is gaining the ability to understand voices, see objects with
the same depth perception as humans and use grasping arms that have
dexterity close to that of humans.
Of course, none of them is yet as lifelike as "Sonny," the android of Isaac Asimov's novel (and the subsequent movie) I, Robot,
who feels, can think for himself, move on his own and, in a limited
way, emotes. Most robots with advanced sensing abilities are still in
the experimental stage. More than toys but not yet tools, they work well
in the laboratory but can't yet handle real-world situations.
Take, for example, the robots that are sorting boxes, picking
them up and tossing them into the right bin. This robot uses
two-dimensional and three-dimensional video cameras and software to
"look" at the size and shape of the box and then decide where it should
go.
The robot works pretty well - as long as the boxes are pretty
much rectangular and aren't moving, says Stanford University computer
science professor Gary Bradski, co-founder of Industrial Perception, the
start-up that invented the robot. But it isn't quite ready to replace
human workers in the mailroom or on the factory floor.
"It's easy to get 80 or 90 per cent of the way there," he
said. " But it's getting the speed and reliability to make it economic.
You can't fail very often; otherwise, you're not saving any labour."
Getting robots to smell is one of the bigger challenges. A
recent project out of the University of Tokyo takes a step in that
direction. Scientists there recently unveiled a tiny robot that is
driven by a male silkworm moth responding to a female moth's seductive
pheromone aroma.
The researchers built a motorised wheeled car that moves when
a moth, spurred by the smell, launches into a mating dance of repeated
zigzags on top of a trackball, similar to the ones used inside a
computer mouse. As the moth does its dance, sensors transmit its motions
to the robot's motors, allowing it to follow the path chosen by the
male.
The researchers said the "odour-tracking behaviours of
animals" could eventually be "applied to other autonomous robots so they
can track down smells, and subsequent sources, of environmental spills
and leaks when fitted with highly sensitive sensors".
Noriyasu Ando, an associate professor at the University of
Tokyo Research Centre for Advanced Science and Technology, who worked on
the moth robot, said in an e-mail that the challenge was to develop a
robot that could "behave alone, free from external wired connections
because the silk moth turns quickly and rotates very often".
Ando said the ultimate goal is to develop a robot with its
own smelling capabilities, one that can follow a trail just like the
moth on the trackball.
The team is now trying to build an artificial brain they've
named Kei; the motor-moth using its sense of smell is one step towards
that goal, he said.
Achim Lilienthal, who directs the mobile robotics and
olfaction lab at Orebro University in Sweden, said smell is more
complicated for robots than vision. Cameras can see an object as long as
there is enough light, while odours exist as plumes and patches in the
air and are not consistent in strength, which makes finding the source
difficult.
Lilienthal gives the example of methane emanating from an old
landfill. The town managing the landfill had set up devices to capture
the gas produced by the landfill's decay and burn it to heat the local
hospital. But over time, as the plastic lining beneath the landfill
developed cracks, more than half of the methane was evading the capture
technology. The town hired someone to walk around the landfill and sniff
for leaks, but that didn't work very well because the human nose is not
very efficient.
Enter Lilienthal's "gasbot", which looks like a lawnmower
with a big metal eyeball perched on top of a metal pole. This mini
all-terrain vehicle picks up smells using two laser beams: One absorbs
the chemical signature of the methane and determines its concentration
in the air. The second helps provide a three-dimensional map of the gas
plume. The advantage of the gasbot is that the lasers detect the gas
remotely, without machine or human having direct contact with the plume.
"For most gas sensors [such as smoke detectors], you need to
[physically] encounter the smell," whereas the gasbot uses its lasers to
detect gas at a distance, Lilienthal said.
Scientists are working as well to create effective underwater
robots. This task is very challenging because there is often not enough
light for cameras to work well, while swirling currents and eddies play
havoc with smells and chemical plumes.
To deal with this, a European group has built a robot that
uses something called lateral line sensing. The lateral line is a series
of nerve cells in fish that runs from
just below the head to the tail
and allows them to sense the speed and direction of currents, helping
them catch food and swim in schools without bumping into each other.
More than 30,000 fish species use lateral line sensing,
according to Maarja Kruusmaa, professor of biorobotics at the Tallinn
University of Technology in Estonia. She and colleagues set out to
create an electronic equivalent that would allow underwater robots to
navigate more efficiently through currents. After four years of work
they came up with FILOSE (Robotic FIsh LOcomotion and SEnsing), a robot
that's shaped like a rainbow trout. The researchers developed tiny
sensors to monitor pressure differences in the water flowing around the
robot. This allows the robot to follow in the wake of an object to cut
energy use, according to Kruusmaa.
"It is similar to reducing your effort in the tailwind of
another cyclist or reducing the fuel consumption of your car by driving
behind a truck," she said.
The robot is driven by a small electric motor and can be
outfitted with a video camera for surveillance or with chemical sensors
to detect pollution.
The next step is to take FILOSE for a swim outside the lab to see how it does in the real world.
Meanwhile, Kanna Rajan, principal researcher for autonomy at
the Monterey Bay Aquarium Research Institute, is designing underwater
robots that are programmed to make their own decisions. They use sensors
to determine where to hunt for oil spills, for example, or to swim
towards a place in the ocean that scientists want to study, such as a
feeding ground for fish or a range of underwater volcanoes that might
erupt.
Rajan, a former NASA researcher who helped develop the rovers
that landed on Mars almost 10 years ago, says it's harder to build
smart robots that work underwater than ones that function in outer
space. That's because communications for the latter travel through the
relatively quiet vacuum of space. Underwater communications, however,
are often blocked by layers of warm and cold water, slow-moving
underwater storms and the sounds of passing ships and wildlife. The
ocean's salinity also tends to degrade robots much more than the cold
temperatures of space, Rajan said.
"The ocean is a lot more harsh," he said. "If you go in deep
space, there's not much going on, you are very careful as to what you
do. Even on Mars you can talk to your robot."
So what are the prospects for a seeing, smelling, sensing
robot that works in the real world? That probably won't materialise
anytime soon, said Jelle Atema, a professor of biology at Boston
University and the Woods Hole Oceanographic Institution.
"When it comes to a broad, robust exposure in the natural
environment and still being able to perform the task," Atema said,
"animals have it all over human engineering.
"