How Robots Are Getting Smarter; New Models Bring Greater Skills to the Factory Floor, From Navigating on Their Own to Learning by Doing
June 5, 2014 Leave a comment
How Robots Are Getting Smarter
New Models Bring Greater Skills to the Factory Floor, From Navigating on Their Own to Learning by Doing
June 1, 2014 4:48 p.m. ET
ALL EYES | Baxter’s face shows his status (from left): on standby, confused and needing guidance, surprised, sad and awaiting instruction, and ready for training. Workers can train Baxter by moving his arms rather than writing digital prompts. HANDS ON | The KR Quantec can change its own tools to suit its tasks. THIS WAY | The Lynx maps the layout where it’s working and finds its way around obstacles it “sees” with a laser and ultrasonic detectors. It can also alert its fellow robots to any detours.
An evolving breed of smarter, safer robots is making its way to the factory floor, armed with skills and capabilities that far surpass those of earlier generations. They are bringing more sensitive touch, expressions to convey understanding and intent, and even the ability to change mechanical “hands.”
Here’s a look at some of the latest advances.
Mobility and Vision
Drop a box in front of Lynx, from Adept Technology Inc. ADEP -3.23% in Pleasanton, Calif. The small, squarish robot on wheels can sense the object in its path and plot a new course around it—and then communicate the change in landscape to the other units in its fleet.
Robots have been mobile for many years, but unlike previous generations, Lynx doesn’t need to follow tape on the floor and isn’t restricted to a grid. That’s because Adept, the largest U.S.-based manufacturer of industrial robots, designed Lynx to be able to work autonomously while moving objects around a chaotic factory floor, where the ability to navigate around unpredictable obstacles such as humans and pallets is essential.
To do this, Lynx stores its own internal map of the layout of the location where it’s working. To sense the terrain, Lynx has ultrasonic detectors that scan to see if anything is on the floor in its vicinity, and a laser in front to measure the distance to objects such as walls and moving people. As Lynx cruises the factory floor, it compares the actual terrain with its stored map and chooses the best path, taking into account any new obstacles.
Brain
The Robot Operating System is a set of code libraries and tools for robot software development. Used in the Robonaut 2 aboard the international space station, ROS won’t control a robot all by itself; developers need to write software on top of it. But it provides the building blocks to work with. It includes algorithms related to “perception,” for example, that an engineer can use to create a robot’s navigation system.
“Our goal with ROS is to take care of all the mundane details of controlling a robot,” says Brian Gerkey, CEO of the Open Source Robotics Foundation, Mountain View, Calif., which oversees the development of the operating system. “Then when somebody has a great idea they’re free to implement it in just the right way.” A team of about 15 engineers at the foundation maintains core parts of the operating system, and engineers in industrial and university labs all over the world contribute to the code.
Communicating With People
To work safely shoulder-to-shoulder with humans, robots need to be able to communicate their intent. Baxter, an industrial robot built by Boston-based Rethink Robotics Inc., has a monitor for a face with features like eyes and eyebrows that show his status.
His eyes aren’t able to “see” anything, but one eyebrow raised means he is confused and needs more guidance; droopy eyelids indicate he is sad and on hold awaiting further instruction; two raised eyebrows mean he is surprised because a person has unexpectedly entered his work area.
Nerves and Learning
What happens if a robot hits a human? Rethink’s Baxter has “nerves” that can sense the force and slow down, stop or reverse direction. Technically called “series elastic actuators,” they are underneath his outer casing, and primarily function as force detection to provide a level of feedback to Baxter.
Developed by Matt Williamson, technology director at Rethink, and his Massachusetts Institute of Technology professor at the time, Gill Pratt, the “nerves” play a role when it comes time for a human worker to train Baxter: Workers are able to physically move his limbs to show him what to do—he nods when he understands the task—rather than write lines of code to instruct Baxter.
Hands
Lift a large object. Change tools. Clean the surface. Attach the drill bit. The KR Quantec, made by Kuka AGKU2.XE +1.53% of Augsburg, Germany, is able to switch “hands”—that is, the tool on the end of its limb—depending on the task. It is used on automobile production lines by Tesla, Audi, Mercedes-Benz and General Motors.
—Georgia