« Washington Engineer - October 2005
It’s a bird. It’s a plane. It’s a UW exoskeleton
- Jacob Rosen is developing a powered exoskeleton that could assist the disabled in dealing with the tasks of everyday life.
Go to the UW Biorobotics Lab, where the powered exoskeleton research is happening
Strapped into the sleek superstructure of his robotic exoskeleton arm, Jacob Rosen looks like a recruit in a Robert Heinlein novel, a soldier of the future who dons a mechanized suit to augment his strength and prowess for fighting otherworldly foes.
But Rosen, a research assistant professor of electrical engineering at the University of Washington, has more down-to-earth goals.
“This research will provide a tool and a fundamental understanding for the development of assistive technology for improving the quality of life for individuals in the disabled community,” said Rosen, who with graduate students Joel Perry Levi Miller, both in mechanical engineering, and Bobby Davis in electrical engineering, recently finished the right arm of a powered exoskeleton. The left arm is currently being built.
The project, funded by the National Science Foundation, has been three years in the works. The intent, according to Rosen, is to establish a “bio-port,” or a human/machine interface, and design the robotic exoskeleton so that it functions as a natural extension of the human body. The end effect will be to combine human and machine to enhance the user’s abilities – mostly muscular strength and control.
Humans possess naturally developed algorithms to precisely control and integrate movement, but are limited by strength, Rosen explained. Robotic devices, on the other hand, can perform tasks requiring phenomenal raw power, but can’t adapt to complicated and often unforeseen conditions in the same way that people can.
“The solution would seem to be combining these two entities, the human and robot, into one integrated system under the control of the human,” Rosen said.
The exoskeleton robot would be worn by the user, like a mechanical suit. The joints and links correspond to those of the human body. The actuators share the external load with the operator.
One of the key innovations of Rosen’s system, and a major focus of his research, lies in using the body’s own neural signals as the primary command signals that control the exoskeleton. Those signals would be detected through electrodes placed on the user’s skin.
The neural signals are processed by artificial muscles, called myoprocessors, that are at the core of the system. The myprocessors, serving as mechanical counterparts to human muscles, take into account the small delay between detection of the signals and the actual contraction of the user’s muscles. They also consider joint position, angular velocity and joint torque, and feed this information to the exoskeleton system.
By the time the operator’s muscles begin to contract, the exoskeleton is ready to perform and can move in tandem with the operator.
“That will allow natural control of the device as an extension of the body,” Rosen said.
The long-term idea, Rosen said, is to develop a device that can help people who have suffered physical setbacks deal with the tasks involved in everyday living.
“This could help people who have suffered various neurological disabilities, like stroke, spinal cord injury, muscular dystrophy and other neurodegenerative disorders,” Rosen said.
Smarter shoes? It’s all in the flex
- Wei-Chih Wang shows his flexible photonic sensors, which could be used to create a new generation of highly adaptable smart devices, from shoes to prosthetic limbs.
Imagine taking your morning jaunt in running shoes smart enough to measure the exact forces on your feet, then recommend how to counteract those stresses so you can avoid injury.
Or beds that can monitor normal and sheer stresses on the bodies of bedridden patients, then automatically adjust to avoid bed sores. Or a prosthetic limb that keeps close tabs on the contact area between patient and device, changing as needed to prevent skin ulcers.
Such futuristic devices may be just around the corner, according to Wei-Chih Wang, a research assistant professor in the Department of Mechanical Engineering who is creating sensors that can mold to their environment and give a detailed overall picture of stress areas.
The key is flexibility, Wang said. Not just in the creative process, but in the material itself.
“We are using a nano-fabricated material,” Wang said, holding aloft two palm-size circles that droop and flex as if they were made of latex rubber. “They are completely flexible. Nobody else is doing flexible sensors.”
Wang and his colleagues’ most recent research on the flexible sensors was presented earlier this year at the 2005 SPIE NDE Health Monitoring and Diagnostics conference.
The sensors, which can detect both pressure and shear forces, consist of arrays of optical waveguides, or conduits for light. The waveguides are set in perpendicular rows and columns, separated by elastomeric pads. When pressure is applied the waveguides bend. By sensing how they bend in relation to one another, scientists can construct a map of the normal and shear stresses that come into play.
And since the waveguides are optical – they use light instead of electricity – there is no danger of electrical interference.
Next steps include exploring various ways to adapt the sensors for medical diagnostics and interventions, creating smart devices that help protect their users.
“The type of information we can get from these sensors has the potential to drive actuators and smart materials to respond to varying pressures on an object,” Wang said. “That would create truly intelligent designs.”