The Embedded Systems Conference, along with chowder (pronounced chow-dah) and Fenway Park, is one of the better reasons for visiting Boston. Traditionally the first day of the show is reserved for half-day tutorials and the main block of conference sessions, lectures and keynote addresses kick off on Day Two.
The lead-off keynoter this year was Dr. Hugh Herr, associate professor at MIT. When he was 17, Herr suffered a rock climbing accident that resulted in the amputation of both legs below the knee. His dis-satisfaction with conventional prostheses led him to a career working on development of technology-enhanced artificial limbs.
Today Herr directs the Biomechatronics group at the MIT Media Lab. His keynote focused on the science of biomechatronics, which combines human function with sensing, computation and mechanical activation that promises to accelerate the merging of body and machine. Biomechatronics research encompasses the disciplines of biology, mechanics, material science and “even art and design," Herr told the embedded system design audience.
Having heard Prof. Herr deliver a talk earlier this year at the Freescale Technology Forum (see my previous blog )his mid-presentation show and tell moment where he first reveals his titanium prosthetic lower legs that contain a battery, five motors, three internal microprocessors and 12 sensors to closely simulate natural biological movement was, for me, lacking the original “wow” factor of seeing it for the first time. Nonetheless, witnessing Dr. Herr walk around on stage with a natural gait and none of forced, jerky movements typical of a mechanical prosthetic device is inspiring even after multiple viewings.
Herr puts the market for active prosthetics that work off mechanical parts controlled by computers at approximately $1 billion. His company, iWalk, expects to market the biomechatronic legs next year, priced in the $30K-$50K range.
In his ESC Boston keynote address the researcher also offered examples of other biomechatronic projects, including two taking place at MIT.
One of the disorders associated with autism--there are about 1 million to 1.5 million Americans with the affliction-- is a condition sometimes called "mind blindness," the inability to know other people react and understand that person's emotions. As a result people with autism often fail to notice that they are being repetitive, boring or confusing their listeners.
By developing technology that can help understand facial expressions and emotions and using that information to augment human-to-human interaction PeopleSense, a project under the direction of Prof. Rosalind Picard and Rasa El-Kaliouby of the MIT Media Lab, is a wearable device that gives people with autism or other people who have trouble reading emotions the opportunity to go out in the real world and learn about emotions and facial expressions of the people they usually interact with.
The "emotional social intelligence prosthetic" device consists of a camera small enough to be pinned to the side of a pair of glasses, connected to a hand-held computer running image recognition software. PeopleSense will alert its user if the person they are talking to starts showing signs of getting bored or annoyed.
Nexi, the first of a new class of robot being also being developed at MIT’s Media Lab, is a small mobile humanoid robot that possess a novel combination of mobility, dexterity, and human-centric communication and interaction abilities. Nexi eventually will be able to move around on wheels and it can pick up objects. But its most striking feature is its humanlike face, which can express an wide range of emotions.
To do so Nexi’s head moves at speeds equivalent to a human and is capable of human head gestures such as nodding and shaking. The 15 degrees-of-freedom face has several facial features to support a diverse range of facial expressions including gaze, eyebrows, eyelids and an articulate mandible for expressive posturing. The robot’s neck mechanism has 4 degrees-of-freedom to support a lower bending at the base of the neck as well as pan-tilt-yaw of the head.
To do so Nexi’s head moves at speeds equivalent to a human and is capable of human head gestures such as nodding and shaking. The 15 degrees-of-freedom face has several facial features to support a diverse range of facial expressions including gaze, eyebrows, eyelids and an articulate mandible for expressive posturing. The robot’s neck mechanism has 4 degrees-of-freedom to support a lower bending at the base of the neck as well as pan-tilt-yaw of the head.
Immediately following Prof. Herr’s presentation was the first of a series of industry talks at ESC Boston, given by Kevin Dallas, General Manager of Microsoft Windows Embedded. Dallas covered the impact of the Cloud on the embedded device market, more specifically describing how Windows Embedded software and Services Platforms enables engineers to take advantage of the new revenue opportunities the so-called computing “cloud” presents.
As for what the cloud is, Dallas defined it as an approach to computing that has internet scale as well as offering connectivity to a variety of devices and endpoints. He noted that server systems can take several forms: It can be embedded into or near a product, can be available to enterprise through their IT department or be totally off-site as a shared (public) or dedicated (private) server.