Engineering students spend summer
teaching FLUFFEMS how to see © Oct. 23, 2009, Norwich University Office of Communications
Electronics engineering student Will Biasotti compared the challenge of creating a robot that follows directions reliably to walking: A single motion requires a complex interaction of many factors that executes flawlessly.
“Everything looks simple,” he said. “It’s actually really not.”
Biasotti and mechanical engineering student Rob Burnham spent the better part of the summer of 2009 making a simple robot perform a very complex task: recognizing and responding to visual cues. Rather than following a set of programmed commands, their robot reacts based on the information it receives through a simple USB camera purchased at Radio Shack. It’s a project with challenges similar to those of guiding NASA robots that operate autonomously on distant planets.
“The fact that it actually uses the camera to see makes [the project] unique,” said Burnham, a senior from Pepperell, Mass.
Funded though Norwich University’s summer research program, their efforts were the next step in a series of projects to help engineering students learn skills by building robots. To reach their goal of having a mobile robot navigate through a simple course by sight, Burnham developed a program to allow the computer to recognize images and patterns. Biasotti was more involved with the electronics end, installing circuit boards and sensors and “teaching” the vehicle to react to commands from the computer.
Their system was built from a small laptop computer resting on a mobile chassis, built from a Vex Robotic Design System kit similar to an erector set. For the final test, they programmed the robot, called Flat Land Unmanned Flare-Finding Electronic Machine System, or “FLUFFEMS,” to think its way through a course of blue tape laid on the floor, recognizing containers and a red “flare” target representing the final goal.
Both said they thought the programming would be the biggest challenge, and were surprised when they spent the majority of time on mechanical issues and gearing the robot’s motors to react the way they wanted. The biggest challenge of programming, according to Burnham, was taking the matrix of digital pixels the camera recognizes, and comparing it to the contour of lines, shapes and edges of objects represented in flashcard pictures he scanned into a database.
“How do you get a computer to recognize a chair?” he said. “A lot of it is just thinking outside the box.”
Burnham and Biasotti will continue to take their sight recognition research to new places during their senior year, with a robot that’s speedier, more reliable, and programmed to recognize hand signals. Ron Lessard, Norwich computer and electrical engineering professor, said he thought this was a largely untraveled path for communicating with a robot. There are many robots able to take in visual information, he said, but it’s a much more complex task for a computer to perceive and analyze this data.
“I don’t know anyone who’s doing it,” he said.
FLUFFEMS has been drafted for use in Lessard’s embedded systems course, in which engineering students learn to write programs for the robot to follow. In addition, he said, Norwich’s students are working on a robotic mountain dulcimer, a simple, three-stringed instrument that will play the signature song of Norwich, the country’s oldest private military college. Students also regularly compete in an international contest to produce the best unmanned underwater robot.
Following his 2010 graduation, Burnham would like to work for iRobot, a Massachusetts company that pioneered the unmanned vacuum cleaner. Biasotti said he’s interested in working for Applied Research Associates, a technology engineering company with a facility in Randolph, Vt., not far from Norwich’s campus. Both said it was a privilege to spend their summer working on a project like this one.
“It was frustrating at times, but it was definitely satisfying,” said Biasotti. “It definitely piqued my interest to learn more.”