Use Of A Low Cost Camera Based Positioning System In A First Year Engineering Cornerstone Design Project
Author(s) -
Michael Vernier,
Craig Morin,
Patrick M. Wensing,
Ryan Hartlage,
Barbara Carruthers,
Richard Freuler
Publication year - 2020
Publication title -
2009 annual conference and exposition proceedings
Language(s) - English
Resource type - Conference proceedings
DOI - 10.18260/1-2--5632
Subject(s) - computer science , robot , global positioning system , cornerstone , autonomy , human–computer interaction , systems design , beacon , positioning system , systems engineering , simulation , artificial intelligence , engineering , real time computing , software engineering , telecommunications , art , structural engineering , political science , node (physics) , law , visual arts
Although the concept of autonomous robot design projects has existed in engineering education for years as a tool for giving engineering students hands-on experience, in practice, the autonomy of these projects has been limited due to cost. Student programmers participating in these projects often have limited ways to interact with their environment autonomously, relying on low-cost sensors such as touch sensors instead of interacting with a high-cost camera-based positioning system. This not only limits the autonomy of the robot, but robs the student of valuable design and programming experience, since professional engineers often have access to tools such as the Global Positioning System (GPS). Thus, it was desirable to build a low-cost positioning system that could be used to track the movements of student-built robots and to transmit position and orientation information. This system would not only aid in the autonomy of the student-built robots, but give the students hands-on experience interacting with such a positioning system, experience that would transfer to working on autonomous vehicles in industry. Such a system was built, utilizing Nintendo Wii remotes as infrared cameras to track the movement of high-intensity infrared beacons mounted on student robots. Each of these Wii remotes tracked up to four LED locations in their field of view and conveyed the information to a C application running on a Linux machine. This application packaged the information and transmitted it to a Microsoft Visual Studio C# library, which grouped LED locations into robots positions. A National Instruments LabVIEW application transmitted the resultant orientation and position information to student’s autonomous robots via radio frequency communications. This system performed well and significantly improved the design experience for the students involved.
Accelerating Research
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom
Address
John Eccles HouseRobert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom