Download .PDF
Abstrak:
prototyped sebuah alat yang disebut eyeRobot ini akan memandu pengguna tunanetra melalui lingkungan berantakan dan penduduk dengan menggunakan Roomba sebagai dasar kesederhanaan tongkat putih tradisional dengan naluri melihat. Pengguna menunjukkan / gerak yang diinginkan nya dengan intuitif mendorong dan memutar pegangan. Robot mengambil informasi ini dan menemukan jalan yang jelas ke lorong atau di seluruh ruangan, menggunakan sonar untuk mengarahkan pengguna ke arah yang sesuai sekitar hambatan statis dan dinamis. Pengguna kemudian mengikuti di belakang robot karena memandu pengguna ke arah yang diinginkan oleh kekuatan nyata dirasakan melalui pegangan. Pilihan robot ini membutuhkan sedikit pelatihan: mendorong untuk pergi, menarik untuk berhenti, memutar untuk mengubah. Kejelian para pengukur jarak memberikan mirip dengan anjing melihat, dan merupakan keuntungan besar atas percobaan konstan dan error yang menandai penggunaan tongkat putih. Namun eyeRobot masih memberikan alternatif yang jauh lebih murah daripadaalat lain , yang biaya lebih dari $ 12.000 dan berguna untuk hanya 5 tahun, sedangkan prototipe dibangun untuk baik di bawah $ 400. Ini juga merupakan mesin yang relatif sederhana, membutuhkan beberapa sensor yang murah, berbagai potensiometer, beberapa perangkat keras, dan tentu saja, lebih efisien biaya.
Abstract:
prototyped a tool called eyeRobot will guide visually impaired users through a cluttered environment and residents by using the Roomba as the basic simplicity of the traditional white cane with instinct look. Users show / her desired motion with intuitive push and rotate the handle. Robots take this information and find a clear path to the hall or across the room, using sonar to direct users to the appropriate direction about the static and dynamic obstacles. Users then follow behind the robot because it guides the user to the desired direction by a real force felt through the handle. This robot option requires little training: push to go, pull to a stop, rotate to change. Foresight of the measuring range gives similar to the dog look, and is a major advantage over constant trial and error that marks use a white cane. However eyeRobot still provide a much cheaper alternative to other daripadaalat, which cost more than $ 12,000 and useful for only 5 years, while the prototype built for well under $ 400. It is also a relatively simple machine, need some cheap sensors, various potentiometer, some hardware, and of course, more cost efficient.
Abstrak:
prototyped sebuah alat yang disebut eyeRobot ini akan memandu pengguna tunanetra melalui lingkungan berantakan dan penduduk dengan menggunakan Roomba sebagai dasar kesederhanaan tongkat putih tradisional dengan naluri melihat. Pengguna menunjukkan / gerak yang diinginkan nya dengan intuitif mendorong dan memutar pegangan. Robot mengambil informasi ini dan menemukan jalan yang jelas ke lorong atau di seluruh ruangan, menggunakan sonar untuk mengarahkan pengguna ke arah yang sesuai sekitar hambatan statis dan dinamis. Pengguna kemudian mengikuti di belakang robot karena memandu pengguna ke arah yang diinginkan oleh kekuatan nyata dirasakan melalui pegangan. Pilihan robot ini membutuhkan sedikit pelatihan: mendorong untuk pergi, menarik untuk berhenti, memutar untuk mengubah. Kejelian para pengukur jarak memberikan mirip dengan anjing melihat, dan merupakan keuntungan besar atas percobaan konstan dan error yang menandai penggunaan tongkat putih. Namun eyeRobot masih memberikan alternatif yang jauh lebih murah daripadaalat lain , yang biaya lebih dari $ 12.000 dan berguna untuk hanya 5 tahun, sedangkan prototipe dibangun untuk baik di bawah $ 400. Ini juga merupakan mesin yang relatif sederhana, membutuhkan beberapa sensor yang murah, berbagai potensiometer, beberapa perangkat keras, dan tentu saja, lebih efisien biaya.
Abstract:
prototyped a tool called eyeRobot will guide visually impaired users through a cluttered environment and residents by using the Roomba as the basic simplicity of the traditional white cane with instinct look. Users show / her desired motion with intuitive push and rotate the handle. Robots take this information and find a clear path to the hall or across the room, using sonar to direct users to the appropriate direction about the static and dynamic obstacles. Users then follow behind the robot because it guides the user to the desired direction by a real force felt through the handle. This robot option requires little training: push to go, pull to a stop, rotate to change. Foresight of the measuring range gives similar to the dog look, and is a major advantage over constant trial and error that marks use a white cane. However eyeRobot still provide a much cheaper alternative to other daripadaalat, which cost more than $ 12,000 and useful for only 5 years, while the prototype built for well under $ 400. It is also a relatively simple machine, need some cheap sensors, various potentiometer, some hardware, and of course, more cost efficient.
Step 1: Operation overview
User Control:
The operation of eyeRobot is designed to be as intuitive as possible to greatly reduce or eliminate training. In order to begin motion the user simply has to begin walking forward, a linear sensor at the base of the stick will pick up this motion and begin moving the robot forward. Using this linear sensor, the robot can then match its speed to the desired speed of the user. eyeRobot will move as fast as the user wants to go. To indicate that a turn is desired, the user simply has to twist the handle, and if a turn is possible, the robot will respond accordingly.
Robot Navigation:
When traveling in open space, eyeRobot will attempt to keep a straight path, detecting any obstacle that may impede the user, and guiding the user around that object and back onto the original path. In practice the user can naturally follow behind the robot with little conscious thought.
To navigate a hallway, the user should attempt to push the robot into one of the walls on either side, upon acquiring a wall the robot will begin to follow it, guiding the user down the hallway. When a intersection is reached, the user will feel the robot begin to turn, and can choose, by twisting the handle, whether to turn down the new offshoot or continue on a straight path. In this way the robot is very much like the white cane, the user can feel the environment with the robot and use this information for global navigation.
The operation of eyeRobot is designed to be as intuitive as possible to greatly reduce or eliminate training. In order to begin motion the user simply has to begin walking forward, a linear sensor at the base of the stick will pick up this motion and begin moving the robot forward. Using this linear sensor, the robot can then match its speed to the desired speed of the user. eyeRobot will move as fast as the user wants to go. To indicate that a turn is desired, the user simply has to twist the handle, and if a turn is possible, the robot will respond accordingly.
Robot Navigation:
When traveling in open space, eyeRobot will attempt to keep a straight path, detecting any obstacle that may impede the user, and guiding the user around that object and back onto the original path. In practice the user can naturally follow behind the robot with little conscious thought.
To navigate a hallway, the user should attempt to push the robot into one of the walls on either side, upon acquiring a wall the robot will begin to follow it, guiding the user down the hallway. When a intersection is reached, the user will feel the robot begin to turn, and can choose, by twisting the handle, whether to turn down the new offshoot or continue on a straight path. In this way the robot is very much like the white cane, the user can feel the environment with the robot and use this information for global navigation.
Step 2: Range Sensors
Ultrasonics:
The eyeRobot carries 4 Ultrasonic rangefinders (MaxSonar EZ1). The ultrasonic sensors are positioned in an arc at the front of the robot to provide information about objects in front of and to the sides of the robot. They inform the robot about the range of the object and help it find a open route around that object and back onto its original path.
IR Rangefinders:
The eyeRobot also carries two IR sensors (GP2Y0A02YK). The IR rangefinders are positioned to face out 90 degrees to the right and left to aid the robot in wall following. They can also alert the robot of objects too close to its sides that the user may walk into.
The eyeRobot carries 4 Ultrasonic rangefinders (MaxSonar EZ1). The ultrasonic sensors are positioned in an arc at the front of the robot to provide information about objects in front of and to the sides of the robot. They inform the robot about the range of the object and help it find a open route around that object and back onto its original path.
IR Rangefinders:
The eyeRobot also carries two IR sensors (GP2Y0A02YK). The IR rangefinders are positioned to face out 90 degrees to the right and left to aid the robot in wall following. They can also alert the robot of objects too close to its sides that the user may walk into.
Step 3: Cane position sensors
Linear Sensor:
In order for the eyeRobot to match it's speed to that of the user, the eyeRobot senses whether the user is pushing or retarding its forward motion. This is achieved by sliding the base of the cane along a track, as a potentiometer senses the cane's position. The eyeRobot uses this input to regulate the speed of the robot. The idea of the eyeRobot adapting to the speed of the user through a linear sensor was actually inspired by the family lawnmower.
The base of the cane is connected to a guide block moving along a rail. Attached to the guide block is a slide potentiometer that reads the position of the guide block and reports it to the processor. In order to allow the stick to rotate relative to the robot there is a rod running up through a block of wood, forming a rotating bearing. This bearing is then attached to a hinge to allow the stick to adjust to the height of the user.
Twist Sensor:
The twist sensor allows the user to twist on the handle to turn the robot. A potentiometer is attached to the end of one wooden shaft and the knob is inserted and glued into the upper part of the handle. The wires run down the dowel and feed the twist information into the processor.
In order for the eyeRobot to match it's speed to that of the user, the eyeRobot senses whether the user is pushing or retarding its forward motion. This is achieved by sliding the base of the cane along a track, as a potentiometer senses the cane's position. The eyeRobot uses this input to regulate the speed of the robot. The idea of the eyeRobot adapting to the speed of the user through a linear sensor was actually inspired by the family lawnmower.
The base of the cane is connected to a guide block moving along a rail. Attached to the guide block is a slide potentiometer that reads the position of the guide block and reports it to the processor. In order to allow the stick to rotate relative to the robot there is a rod running up through a block of wood, forming a rotating bearing. This bearing is then attached to a hinge to allow the stick to adjust to the height of the user.
Twist Sensor:
The twist sensor allows the user to twist on the handle to turn the robot. A potentiometer is attached to the end of one wooden shaft and the knob is inserted and glued into the upper part of the handle. The wires run down the dowel and feed the twist information into the processor.
Step 4: Processor
Processor:
The robot is controlled by a Zbasic ZX-24a sitting on a Robodyssey Advanced Motherboard II. The processor was chosen for its speed, ease of use, affordable cost, and 8 Analog inputs. It is connected to a large prototyping breadboard to allow for quick and easy changes. All power for the robot comes from the power supply on the motherboard. The Zbasic communicates with the roomba through the cargo bay port, and has full control over the Roomba's sensors and motors.
The robot is controlled by a Zbasic ZX-24a sitting on a Robodyssey Advanced Motherboard II. The processor was chosen for its speed, ease of use, affordable cost, and 8 Analog inputs. It is connected to a large prototyping breadboard to allow for quick and easy changes. All power for the robot comes from the power supply on the motherboard. The Zbasic communicates with the roomba through the cargo bay port, and has full control over the Roomba's sensors and motors.
Step 5: Code Overview
Obstacle avoidance:
For obstacle avoidance the eyeRobot uses a method where objects near the robot exert a virtual force on the robot moving it away from the object. In other words, objects push the robot away from themselves. In my implementation, the virtual force exerted by an object is inversely proportional to distance squared, so the strength of the push increases as the object gets closer and creates a nonlinear response curve:
PushForce = ResponseMagnitudeConstant/Distance2
The pushes coming from each sensor are added together; sensors on the left side push right, and vice versa, to get a vector for the robot's travel. Wheel speeds are then changed so the robot turns toward this vector. To ensure that objects dead in front of the robot do not exhibit a "no response" (because the forces on both sides balance), objects to the dead front push the robot to the more open side. When the robot has passed the object it then uses the Roomba's encoders to correct for the change and get back onto the original vector.
Wall Following:
The principle of wall following is to maintain a desired distance and parallel angle to a wall. Issues arise when the robot is turned relative to the wall because the single sensor yields useless range readings. Range readings are effected as much by the robots angle to the wall as by the actual distance to the wall. In order to determine angle and thus eliminate this variable, the robot must have two points of reference that can be compared to get the robots angle. Because the eyeRobot only has one side facing IR rangefinder, in order to achieve these two points it must compare the distance from the rangefinder over time as the robot moves. It then determines its angle from the difference between the two readings as the robot moves along the wall. It then uses this information to correct for improper positioning. The robot goes into wall following mode whenever it has a wall alongside it for a certain amount of time and exits it whenever there is an obstacle in its path, which pushes it off its course, or if the user uses the twist handle to bring the robot away from the wall.
For obstacle avoidance the eyeRobot uses a method where objects near the robot exert a virtual force on the robot moving it away from the object. In other words, objects push the robot away from themselves. In my implementation, the virtual force exerted by an object is inversely proportional to distance squared, so the strength of the push increases as the object gets closer and creates a nonlinear response curve:
PushForce = ResponseMagnitudeConstant/Distance2
The pushes coming from each sensor are added together; sensors on the left side push right, and vice versa, to get a vector for the robot's travel. Wheel speeds are then changed so the robot turns toward this vector. To ensure that objects dead in front of the robot do not exhibit a "no response" (because the forces on both sides balance), objects to the dead front push the robot to the more open side. When the robot has passed the object it then uses the Roomba's encoders to correct for the change and get back onto the original vector.
Wall Following:
The principle of wall following is to maintain a desired distance and parallel angle to a wall. Issues arise when the robot is turned relative to the wall because the single sensor yields useless range readings. Range readings are effected as much by the robots angle to the wall as by the actual distance to the wall. In order to determine angle and thus eliminate this variable, the robot must have two points of reference that can be compared to get the robots angle. Because the eyeRobot only has one side facing IR rangefinder, in order to achieve these two points it must compare the distance from the rangefinder over time as the robot moves. It then determines its angle from the difference between the two readings as the robot moves along the wall. It then uses this information to correct for improper positioning. The robot goes into wall following mode whenever it has a wall alongside it for a certain amount of time and exits it whenever there is an obstacle in its path, which pushes it off its course, or if the user uses the twist handle to bring the robot away from the wall.
Step 6: Parts List
Parts Required:
1x) Roomba create
1x) Large sheet of acrylic
2x) Sharp GP2Y0A02YK IR rangefinder
4x) Maxsonar EZ1 ultrasonic rangefinders
1x) ZX-24a microprocessor
1x) Robodyssey Advanced Motherboard II
1x) Slide potentiometer
1x) Single turn potentiometer
1x) Linear bearing
1x) Solderless breadboard
)))) Assorted Hinges, dowels, screws,nuts, brackets, and wires
1x) Roomba create
1x) Large sheet of acrylic
2x) Sharp GP2Y0A02YK IR rangefinder
4x) Maxsonar EZ1 ultrasonic rangefinders
1x) ZX-24a microprocessor
1x) Robodyssey Advanced Motherboard II
1x) Slide potentiometer
1x) Single turn potentiometer
1x) Linear bearing
1x) Solderless breadboard
)))) Assorted Hinges, dowels, screws,nuts, brackets, and wires
Step 7: Motivation and Improvement
Motivation:
This robot was designed to fill the obvious gap between the capable but expensive guide dog and the inexpensive but limited white cane. In the development of a marketable and more capable Robotic White Cane, the Roomba Create was the perfect vehicle for designing a quick prototype to see if the concept worked. In addition, the prizes would provide economic backing for the considerable expense of building a more capable robot.
Improvement:
The amount I learned building this robot was substantial and here I will attempt to lay out what I have learned as I move on to attempt to build a second generation robot:
1) Obstacle Avoidance - I have learned a lot about real time obstacle avoidance. In the process of building this robot I have gone through two completely different obstacle avoidance codes, starting with the original object force idea, then moving to the principle of finding and seeking the most open vector, and then moving back to the object force idea with the key realization that the object response should be non-linear. In the future I will correct my mistake of not doing any online research of previously used methods before embarking on my project, as I'm now learning a quick Google search would have yielded numerous great papers on the subject.
2) Design of the stick sensors - Beginning this project I thought my only option for a linear sensor was to use a slide pot and some sort of linear bearing. I now realize that a much simpler option would have been to simply attach the top of the rod to a joystick, such that pushing the stick forward would also push the joystick forwards. In addition a simple universal joint would allow the twist of the stick to be translated into the twist axis of many modern joysticks. This implementation would have been much simpler then the one I currently use.
3) Free turning wheels - Although this would have been impossible with the Roomba, it now seems obvious that a robot with free turning wheels would be ideal for this task. A robot that rolls passively would require no motors and a smaller battery and thus be lighter. In addition, this system requires no linear sensor to detect the users push, the robot would simply roll at the users speed. The robot could be turned by steering the wheels like a car, and if the user needed to be stopped brakes could be added. For the next generation eyeRobot I will certainly use this very different approach.
4) Two spaced sensors for wall following - As discussed earlier problems arose when trying to wall follow with only one side facing sensor, thus it was necessary to move the robot between readings to achieve different points of reference. Two sensors with a distance between them would simplify wall following greatly.
5) More sensors - Although this would have cost more money it was difficult trying to code this robot with so few windows on the world outside the processor. It would have made the navigation code much more powerful with a more complete sonar array (but of course sensors cost money, which I didn't have at the time).
This robot was designed to fill the obvious gap between the capable but expensive guide dog and the inexpensive but limited white cane. In the development of a marketable and more capable Robotic White Cane, the Roomba Create was the perfect vehicle for designing a quick prototype to see if the concept worked. In addition, the prizes would provide economic backing for the considerable expense of building a more capable robot.
Improvement:
The amount I learned building this robot was substantial and here I will attempt to lay out what I have learned as I move on to attempt to build a second generation robot:
1) Obstacle Avoidance - I have learned a lot about real time obstacle avoidance. In the process of building this robot I have gone through two completely different obstacle avoidance codes, starting with the original object force idea, then moving to the principle of finding and seeking the most open vector, and then moving back to the object force idea with the key realization that the object response should be non-linear. In the future I will correct my mistake of not doing any online research of previously used methods before embarking on my project, as I'm now learning a quick Google search would have yielded numerous great papers on the subject.
2) Design of the stick sensors - Beginning this project I thought my only option for a linear sensor was to use a slide pot and some sort of linear bearing. I now realize that a much simpler option would have been to simply attach the top of the rod to a joystick, such that pushing the stick forward would also push the joystick forwards. In addition a simple universal joint would allow the twist of the stick to be translated into the twist axis of many modern joysticks. This implementation would have been much simpler then the one I currently use.
3) Free turning wheels - Although this would have been impossible with the Roomba, it now seems obvious that a robot with free turning wheels would be ideal for this task. A robot that rolls passively would require no motors and a smaller battery and thus be lighter. In addition, this system requires no linear sensor to detect the users push, the robot would simply roll at the users speed. The robot could be turned by steering the wheels like a car, and if the user needed to be stopped brakes could be added. For the next generation eyeRobot I will certainly use this very different approach.
4) Two spaced sensors for wall following - As discussed earlier problems arose when trying to wall follow with only one side facing sensor, thus it was necessary to move the robot between readings to achieve different points of reference. Two sensors with a distance between them would simplify wall following greatly.
5) More sensors - Although this would have cost more money it was difficult trying to code this robot with so few windows on the world outside the processor. It would have made the navigation code much more powerful with a more complete sonar array (but of course sensors cost money, which I didn't have at the time).
Step 8: Conclusion
Conclusion:
The iRobot proved an ideal prototyping platform for experimenting with the concept of a Robotic White Cane. From the results of this prototype it is apparent that a robot of this type is indeed viable. I hope to develop a second generation robot from the lessons I have learned from using the Roomba Create. In future versions of eyeRobot I envision a device capable of doing more than just guiding a person down a hallway, rather a robot that can be put in the hands of the blind for use in everyday life. With this robot, the user would simply speak their destination and the robot would guide them there without conscious effort from the user. This robot would be light and compact enough to be easily carried up stairs, and tucked away in a closet. This robot would be able to do global navigation in addition to local, being able to guide the user from start to destination without the users prior knowledge or experience. This capability would go well beyond even the guide dog, with GPS and more advanced sensors allowing the blind to freely navigate the world,
Nathaniel Barshay,
(Entered by Stephen Barshay)
(Special thanks to Jack Hitt for the Roomba Create)
The iRobot proved an ideal prototyping platform for experimenting with the concept of a Robotic White Cane. From the results of this prototype it is apparent that a robot of this type is indeed viable. I hope to develop a second generation robot from the lessons I have learned from using the Roomba Create. In future versions of eyeRobot I envision a device capable of doing more than just guiding a person down a hallway, rather a robot that can be put in the hands of the blind for use in everyday life. With this robot, the user would simply speak their destination and the robot would guide them there without conscious effort from the user. This robot would be light and compact enough to be easily carried up stairs, and tucked away in a closet. This robot would be able to do global navigation in addition to local, being able to guide the user from start to destination without the users prior knowledge or experience. This capability would go well beyond even the guide dog, with GPS and more advanced sensors allowing the blind to freely navigate the world,
Nathaniel Barshay,
(Entered by Stephen Barshay)
(Special thanks to Jack Hitt for the Roomba Create)
Step 9: Construction and Coding
A few extraneous words on construction:
The deck of made by a piece of acrylic cut in a circle with an opening at the back to allow for electronics access, and is then screwed into the mounting holes beside the cargo bay. The prototyping board is screwed into the screw hole at the bottom the bay. The Zbasic is mounted with an L bracket's with the same screws as the deck. Each sonar is screwed into a piece of acrylic, which is in turn attached to a L bracket attached to the deck (the L brackets are bent back 10 degrees to give a better view). The track for the linear sensor is screwed right into the deck and the slide pot is mounted with L brackets beside it. A more technical description of the construction of the linear sensor and control rod can be found in step 4.
Code:
I have attached the full version of the robots code. Over the course of an hour I have attempted to clean it up from the three or four generations of code that were in the file, it should be easy enough to follow now. If you have the ZBasic IDE it should be easy to view, if not use notepad starting with the file main.bas and going through the other .bas files.
The deck of made by a piece of acrylic cut in a circle with an opening at the back to allow for electronics access, and is then screwed into the mounting holes beside the cargo bay. The prototyping board is screwed into the screw hole at the bottom the bay. The Zbasic is mounted with an L bracket's with the same screws as the deck. Each sonar is screwed into a piece of acrylic, which is in turn attached to a L bracket attached to the deck (the L brackets are bent back 10 degrees to give a better view). The track for the linear sensor is screwed right into the deck and the slide pot is mounted with L brackets beside it. A more technical description of the construction of the linear sensor and control rod can be found in step 4.
Code:
I have attached the full version of the robots code. Over the course of an hour I have attempted to clean it up from the three or four generations of code that were in the file, it should be easy enough to follow now. If you have the ZBasic IDE it should be easy to view, if not use notepad starting with the file main.bas and going through the other .bas files.
DOWNLOAD HERE :
Roomba code.zip23 KB
0 comments:
Post a Comment