mechanical fourbar and self-guiding robot
Robotics engineering 2001 and 2002
For my robotics engineering major, I have to take project-based robotics courses that teach the student how to create an efficient four-bar mechanism robot and a robot that can self-guide using sensors within a field.
One of the important aspects of these courses is working with a group of people to design, test, and demo each robot. Teamwork is essential to make projects like these come to life, and it also takes proper time management, collaboration, and skill.
When I work with a group of people, I particularly step up into the role of a leader where I guide the project in the correct trajectory and ensure each design step is properly done.
In addition, I used my skills in Solid works to help create the four-bar mechanism with my fellow teammate Jakub Jandus. I also had a heavy hand in creating the code and building the electronics for the self-guiding robot.
Here is the github repo:
https://github.com/RBE-2002/final-project-team-17/releases/tag/group17-finalcode
The fourbar mechanism robot
To create the fourbar mechanism, my team used Solidworks drawings to draw out line of actions for three main poses we wanted the fourbar to traverse. This informed our pivot points and fourbar lever arm lengths.
We designed the gearing to the correct ratio for our blue motor so we can achieve a desirable speed and torque to lift an aluminum plate. We used a servo for the clamping mechanism. Everything was either 3D printed or laser cut in acrylic.
The remainder of the project consisted of performing torque calculations, simulations and completing the code which heavily used state machines to achieve landmarks on the playing field. The goal was to remove and place aluminum plates on angled roofs.
The self-guiding robot
The goal of the self-guiding robot project was to utilize a camera, sensors and an IMU to traverse a playing field. It would communicate different sets of information to a WIFI broker and have another robot read the broker and use that data for another task.
In total, there were three robots on the playing field. A robot would wall follow to a button. The button had an IR emitter that would be sensed by the robot. The robot would publish it’s final position to the wifi broker. Once pressed, a hidden april tag would expose itself to robot two.
Robot two would ascend a ramp and stop when there was a visible change in the accelerometer. It would read that april tag using an OpenMV camera and publish the tag ID to the wifi broker through an ESP32.
A third robot will start moving once it’s recieved robot one’s position data and robot two’s april tag data. It would wall follow and position find using forward and inverse kinematics to the correct coordinates provided by robot two. Once there, it would IR emit a code provided by robot one to a door. Once the door opens, the robot travels through and stops when the accelerometer reads if it has crashed.