Teams - Hack & Build 2024
This year, we had 13 teams participating in the hackathon, with a wide range of projects and ideas. Some of them agreed to answer our questions during the hackathon.
Team 1
What does this project do?
Blind, a robotic dog on wheels that’s not blind, but is in fact equipped with a camera (supplied by a team member’s roommate) that allows it to perform localization tasks to map out its surroundings. The team sent Blind to do laps inside the Makerspace and it was able to create a rendering of the room using RealityCapture.
What’s next for this project?
Incorporate optimization algorithms so Blind can go around faster and faster as it gains more knowledge of its surroundings. It currently also needs guidance as it navigates unfamiliar environments for the first time—some exploration algorithm would be useful.
Team 2
What motivated this project?
When elderly people fall, it quite often leads to serious repercussions, especially if help doesn’t arrive in time. To address that, Team 2 designed and built a fall detection robot.
How does your project help?
Using ultrasonic sensors, it follows you and alerts people (currently a red light goes off) if a fall has been detected. Specifically, one ultrasonic sensor points diagonally upward while another one points straight ahead; if the user falls over, the significant distance difference between the two sensors will trigger the alarm. Neat and effective.
Team 3
What does this project do?
It’s a speech-to-Braille converter! Here’s a sequence of steps that make it happen: speech gets recorded by a microphone, which is then sent to a word processing unit website and converted into words (powered by OpenAI’s Whisper); words then trigger the corresponding vibrators in an array of vibrators in what the team members have dubbed “a sock,” and the resulting vibrations will inform the user of the corresponding Braille. What’s better? Everything is sent via WiFi, no connection cables necessary (aside from those needed for powering, of course).
Team 4
What does this project do?
It’s a cross between an expandable table and a self-balancing reaction wheel. The same mechanism that goes into an expandable table is leveraged to create the change in moment of inertia necessary to keep the wheel balanced. Math alert: a sizable amount of geometric calculations were carried out.
What’s next for this project?
Upgrade to a more powerful motor and incorporate friction-reducing features in the expanding mechanism to ensure better self-balancing performances. That’s part of the engineering process, isn’t it?
Team 5
What does this project do?
It's a robotic hand controlled by a muscle sensor! As the muscle sensor worn on the user’s wrist detects muscle tension, a binary signal is sent to the hand, which then goes from an open state to a closed state, and as the user relaxes their muscles, the hand will then return to its original state.
What challenges did you face?
The muscle sensor proved finicky to work with: the team even tried creating their own electrodes using aluminum strips purchased from Morton Williams. One of our judges, Professor Kymissis, even offered to provide copper strips as an alternative, showcasing the iterative nature of engineering.
Team 6
What does this project do?
It’s a Strandbeest-inspired walking robot. What else needs to be said but that it’s awesome. It can go forward, backward, and turn according to the good old WASD inputs. To tackle the associated difficulties of 3D printing the complicated geometric shapes and joints, the team ran loads of printing simulations to find the perfect parameters, and what’s more, designed all the joints to be printed in place—no assembly needed for each individual leg!
Team 8
What does this project do?
It’s a human tracking eyeball with LEDs around it that flashes in sync with whatever music is played near it! To implement the human tracking, the team utilized OpenCV along with MediaPipe for the software, and installed a universal joint as well as a linkage system powered by servos for the hardware.
Team 9
What does this project do?
It’s a theremin! As the user holds out their hand (or a ruler from the Makerspace) above the ultrasonic sensor and changes the height difference, different sounds are played. The more musical expo attendees had a blast playing out their favorite tracks. What’s more, the device supports three different modes—discrete, discrete with hold, and continuous—to accommodate different music styles. Want it to sound more like the violin? Use the continuous mode.
Team 10
What does this project do?
A tongue that dances to music, which a user can control with hand gestures! Using computer vision, the team built a program that recognizes which gesture (open or closed) is displayed by which hand, thereby controlling what beats should be played. Accordingly, the tongue (3D-printed in one go), driven by artificial tendons and leveraging the flexibility of the 3D print filament, dances along to the music.
What challenges did you face?
The team decided to use TPU as the filament of choice, which turned out to not be stiff enough for the purposes of the project—that’s part of the engineering process and now we have learned something new.
Team 11
What does this project do?
It’s a 3D scanner! A camera mounted on a rotating arc captures images from multiple angles as the target object spins on a platform. These images are stitched together to create a detailed 3D rendering.
What’s next for this project?
To improve the quality of the rendering by creating better lighting environments, the team duct-taped (as it’s quite often the case in prototyping) a Bank of America selfie ring light and a black screen on either ends of the moving arc. Needless to say the next step is to design and implement a more robust method of attachment.
Team 12
What does this project do?
This elegant device solves mazes by tilting the platform based on joystick inputs equipped with a gyroscope. Three servo arms enable movement in all directions. You simply can’t find anything else more elegant to watch and more entertaining to play with.