Presenting my Space Applications departmental project at International Space University SSP19
Hi! My name is Erin, and I’m an eccentric robotics builder. Online I’m known as “RobotGrrl”. My passion for robotics has lead me to launch Robot Missions with a robot that collects plastic from beaches, attend International Space University, leading a Hackaday Dream Team from 3 continents to build an ocean ghost gear robot, learn electronics kit manufacturing at Evil Mad Science, and build a submarine in a 48 hour hackathon during Fab11 with an international team at MIT. Presently, on the side of my part time role as an electronics designer, I’m designing a wind propelled robot for exploration on Mars, and developing an augmented reality game for outdoor sensor nodes. My mind is always fascinated about space, robots, and ways the two could improve Earth’s environment when combined.
Founded Robot Missions to solve environmental challenges using robots. Lead the technical development of Bowie the plastic collecting robot for beaches. Deployed the robot on a beach in Ottawa as part of a pilot. Tested the robot at the Canadian Space Agency. Attracted resources to bring the robot to its beta v1.0 stage, including government funding, teams, and sponsorship. I developed the design, electronics, firmware, autonomous navigation, and image classification AI for the robot.
VR for Space Applications
Combination of satellite imagery data and on the ground sensors in WebVR, using Mapbox 3D API, aframe.io, and Bowie the robot’s sensor data. Developed for the Space Applications department project at ISU SSP19. Originated from the question of: What would happen if we could step in to the nature around us, and see what is not visible to our eyes? Also made a demo of exploring different planets in VR using Enceladus (a moon of Saturn). I developed the code, building on top of aframe.io and accessing Mapbox’s API.
Fast Transit to Mars AR
Robot WebVR Control
Situational awareness for robotics can be tricky when they are driven further away and operating in a dynamic environment. This project explored what it would be like to see a camera view from the robot in WebVR. I developed the systems architecture and the code built on aframe.io, and using Python scripts on a Raspberry Pi to send camera images to a server, plus the microcontroller communicating with the Pi to send the data through MQTT.
RDAS and Tele-op Headband
Rapidly Deployable Automation System (RDAS) is a robot the size of a 1u CubeSat. It unfolds and deploys its wheeled locomotion with the goal of automating a task. The robot is controlled wirelessly via a tele-operational headband, which checks for movement. I developed this for my final project at Fab Academy, included a custom pcb Arduino derivative design using smd, and the pieces designs were fabricated at Evil Mad Scientist.
Working with collaborators in Venezuela and Nigeria, we prototyped a ropeless fishing ghost gear robot. Team lead for Eja during the Hackaday Prize Dream Team in 2020. Learned a lot about working together during a pandemic to develop a hardware based solution. I worked on the design and prototyping of the water-bottle sized robot. Our project was featured in Supplyframe’s highlights of 2020.
Present work in progress project. Developing a game that uses augmented reality to display realtime data from environmental sensor nodes. There will be a gameplay around this that sends elements to the nodes in return for points. The objective of the game is to increase your level by completing as many ‘nodeports’ as possible. This increases the surface area of data logged, and effectively makes a better map of nonpoint source pollution.
During the Studio Y fellowship at MaRS Discovery District, a group of us applied systems thinking to understand how consumers might change their behaviours for electricity, in partnership with HydroOne. I developed the Lumenbot lightswitch robot design and electronics, which was deployed in multiple locations for use.
Botbait and the Space Fish
Have you ever imagined what jellyfish in space would do? This fluid dynamics simulation combined with a robotic tentacle and virtual fish shows the art of possible with combining hardware and software. Developed in Processing using the MSAFluid library. The serial component that I wrote automatically detects and connects to the robot, and transfers the data in between to react the fish to the robot.
This big wheel print looked at what the Mars Science Lab Curiosity Wheel could be like if it was a combination of materials – both rigid and flexible. The design was in Autodesk Fusion 360, and the flexible portions are inlayed with the rigid portions, making sure that the two materials would be captured / attached securely. The print took 5.5 days long, and used rigid PLA and flexible TPU.
Mars Wind Tumbler Robot
Present work in progress project. Exploratory robots on Mars encounter the problem of wheels getting stuck, and missions needing to be away from sand storms. This design addresses those problems by being primarily wind propelled. This is applicable to prospecting land for human exploration on Mars, when colonies of habitats are developed in a nomadic fashion.
Presentation of Robots
Recent presentation of my work which showcases all 23 iterations of Bowie the robot and the story behind it – presented at ISU SSP19. This is a pretty good brief summary of some of my projects worked on in the past.