Shared Overview Effect in Augmented Reality

Hi Planet Labs, welcome to my portfolio which also includes a demo for the idea that I’m proposing for the Artist in Residence position. For my featured portfolio projects, please scroll down.

My idea addresses the disconnect that many are feeling right now due to isolation during the pandemic. Imagine being able to see perspectives from Earth all around us. With Planet Labs imagery and Augmented Reality, this is possible.

The experience that it aims to instil would be similar to the Overview Effect, which astronauts have described as changing their perspective of our planet and humanity.

Hope you will enjoy the sections below with more information, as well as my prior work portfolio pieces.

SŒAR Demo

Check out the live demo! To see the images, point your camera at these 4 markers. Built on the open source webVR framework aframe.io.

Compatible with Safari on iOS, Chrome on Android, and Firefox on desktop.

Project Concept and Motivation

Astronauts have expressed the profound impact that the Overview Effect has had on them by shifting the way they view the world. It alters their awareness of our planet and their perspective of humanity through all of eternity. Their observations are infused with lessons of cooperation, environmental protection, and the sheer beauty of our planet.

While many of us have been isolated more than ever before throughout 2020 and 2021, there is a subtle similarity between the spaces we reside in serving as our own “spacecraft” as we travel through this, knowing that after some duration of time we will encroach on the destination.

 

However, an unintended consequence of this time period may be a profound disconnect between people and the natural world for years to come. With Earth’s climate on a precipice, we need to encourage more of a connection with the natural world for the future longevity of the planet.

 Canal in Italy during the lockdown. March 18, 2020. Image © 2020 Planet Labs.

These reflections lead to three pillars of questions:

How could it change our perspective about the environment?

What if we could experience the Overview Effect from within our own rooms?

Would there be ways to feel connected to people at random, despite being far apart, while looking at Earth?

 Canal in Italy before the lockdown. Oct. 20, 2019. Image © 2019 Planet Labs.

Planet Lab’s mission of imaging the entire Earth every day is key to making this project come to life. By bringing this awareness of the Earth in an immersive way, my hope is that participants will find a newfound sense of wonder about the natural beauty on the planet.

 

One possible outcome from this could be bestowing the regard for personal ownership over environmental protection in local communities with people. While sharing the experience by being connected with someone else at a completely different location, it will make large distances seem closer by seeing their view. No longer would the Overview Effect be reserved for astronauts, but instead now a shared one between habitants.

Karen Nyberg, a NASA astronaut, took in the view from the space station’s cupola in 2013. Image credit: NASA

SŒAR Project Components

The project would create portals using augmented reality for a shared experience of the Overview Effect. Participants can place markers around their room where the portals should appear – as if they are constructing their personal Cupola ISS module. Using a web app on any device, participants scan the markers, and they will see Earth from the vantage point of space. The imagery will not be solely from their location, but rather, someone that they are randomly connected with. Planet basemap and Planet monitoring would be used for the imagery data. The AR engine would be the open source aframe.io, which is based on the WebXR framework by Mozilla, allowing the most cross-device functionality.

Screenshot from SŒAR demo

Connectivity

The two connected participants can communicate by tapping on the screen, producing auditory pings of sound sound — quite the juxtaposition from the soundless vacuum of space. The position and tonality of the sounds will likely be discovered by accident through an experimental process during development, with one of the ideas being perhaps it could be related to some of the Planet analytic feeds. Poetic combinations of sound could emerge when playing together with the connected participant. On the technical side, this will be an interesting experiment with latency when using MQTTS to send and deliver the messages.

Beyond

The project would have a use beyond its initial launch period. Some of the most impactful applications of augmented reality will be for training and hands-on work. Combined with satellite imagery, it can add extra layers of understanding to maps of infrastructure that is typically hidden from view (such as water mains, electricity wires, and more). This can serve as realistic training beforehand, and be a better visualization guide tool during hands-on work. There are even more exciting possibilities out there, and using satellite imagery with AR in an artistic way can assist those ideas to flow for solving problems using this approach.

The participant will find delight in the subtle visual effects, such as imagery moving and fading to reveal additional imagery, stars animating, and more. Over time, the experience will start to feel more familiar. This relational aspect is another inspiration to the name SŒAR beyond the abbreviation, as shifting the emphasis on syllables can make it sound like ‘soeur’, which is ‘sister’ in French – making this an interesting companion alongside ‘Mother Nature’.

Screenshot from SŒAR demo

Presenting my Space Applications departmental project at International Space University SSP19

Portfolio

Hi! My name is Erin, and I’m an eccentric robotics builder. Online I’m known as “RobotGrrl”. My passion for robotics has lead me to launch Robot Missions with a robot that collects plastic from beaches, attend International Space University, leading a Hackaday Dream Team from 3 continents to build an ocean ghost gear robot, learn electronics kit manufacturing at Evil Mad Science, and build a submarine in a 48 hour hackathon during Fab11 with an international team at MIT. Presently, on the side of my part time role as an electronics designer, I’m designing a wind propelled robot for exploration on Mars, and developing an augmented reality game for outdoor sensor nodes.

 

My mind is always fascinated about space, robots, and ways the two could improve Earth’s environment when combined. The Planet Labs Artist in Residence opportunity lends itself perfectly to these interests to explore captivating ways to connect people together and help them see the world from a different perspective.

Robot Missions

Founded Robot Missions to solve environmental challenges using robots. Lead the technical development of Bowie the plastic collecting robot for beaches. Deployed the robot on a beach in Ottawa as part of a pilot. Tested the robot at the Canadian Space Agency. Attracted resources to bring the robot to its beta v1.0 stage, including government funding, teams, and sponsorship. I developed the design, electronics, firmware, autonomous navigation, and image classification AI for the robot.

VR for Space Applications

Combination of satellite imagery data and on the ground sensors in WebVR, using Mapbox 3D API, aframe.io, and Bowie the robot’s sensor data. Developed for the Space Applications department project at ISU SSP19. Originated from the question of: What would happen if we could step in to the nature around us, and see what is not visible to our eyes? Also made a demo of exploring different planets in VR using Enceladus (a moon of Saturn). I developed the code, building on top of aframe.io and accessing Mapbox’s API.

Fast Transit to Mars AR

Collaborating with the Fast Transit to Mars team project at ISU SSP19, this is the result – an AR viewer where you can move and reconfigure tiles to build your own spacecraft model. In the future, people will be buying their own custom configured spacecraft to bring them from destination to destination in the solar system. This was presented during the team project final presentation, where the audience joined in a mini interactive game to assemble the spacecraft with the people they were sitting next to. It worked! I developed the code (building off of aframe.io) which involved Javascript for keeping order of the modules, and did testing on the AR markers from different distances. As the word has it, there is still an AR marker placed in Pioneer’s Hall at ISU Strasbourg campus!

Robot WebVR Control

Situational awareness for robotics can be tricky when they are driven further away and operating in a dynamic environment. This project explored what it would be like to see a camera view from the robot in WebVR. I developed the systems architecture and the code built on aframe.io, and using Python scripts on a Raspberry Pi to send camera images to a server, plus the microcontroller communicating with the Pi to send the data through MQTT.

RDAS and Tele-op Headband

Rapidly Deployable Automation System (RDAS) is a robot the size of a 1u CubeSat. It unfolds and deploys its wheeled locomotion with the goal of automating a task. The robot is controlled wirelessly via a tele-operational headband, which checks for movement. I developed this for my final project at Fab Academy, included a custom pcb Arduino derivative design using smd, and the pieces designs were fabricated at Evil Mad Scientist.

EJA

Working with collaborators in Venezuela and Nigeria, we prototyped a ropeless fishing ghost gear robot. Team lead for Eja during the Hackaday Prize Dream Team in 2020. Learned a lot about working together during a pandemic to develop a hardware based solution. I worked on the design and prototyping of the water-bottle sized robot. Our project was featured in Supplyframe’s highlights of 2020.

Codename Terrapulse

Present work in progress project. Developing a game that uses augmented reality to display realtime data from environmental sensor nodes. There will be a gameplay around this that sends elements to the nodes in return for points. The objective of the game is to increase your level by completing as many ‘nodeports’ as possible. This increases the surface area of data logged, and effectively makes a better map of nonpoint source pollution.

Lumenbot

During the Studio Y fellowship at MaRS Discovery District, a group of us applied systems thinking to understand how consumers might change their behaviours for electricity, in partnership with HydroOne. I developed the Lumenbot lightswitch robot design and electronics, which was deployed in multiple locations for use.

Botbait and the Space Fish

Have you ever imagined what jellyfish in space would do? This fluid dynamics simulation combined with a robotic tentacle and virtual fish shows the art of possible with combining hardware and software. Developed in Processing using the MSAFluid library. The serial component that I wrote automatically detects and connects to the robot, and transfers the data in between to react the fish to the robot.

Big Wheel

This big wheel print looked at what the Mars Science Lab Curiosity Wheel could be like if it was a combination of materials – both rigid and flexible. The design was in Autodesk Fusion 360, and the flexible portions are inlayed with the rigid portions, making sure that the two materials would be captured / attached securely. The print took 5.5 days long, and used rigid PLA and flexible TPU.

Mars Wind Tumbler Robot

Present work in progress project. Exploratory robots on Mars encounter the problem of wheels getting stuck, and missions needing to be away from sand storms. This design addresses those problems by being primarily wind propelled. This is applicable to prospecting land for human exploration on Mars, when colonies of habitats are developed in a nomadic fashion.

Presentation of Robots

Recent presentation of my work which showcases all 23 iterations of Bowie the robot and the story behind it – presented at ISU SSP19. This is a pretty good brief summary of some of my projects worked on in the past.

What I bring to the table

My unbridled enthusiasm about applying space to help the environment of Earth can hardly be contained. This pushes me through late nights, code bugs, and final details. As seen from my portfolio, I have the technical skills needed which were gained from work on robotics and space projects, and am always eager to learn and advance my understanding. My geeky and creative nature would thrive in the community of Planet Labs. Ideas that I love to think about are sustainable space exploration, making humanity a multi-planetary species, and art in the new space era.

“Look everyone! We’re on Enceladus in VR!” – during Space Applications departmental project presentation at International Space University SSP19

If selected for this opportunity, I would harness the massive progress on Earth observation that Planet Labs has achieved in to create a piece that instills the same sense of wonder about the spectacle of space, and share this with others so they too can gaze upon it, and find glimmers of resolve during these bleak times.

If your team wishes to discuss further, you can reach me at erin at robotmissions dot org and on LinkedIn. Hope we will have the fortune to work together and looking forward to hearing any thoughts on the project idea. Thanks for the open opportunity to share this idea with your team.

-Erin K

Presenting during the Team Project – Fast Transit to Mars at International Space University SSP19

Timeline

Over the 3 month timeframe, here’s a breakdown of the major milestones. While this does seem structured at first glance, many of these elements will rely on unstructured experimentation and play. Bouncing ideas back and forth from Planet Labs team would be fully utilized to have new ideas emerge.

Timeline

Milestone

Objectives

Week 1

Design Brief

  • Orienting, meeting people (virtually), getting set up
  • Learning overview of Planet API
  • Make a 1 page concept design brief based on this proposal and with any modifications

Week 2 & 3

Planet API

  • Dive in to the Planet API
  • Get the images to display in AR view with aframe.io
  • Experiment with transformations and lighting

Week 4

Design

  • Experiment with AR image tracking / markers
  • Design the AR environment with interactive visual effects
  • Experiment with shaders

Week 5 & 6

Planet API

  • Learn the analysis side of the Planet API
  • Exploration in to the capabilities
  • Emit sounds relating to this data

Week 7

Interactivity

  • Location API
  • Get touch points on screen
  • Animations on the images

Week 8 & 9

Connectivity & Integration

  • Add MQTTS functionality
  • Bring together all the elements from previous weeks into one
  • Final touches

Week 10

Testing

  • Deploy web app to a server
  • Beta test with 25 participants
  • Make any bug fixes

Week 11

Launch

  • Preparations for launch eg, screenshots, video clips
  • Launch
  • Watch the analytics for any immediate failures

Week 12

Conclusion

  • Project wrap up
  • Final documentation
  • Thank you’s

 

Resources Requested

Resources that would assist this project include:

  • Planet API access
  • Web server / AWS cloud credits for web app
    MQTTS broker on a server
  • (Possibly) Oculus Quest 2 VR system
  • Assistance in writing privacy policy of the app, since it accesses the camera and location
  • Small contribution to developers of aframe.io as it is the open source framework that will be used for this project
  • (Stretch goal) Leveraging this opportunity to gain early access to the Niantic API would be an interesting combination of their AR map scanning combined with Planet Lab’s data
  • (Stretch goal) Have an Astronaut beta test the app and provide their thoughts

Learning Experience

What I hope to gain from this experience is learning more about the intersection of Earth Observation and environmental challenges that could be solved with insights from processing that data. With several startups already addressing some of these ideas, it is at this junction where the exciting stretch ideas will be at since it’s beyond the low-hanging fruit. This is why being an artist who can push the boundaries of how we see, interpret, feel, and understand this data is instrumental. One hope is that perhaps what is learned through this experience could lead to eventual increase of environmental remediation efforts. Another hope is that if we can engage and communicate climate data to people in a different way, maybe they will feel more connected and involved with the problem.

A key component of the project will be using the Planet API. Learning more during this residency will grow my understanding of working with Earth observation imagery data. For example, one of the things I am wondering about is demystifying how a measurement scale is calculated for the images, especially the ones at an oblique angle. Additionally, I aim to grow my understanding of Augmented Reality and shaders in 3D using the aframe.io framework.

Riffing ideas together with people at Planet Labs would be phenomenal given the knowledge level. I hope to have virtual coffee chats and learn from their work, and where they see the new space era going. The inspiration from these discussions will spur the emergence of new ideas for the project. When the project is complete, it would be an honour to see the resulting work as a showpiece. Perhaps it could be featured at a future Explore conference.

Planet Labs is an inspiring space entrepreneurship example. As I’ve started and lead various projects, it would be impressive to hear more about how they have grown their company to this success and the risks taken along the way. Additionally, learning about hardware at scale and testing for harsh conditions. One of my new space entrepreneurship ideas has to do with energy storage in space and wireless power transfer. It would be amazing to geek out with people who are also interested in this. As you can see, the start of the opportunity is art, and the lessons learned through this would create ripples for my future adventures in space.