Image of the 2017 total solar eclipse captured on my iPhone 6 using the Eclipse Megamovie Mobile application.

The Eclipse Megamovie Project

On August 21st, 2107 a total solar eclipse traveled across the United States from coast to coast. I led a project at Ideum to develop a smartphone app to help citizen scientists capture images of the eclipse and upload then them to a database for scientific study. The project was a collaboration with the Space Sciences Laboratory at UC Berkeley, and was also part of a larger citizen science effort spearheaded by Google, called the Eclipse Megamovie Project. Our goal was to answer the question: can you turn a smartphone into a scientific tool for studying a solar eclipse?

Roles: Software development and supervision (Android/iOS), producer, UX design, field testing. 

A photo of the application just after the end of the 2017 solar eclipse.

Because of my background in physics, astronomy and math, it was a natural fit for me to act as  producer for the project in addition to developing the software. I collaborated closely with  astronomer Hugh Hudson (of UC Berkeley and the University of Glasgow) and Mark Bender, a filmmaker and die-hard eclipse chaser. A key part of my role was to understand their scientific goals and and communicate them to the team at Ideum. For example, early on in the project I created a basic wireframe for the application, which served as a basis for the final design.

The ultimate goal for the app was for it to make it easy as possible for people to capture scientifically useful images of the eclipse. In particular, the app had to calculate the exact timing of the eclipse based on GPS location data, then automatically activate the camera at the proper time. Furthermore, it needed to precisely control cameras settings such as exposure duration and focus. The scientific team decided it would be most useful if the app could cycle through a sequence of exposure durations to gather the most scientifically useful information. The sequence had to be precisely synced with the different phases of the eclipse. 

A sequence of exposures of the 2017 eclipse captured automatically on my iPhone during the eclipse.

The client wanted to support a range of users, including dedicated citizen-scientists willing to use a tripod and external zoom lens. Using a zoom lens made it possible to get considerably better images, but it also presented some challenges. To capture images of the eclipse, the phone would have to be pointing in the right direction with an accuracy of about 1 degree. And as I quickly learned myself, it is no easy matter to aim so precisely as a moving target.

Field testing the app taking photos of the sun. I learn how hard it is to precisely point a smartphone lens, especially when pointed directly at the sun.

To help with this problem, I developed a pointing aid that utilized the phone's compass, accelerometer and other sensors to determine the phone's spacial orientation. The application then calculated the position of the sun in the sky at the time of the eclipse (based on the user's GPS), and an onscreen UI helped the user point the phone in the correct direction.

A sketch I made to help explain the pointing aid feature.

The app contained a number of novel features, for example the automatic camera control based on gps and the pointing aid. Implementing and fine tuning these required a lot of prototyping and testing. To mention a few of the challenges we encountered during field testing: trying not to accidentally stare at the sun, trying to keep an iPhone from overheating in 100 degree heat, compensating for unreliable smartphone compasses, and trying the application work not only on iPhones but hundreds of  Android devices as well. 

Testing the app with colleagues at Ideum

Testing the pointing feature to take pictures of the moon.

After the hard work developing the app, the team decided we had to view the eclipse ourselves and use the app. So we packed up our smartphones, lenses, and tripods and drove 11 hours to the path of totality in Nebraska. Witnessing the eclipse in person was a truly awe-inspiring experience.

The final hardware setup before the eclipse. My iPhone is attached to a 50x external lens mounted to a tripod.

Setting up to view the eclipse in Nebraska

Thousands of people  ended up downloading the apps. Ultimately, over 60,000 images of the eclipse from across the country were captured using the app and uploaded to the scientific database. 

Images uploaded to the database by an iPhone user without an external lens.

Another sequence uploaded to the database.

The distribution of users who used the app and uploaded images to the database.

Using Format