Computers Will Stitch Together First-Ever Megamovie of Solar Eclipse
"Nothing like the Megamovie was ever possible before!"
This Monday, while you’re processing the incredible solar eclipse you just saw, a Google algorithm will be busy stitching together thousands of photos taken by citizen scientists into a three-minute “megamovie” to be published later that evening.
Scientists from UC Berkeley and Google engineers teamed up for the project, called the “Eclipse Megamovie.” They’re actively seeking volunteers to take photos of the eclipse’s path across the country; as participants upload their pictures to the Megamovie website, the content will transfer to Google’s servers and its computers will immediately get to work.
Google’s algorithm is based on an algorithm developed by Dr. Larisza Krista, Solar Eclipse Image Standardisation and Sequencing (SEISS), who created it to stitch together images taken of the 2012 total solar eclipse in Queensland, Australia. As described in a 2015 paper, SEISS can “process multi-source total solar eclipse images by adjusting them to the same standard of size, resolution, and orientation.” It can also “determine the relative time between the observations and order them to create a movie of the observed total solar eclipse sequence.”
But Google made some changes to Dr. Krista’s algorithm for this project. The Megamovie is focusing on the eclipse’s “totality,” or the minutes in which the moon blocks out all of the sun’s light. SEISS included code for detecting when the eclipse was in its partial phases, but Google will just use each photo’s timestamp and GPS location to discern whether or not was taken during the totality and forgo other images.
“After identifying the solar disk, we scale, translate, and rotate the images so that they all share the same reference frame. Images taken without an equatorial mount are rotated by the parallactic angle,” Google’s Joshua Cruz tells Inverse. “The entire eclipse path is broken into 3600 equal segments; for each segment, the ‘best’ photo is identified. Then, the movie is stitched (3600 total frames at 25 FPS), with any gap segments filled with cross-fades of the surrounding photos.”
Besides creating a high definition movie that looks really cool, what’s the purpose of this project? The Megamovie will provide scientists with observational data about the sun’s atmosphere that’s both extremely valuable and rare; the United States hasn’t had a coast-to-coast total solar eclipse in almost 100 years. By recruiting thousands of citizen scientists from all over the country, researchers will gain access to a much more detailed data set than they could have otherwise acquired.
“It’s really an experiment in using crowdsourcing to do solar science, which will hopefully pave the way for much future work,” says Dr. Juan Carlos Martínez Oliveros of UC Berkeley Space Sciences Laboratory’s Science Team.
“Nothing like the Megamovie has ever been attempted before because nothing like the Megamovie was ever possible before!” says eclipse documentary filmmaker Mark Liston Bender. “This process will revolutionize our observation of the total eclipse of the sun and the results will no doubt change the way we look at everything!”
The project released recruitment posters:
To participate in the Megamovie, you’ll need an interchangeable lens digital camera (like a DSLR or mirrorless camera), not just a cellphone, and you’ll also need to be in the path of the eclipse’s totality. Apply on the project website: Create a profile and enter the position from which you’ll be taking photos. It’s a good idea to download the Eclipse Megamovie Mobile app, which will automatically take photos during the appropriate moments of the eclipse and send the images to the project scientists, so that you can sit back and relax. Get more info on the website’s FAQ section.
To watch the movie, go to the project’s website the evening of the eclipse. It will be posted sometime after 5 p.m. Eastern, August 21, 2017.
Here’s a video about the project: