Apollo is a Game-Changing Photo App for Dual Camera iPhones

(Apollo App in action)

When Apple introduced their new dual camera phones a few generations ago, pictures got better. Having two lenses allowed them to create their new "Portrait mode" which emulated shooting with a longer lens creating a soft background. It's a cheat, for sure, and a lot of that blurring comes from software that takes data from the two cameras and just keeps the foreground in focus and blurs anything behind it.

So, far, however, Apple had been the only ones to utilize that data from the two cameras... until now. 

Apollo is a new photo app that is so advanced it looks and acts more like a 3D modeler like Blender than it does with a traditional photo app, and that's because they hacked the iPhone's camera and data. Taking advantage of the information embedded in a photo when its shot in 'Portrait' mode, the app creates a 3D space with the depth data allowing you to do some crazy things like in the video above, or in this one.


I mean... come on. That's ridiculously cool. Apollo uses a Phong-style model that allows you to:

  • add up to 20 light sources
  • position each one in 3D space
  • change the light intensity
  • alter the light spread
  • change the color
  • add a light BEHIND an object for silhouettes


In a quote from their website under their Developer story they state:

Apollo is the first application to use the depth data of portrait mode photographs, to realistically add light sources to a scene. Development of the app began as an experiment back in November 2017, when we first got our hands on a brand new iPhone 8+. We wanted to see what could be achieved by taking advantage of the depth information of portrait photos. Our hypothesis looked simple: if depth information can be superimposed on a 2D photo, it should be possible to re-illuminate objects with custom light sources.


If you haven't upgraded to a dual camera iPhone with portrait mode yet, this $1.99 might make you pull the trigger, it's that cool.