Contact Us

Use the form on the right to contact us.

You can edit the text in this area, and change where the contact form on the right submits to, by entering edit mode using the modes on the bottom right. 

         

123 Street Avenue, City Town, 99999

(123) 555-6789

email@address.com

 

You can set your address, phone number, email and site description in the settings tab.
Link to read me page with more information.

How to take a Depth Photo with the Iphone 7 Plus

Blog

Retro Video Game Nostalgia

How to take a Depth Photo with the Iphone 7 Plus

Koichi Donovan Nakai (Don)

Did you know that you can capture depth data on your iPhone 7plus? Depth data is essentially a black and white image that reflects depth information. The lighter values mean the object is closer. The darker values mean the object is farther back.

Depth Map - The lighter values mean the object is closer. The darker values mean the object is farther back.

Depth Map - The lighter values mean the object is closer. The darker values mean the object is farther back.

This depth map is often used by CG artists and in the visual effects industry. We call this zdepth- data along the Z axis.

Zdepth - Data along the Z axis

Zdepth - Data along the Z axis

With depth data you can do things like change the focus, add a foggy background, and achieve parallax effects.

With Z-depth Data you can achieve blur, fog, and parallax effects

With Z-depth Data you can achieve blur, fog, and parallax effects


The iPhone 7 Plus has two cameras on the front. These two cameras essentially work like our two eyeballs. Simply by comparing two images and subtracting the difference in parallax, the processor can calculate depth.

In portrait mode, the iPhone knows how to blur out the background thanks to this depth data.
The cool thing is that now with iOS 11 you can actually capture raw depth data as an image using the iPhone 7 Plus. At the moment I couldn’t find apps in the store, however, you can get developer code from apple’s developer program.

Here is What You Will Need

  1. A Mac that has the latest OS
  2. Xcode 11
  3. An iPhone 7 plus with a USB lightning cable

Steps

  1. First we are going to install Xcode
  2. And download the three sample AVCam developer apps. Your going to get AVCam, wiggle me (Using Depth Data), and AV Cam Photo Filter
  3. load the project file in Xcode
  4. Plug in your phone
  5. Choose your user
  6. And push play

Do this for all three developer apps
 

Now the apps should be on your phone.

Let’s start with AV cam
Turn on depth data delivery
Turn on Live Photo mode
Now take a picture.

This looks like a normal photo. Don’t be fooled though. Embedded inside this pic is the depth data. With a depth data app like wiggle me we can make use of it.

Open the wiggle me app and Load the picture.

Woah 3d....
You can pinch to zoom and move your phone around to see what happens. That my friends is the power of depth data.

Now I am going to use AVCam photo filter to capture raw depth data.
And see what I can do with this image in After Effects and Zbrush

Load AVCam photo filter, Turn on depth,
Slide the bottom slider all the way up
Now at first It’s going to look all glitchy.

 

choose smooth, And voila, you have depth data
With a tripod  Snap a depth photo
Next slide the bottom slider all the way down

And snap a normal photo

Now we will use these two images to create a 3d version of this photo

In After effects I will be using Volumax Pro which can be found on videohive.net.

This is an awesome After Effects template made by Cream Motion
 

Essentially it takes a 2d image and turns it into 3d using this depth map


Just open the project, put your image in “your picture comp”

make sure to cover all the entire canvas
Copy the picture and paste it into “2 Your Displacement Map.”

 

And now you can move and animate this image in pseudo 3d.

 

Using depth data

 

Let’s take it one step further

 

and actually make a real 3d model with this depth data.




 

Alt click your depth file and drag it over your layer so that we have an exact scale and placement.

Open up main comp 4k and now you can move and animate this image in pseudo 3d using depth data!

Let’s take it one step further and actually make a real 3d model from this depth data

First let’s crop these photos into nice squares.

 

We are going to use 4096 x 4096 pixels
 

Render those guys out

 

Next we’re going to open Zbrush and make a plane

Hit T on the keyboard
And select “Make Polymesh 3d”
Under “Texture” load the color image and apply the texture to the plane

Next in the alpha selector, load your zdepth map image

Under masking choose mask by alpha

Now if we turn off our texture we can see that this geometry has been masked based off the Z-depth data

Let’s deform the geometry using the depth data as our stencil.

We will use the “offset” deformation along the z axis
 

There, now have this 3d image


But you’ll notice these artifacts

 

That’s because there is not enough bit depth in this image

Until this capturing technology improves to include more depth data, we will have to simply blur the mask.

 

So let’s blur the mask, lower our poly count, and deform again.

 

There, much smoother.

Now we have full 3d control over our image.

Although it is not perfect, there is definitely room for development

 

This depth map capturing technology is good enough to do some basic parallax and blur effects.

But once we can capture more depth information


we will be able to manipulate things like lighting, shadows, and even capture our real environments into 3d virtual space.

Think about it... what if we were able to capture depth over the course of time? We would be able to recreate space and time... then we can jump back to that space in time. Yes, That is correct... I am talking about time travel!

But until then... I hope you enjoyed this video, please like share and subscribe! I would love to see how deep people can take depth data! Thanks for watching the Dusty Gamebox