Luma AI Releases iOS App
The solution leverages Neural Radiance Fields to create immersive 3D environments based on real-world photography and videography.
Luma AI has released an iOS app version of its software, which leverages Neural Radiance Fields to create immersive 3D environments based on real-world photography and videography.
NeRF uses a neural network to learn a 3D volumetric representation of a scene from a set of input images. Luma’s app simplifies this process by guiding the user through photo capture and processing the results quickly via the cloud. The software can generate photorealistic 3D reconstructions from a single 360-degree panoramic view. These reconstructions can serve as backgrounds in LED-volume environments and in post workflows where a high-quality, bespoke environment is desired.
Follow Luma AI on Twitter and Instagram.
Keep up with American Cinematographer on Facebook, Twitter and Instagram.