FP TrendingOct 15, 2020 15:17:18 IST
Snap Inc. announced the launch of Lens Studio 3.2 recently which brings support for the LiDAR camera in Apple’s new iPhone 12 Pro and iPad Pro. This will allow augmented reality (AR) creators and developers to build and create lenses for the new Apple devices that blend perfectly with the real world. Speaking about the LiDAR lens on iPhone 12 Pro and iPhone 12 Pro Max, a Snapchat release said that it enables immersive AR experiences that overlay more seamlessly onto the real world.
With the release of the Lens Studio 3.2, Snapchat’s camera will be able to see a metric scale mesh of the scene, thereby understanding the geometry and meaning of surfaces and objects.
The scaling will help in better than ever scene understanding so that AR creators can interact realistically with the outside world. Using the features of A14 Bionic and ARKit, Snapchat will let users render thousands of AR objects in real time. Also, the experience of creating immersive environments will not remain isolated as you can share it with the entire Snapchat community to explore.
Eitan Pilipski, Snap’s SVP of Camera Platform, said, “The addition of the LiDAR Scanner to iPhone 12 Pro models enables a new level of creativity for augmented reality”.
He said that the firm was excited to collaborate with Apple to bring this sophisticated technology to Snap’s Lens Creator community.
Other than creating real-like, immersive worlds in reality, the Lens Studio 3.2 also lets users create Lenses and preview them in the world even before one has got their hands on the new iPhone 12 Pro. This is possible because of the new interactive preview mode.
One can just open the Snapchat app on Apple’s latest iPad Pro to test the feature. As the Lens Studio 3.2 is already available, developers can get to work to create augmented reality using the free template offered by Snapchat.