The development of Artificial Reality is advancing even faster than was predicted. With the introduction of the LIDAR sensor to iPhone Pro, it seems like AR will become available to mainstream users.
Of course, it’s not the only huge advancement on the AR market. We’ve seen AR glasses and chips released by Microsoft and Google, and even Apple themselves supported multiple. However, the idea of scanning 3D objects with an iPhone with Lidar is revolutionary - and it has a lot of implications.
As a software development team focused on AR and VR development, we are absolutely thrilled to see such a rapid advancement. Practically speaking, this new opportunity is actionable for small and medium businesses to leverage AR’s potential. What has changed and where to start - let’s take a look.
3D scanning is not at all a new concept. LIDAR technology, implemented by Apple in their smartphones, has been used for decades in physics, geometry, geography, meteorology, and other sciences. The significant difference is accessibility - recently released scanner apps offer an entirely different approach than traditional devices.
A 3D scanner aims to build a 3D model from a set of points that define the geometric shape of the subject. The scanner collects information about each point, collects it and reconstructs the original image.
The technology used to make 3D scanners is similar to the camera one. They also need light to collect information, but unlike cameras that mostly focus on the object’s colors, 3D scanners are used to determine the distance to each point of the surface.
Laser 3D scanning used in applications
A 3D laser scanner app defines an object’s shape by creating cloud points that clearly describe the 3D dimensions. The points are then joined to make a 3D model of the entire object. 3D scanners allow recreating a precise image of a physical object on a digital device.
3D scanning devices and apps help users make buying choices (makeup, clothes, furniture), create virtual and augmented reality (by scanning surroundings around users, apps can create realistic digital surroundings), and many others.
In laser scanners, laser light projects to an object-to-be-scanned, and information from reflected waves provide data on the precise position of every point in 3D. Because laser scanners rely on light speed, it’s a fast way to collect data about millions of points and return it back to the device.
The data is processed in the application
Although the scanning process itself is performed in the hardware - in a mobile phone, it can be done with an embedded laser scanner, much like on iPhone 12. The end aggregation, however, is performed in software.
The scanning apps for Apple and Android existed long before the official introduction of Lidar. Users could connect third-party scanners to their devices and use them to get the point data.
Object reconstruction
The final step of the process is to recreate the scanned object in the software. The application uses the point data to reverse engineer the object's shape. Using a CAD model (CAD stands for computer-aided design and refers to any design performed by computer systems), users can modify a physical object’s image, transfer it to a virtual environment, export, share the image, etc.
Here’s an interesting demonstration of what LIDAR and scanning apps are capable of in the iPhone, performed by CNET journalists. One of the most common applications of 3D technologies was always scanning furniture - because knowing exact 3D dimensions and testing them in the interior makes buying and transportation a lot easier. LIDAR improved the scanning precision, and users got high-quality pictures of virtual furniture.
To be clear, the idea of performing 3D scanning with a mobile app isn’t new. Most existing 3D scanning apps for the iPhone were released for the iPhone 6, 7, 8. Users used embedded cameras and recorded the object from many angles to get the precise image. However, how is Lidar scanning better - if it is?
Camera scanning has advantages: it provides information about colors and a too high resolution. The main selling point, however, is its availability: cameras are cheap and present in almost any device. However, they require balanced light - at night, you won’t be able to scan much, similar to standing in front of direct sunlight.
Lidar 3D scanning doesn’t provide information about RGB colors. So, using just Lidar can be inconvenient for many applications of 3D scanning. When it comes to taking better pictures, modeling furniture, getting a better representation of reality, Lidar will not do much good. Without a camera, Lidar is used to determine an object’s position and create exact maps - applications widely used in scientific research and military. However, for a typical user, making a map is hardly the end goal.
The combination of Lidar and camera in 3D scanning gives the fullest representation of reality. The camera gets data on colors and resolution, while Lidar collects information about dimensions and distances. Both tools do what they do best - and that’s how users get 3D, fully-colored reconstruction.
The release of Lidar in an iPhone has caused a commotion among app creators and researchers. The technology, previously available only to a few, now became mainstream. Experts keep exploring the potential of the scanner - from ambitious self-driving car research to gimmick party tricks.
For software development teams and businesses, it’s straight up a revolution. Many business models that used to be discharged because of the low-quality of camera-based scanning are now actively resurfacing. So, if you are thinking about starting an application or an ambitious initiative, one of the ways to go would be considering the use of Lidar scanning.
High-quality photography
The immediate consequence of the introduction of Lidar is a boost of photo and video quality. Lidar gives information about object’s position in space, which is why it improves the focus. However, if there’s no light, Lidar won’t have much impact.
The apps that will immediately benefit from Lidar are social media that use photo and video filters. The active use of Lidar will help make better Instagram photos, Snapchat photos, and TikToks. Of course, even existing apps will take time to embrace the technology - so this might be a chance for new applications to shine.
Perhaps, if a team were to build an application that leverages Lidar’s technical potential, it yields immense potential for partnerships and acquisitions.
Augmented reality will become much better
Journalists tried Lidar out on existing AR games for iOS (like the Hot Lava arcade), and as expected, the user experience significantly improved. Applications scan surroundings a lot faster, the image looks better, the interactions become more available. Now users can even more objects around - and still have them accurately represented in the in-game world.
Most importantly, Lidar’s introduction will bring a change in the popularity of AR real-time streaming and videos. Before, getting a high-quality image when the object is on the move was impossible - cameras simply didn’t keep up with the shifts in the position.
However, for Lidar, detecting the change of position of a point immediately was never an issue. After all, it’s used to track the movements of self-driving vehicles and military objects. So, it’s likely that we’ll see a lot more action-based AR gaming, live-streams, and dynamic filters. Augmented reality is becoming increasingly more immersive - and highly available.
Creating even better maps
Mapping out objects has always been Lidar’s main application. Now that it's connected to the camera, the maps will also be equipped with high-resolution color images. The practical applications are numerous.
Mapping living and business spaces: interior design, ergonometric planning, engineering will become a lot better even outside professional companies. Now an average user will be able to create a professional model of a remodeled room or predict how a dress will look before trying it on. Lidar’s pro-technology is going to be used to perform everyday tasks.
Accessible Lidar is getting us one step closer to self-driving technology. Users will be able to see the benefits of Lidar’s precision on their own devices. They will become a lot more knowledgeable about the technology that’s currently employed in many self-driving technologies. This could be a long haul that’ll lead to higher tech awareness - and increase the acceptance of self-driving tech.
Entirely new AR-based business ideas
Prior to Lidar introduction, businesses were limited in their creativity. A lot of good ideas had to be put aside due to poor execution. Making small 3D scanning apps didn’t make sense - the process took a lot of time and wasn’t widely accepted by the public. That’s why up till now, 3D scanning is featured in ambitious projects focusing on practical sustainability.
However, the increased speed and quality of 3D scanning makes the process easy and fun. Likely, we’ll see much less sophisticated applications make use of Lidar. For instance, there could be an app that determines people’s height. Fitness applications could use Lidar to scan user’s measurements after workout sessions to track their progress.
3D scanning will enter industries that didn’t even consider adopting it before. The same process will happen to users - rather than being a guilty pleasure of tech adopters, 3D scanning will become an everyday possibility.
Image-based 3D human shape estimation
When Apple announced Lidar’s introduction, many users pointed out that a similar technology already existed in the app. It’s true; the TrueDepth camera also collects more than 30 000 points to get the exact dimensions of the face. However, it’s still a camera - which means the far-distance precision will not be possible.
Lidar uses the same concept but takes the execution to a new level. TrueDepth cameras could work in darkness, but only if the user's face is inches away from it. Lidar offers the same precision for faraway objects as well.
Naturally, users are already enthusiastic about embracing the scanner. Videos on testing Lidar and discussing new applications are gaining 500-600k views on Youtube. Here’s our take on emerging Lidar trends - based on the way users see them.
Using Lidar to scan cars
A Youtube video about cloning a car with Lidar and iPhone 12 already gained 134k views, despite being released recently. The blogger used the scanner to create an exact digital copy of his car.
Scanning cars, furniture, and other objects would make 3D printing much easier. Now engineers can work with highly precise digital models of real-life objects and deconstruct them. As users hypothesize in comments, by scanning locations it’s possible to build a real-life GTA (of course, theoretically, since there are ethical implications to consider as well).
In-home arcades
We already mentioned about instances of using Lidar to connect to AR arcades like Hot Lava. Well, on the video a user does just that. The first step is scanning the room with Lidar. Then the software takes several minutes to process it - and the real-life surroundings can be uploaded to the game with almost 100% precision.
Users can easily add effects, filters, augmented elements on top of their real-life surroundings. Of course, use cases similar to the ones shown on the video can be used out of entertainment context as well - architects and designers will particularly benefit.
Comparing image quality with Lidar and no-Lidar
Photographers, bloggers, videomakers are also excited about the news. In particular, they are curious about the camera being able to produce high-quality images at night. So, in this video a blogger tests how much the use of Lidar actually influences photo quality.
Long story short, there isn’t that much difference for daylight pictures, but the quality of photos shot in the dark noticeably increases. Also, photographers remark that there might be some fun AR photography-related apps in the future.
Demonstrating the tech characteristics of the sensor
If you’d like to see a demonstration of Lidar’s possibilities and limitations, we suggest taking a look at this video. It’s a practical overview of the sensor's range, sensitivity, and limitations. To test the innovation practically, the blogger scans different objects - from furniture to a cat.
For many users, the presence of Lidar is a turning point in favor of buying the iPhone 12 Pro rather than the lite version. Even people who didn’t work with 3D scanning before, now recognize its advantages.
Seeing in a cave while using Lidar
Here’s a less intuitive application of iPhone12 Pro. Here, cave explorers use the night vision app and iPhone leader to navigate in the dark cave. It’s a homemade demonstration, however, the results are great.
There’s a definite prediction we can make about the future of 3D scanning - it will become a leading technology over the next years. Likely, it will become a standard for applications in almost any industry. This is why we, as a development team, are highly focused on being among the early adopters of Lidar. We are passionate about 3D scanning development - and now, we are also very optimistic.
If you are thinking about incorporating 3D scanning features into your application or building one from scratch, we would be happy to share our experience and expertise. Our team will find ways to leverage Lidar’s potential in the best way.