For Apple (AAPL -0.57%), the future is about far more than our physical reality. The company is building tools for augmented reality (AR) that could be in common use by over 1 billion devices around the world, and virtual reality (VR) has been rumored to be on the way as well

What's become clear in the last few quarters is that Apple is building the foundation of its AR strategy right before our eyes. Lidar included in iPhones and iPads today increases the accuracy and fidelity of AR on the devices, and Apple is already creating an ecosystem of apps and tools for developers. At its Worldwide Developers Conference last week, Apple said it is bringing AR into maps and capture tools for third-party apps in an AR strategy that could keep this tech stock growing for the next decade. 

iPhones with augmented reality features.

iPhones with augmented reality features. Image source: Apple.

Augmented reality is coming to maps

The most notable AR addition announced last week was AR for maps. In a news release, Apple described its mapping AR technology by saying, "With iOS 15, users can simply hold up iPhone, and Maps generates a highly accurate position to deliver detailed walking directions in augmented reality."

Apple Maps app on an iPhone.

Apple Maps. Image source: Apple.

For now, this technology will only be available on the iPhone, but it's unlikely that's the end game. AR glasses like Magic Leap have long envisioned maps as a high-value use for augmented reality technology. A Magic Leap app called Holomaps says you can "see 3D maps with live data, traffic, weather, and Twitter updates." If and when Apple announces AR glasses, it could offer the same tools.

The combination of knowing a user's location and being able to scan the surrounding area opens up a world of possibilities, especially if users are wearing Apple AR devices. And if Apple can use the user's scan data to improve its maps, we could see that add value not only to maps but also to new technologies like self-driving vehicles. 

Capturing AR assets just got a lot easier

Another notable addition to iOS is Object Capture. One of the challenges with building cost-effective AR tools is capturing 3D assets, and now that can be done with just a camera. Here's what Apple said about Object Capture and RealityKit 2, part of ARKit, in a press release:

RealityKit 2 introduces Object Capture, a simple and powerful [Application Programming Interface] API on macOS Monterey [the codename for Apple's latest OS] that enables developers -- like Wayfair, Etsy, and more -- to create high-quality, photo-realistic 3D models of real-world objects in minutes by taking photos shot on iPhone, iPad, or DSLR and transforming them into 3D models optimized for AR. These models can be viewed in AR Quick Look or added to AR scenes in [applications like] Reality Composer or Xcode, making it easier than ever to build amazing AR apps.

If capturing assets gets easier, it'll make it easier for developers and companies to include AR assets in their apps. And more assets mean more app possibilities for current iOS devices and next-generation devices like AR or VR glasses or headsets. 

AR is core to Apple's future

Apple highlighted that it has over 1 billion AR devices in the world, and the company has slowly but surely been building a foundation in AR for years. It has hardware with AR technology integrated, tools for developers to build with, and billions of users already in the ecosystem. If Apple introduces AR glasses in the next few years, as rumored, it could expand its product lineup even further and continue growth into the next generation of technology devices. Don't sleep on the importance of AR to Apple's future.