Apple is widely expected to introduce its long-rumored mixed reality headset as part of WWDC 2023. This comes as a surprise to a few in part because Apple has been praising AR since at least WWDC 2017. That’s when Apple started laying the groundwork for the technology used in the speaker with developer tools on the iPhone and iPad.
That’s when Apple first introduced the augmented reality framework ARKit that helps developers create immersive experiences on iPhones and iPads.
ARKit has been a focus for Apple in the years since, devoting much of the recent live keynote to introducing and demonstrating new AR capabilities. Who could forget the wooden tabletops strewn about that served as building surfaces for virtual LEGO sets on stage?
By emphasizing these tools, Apple has conveyed the importance of augmented reality technology as part of the future of its platforms.
iPhone and iPad software isn’t the only thing starting to design for a mixed reality future. Likewise, iPhones and iPads are becoming more equipped to act as portable windows into the world of augmented reality.
Starting with Apple’s Face ID and Animoji (and later Memoji), Apple set out to tune the iPhone for augmented reality capabilities. Internally, Apple designed the iPhone’s neural engine to handle augmented reality effortlessly.
The main camera on iPhones has even added a dedicated LiDAR sensor as lunar spacecraft navigate the surface of the moon and self-driving cars read their surroundings.
There was even an update for the iPad Pro that focused almost entirely on adding a LiDAR scanner on the rear camera.
Why? Sure, it helped with focus and depth sensing for portrait mode photos, but there were also iPad apps dedicated to decorating your room with virtual furniture or trying on glasses without the frames actually in place.
What was clear from the start was that ARKit wasn’t entirely dedicated to immersive experiences through the iPhone and iPad. The phone’s screen is too small to be truly immersive, and the tablet’s weight is too heavy to withstand extended periods of use.
There is an absolute use of augmented reality on iPhones and iPads. Catching pocket monsters in the real world is even stranger in Pokémon GO than in an all-digital environment. Dissecting a virtual object in the classroom can be more welcoming than touching the actual guts.
However, the most immersive experiences that really trick your brain into believing you’re physically surrounded by whatever digital content you’re watching require goggles.
Does this mean that everyone will be interested in augmented reality and virtual reality enough to make a headset a hit? The feedback on AR on the iPhone and iPad has been, at times, that Apple offers a solution in search of a problem.
However, there are some downright delightful augmented reality experiences out there.
Want to see all the dimensions of an announced but unreleased iPhone or MacBook? Augmented reality is likely how a lot of people first experienced the Mac Pro and Pro Display XDR.
Dropping a virtual space rocket at a scale of 1:1 in your living room will also give you a good idea of how big these machines are. Experiencing a virtual rocket launch that allows you to look at the ground as if you were a passenger can also be exhilarating.
Augmented reality was also the best way to introduce my kids to dinosaurs without the risk of time travel and bringing the T-Rex back to the present day.
As for ARKit, there are a number of ways Apple has publicly built the tools that will be used to evolve the headphone experience starting next month.
For starters, the framework offered a way to provide developers with the tools, APIs, and libraries needed to build AR apps in the first place. Motion tracking, scene detection, light sensing, and camera integration are all essential to delivering AR applications.
Real world tracking is another important factor. ARKit has provided the tools to use hardware sensors such as the camera, gyroscope, and accelerometer to accurately track the position of virtual objects in a real environment through Apple devices.
Then there is the face tracking feature. ARKit allows developers to include the same facial tracking capabilities that Apple uses to power Animoji and Memoji with facial expressions mirrored.
AR Quick Look is another technology that was mentioned earlier. This is what augmented reality experiences use to place virtual objects like products in the real environment around you. Properly measuring these objects and remembering their location relative to your device helps create the illusion.
Newer versions of ARKit have focused on supporting shared AR experiences that can remain static between uses, detect objects in your environment, and block people from scenes. Performance has also been steadily tuned over the years, so the underlying technology that supports VR and AR experiences in the headset should be pretty powerful.
We’re expecting our first official glimpse of the Apple headset on Monday, June 5, when Apple kicks off its next major event. 9to5Mac He will be present at the special event, so stay tuned for comprehensive coverage and close-ups. Best of luck to the HTC Vives and Meta headsets out there.
FTC: We use affiliate links to earn income. more.
#Apple #publicly #building #technology #mixed #reality #headset #years