Apple’s New AR Headset Could Be Unveiled This Year
Apple’s internal disagreements shaped and changed its goals for the headset, the people said. One major conflict pitted senior designer Jony Ive against the leader of its AR team.
The headset could use a laser scanning system called LiDAR to scan the environment for objects and create a map of the space. It would then place virtual items in that location.
Adaptation of iPad Apps
Apple is reportedly working on a huge number of apps to make the new headset feel like an extension of the iPad. A report by Mark Gurman for Bloomberg lists loads of features that could attract headset buyers, including reading books in virtual reality, FaceTime calling and Apple TV watching, meditating with audio/visual sessions, workouts and snapping photos. The headset is said to be able to run most existing iPad apps, though they may need to be tweaked to work in a 3D interface.
Some of the features will include a dedicated FaceTime experience that supports Memoji-like avatars and virtual meeting rooms, as well as a version of Freeform for working on collaborative projects with others. The headset will also offer a Books app for reading in virtual reality and a Camera app that will allow you to capture images straight from the lenses. The headset will also be able to display multiple apps at once, and it will remember where you are in the room when switching between different ones.
The headset will be able to use Apple’s homegrown productivity software suite, too, with adapted versions of Pages, Numbers, Keynote and iMovie. Apple is also said to be working on a number of games for the headset, and it’s believed that one of the major selling points will be the ability to watch sports in immersive ways.
Mac-Level M2 Processors
Apple is reportedly using two Mac-level M2 processors inside the headset, which will give it a lot of built-in compute power. The main M2 chip will handle the bulk of the headset’s processing, while the other chip made with TSMC’s low-power 4nm process will handle sensor-related tasks. This should allow for a more immersive experience with less latency. Having the headset do the processing itself rather than sending it to an iPhone or Mac could also help keep the weight down and preserve battery life.
The AR headset will also have an external display that’s expected to be a standard OLED screen from LG Display. However, the internal displays will be micro OLED panels, which use chips stacked directly on the chip wafer for thinner and smaller pixels than traditional OLED screens. They ar headset also have a much faster microsecond response time, making them ideal for VR and AR.
The headset’s cameras will reportedly be used for facial recognition to scan the user’s expressions and features and match those to 3D Animoji or Memoji avatars, which will then appear in the headset for a lifelike chatting experience. The headset will also have a FaceTime experience that’ll let users see a realistic avatar version of themselves for one-on-one chats. Other apps rumored for the headset include a fitness app with a virtual trainer and a new version of Freeform for creating and working on 3D content.
Eye Tracking
A key component of AR and VR is eye tracking. It allows the system to know where in a virtual scene your eyes are looking and then adjusting the display for maximum clarity. It also lets the system measure your IPD (the distance between your eyes) which is a critical factor in making sure the headset fits correctly and provides the best possible image quality.
This is just one of the capabilities that Apple’s rumored AR headset will have, according to analyst Ming-Chi Kuo. The device is reportedly going to use multiple processors, including a new M2 chip, to run the various functions. The headset may also have cameras that feed real-time video of your surroundings into the screens inside the glasses, similar to how Microsoft HoloLens works, though early renders of the hardware make it look more like futuristic ski goggles than an iPhone.
Adding eye-tracking to the mix can improve the user experience in many ways. It can enable features like foveated rendering which displays only the part of an image that is currently in focus. This makes for a smoother and more natural viewing experience than what you get with a regular headset. It can also help you control the device by simply using your gaze. This could be a very handy feature for field service technicians who need to quickly scan and troubleshoot equipment in the wild.
In-Air Virtual Keyboard
Apple has been building more advanced AR tools into iPhones and iPads for years, telegraphing its intent to move beyond its existing headset with a more powerful device. And with a new M2 chip, that headset may finally be coming this year.
It will reportedly include a virtual keyboard that appears on the surface where the user is typing. This will allow for a number of different uses, including entering text in games that require a mouse or keyboard and providing a fallback option to correct speech recognition errors. Users could also use it to create a keyboard for use with another computer or mobile device, or as a way to interact with the virtual world of their choice.
The virtual keyboard will be able to track the movement of fingers, making it possible to enter complex PC games that wouldn’t work well with a controller or touch input. And it will supposedly support a ar headset range of different key combinations, allowing for a custom setup that’s right for the user.
It’s likely the headset will come with its own specialized software that will let users scan real-world objects and convert them into digital assets. And a report from The Information suggests that it will even allow user-created apps to appear in the App Store, though they’ll have to pass Apple’s strict approval process.