A collection of new enterprise APIs will let developers create custom apps that bring mixed reality to staffers in manufacturing, warehouse, and production line roles. Credit: eXeX With a heavy focus on artificial intelligence during the WWDC keynote on Monday, Apple executives spent just a short time outlining features coming to the company’s Vision Pro “spatial computing” headset. There was little in the visionOS 2.0 update to boost the office productivity credentials save for a larger virtual monitor. There were, however, signs of how the device could become more useful for frontline workers with the addition of “enterprise APIs” for visionOS 2.0. These APIs let developers create custom apps with greater control over aspects of the Vision Pro’s sensors and other systems, and enable businesses to create visionOS apps for those in frontline roles, such as warehouse or production line workers, Apple said in a video on its developer site this week. This includes the ability to build an app that provides see-what-I-see remote assistance apps, for example. The enterprise APIs address “known functionality requirements” for the Vision Pro and show Apple’s intention to meet their customers “where they’re at,” said Tuong Nguyen, director analyst at Gartner. “It’s noteworthy that Apple has been this responsive in terms of opening up access/functionality to further empower developers and encourage enterprise adoption. I expect this will be a theme across future announcements regarding Vision Pro.” With lagging sales of virtual and mixed reality headsets, and question marks about key use cases, frontline workers present an early route to adoption for Apple and others. “The near- and mid- term benefit [of AR/VR devices] will primarily be for frontline workers — usually in asset intensive industries,” said Nguyen. “The value for information/knowledge workers, as well as consumers,will come much later.” Apple outlined ways developers can create a wider range of visionOS apps for frontline staff in the developer video, including three APIs aimed at improvingaccess to Vision Pro sensors. One API provides access to the main forward-facing camera. This could be used in conjunction with a computer vision algorithm to create an anomaly detection app that helps a production worker detect faulty components, Apple said. Another enables recording of a user’s entire passthrough video feed — previously it was only possible to record the apps a user was looking at — which could provide remote assistance for field technicians. There’s also a QR code scanning API that enables custom app functionality to detect a QR code and receive its content. It could be used, for example, by a warehouse worker to scan package bar codes to verify they have the correct itemwithout the need for a hand-held scanner, Apple said. Three other APIs are focused more on background processes. One lets machine learning tasks run on the Vision Pro’s neural engine in addition to the CPU. Another, an “object tracking” API, can track multiple dynamic and static objects that appear in a user’s field of view. This could be used to track tools and parts in a complex repair environment — providing a technician with guidance on how to fix a machine, for instance, Apple said. There’s also an API that lets users increase demands on computing resourcesbeyond the Vision Pro’s default limits, essentially compromising battery life and increasing fan noise for a demanding scenario such as rendering a high-fidelity,mixed-reality display of a race car. These “behind the scenes” capabilities add flexibility to the Vision Pro and couldbe useful in a variety of scenarios, said Ramon Llamas, research director with IDC’s devices and displays team. “That’s absolutely key, especially if you’re in the business of looking at a lot of objects in a quick amount of time, because the computing power for that may sometimes go beyond what the Vision Pro can offer out-of-the-box,” he said. “Giving developers and enterprise users the power to spin up or down [compute resources] can be the difference between the Vision Pro being nice-to-have and must-have.” Llamas said the new APIs enable Apple to catch up with others in the market when it comes to enterprise functions. “That’s where the market is right now and it’s important for Apple to have these kinds of functionalities built in so that they are part of the enterprise solution conversation,” said Llamas. The additional workplace functionality reflects broad potential use cases for mixed reality, said Nguyen, and should help Apple maximize adoption of the Vision Pro, “because this early in the market, no one — including Apple — will get massive adoption volume off a single use case, or functionality. “Similar to the smartphone era, there’s no killer app,” he said. “It’s a collection of applications and use cases that will make Vision Pro (and other head-mounted displays) a valuable device.” The enterprise APIs are currently in beta, according to Apple. SUBSCRIBE TO OUR NEWSLETTER From our editors straight to your inbox Get started by entering your email address below. Please enter a valid email address Subscribe