Wearing a device that slips on as easily as a hat, PACCAR employees are able to access step-by-step holographic instructions to guide them through unfamiliar tasks like assembling a truck door. In Dynamics 365 Guides, lighted arrows create a path from each instruction card to the precise hole where a wire needs to be threaded or to the location of the correct tool on the factory floor.

Holographic drawings superimposed on the actual door show how to perform that task and light up structures behind the steel panel that normally can't be seen without superpowers like X-ray vision.

In industries with aging workforces, there's also an urgent need to impart workplace wisdom that employees have accumulated through years of apprenticeship or decades on the job to the next generation of workers.

At Alaska Airlines, for instance, it can take roughly two years for a new mechanic to get fully trained and up to speed. The hope is that mixed reality tools might reduce that learning curve significantly. The immersive training environment also resonates with employees who have grown up with video games and nearly instantaneous access to digital information.

'It brings the paper to life,' said Mike Lorengo, director of Architecture and Strategy at Alaska Airlines. 'Rather than seeing a flat piece of paper, I'm seeing 3D projected onto an engine.'

Mixed reality is even more powerful when it takes advantage of the intelligent edge and intelligent cloud's different capabilities.

In some scenarios, you want to quickly process information on the intelligent edge without sending that data to the cloud, such as in cameras that can alert you to imminent safety risks or algorithms that control braking systems. On a factory floor, not all the data from each sensor on each piece of equipment is relevant at any given time. So running less-complicated AI services on the edge can help filter out irrelevant information or perform tasks that don't require the power of the cloud.

If you need a hologram to help potential customers envision how a new car or potential remodel will look with different options, a single HoloLens 2 using on-board capabilities in a showroom or living room will offer plenty of computing power and resolution.

But connecting that device to the new Azure Remote Rendering mixed reality cloud service can quickly produce intricate, three-dimensional digital models that begin to rival the sculpted clay or detailed architectural models that a company might spend days or months building today. That simply wouldn't be possible without the graphics processing power of the cloud.

'Suddenly mixed reality goes from something that's a novel way to augment what you're already doing to being able to replace an entire business process - for example, using full digital construction in a way that just couldn't happen before,'' said White.

PTC's IoT and mixed reality tools help companies minimize downtime by empowering on-site workers to quickly diagnose and repair machines that are critical to their operations.

PTC, one of Microsoft's partners, has developed integrated systems that combine IoT edge solutions, the Azure cloud and mixed reality tools to digitally transform businesses of all kinds, from aerospace and defense contractors to clothing brands and life science companies.

Think about a lab technician who comes into work one morning and finds a critical machine that processes blood samples isn't working, said Jim Heppelmann, president and CEO of PTC.

Several years ago, a blinking light or vague error message might be the only clue to what's wrong. She'd probably call the manufacturer, who might or might not be able diagnose the problem over the phone. Mostly likely, they'd have to dispatch a repair person for that specialized machine who might or might not work in that city. It could take hours or days of downtime to get it back up and running. Meanwhile, patients worried about their blood results would be left in the dark.

Today, with the ThingWorx for Azure service, she could put on a HoloLens 2 device and see a holographic dashboard with each component's health and status mapped onto the physical machine. The data collected by tiny IoT sensors and sent to the Azure cloud might diagnose a problem with one of the cartridges. The lab tech could access step-by-step holographic instructions showing her how to open the cover, which lever to flip, how to insert the new part. If she can't figure it out, a repair expert sitting in the manufacturer's office in Nebraska could look at a screen, see exactly what she sees through HoloLens 2 and walk her through the job.

'It's a closed loop between humans and things,' Heppelmann said. 'The IoT devices tell me what's wrong, and the mixed reality solutions allow me to repurpose that blood test technician into someone who's able to fix a simple problem that saves time and money on airplane tickets and rental cars.'

For any first-line worker who might wear a mixed reality headset for a good portion of the day, the improved comfort and larger field of view in HoloLens 2 - which allows people to see multiple holograms, read text and view intricate details in 3D - will be transformative, Heppelmann said.

'Those two things take HoloLens from a device that's interesting to play with and prototype with to one that could be put into widespread production in factories, in hospitals, in construction sites today. This is a big step forward,' he said.

Those features are also important to Bentley Systems, another Microsoft partner that develops software for engineers, architects and construction firms building massively complicated infrastructure projects.

When overhauling an urban train station or building a new soccer stadium with lots of moving parts and heavy equipment, looking down to access information on a phone or tablet can be dangerous, said Noah Eckhouse, Bentley senior vice president for project delivery. HoloLens headsets allow workers to access digital information while remaining aware of their physical surroundings.

Through HoloLens, the company's SYNCHRO software allows workers to zoom in on a particular location on the construction site and access important digital information, like safety guidelines or installation instructions, for that particular job or area. Managers can see in three dimensions what the project is expected to look like two days or three weeks from now - based on constantly changing realities and projections - and anticipate any scheduling conflicts.

'A construction site is like a giant ballet - it's a very highly choreographed operation with movements of materials and people that all have to exist within a certain space,' he said. 'And the plan changes the first day you're on the job.'

While it might be possible to store and update plans for a two-bedroom bungalow on a single device, it would be impossible to track all the moving parts on a massive infrastructure project without the cloud, Eckhouse said.

By connecting each HoloLens device on a job site to a master model that's constantly updating in Azure, SYNCRHO ensures that everyone works from the same shared reality, with the latest information to sequence jobs, plan crane movements, track progress and keep workers safe.

'The cloud connectivity is critical because in these large projects the amount of information going back and forth between the field and the engineers and designers is continual,' Eckhouse said. 'And the consequences of working on infrastructure projects in the physical world are very real.'

Bringing powerful perception tools to the edge

Two defining achievements in computer vision and AI contribute to HoloLens 2's immersive experience. The ability to interpret physical spaces with semantic understanding allows the device to differentiate between walls and windows or a couch and coffee table. Natural hand-tracking now allows people to grasp, rotate and expand the holograms more instinctively, rather than having to learn gestures that mimic mouse movements.

Those advances are enabled by the fourth generation of Kinect, combined with AI tools that operate on the edge. That depth- and motion-sensing technology was originally developed nearly a decade ago to create a gesture-recognition accessory for Xbox. But the ability to sense depth accurately and pinpoint how human bodies are moving in space turned out to have far broader applications than gaming.

Ocuvera, for instance, is working with Azure Kinect in a system that aims to help prevent the roughly 1 million falls that occur in U.S. hospitals each year, and even more worldwide. It can sense when a patient who needs help walking is trying to get out of bed unassisted, with enough advance warning to alert a nurse to go help.

Using a depth-sensing camera and AI algorithms, the system recognizes patterns of movements before a patient gets out of bed, like sitting up or swinging their legs around. Initial results from pilot studies at 11 clinical sites found that unassisted and unobserved bed exits decreased by more than 90 percent after the technology was implemented.

CEO Steve Kiene said Ocuvera's team has investigated every depth-sensing camera in the world and even tried to build its own. When it comes to distinguishing whether a patient is moving forward or just rolling over or detecting the first wiggle of a foot, none have come close to the accuracy and resolution of Azure Kinect.

'It's like looking for tells when you're playing poker,' he said. 'Only Azure Kinect gives us the data to really see what's going on with a patient in a hospital bed and predict their intent with enough accuracy. When we do a pilot with a hospital, they often tell us that's just not possible, but then they find out it does work, and they're amazed. It's kind of like magic.'

Attachments

  • Original document
  • Permalink

Disclaimer

Microsoft Corporation published this content on 12 June 2019 and is solely responsible for the information contained therein. Distributed by Public, unedited and unaltered, on 13 June 2019 01:13:06 UTC