A human-machine interface (HMIs) is what its name says: A way for humans to interact with machines. In the industrial world these usually take the form of fixed buttons, 2D touchscreens or computer terminals where a factory operator can monitor the machine's health, analyze its performance, and make remediating changes.

While these HMIs have iteratively improved due to advancements in computing power and digital visualizations, how the frontline worker interacts with them to do their job hasn't greatly changed. What's changed is the self-sufficiency of the machine to autonomously conduct simple and repetitive tasks, decreasing the need for human involvement in some circumstances.

This siloed ecosystem where industrial robots complete these simple tasks in safety cages and humans conduct complex tasks like maintenance and quality inspections across these systems is not a sound setting for the collaborative future of work.

McKinsey estimates that for 60% of occupations, 30% of the tasks involved are automatable. Meaning tasks will increasingly require tradeoffs in responsibilities and handoffs between machines and humans, which requires an immersive interface between the two to achieve higher levels of productivity.

A next-generation human-machine interface is clearly needed to facilitate these task exchanges in this collaborative era of work, and spatial computing is the leading candidate to power the future of industrial work.

Spatial computing plus augmented reality gives frontline workers next-gen digital powers

There is critical information residing under the physical shells of industrial equipment but there hasn't been an HMI that puts it into its physical context. Workers look at a screen to understand what is wrong with a machine, but that interface is always separate from the physical machine that needs attention. Augmented reality is the frontline worker-friendly digital platform to tap into this dynamic physical world and enable synchronized tasks.

Spatial data fuels AR monitoring and discovering opportunities

Spatial computing puts digital information from machines, people, and their surrounding environment to be viewed through the lens of AR, unlocking several different capabilities. With digital information overlaid on the physical reality, frontline workers can discover and monitor meaningful relationships between systems in-context. This can include a machine's real-time performance data in relation to its service history or within the entire production line. Equipped with this Industrial Internet of Things (IIoT) data, the frontline worker can identify bottlenecks and tweak processes to drive higher levels of overall equipment effectiveness and throughput.



Workers can also discover unpredictable events that are common in the industrial world with enhancements in computer vision and artificial intelligence. Take a futuristic safety example, where a worker is visually and audibly alerted of an impending collision with an industrial vehicle or a machine is at risk for malfunction. Digitizing these spatial events, running AI-driven inferenced models and applying them through the immersive AR lens could decrease unpredictable environments, reducing the thousands of annual injuries for machine operators and elevate industrial companies' health and safety programs.

Spatial computing fosters human-machine interactions and collaboration

Another method to use spatial computing as the next-generation HMI through AR will change the actual interaction between the human and machine. With AR, the traditional physical HMI is juxtaposed into the digital realm replacing 2D touchscreens. Instead of using the out-of-context 2D screen to program a robot's actions, the worker leverages a spatial native interface to guide a robot's sequential movements with waypoints and logic flows.

This in-situ spatial programming of human-friendly robots (cobots) enables quicker reactions to market conditions and greater flexibility on the production line.

Spatial programming and 'Kinetic AR' enable workers with minimal engineering or technical knowledge to instruct the robot in their work cell through a headset or tablet versus taking costly downtime to reprogram it, which requires specialized personnel.

PTC's Reality Lab similarly uses Kinetic AR to program motions across spaces for its MiR 100 robot Frida. This futuristic collaborative interaction could be useful for the mobile robot to dynamically move heavy payloads or other materials across a factory.

Expanding spatial technology's purview to remote work

The construct of remote work has forever changed from COVID-19 and businesses will turn to technology now and in the next normal to accommodate for this digitally-driven future of work. Spatial computing expands the human-machine interface into this scenario with Remote Operator, which virtualizes spaces and the people and objects interacting within it in real-time.

In this example from PTC's Reality Lab, a remote expert can tap into a distant factory or facility for real-time monitoring, collaborate with onsite personnel and overlay annotations, analyze in-context asset performance data, and even control the machinery. By capturing and analyzing volumetric data, additional use cases can be unlocked to better optimize the space and their processes.

Final thoughts

The way we work is changing before our eyes as macroeconomic and technology forces make digital transformation pervasive across the value chain. Legacy systems are not purpose-built for this digital age and the HMI is an example where the next step in its evolution is required to compete in this digital era. With the rise of spatial computing and augmented reality as its human-friendly lens, the HMI revolution is quickly approaching.

Digitizing the Future of Work

Read PTC's vision for spatial computing and how it will play a pivotal role in manufacturing.

Click Here
Tags:
  • Augmented Reality
  • Digital Transformation
About the Author

David Immerman

David Immerman is a Senior Research Analyst on PTC's Corporate Marketing team providing thought leadership on technologies, trends, markets, and more. Previously David was an industry analyst in 451 Research's Internet of Things channel primarily covering the smart transportation space and automotive technology markets, including fleet telematics, connected cars, and autonomous vehicles. He also spent time researching IoT-enabling technologies and other industry verticals including industrial. Prior to 451 Research, David conducted market research at IDC.

Attachments

  • Original document
  • Permalink

Disclaimer

PTC Inc. published this content on 23 February 2021 and is solely responsible for the information contained therein. Distributed by Public, unedited and unaltered, on 05 March 2021 17:18:01 UTC.