Combating the dark side of extending reality

Published on the 23/05/2019 | Written by Jonathan Cotton

It’s all fun and games at the moment but the the risks of VR, AR and all the rest are real says a new report…

Extended reality – that’s VR, AR, haptics and holograms to you and I – seems to be having its moment, cracking the mainstream and surprising many with real-deal emerging business-use cases.

Take Lenovo’s latest, the ThinkReality A6. The unit (pictured), which offers a real world view (via see-through lenses a la the HoloLens) onto which 3D images are projected, is squarely aimed at the enterprise market.

“The ThinkReality platform enables users to pin, interact and collaborate with 3D digital information in the real world, improving their contextual awareness and efficiency,” goes the PR spiel.

Anonymous data collected about our brain waves today may be transformed into identifiable insights… by someone else, years later.

“It is designed to help enterprise workers use AR applications to receive assistance, reduce repair times, eliminate errors, streamline complex workflows, improve training quality, collaborate and save money.”

It’s cool stuff with significant potential business and economic benefits, but as always, it’s only a matter of time before someone figures out how to exploit the system. So what’s the downside to our emerging ability to alter, augment and extend our very perception of reality?

Enter the latest Accenture report, Waking Up to a New Reality: Building a Responsible Future for Immersive Technologies, which, as well as chronicling the recent soaring growth of the market examines just how XR poses personal and societal risks that are not yet well understood. Those risks include the creation of ‘fake experiences’, ‘anti-social behavior’ as well as a terrifying new twist on traditional security and privacy challenges.

“The blurring of physical and virtual boundaries unearths urgent new questions around reality, trust and mental health,” says Laurence Morvan, chief of staff and chief corporate social responsibility officer at Accenture.

“Our intimate feelings, behaviors and judgments may be captured as data for new uses – or misuses. The potential physical, mental and social costs of mistakes are too significant to try to fix retrospectively.”

Those threats, though still somewhat vague, have ramifications for business that need to be addressed today, the report says.

“Companies already know from current technology challenges that retrospective responsibility costs dearly.

“With immersive experiences, the stakes are so high, we cannot afford to wait until large-scale business models are making commercial returns. Firms are obliged to take pre-emptive, preventative action.”

Okay, so let’s assume the threat is real. What’s a well-meaning XP business to do about it, right here, right now?

The first step is act early against possible exploitation, preferably at the design stage, says Accenture.

“We can reduce the frequency of bad outcomes by being more conscious about the design of our risk assessment questions,” advises the report.

“The risk assessment lens needs to allow for a broader view of impact, incorporating individual and societal wellbeing. And the lens must reach further into the future, including systemic second- and third-order risks. For example, anonymous data collected about our brain waves today may be transformed into identifiable insights about our thoughts by someone else, somewhere else, years later.

“We must be honest and explicit about such future risks, and continuously revisit questions to build foresight and detect shifting trends early.”

Next, collaborate with the experts, says Accenture, as they can offer a wider perspective around the dangers and opportunities of the platform.

“Connecting with the right people is a good first step. But businesses should also actively participate in the exploration, discussions and research of key issues to ensure that the outcomes are relevant to their realities. Contributing to the development of principles, rules and solutions makes you better prepared to act rapidly in shaping the future of your business.”

Finally, when it comes to XR deployment, be careful with vulnerable employees. There’s even economic incentives to do so.

“Workers who spend a significant amount of their time performing routine or repetitive tasks are at risk of losing their job to automation. Yet, many of these same roles involve activities and tasks that could be augmented through XR, making them more valuable to the workforce. Assembly-line and warehouse workers, for example, could use AR to improve their judgment, decision-making, speed and accuracy.”

Of course it’s hard to predict just what the future holds, and apocalyptic thinking does seem to be a favourite pastime of the species. Nevertheless, if extended reality does pose a risk, it’s likely that the nature of that threat will be unlike anything we’ve encountered before.

Time therefore, is of the essence warns Accenture.

“Through the responsible design of safeguards, incentives and collaboratively-defined principles, business, government and society can unleash economic and social possibilities that have so far remained in our collective imagination,” says Morvan. “This work needs to begin now.”

Read the full report here:  Waking Up to a New Reality: Building a Responsible Future for Immersive Technologies

Post a comment or question...

Your email address will not be published.

Time limit is exhausted. Please reload CAPTCHA.

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Follow iStart to keep up to date with the latest news and views...