Growth Of Metaverse Increases Trust And Safety Risks: Report
The report also said that the T&S service is among the fastest-growing segment of the business process Services market which is expected to reach $15 to $20 billion by 2024
The IT research firm, Everest Group, has released a report entitled, 'Growth of metaverse increases trusts and safety (T&S) risks to enterprises, users', has stated that the metaverse is expected to grow rapidly to a $679 billion industry by 2030, but the growth has implications for T&S, such as threat to user security, increased abuse, proliferation of objectionable content and financial frauds.
"The metaverse is attracting large investments from technology giants such as Google, Meta, Microsoft and Nvidia to make the virtual world a reality, and the applications hold unlimited economic and social potential for both good and bad. As organizations develop their business strategies for the metaverse, trust and safety issues need to be among their foremost considerations. Enterprises may be able to adapt some best practices of today, but they will also need to address scenarios and use cases that are unique to the metaverse. Solving for those novel challenges will require a collaborative approach among enterprises, policymakers, academia, and T&S service providers to realize the full potential of metaverse as an immersive yet safe place for users," said Rajesh Ranjan, partner at Everest Group.
While exploring the impact of the metaverse on the third party T&S market, the report also said that the T&S service is among the fastest-growing segment of the business process Services market which is expected to reach $15 to $20 billion by 2024. The market is expected to grow 35 to 38 per cent through 2024 and accelerate to 60 to 68 per cent growth beyond 2024 as technology and infrastructure advances beyond the nascent stage.
The Everest Group's report also proposes the metaverse risk mitigation strategies for T&S risks which include abuse of virtual avatars comprising invasion of personal space, impersonation, harassment, assault, bullying, stalking and spying. The report further noted the concerns around data privacy and user safety, safety of virtual assets from financial crimes and identity theft, well-being of content moderators who may face physical and mental health hazards from prolonged exposure to VR headsets and content, as well as regulatory ambiguity.