Skip to Main Content
Strategy

The future of the metaverse is only as bright as its privacy policies

Mass adoption of the metaverse and blockchain technology at large requires that brands meet consumer data privacy expectations

Cecilia Tran leads communications and public relations at Long Dash. She previously founded a content strategy consultancy for mission-driven companies.

In the rush to architect, or merely enter, the metaverse, brands may be overlooking a fundamental piece that will determine the success or failure of their entire endeavor: data privacy. Mass adoption depends on how much the product improves people’s lives and how trusted they are. While the quality of the technology and experience can be iterative—look no further than dial-up internet—trust must be embedded from the start. 

Data privacy shortcomings have eroded brand trust

Companies, incentivized by the forces of programmatic advertising, have created a massive trust barrier by failing to protect consumers’ personal data privacy. From unwanted emails to creepy targeted ads to personal data breaches, most people are acutely aware of how our data can be exploited and left vulnerable by brands. A 2018 Statista survey revealed that one third of Americans have had their email or social media account hacked. This was the same year as the  Cambridge Analytica-Facebook scandal, in which a whistleblower revealed that the political consulting firm had been harvesting personal data from Facebook users without their knowledge. 

The long-term price for broken trust is steep: 87 percent of consumers said they would not do business with a company if they had concerns about their security practices. A large majority of Americans (87 percent) think that privacy is a human right.

As companies eagerly make their debut in the metaverse, they must recognize that these new immersive technologies will make an alarming amount of personal data vulnerable to misuse and hacking without the right protections. A 20-minute session in virtual reality has the potential to generate biometric data to the tune of approximately 2 million data points based on unique recordings of body language, including eye-tracking. These data could infer positive or negative facial reactions to products or experiences, identify gait patterns in real life, and diagnose health conditions unbeknownst to the user themselves. 

The potential for misuse requires far greater demonstration of transparency, trust, and accountability beyond what most companies are accustomed to providing. Failure to do so could result in litigation or a drain in resources and time to update policies and products to address privacy concerns. The biggest hit would be to brand trust and brand health. Roblox is coming to understand this as the company becomes the first, likely of many, tech and gaming companies to take a proactive stance on online children’s safety and privacy legislation. 

The opportunity for a better digital future

These questions of privacy are not simple to answer or solve for. However, rather than being deterred by these challenges, brands must look at the opportunity that becomes unlocked if they get it right. Data privacy as a brand value is truly white space in the corporate landscape. Consumers overwhelmingly reported that improvements to data privacy is a corporate issue, with 96 percent saying that corporations should do more to protect personal data. Brands that are able to truly integrate this in their products and experiences will be rewarded by consumer attention and loyalty. 

 

Data privacy must be a core pillar of brands’ strategy to earn consumer trust.

Data privacy must be a core pillar of brands’ strategy to earn consumer trust. In practice, this means developing a clearly articulated corporate data responsibility policy and incorporating it into the company’s ESG criteria. Brands can already utilize privacy-first frameworks like the one from nonprofit XR Safety Initiative that has also developed child safety product development frameworks. Brands can also follow best practices offered by the Electronic Frontier Foundation. Companies should do their due diligence by consulting with data privacy lawyers for best practices before launching metaverse campaigns to understand current regulations and ones on the horizon. 

Still, self-regulation is critical as policies catch up with technology. Partnerships with trusted brands can strengthen companies’ expertise, as well as their competitive edge. Epic Games and Legos have paired up, for instance, to design metaverse experiences that are both age appropriate and privacy-forward in their design. Assuming that participants prefer the most stringent privacy preferences rather than requiring them to opt-out of less secure experiences is another policy more brands should adopt. The privacy-forward browser Brave was developed to model exactly this, ensuring that customers can opt-in instead of opt-out to provide their data. 

The reality is that brands will need to contend with these challenges sooner than later. Brands should keep in mind that blockchain technology, the foundation of Web3 and the metaverse, was designed with the core purpose of limiting the power of corporations to profit from personal data. As such, brands should seek out these passionate Web3 architects who are actively seeking to build a more privacy-focused digital future. 

Tomorrow’s technology is most powerful if we lay the groundwork to trust it today. Brands that go the extra mile to ensure safety and security in these products will usher in a new era—one that sets hard boundaries around privacy and to earn trust. 

Suggested reading