Coined in Neal Stephenson’s 1992 best-selling novel, Snow Crash, the term “metaverse” has recently reentered the general public’s lexicon to denote a technology hailed by some as the successor to the mobile internet and the next step in humankind’s technological evolution. Though there is no consensus on the definition’s precise contours, the metaverse has generally been described as an embodied internet where, instead of passively viewing content in two-dimensional space, users are in the content and experiencing it with others.
The recent enthusiasm for, and commercial promise of, this more immersive digital experience has led companies at all stages to consider a metaverse strategy, from early stage start-ups offering metaverse fashion items to mature financial institutions buying virtual land to open metaverse-based bank branches. And yet in any vision of the metaverse, real-world privacy issues are magnified, as I/O devices can capture qualitatively novel, more intimate data. For example, virtual reality headsets could leverage built-in sensors to capture facial movements, drawing even more deeply personal inferences about individuals such as their medical conditions or emotions. The following are some key privacy considerations for companies that are planning on venturing into the metaverse:
- Design your offering with user privacy in mind. Although there is no comprehensive federal privacy law with clear rules of the road on the collection, use, and sharing of personal data, several states have enacted privacy laws that would apply to personal information that is reasonably linked to a consumer or device. But irrespective of the legal framework, companies venturing into the metaverse have other reasons to consider privacy: It is an important building block to ensure consumer trust. Research has shown that, if consumers trust that a company will use their data in ways that benefit them, they are more willing to share more data. To that end, companies should embed privacy and data security into products and services from the outset. This means understanding what personal data they need, only collecting such data if they have a business need, getting rid of it when that need no longer exists, and securing personal data in their possession. Some state laws codify these requirements.
- Have a compliance strategy to implement consumer data rights. State data privacy laws like the California Consumer Privacy Act, the Virginia Consumer Data Protection Act, and the Colorado Privacy Act provide consumers in some circumstances the right to access, correct, or delete their personal data. Many metaverse evangelists foresee blockchain technologies playing a significant role in the technology’s future; the blockchain’s immutability may in some cases make compliance with the consumer right to delete more complicated. (See a related Wilson Sonsini advisory addressing the potential application of consumer data rights to NFTs.) Companies operating in the metaverse should have in place processes to comply with these requests, as the extensive amount of data available on consumers in the metaverse may increase consumer interest in exercising these rights.
- Adhere to biometric privacy laws. New I/O devices that allow users entry into the metaverse are capable of collecting biometric data, from conscious physical movements, to eye flickers, to emotional data. Several states have enacted laws seeking to protect this data. For example, Illinois’ Biometric Information Privacy Act (BIPA) requires that private entities using biometric information have a public written policy establishing a retention schedule and guidelines for permanently destroying such information. BIPA also imposes other obligations on private entities collecting biometric information, such as requiring notice and opt-in consent before collecting biometric information. BIPA provides for a private right of action, and penalties for violating its provisions are harsh. In 2021, for example, Facebook settled a multi-year litigation over its photo-tagging feature for $650 million.
- Be especially careful if your offering appeals to children. Politicians have rallied around protecting children from the perceived harms of technology. President Biden specifically flagged shielding children from online advertisements and the pernicious effects of social media in his State of the Union address; and Senator Edward Markey (D-MA), along with Congresswomen Kathy Castor (D-FL) and Lori Trahan (D-MA), recently sent a letter to Federal Trade Commission (FTC) Chair Lina Khan encouraging the FTC to monitor children’s increasing use of virtual reality and exercise its authority under the Children’s Online Privacy Protection Act (COPPA) and the FTC Act “to protect children in the metaverse.” As this letter flagged, two-thirds of parents with virtual reality (VR) devices report that their children asked them to buy the device, and approximately three-quarters of children between the ages of 8 and 15 that responded to a 2017 survey expressed significant interest in VR. Companies with metaverse offerings that appeal to minors, including children under 13, will in many circumstances need to offer COPPA-compliant experiences that may include parental consent prior to collecting, using, or disclosing any child’s personal information, or limiting the type of personal information collected and the ways that information is used. Failure to do so could result in regulatory action and substantial fines.
As you’re developing your metaverse offerings, if you have privacy questions, please contact Wilson Sonsini attorneys Dan Chase, Maneesha Mithal, Tracy Shapiro, or Libby Weingarten, or another member of the privacy and cybersecurity practice.