THE WALLS HAVE EARS: Privacy and Risk in the Clubhouse

Clubhouse’s novelty and exclusivity is a potent combination for growth, and the platform is already reshaping the social media landscape and the manner by which people around the world can engage with each other, but it falls short on individual privacy protections.

The general premise of Clubhouse is that it allows users to create rooms where they can engage in conversations with other users. There is no video or text, just voice. Each room is hosted within the Clubhouse platform, and, assuming it’s not a closed room, users can enter a room to “drop in” on a conversation as a listener, and potentially a participant, similar to as if they were wandering a private club, hence the name. The experience feels a bit like scanning talk radio for an interesting topic, except the listener does not have to call in to the station and wait on hold to add their voice to the conversation. (They may still have to be called on by a moderator through the user interface on the app.)

Clubhouse’s privacy issues are not unique, but are surprising considering the modern global emphasis on, and demand for, better privacy controls in consumer technologies.

It is interesting to consider Clubhouse’s present growth-surge in contrast to what Facebook experienced in its early years. Facebook’s mass adoption was driven at least in part by its strong user privacy controls. Though Facebook’s policies and practices with regard to its collection, processing, and use of personal data have changed since, at the time, Facebook was able to point to its privacy controls as a differentiator from its competitors. One of Facebook’s competitive advantages early on was that it offered its users a level of control over their profile data they were not getting elsewhere.

Global privacy considerations given short shrift.

At this point, the bulk of Clubhouse’s privacy shortcomings are well documented, and so are not explored in depth here beyond a brief discussion of what are to date the most commonly highlighted issues. Of those, the two most readily apparent privacy concerns with Clubhouse include that its public-facing privacy disclosures don’t appear to be directed to individuals, or account for privacy laws, outside the United States, and that Clubhouse’s privacy disclosures and other practices suggest that it may be collecting data and building profiles about individuals who have not signed up for Clubhouse, i.e., shadow profiles.

Certain Clubhouse features may result in GDPR violations.

As to the former, a quick scan of Clubhouse’s privacy policy[3] reveals some degree of tailoring specific for U.S. law (and likely to satisfy any requirements for placement within Apple’s App Store), most readily apparent by a California Privacy Disclosures section that nods to the California Consumer Privacy Act (“CCPA”). But conspicuously absent is any similar section addressing rights or requirements under the European Union’s General Data Protection Regulation (“GDPR”).

Uploaded contact data could be used to build shadow profiles.

The second concern, that Clubhouse is collecting data to create shadow profiles of individuals, is based in large part on the process of joining, and inviting others to join, Clubhouse. As noted above, Clubhouse is at present invitation only. If you have already joined Clubhouse, you are provided a limited number of invitations to dole out to contacts. (This limitation on available invites is part of the exclusivity that is driving demand.) Clubhouse continues to increase the number of invites allotted to members. But there’s a catch to inviting other users to the platform: in order to send an invitation, you can’t simply enter the intended recipient’s email address or phone number; rather, you first have to allow Clubhouse to access your contacts from your device before you can select a recipient. In other words, you have to provide Clubhouse with access to the personal data of all your contacts stored in your device in order to invite any of them to the platform.

Conversations on Clubhouse may be recorded, conversation content may be collected in real time, and audio from conversations may be vulnerable to government eavesdropping.

Less obvious but perhaps more dire issues include observations that conversations on Clubhouse are neither as private nor as secure as described in Clubhouse’s documentation. Specific criticisms point out that Clubhouse records audio, and therefore conversations, absent meaningful consent, and that portions of the Clubhouse app’s back-end infrastructure, which potentially stores user data and recordings, may be hosted on servers easily accessible by foreign state actors.

Conversations may be recorded without adequate consent.

In its privacy policy, Clubhouse discloses that it temporarily records audio from a room while the room is live, and, unless a violation of Clubhouse’s Trust and Safety program is reported while the room is active, it deletes the temporary recording when the room ends. If Clubhouse receives a report of a Trust and Safety violation while a room is active, it will retain the temporary audio recording from that room until Clubhouse completes its investigation of the incident.

Data related to conversations may be collected and used for profiling.

First, despite the specific limited purpose for which it states it uses recordings, Clubhouse could still be using details around the circumstances of recordings to develop individual profiles. Clubhouse discloses in its privacy policy that one of the categories of personal data it may collect is “usage” data, which among other categories of data may include “information about how you use [Clubhouse], such as the types of conversations you engage in, … actions you take, [and] people or accounts you interact with. ..” So even if the content of a recorded conversation is only being used for Trust and Safety investigations, the other data related to the recording — such as, for example, the people or accounts in the room and their connections, the type or topic of conversation, and whether a Trust and Safety violation was reported during the room — can all still be related to individuals and used for profile development.

Natural language processing can be used to collect real-time conversation data.

Regardless of whether Clubhouse is recording conversations only temporarily, indefinitely, or even not at all, it is likely still collecting data from conversations in real time through natural language processing (“NLP”) technologies. In basic terms, NLP describes the use of a computer to process human language data. When you use your voice to give commands to a connected device, such as through a smart assistant on your phone or any number of the internet-of-things (“IoT”) devices that many people have in their homes, that the device can perform or respond to the request is NLP in action (among other underlying technology).

One instance of the potential for government eavesdropping on conversations was already identified.

That the temporary recordings of conversations on Clubhouse are encrypted also may not provide the degree of security that users probably expect. In particular, in an entry posted to its blog on February 12, 2021, the Stanford Internet Observatory (“SIO”) detailed how a provider of back-end infrastructure to Clubhouse likely could access users’ raw audio data, even where encrypted by Clubhouse.[5]

The failure to implement meaningful privacy controls by design presents a real risk of harm to both individuals and businesses, even those who do not use Clubhouse.

As suggested above, that Clubhouse’s popularity is surging despite obvious privacy flaws may say more about the state of global social media consumers than it does about Clubhouse. That is not a reason to excuse Clubhouse’s failure to account for and implement stronger privacy controls from inception. The developers and investors behind Clubhouse are all too smart and experienced to have been ignorant of their responsibility in this regard, and the harm that could ensue if the app were to reach its tipping point without meaningful privacy controls in place, which it seems to be beginning to crescendo toward at this moment.

Development and use of profiles of individuals without their knowledge, input, or ability to influence the data collected about them can perpetuate and institutionalize historical prejudices.

While it may be an extreme example, the Cambridge Analytica scandal is illustrative of the potential harms that the exploitation of personal data can result in. Individual consumers — the users of the technologies that collect their data — should not bear the burden of having to understand at a granular level how their personal data may be harvested, compiled, and subsequently used or shared by the platforms they participate on.

Reduced privacy protections coupled with explosive growth will harm businesses and individuals, and probably even Clubhouse.

The harms described above are far from the only risks presented by Clubhouse’s data and privacy practices. As noted earlier, European privacy regulators already have vocalized their concerns about Clubhouse’s practices, and most certainly will pursue regulatory enforcement against Clubhouse, which could result in massive fines under the GDPR.

Clubhouse should have been better about privacy; it’s not too late for it to improve.

Pointing out the privacy shortcomings of Clubhouse and other technology platforms and services does not have to be an indictment, but rather can serve as a call-to-action to the developers of such technologies. There is both utility in and demand for innovative platforms for human engagement. But trust and accountability need to be present too. We do not need to look far back in our history to see the harm that can ensue as a result of reducing privacy protections. As such, it is fair for us as consumers to expect emerging technologies such as Clubhouse to incorporate strong privacy controls and responsible data practices by design and default, and to express our disappointment when they fall short.

--

--

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
Xavier Clark

Xavier Clark

Technology and privacy attorney in Portland, Oregon.