Research ethics provides a model for data privacy

Does privacy have commercial value and if so, can personal data be traded for goods, services, or content? It is an important question because the data generated and produced collectively in 2020 stood at 2.5 quintillion bytes. This incredible fact is only as discombobulating as it is frightening. 

The digital economy has an enormous appetite for user data which is largely gathered, stored, and traded without permission, let alone without the consumer’s knowledge. These consumers are also not privy to the decisions made about their data and indeed they do not know who or what makes them, and for what purpose. 

If the Industrial Age created a host of wealthy industrialists, the Information Age has been considerably more generous. It has spawned many more beneficiaries of this information glut with twenty percent of the Forbes 400 having their fortunes in technology. And these 80 billionaires have a combined wealth of a whopping $1.6 trillion. 

It is therefore plausible that the privacy issue has reared its benign head because these very users have a worm’s eye view of the action. They’ve watched as tech magnates launch into the sky, roaring to a wealth stratum that no other generation has ever reached before. Perhaps the enlightened consumer desires a share of those fortunes – their data has facilitated this rise after all.

It doesn’t help that the valuation of social media platforms incorporates user counts, impressions and engagement levels which are eventually converted into hard cash. And digital media attracts advertising dollars because of its ability to keep users locked in, voraciously consuming content at one end and releasing data at the other.

The new media companies’ have performed particularly well while pulling the advertising expenditure rug from under the feet of traditional media. The ill wind of COVID-19 acted as a catalyst to this redirection of attention, and subsequently cash. In 2020, the global digital advertising spend stood at $350 billion and it is expected to reach $786 billion by 2026.

Significant chunks of this expenditure is as a direct result of unrestricted access and use of personal data. Whereas traditional media has established a code of practice that protects audiences from invasion of privacy, new media with all its machinations has dazzled and blinded us for the better part. From inception it went mostly unmodulated until it emerged that tech companies were taking liberties that were not altogether compliant to privacy norms.

There is also something to learn from the market and social research industry, the forebearer of user data collection. Commercial research adheres to ethics referred to as a codification of scientific morality in practice. The rules are based on general ethics of science, just as general ethics is based on the morality of society at large.

The privacy issue, therefore, most likely stems from the fact that privacy is a qualified, fundamental human right and it serves as a foundation upon which many other human rights are built. 

There are schools of thought that claim that personal privacy can be used as a currency for digital content. In this privacy-for-content scheme, you trade your personal data for free internet content. The idea borrows from traditional advertising where TV channels and radio stations offer audiences’ entertainment and news free-of-charge. Media owners then finance their operations from advertising fees earned, while newspapers and magazines heavily discount their cover prices in the same way.

However, receiving free content for advertising exposure is different from giving up your privacy, or having your data sold to third parties for marketing communication, sales activities, and propaganda. The Cambridge Analytica data scandal escalated the need to protect personal information as it is reported that the company acquired data of 87 million Facebook subscribers and used that to manipulate the outcomes of national elections.

Clearly tech companies must apply good governance and assume responsibility for the implications of their business models. Because in the absence of self-regulation governments are forced to act. Privacy laws formulated so far are the GDPR (Global Data Protection Regulation) in Europe, the California Consumer Privacy Act in the US, and the National Privacy act in Kenya, among others. These policies aim to enhance individuals’ control and rights over their personal data. 

The research industry has already demonstrated that you can collect accurate and actionable consumer data without violating anyone’s rights. This scientific approach is an indication that the restrictions placed on the digital media industry will not destroy the sector. It will instead enhance it, as media planners rely less on black boxes that are difficult to hold accountable, and more on transparent research which can withstand the highest degree of scrutiny.


Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s