Columnists

Individual data privacy a thorn in our flesh

data

Virtually all corporates dealing directly with customers strive to get as much personal data as possible in order to be more effective in their marketing strategies. FILE PHOTO | NMG

In this era of data analytics, every organisation is in a rush to better understand the customer. Virtually all corporates dealing directly with customers strive to get as much personal data as possible in order to be more effective in their marketing strategies.

There are very few global organisations like Apple that defend blanket individual privacy. But that could be changing. One senior Apple vice-president admitted that collecting user information is important to developing good software products, especially in this age of machine learning and big data analytics.

He proposed a paradoxical solution that he referred to as “differential privacy.”

By this he meant using a statistical science of trying to learn as much as possible about a group while learning as little as possible about any individual in it.

Whatever new definition of privacy there is, the bottom line is that individual privacy in this age of digital transformation is under siege.

People are more vulnerable and incapable of protecting themselves from this emerging phenomenon.

Apple for example argues that such gathering of data is consented by the users who decide to opt in. In my view, this is a frivolous explanation considering the fact that majority of mobile users have no idea of the many apps that could be snooping on them. Mobile handset manufacturers assume that every user of their product has a complete understanding of all the features. However, the opposite is true. Every user yearns for conveniences that come with these gadgets but they hardly understand the consequential effects on their individual liberties.

While opting in to an app in some of the devices is easy, opting out is often very complex and annoying that it requires a very sophisticated user to execute. In the absence of comprehensive legal frameworks in many emerging economies, it is perhaps our collective responsibility to become moral advocates of personal privacy.

Unabated gathering and use of personal data in business development without acceptable ethical conduct may give undue advantage to advanced countries. These countries have the capacity to store, analyse and use the data for competitive advantage.

READ: 20-year prison term for hackers of State secrets

Most of these data is passively gathered with absolutely no benefit to the individual whose data is being used to develop new enterprises.

Ethically, a standard moral code will lay responsibility on the provider to ensure that the customer is given a chance to consent or repudiate proposals to collect data from their devices. If that fails, there must be a legal requirement to reveal how the data collected from any specific country is used and a guarantee that it will not be misused.

I am well aware that some of the data is of great importance in research, especially around medical research. In such cases, it will be of great use if the data is shared with the research community in the country of domicile. Telecommunication companies, for example, have some of the most valued data that is critical to economic development.

These data does not just belong to these companies. It also belongs to the users. It will be prudent to philanthropically provide the data for national good. Perhaps it is too early to criticise Apple’s mathematical approach to individual privacy for the simple reason that there isn’t any known watertight method in place to ensure privacy. Anonymization, which has in the past been promising, is proving difficult to guarantee privacy.

Aaron Roth, a University of Pennsylvania computer science professor who co-authored with Microsoft researcher Cynthia Dwork the book ‘‘Algorithmic Foundations of Differential Privacy,’’ suggests that “[d]ifferential privacy, seeks to mathematically prove that a certain form of data analysis can’t reveal anything about an individual—that the output of an algorithm remains identical with and without the input containing any given person’s private data.”

As data becomes more of a critical resource from those who generate it, individual privacy is increasingly being threatened. Developing nations must seek to develop data analytic capability to understand its application and use in decision-making.

ALSO READ: Deloitte hacked, says 'very few' clients affected