BLOG

The Time for Data Protection Legislation Is Now

“The refrain that the poor need no civil and political rights and are concerned only with economic well-being has been utilised through history to wreak the most egregious violations of human rights. …The pursuit of happiness is founded upon autonomy and dignity. Both are essential attributes of privacy which makes no distinction between the birth marks of individuals.” — Justice Dhananjaya Chandrachud

Justice Chandrachud issued this powerful statement in a landmark case last year, in which the Indian Supreme Court recognized a constitutional right to privacy. A recent survey in India — where Aadhaar has stirred debates about the appropriate use of data — found that respondents across education levels recognize privacy as a fundamental right. As the digital realm claims an ever-greater share of low-income people’s economic and social activity worldwide, it’s becoming increasingly clear that stronger legal protections are needed to ensure financial services providers and others respect data privacy rights .

A weaver at her spinning wheel in India.
Photo: Joydeep Mukerjee, 2016 CGAP Photo Contest

In many ways, financial services providers’ use of people’s data trails has been a boon for financial inclusion, as it can expand access to those who are excluded or underserved. Data-based business models have enabled financial services providers to better understand and offer more services to low-income customers, who often lack formal financial records or credit histories. This can mean lower prices, greater competition and choice and more useful, customized services for the poor.

For example, Tala, a firm that offers microloans in Kenya using a smartphone app to evaluate applicants’ credit risk, gathers various types of data, including where loan applicants spend their time, how many people they communicate with every day, how often they call their parents (by searching call logs for the word “mama”) and, less surprisingly, whether they pay their bills on time. It turns out that, collectively, these behaviors can be more revealing than credit reports, especially for people with little or no prior credit experience. Similarly, Branch, another lender in Africa, uses information stored on smartphones, including contact lists, call logs, SMS logs, Facebook friends, contact lists from other social media accounts, photos, videos and other digital content to make lending decisions.

Despite the tremendous potential for use of individual’s data, however, there are considerable risks to the proliferation of data-based models if the market doesn’t adopt a legal and regulatory approach that safeguards data and empowers citizens, while laying the foundation for responsible business use of individuals’ data.

You may remember the recent Equifax data breach in which 143 million U.S. accounts, including customers’ social security numbers and, in some cases, credit card numbers, were compromised by hackers. Over the past few years, data breaches have become bigger and recurrent around the world. In 2016, the card data of 3.2 million Indian customers were stolen from a network of Yes Bank ATMs and remained undetected for three months. The risk was introduced by inadequate checks and controls between the bank and a third-party service provider that managed ATMs. In 2014, a malware-based fraud ring infiltrated Boleto, one of Brazil’s most popular payment methods. This led to an estimated 500,000 Boleto transactions being compromised. Hackers stole an estimated $3.75 billion, affecting 30 banks in Brazil, before the fraud was discovered.

Even without concerted efforts to attack citizens, violations of privacy can be dangerous. A public map released by the fitness app Strava, for example, inadvertently became a national security risk because it revealed the location and movements of people on secured military bases. In another example, despite ensuring anonymity, the Australian government accidentally leaked the entire medical history of 2.9 million citizens without their consent, when researchers showed different sets of anonymized data could be compared to each other to reveal unique identities.

Many countries, including India and Brazil, are considering or have recently implemented data protection policy that is either broad based or focused on the financial sector. The challenge is how to develop data protection laws that protect individuals and foster trust in financial services firms’ use of data on the one hand while promoting technology-led marketplace innovation on the other. This effort is complicated by ever-changing business models that rely on new and innovative ways to use data and the risks that result. Policies developed today will need to keep up and even anticipate models of the future, while establishing citizens’ protection and autonomy over their data.

Here are some specific challenges that should be on policy makers’ radars:

  • Data silos. Firms that collect data from customers often have unfettered authority to define the terms of use of the data, which may not provide customers a right to access their own data or the ability to control who can see it. As a result, these company “data silos” can limit competition and consumer autonomy. On the other hand, firms need data to innovate and develop new financial inclusion products and services. Undue regulatory burdens at early stages can stifle such innovation.
     
  • Automation biases. Algorithms and artificial intelligence (AI) are being used to make important decisions about customers, such as whether a customer will get a loan or whether her insurance claims should be paid. The promise of these models is that they will make decisions more consistent, accurate and scalable. But there is the risk that potential bias or unfairness in such models will entrench socioeconomic factors unfairly and at scale. Another risk is that they erroneously reject qualified applicants because the model is not as predictive as it purports to be — or because it doesn’t consider human explanations such as unexpected health costs or job losses.
  • Security breaches. Attempts to hack financial firms, including those providing financial services to the poor, lead to the potential of devastating losses to vulnerable populations, which can undermine trust and confidence. CGAP recently published a blog and slide deck that answers many of the questions policy makers, regulators and providers have about cybersecurity for mobile financial services.
     
  • Narrow jurisdiction. While there is a logic to focusing regulations just on the financial sector, FinTechs have so blurred the lines between traditional firms and new services that broad-based regulation may be the only way to capture relevant providers.
     
  • Local relevance. Any new legal requirements should take into account each country’s legal system, culture, economy and available technology.

Given the significant risks that the collection and use of individual data creates for consumer harm and data breaches, governments run a risk by operating without data protection laws. The time is right for countries that have not yet done so to adopt a broad-based, overarching data protection policy  that protects citizens while creating the regulatory rails for responsible and inclusive innovation.

Resources

Reading Deck

This deck answers key questions regulators, supervisors, mobile network operators, and digital financial services providers have about vulnerabilities in mobile financial services and countermeasures that can be taken to address data security.

Add new comment

CAPTCHA