Saturday 16 October 2021

Technology and Human Bias (1) -- Robo-advising reduces culture biases!

In this new post-corona era, there is a rising interest in technology’s impact on biases in human decisions. Do few or no human interactions reduce and even eliminate culture, racial, and other taste-based biases? In this series of blogs, I hope to provide more understanding on this question by discussing several recent research papers.

The first paper to discuss is D'Acunto et. al. (2020) where they find that automated robo-advising lending tools can reduce culture biases in the peer-to-peer lending process. Moreover, the reduction of cultural biases actually provides sizable economic returns to investors! Discriminating lenders face 32% higher default rates and about 11% lower returns on the loans they issued.

As their study uses a leading P2P platform in India, Faircent, let's first have a brief idea of the cultural bias in India.

https://www.culturalsurvival.org/publications/cultural-survival-quarterly/ethnic-and-religious-conflicts-india

https://www.pewresearch.org/fact-tank/2021/06/29/key-findings-about-religion-in-india/

https://en.wikipedia.org/wiki/Shudra


Their main findings are two folds. 

Before investors use robo-advising, they detect two spiking patterns that are indications of culture-based discrimination in the lending process.

  • Both Hindu and Muslim lenders tend to favor borrowers of their same religious relative to borrowers of the other group.
  • Lenders are less likely to lend to borrowers who tend to be considered as Shudra borrowers.
After adopting robo-advising, these patterns of cultural discrimination are largely reduced. The share of borrowers of different religions are equated for Hindu and Muslim lenders. Similarly, the share of Shudra borrowers increases substantially.

You might wonder whether it is a good thing to have a more equalized lending process? Maybe it is not cultural discrimination, but just a rationalized choice that gives them a higher return? In the economic theories, this is called "Statistical Discrimination" which can at least date back to Arrow (1971) and Phelps (1972). However, the study shows that it is not true at least in their setting. Before the adoption of robo-advising, borrowers with the same region are more likely to default. Shudra borrowers' loans do not perform worse than other loans. If anything, they are less likely to default.


Reference
Kenneth Arrow, 1971. "The Theory of Discrimination," Working Papers 403, Princeton University, Department of Economics, Industrial Relations Section.

D'Acunto, F., Ghosh, P., Jain, R., & Rossi, A. G. (2020). How Costly are Cultural Biases?. Available at SSRN 3736117.

Phelps, E. S. (1972). The statistical theory of racism and sexism. The American Economic Review, 62(4), 659-661.