Digital tools for development policy

A range of digital technologies offers the promise of making evidence-based decision-making in developing countries easier to implement. For example, predicting harvests with satellite imagery, detecting tax fraud with machine learning algorithms, or tracking poverty using image processing and deep learning at the street level. As this column explains, international organizations are well placed to act as designers and curators of these digital solutions. They have an opportunity to redefine the role of global statistical capacity, becoming not only providers of statistics and policy reports as a public good but also providers of algorithms and related tools as ‘digital public goods’.

It is the rapid and unpredictable changes in food prices that wreak havoc on markets, politics, and social stability’, noted Homi Kharas of the Brookings Institution nearly a decade ago. Over the past 20 years, numerous countries have faced revolts and outbursts caused by their incapacities to respond to sudden food price surges.

Better anticipation of harvests through satellite monitoring could help developing countries to mitigate price increases or address food penury more effectively. Current technologies offer the potential to temper these risks.

For example, Canada now uses the combination of satellite imagery and image recognition (with a deep learning algorithm) to monitor yields, forecast harvests, and integrate the results more quickly into national statistics. This reduces uncertainty, increases the granularity and speed of information, and allows decision-makers to make smarter decisions and adapt their policies to the local specificities.

In fact, agriculture is one of many areas where developing countries share common issues and where there are technological solutions. Tracking poverty at a granular level, predicting harvest, identifying tax frauds, and anticipating migratory patterns are other examples where ‘big data’ and analytical tools are useful. Several countries have successfully deployed these digital tools, but they have not been adopted at scale.

So why are developing countries not adopting them? The answer lies in the high barriers to entry, especially at the development phase. Algorithms need to be trained and updated by skilled data scientists; machines using artificial intelligence require large computing power and a steady electricity supply; satellite databases need large information technology (IT) infrastructures and have high maintenance costs. Developing countries often cannot afford these costs.

Because it is also more cost-efficient to centralize the development of such tools, international organizations are well placed to act as designers and curators of these digital solutions. With talented economists and analysts, they have the necessary development policy experience to select carefully which tools could be the most beneficial. Unlike governments in developing countries, international organizations have the capacity to attract talents from leading universities, offering them intellectual challenges and competitive salaries.

Some international organizations already do it for global statistics: the World Bank selects, gathers, cleans, and centralizes the World Development Indicators, while the International Monetary Fund (IMF) does the same for the World Economic Outlook Database.

Providing algorithms and other digital tools as global public goods could be their next logical step. As for their statistics and policy report, international organizations could offer them freely, allowing public officials or researchers to use them for their purposes. As with statistics, organizations like the World Bank would not even need to develop all the digital tools in-house, but could collect them from others and ensure their robustness, consistency, and accuracy.

Initiatives along these lines have already emerged, but they remain rare and scattered. The IMF’s growth at risk tool is hosted on GitHub, while all the digital tools presented at the United Nations’ Fifth International Conference on Big Data for Official Statistics were hosted on different platforms. Knowing that there is a central repository of curated digital tools would help public servants, researchers, and other communities.

The United Nations’ move to create a repository for algorithms through its big data platform is perhaps closest to this approach. It allows international organizations to deploy algorithms as web-service and offer them to users. Further steps in curation, maintenance, and testing might be required to make the services offered ‘bulletproof’ and visible to the public.

If the barriers to creating such tools are high, barriers to usage are significantly lower. As international organizations host these digital public goods, calculation and processing would be done on their servers. Emerging economies would not be required to invest heavily in IT infrastructures and the impact of unstable internet connections would be minimal.

This would also not require highly specialized skills, as public servants and statisticians would only need basic programming skills to use digital tools. With a little coding knowledge and an internet connection, public servants could select an application-programming interface (API) from the World Bank and feed it with their domestic data. Common programming languages and tools are open source (including TensorFlow, R, and Python), and the internet abounds with free IT lectures and tutorials.

For example, a public servant could select the ‘World Bank fraud detection in tax filings’, upload anonymized tax filings, have World Bank algorithms running, and obtain detected outliers, alleviating the costs of detecting potential tax frauds.

To date, international organizations have mainly structured their knowledge production around statistics and policy reports. Digital tools could be their next public good. Developing and centralizing digital public goods (and offering them free of charge) could empower local decision-makers, allowing for more knowledge creation, and enabling them to make better and smarter decisions.

 

Author:

Arnaud Pincet is a Data Scientist at the OECD.