FOLLOW

FOLLOW

SHARE

How Governments Around The World Are Turning To Data

We look at how different countries are solving their biggest problems with data analytics

10May

According to recent research by Garner, a quarter of government CIOs believe that a lack of digital skills are a barrier to achieving their goals, and chief among them is data science skills.

The survey asked 377 government CIOs from 38 countries a range of questions around technology, from how much of their budget is dedicated to IT, where they see the biggest potential for technology use in government, and what barriers stand in their way. The p that data analytics and data science skills were the most lacking within their organisation, with 30% saying they were vulnerable in that domain; security and risk were ranked second for government, with 23% of respondents indicating this was a concern.

This is a situation that is going to have to improve rapidly. The global urban population has risen to 54% of the total population and is predicted to rise to 66% by 2050. For a city to function with this level of migration, billions of data points and Internet of Things technology are going to have to be incorporated. Essentially, we are going to have to create data-driven smart cities, which use intelligent transport systems and are administered by integrated urban command centers.

Many governments around the world are already some way along the road to achieving this, with initiatives in place that seek to leverage big data to improve their infrastructure, process, or systems. We’ve looked at six governments who are currently using data to solve the everyday problems of their citizens, make their lives easier, and better their economic situation, and whether they’ve been successful.

Indonesia & Singapore

Three and a half million people a day commute into Jakarta from the wider metropolitan area of Greater Jakarta, mainly by car. Unfortunately, while they may come for the work, they usually end up staying for the traffic, and usually not by choice. In a 2015 index of the world’s worst cities for traffic jams from motor oil company Castrol, Jakarta ranked dead last, with satellite navigation data finding the average driver starting and stopping more than 33,000 times in a year, and an estimated 70% of the city’s air pollution comes from vehicles.

There is a subsequently a tremendous knock-on effect on public transportation, which the Jakarta Government is looking to big data analytics to solve. The Jakarta Government’s Smart City team is now working alongside Pulse Lab Jakarta to leverage real-time bus location data, service demand data, and real-time traffic information. It is currently in its first phase of implementation, with the focus on two aspects: mapping locations with abnormal traffic behaviours and understanding how customer demand responds to traffic dynamics. This should allow them to appropriately schedule buses so as not to add to the congestion, and better optimize capacity. Similar projects are being seen in countries like Singapore, whose Land Transport Authority (LTA) is using real-time data from ground sensors and Radio-Frequency Identification (RFID) chips to improve public transport and citizen mobility. Bus arrival times are tracked using sensors installed in over 5,000 vehicles, which then transmits real-time location data of buses to a central command centre where predictive analytics is applied in order to better match supply to demand. There are also 500 signal crossings with the ‘green man plus’, 77 ERP gantries, 1,600 electronic car parking systems, 176 MRT stations and 114,000 lampposts that could potentially be equipped with sensors - all of which adds up to a significant amount of data, and the Singapore government is looking to analyze every bit. According to Dr Daniel Lim Yew Mao, Consultant with the Data Science Division, Government Digital Services (GDS), Infocomm Development Authority of Singapore (IDA), the ultimate goal of all this is to ‘transform transport so that ‘walk, cycle, ride’ is how we travel for a greener and healthier Singapore,’ highlighting just how important government involvement in data initiatives is for the environment.

US

The Obama presidency, whatever its failings, was always undeniably committed to data. From his very first election campaign, when he tied John McCain in knots with his data-driven ground game, Obama never seems to have lost faith, with numerous data-driven initiatives implemented since to solve a number of the country’s problems. He was the first president to appoint a Chief Data Scientist in DJ Patil, and issued an executive order making open and machine-readable data the new default for government information, ensuring that information about all government operations is readily available to anyone who needs it - something vital in ensuring an efficient and transparent government, while also enabling significant opportunities for innovation.

Among his other initiatives was a new program to try and reduce the local jail population with the ‘Data-Driven Justice Initiative’ (DDJI). The US operates the largest prison system in the world. With just 5% of the global population, it accounts for almost 25% of all incarcerated individuals. The DDJI will see the government partner with seven states and 60 municipalities to use algorithms to get any nonviolent offenders who don’t pose a risk to the community out of the overcrowded prison system.

Data is also being used heavily at a more local level. Tom Schenk was appointed Chicago's new Chief Data Officer in 2015, having previously served as the the City's director of analytics and its open data portal. He is currently overseeing a radical adoption of analytics across every facet of city management. In 2014, Chicago’s Department of Innovation and Technology (DoIT) also began constructing the SmartData Platform, an open-source predictive analytics platform funded with a $1,000,000 award from Bloomberg Philanthropies’ Mayors Challenge.

Schenk has been able to use predictive analytics to leverage the data at his disposal in a number of innovative ways. For example, determining where to place bait for rats by listing which dumpsters are most likely to be overflowing. According to Schenk, this has seen the city become 20% more efficient in controlling rats.

One of the most notable ways that predictive analytics has been used in Chicago is in food inspection. Under Schenk’s leadership, DoIT has collaborated with a number of City departments to put predictive analytics to good use, including the city’s Department of Public Health (CDPH). Chicago, with a population nearing 3 million, has less than three dozen inspectors to oversee the annual checking of the city’s 15,000 food establishments. In order to make the best use of these inspectors, DoIT and CPDH have collaborated to build an app for inspectors that uses predictive analytics to score food establishments on how likely they are to face a critical violation. This score was based on factors believed to correlate to violations - such as a prior history of critical violations, possession of a tobacco and/or incidental alcohol consumption license, the length of time an establishment has been operating, as well as nearby burglaries, among others. Inspectors can then use this to make sure that they visit those food establishments that most urgently need visiting first. By using this analytics-based procedure, Chicago has been able to discover critical violations an average of seven days earlier than with the traditional inspection method.

It is difficult to say what the future holds for government data in the US under Donald Trump, although the apparent success it brought his campaign suggests he may at least be aware of its advantages. Hopefully, we will see Obama’s work continued and more data opened up to the public so the US can continue to lead the world in data analytics.

Canada

The Canadian government has come in for criticism over the years for the way it treats its wireless spectrum, the radio waves responsible for everything from TV and radio broadcasting to data and phone service for smartphones. According to Gregory Taylor, principal investigator for Canadian Spectrum Policy Research and a professor at the University of Calgary, it has been auctioning off airwaves based on data provided by the big wireless carriers, who he notes have a clear vested financial interest.

However, the Government of Canada is changing this. It is now using big data to make smarter use of wireless spectrum, conducting research at the Big Data Analytics Centre to help the Government predict where on the wireless spectrum there are unused radio waves that can be put to work to ensure that the wireless networks Canadians depend on are reliable and accessible, regardless of traffic load.

According to the Honourable Navdeep Bains, Minister of Innovation, Science and Economic Development Canada, ‘To power their smartphones, tablets, TVs and radios, Canadians rely on the judicious use of wireless spectrum. We can't make more wireless spectrum, but we can make better use of it. To do that, we need to understand exactly how it's being used and where. We need to know, in real time, where there are unused radio waves that could be put to work. Big data is the key to understanding that. It gives us the power to turn data into useful insights that allow us to predict where the surplus capacity will be at any given time. The research being conducted at the Big Data Analytics Centre has the potential to transform not only the telecommunications sector but all sectors of the economy.’

Switzerland

One of the most exciting, and important, applications of big data has been in healthcare. It is being used to track diseases like malaria and dengue fever, as well as in drug discovery. According to a recent paper, ‘Use of machine learning approaches for novel drug discovery’, they can now be applied in several steps of the drug discovery methodology. These include ‘the prediction of target structure, prediction of biological activity of new ligands through model construction, discovery or optimization of hits, and construction of models that predict the pharmacokinetic and toxicological (ADMET) profile of compounds.’

The Swiss government is now leading the way among governments researching its potential, having launched a new initiative to create a national infrastructure for sharing clinical data, including genomics data, among university hospitals, institutes, and others involved in personalized healthcare research in Switzerland. The initiative is called the Swiss Personalized Health Network (SPHN) and is expected to take until 2020 to complete, at a cost of CHF 40 million just to support IT and clinical data interoperability.

Torsten Schwede, an investigator at Biozentrum, the Center for Molecular Life Sciences at the University of Basel, noted that ‘The SPHN initiative is unique worldwide as it brings together all university hospitals and associated academic research institutions in the country in a joint effort to transform the way biomedical information is used for research to benefit personalized healthcare.’ Hopefully, it is not unique for much longer.

China

China has a population of 1.4 billion, 700 million of whom are internet users. A population of this size is a massive resource when it comes to collecting and analyzing data, and the Chinese government has some ambitious plans.

According to a high-level policy document, China’s Communist Party is attempting to develop a far-reaching social credit system under the guise of building a culture of ‘sincerity’ and a ‘harmonious socialist society’ where ‘keeping trust is glorious.’ They are seeking, primarily, to better control China’s vast, anarchic, and poorly regulated market economy and stem the flow of corruption. However, many in the Western media have drawn comparisons with 1984, arguing that it is really just a state surveillance system that will leave the government free to manipulate the people. This is something the Chinese government has undoubtedly been guilty of before, but the criticism is somewhat ironic given that many of their data tactics have been taken from American companies.

The Communist Party’s ambition is to collate all available online information about China’s companies and citizens in one place — and then assign each of them a score based on their political, commercial, social and legal ‘credit’. Any person or company deemed to have fallen short faces an array of penalties, which range in severity depending on the nature of the discretion.

This system is already in the testing phase, although the scale presents a number of challenges that could prove insurmountable. With a population of 1.4 billion, it’s near impossible to centralize all the data, clean it, analyze it, and action it. According to a report by The World Privacy Forum, a non-profit organization, credit scores are based on hundreds of data points with no standards of accuracy, transparency, or completeness. Wang Zhicheng of Peking University’s Guanghua School of Management similarly told the Financial Times that, ‘China has a long way to go before it actually assigns everyone a score. If it wants to do that, it needs to work on the accuracy of the data. At the moment it’s ‘garbage in, garbage out.’’

Comments

comments powered byDisqus
Turkeyss

Read next:

What’s On Data Analysts’ Plates This Thanksgiving

i