If there is one single transformation that illustrates the need for Big Data applications and a wired infrastructure, it is the stunning pace of urbanisation around the globe. In 1950 just 30 per cent of people lived in cities, by 2050 three quarters of the population will be city-dwellers.
Alongside the inevitable environmental consequences, this shift threatens to amplify the impact of other factors, including Europe’s ‘Lost Generation’ of (currently 7.5 million) unemployed 15 – 24 year olds and the frailties of an ageing population.
“This will challenge all of us working on Big Data: what are we trying to build?” said Dan Pelino, General Manager, Global Public Sector at IBM. The potential exists to build resilient citizen-based services, to improve healthcare, to create jobs and to spark economic revival, Pelino told the Science|Business conference, ‘Smarter Data for Europe’.
The scale of the changes that can be put in train by Big Data is exemplified in projects supported by IBM’s Smarter Cities Challenge, in which the company has worked in concert with local governments to apply Big Data to do more with less, by integrating information and operational silos, engaging citizens and identifying efficient investments in infrastructure.
“Look to the success for the way forward and follow good examples from around the world,” Pelino said.
Changes in real time
One field where Big Data has a long track record of driving efficiencies and constant improvement is in Formula 1 motor racing. “F1 as a business is underpinned by Big Data. In essence that’s what the industry is – looking at what’s happening to data and how it’s changing in real time, so we can influence outcomes,” said Peter van Manen, Managing Director of McLaren Electronic Systems. “Of course F1 is not the only place where decisions based on data have big outcomes,” he added.
McLaren’s skills in monitoring and analysing multiple inputs in real time have been transferred to analysing outputs from clinical monitoring systems in the intensive care unit at Birmingham Children’s Hospital and synthesising them into intelligence that can be used proactively to prevent cardiac arrests. This makes it possible to see life-threatening events earlier, by identifying when the pattern starts to change. Whereas previously staff would be alerted to a cardiac arrest when it happened, now there is a warning beforehand and it is possible to take action to avert it.
“What we are doing is taking the philosophy of looking at Big Data as it comes through and before it is stored to decide if something is about to happen and be able to respond,” said van Manen noting, “There’s a better chance of a good outcome if you react sooner.” In the case of the Birmingham intensive care unit there has been a 25 per cent reduction in life threatening events.
Van Manen estimates such improved decision making translates through to a 10 – 15 per cent saving in terms of the economic cost. “You can change data into decisions and that’s where the real value lies: Big Data gives us a great opportunity to make better decisions.”
Impact on patient pathways
While it may be relatively straightforward to bring Big Data to bear in F1, it is more demanding to smooth its adoption in healthcare. “It calls for different ways of [deploying] doctors and nurses, and has an impact on the patient pathway, so there are challenges,” said van Manen.
Perhaps bigger issues in applying Big Data in healthcare are those of data sharing, data privacy and data quality, suggested Andrew Morris, Chief Scientific Officer for Scotland. “We need to sort out data issues before implementing Big Data. There must be proper approval for data sharing or it will conspire against us,” he said.
Healthcare is the last major sector not to be transformed by data. “When you look at healthcare systems you see they are good at episodic care, but surveillance of chronic disease is not good because it relies on integration.” The lack of integration and failure to share data between the different links in the chain causes, “waste, duplication and harm,” Morris said.
Scotland has the unfortunate reputation of being the ‘sick man of Europe’ with around two million of its five million population suffering from at least one chronic disease. Morris has been instrumental over the last twenty years in setting up a national system for monitoring patients with Type II diabetes. This improved surveillance has had a dramatic effect in reducing some of the most insidious effects of the disease, with 40 per cent fewer limb amputations and a 45 per cent reduction in the incidence of blindness caused by Type II diabetes.
Use Big Data to drive health policy
Big Data can also be used to drive healthcare policy, Morris believes. For example, using historical datasets it was possible to assess the health impact of banning smoking in public places. After the ban, which came into effect in Scotland in 2006, an 18 per cent reduction in the incidence of acute asthma was recorded.
As chairman and co-founder of the UK’s Open Data Institute, Nigel Shadbolt too, has devoted considerable effort to opening up public sector information and making it amenable to Big Data analytics. It has been a “heck of a journey” on which it transpired that much important information is locked up in inaccessible departmental spread sheets. “These are often the Rosetta stones for all other data assets,” Shadbolt said.
The range of different forms of data needs to be reflected in the tools and methods that are devised to manipulate it. One positive element of the current landscape in Europe is a significant investment in the development of new algorithms.
Curating Big Data stores to make them accessible for use and to maintain their integrity, calls for the development of a new profession of stewards and custodians , who will sit somewhere between librarianship and information science, Shadbolt said.
Big Data as a source of jobs
There is also direct evidence that unlocking Big Data is a source of new jobs, with a number of start-ups forming around the Open Data Institute, which is based in east London. One example is OpenCorporates, a start-up that began by harvesting data from Companies House in London and enriching it by adding details of the beneficial ownership of companies. It has now extended this to include data sources beyond the UK and to date covers 54.2 million companies worldwide, with the information available for free.
Another example involves analysing all local authority spending in England and Wales and packaging this up as business intelligence for procurement officers, to increase control of spending. Making such public sector data available has allowed third parties to work with it, and as a result the government has got free applications. “The power of information is in its widespread use, not its monopoly use,” Shadbolt said.
Armed with these examples, three things are now needed to pull down barriers and push the Big Data revolution forward. It is necessary to improve the supply of high quality open data, stimulate demand and provide the necessary skills,” Shadbolt concluded.