Last updated: 13 March 2023
Published on: 14 May 2018
6 MINS READ
New framework and blockchain-powered platform will enable the safe and secure sharing of data, which can help propel AI and beyond.
By Janice Lin
Though vast amounts of data are generated yearly, only a handful of large organisations have access to this tsunami of data.
In 2016 alone, a massive 16 zettabytes (that’s one trillion gigabytes) worth of data was created worldwide, yet only a scant one per cent of it was analysed and used, according to market research firm IDC.
For data owners, concerns over trust and security hinder the mass sharing of this information, despite the benefits that can be gained from leveraging large data sets, such as optimising business processes or accelerating the development of artificial intelligence (AI). Most existing data exchanges also operate on a centralised model, which does not engender trust, as data owners would have to give up control of their data to use the platform.
It is with the purpose of addressing these issues that Singapore-based start-up DEX and PwC Singapore have come together to work on a framework and technology for the safe and secure exchange of data.
This collaboration will see the launch of a new model of data exchange, where data owners and buyers can transact on a blockchain-based platform, called Ocean Protocol, to share data. This platform, which will be up and running by end-2018, will allow for the secure, auditable and transparent exchange of data that is not only compliant with existing regulations, but also scalable.
Companies like consumer goods giant Unilever and biotech firm Roche Diagnostics have come on board to be among the first to test the new platform.
“Big companies, multinationals and governments are creating huge volumes of data and facing increasing pressure to monetise their information, while AI start-ups and researchers have their algorithms but little data. What we’re doing essentially is building a bridge between the two,” said DEX’s CEO, Mr Chirdeep Singh Chhabra, at the launch on 10 April 2018.
Setting the standard for the global data economy
As a decentralised platform, Ocean Protocol will ensure that control of the data lies in the hands of its owners, thus helping to create trust. Meanwhile, it records information about who has touched the data, what was done to it and for which purposes, which creates a transparent and secure environment for users to transact on.
The hope is that this new model will set the standard for the global data economy, with plans to open source Ocean Protocol’s technology and provide the tools and services to allow for the replication of this model and for more solutions to be built upon it.
“It's about creating these avenues and enabling organisations and large companies to come on board, so if any organisation wants to start a marketplace on top of Ocean Protocol, it should be able to do it, by just copying the code,” said Mr Chhabra.
The framework and Ocean Protocol are being built with support from the Info-Communications Media Development Authority (IMDA), which will provide regulatory oversight, policy guidance and feedback to ensure the proper sharing and usage of data.
Mr Yeong Zee Kin, IMDA’s assistant chief executive of data innovation and protection, said: “We want to be an engaged regulator, to be part of what’s happening and to learn and understand data along with the industry. Our role as an industry developer will be enhanced if we work hand in glove with private sector initiatives and the industry, to ensure consistency and that this market can grow.”
Development sprints with industries
Six industry-led pilots have been launched to support the development of this project, with participation from firms in sectors such as healthcare, retail, finance, utility, mobility and built environment. These “sprints” will help the project gain deeper insight into the challenges faced by each industry in sharing and accessing data.
Roche Diagnostics is among the first to jump on board, with plans to collaborate on a pilot project to look at ways to improve the transmission of data sent by heart patients undergoing blood-thinning therapy at home. By being able to receive blood results – in real time – from patients’ home-monitoring device to the hospital, clinicians can better monitor and manage their patients’ conditions and thus prevent complications such as a stroke from occurring.
Speaking at the media briefing, Roche Diagnostic’s Asia Pacific managing director, Mr Lance Little, said clinicians currently receive these results only when patients return to the hospital for check-ups. This could be weeks or months after the results were first taken.
“We need trusted and effective tools to bring our data together,” he said. “Accessing insights from this data in real time will enable better decision-making by hospitals and, ultimately, a better patient outcome.”
Other companies that are currently participating in the healthcare stream include insurance firm Aviva and health tech company ConnectedLife, which will look at applying data analytics and AI to enhance care for the elderly and support independent living.
And in the retail stream, Unilever aims to use the framework to gain new insights on shoppers and help smallholder farmers in Southeast Asia gain access to information and services that will help them adopt sustainable production practices and thus improve their livelihoods.
Summing up this collaborative project, Mr Yeong said: “From an industry development perspective, what I'd like to see is for us to get learnings from this, and see whether we can share them with others, so that around this region, companies will be keen to come to Singapore to do this kind of work, by adapting our framework and technology.”
For further reading, check out this blog post by DEX.
1. Images courtesy of DEX.