Data Protection in Singapore: From Compliance to Accountability
Ladies and Gentlemen
- I am very honoured to be invited to speak at the Privacy Protection and Data Governance Summit today. I look forward to exchanging views and sharing experiences with such a respected audience that includes the deans of the Peking University Law School & Development Academy, and my friend Mr Stephen Wong, Privacy Commissioner for Personal Data for Hong Kong SAR. This is a great opportunity for me to share from Singapore’s humble experience and speak about our approach towards data protection. I describe it as an Asian approach to data protection and there is no better place to share our perspectives than right here in the heart of Asia, Beijing, the capital of the People’s Republic of China.
- The advent of the Internet of Things (IoT) has greatly increased the velocity at which data is generated and collected. Big Data allowed us to aggregate and start managing the explosive growth of data. And Artificial Intelligence (AI) is a key enabling technology that allows us to harness data to power new functions like recommendation engines, image recognition, translation and predictions. Data has brought forth new opportunities to countries, companies and the society that were unimaginable a decade ago. Both China and Singapore are beneficiaries. For instance, efforts in enhancing facial recognition technology driven by AI in China brings about great possibilities in its applications. Back in Singapore, we have implemented this technology and I no longer have to carry my office pass with me because facial recognition technology literally opens doors for me.
- Personal data forms a large part of data generated and collected by IoT devices. When the Singapore’s Personal Data Protection Act was introduced, user-provided personal data formed the majority. Today, IoT has changed the way we generate and collect personal data. The volume of user activity and observable data generated from IoT devices has increased exponentially to eclipse declared or user-provided data. IoT devices collect personal data whenever a person steps within range of their sensors. The assumption of the consent-based foundation of data protection law is challenged in the following ways. There is no easy way to identify when the purpose for collection is effectively brought to the attention of the individual. And it is difficult to find a clear indication of consent from the individual.
- The way our personal data is being used has also changed drastically. As data analytics and AI offer new possibilities to make better use of existing data, re-purposing or secondary use of data becomes possible. This places additional stress on the traditional consent-based regime. Consent for a fixed list of purposes that was obtained in the past cannot possibly anticipate the new possibilities that technological advances offer in the future. From a pragmatic point of view, contacting each individual to obtain fresh consent is often not viable. Equally, leaving past data sitting in storage and not making use of them when technological advances offer the opportunity to do so is economically wasteful.
- Although Singapore’s Personal Data Protection Act is barely 5 years in operation, we have come to realise that it needs to be updated. When we started, we took a lot of guidance from the OECD guidelines and Canadian data protection law. Today, as we review our law, we have decided that we cannot follow Western examples alone but we have to create a bespoke set of rules that will allow us to find a new balance between consumer protection, business efficiency and technology innovation. These are not competing interests but complimentary. Businesses will deploy new technologies as fast and as far as their customers are willing to adopt. To spur the adoption of emerging technologies, it is essential to establish a conducive and consistent regulatory environment that encourages innovation through the building of public trust. With a high level of public trust, consumers will be more willing to participate in the digital economy. When done right, these form a reinforcement loop that will propel society and the economy forward.
- I wish to now share about Singapore’s approach in building business and consumer trust in the digital economy so as to build a trusted environment that is conducive for innovation. Two years ago, we embarked on a three-stage process to help organisations in their transformation journey. This entails a mindset shift from compliance to accountability. It is about being able to demonstrate to customers that measures have been put in place to pre-emptively identify and address the risks that come with data use. We have achieved several milestones which I will elaborate.
- The history of accountability can be traced to the very beginnings of data protection. It began as a way to ensure that organisation transfers personal data only to other organisations that have equivalent data protection practices in place. This principle has evolved but one key aspect of accountability today is still ensuring that personal data is transferred to receiving organisations with comparable practices. Today, this principle operates at different levels. Some countries have mutual recognition that each other’s data protection laws are comparable, making life easy for businesses that transact between them. Other regimes require that companies put in place contracts to ensure that their counterpart has data protection practices in place. A new area that we think will grow is for recognition of data protection marks as a visible indication that data protection practices are in place.
- In this modern world with transborder commerce taking place all the time, we need to present options for the business community so that consumers at home have access to worldwide choice and reap the benefits of e-commerce. The principle of accountability will enable us to construct secure bridges between countries with comparable data protection practices, so that data can flow in support of transborder trade and e-commerce. I believe that conferences like this where we share practices in data protection will increase our knowledge of each other’s data protection systems. Increased knowledge and familiarity will enable us to find the best way to facilitate cross border flow of data in support of modern digital trade for the ultimate goal of benefiting our consumers at home.
- Just last week, Singapore officially launched the Data Protection Trustmark certification scheme. The Trustmark enhances and promotes consistency in data protection standards across sectors, by establishing and recognising robust data governance standards. It helps organisations increase their competitive advantage and build trust with their clients. The framework was developed based on adopting and aligning to the PDPA and incorporating elements of international benchmarks such as APEC CBPR/PRP and global best practices.
- As the certification entails regular independent review of work processes, the Trustmark serves as a visible badge of recognition that an organisation demonstrates accountability and responsibility in their data protection policies and practices. It also allows customers to easily identify them, giving business partners and consumers the assurance that they need.
- We think this is an important development. The Commission’s 2018 surveys had shown that 2 in 3 consumers would rather buy from a brand that is certified to protect their personal data. In addition, 4 in 5 organisations would also work with businesses that properly manage personal data. With this, we look forward to more organisations coming on board.
- To ensure that the regulatory environment keeps pace with evolving technology in enabling innovation, the Commission is reviewing the Act and has to date conducted two rounds of public consultations to gather feedback on our proposed changes.
- As I have stated before, we cannot rely on consent to be the only control on how personal data is used. We need to enhance our consent regime by introducing parallel bases for processing personal data. We have identified two new enhancements. First, we will be introducing a system to allow organisations to notify consumers of their intention to use personal data collected in the past in a new way. The notification can take place in any means that is most effective for the organisation’s customer-base. The notification will have to provide a period for opt-out. At the end of the period, anyone who has chosen not to opt-out will be deemed to have consented to the new use and the organisation can proceed. This provides a pragmatic way to organisations to make better use of data that they already have as technological advances opens new opportunities.
- There may be other times when the larger interests of systemic benefits override individual preferences. One clear example is monitoring payment transactions for fraudulent activities or money laundering attempts. The need to maintain systemic integrity and trust will trump individual’s preferences. I should also add that oftentimes, it is the crooks who will withhold consent if they know that someone is watching! We are therefore introducing legitimate interest as a way to allow organisations to make use of data without having to obtain consent when there is a larger benefit to society.
- We introduced a regulatory sandbox to pilot test these concepts before we amend our law. There has been strong interest from the private sector. We look forward to working with the private sector to pilot these concepts and fine-tune the details before we introduce amendments.
- Consumers will also look forward to greater accountability from organisations. A new mandatory data breach notification regime will be introduced that will require organisations to notify the Commission and affected individuals when there is a data breach. This is not introduced lightly as we expect the Commission’s workload to increase. But we think that this is necessary for two reasons. Notifying individuals early will allow them to take steps to protect themselves, for example, changing passwords and cancelling credit cards, depending on the context. Notifying the Commission gives us the opportunity to provide guidance to organisations in managing the breach and containing its potential harm.
- I will spend the remainder of my allotted time sharing about how we are putting accountability into practice in the area of AI governance. AI technology diffusion into the marketplace is a classic example of how we are making use of accountability as the guard rails for guiding the broad adoption of AI in Singapore.
- In order to create a trusted ecosystem for AI adoption, we announced three initiatives last year to engage key stakeholders in the shaping of the AI ecosystem. These are, the formation of an Advisory Council on Ethical Use of AI and Data, the establishment of a Research Programme on the Governance of AI and Data Use, and the publication of a discussion paper on the responsible development and adoption of AI. It is the discussion paper that I will focus on. Issued in June 2018, we have now converted it into a model framework which we plan to publish very soon.
- Spearheading Singapore’s discussions on legal, ethical and governance issues that may arise from the use of AI and other data-driven technologies, we developed a Model AI Governance Framework. It is an accountability-based framework for responsible development and adoption of AI. It sets out what we think are the sorts of questions which should be asked by any company before embarking on or implementing AI into their products.
- By adopting the model framework, companies will be putting into practice, fair, transparent and accountable AI that is human-centric. The framework seeks to do three things:
- First, it seeks to identify how ethics and risks associated with AI can be integrated into and managed by existing corporate governance structures. When AI is used to augment human decisions or even to make autonomous decisions, the framework seeks to provide guidance on how to select the most appropriate decision-making model. By doing so, it hopes to ensure that decisions are taken at the right levels, with proper oversight by the Board, without introducing new structures.
- Second, the model framework examines operational considerations when data from different internal or external sources are prepared and used to train machine learning models. By identifying the people involved in ensuring data quality, model training, model selection, and monitoring the output of selected models, companies will be able to allocate the right responsibilities to the right departments.
- Finally, how does a company communicate with its customers when it makes use of AI in its services or products? The model framework aims to provide guidance on the level of disclosure, identifying effective means of communication and providing customers with the means to reach out.
- We believe in taking the position of promoting responsible data use and sharing to engender a high level of consumer trust. We want companies to build in good data accountability practices, to provide reassurance to consumers that decisions or suggestions made through AI are working towards their best interests. In order to build public understanding and trust in AI technologies, it is important for companies and consumers to understand the benefits and challenges of the AI technologies, such as challenges related to ethics and the law.
- The ultimate goal of our shift from compliance to accountability is to establish a high level of consumer trust as the bedrock of our data protection regime, thereby enabling data innovation in Singapore’s Digital Economy.
- Let’s think critically, share experiences and learn best practices on harnessing the new wave of challenges, and work together to continue building a robust, trusted data ecosystem that will drive this digital economy.
- With this, I would like to thank Peking University Law School & Development Academy for the opportunity to speak and I wish you a very happy Lunar New Year ahead.