Impact Stories

Rate Our Information

Was this Information useful?

AI: Are we there yet?

last updated 05 April 2018

From ever-faster hardware to human-beating Go champions, AI has made giants steps in recent years. But much more work lies ahead.

EmTech Asia 2018

By Janice Lin


The rapid development of computer hardware is fuelling an explosion in artificial technology (AI) and changing the way we work and play. Optimistic speakers at this year’s EmTech Asia conference described today's advances as the “golden age of AI”.

EmTech Asia is the Massachusetts Institute of Technology’s (MIT) Southeast Asian stop for its annual conference on emerging technologies. Held on 31 January at the Marina Bay Sands Expo and Convention Centre and co-hosted by the Info-communications Media Development Authority, the conference brought together industry experts and companies to discuss the progress made on this front.

Bill Daly Nvidia

Dr Bill Daly, NVIDIA's chief scientist.

Besides AI, other topics that were discussed at this year’s edition included aerospace innovation, quantum computing, virtual reality and future cities.

Hardware advances drive AI revolution

“Things are finally coming together and we're starting to apply AI to almost every aspect of human life,” said Dr Bill Daly, chief scientist at NVIDIA. “It's revolutionising all sorts of tasks and all sorts of business processes, the way we live, the way we communicate.”

Dr Daly believes this revolution is driven by advances in hardware development. “The arrival of the GPU (graphics processing unit) on the scene in 2012 was what really sparked the AI revolution. And since then, our evolution of faster hardware has been pacing the revolution.”

One example he brought up was in the area of computer vision applications, an area where most of the algorithms and convolutional neural networks used to analyse visual imagery had already existed since the 1960s. Yet the AI revolution was sparked only after hardware that was fast enough to run these networks and train them in a reasonable amount of time was developed – like GPUs.

Going forward, hardware will continue to pace advancements in AI, said Dr Daly.

“As we move from one endeavour to another, say from vision to speech, and then finally to machine translation … the task becomes computationally (more) demanding, and our progress on it is limited by our progress in building machines that are fast enough to meet those demands.”

Wu Shuang YITU

Dr Wu Shuang, research scientist at YITU Technology.

Meanwhile, Dr Wu Shuang, research scientist at YITU Technology, which opened its Singapore research and development office in mid-January, pointed to the growth in academic research as contributing to AI’s steady rise.

He stressed that because of this, it is important for AI companies to “stay very close to the frontier of research, and at the same time, looking for the right application”.

Dr Wu also shared that YITU's face recognition technology is now at a stage where it can scan one billion faces in under one second.

Machine learning takes a huge step forward

Among the various groundbreaking AI technologies discussed at EmTech Asia, one that stood out was Google DeepMind’s AlphaGo Zero. This is the latest version of AlphaGo, which in October 2015 became the first computer program to beat a professional player in the game of Go, long seen as the most challenging of games for an AI to play.

Oriol Vinyals

Oriol Vinyals, Google research scientist.

AlphaGo was able to do so via an algorithm to find possible moves based on previous knowledge gained by playing thousands of games with humans.

However, it “still needed training data from humans to know what are good moves and who is ahead”, said Google research scientist Oriol Vinyals.

On the other hand, AlphaGo Zero was designed to learn the game from scratch by playing multiple games against itself, teaching itself to predict moves with each new round. Within three days, it surpassed the abilities of the AlphaGo version that defeated a Go champion.

The technology behind AlphaGo Zero is far more powerful because of its ability to learn from itself and not be constrained by the limits of human knowledge.

This has far-reaching implications on the development of AI, opening up possibilities for the creation of machines that do not require human input to function and can also exceed human capabilities.

Tomaso Poggio MIT

Professor Tomaso Poggio from MIT.

How to think like us

Despite these advances, experts believe that for AI to take the next step, it has to be able to emulate "human-like intelligence".

This would require understanding how the brain works, said Professor Tomaso Poggio from the Brain and Cognitive Sciences Department at MIT.

Machines learn from large data sets in order to perform tasks, but human learning does not require a person to, for example, look at multiple images just to grasp what an object is, explained Prof Poggio.

“There must be the ability to synthesise programs on the fly based upon a set of small routines,” he said.

He suggested that the next breakthrough in AI would likely come from the neurosciences – because in order to build better computer brains, we would first have to understand our own.

IMpact In article subscription banner 

We have to talk about AI

In order to build a healthy ecosystem for AI technology, YITU’s Dr Wu stressed that the industry, governments and the public must come together to talk about policy issues relating to the technology.

“We need to talk about what AI is capable of and not oversell AI as a technology. It is on us to make good progress without making a big mistake,” he said.

Talking openly about AI with relevant stakeholders will also help to correct some of the misconceptions about the technology.

“AI doesn't threaten humans and it doesn't replace humans. It empowers them, frees humans from menial tasks,” said NVIDIA’s Dr Daly.

Likening AI to “power tools for the brain”, much like power saws and drills are able to assist a carpenter more efficiently than would a chisel or handsaw, he added: “Ultimately, it's going to free humans for more endeavours in the arts and sciences … With AI, we can be much more productive in our intellectual pursuits.”