Cognitive Business: Baidu AI Lab's Outlook Into 2017 and Beyond

Cognitive Business: Baidu AI Lab's Outlook Into 2017 and Beyond
This post was published on the now-closed HuffPost Contributor platform. Contributors control their own work and posted freely to our site. If you need to flag this entry as abusive, send us an email.

In this Cognitive Business interview, we get Baidu AI Lab's outlook into 2017 and beyond through the lens of Adam Coates, Director at Baidu Silicon Valley AI Lab.


Source: Baidu Research

Before we dive into the interview, here is a little bit about the accomplished Adam Coates and his team. Coates is a Stanford PhD whose graduate thesis work investigated issues in the development of deep learning methods and led the development of large-scale deep learning methods using distributed GPU clusters. At Baidu, Coates’s team uses large-scale deep learning technology to train networks with billions of connections for state-of-the-art speech systems. Essentially, Coates’s team's goal is to make devices that are as easy to interact with as a human.

Ready for our Q&A? You’re in for a treat.

What’s the upside enterprises are seeing when they embrace AI, today?

AI is helping enterprises become more efficient and also serve consumers with better product experiences. For example, Baidu uses machine learning to forecast things like food delivery times for Baidu Takeout Delivery, or the failure of hard disks in data centers. And new consumer apps like Baidu’s “Melody” assistant can have a conversation with a user to learn about medical symptoms they’re experiencing and help connect them with a doctor.

What can we expect enterprise AI adoption to look like in the next year?

I think we will see enterprises turning to cloud computing services to quickly bring in AI capabilities. It’s difficult to build your own team of machine learning experts, but platforms like Baidu’s Open Cloud make it possible for enterprises to leverage the AI strengths of companies like Baidu that have great AI teams.

In the enterprise, how will AI interactions be led by voice technology?

There are lots of places where voice-powered interfaces will be a big help. AI interfaces powered by much better speech recognition will make it easier for customers to get in touch and find solutions to problems. We all have some experience with this when we call into a customer service line --- but these interactions will become more natural and more reliable than has ever been possible before.

It also turns out that with current speech technology it is much faster and often more accurate to talk than to type. This is going to make all kinds of note-taking and dictation more efficient --- from transcribing meetings to filling medical forms.

Can you tell me about the battle for AI talent among large enterprises?

Machine learning skills, and especially “deep learning” skills that are at the heart of recent progress in AI, are still relatively rare even though the number of applications we can see is enormous. Deep learning became popular very suddenly, and so the number of people with years of experience in the topic is tiny. Not surprisingly, there’s a battle to hire people that can build AI systems and make them work.

Even though there’s a lot of competition to hire experts, the deep learning research community has remained very open and I’m glad that we continue to see rapid progress.

What computing breakthroughs (hardware and techniques) will allow for never before possible AI technology?

Massively parallel hardware will keep getting better and this will be a big boon to deep learning, which scales very well with parallel hardware. There are a lot of thrilling software techniques that were not possible even 1 to 2 years ago that could become mainstream thanks to these advances. For example, the AI Lab has pioneered techniques that store large chunks of a neural network directly on a chip to perform computation an order of magnitude faster. This only works on the latest, largest chips -- in the future, we’ll have even more options.

We will definitely see more hardware vendors releasing chip designs that are tailored for deep learning and AI. For example, deep learning applications can benefit from lower precision arithmetic --- and chipmakers are adding exactly these features. The AI Lab recently released the DeepBench benchmark specifically to help hardware designers understand what researchers are running and how they can improve performance.

What can’t enterprise AI do today that it will be able to do in the next five to ten years?

Natural language is still a big challenge. I don’t think we will be able to have open-ended conversations with computers for a while, but there’s a good chance that more narrow conversations will become very natural over the next few years.

“Cognitive Business” is an interview series featuring awesome people in the Artificial Intelligence (AI) world. Written by Lolita Taub and written for business people.

Popular in the Community