In this Cognitive Business interview, we draw on SAS’s enterprise artificial intelligence position and outlook with Oliver Schabenberger, SAS Executive Vice President and Chief Technology Officer.
Oliver Schabenberger, SAS Executive VP and CTO | Source: SAS
You should know that Schabenberger explains things thoroughly, though in a manner that’s easy to understand and apply. This may be a side-effect of being “a recovering academician,” as Schabenberger puts it himself.
The IDC named SAS “the leader in the advanced and predictive analytics market” in their Worldwide Business Intelligence and Analytics Tools Software Market Shares report. What does it mean to be the largest global market shareholder of advanced analytics and why is this important for the enterprise?
Advanced analytics innovation has been at the heart of what SAS has done for 40 years. The past four decades saw a lot of innovation in analytics, and technology and methodology have greatly matured. But the field is not standing still; it is developing and evolving. Modern machine learning, deep learning and cognitive analytics are ushering in a new period of advanced analytics where the data take center stage. Edge analytics and streaming data analytics are challenging us to move analytic capabilities inside networks and inside devices.
As the largest global market shareholder of advanced analytics, we cater to all segments of the market. We help large customers and small ones. We solve the biggest and most complex of analytical challenges and the simple, straightforward ones.
Businesses of all sizes are moving toward data-driven decisions, and analytics are key. Today, enterprises have much greater data awareness and data literacy. But data without analytics is value not yet realized.
We generally distinguish descriptive analytics from predictive analytics. The former uses inward-looking methods based on historical data to answer the question “what has happened.” The latter uses forward-looking prediction methods to address “what might happen” and “when might it happen.” Enterprises that start their analytics journey with descriptive insights based on historical data moves toward more forward-looking predictive methods to help guide the business. Advanced analytics empower predictive methods and the optimization of business processes.
My mission is to provide analytic solutions for the continuum of data challenges our customers encounter and to help guide customers along their analytic journey. For example, cloud computing has democratized access to compute resources, but, in my opinion, has only begun to democratize analytic insight. The “advanced” in advanced analytics should not mean “advanced programming skills required.” If everyone has access to data and access to compute resources, then we need to enable ubiquity of data with ambient advanced analytics.
To lead the market in advanced analytics requires more than crafty implementations of complex algorithms that can solve difficult optimization problems at enterprise scale. You clearly need that. You also need to understand and support the analytic life cycle in the enterprise: the flow from data exploration to model development and to deployment and model management. For example, a model developed to determine whether a debit card transaction was fraudulent is not static. It needs to be monitored and adjusted as conditions change.
What’s the story behind SAS’s expansion from machine learning to deep learning and cognitive computing offerings and how will this help the enterprise customer?
SAS has offered its customers advanced modeling and machine learning capabilities for decades, through our programming language and through our solutions. We also have strong offerings in analytics for unstructured data, specifically text mining and contextual analytics. Capabilities such as content categorization, entity extraction, sentiment analysis and question-answer systems are not new to SAS.
Machine learning is a data-driven approach for classification, prediction and pattern recognition. These approaches have ascended in recent years compared to traditional modeling techniques, aided by access to more data, increasing interest in predictive analytics, and the rise of data science as a discipline. SAS has covered this space for decades.
The combination of big data and big compute has helped to propel forward in modern machine learning approaches based on deep neural networks, also known as deep learning. The excitement and fascination of deep learning is to use the data in a different way. Data continues to drive the approach – rather than the model – but it is used for algorithms to acquire skills. Rather than training a decision tree on a set of data so that the tree can classify the data well, deep neural networks learn to classify by abstracting the regularities in the data in layers.
The expansion for SAS with respect to deep learning and cognitive analytics has several facets: (1) To add deep learning based analytic approaches to our classical approaches in order to provide choices, to support greater automation and problem solving with less domain knowledge. (2) To provide deep learning tools to our customers that enable them to train modern machine learning models on their data. (3) To embed cognitive computing and deep learning into SAS products, in support of a more human-like interaction between user and software, and in support of greater personalization and more dynamic behavior.
Enterprise customers are looking towards artificial intelligence, deep learning, and cognitive computing because these techniques promise accuracy and automation, without the prerequisite of deep domain knowledge. The keys to building accurate learning systems are large volumes of quality data, compute power, and tools. We are providing the tools and analytic infrastructure to derive insight, and this insight will increasingly be the result of AI techniques applied to enterprise customer data.
Enterprise customers also need to drive analytics and data-driven decisions through their organizations. Analytics are increasingly requested, interpreted and consumed by those not trained in data science and statistics. Visualization was one driver toward self-service analytics. Cognitive technologies are taking it to the next level, a natural interaction between user and software: request analytics in natural language (keyed or spoken), retrieve the analytic insight at the appropriate level of complexity depending on the user and with situational awareness.
We are committed to making natural human interactions with SAS software a reality.
Can you tell me about the newest high-speed in-memory analytics platform you’re working on?
Our newest platform is SAS® Viya™, our third in-memory, scalable, high-performance architecture. But SAS Viya is more than an architecture; it is truly a platform that allows customers to build the analytic enterprise. It is a paradigm shift for SAS.
We built SAS Viya the same way we approach all our software solutions – innovating to meet our customers’ needs and to respond to market forces. Our customers asked for speed and scale, for flexibility, resilience, and elasticity. And they asked for a service-oriented, open and accessible platform that is easy to deploy, manage, update and monitor. At the same time, we saw an opportunity to tackle some technical debt and to simplify to a common SAS platform for data and analytics.
We boiled this down to four pillars or business requirements for SAS Viya: the need for an open, unified, powerful platform that is built for the cloud.
Our journey into high-performance analytics began 40 years ago. SAS has always handled big data problems and provided the best, highest-performing algorithms. In 2009, the pace accelerated with the move to distributed computing on multi-core commodity hardware. In the years that followed, we learned how to move advanced analytics and data management to environments that scale from single computers to many machines. SAS Viya is the culmination of that effort, and we are very proud of it.
What challenges do you see for enterprise AI?
The artificial intelligence (AI) applications we are discussing these days belong to the class of weak AI, where we try to solve a specific, narrowly defined human task. By contrast, “strong AI” aims to develop machines that can think at the level of a human, or above.
Current advances in weak AI are based on applying deep neural network algorithms that are learning a skill such as predicting an outcome, classifying an object, or recognizing a pattern. That is accomplished almost entirely in a supervised fashion, where labeled data is training an appropriately built neural network. For example, the network learns to identify objects on images, because it was passed a large number of images on which objects were already identified (labeled).
Once we understand how these systems are built, some of the challenges for enterprise AI are apparent.
The data quality is paramount because data does the programming. To reach acceptable levels of accuracy on new observations, you need large amounts of training data. The systems do not learn beyond the regularities of the training data, however. They cannot generalize to other conditions and situations than those the system was exposed to during training. A system trained to perform automated trading might behave poorly when the market conditions change in a way not reflected in the training data. The systems do not learn by automatically adapting to new situations; the student is only as good as its teacher.
Many enterprises are changing from a product-centric model to a customer-centric model. AI systems can help to better understand your customer, and to automate interactions and recommendations. If those decisions are made by systems that learned from data alone, then you need a holistic view of the customer. Many enterprises still find their data siloed in different parts of the organization.
The second important piece in developing a deep learning AI system is the deep neural network. It consists of connected layers that abstract the derivation of the output, the sentiment of a customer, for example, from the input, which might be an image of the customer or a chat transcript. While it is well understood which type of network applies to which type of input data – convolutional networks for images, recurrent networks for text and time series, for example – it is highly important to build the networks. A neural network that works well for one set of data might perform poorly on a different set. The battle for talent in data science has just intensified; the latest skirmish is about AI programmers.
AI systems based on deep learning are black boxes and difficult to explain. You cannot describe how the predictions will change when you change the input in a certain way. That will affect their acceptance in highly regulated industries and where transparency of decisions is key. You might not have to justify how email is summarized, but you might have to explain why a loan application was denied. “Well, it works” is not necessarily a satisfactory answer.
What insider advice do you have for enterprises looking to use AI?
Do not buy into the hype that the robots and algorithms are coming for the jobs. Yes, artificial intelligence is on the rise, and it will affect every industry. But this effect will be mostly in knowledge augmentation, not knowledge replacement. AI and machine learning are very good at performing narrowly defined tasks. These can be highly complex tasks like driving a vehicle or playing Go, or translating text from one language into another. But the algorithms cannot have creativity, innovation or common sense.
If you are interested in AI applications in your enterprise, look for high-frequency, repetitive tasks that do not require context understanding, creativity or judgment. These are the tasks where knowledge augmentation through AI is useful. So rather than replace the customer service representative with a bot, maybe you want to use AI to augment the representative’s interaction with the customer to become more valuable.
You need to create boundaries around AI algorithms to make sure they do not step out of line – they can learn bad things, and they can behave poorly in unfamiliar situations. You need to rethink your framework for testing and validation of data-driven processes.
It is unlikely that AI will replace the processes and business rules you have built. Do not throw them away. Enterprises that bring AI successfully into their processes will understand how to combine AI with technologies and approaches that are built on traditional models. The ensemble might be the winning combination.
Look for areas where AI and cognitive computing can be applied to business problems. Ask yourself, “Where do I have a lot of data? Where could we benefit from more automated decisions? Where do I need more personalized interactions and fewer business rules?” Look for activities and systems that can be automated and simplified using data, and remember that the biggest opportunities might come from assisting your employees.
Oliver Schabenberger is Executive Vice President and Chief Technology at SAS. He spent six years at Michigan State and VA Tech teaching statistics. Schabenberger earned a Master of Science in Statistics and a PhD. in Forestry. He is the principal architect of SAS’s newest high-speed, in-memory analytics platform. Schabenberger is also a musician, a wine lover and a Formula 1 fan.
“Cognitive Business” is an interview series featuring awesome people in the Artificial Intelligence (AI) world. Written by Lolita Taub and written for business people.