In anticipation of his upcoming conference presentation, LSTM Neural Networks for Time Series Analysis at Deep Learning World in Las Vegas, June 3-7, 2018, we asked James McCaffrey, Senior Scientist Engineer at Microsoft, a few questions about his work in deep learning. James will also co-instruct the pre-conference workshop Deep Learning in Practice: A Hands-On Introduction.
Q: In your work with deep learning, what do you model (i.e., what is the dependent variable, the behavior or outcome your models predict)?
A: In addition to traditional research efforts, the Microsoft Research Deep Learning Group works directly with many of the groups at Microsoft that create products and services. Examples of problems tackled successfully include anomaly and fraud detection, image analysis, conversational systems, knowledge representation, recommendation systems, bandit problems, and many other areas.
Q: How does deep learning deliver value at your organization – what is one specific way in which model outputs actively drive decisions or operations?
A: Over the past 14 months, deep learning systems have been successfully integrated into dozens of products and services. Systems have generated increased revenue (improved recommendation systems, legal document analysis, etc.) and have reduced costs (fraudulent account identification, infrastructure energy optimization, etc.)
Q: Can you describe a quantitative result, such as the performance of your model or the ROI of the model deployment initiative?
A: The success of deep learning systems for practical applications has exceeded our most optimistic expectations. A recent analysis of 10 projects showed a direct, immediate impact of $900 million. These 10 projects, each mentored by two Microsoft Research experts, were accomplished within 12 weeks of intensive effort by teams of three to six people.
Q: What surprising discovery or insight have you unearthed in your data?
A: There have been many surprises, but perhaps the biggest surprise has been the number of problem scenarios that are “low-hanging fruit” as the saying goes. Time and time again, we found that when product and service groups were educated about deep learning techniques and tools, ideas emerged quickly. For example, one group had been using a standard but primitive logistic regression model for a type of anomaly detection. When educated about deep neural models they were able to quickly improve both precision and recall by over 25% resulting in an estimated cost impact of over $10 million.
Q: What excites you most about the field of deep learning today?
A: There are important breakthroughs in the data-processing-algorithm triad being made every few months. These breakthroughs have pushed deep learning from the realm of research to the realm of applicability. I’m especially excited about recent advances in deep neural architectures (LSTMs), algorithms (Adam optimization), and hardware (tensor processing units).
Q: Sneak preview: Please tell us a take-away that you will provide during your talk at Deep Learning World.
A: My presentation will describe techniques to analyze time series regression problems using LSTM recurrent neural networks. Attendees will learn exactly how LSTM cells work, and may be inspired to find ways to use LSTMs in their domain of interest.
Don’t miss James’ conference presentation, LSTM Neural Networks for Time Series Analysis on Tuesday, June 5, 2018 from 3:55 to 4:40 pm at Deep Learning World in Las Vegas, June 3-7, 2018. Click here to register to attend. Use Code PATIMES for 15% off current prices (excludes workshops).
By: Luba Gloukhova, Founding Chair, Deep Learning World
Luba Gloukhova facilitates and accelerates advanced research projects at a major R&D hub of the Silicon Valley. She supports Stanford GSB faculty by conceiving and generating innovative solutions that drive their cutting edge research. Luba also serves as the founding chair of Deep Learning World, the premier conference covering the commercial deployment of deep learning.