SQL for Deep Learning (SQL-DL)

SQL for Deep Learning (SQL-DL)

Introduction

Big things are coming from Teradata in regards to deep learning. This post is the first in a series regarding our AI engineering efforts. On the solution side, I would urge you to watch our talk with Danske Bank. We will also be presenting more information at the O'Reilly Artificial Intelligence Conference as well as Teradata PARTNERS 2017. In the meantime, I would like to introduce you to SQL-DL:

What is it?

SQL-DL is a way to bridge the gap between massive (petabyte+) data and deep learning models. Teradata's Unified Data Architecture is the perfect place for both data storage and analytics: transactional/relational data is stored in the Enterprise Data Warehouse, while real-time streaming data can be coming through Listener and stored in Hadoop. It is awkward (and a waste of time) to try and maneuver all of the data stored between these technologies into one area before starting a deep learning pipeline, whether training or inference. SQL-DL makes it possible to mix the different data stored in these varying technologies. 

Demo

Imagine a bank whose customer data: credit card transactions, demographics, accounts, etc. are all stored in a relational database. This bank is concerned with transactional fraud and wants to predict the likelihood of fraud in real-time transactions. They develop a deep learning model in Keras based on the petabytes of data they already have stored in their Enterprise Data Warehouse and want to score incoming data. With a few lines of code, that's possible:

In this picture, the deep_learning_scorer function is called to analyze the data in the cc_data table. The function takes the URL endpoint of our new API server, the name of the model we want to use to score the data, the version of the model, as well as the type of request. Finally, it takes a list of columns as features to feed into the model. 

What we get back is the list of features with the score representing the likelihood of fraud from the model: 

This shows how easily we were able to score relational data using a Deep Learning model.

To Be Continued

Look for more posts as we continue to unveil our AI efforts at Teradata.

Jean-Marc Bonnet

Sales Technology Director - Western Europe

6y

1+1=3 ;-)) done

Naveen T S

Staff Software Engineer at Teradata

6y

How does the training happen? single model for all customers or specific customer based model? Also, how does the model gets updated?

Like
Reply

Does it do IoT too...

Like
Reply

Very cool. SQL-DL... nice!

Like
Reply

To view or add a comment, sign in

Insights from the community

Others also viewed

Explore topics