kohera-logo-regular.svg

Connecting to an (Azure) database from within Databricks

Vector_BG.png

Connecting to an (Azure) database from within Databricks

Vector_BG.png

“Connecting to an (Azure) database from within Databricks using revoscalepy by deploying Microsoft’s Machine Learning Server”. I am sure that I am not the first guy typing this line into a search engine, and I’m very sure that I won’t be the last neither. The real power of Python resides in the flexibility of the language and the power of its data frames and the agility to play with data in a very easy way.

I made a very useful ELT program in python and wanted it do run inside a Databricks cluster. Databricks on Azure fully supports python 3 so I thought I was up for a walk in the park. Trying to import the database connection classes already gave a small hint of the troubles ahead.

Spinning up the cluster

The easy part was to create the workspace, once you have created the Databricks environment, it’s easy to create a new environment by clicking the Launch Workspace button:

Because I am creating my workspace for a rather simple purpose, I create a very simple cluster with limited resources.

For the purpose if this demo, a F4s will largely be sufficient, as it has 4 cores and 8GB of ram. Autoscaling will not be needed, and the cluster can easily destroy itself after 50 minutes of inactivity. (This is a demo anyway)

Spinning up a simple cluster like this also has the advantage that it goes fast, and I don’t need to drink 5 cups of coffee before I get my resources (I only needed 1 <Evil Grin>)

Once the cluster is available, we can now create a new notebook. To create the notebook, click on the Azure Databricks button

Here you can select, New Notebook

Make sure you connect to your newly created cluster, and the language is on Python

This opens your workbook, and now we are in our Python workspace.

It is here that I started to get some worries almost everything that I could find about working with Azure Database involved JDBC, and somehow I couldn’t get them working as I wanted.

Importing the libraries

Importing the base libraries, work like a charm, loading as we expect, and working just great

Once I started to load, my favorite SQL libraries, like RevoScalePy for example, it’s not loaded.

My first (and obvious) guess was to look for an instruction how to load revoscalepy on Databricks.

For reasons unbeknownst to me, this instruction doesn’t exist, neither on the Databricks pages, nor on on the Ravoscalepy pages.

Ok bummer…

But this never stopped me before, so why would it stop me now…

Gathering information

First thing that we can now (ab)use is knowing what platform Databricks is built on. This is well documented on docs.databricks.com

Because we created a Databricks 5.0 environment, we now know that we are essentially running with the following important actors:

  • Pip 18.0
  • Apache Spark 2.4.0

But most important of all this piece of information:

Ok great, we are running on Ubuntu 16.04, time to dust off those Linux Shell skills (always knew they would come in handy sometime) and there is an installation manual for Ubuntu and Revoscalepy Yay!

Altering the Linux Environment

Getting into shell mode

The easiest way to test your shell scripts before turning them into a initialization script, is to run them from inside your notebook.

Adding the %sh command, puts your notebook in shell mode, it’s now a graphical linux shell.

Ok now let’s get cooking, Ubuntu’s default package manager is apt, so let’s start

Update the package manager

Let us first update the package manager and it’s repos.

Woot, this works!

Now let us try Microsoft’s installation manual, I just made some minor adjustments so that it could run in this shell environment

%sh
# Optionally, if your system does not have the https apt transport option
sudo apt-get install apt-transport-https
# Add the **azure-cli** repo to your apt sources list
sudo AZ_REPO=$(lsb_release -cs)
sudo echo “deb [arch=amd64] https://packages.microsoft.com/repos/azure-cli/ $AZ_REPO main” | sudo tee /etc/apt/sources.list.d/azure-cli.list
# Set the location of the package repo the “prod” directory containing the distribution.
# This example specifies 16.04. Replace with 14.04 if you want that version
sudo wget https://packages.microsoft.com/config/ubuntu/16.04/packages-microsoft-prod.deb
# Register the repo
sudo dpkg -i packages-microsoft-prod.deb
# Verify whether the “microsoft-prod.list” configuration file exists
sudo ls -la /etc/apt/sources.list.d/
# Add the Microsoft public signing key for Secure APT
sudo apt-key adv –keyserver packages.microsoft.com –recv-keys 52E16F86FEE04B979B07E28DB02C46DF417A0893
# Update packages on your system
sudo apt-get update

 

Installing the package:

%sh sudo apt-get update # Install the server
sudo apt-get -y install microsoft-mlserver-all-9.3.0

 

And then… Bummer

Running the following query shows us that the file has been created though

Changing Microsoft script to:

sudo echo “deb [arch=amd64] https://packages.microsoft.com/repos/azure-cli/ xenial main” | sudo tee /etc/apt/sources.list.d/azure-cli.list

Fixes that issue, allowing you to rerun apt-get update

Installing Revoscalepy

Be aware that this is a rather large install (almost 10 GB) and will take several minutes to finish, but after approximately 10 minutes we got the following message:

Finishing the install

The Complete script:

%sh
#updating the environment
sudo apt-get update
sudo apt-get -y upgrade
# Optionally, if your system does not have the https apt transport option
sudo apt-get install apt-transport-https
# Add the **azure-cli** repo to your apt sources list
sudo echo “deb [arch=amd64] https://packages.microsoft.com/repos/azure-cli/ xenial main” | sudo tee /etc/apt/sources.list.d/azure-cli.list
# Set the location of the package repo the “prod” directory containing the distribution.
sudo wget https://packages.microsoft.com/config/ubuntu/16.04/packages-microsoft-prod.deb
# Register the repo
sudo dpkg -i packages-microsoft-prod.deb
# Verify whether the “microsoft-prod.list” configuration file exists
sudo ls -la /etc/apt/sources.list.d/
# Add the Microsoft public signing key for Secure APT
sudo apt-key adv –keyserver packages.microsoft.com –recv-keys 52E16F86FEE04B979B07E28DB02C46DF417A0893
# Update packages on your system
sudo apt-get update
# Install the server
sudo apt-get -y install microsoft-mlserver-all-9.3.0
# Activate the server
sudo /opt/microsoft/mlserver/9.3.0/bin/R/activate.sh
# List installed packages as a verification step
sudo apt list –installed | grep microsoft
# Choose a package name and obtain verbose version information
sudo dpkg –status microsoft-mlserver-packages-r-9.3.0

 

Done!

2319-blog-database-specific-security-featured-image
Database specific security in SQL Server
There are many different ways to secure your database. In this blog post we will give most of them a...
kohera-2312-blog-sql-server-level-security-featured-image
SQL Server security on server level
In this blog, we’re going to look at the options we have for server level security. In SQL Server we...
blog-security_1
Microsoft SQL Server history
Since its inception in 1989, Microsoft SQL Server is a critical component of many organizations' data infrastructure. As data has...
DSC_1388
Aggregaties
Power BI Desktop is een prachtige tool om snel data-analyses te kunnen uitvoeren. Je connecteert op een databron, importeert en...
dba image
DBA is not that scary
Often when talking to people who are looking for a career path in the data world I feel like there...
blog-2303
How do you link an SCD Type 2 table in Power Query?
This article uses a simple example to demonstrate how to link an SCD Type 2 table in Power Query to...