kohera-logo-regular.svg

The Databricks lifecycle and how it impacts your maintenance

Vector_BG.png

The Databricks lifecycle and how it impacts your maintenance

Vector_BG.png

Since its announcement as a first-party service on Microsoft Azure at the end of 2017, Databricks has seen a remarkable growth in usage. However, the service and its success were around long before Microsoft came into play. Going by the fact that you are reading this blog at this very moment, I’ll assume that you have used Databricks before, or at least have heard that Databricks is in the data business. And cousin, business is a-boomin’! One aspect that may have left users of the service frazzled, is the steady stream of updates it receives. This begs the questions, why are there updates so often? And what is the impact on maintenance of your dearest Databricks-projects?

Releases out the wazoo

Databricks was first released in limited availability in November 2014. With minor updates happening more or less monthly and major releases every 6 to 12 months, we’re now on version 8.2. So lets first take a look at what these minor and major updates generally change.

Databricks is a web-based platform that is built on top of the Apache Spark framework. In fact, the company was founded by the original creators of Apache Spark. This means that not only do the updates incorporate changes to the Databricks platform itself, it also upgrades the Spark engine it is built on. This is reflected in the type of releases. A minor release contains only updates to Databricks, while major releases incorporate updates to Spark.

Why so many releases?

So why are there so many updates? While Databricks isn’t open source, many of the systems used are – most notably the Spark engine. Given the engine’s popularity, this makes for a very dynamic development process, resulting in a lot of updates. And as mentioned above, changes to Spark end up triggering a major release for Databricks too.

Furthermore, you can’t teach an old dog new tricks. While the Databricks platform isn’t open source, the developers have kept their habit of being deeply ingrained in their online communities and incorporating their feedback. Coupled with the fact that Databricks is used in several rapidly developing fields such as data science and machine learning, this makes for an explosive mix that triggers fast minor and major releases.

The Databricks release lifecycle

Let’s quickly check in on what the exact Databricks Runtime Versions lifecycle is, since it impacts the maintenance of your Databricks-projects.

Releases start in beta, which has three modes:

  • Private preview: this is invite-only and not necessarily very stable. The features are not publicly documented.
  • Public preview: publicly available and relatively stable, with documented features.
  • Limited availability: rarely used mix of the two above: invite only, but relatively stable, features not publicly documented.

After beta there is a full support release:

  • Major stability and security fixes are backported to active full-support releases.
  • Most releases have full support for 6 months. The exception to this are Long Term Support (LTS) versions, that have full support for 2 years. The LTS versions so far:
    • 5, Dec 21, 2017
    • 5, Jul 10, 2019
    • 4, Sep 24, 2020

The next phase is End of Support (EOS):

  • There is no support from the Databricks team for platforms running these versions.
  • Fixes are not backported to these.

Finally, there is End of Life (EOL):

  • A release version that is EOS can be removed from the API at any time from the moment support ends, with no prior notice necessary. It then becomes End of Life.

Two years might not be what a lot of us envision when we hear the phrase ‘Long Term Support’. Not only are the timeframes shorter compared to conventional platforms such as SQL Server that keep churning away even after support ends, your Databricks project on an abandoned version can fully stop working at any time. All the more reason to keep close tabs.

Impact on the timing of maintenance

So finally, we arrive at the big question: how does all of this impact us when it comes to maintaining our flourishing Databricks projects? This question is largely decided by the role the Databricks platforms plays for you. If you are a data scientist, chances are you are very actively working and reworking your notebooks on a day-to-day basis. If you are a data engineer on the other hand like most of us at Kohera are, you might be looking more into building a reliable ELT platform.

In the first case, updates can quickly be incorporated into your work and it’s worth it to keep close tabs on the newest release. For the latter, incremental updates are welcome, but realistically it’s more beneficial to limit the amount of time spent on maintenance.

When to check for updates as a data engineer?

So as a data engineer, it’s best to check in and update your clusters at least every 3 to 6 months. If you are particularly enthusiastic, you can always be up to date of course. However, setting a hard limit of 6 months with a bit of leeway beforehand, keeps you safe from your environments suddenly grinding to a halt. Additionally, we would suggest sticking mostly with full support releases. If you do opt for beta releases as a data engineer, it is advisable to give these releases a few months at least to mature before jumping in.

How to do the maintenance?

“Alright, cool, good to know, but what does the actual maintenance entail then?”, I hear you say. Well, so far Databricks has shown the tendency to mostly add new features or improve existing ones. This is great news of course, as it allows us to upgrade our platforms, while still being relatively sure that our notebooks will keep working. Still, when you do push an upgrade, it remains prudent to check out the release notes and think about what changes might impact your notebooks. The easiest way though to check, is to upgrade your dev platform first and run some sample notebooks with different scenarios. The proof is in the pudding, as they say. I haven’t really encountered breaking code on minor releases so far – knock on wood – but I did on major releases. Especially when there was a Spark upgrade incorporated. Keep an eye out for those.

Keep an eye out for third-party libraries

Important to note: pay extra attention if you use third-party libraries. These can add a lot of functionality to your clusters, but the downside is they add a potential weakness in your notebooks when Spark receives an upgrade. The library might not work with the newer version of Spark.

This could always be resolved, of course, if there is still support and development for the library. However, this results in time pressure and makes you reliant on the third party providing updates. I’ve had good experiences with this in the past – shoutout to Cobrix for very quick responses to fellow Koherians in such cases – but it’s best to always remain vigilant regarding this.

Our guide to the Databricks resources treasure

Finally, you may be thinking “Alright, yeah, most interesting blog I’ve ever read in my entire life. If only they’d also tell me where to look for release notes, more interesting blogs regarding big changes or maybe something in the genre of a lively vlog on the topic.”. I get your need for a guide towards the Databricks resources treasure! So I have provided links to other Kohera blogs regarding Databricks, as well as documentation links below. Also, I would be remiss to not mention Simon Whiteley’s videos.

Kohera blogs

Other Databricks resources

Photo of successful woman coder hacker web creator sitting armchair comfortable workspace workstation indoors.
The hurdles and pitfalls of moving or migrating a System-versioned temporal table cross database
Maybe you already have your own way of doing this and are wondering about alternative methods, or maybe you are...
Group of computer programmers working in the office. Focus is on blond woman showing something to her colleague on PC.
Updating your Azure SQL server OAuth2 credentials in Power BI via PowerShell for automation purposes
The better way to update OAuth2 credentials in Power BI is by automating the process of updating Azure SQL Server...
2401-under-memory-pressure-featured-image
Under (memory) pressure
A few weeks ago, a client asked me if they were experiencing memory pressure and how they could monitor it...
2402-fabric-lakehouse-featured-image
Managing files from other devices in a Fabric Lakehouse using the Python Azure SDK
In this blogpost, you’ll see how to manage files in OneLake programmatically using the Python Azure SDK. Very little coding...
2319-blog-database-specific-security-featured-image
Database specific security in SQL Server
There are many different ways to secure your database. In this blog post we will give most of them a...
kohera-2312-blog-sql-server-level-security-featured-image
SQL Server security made easy on the server level
In this blog, we’re going to look at the options we have for server level security. In SQL Server we...