Azure and DevOps have become quite the rage in the information technology world. With the increasing desire for companies to build and deploy code faster, and the need for scalability the DevOps philosophy gets more focus as a path to accomplishing these goals. However, applying DevOps in a cloud environment means a shift in the way IT pros have been working over the years.
DevOps, as a concept, is easier if you look at it from a developer’s side. Concepts such as frequent (partial) software releases, treating assets and resources as disposable and continuous deployment, however will make any DBA shiver with fear. In their dictionary, these terms are synonyms to bad queriesand machines going berserk, resulting in potential data loss, instability, downtime and always bad performance driving the DBA in question into the dark side of administration, and so, demanding lack of sleep combined with huge caffeine intakes. It’s a database administrator’s goal to maintain stability in the environments by means of rigorous resource management, fine-tuning instances and cautiously examining code to prevent the ultimate disaster… data loss.
This makes database deployment automation an extremely difficult process, but, remember, the goal should be to reduce risk for new releases and achieve higher quality and fewer bugs. Faster deployment should be considered a side-product, not the goal. Also, it is primordial that application and database releases can be done independently. Of course there will be dependencies, but it should always be possible to make adjustments to both sides, when needed.
This gives the current DBA generation some serious challenges. DevOps teams are getting stuck in manual database delivery purgatory, because databases are a different beast altogether. DevOps for Database deployment have taken a back seat, leaving database automation as a horror scenario (for both parties). The main reason is that most DBAs don’t trust deployment automation like code developers do, and most of the time they have some very good reasons to do so.
DBAs, more than any other Ops member, know the value and the risks involved with handling data they are responsible for. DBAs are forged in the heat of the fire, created by fighting breakdowns or conflicts. This leads them to only trust deployment if they can take matters into their own hands to ensure the release is done right. One of the main reasons for this is that any rollback scenario can quickly become a nightmare, requiring a lot of resources and time.
It is my opinion that this needs to change. Otherwise database delivery and technology might get stuck in the past, forcing developers to take refuge in more flexible technologies or enforcing their methods to existing structures. This way they generate database structures and force DBAs to burn even more midnight oil trying to fix queries from hell.
A fitting solution is to start seeing databases as an application in their own right, not as a component of the business application, or even worse, a simple data store. Like any other application, the database application uses APIs to communicate with other applications. These APIs are an abstraction layer between the data and the business layer, enabling better integration in a continuous deployment scenario. For DBAs this makes sense, as the APIs are well known and advocated by them for years. They just call them differently, DBAs speak of stored procedures, functions and views. Code developers could start to see them as a way to enable DBAs to participate actively as a database developer. Together with the other code developers, this would facilitate the continuous deployment process. Ironically, this does mean that code developers should leave part of their work and give control to data developers, something that is looked upon with a lot of resistance. (Just google for reasons why not to use stored procedures, and you’ll see what I mean). This shouldn’t be the case, data developer are part of the development team and should take their place in development cycles, participate in scrum meetings and not be seen as a second asynchronous process fixing mayhem afterwards.
The underlying thought is that, just like a database application shouldn’t contain business logic, a DAL layer shouldn’t contain data logic. It’s the database application’s task to know what to store, where to store it and how to retrieve it when needed. These rigorous best practices, combined with version control, need to be enforced. After all, they can ensure safe continuous delivery of databases, either in the cloud or on premise, leading to better and easier deployment of databases.