kohera-logo-regular.svg

Azure SQL Database autotuning is not just a wizard

SQL Server provides custom recommendations by using the Azure SQL Database Advisor, enabling maximal performance. The SQL Database Advisor makes recommendations for creating and dropping indexes, parameterizing queries, and fixing schema issues.

Automated deployment of SSRS reports (PowerShell)

It all began with a client who wanted to distribute reports with minimal effort and make them usable by non-technical users. Minimal requirements were following: Should support SQL 2008 or higher, No deployment or installation of the solution, Minimal user explanation

Why invest in data science?

The last years we have been generating data like never before. Google CEO, Eric Schmidt, said the following 6 years ago: “We now create as much information every two days as we did from the dawn of civilisation up until 2003, … That’s something like five exabyte’s of data”. I wouldn’t be shocked if we can do it in one day now. Most of this data is user-generated content, like pictures, instant messages and tweets. But we cannot ignore that in our local businesses, we keep track of way more data than before.

This was Microsoft BI in 2016

With the latest version of Microsoft SQL Server, we – BI folks – have a lot of new features to play with. Reporting Services (SSRS) has brought the biggest changes in years. Also, Analysis Services and Integration Services deliver some major updates. It’s not just SQL Server that gave us a lot of new BI features. Power BI, the all-in-one clean-up and reporting tool/portal from Microsoft, has also made enormous progress in 2016.

Power BI – Ranking with ‘Other’ group

Not so long ago a client asked me to rebuild a report of theirs in Power BI. It contained the top 5 countries ordered by number of visitors and … an Other group at the end. Like this for example:

Power BI – From data to information

We live in a world where an enormous amount of data is available. But having access to the data is not the same as having insight. If we combine this data access with the right tools and a little bit of effort, you might be able to transform your data into information in no time. I have had this particular example at a client where the results of a nightly ETL load is sent on a daily basis. It merely contains the status (success and error) and the start/end times (the status is also part of the email subject). This is for both their test and production environment. All these emails are dropped into an Exchange folder of my choice. This is an example email:

SSIS – Boosting Range Lookups with a script component

Did you ever needed to use a lookup in SSIS that was not the standard equi-join and there was no possibility to do the lookup directly in the source? A situation that you might have encountered before is when you get a date and you need to find a value in a reference table based on that date, but there’s no one-to-one match, as the reference table contains start and end dates, so a match is defined as a date that lies between these start and end dates.

Power BI on-premise

Although Microsoft always had said that bringing Power BI on-premise was on the road map, no clear time frame could be given. That was the case at least up until recently, when Microsoft announced the Technical Preview of the Power BI on-premise at SQLPass and Ignite.

Dynamic bulk copy staging package

The most commonly created packages are data copy packaging from one source DB to a staging target DB. Instead of creating multiple packages (with or without BIML), a META data free solution can be made using the .NET SqlBulkCopy class. You can find the full description of this process in this blog. I’ll be giving an example of a staging process created with one single and easy package based on the package architecture.

Dynamic bulk copy SSIS package

Using the .NET SqlBulkCopy class as described in the blog ‘META driven SSIS and Bulk copy SSIS packages’, you can easily create a parameterized package to copy a bunch of tables at once.