Monitoring and Auditing your Azure Analysis Services

I have been using SSAS for years, started years before I started at Microsoft, but most of my real world experience is around using SSAS on premises. In the day there was only Profiler, but now we are in a whole new world that is called Azure.

Last week I got a question on how to monitor and audit an SSAS server in Azure and I decided to investigate. Turns out it amazing 🙂 so decided to blog on it.

So the question was: how can I see who connected to my AS Azure database and what queries where send? Initially I thought of ways I used to do this in the on premises world. Capture profiler traces or XEvents by writing code and then store it somewhere for processing. It looks like was not alone in these, even the AS team itself had ways to capture XEvents and store them: https://azure.microsoft.com/en-us/blog/using-xevents-with-azure-analysis-services/

But it turns out it is much more smooth, simple and elegant by leveraging Azure’s own products. In this case we will be using Azure Log Analytics. It already documented in the official documentation here.

Setting up logging

Setting it up is very straightforward as integration is build into the product. Go to your Azure AS Server, select Diagnostics and click on “Add diagnostic setting”:

This allows you to configure which events you want capture and what to do with them. You can find all available events described here. You can choose to just put the files on a storage account, stream them to an event hub or move it directly to Log Analytics.

I would recommend using Log Analytics, i’ll show the benefits below when we configure it. Before you can do stream the events to Log Analytics you need to set it up Log analytics first which I won’t cover here.

Here we stream all the events to my Log Analytics workspace:

Using the logs

After this have been configured all events are logged into Log Analytics. I created a report in Power BI desktop pointing to the AS model and start generating queries. When I now go to my Log Analytics i should be able to see them.

After logging into Log Analytics I select “Logs” and then double click on the “AzureDiagnostics” table (for lack of another word as it is not really a table).

Double clicking adds it to the query pane that we can now execute and see all the events in the last 24 hours (this is configurable).

Now this shows us all the events and this might be too much so want to filter it down. So I am chancing the query and add a where clause to just show me the “QueryBegin” operations:

AzureDiagnostics
| where OperationName == “QueryBegin”

The query language is a bit different but there is good documentation here and there are plenty of examples around.

If we run it we get the results:

It contains all the information we also would expect to show up in profiler but it is immediately queryable, how cool is that. So it is easy to look back at what has happened. Even cooler is that there is also a way to visualize this data in Power BI :).

On the top hand right you can export to Power BI:

This gives us a M script that we can paste into PBI desktop.

Finally following the instructions in the file we are able to load data into Power BI Desktop:

This now will allow us to visualize how many queries are send per hour of the day (yeah not great data I know 🙂 ).

Here is another blog post I found on the topic with some great examples: https://techcommunity.microsoft.com/t5/Premier-Field-Engineering/Monitoring-Azure-Analysis-Services-with-Log-Analytics-and-Power/ba-p/371441

Hope that gave you some idea on what is available and how you can use the tools to get better insights into you server load.

2 Replies to “Monitoring and Auditing your Azure Analysis Services

  1. Using Azure Monitor is easy option, but those engine queries are custom metrics, i.e. you pay more for logging it and you pay for querying it. Will be happy to be proved wrong though, but that is my understanding.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.