How it is build: the SSAS memory report

So recently I created a new SSAS memory report that you can use to analyze the memory usage of your SSAS servers, read more on the actual report here. In this blog post we’ll look at how this report was build.

First let’s look at what is needed for us to get the information we need for tabular models with compatibility level 1200 or higher. The most detailed information for the tabular model we can get by querying the DMV DISCOVER_STORAGE_TABLE_COLUMN_SEGMENTS. This will give us plenty of data to work with like:

  • Databasename
  • Servername
  • Table (actually dimension name)
  • Column
  • Partition
  • Recordcount
  • Size
  • Compression type

And more, but with the above we should be able to make a great report. Unfortunately there is one issue, the query above can only be send per database not for the entire server. This used to be a problem with the previous memory reports but with PowerQuery we can solve this. What we will do is do a second discover that will give us all the databases on the server “DBSCHEMA_CATALOGS”. Here we will also filter out any database that doesn’t have the compatibility level 1200 or higher. We will then use the results of that query to send the DISCOVER_STORAGE_TABLE_COLUMN_SEGMENTS per query.

 

So to build this the first thing I did was create a query in Power BI desktop that will give a list of all the databases on my server using “DBSCHEMA_CATALOGS”. Unfortunately you have to specify a Database on the server for each connection to SSAS while this should be an optional argument in Analysis Services.Database. I have asked the M folks to put it on the backlog but I can live with this for now, this scenario is not that common.

 

This gives me the list of databases:

image

Next I want someone to be able to have the user type in the server name and database trough a prompt instead of changing the M script, so I created 2 parameters:

image

Next I use these 2 parameters in my M expression:

let
Source = AnalysisServices.Database(Server, Database, [Query="select * from $system.DBSCHEMA_CATALOGS#(lf)where compatibility_level >= '1200'"]),
#"Renamed Columns" = Table.RenameColumns(Source,{{"CATALOG_NAME", "DatabaseName"}}),
#"Removed Columns" = Table.RemoveColumns(#"Renamed Columns",{"DATE_QUERIED", "CURRENTLY_USED", "POPULARITY"}),
#"Added Custom" = Table.AddColumn(#"Removed Columns", "Server", each Server)
in
#"Added Custom"

Done  Smile .

Now that was the easy part. Next I need to use the list of databases to get the information per database. I first do one manually so I again import directly against my SSAS server using “DISCOVER_STORAGE_TABLE_COLUMN_SEGMENTS”

This gives me the information I care about for a single database on the server (I renamed and removed some columns):

image

This is the query I used, again leveraging the parameters I defined earlier:

Source = (Database as text, Server as text) => let
Source = AnalysisServices.Database(Server, Database, [Query="select * from $system.DISCOVER_STORAGE_TABLE_COLUMN_SEGMENTS"]),
#"Renamed Columns" = Table.RenameColumns(Source,{{"DATABASE_NAME", "Cube"}, {"MEASURE_GROUP_NAME", "Table"}, {"PARTITION_NAME", "Partition"}}),
#"Removed Columns" = Table.RemoveColumns(#"Renamed Columns",{"DIMENSION_NAME"}),
#"Renamed Columns1" = Table.RenameColumns(#"Removed Columns",{{"TABLE_ID", "TableId"}, {"COLUMN_ID", "ColumnId"}, {"SEGMENT_NUMBER", "SegmentNr"}, {"TABLE_PARTITION_NUMBER", "TablePartitionNr"}, {"RECORDS_COUNT", "RecordCount"}, {"ALLOCATED_SIZE", "AllocatedSize"}, {"USED_SIZE", "UsedSize"}, {"COMPRESSION_TYPE", "CompressionType"}, {"BITS_COUNT", "BitsCount"}, {"BOOKMARK_BITS_COUNT", "BookmarkBits"}, {"VERTIPAQ_STATE", "VertiPaqState"}})
in
#"Renamed Columns1"

Now the interesting part. Now we need to call the above code that gets the information per database for each database on the server. To do this I create a function out of the M script above:

let
Source = (Database as text, Server as text) => let
Source = AnalysisServices.Database(Server, Database, [Query="select * from $system.DISCOVER_STORAGE_TABLE_COLUMN_SEGMENTS"]),
#"Renamed Columns" = Table.RenameColumns(Source,{{"DATABASE_NAME", "Cube"}, {"MEASURE_GROUP_NAME", "Table"}, {"PARTITION_NAME", "Partition"}}),
#"Removed Columns" = Table.RemoveColumns(#"Renamed Columns",{"DIMENSION_NAME"}),
#"Renamed Columns1" = Table.RenameColumns(#"Removed Columns",{{"TABLE_ID", "TableId"}, {"COLUMN_ID", "ColumnId"}, {"SEGMENT_NUMBER", "SegmentNr"}, {"TABLE_PARTITION_NUMBER", "TablePartitionNr"}, {"RECORDS_COUNT", "RecordCount"}, {"ALLOCATED_SIZE", "AllocatedSize"}, {"USED_SIZE", "UsedSize"}, {"COMPRESSION_TYPE", "CompressionType"}, {"BITS_COUNT", "BitsCount"}, {"BOOKMARK_BITS_COUNT", "BookmarkBits"}, {"VERTIPAQ_STATE", "VertiPaqState"}})
in
#"Renamed Columns1"
in
Source

This allows me to pass in Database and Server and run the code. The function is called “fnDatabaseSize”:

image

 

Now we need to pass in that information automatically form the data . I do this using the following M script:

let
Source = Databases,
#"Removed Columns" = Table.RemoveColumns(Source,{"DESCRIPTION", "ROLES", "DATE_MODIFIED", "COMPATIBILITY_LEVEL", "TYPE", "VERSION", "DATABASE_ID"}),
#"Invoked Custom Function" = Table.AddColumn(#"Removed Columns", "fnDatabaseSize", each fnDatabaseSize([DatabaseName], [Server])),
#"Expanded fnDatabaseSize" = Table.ExpandTableColumn(#"Invoked Custom Function", "fnDatabaseSize", {"AllocatedSize", "BitsCount", "BookmarkBits", "ColumnId", "CompressionType", "Cube", "CUBE_NAME", "Partition", "RecordCount", "SegmentNr", "Table", "TableId", "TablePartitionNr", "UsedSize", "VertiPaqState"}, {"AllocatedSize", "BitsCount", "BookmarkBits", "ColumnId", "CompressionType", "Cube", "CUBE_NAME", "Partition", "RecordCount", "SegmentNr", "Table", "TableId", "TablePartitionNr", "UsedSize", "VertiPaqState"}),
#"Changed Type" = Table.TransformColumnTypes(#"Expanded fnDatabaseSize",{{"AllocatedSize", Int64.Type}, {"BitsCount", Int64.Type}, {"BookmarkBits", Int64.Type}, {"RecordCount", Int64.Type}, {"SegmentNr", Int64.Type}, {"TablePartitionNr", Int64.Type}, {"UsedSize", Int64.Type}})
in
#"Changed Type"

I get to the these steps by doing the following steps in the UI. First I base my query on the information that is retrieved from the databases query and then I remove all the columns expect name and server:

image

And then for the magic part, in the next step I invoke a custom function:

image

Fortunately Power BI desktop automatically recognized the parameters in my function and asks me which column values it should use to enter the variables.

This results in a new table being added for each database:

image

This I now can expand and clean up to get the full results.

image

This gives me all the information I need for all the databases on my server. Unfortunately you will probably have gotten an error message about data privacy or firewall, what happens is that Power Query is trying to prevent some sort of data injection. Based what we are doing here I could also have used information from you SSAS server and pass that over to some web service and steal some data so Power Query prevents that. To solve this you can go to options, Privacy, Always ignore Privacy Level settings. This will allow these queries to call each other.

I ended up with 6 queries, 2 parameter. The DBSize function is the base for my function, for this one I disabled loading data as I am not planning to use it.

image

 

Now I can start creating the reports as described in the my other blog post here. As last step I saved the report as a template so other users can start clean without my data, you can do this through File, Export, Power BI template.

 

Hope this gave you some interesting new ideas on what can be done with Power BI Desktop Smile.

3 Replies to “How it is build: the SSAS memory report

  1. To get rid of the inappropriate Database parameter one could use AnalysisServices.Databases(Server).

    Why do you restrict the compatibility level of the databases? DISCOVER_STORAGE_TABLE_COLUMN_SEGMENTS does work with lower levels as well.

  2. Hi Kasper. Coparing with VertPaq Analyzer, from Marco and Alberto, I noticed your report give similar value to the Dictionary Size of theirs. Have you compared both? What is the difference of getting DISCOVER_OBJECT_MEMORY_USAGE and DISCOVER_STORAGE_TABLE_COLUMN_SEGMENTS?

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.