What is using all that memory on my Analysis server instance ?

With the release of SQL Server 2012 tabular models identifying what the memory use of objects on your server instance is has become more important then ever. Since everything is in memory, being able to tune the model to remove columns that take up loads of memory could be very valuable. Or being able to use your development database to extrapolate the memory usage that you will have on you product machine.

I have created a PowerPivot workbook that will allow you to investigate the memory usage on your server instance (this report could also be used on a MOLAP instance).

The report contains two worksheets. Worksheet 1 contains two dashboards, the first dashboard contains the top 10 tables on the server instance by memory usage.

The second worksheet allows a more detailed investigation on all the objects on the server by using a pivottable. I have created a hierarchy of all the objects that allows to start at the top to bottom when you want to investigate the details.

of course since this is a pivottable there you can create your own insights if you want:


You can download the workbook and try for you self here BISMServerMemoryReport. Now also available is the Excel 2013 version here.

In order to get this working on your tabular or multidimensional server is change the connection:

After this, refresh the table and refresh the data in the workbooks from Excel.

The data itself is being retrieved into one table by a DMV function:

Then the data that came in was manipulated by several DAX calculated columns using new SQL 2012 DAX functions like parent child functions and created hierarchies for them.

I hope you can use this workbook, let me know if you have any comments or feedback on the report. I am always looking for new insights and ways to improve the workbook.

36 Replies to “What is using all that memory on my Analysis server instance ?

  1. I’m using my phone to read this (hence with tiny writing) and at first glance I read “Sales Country – model” as “Sean Connery – model”. Being awake with jetlag at 4 in the morning isn’t helping none either 🙂

    Nice post Kasper

  2. Kasper,

    this is really nice!
    I discovered a database I have to analyze log from IIS… should be optimized using a best practice I written in the book that I haven’t applied! 🙂
    Very nice!


  3. Nice post!

    You have given me a bunch of homework assignments. Now I have to figure out why global allocators are eating 57% of my memory. Also I have to figure out why taskmgr reports that my instance is using 48 MB RAM but the DMVs for object memory usage are reporting only 15 MB used.

    Maybe I should give you some homework on how to explain the difference between memory usage reported by the various DMVs and the OS, and also how to free up the global memory used by AS. That should keep you busy for a while, if you have nothing else to do 😉

  4. Hi @Cathy Dumas ,
    The difference between taskmgr and dmv is explained in the intro page:
    1) The memory report does not take the size of the Analysis Services executables into account (hence the memory size in the taskmanager is slightly lager). This is approx 150 MB.

    The global allocators are memory allocated objects that are not assigned directly to a model object. But there might be some objects that are related to the model that are not placed correctly. Unfortunately there are some issues with the object_memory_usage so this report will not give you a 100% accurate report of all your memory. It does give you an idea on the size of your objects and what is eating up most of the space.


  5. Great difficulty in trying to download this….. I remember in the late 90s when I said its going to be so nice in the next century when it will be so easier to download things. NOT!!!

  6. As a follow up. The errors in the Levelx calculations were due to the expression used for the RIGHT() function evaluating to a negative number in our environment. I added an IF statement to all of the Levelx calculated columns to catch those instances and the model started working perfectly.

    Thanks for the great tool Kasper!

    1. Hi William. I got the same error level output in my Power Pivot for excel, but I cannot find the solution that you wrote, some time ago….Thanks for your help.
      Jairo Suarez

  7. Kasper, I started using the spreadsheet this week and it is awesome. The only question I have is that the spreadsheet is consistently report 2x the size of what SSMS says the Tabular database size is. I have one database that is 515 MB and the spreadsheet analyzer says it is 1000 MB. Any ideas why this would be happening? Thanks, Mike.

    1. Hi Mike,

      The memory reporting query is sometimes not as accurate as you want. Check the file sizes on disk for the most accurate numbers.


  8. Hi Kasper, unfortunately, i failed by setting up the custom connection. I use Excel 2013 64bit and PowerPivot. I added the server name (11.0.2830.24) but testing the connection leads to the message “The name is valid, but data of the requested type isn’t found.”
    Any idea, what i should change?

    Best regards from Hamburg/Germany

  9. This is what I was looking for. Great stuff!!
    Will this work for Power-Pivot for SharePoint SSAS mode as well?

  10. Hi there, I downloaded your workbook and opened it in Excel 2016 and it complained with errors in the Level1-12 in the Power Pivot Model. I did hack a quick fix, but I do not think that it is 100% correct.

    I can email you the workbook if you would like to test and see what I did?

  11. Im using Excel 2016 and to remove error on level 2 and above i use syntax “=mid(PATHITEM([FullObjectPath];2;0);len(PATHITEM([FullObjectPath];1))+2;10000)”, changing in level 3 in first pathitem lvl to 3, second pathitem to lvl 2 and so on.

  12. I downloaded the workbook, but I get a warning claiming the PowerPivot model was created using a previous version of PowerPivot, so it needs to be upgraded. After the upgrade however, the model is corrupt and it cannot be opened.

  13. Hi, I have an Excel power pivot model that grabs data form multiple folders on our server, what do I change the localhost to when I edit the connection?

  14. Hello Kasper,

    I’m getting an error below:

    Calculation error in column ‘MemUsage'[]: An argument of function ‘RIGHT’ has the wrong data type or has an invalid value.

    Could it be due to using it against SQL 2016 with Excel 2016?

    Thank you!

  15. For Excel 2016, this query works well:
    MemUsage = AnalysisServices.Database(Database, “”, [Query=”select * from $system.Discover_object_memory_usage”, Implementation=”2.0″]),
    #”Added ObjectPath” = Table.AddColumn(MemUsage, “ObjectPath”, each if [OBJECT_PARENT_PATH] null then [OBJECT_PARENT_PATH]&”.”&[OBJECT_ID] else [Object_ID], type text),
    #”Split Column by Delimiter” = Table.SplitColumn(#”Added ObjectPath”, “ObjectPath”, Splitter.SplitTextByDelimiter(“.”, QuoteStyle.Csv), {“ObjectPath.1”, “ObjectPath.2”, “ObjectPath.3”, “ObjectPath.4”, “ObjectPath.5”, “ObjectPath.6”, “ObjectPath.7”, “ObjectPath.8”, “ObjectPath.9”, “ObjectPath.10”, “ObjectPath.11″}),
    #”Changed Type” = Table.TransformColumnTypes(#”Split Column by Delimiter”,{{“ObjectPath.1”, type text}, {“ObjectPath.2”, type text}, {“ObjectPath.3”, type text}, {“ObjectPath.4”, type text}, {“ObjectPath.5”, type text}, {“ObjectPath.6”, type text}, {“ObjectPath.7”, type text}, {“ObjectPath.8”, type text}, {“ObjectPath.9”, type text}, {“ObjectPath.10”, type text}, {“ObjectPath.11″, type text}}),
    #”Added ModelRelated” = Table.AddColumn(#”Changed Type”, “ModelRelated”, each if [ObjectPath.1] “Global” and Text.Contains([OBJECT_PARENT_PATH],”Databases”) then true else false, type logical),
    #”Added IsDimension” = Table.AddColumn(#”Added ModelRelated”, “IsDimension”, each if Text.Contains([OBJECT_PARENT_PATH],”Dimensions”) and Text.Contains([OBJECT_PARENT_PATH],”Measures”) then true else false, type logical),
    #”Added Database” = Table.AddColumn(#”Added IsDimension”, “Database”, each if [ObjectPath.2] = “Databases” then [ObjectPath.3] else null, type text),
    #”Added Table” = Table.AddColumn(#”Added Database”, “Table”, each if [ObjectPath.4] = “Dimensions” then [ObjectPath.5] else null, type text)
    #”Added Table”

    Either create a Parameter called Database and set it to your SSAS database name, or replace “Database” at the beginning of the query (eg. AnalysisServices.Database(MyDatabaseName,…).

  16. I am using Office 365 and get errors while refreshing the data. Connection works, but i think some of formular are no longer supported by offeice 365?

    Maybe you can provide a version which can be used with office 365. would be awsome.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.