Archive for the ‘SSAS’ Category

SSAS: All Possible Scenarios of Changing SSAS Data Folder

Title – All Possible Scenarios of Changing SSAS Data Directories

Scenario 1:

Recreating a New Data Folder in New Drive / Same Drive

Creating a New Data Folder and changing the Data File Location in configuration files as shown below –

1. Connect to your Analysis Server in SSMS with a user that is a member of the server role but NOT the user that installed the software

2. Right-click on the server in Object Explorer. Select ‘Properties…’

3. On the general tab, change the value of the DataDir property to a different folder (it doesn’t matter what)

4. Click ‘OK’ on “Analysis Server properties” dialog. You will get a message telling you that the server needs to be restarted

5. Restart your Analysis Server in Services panel.

Caveat –

If you change the DataDir property of Analysis Server (i.e. the place that, by default, Analysis Services stores all its data) then you will lose valuable metadata about your installation – namely that you will lose all members of the fixed admin Server Role. The only user who can then connect to Analysis Server and carry out administrative functions is the user that installed the software.

These Screen Shots demos same –




change to c:\temp


request to restart.

Then on Security tab you missed the Server Administrators ….


Simple workaround –

For getting users of Admin Roles you can follow below mentioned steps:

Before applying steps mentioned above take the Script of SSAS Server Admin Roles –

Connect to SSAS

Right-click on the server in Object Explorer. Select ‘Properties…’

Select Page – Security ->On drop down Script ->


Drop Down Script and select Script to a file, now follow below mentioned steps.

1. On the general page, change the value of the DataDir property to a different folder (it doesn’t matter what)

2. Click ‘OK’ on “Analysis Server properties” dialog. You will get a message telling you that the server needs to be restarted

3. Restart your Analysis Server in Services panel.

4. Now log onto the server as the user that originally installed Analysis Services (for this is now the only user that will have server role privelages)

5. Open Script File which we had saved earlier -> Script file will open in XMLA query window.

6. Execute Script and after that you will find all existing users in place.

Scenario 2:

Moving Data Directory of Analysis Services Refer this blog post

Scenario 3:

In this Scenario, Default Data Folder is V:\Data but while creating partitions developer changed the drive of Partition Directory. In such a case you can use Scenario 2 for Moving Data directory but in this Scenario I will show how to move the Partitions Directory (in case Partition directory got some issues and you need to relocate Partition Files)

Default data folder – V:\Data

Partition 2006 for Sales reside in C:\Partitions

New Location F:\Partitions

First Option:

OPEN Database in BIDS – > Go to Partitions -> In Storage Location Chang the path to new one.



New –


Before saving ensure that in F drive you have directory with same name, else you will get error stating folder doesn’t exist.

As soon as you hit save you will get this message – so you need to process partition to get it done.


Once you process – you are good, you will get data in new drive-folder.

Second Option:

Yes we can use the database backup and during restore change the drive location for specific Partition but if the Partition file size >4gb and AS is 2005, then you will be in trouble J because there is a limitation that a specific file shouldn’t be >4GB (yes fixed in SQL 2008) J



SSAS Monitoring Tool– ASTrace

June 5, 2012 4 comments

Guys, We have noticed customers asking a tool for  Monitoring SSAS Activities;  So we have added new features to an existing ASTrace of SQL 2005 and introduced a new version for SQL 2008/R2 & SQL 2012(MOLAP and Tabular).

With this tool you can monitor minimum to maximum SSAS Information.  Architecture is pretty simple, ASTrace runs as a service on any server which got prerequisite installed (as stated in documentation) -> collect trace information based on trace template for an SSAS Instance stated in configuration file (you can customize and use your own trace template) and finally pushes trace data into a SQL Table enabling you to query this SQL Table and pull required information.

You can customize Trace Template, SQL Table and Database Name. Documentation will provide you an idea of enhancements which we had done in  new version of ASTrace.


AS Trace2008 (SQL 2008 / R2)
AS Trace2012 (MOLAP / Tabular)

Caveat- By design

Restarting ASTrace Service –

By design, ASTrace will stop auditing whenever you restart the Analysis Service instance. Therefore, any time that you restart Analysis Services you will also need to restart the ASTrace service.

When the ASTrace service restarts, the service automatically writes the content of the existing table, ASTrace, to a new table with a timestamp. For example:

Old traces ASTraceTable_Archive_Tuesday, January 31, 201212:56:35 AM
New and current traces ASTrace

This workaround is required because trace files are always truncated by the function, InitializeAsWriter. Unless you take action to save the existing trace information, when the service restarts, all previous trace information would be lost. To preserve the trace information prior to the service restart, the ASTrace utility programmatically creates a table, adds a timestamp, and writes any existing trace to the time-stamped table. This lets you maintain a history of events prior to the service restart.

This trigger enables you in collating the result of all ASTrace table and you can run a SQL query for pulling information against single table.

Feel free to ping me if you need any assistance.



These is a sample tool and will not be supported by CSS.

Dimension: ProcessUpdate – Insight

ProcessUpdate  applies only to Dimension. It is the equivalent of incremental dimension processing in Analysis Services 2000. It sends SQL queries to read the entire dimension table and applies the changes—member updates, additions, deletions.

Since ProcessUpdate reads the entire dimension table, it begs the question, “How is it different from ProcessFull?” The difference is that ProcessUpdate does not discard the dimension storage contents. It applies the changes in a “smart” manner that preserves the fact data in dependent partitions. ProcessFull, on the other hand, does an implicit ProcessClear on all dependent partitions. ProcessUpdate is inherently slower than ProcessFull since it is doing additional work to apply the changes.

Depending on the nature of the changes in the dimension table, ProcessUpdate can affect dependent (related) partitions. If only new members were added, then the partitions are not affected. But if members were deleted or if member relationships changed (e.g., a Customer moved from Redmond to Seattle), then the aggregation data and bitmap indexes on the related partitions are dropped. The cube is still available for queries, albeit with lower performance.(with ProcessUpdate Flexible aggregations and indexes on related partitions will be dropped)

After ProcessUpdate you need to Process Indexes on the Partitions. The simple approach of dealing with such problem is to do default processing on associated Cubes, default processing will create dropped Aggregations and Bitmap indexes for affected partitions.

Three Options you can use for building Indexes / Aggregates –

1)      Lazy Processing

Lazy processing is the mechanism by which the Analysis server automatically builds bitmap indexes and aggregations for dimensions and partitions. Lazy processing is done in the background at a low priority, so it does not interfere with foreground operations such as user queries. The advantage of lazy processing is that it reduces the foreground processing time and the cube is available earlier to users, albeit with lower performance.

2)      ProcessAffectedObjects while Processing Dimension (Process Update) – that tells the server to reprocess other objects impacted by this command, benefit of this is All Affected Objects will be process in same Batch.

3)      Process Index on Measure Groups or Process Default of a Cube

Scenario A: Update a Product Color (Attribute Key is Color) for a single Product which had a Historical Record in Year 2005 and 2006.

1) If you notice for this specific product got sales only in Year 2005 and 2006

select ProductKey,orderdatekey from testinternestsales

where Productkey=324


2) Created A Test Cube based on TestDate and TestProd-

TestDate –



With Color – it got Flexible Relationship.


Four Partitions based on –

2005     WHERE OrderDateKey <= ‘20051231’

2006 WHERE OrderDateKey >= ‘20060101’ AND OrderDateKey <= ‘20061231’ and same was for 2007 / 2008


Dimension Usage


Key Col and Name Col is Color.


3)Auto-Slicing is correct, can be proved by simple query –

Select {[Measures].[Sales Amount]}on 0,[Testprod].[Product Key 1].&[324] on 1

from [Test] WHERE [Test Date].[Calendar Year].&[2005]



, CUBE_NAME = ‘TEST’ , DATABASE_NAME = ‘Adventure Works DW 2008R2’  ,MEASURE_GROUP_NAME = ‘Testinternestsales’


Checking if Indexes are in Place, if you notice here For Color in all years Indexes are in Place.


5)Also checked in Folder

E:\data\Adventure Works DW 2008R2.0.db\TEST.40.cub\Testinternestsales.40.det\2008.37.prt

E:\data\Adventure Works DW 2008R2.0.db\TEST.40.cub\Testinternestsales.40.det\2005.36.prt

Colro.Map files are in place


6)Fired this query for updating Color for ProductID 324 (As shown earlier it got records only for 2005 and 2006 in Fact Table)

update testprod

set color=’Test Green’

where productkey=324

7) Process Update

Fired same DMV and if you notice Attribute Index for Color is dropped for 2005 as well as 2008


Scenario B: Update a Product Color (Attribute Key is Color)  for a product which doesn’t have any record in fact table (No History or Current Record)

You can see all flex Aggregates


Changed – Color of a Product, which does not have any record in Fact table

After Process Update – all flex aggregates are gone only Index files are left that too for all attributes except Color


Scenario C In  earlier mentioned Scenario – Color is Key Column, for testing I have changed the Key Col – ColorID and Name Col. Kept as Color.


Changed Color

update testprod set color=’ASAT’ where productkey=609

So in this case, I am updating a color but internally I am referring ColorID as Key Column for this Color Attribute. Which internally is not impacting Index Mapping and hence not dropping indexes / aggregates


If your design is somewhere near to Scenario 1 or 2, Indexes / Aggregates will get dropped but if you are following the practice given in Scenario 3, It will not drop indexes because it’s not impacting the Key Col (and that’s why it’s a suggested practice to use Name Col for Attributes).

Scenario D: Dropped a record from Dimension and shown an impact on Aggregates. as expected its dropped all Attribute Indexes (Map files) and  Flex Aggregates



Aggregates are present.


, CUBE_NAME = ‘TEST’ , DATABASE_NAME = ‘Adventure Works DW 2008R2’ ,MEASURE_GROUP_NAME = ‘Testinternestsales’


Now fired below mentioned query for checking indexes


, CUBE_NAME = ‘TEST’ , DATABASE_NAME = ‘Adventure Works DW 2008R2’ ,MEASURE_GROUP_NAME = ‘Testinternestsales’




ProductKey 611 doesn’t existing in Fact Table

This signify that we don’t have record for this guy


delete from testprod

where productkey=611

Now if you notice all flex aggs are dropped you can see size as 0


Indexes – for all attributes are dropped because we have deleted a record and it had impacted all Attributes


*———–End of Article—————-*

XMLA– Clear Analysis Services (SSAS) database cache

April 27, 2012 Leave a comment

How to clear cache of a cube

<ClearCache xmlns=”“>

Adventure Works DW
Adventure Works DW



How to clear cache of a Database

<ClearCache xmlns=”“>

Adventure Works DW


Categories: SSAS Tags: ,

SSAS Tip – IndexBuildThreashold

April 20, 2012 Leave a comment


Index creation is controlled by the number of rows, specified in the msmdsrv.ini (default location of msmdsrv.ini file – ?:\Program Files\Microsoft SQL Server\MSAS10_50.SQL2008R2\OLAP\Config\msmdsrv.ini) file as <IndexBuildThreshold>, with a default of 4096. Partitions with fewer rows will not have indexes built.

Which means if you have less than 4096 records in a Partition you will not see Index for same.

How to Determine Index is present or not –

, CUBE_NAME = ‘Adventure Works’
, DATABASE_NAME = ‘Adventure Works DW 2008’
,MEASURE_GROUP_NAME = ‘Internet Sales’
,PARTITION_NAME = ‘Internet_Sales_2004’)



Look for Attribute Index Col.

Step by Step Guide of Installing Power View with PowerPivot 2012

April 16, 2012 Leave a comment

Install Win 2008 R2 + SP1



Add .Net Framework 3.5 Features


Add Desktop Experience (needs reboot)

Install SQL Server 2012 (default instance)


Few things which we need to keep in mind during installation – (New Farm)

– Install  SP2010 , Apply SP1, Apply Latest CU (guess latest released in Jan 2012)

– Install SQL 2012 – PPIV

– Configure IT

– Install SSRS Integrated with SharePoint (during this install you can select Tabular Instance of SSAS, otherwise your Power View will be limited to only PowerPivot Work Book (As you know there are 2 options in Power View, PPIV URL or Tabular Instance Name)

– Last part is the most confusing part where we need to configure the Application and Content Type


Steps as Follow –>

· Double click on Setup

· Click on Installation (left hand side), then click on New SQL Server stand-alone installation or add features to an existing installation.


· Setup support rules will identify problems if any. If there are problems fix them else press OK button.


· On Product Key choose Evaluation, later you can specify the product key. Press Next button.


· Accept the License Terms then click Next button.

· Setup files will get installed. There will be a report at the end. Press Next button.


· In the Setup Role choose SQL Server Feature Installation then click Next


· Select all features you would like to install then click Next


· Installation Rules should show if any problem is remaining, else click Next button


· Define the Instance name (Default instance)


· Disk Space Requirements will show up. Press Next button.

· Assign appropriate Service Accounts

· In Database Engine Configuration add users to SQL Server Administrators

· In Analysis Services Configuration add users who will have administrative permissions

· In Reporting Services Configuration specify the configuration mode:


· In Distributed Replay Controller add users

· In Distributed Replay Client complete based on below image then click next


· Uncheck Send Windows and SQL Server….. option then click Next button

· Installation Configuration Rules will display any error. Click Next button


· Review the Ready to Install information to validate the your selected options.

· Click Install button

· Wait until installation is completed.


· Click Close button

1. Install SharePoint

· Right Click on Splah HTML Application, this will allow you to review Pre-Requirements

· Click on Install Software prerequisites


· Click Next button


· Accept the License terms by clicking the checkbox then click Next button.


· Installation will start


· Once completed click Finish button.


· Click Install SharePoint Server

· Enter Product Key, once is validated press Continue button


· Accept License Terms and click Continue


· Click Server Farm option


· Select Complete Option then Click Install Now button


· Wait until installation is completed


· Once the installation it is completed. Uncheck the option “Run the SharePoint Products Configuration Wizard now”


2. Install SP SP1 (Very important el se PowerPivot Configuration Tool will fail)


· Double click on officeserver2010sp1-kb2460045-x64-fullfile-en-us application

· Accept the License Terms then click Continue button.


· Once completed press OK button.


3. Run SQL Server setup to install PowerPivot for SharePoint

This step will configure your SharePoint farm and install PowerPivot. It will also customize your farm with recommended settings for a PowerPivot installation.

a. Launch Microsoft SQL Server setup, click the Installation tab, and then click New SQL Server stand-alone installation or add features to an existing installation.

b. On Setup Support Rules page click OK. Once operation is completed click Next.

c. On Installation Type page select Perform a new installation of SQL Server “Denali” CTP3, click Next.


d. Either enters the product key or select free edition when prompted, accept the License Terms and click Next.

e. On Setup Role page select SQL Server PowerPivot for SharePoint, and then leave the Add SQL Server Database Relational Engine Services to this installation checkbox selected. Click Next.


f. On the Feature Selection page you will see all the required features selected, click Next.


g. On the Installation Rules page click Next.

h. On the Instance Configuration page, enter the instance name then click Next.


i. On Disk Space Requirements, click Next button.

j. On the Server Configuration page, enter service accounts for each SQL Server service. The Analysis Services Service Account must be a domain account. Click Next to go to the Database Engine Configuration page.


k. On the Database Engine Configuration page, enter the name of an administrator for SQL Server (you can click Add Current User to ensure that you are a SQL Server admin), then click Next.


l. On the Analysis Services Configuration page, enter the name of an administrator for Analysis Services. At minimum click Add Current User to the administrators, this is required to ensure that PowerPivot is deployed correctly in the farm.


m. Click Next button until you reach Ready to Install page.

n. Click Install button.

o. Once complete click Please launch the PowerPivot Configuration Tool to configure the server.

p. On the Complete page of SQL Server “Denali” CTP3 Setup click Close button.

4. PowerPivot Configuration Tool

This tool (new in CTP3) performs the SharePoint Integration and configuration steps required for PowerPivot.

a. Click Configure or Repair PowerPivot for SharePoint


b. On the PowerPivot Configuration Tool complete all required information


The database server should be the instance you install with SharePoint integration (step 3), you can leave the default port or change it based on your preferences, keep the passphrase handy as you will require it later during the configuration.

c. Click Validate button.

d. If you entered the correct information validation should succeed and display below popup message:


e. Click OK.

f. Now click Run button to begin the configuration of your PowerPivot farm. There will be a warning message click Yes button.

g. You will see an Activity Progress popup window.


h. Once completed you will see following popup message:


i. Click Exit button.

j. Open SharePoint Central Administration to confirm that SharePoint is installed correctly.

5. Run SQL Server setup to install Reporting Services for SharePoint and a BISM instance of Analysis Services.

Although you already ran SQL Server Setup once, you now need to run it again to add Reporting Services and an Analysis Services server in VertiPaq mode. The Analysis Services instance will host the business intelligence semantic models that will serve as data sources for the “Crescent reports” you create.

a. Launch Microsoft SQL Server Setup, click the Installation tab, and then click New SQL Server stand-alone installation or add features to an existing installation.

b. Continue through Setup, selecting the defaults until you get to the Installation Type page. On this page, make sure Perform a new installation of SQL Server “Denali” CTP3 is selected.

c. Enter the product key when prompted and navigate through Setup, selecting the defaults until you reach the Setup Role page.

d. Select SQL Server Feature Installation (the default) and click next.

e. On the Feature Selection page, select Analysis Services, Reporting Services – SharePoint and Reporting Services Add-in for SharePoint Products. Click Next twice.


f. On the Instance Configuration page:


g. Click Next on the Disk Space Requirements page.

h. On the Server Configuration page enter a service account for SQL Server Analysis Services. The following screenshot shows a domain user account, but for an Analysis Services server in VertiPaq mode, you can also use the default virtual account provided by Setup.


i. On the Analysis Services Configuration page, enter the name of an administrator for Analysis Services. To ensure that you are (or the account you are using is) an Analysis Services Administrator, click Add Current User. Click Add if you want to add additional users as administrator. You should add anyone who will perform processing operations on the server. Also select BI Semantic Model as your Analysis Services Deployment mode. Click Next when you are finished.


j. Click Next on Reporting Services Configuration page.


k. Click Next until you are on Ready to install page, click Install button.

l. Close the SQL Setup window.

6. Create a Reporting Services application

a. Click SharePoint Central Administration > Application Management > Manage service applications.

b. Click New > SQL Server Reporting Services Service Application.


c. Specify a name for the service application and an application pool that this service will run under. While you can use an existing application pool such as Web Services Default, consider creating a new application pool to isolate Reporting Services from other web services like Excel Services and Secure Store Service.


d. On the same page, specify the Database Server to use in provisioning the service application database. SharePoint’s database server is the SQL Server Database Engine instance that was installed as a ‘POWERPIVOT’ named instance. To use this server, enter <ServerName>\POWERPIVOT. Ensure that Windows authentication (default) is selected. Also make sure the check box for the Web Application you wish to enable Crescent on (in this case SharePoint-80) is selected under Web Application Associations. Click OK


e. After click Ok button



f. Your installation is complete and your server is configured and ready to use.

To add different content types



Click Yes and OK


On the same page under content type click


Add Different Content Types


To verify you added the content type, go back to the library and click New Document on the Documents area of the library ribbon. You should see BI Semantic Model Connection File in the New Documents list-


Click on BISM Connection – SSAS tabular Instance –


PowerPivot Workbook.

Workbook URL – http://localhost/PowerPivot Gallery/Book1.xlsx