OLAP Cubes in the SCSM Data Warehouse: Troubleshooting
Published Feb 15 2019 10:08 AM 1,076 Views
First published on TECHNET on Feb 03, 2012

This is the seventh post in a series of posts detailing the capabilities of OLAP cubes in the Data Warehouse, a new feature that enables self-service reporting functionality added in SCSM 2012.

Common Issues in Debugging/Troubleshooting

Processing Failures:

Although safeguards exist in the DWRepository to ensure the integrity of the data, it cannot completely prevent the possibility of a processing error. The most common processing error will be a DimensionKeyNotFound exception. Since the AS Dimensions are by default processed every 60 minutes (a user-configurable value), it is possible that during processing of the fact’s measure group the dimension keys do not yet exist. In this case, by default the processing logic will reprocess the AS Dimensions via a ProcessUpdate and then reprocess the fact up to 2 times in order to resolve the key errors.

There are some rare scenarios where the reprocessing may fail. This could be due to 2 reasons:

  1. Only the DW Repository enforces foreign keys to ensure the integrity of the data. The Datamart does not have any foreign keys for performance considerations. Since the load process bulk moves the data from the repository to the mart via ADO, it is possible that the fact data may have been loaded before the dimension keys due to a timing issue. To resolve this scenario, the load process must be executed a subsequent time to bring over the existing dimension keys.
  2. In multi-mart scenarios, all the dimensions of each mart are pointed at the primary DW Datamart . This is to reduce the size and processing time of the cubes. It is possible, however, for facts in the OM or CM marts to target dimension keys that do not yet exist in the primary DW Datamart. In this case, although it is counterintuitive, you must actually run the Load job on the primary mart in order to resolve the processing failure for cubes pointed at the OM or CM mart.

Troubleshooting MDX customizations of the cube

Since many of the cube customizations require a working knowledge of MDX, it will be common for the user to have syntax errors in the initial expression and multiple iterations may be necessary before the expression is suitable for the user’s needs. It is advisable to test the MDX expression on the cube via BIDS or SSAS(without saving the changes) before trying to add it to the cube via a CubeExtension or defining it in the SystemCenterCube element.

However, if the user does have an error in the MDX expression when it is included in the MP via a CubeExtension, they will be able to uninstall the cube extension which will revert any changes made on the cube. If these expressions are defined via a SystemCenterCube element, the user unfortunately will have to uninstall the MP and then manually delete the cube from Analysis Services before making any corrections and redeploying the cube MP. For this reason, it may be advisable for the user to define cube customizations via the CubeExtension element.

Deployment of a cube MP fails or Changes/Updates are NOT reflected in the cube:

The handling of deployment failures for cube MP’s are very similar to the current system in place for 2010 DW MPs. A user can always retry deployment to see if the root cause was a transient issue such as a connection failure or a temporary disruption in the Data Access Wervice which will succeed on retry. Since we allow certain customization of cubes such as custom MDX scripts that cannot be verified for syntactical correctness at import time, it is anticipated that this will be a common cause of deployment failures. In this scenario, users can uninstall and subsequently reinstall the MP with the corrected MDX script. As always, more information about failures can be found in the event logs under the Data Warehouse or Deployment sources.

All cube deployment changes are atomic transactions, so any errors encountered when making changes to any AS components (Data Sources, DSV, Dimensions, Cubes) will rollback all changes made ensuring that the system is always in a consistent state. In other words, either all changes to the DB occur, or nothing occurs.

Keep in mind that there are no measures in place to prevent the user from manually making changes to the cube itself. In the event this happens, we cannot guarantee the integrity of the cube or that automated maintenance of the cube will succeed since the cube structure may be compromised. If the user decides to manually manipulate a cube, it is advised that they uninstall the associated cube MP. Uninstalling a cube MP will never delete the cube from AS, but the cube metadata stored in the CMDB will be deleted, so the cube will no longer be automatically processed or partitioned. They will still be able to view the cube from the DW workspace and open Excel to browse the cube.

Filtering by Dimensions Does NOT Slice or Dice the Cube:

Let us take a scenario where the user is browsing the Measure Group “WorkItems Assigned To User” and he wants to slice on all users in a particular department. When he attempts to perform the filtering on UserDim, however, nothing happens or no data is returned. This is very confusing for the user because they know for a fact that UserDim has a relationship to this particular measure group.

In this scenario, we must remember that the same database dimension can play multiple roles in the multidimensional model. We call these dimensions role-playing dimensions. For example, the time dimension can be used multiple times in a cube describing flight information. The different role-playing dimensions in this case could be “Departure Time” and “Arrival Time”, which both target the Time dimension.

In the “WorkItems Assigned To User” example, the given role-playing name of the User dimension is actually “AssignedToUser”. If the user filtered by this particular dimension instead of “UserDim”, they would return the correct information.

BIDS has an extremely useful feature called a Dimension Usage Tab which will show the relationships between dimensions and cubes so they user can determine which dimensions they can use to slice and dice the cube. Further expounding on the “WorkItems Assigned To User” example in the figure below, you can see that “UserDim” has no relationship to the “WorkItemAssignedToUser” measure group, while UserDim(AssignedToUser) does have a relationship to the measure group where the join attribute is UserDimKey. In this case, the role-playing name is highlighted within the parenthesis.

Figure: Dimension Usage View for WorkItemAssignedToUser Measure Group

At this time SCSM does not have a native Dimension Usage Tab capability so the end user will need to look at BIDS to determine exactly which dimensions can filter on a particular cube.

Cubes on a Remote Analysis Service Server Fail To Process

In certain cases, processing a cube on a remote AS server will fail because the firewall has not been configured properly. The default instance of AS uses TCP/IP port 2383 and this port needs to be unblocked in the firewall to allow access.

To unblock the port, run the command line instructions below:

C:\Windows\system32>set port=2383
C:\Windows\system32>netsh advfirewall firewall add rule name="Analysis Services" protocol=TCP dir=in localport=2383 action=allow

Processing is Hung

There could be multiple causes for processing hanging. The systems administrator must ensure that the system has enough RAM in cases where the DW and AS Server are on the same machine that there is enough memory to possibly run DW ETL and Cube processing jobs concurrently. A few potential solutions are offered below:

1) There are known deadlock issues in 2008 AS Server. The workaround in this case is the increase the number of threads in the processing thread pool before the processing hangs. If the system is already hung, the workaround is to restart both the System Center Management service and the AS service as well as reset the cube processing workitem to a status of 3 (not started) so the workflow engine can restart it.

To get the relevant cube processing workitem, you can run the following queries on the DWStagingAndConfig (the queries are shown 1 by 1 for illustration purposes below, they can easily be joined together for one query)

select processId from infra.process where processname like ‘Process.{CubeName}’

select batchid from infra.batch where processId = {ProcessId from previous query}

select * from infra.workitem(nolock) where BatchId = {BatchId from previous query}

update infra.workitem set statusid = 3 where workitemId = {workitemId from previous query)

2) Check the CoordinatorExecutionMode property on the AS service and make sure it is set properly. The following forum thread highlights some of the known issues: http://social.msdn.microsoft.com/Forums/en-US/sqlanalysisservices/thread/78feaf33-a64d-49e6-a04...

DWMaintenance is Hung on the ManageCubePartitions/ManageCubeTranslations Step

In this scenario the most common cause is also a nonresponsive AS Server. The workaround is the same for the hung processing (Step 1) above, which is to restart the System Center Management service as well as the AS service and then reset the workitem to a status of 3 (not started) so it can re-run.

To get the relevant DW maintenance workitem, you can run the following queries on the DWStagingAndConfig (the queries are shown 1 by 1 for illustration purposes below, they can easily be joined together for one query)

select processid from infra.process where processname = 'DWMaintenance'

select * from infra.ProcessModule where ProcessId = {ProcessId from previous query} (Note the ProcessModuleId where the VertexName is ManageCubePartitions/ManageCubeTranslaions)

Select * from infra.batch where ProcessId = {ProcessId from previous query} (Note the BatchId from the largest batch)

select * from infra.WorkItem where BatchId = {BatchId from previous query}

update infra.workitem set statusid = 3 where workitemId = {workitemId for the step that is hung with the corresponding processmoduleid for ManageCubePartitions/ManageCubeTranslations)

Here is an index list of the entire series:

Version history
Last update:
‎Mar 11 2019 09:03 AM
Updated by: