PowerShell Script to Run ETL and Cube Processing in Sequence

This is a PowerShell script that will run each of the ETL (Extract-SM, Extract-DW, Transform, Load) jobs for the data warehouse in sequence and then the Deployment and Cube Processing Jobs.

Run-ETL.ps1
 
 
 
 
 
4.7 Star
(13)
5,153 times
Add to favorites
System Center
9/25/2013
E-mail Twitter del.icio.us Digg Facebook
Sign in to ask a question


  • Stopped != Not Started
    1 Posts | Last post January 01, 2016
    • Quick note: if you're trying to use this script, and you've manually stopped the jobs (maybe because they were hung or whatever) you'll want to change line 19 to read:
      
          if($JobStatus -eq "Not Started" -or $JobStatus -eq "Stopped")
      
      in order to cause the script to treat manually stopped in the same way as automatically finished. 
  • can we use this script for SCSM2012 R2 ?
    1 Posts | Last post November 16, 2015
    • As i am having a transcriptional error with one of my clients and i need to assist them on this, I appreciate your support . Thanks 
  • Partially Processed Cubes and Schema changes pending
    1 Posts | Last post October 22, 2014
    • Hi All,
      I have executed successfully Run-ETL.ps1 script. After all jobs were ran successfully, I have checked the status of Cubes. For Change and Activity Management Cube the status was showing as Partially Processed.
      
      For Service Manager WorkItems Cube and Service Catalog Library Cubes the Schema Changes Pending status was 'Yes'. 
      
      Please let me know the solution? Will it cause issue when generating up to date reports?
      Thanks in advance.
  • Will be there impact on performance?
    2 Posts | Last post January 02, 2013
    • First of all I would like thank Travis for ETL script for getting data into cubes immediately as soon as ETL process completes.
      But, I am concerned regarding performance of service manager when using ETL script.
      We have 32 GB and 4 CPU Cores on SM DB and SM DWDB servers.
      Please tell me, if I use ETL script in our customer environment, will be there any impact on performance.
      
      Thanks
    • Not any more than ETL running by itself.  The script runs the same exact jobs that are provided out of the box.  The only difference is that they are run sequentially one after another with practically no wait time in between.
  • Log
    4 Posts | Last post December 11, 2012
    • Is there a log to see why it keeps giving me an error "Exiting since the job is in an unexpected status"? Just want to get a better idea of why the cube processing doesn't run at all. It runs but never ends....left it one job going for over 3 days with the rest of the cube processing jobs disabled. Just trying to find my way with the 2012 environment. Thanks
    • Travis, would also like to know more about the "Exiting since job is in an unexpected status" error.  Can't get past it and nothing in the logs.
      
      <bump>
    • Looks like the Run-ETL.ps1 script throws the error "Exiting since the job..." whenever a job is in any state other than NOT STARTED.  Once that is corrected, you'll be able to run the whole script, but it may stall on a cube that is in a "bad" state.  Still trying to correct this (looks like the data keys have not processed in all the cubes).
    • Doug is correct.  The script is pretty simple.  If it finds that the job that it is supposed to try next is in a "Not Started" state then it will start it.   "Not Started" is a "good" status.  If it finds the job in the "Running" status then it will wait for 30 seconds for it to complete and will then try again.  If it is in any other status such as 'Failed' it will write the error message 'Exiting since the job is in an unexpected status' and quit.
      
      You'll want to then take a look at the job statuses and see what is going on.  Check the DW management server Operations Manager event log for errors thrown by the jobs.  Cube processing requires a lot of RAM/CPU on your DW management server, especially if you are using SQL Standard.
  • Update cmdlet path for SCSM 2012 RTM
    3 Posts | Last post December 11, 2012
    • Not really a question, but more of a request :).
      
      The cmdlet path needs updated for RTM, the script currently refers to the pre-RTM location.
    • Just to confirm and extend what Bryan has mentioned. If you have a clean install of SCSM 2012 then the path at the top of the script needs changing from:
      
      Import-Module 'C:\Program Files\Microsoft System Center\Service Manager 2012\Microsoft.EnterpriseManagement.Warehouse.Cmdlets.psd1'
      
      To:
      
      Import-Module 'C:\Program Files\Microsoft System Center 2012\Service Manager\Microsoft.EnterpriseManagement.Warehouse.Cmdlets.psd1'
      
    • Thanks for pointing this out Graham/Bryan. I've updated the path and uploaded the new version.