Using Cognos Command Center for Cognos TM1
IBM Command Center is very helpful at executing many tasks within Cognos TM1 (aka Planning Analytics). In fact we have completely replaced the use of chores within multiple TM1 implementations, and moved all those activities into Command Center. One thing that many customers don’t realize is that you can have a command center process that spans multiple TM1 server instances.
What is IBM Command Center?
Command Center is an application acquired by IBM to help execute administration/sustainment activities with IBM and non-IBM applications. This application was originally very heavily used by Hyperion users prior to IBM’s acquisition of it. Within Command Center you create “Eco Systems” which represent businesses or logical groups where security should be applied. For example, an Ecosystem can be given access to one TM1 instance or multiple TM1 instances. When in an Eco System you create “Processes” to execute tasks such as loading data or importing dimensions by executing TI processes or Chores
Linking TM1 Instances
By assigning multiple TM1 instances to an Ecosystem a Command Center process can be built to execute tasks across multiple instances. For example you can export a file out of Instance 1, import the file into Instance 2, export a file out of Instance 2, import a file into Instance 3, update timestamps in all 3 applications and then send out an email if errors are encountered. The execution of something like this is seamless where this is no downtime waiting for a file to be dropped and then looking for it. Each task is executed one after another.
Parsing out TM1 Processes to Multiple Processors
Large applications may have processes that take a long time to complete. For example, assume a data import into your application takes 10 minutes to run. You can create a task in Command Center called “Parallel Group” that will kick off multiple TI processes at the same time. This will allow each TI process to import a subset of the data at the same time, thus speeding up the import. As an easy example, assume you needed to import a full years worth of data regularly. You can have 4 different tasks running at the same time that import data for Q1, Q2, Q3, and Q4. We are making a general assumption that each quarter has the same amount of data and in theory you can take your 10 minute import down to 2.5 minutes. The catch being you will need 4 available cores. Take this even further and break the data imports out by 12 different periods to speed it up further. In this scenario you would need 12 available cores other wise the processes will queue behind each other defeating the purpose of breaking out the data imports.