TI Shell

Post Reply
User avatar
Eric
MVP
Posts: 373
Joined: Wed May 14, 2008 1:21 pm
OLAP Product: TM1
Version: 9.4
Excel Version: 2003
Location: Chicago, IL USA

TI Shell

Post by Eric »

Polling the user base for thoughts. Is this idea genious or overkill?

I am building a "Shell" TI Process to use to execute all/most of my processes. Detail partial below.

Idea is to build a most TI's as Functions so the can be reused and abused, and the have a master TI to execute all of the functions.


Process: Shell
Parameter: Process_Name, Process Paramters



Flow
Execute Shell
Input Process_Name (Example "Function - Save")
Input Process Parameters (Example NULL) This is optional based on process being executed.
Process_Status=ExecuteProcess(Process_Name|Process_Parameters);
SaveDataAll;
*End Process Function - Save
ExecuteProcess('Function - Error', 'Function - Error Parameters');
Build Text File for Body of email in TI
Process_Status=ExecuteProcess('Function - Email', 'Function - Email Parameters');
Build Email in TI
Send Email in TI
*End Process Function - Email
*End Process Function - Error
*End Process Shell


Please comment Critisim is welcome.

Thanks in Advance.
Regards,
Eric
Blog: http://tm1-tipz.blogspot.com
Articles: http://www.google.com/reader/shared/use ... /label/TM1


Production: 32 bit 9.0 SP2, Windows 2000 Advanced Server. Web: 32 bit 9.0 SP2, Windows 2000 Server. Excel 2003
User avatar
Mike Cowie
Site Admin
Posts: 482
Joined: Sun May 11, 2008 7:07 pm
OLAP Product: IBM TM1/PA, SSAS, and more
Version: Anything thru 11.x
Excel Version: 2003 - Office 365
Location: Alabama, USA
Contact:

Re: TI Shell

Post by Mike Cowie »

Hi Eric,

I have used a master process approach (sometimes in place of listing the processes in a Chore) when running a series of other process that, combined, make up an overall business process. I haven't done a more generic Shell process, but I can say that the approach I have used has helped control the overall flow through steps and allowed me to do certain things if one or more steps fail in a way that is relatively easy to follow in TI script (at least as much as TI script can be).

I think your question regarding a Shell process depends largely on what you're trying to gain out of the whole approach. There are certainly some advantages to having a master process control the execution of other processes as I mentioned above, since you can do something more meaningful with the information returned in Process_Status in terms of error messages. I would see this as being useful for a data load or more significant process where you always want to check for error messages and send emails if there's a problem without repeating the same block of TI script everywhere, but maybe not for really small/simple processes.

One of the things this Shell approach will do is to make your server message log a little harder to navigate since it will reflect entries for all processes that you've run - the message log isn't the prettiest thing to begin with, but run some nested TI processes and it gets really tough to follow what started when and as part of which process.

Another thing to keep in mind with this approach is some of the possible difficulty with parameter passing. First, you'll need to put enough parameter placeholders in your Shell process to accomodate the largest number of parameters across all of the processes you plan to call. Then, there's a real lack of any ability to validate the passed parameters in terms of type, name or number - for example if you run a process directly from Server Explorer that has a numeric parameter TI will tell you right away if you entered something that isn't a number. Unfortunately, there aren't TI functions that I know of that allow you to test this information before trying an ExecuteProcess. If you get something wrong the error will occur in your Shell process and you may not be able to catch it in the Shell process (from the documentation):
The parameter names passed in the ExecuteProcess function are matched at runtime against the parameter names specified in the process to be executed. If the passed names cannot be found in the parameter list of the process to be executed, a serious error results, causing the immediate termination of the process from which ExecuteProcess is called.
Because you have no real control over the ExecuteProcess return codes/information (unless you feel like dealing with Global Variables) there is only so much you can do to "trap" errors in a meaningful way in your Shell process. In an ideal world you'd be able to run a process from your Shell process and be able to get rich information about what the other process did, but with TI you may need to build your other processes called by Shell in a way that helps them interface with/provide feedback to your Shell process.

Finally, one security concern I would have: if a user had READ access to the Shell process they could then run ANY other TI process on your server, provided they knew the process name and parameters. Is this really likely? Probably not, but that's what you'd be allowing if you create a Shell process like this and gave a user READ access to it.

So, I guess most of my points have more to do with limitations within TI rather than your own approach, but they are things to keep in mind.

Regards,
Mike
Mike Cowie
QueBIT Consulting, LLC

Are you lost without Print Reports in Planning Analytics for Excel (PAfE)? Get it back today, for free, with Print Reports for IBM Planning Analytics for Excel!
User avatar
paulsimon
MVP
Posts: 808
Joined: Sat Sep 03, 2011 11:10 pm
OLAP Product: TM1
Version: PA 2.0.5
Excel Version: 2016
Contact:

Re: TI Shell

Post by paulsimon »

Eric

I've done something like this but I took things up a level. I used a control cube to drive it.

This had three dims

Master Process - Just a placeholder that serves the same function as a Chore Name
Process Number - Say Process 1 - 20 or whatever maximum you need to run together
Process Measures - This has:

Process Name
Process Parameters as pairs of Param Name and Value repeated up to the max required
Flag to get result
Setting to say whether the Master Process should continue to run other processes if a particular process errors. Eg you might want to halt everything if a dim build fails but might want to carry on if one data load fails. This can be tuned to work on serious or minor errors.

I then have a TI Process called Run Master Process. This takes a single parameter of the name of the Master Process to be run.

The TI Process loops through all the processes. It works out how many parameters they have. It executes the process and passes the parameters. It checks the results and works out what it considered success or failure.

If uses some logic to work out how to call processes with different numbers of parameters.

The advantage of this approach are that it is much easier to re-use processes. For example we have a series of data loads from different GLs. They all have the same structure but the database name has to be different. I have just one process, which dynamically changes the data source SQL to include the appropriate database. The name of the database is passed as a parameter. In the control cube I can then enter the name of the load process and specify the parameter. I can then copy and paste this down and change the database name appropriately. This is easier than a Chore where I would need to specify the parameter value of the database name on each process, and this would be hidden from view so that I could not tell what the Chore was doing without re-checking each of the parameters individually.

The message log is a little harder to interpret, but I tend to look at the logfiles directory anyway to check for processerror files.

I have sets of functions that I use such as:

Copying data from one version to another
Zeroing out
SaveData

etc

Regards


Paul Simon
User avatar
Eric
MVP
Posts: 373
Joined: Wed May 14, 2008 1:21 pm
OLAP Product: TM1
Version: 9.4
Excel Version: 2003
Location: Chicago, IL USA

Re: TI Shell

Post by Eric »

MIKE
advantages to having a master process control the execution of other processes as I mentioned above, since you can do something more meaningful with the information returned in Process_Status in terms of error messages.
Exactly! I intend on using the status as a trigger for other processes.
your server message log a little harder to navigate
True, but I think it is mroe of a learning curve, you get use to it after a while.
difficulty with parameter passing
Yea, I came across this. Currently set it up a a string that will allow anything. Kind of ugly and no way to validate. Still need to work on this.
If you get something wrong the error will occur in your Shell process and you may not be able to catch it in the Shell process
Not sure I understand what you mean. Doesn't the following help with that issue?

Code: Select all

Process_Status=ExecuteProcess(Process_Name|Process_Parameters);
there is only so much you can do to "trap" errors in a meaningful way in your Shell process
I haven't tested, but was hoping SetChoreVerboseMessages(Flag); would help with that.
Security concern
Good point didn't even think of that. Probably will limit who has access to the shell, to bad read only rights allow the user to open the process. I could have a IF statment with a "password" for a parameter.


PAUL
I've done something like this but I took things up a level. I used a control cube to drive it.
Interesting idea..... need to ponder that idea a little more.
If uses some logic to work out how to call processes with different numbers of parameters.
Obstacle I will be facing. Any help would be appreciated.
The advantage of this approach are that it is much easier to re-use processes
My thoughts exactly. Why reinvent the wheel over and over and over.
I have just one process, which dynamically changes the data source SQL to include the appropriate database
Slick!
Regards,
Eric
Blog: http://tm1-tipz.blogspot.com
Articles: http://www.google.com/reader/shared/use ... /label/TM1


Production: 32 bit 9.0 SP2, Windows 2000 Advanced Server. Web: 32 bit 9.0 SP2, Windows 2000 Server. Excel 2003
Post Reply