TM1 Server RAM

Post Reply
camembert
Posts: 10
Joined: Mon Jun 18, 2012 11:45 am
OLAP Product: TM1
Version: 10.1.1
Excel Version: 2010

TM1 Server RAM

Post by camembert »

Hi All,

I've just joined an end-user company that is operating several TM1 servers for around 1500 worldwide users, all with exactly the same financial planning applications but each managing a geographical scope so that there are rarely over 200 declared users per server. Those "regional" servers then consolidate their data to one conso server.

All the servers have at least 288GB RAM but most are over 400GO with a max of 512GO. Even with that sizing we are seeing memory peaks of over 80% so are having to plan upgrades to 1TB.
The above to me seems excessive for a budget and forecasting application, well at least from what I've seen from previous experiences.

My questions therefore are pretty general:
- In your different experiences what is the largest setup you've seen, both in terms of users and in terms of RAM, and was there any justification for this (i.e. particular complex model or simply a very badly designed model)?
- For financial planning applications of 150-200 users, what was the typical size of RAM of the servers?

I simply want to get a feel as to whether there are other companies out there with servers that the NASA would be proud of :)

Thanks
declanr
MVP
Posts: 1815
Joined: Mon Dec 05, 2011 11:51 am
OLAP Product: Cognos TM1
Version: PA2.0 and most of the old ones
Excel Version: All of em
Location: Manchester, United Kingdom
Contact:

Re: TM1 Server RAM

Post by declanr »

It entirely depends on the model.
what level of granularity is forecast/how many versions are held/rules & feeder usage/interface (web will also take up ram)/what history of actuals is held in the system etc etc etc

So its bigger than a lot of models but based purely on user numbers it would be impossible to say whether its "higher than should be expected".
Declan Rodger
Gabor
MVP
Posts: 170
Joined: Fri Dec 10, 2010 4:07 pm
OLAP Product: TM1
Version: [2.x ...] 11.x / PAL 2.0.9
Excel Version: Excel 2013-2016
Location: Germany

Re: TM1 Server RAM

Post by Gabor »

For pure TM1 I always start with looking for the ratio of stored volume in DB folder (excluding .feeder) compared to size in RAM.
1:1 is ok, anything else with RAM usage is larger than HD is at least worth to think of optimizations.
User avatar
stephen waters
MVP
Posts: 324
Joined: Mon Jun 30, 2008 12:59 pm
OLAP Product: TM1
Version: 10_2_2
Excel Version: Excel 2010

Re: TM1 Server RAM

Post by stephen waters »

Camembert
As Declan says, it depends on the model, but 400 Gb for a 200 user planning and budgeting model does seem very high compared to most of our clients. Our largest models are around that size but these are extremely high volume insurance modelling with actuarial calculations going forward up to 20 years! Most of the models are much smaller.

Without giving away any commercially sensitive details can you give an idea of
- what sort of calcs you are doing ( eg allocations, complex phasing with profiles, salary modelling,
- the level at which you are inputting\calculating data and the volumes ( eg weekly sales qty and price for 20,000 SKU's. 100 account lines at monthly level for 200 Cost centres )
- and the amount of history and number of versions ( eg 5 years history, 2 years budget data, 5 budget\forecast versions per year)

Have you queried this with developers of the system?
camembert
Posts: 10
Joined: Mon Jun 18, 2012 11:45 am
OLAP Product: TM1
Version: 10.1.1
Excel Version: 2010

Re: TM1 Server RAM

Post by camembert »

Thanks for your replies. I tried to word my question to make it more of a general survey as opposed to a "depends on how big your model is" debate :) I'm not sure though if I'm comforted or not by the lack of replies saying there's nothing unusual about such a large config...

Stephen, to give you a quick overview of the main features of the model, we have P&L by product, P&L by Customer and Costs cubes in which forecast data are input and calculated on a monthly basis.
Actual data is imported from the ERP system, after that the user goes through his TM1 Websheet to run processes to copy from a frozen to a working version, input and allocate (e.g. from generic product level to detailed product level) data.
There is 2 years' history plus current year.
Nothing particularly unusual so far...

However.... now, I'm just starting to get to grips with the model but following your reply I discovered that there are over 50 possible scenarios/versions.
For example, the "January forecast" version will receive actual data imported on January then the user forecasts the next 11 months.
The following month a new version is used, i.e. "February forecast" to which Jan & Feb actual data is imported then the user forecasts the next 10 months.
And so on, until the end of year when the system now stores 12 different versions.
Each version is ruled, and the data is never fixed or purged....
Add to that the other versions for prebudget, budget, plan etc and the volume starts racking up.
Also, the product dimension (containing a few thousand products) has around 30 attributes set up, dont know if that can have any impact.

The larger volume servers are restarting with 150-200GB RAM already used, so seemingly can't blame the RAM usage solely on daily operational activity.

The developers aren't much help, they are paid just to.... develop.... :roll:
Wim Gielis
MVP
Posts: 3123
Joined: Mon Dec 29, 2008 6:26 pm
OLAP Product: TM1, Jedox
Version: PAL 2.0.9.18
Excel Version: Microsoft 365
Location: Brussels, Belgium
Contact:

Re: TM1 Server RAM

Post by Wim Gielis »

camembert wrote:The developers aren't much help, they are paid just to.... develop.... :roll:
Not entirely, they are paid to develop in a good way...

What you describe seems to me to be too heavily relying on rules and feeders. Bringing in Actuals with TI is fine, stored on the Actuals element in Scenario.
But then use TI again to copy to other intersections, you'll probably not want to use rules and feeders there.

Fixing calculations from the past is needed too in such a large model (or it appears to be large in data volumes).
Best regards,

Wim Gielis

IBM Champion 2024
Excel Most Valuable Professional, 2011-2014
https://www.wimgielis.com ==> 121 TM1 articles and a lot of custom code
Newest blog article: Deleting elements quickly
lotsaram
MVP
Posts: 3661
Joined: Fri Mar 13, 2009 11:14 am
OLAP Product: TableManager1
Version: PA 2.0.x
Excel Version: Office 365
Location: Switzerland

Re: TM1 Server RAM

Post by lotsaram »

I would say you really, really haven't given enough detail about the model for anyone to know definitively if the memory is excessive for what the model is doing. That's more than likely the reason replies are tin on the ground.

For a TM1 model with over 100 GB I would expect a "big data" kind of model. (To use a cliche that I detest about as much as Stephen Few.) That is more than just financial planning and reporting, for a model of that data volume I would expect to see at least a few "large" dimensions with more than 100K elements and some detailed reporting with some degree of time retention like retail reporting by retail location by SKU by week. Or customer sales over a very large customer dimension. Anything with close to transactional but not quite transactional data. Without something at this level of detail or very many cubes spanning a really enterprise application covering many business areas and dozens or hundreds of cubes then it's difficult to understand how you get to the data volume you describe.

Although the details you gave are fairly scant it does sound like there are significant issues with data redundancy arising from a versioning concept that sounds "sub-optimal". I wouldn't want to say more without looking at the model.
Please place all requests for help in a public thread. I will not answer PMs requesting assistance.
User avatar
stephen waters
MVP
Posts: 324
Joined: Mon Jun 30, 2008 12:59 pm
OLAP Product: TM1
Version: 10_2_2
Excel Version: Excel 2010

Re: TM1 Server RAM

Post by stephen waters »

Mmmm.... A few things you mention make me think.
camembert wrote:we have P&L by product, P&L by Customer and Costs cubes in which forecast data are input and calculated on a monthly basis.
How many elements in the product and customer dimension? If you are getting up into the 10's of thousands you will need to be careful with design and operation. Also, are the product and customer cubes dimensioned by the full P&L dimension or a smaller dimension with just the elements appropriate to customer and product level.
We often find that inexperienced developers ( frequently from an IT background) will use too many dimensions and, because of not understanding business side, will assume the same dimensions apply to all data types.
camembert wrote:Actual data is imported from the ERP system, after that the user goes through his TM1 Websheet to run processes to copy from a frozen to a working version, input and allocate (e.g. from generic product level to detailed product level) data.
LInking to my queries above, is the allocation appropriate? We sometimes find clients wanting to allocate to a meaningless level which creates vast amounts of meaningless data and slows down the system.

You mentioned 51 versions? Even if you do need them in the central reporting cube (which seems unlikely) do you need them all in the planning sub systems? Sounds like a housekeep of versions would be a good, quick first step.
camembert wrote:The developers aren't much help, they are paid just to.... develop....
If true, this is your biggest problem. Sounds like they do not know what they are doing. Before you commit to hundreds of thousands of USD on Cray style servers (hardware and extra PVU licenses?) get an experienced TM1 partner in for a few days to some up with some ideas. Not sure where you are but find someone locally, preferably a business partner from an Applix background, with experience of this type of system. They could spend 2 days doing a review and either
(i) confirm you have a mega system that needs to be the size it is
or
(ii) highlight areas you could address to make it more efficient and smaller.


Best of luck
Post Reply