Planning Analytics using twice the memory of 10.1

User avatar
paulsimon
MVP
Posts: 603
Joined: Sat Sep 03, 2011 11:10 pm
OLAP Product: TM1
Version: 10.1.1 and 10.2.2
Excel Version: 2013

Planning Analytics using twice the memory of 10.1

Post by paulsimon » Fri Jul 20, 2018 8:07 pm

Hi

My major client is testing an upgrade from TM1 10.1.1 to Planning Analytics 2.0.5. However, they are finding that the server is using twice as much memory as it was in 10.1.1.

To eliminate anything to do with their model, we compared the memory used by the SDATA sample model and found the same - about 30MB in 10.1.1 and with PA 2.0.5 on a Win Server 2008R2 it is using 71MB and on a Win Server 2016 it is using about 62MB. A colleague even installed PA on an AWS Server to eliminate anything to do with their hosting provider, and still found that SDATA was using double the memory.

On my own company's server which is Win Server 2012R2 it is using 40MB. Part of the increase is due to the extra Metrics cube that was not there in 10.1.1 which accounts for 4MB. Therefore the huge increase in RAM does not seem to be happening in all cases.

On the client's real model, the total of the memory used by all cubes in the }StatsCubes cube has not changed significantly from 10.1.1 to PA, but something else is using huge amounts of memory. Even after deleting all rules on the real model it is still using twice as much memory as the equivalent in 10.1.1. Anyway the fact that we can replicate the memory issue with the SDATA sample cubes shows that it is not related to their model.

Has anyone else had similar experience?

Is there anyway around this, such as a Windows Setting that we have missed? As far as I can see, all the pre-requisites such as .Net 4.6.1 are installed.

Is there anyone who could share some TM1S-Log.properties debug settings to trace where the memory is going?

It has been very hard to get any support from IBM on this.

Regards

Paul

Bakkone
Posts: 63
Joined: Mon Oct 27, 2014 10:50 am
OLAP Product: TM1
Version: 10.2.2
Excel Version: 2013

Re: Planning Analytics using twice the memory of 10.1

Post by Bakkone » Thu Jul 26, 2018 9:35 am

Hi,

Just posting to confirm that I noted the same thing with a model after moving it to a new PA 2.0.5 environment.

I haven't had time to investigate why. But I suspected it was related to this:
http://www-01.ibm.com/support/docview.w ... wg22008561

Did you compare the control cubes in }statscube as well? Cause without totally understanding the changes made. If space reserved for every text cell gets increased from 32k, to 64k, that could mean that all those control cubes filled with text will explode in size. For example cubes with element attributes.

Funny that you had such large differences between win 2016 and win 2008 tho.

User avatar
paulsimon
MVP
Posts: 603
Joined: Sat Sep 03, 2011 11:10 pm
OLAP Product: TM1
Version: 10.1.1 and 10.2.2
Excel Version: 2013

Re: Planning Analytics using twice the memory of 10.1

Post by paulsimon » Thu Jul 26, 2018 10:30 pm

Hi Backkone

Thanks for confirming that you had the same problem. We are seeing about a 66-100% increase in RAM usage. Even with the sample SDATA cubes IBM Support themselves have shown a 63% increase. There is clearly a problem.

We did finally get some TM1S-Log.Properties for memory from IBM. However, they proved to be useless as it seems to generate about 20-30 lines per normal line, and it gives the Pool Number, each bit of memory allocated and the memory address. In other words nothing useful for this problem. What we would want to see is the total memory used after each item.

IBM's latest suggesting is to use Perfmon but we have already tried that. We logged the memory at 1 second intervals, but because so much can happen in the TM1Server.log in one second it is impossible to relate any increase in memory to a particular object. Even then three runs did not give consistent results pointing to different TM1 objects each time. We need something that will include the memory usage in the TM1Server.log.

We still have a situation where the total memory used by the server is far more than the total memory used by the cubes. We have no way to find out what is using all this memory.

I did look at the link you gave but from my reading they are only taking about the maximum size for a string. No strings in our model are using anything like 32K. As I understand it a string in TM1 is stored as a pointer to a memory heap that gives the memory address and the length if the string which will be terminated by a return character. Therefore if the string is say a 20 byte comment, the total memory usage should not be much more than 30 bytes even allowing for pointers, etc.

We do have our suspicions about the }Stats cubes. We did notice that the }StatsByCube was reporting that one of the }Stats cubes had 1.5 trillion populated String cells, although the actual memory used was not that high. I think that this was a spurious value. We have disabled Performance Monitor in the Config, and we have run a TI process with CubeClearData for all the }Stats cubes, including the new ones introduced in PA. The cube is no longer reporting 1.5 trillion string cells. However, the total memory used by the server is still about 66-100% more than it was in 10.1. We generally run with Performance Monitor off anyway, and had only turned it on to try to investigate this problem.

Regards

Paul Simon

User avatar
Michel Zijlema
Site Admin
Posts: 701
Joined: Wed May 14, 2008 5:22 am
OLAP Product: TM1, PALO
Version: both 2.5 and higher
Excel Version: 2003-2007-2010
Location: Netherlands
Contact:

Re: Planning Analytics using twice the memory of 10.1

Post by Michel Zijlema » Fri Jul 27, 2018 9:00 am

Hi Paul,

You're probably aware, but I just would like to mention that there are some server configuration settings that trigger a RAM usage impact that you mention - most notably the MaximumCubeLoadThreads and the newer MTCubeLoad, MTFeeders and MTFeeders.AtStartup.
One thing you could check is whether in the latest PA release the default setting for these parameters have changed (unintentionally?). You could check what the effect is when you f.i. explicitly add MTFeeders=F to your tm1s.cfg file...

Michel

lotsaram
MVP
Posts: 3094
Joined: Fri Mar 13, 2009 11:14 am
OLAP Product: TM1, CX
Version: TM1 10.2.2 PA 2.0x
Excel Version: 2010 2013 365
Location: Switzerland

Re: Planning Analytics using twice the memory of 10.1

Post by lotsaram » Fri Jul 27, 2018 9:14 am

Bakkone wrote:
Thu Jul 26, 2018 9:35 am
Hi,

Just posting to confirm that I noted the same thing with a model after moving it to a new PA 2.0.5 environment.

I haven't had time to investigate why. But I suspected it was related to this:
http://www-01.ibm.com/support/docview.w ... wg22008561
The change to .cub data storage format is for sure a red herring and has nothing to do wit the memory increase in v11 because
  • this only affects the storage of cubes ON DISK and not in memory
  • this only affects string cells holding very long strings
It may not be something you expect but I'll bet $100 the extra memory is related to dimensions and not cubes. The extra memory is from "automatic hierarchization" which is applied to dimensions on server startup and on any dimension update. In all previous versions dimensions were only hierarchized upon the first MDX (or Rest API) request involving the dimension. The "hierarchization" involves building a dictionary of the MUNs (member unique names) for the dimension as MDX and the Rest API (in the background via MDX) both require a member reference as opposed to an element reference. For every alternate parent of an element there is different member. Moreover, this doesn't just apply to leaves, for dimensions with deep rollups the fact of multiple members per element applies at every level of the rollup. So within a given dimension if the rollup is deep and especially if there is a high degree of multiple parentage then there can be an order of magnitude (or sometimes MANY orders of magnitude) difference between the number of elements vs. the number of members.

Building the MUN lookup takes time and it takes memory. Sometimes it takes a lot of memory! Pre v11 if the dimension was never involved in an MDX query then you were never hit with this extra memory overhead (but conversely on the first such MDX request the user making that request got lumped with the whole processing and commit time from the hierarchization). As the way forwards is all Rest API & MDX they decided it would be better to pre-calculate to isolate the user from the potential performance impact. But this does mean longer server load time and longer dimension save times.

I came across this with a small sample model that used 300 MB in 10.2.2 but when loaded in v11 was 4 GB. On analysis all cubes were using the same amount of memory give or take a few %. When all .cub files were removed and the same test performed it was 80 MB in 10.2.2 and 3.9 GB in v11. Eventually the vast majority of the difference was isolated to a single dimension. This was a date dimension spanning 10 years (at daily level) with many alternate rollups for YTD, MTD, rolling totals, etc. (and then the same again but with different element weights for averages). In all it looked seemingly innocuous. The dimension having around 28 000 elements (of which of course about 3 650 were the leaf date elements). However, when queried for the number of MEMBERS the result was somewhere around 10.5 million. Over 90% of the extra server load time and extra memory for the server was due to this dimension.

It is a solvable problem: if the alternate rollups are split into distinct hierarchies and within each hierarchy there is single parentage then the number of members equals the total number of elements and the additional memory usage disappears. However, it does make management of detailed single time dimensions much more onerous/complex. I think that dimension in question now has something like 20+ hierarchies (and could easily have more). Yes time dimensions are "special" and the fact that TM1 doesn't treat them as a special case can (as in an example like this) present some problems. It does add some extra spice to the old single vs. multiple time dimension debate.

Anyway ... long winded explanation. If you want to see if this might be what is causing the extra memory then it is easy to check. Just load the dimensions only and check the memory consumption between 10.2.2 and PAL. You can also do a check of your dimensions with a simple rest query to check the variation between #elements and #members.

Code: Select all

http://serverip:port/api/v1/Dimensions('dimension')/Hierarchies?$select=Name&$expand=Elements/$count,Members/$count
Just substitute in the required server name/IP, port number and dimension name to suit your environment that you want to test. If you have some dimension with a big discrepancy between count of members and elements then this will be eating extra memory. (And if the number of members is in the 6 or 7 figure area then this will be substantial.)

Please post back to confirm (or deny) if this was your issue.
Please place all requests for help in a public thread. I will not answer PMs requesting assistance.

Wim Gielis
MVP
Posts: 1748
Joined: Mon Dec 29, 2008 6:26 pm
OLAP Product: TM1
Version: PAL 2.0
Excel Version: 2016
Location: Brussels, Belgium
Contact:

Re: Planning Analytics using twice the memory of 10.1

Post by Wim Gielis » Fri Jul 27, 2018 12:36 pm

Now that's a useful piece of information Lotsaram, thanks !
Best regards,

Wim Gielis

Excel Most Valuable Professional, 2011-2014
http://www.wimgielis.com ==> 105 TM1 articles and a lot of custom code
Newest blog article: Looping over input files

User avatar
jim wood
Site Admin
Posts: 3577
Joined: Wed May 14, 2008 1:51 pm
OLAP Product: TM1
Version: TM1 10.2.2
Excel Version: 2007
Location: 1639 Route 10, Suite 107, Parsippany, NJ, USA
Contact:

Re: Planning Analytics using twice the memory of 10.1

Post by jim wood » Fri Jul 27, 2018 1:15 pm

Seconded Wim. Lotsaram that was very useful indeed.
Struggling through the quagmire of life to reach the other side of who knows where.
Application Consulting Group (ACG) TM1 Consulting
OS: Windows 10 64-bit. TM1 Version: 10.2.2

Bakkone
Posts: 63
Joined: Mon Oct 27, 2014 10:50 am
OLAP Product: TM1
Version: 10.2.2
Excel Version: 2013

Re: Planning Analytics using twice the memory of 10.1

Post by Bakkone » Fri Jul 27, 2018 2:54 pm

Thanks for stopping me before I dug deep into my theory. And thanks for the very informative post.

I will run som tests and start to think about how this affects building models going forward. I also makes that 64GB memory that the standard PA package comes with seem a bit on the low side. Especially now that IBM wants to push for more analytics in addition to planning in PA.

User avatar
paulsimon
MVP
Posts: 603
Joined: Sat Sep 03, 2011 11:10 pm
OLAP Product: TM1
Version: 10.1.1 and 10.2.2
Excel Version: 2013

Re: Planning Analytics using twice the memory of 10.1

Post by paulsimon » Sun Jul 29, 2018 7:06 pm

Hi Lotsaram

I will carry out some checks on Monday but I think that what you have said may explain the problem. We also have both Day and Month level dimensions with lots of alternate hierarchies and some of our business dimensions also have 20+ alternate hierarchies.

I was wanting to do a vanilla conversion first before taking advantage of any of the new functionality in PA. However, it looks as though creating Hierarchies might be one of the things that we have to do first, even if we are not immediately going to use PAX - As I understand it the user defined hierarchies are not visible in Perspectives or TM1 Web.

We have been working on this problem with IBM support all week but they have not really been able to help much. If we look in Perspectives and see the memory used by dimensions then there is nothing out of the ordinary. However, presumably the memory that it is using to generate every possible MUN is not included in that figure? Is there anywhere that you can see this, or is the only way to find this out to delete cubes?

I am concerned that some of the Time dimensions include Cumulative to Date calculations which are by their nature recursive, eg 2017 Jan CTD is Starting Balance + 2017 Jan. Then 2017 Feb CTD is 2017 Jan CTD + 2017 Feb. By the time you get to 2020 Dec CTD the size of the MUN that it is going to generate is going to be quite long. Will creating a hierarchy such as All CTDs that just lists out the CTD level for each month prevent it from generating MUNs below the recursive hierarchy under every consol? If you don't know the answer to that one, I think I will be experimenting on Monday and will let you know.

Regards

Paul Simon

User avatar
paulsimon
MVP
Posts: 603
Joined: Sat Sep 03, 2011 11:10 pm
OLAP Product: TM1
Version: 10.1.1 and 10.2.2
Excel Version: 2013

Re: Planning Analytics using twice the memory of 10.1

Post by paulsimon » Sun Jul 29, 2018 7:48 pm

Hi Lotsaram

One thing that does seem a little curious is that we are seeing a 63% size increase on the SDATA sample cubes, and those don't have any recursive time hierarchies. The only thing that I can find with any sort of alternate hierarchy is the model dimension that has different ways of looking a car types.

The annoying thing is that MUNs are only there for Cognos BI as all elements in TM1 have to be unique, so if they are causing the explosion in memory size then as we no longer use Cognos BI to query TM1, it would be good if there was a TM1S.CFG setting to turn off the generation of all possible MUNs. Anyway that is something that we can take up with IBM support. Once again thanks for your help.

Regards

Paul SImon

David Usherwood
Site Admin
Posts: 1329
Joined: Wed May 28, 2008 9:09 am

Re: Planning Analytics using twice the memory of 10.1

Post by David Usherwood » Mon Jul 30, 2018 11:43 am

The annoying thing is that MUNs are only there for Cognos BI as all elements in TM1 have to be unique
I don't believe that non-leaf elements in the new, named hierarchies need to be unique across all hierarchies.
This kinda sorta confirms this - from a post by harrytm1 dated 21 Dec 2016:
Hierarchies created in PAW are actually created as dim files in the same Data folder of the model. Just that you won't find them under Data folder. Instead, a sub-folder e.g. Product}hiers is created and each hierarchy, which is now a dim file, is stored there. That makes sense because a hierarchy is treated as a dimension now.

Another thing to note is that we will not see such hierarchies in Architect's sub editor, probably because Architect requires uniqueness to be enforced through the dimension. And also because such hierarchies are not considered part of the dim in the traditional sense. There is a control dim e.g. }Hierarchies_Product that will list the hierarchies created by the undocumented TI function CreateHierarchyByAttribute or through PAW. The hierarchies will be shown as such Product:Customer Target in the subset editor in Architect.

lotsaram
MVP
Posts: 3094
Joined: Fri Mar 13, 2009 11:14 am
OLAP Product: TM1, CX
Version: TM1 10.2.2 PA 2.0x
Excel Version: 2010 2013 365
Location: Switzerland

Re: Planning Analytics using twice the memory of 10.1

Post by lotsaram » Mon Jul 30, 2018 4:24 pm

David Usherwood wrote:
Mon Jul 30, 2018 11:43 am
I don't believe that non-leaf elements in the new, named hierarchies need to be unique across all hierarchies.
No. Leaves obviously must be unique as the Leaves hierarchy is shared and pools leaf elements from all hierarchies. But C elements don't need to have unique names and you can explicitly have same named C elements with different definitions per hierarchy. This is a really useful feature as say your "Total" element can simply be called "Total" vis a vis alternate rollups within a hierarchy where you need to be explicit with naming e.g total by this, total by that. This does present an extra challenge with rules due to ambiguous name errors. Since if you did have a same named element with different definition per hierarchy and you wanted to apply a rule to it you would (presumably) want to have the same rule apply. Well unfortunately there is no !hierarchy syntax available so in such a case on the LHS of the rule all dim:hier:ele combinations need to be explicitly written within {}.

More discussion on this tangent would probably warrant its own thread.
Please place all requests for help in a public thread. I will not answer PMs requesting assistance.

lotsaram
MVP
Posts: 3094
Joined: Fri Mar 13, 2009 11:14 am
OLAP Product: TM1, CX
Version: TM1 10.2.2 PA 2.0x
Excel Version: 2010 2013 365
Location: Switzerland

Re: Planning Analytics using twice the memory of 10.1

Post by lotsaram » Mon Jul 30, 2018 4:35 pm

paulsimon wrote:
Sun Jul 29, 2018 7:06 pm
I am concerned that some of the Time dimensions include Cumulative to Date calculations which are by their nature recursive, eg 2017 Jan CTD is Starting Balance + 2017 Jan. Then 2017 Feb CTD is 2017 Jan CTD + 2017 Feb. By the time you get to 2020 Dec CTD the size of the MUN that it is going to generate is going to be quite long. Will creating a hierarchy such as All CTDs that just lists out the CTD level for each month prevent it from generating MUNs below the recursive hierarchy under every consol? If you don't know the answer to that one, I think I will be experimenting on Monday and will let you know.
The memory bloat doesn't seem to be affected by very, very deep rollups and hence very very long member names. It seems to be the number of members that tip things over the edge. As long as a very deep nested rollup absolutely obeys the single parent rule then no problem. But as soon as there is multiple parentage combined with deep nested hierarchy then you get an explosion of members vs. elements. So to answer the 2nd part of your question yes you absolutely want to avoid "organizational" type rollups that group elements of the same concept but actually of different levels down from an ancestor.
paulsimon wrote:
Sun Jul 29, 2018 7:06 pm
The annoying thing is that MUNs are only there for Cognos BI as all elements in TM1 have to be unique.
No longer true as the new clients being Rest/MDX based also need to know the listing of valid member names.

It's certainly a head-kicker. These sorts of issues really only apply to time dimensions and it sure makes me wonder if TM1 shouldn't be so agnostic about time dimensions and treat them a bit special after all.
Please place all requests for help in a public thread. I will not answer PMs requesting assistance.

User avatar
paulsimon
MVP
Posts: 603
Joined: Sat Sep 03, 2011 11:10 pm
OLAP Product: TM1
Version: 10.1.1 and 10.2.2
Excel Version: 2013

Re: Planning Analytics using twice the memory of 10.1

Post by paulsimon » Thu Aug 02, 2018 11:49 am

Hi Lotsaram

Unfortunately this does not seems to be the cause of the problem, or if it is then it is not confined to time dimensions.

I first tried creating Hierarchies for all the time dimensions so that each Hierarchy only had a simple hierarchy with each element only having one parent. I only allowed the All Years, Year, Half-Year, Quarter, Month hierarchy to go down to Month level. For all other hierarchies such as the YTD hierarchy where clearly eg 2017 Jan YTD and 2017 Feb YTD will both include 2017 Jan, I made the YTD hierarchy stop at the level of 2017 Jan YTD so that it did not include the month level. Is this what you mean by creating Hierarchies? Short of stopping the Hierarchy at the level above the base level element, I cannot think of any other way of generating a Hierarchy that does not have elements with more than one parent for something like YTD.

Creating the Hierarchies and restarting the service actually increased the RAM usage.

I then tried converting the recursive Cumulative to Date style in the time dimensions to a straight light, eg instead of

2017 Jan CTD being
2016 Dec CTD
2017 Jan

2017 Jan CTD now has direct children of the Starting Balance Period and all months up to 2017 Jan inclusive.

This also made little difference - maybe 0.1GB out of 30GB. I am not too sure why as with the old recursive approach, if it is generating a MUN to fully qualify every base level then the MUNs generated for every consolidation chain that includes the Starting Balance must be very very long.

I renamed our Day level dimension to prevent it from loading. This dropped the RAM by 4GB. The Dimension alone takes 1GB. Some of the the memory reduction can be attributed to the cube that uses this dimension which obviously also could not load. This is only used for logging user activity.

In the remaining two Month level dimensions I deleted all alternate consolidations apart from the basic Year, Half-Year, Quarter, Month one. This dropped memory by only 0.1GB.

We still have a server that is using 25GB in PA compared to 14GB in 10.1.

We have a number of other non-time dimensions that have multiple alternate hierarchies. We could potentially look at creating separate Hierarchies for each of these. I have a standard routine for doing that. However, in the case of the Report Line Hierarchy that is maintained by our Financial Accountants that would be almost impossible. Admittedly it is not the greatest example of a hierarchy but the point is that it was not a problem in 10.1.

We still have an open call on this with IBM. I will let you know if we find anything.

I will try using the REST API call to see if there are any suspect dimensions, but I think I already know which ones have several alternate hierarchies.

Regards

Paul Simon

User avatar
jim wood
Site Admin
Posts: 3577
Joined: Wed May 14, 2008 1:51 pm
OLAP Product: TM1
Version: TM1 10.2.2
Excel Version: 2007
Location: 1639 Route 10, Suite 107, Parsippany, NJ, USA
Contact:

Re: Planning Analytics using twice the memory of 10.1

Post by jim wood » Thu Aug 02, 2018 12:44 pm

paulsimon wrote:
Thu Aug 02, 2018 11:49 am
Unfortunately this does not seems to be the cause of the problem, or if it is then it is not confined to time dimensions.
You made me think there Paul. I originally thought Lotsa only gave a time dimension as an example. I went back and that seems to be the case. He does not specifically say that it applies to time dimensions.
Struggling through the quagmire of life to reach the other side of who knows where.
Application Consulting Group (ACG) TM1 Consulting
OS: Windows 10 64-bit. TM1 Version: 10.2.2

User avatar
paulsimon
MVP
Posts: 603
Joined: Sat Sep 03, 2011 11:10 pm
OLAP Product: TM1
Version: 10.1.1 and 10.2.2
Excel Version: 2013

Re: Planning Analytics using twice the memory of 10.1

Post by paulsimon » Thu Aug 02, 2018 2:47 pm

Hi Jim

I am understand that the problem does not only apply to time dimensions, since TM1 doesn't have the concept of time dimensions. I was just pointing out that lots of business oriented dimensions also have alternate hierarchies. At least a time dimension is usually under the control of the developer and we might be able to do something about that, although we still need to provide the time hierarchies that the users need. However, the business oriented dimensions are generally determined by user requirements and it is harder to generate all possible hierarchies in those, particularly in dimensions like Account which are often also used to perform calculations unlike eg Product hierarchies. In any event adding the hierarchies for the time dimensions does not seem to have made much difference to the memory used by our server. The cause of the memory increase therefore might be due to something other than the wholesale generation of MUNs. We were seeing the same sort of increase on the SDATA sample server which does not have a lot of alternate hierarchies.

We have a call open with IBM and I will let you know what they find.

I am just setting up something to run the RESTAPI queries - took a while as IT were supposed to open the port for the Rest API 3 weeks ago but still haven't done it, so I am having to run things locally on the server using localhost.

Regards

Paul

lotsaram
MVP
Posts: 3094
Joined: Fri Mar 13, 2009 11:14 am
OLAP Product: TM1, CX
Version: TM1 10.2.2 PA 2.0x
Excel Version: 2010 2013 365
Location: Switzerland

Re: Planning Analytics using twice the memory of 10.1

Post by lotsaram » Thu Aug 02, 2018 5:00 pm

jim wood wrote:
Thu Aug 02, 2018 12:44 pm
paulsimon wrote:
Thu Aug 02, 2018 11:49 am
Unfortunately this does not seems to be the cause of the problem, or if it is then it is not confined to time dimensions.
You made me think there Paul. I originally thought Lotsa only gave a time dimension as an example. I went back and that seems to be the case. He does not specifically say that it applies to time dimensions.
For sure the issue of massively more members than elements can apply to any dimension where elements have multiple parents but it is more likely to be a significant issue and potentially cause problems in day level time dimensions as there are likely to be far many more alternate rollup paths defined for reportign and calculation purposes than you would ever encounter in a business dimension.

In terms of Paul's question about building YTD with only single parent yes this is no problem if taking a nested approach.
2018-08-02_18-46-23.png
2018-08-02_18-46-23.png (48.79 KiB) Viewed 332 times
In this example there are 2 approaches to building a "YTD by Month" hierarchy for a date level dimension. Either leveled or nested. Leveled has the advantage of being a bit easier to navigate and elements of the same reporting concept share the same level, but it will have multiple parents per element. On the other hand a nested structure will have only a single parent per element and therefore the same member count as element count. To preserve # members = # elements then rollups for organizational purposes need to be avoided.

If also considering a "YTD by Day" rollup then the issue becomes (much) more extreme in terms of the disparity between # members and # elements with each approach. And of course Dec 31 YTD is then 365 levels deep. My experience with this is that TM1 doesn't seem to have an issue with the ridiculously long full member names. It's still the absolute count of members that seems to cause memory blowout not anything else. For sure if you want to have BOTH a YTD by month AND a YTD by Day rollup then they need to be separated into distinct alternate hierarchies to avoid the day level elements having multiple parents.
paulsimon wrote:
Thu Aug 02, 2018 11:49 am
Unfortunately this does not seems to be the cause of the problem, or if it is then it is not confined to time dimensions.
Did you try either of
- load the model with just dimensions and no cubes?
- load the model with only leaves in the dimensions?
Both experiments should equally be able to prove or disprove fairly quickly and easily if the extra memory is coming from dimensions or cubes.
Please place all requests for help in a public thread. I will not answer PMs requesting assistance.

User avatar
paulsimon
MVP
Posts: 603
Joined: Sat Sep 03, 2011 11:10 pm
OLAP Product: TM1
Version: 10.1.1 and 10.2.2
Excel Version: 2013

Re: Planning Analytics using twice the memory of 10.1

Post by paulsimon » Thu Aug 02, 2018 9:14 pm

Hi Lotsaram

We did try loading up the server with just dimensions and no cubes except the } cubes. The memory was still high which does point to there being an issue related to dimensions. A colleague did that particular experiment so I will check on the exact figures tomorrow.

I don't like multi-headed dimensions. I have always designed dimensions to have a single top level consolidated element. Typically, the main hierarchy typically an All elements consolidation going down to base level elements joins to the top level consolidation with a weight of 1. All other alternate hierarchies join to the top level consolidation with a weight of 0 to avoid double counting. From the users point of view this gives them a single member from which to drill down to explore the different alternate hierarchies. A Show All in a dimension without a single top level consolidation can look quite chaotic. With a single top level consolidation it is easy for them to minimize everything down to the top level consolidation and then explore the alternate hierarchy that they are interested in.However, I am wondering if this approach is now causing problems because perhaps it increases the MUN issue?

What I did in the time dimension, when creating Hierarchies was to create one Hierarchy for the top level that encompassed the top level consolidation and its immediate children - the tops of the alternate hierarchies. I then created Hierarchies for each of the alternate hierarchies, but only went to the base level in one of the Hierarchies. I am not clear as to whether this has any impact on the MUN issue. It certainly did not seem to make any difference, in fact it increased the memory. Is it possible that regardless of Hierarchies that it is still the number of alternates in the base dimension that matters?

However, as I mentioned, deleting the Day level dimension completely and all the alternate hierarchies from the month dimension still left us with the server using close to double what it does in 10.1

It is possible that the memory increase comes from other dimensions, but there isn't much that we can do to reduce the number of alternate hierarchies in those, and even if creating Hierarchies was the answer which it doesn't seem to be, that would be impractical in some of these dimensions.

However, even removing a significant alternate hierarchy from one the larger business dimensions did not give any significant reduction.

We are also seeing the issue on the SDATA sample cubes, which as far as I can see do not have significant numbers of alternate hierarchies (it is basically the Cars model if you remember it).

We have seen the issue with SDATA on the Client's Win 2008 and WIn 2016 Servers, and an AWS Server. The only place I have not seen any significant increase is on my company's own Win 2012 Server. There may be something in the configuration of the server or the number of CPUs that exacerbates the issue. (We do have MaxCubeLoadThreads set to 0).

My colleague has also experimented with just about every new TM1S.CFG setting, MTQ, etc, but none of the changes have made any difference to the memory.

I think that we are getting to the point where we are just going to buy more memory. We have already had to buy two more servers to support Docker and Websphere. This is proving to be an expensive upgrade, particularly as we have 5 environments so that is 10 more servers. However, we want the performance advantage of the Java based TM1 Web, as the performance of the 10.1 ASP.Net TM1 Web is a frequent source of complaint from our users. We have spent 4 weeks looking at the memory issue and don't want to delay the upgrade any longer otherwise we will start clashing with the next accounting deadline, and will have to delay again. However, the dramatic increase in memory is causing concerns that there may be other issues related performance etc. I don't particularly want to embark on lots of people doing testing on the new version if we may still need to make some changes to address the memory issue.

So far we have had considerably better support from you than we have from IBM.

IBM Support have now referred the issue to the Developers. At the very least it would be good to have a better way to determine where the memory is being used and why, which would help us to assess whether there are likely to be any other issues other than just the increase in RAM.

Regards

Paul Simon

User avatar
paulsimon
MVP
Posts: 603
Joined: Sat Sep 03, 2011 11:10 pm
OLAP Product: TM1
Version: 10.1.1 and 10.2.2
Excel Version: 2013

Re: Planning Analytics using twice the memory of 10.1

Post by paulsimon » Fri Aug 03, 2018 2:55 pm

Hi Lotsaram

I checked with my colleague. When we started the server up with just the dimensions and } cubes the server was using 8GB. The dimensions and } cubes probably account for around 2GB of this. Therefore 6GB can probably be attributed to the MUN issue. However, the increase in the RAM usage is more than double that. Therefore, I think that you have explained part of the problem, but it looks as though IBM will need to explain the rest of the increase.

So far IBM have come back and said that there is no way to turn off the generation of all possible MUNs in PA.

One thing that does not hold true is that they seem to be saying that it did this anyway, in previous versions, but instead of doing it at startup it did it when you first did an MDX query. We have lots of MDX queries in the system and it is inconceivable that the system could be running for very long without someone triggering an MDX query in a Dynamic Subset. However, this problem did not occur in 10.1.

Regards

Paul

User avatar
qml
MVP
Posts: 1057
Joined: Mon Feb 01, 2010 1:01 pm
OLAP Product: TM1 / Planning Analytics
Version: 2.0.4 and all previous
Excel Version: 2007 - 2016
Location: London, UK, Europe

Re: Planning Analytics using twice the memory of 10.1

Post by qml » Fri Aug 03, 2018 3:50 pm

paulsimon wrote:
Fri Aug 03, 2018 2:55 pm
One thing that does not hold true is that they seem to be saying that it did this anyway, in previous versions, but instead of doing it at startup it did it when you first did an MDX query. We have lots of MDX queries in the system and it is inconceivable that the system could be running for very long without someone triggering an MDX query in a Dynamic Subset. However, this problem did not occur in 10.1.
I suspect by an MDX query they do not mean the MDX used in dynamic subsets. It would have to have been a proper data MDX query coming from somewhere like Cognos BI etc.
Kamil Arendt

Post Reply