Why I created a blog

Its been four years since I first created this blog. It has remained true to Essbase and related information over those years. Hopefully it has answered questions and given you insight over those years. I will continue to provide my observations and comments on the ever changing world of EPM. Don't be surprised if the scope of the blog changes and brings in other Hyperion topics.

Monday, November 9, 2015

Essbase 12C is here or is it?

A few weeks ago when I was at Oracle Open World, there was a big to do, partially started my me. Someone had posted that Essbase 12C had been released.  Thinking quickly, I immediately found the readme and saw that EAS and Essbase Studio were not supported. I couldn’t fathom that, going back to command line to  build cubes? and how would load rules get created.
I was hoping that perhaps, it had the new simplified interface that Gabby Rubin has been hinting about (and I actually got to see at OOW  -- It is awesome), but alas no.
It turns out this 12C release is for use with Oracle Business Intelligence Enterprise Edition (OBIEE) 12c that was released the Friday before.  It does have cool things we can expect later in our own EPM version of Essbase.

First it is the new JAVA kernel, so I’m happy to let the OBIEE people debug it for us. This Java is supposed to be first released as EssCloud service. so don’t expect it any time soon in-house. I would say on premise, but apparently Larry is rebranding on premise to be some kind of private cloud, or so the story goes.
Second, It supports all of the functions including cross-dim in hybrid mode. That is huge. It means they figured out cross dim issue performance. I can’t wait to try it.

Finally, They are putting Essbase in memory. The real use case for Essbase with OBIEE is for OBIEE to spin off Essbase cubes for caching data to make OBIEE reports faster. we won’t be able to get at these cubes, but OBIEE reports will be.
So for OBIEE (at least) a REALLY cool version of Essbase is available, but for us EPM-ers we will have to wait. may uncle Gabby, will give me a version for Hanukkah to play with. or a version for each night. I love new toys. 

Tuesday, July 7, 2015

Exalytics X5-4 Fast and Furious

I love going to KScope because I learn about new features and products. This event was no different. In the Sunday symposiums with Oracle there was a discussion on the new Exalytics X5-4. It was only Sept last year at Open World when the X4-4  was announced . Edward Roske talks about it in his Blog. It was a big deal then. With the introduction of the X5-4 only 9 months later it becomes even bigger, better and “Badder”. With the X5-4 we go to a max of 72 cores, up from 60 and more memory. In addition to more cores, the X5-4 supports a new NvMe High bandwidth flash technology that improves throughput by 2.5 times. I won’t bore you with the details if you want to read about them , here are the specs

To me the most remarkable thing about this is you get more and the price has not increased. All of the way back to the X3-4 the price has remained the same. With a list price of $175K it is what I consider cheap.

As John Booth mentions in his Blog, you can get this as an X5-2 configuration as well offering additional flexibility. Note I had a correction from John. The X5-2 was more a wish from him than a reality. While you could create a X5-2 using sub-capacity licensing, you are still paying for the physical cores (Thanks Steve Libermensch for that clarification)

For us in EPM it keeps getting better and better.

Monday, July 6, 2015

Essbase Studio patch

Well, I survived KScope. It was a very good event with participants getting over 175 sessions related to EPM/BI.  I sat in a number of sessions and was impressed with the quality of the speakers and presentations.  I also had the opportunity to speak in 4 sessions and I think they went pretty well, at least from the questions people asked.

Patch came out the other day and I read through the readme file. There were only two changes and one document change. 

The first bug fix relates to a problem with stored dimensions (I assume ASO) where it would not let you use external consolidation operators. 

The documentation change fixes the statement that you can drill through on any level of a hierarchy including the top level. That is incorrect, you can’t drill through from the top member of the hierarchy (The dimension name).

The most intersting bug fix is the second one and I’m surprised they are calling it a bug as it used to be described as a limitation. When doing a drill through report on a recursive hierarchy, the drill through would fail with an error message if there were more than 1000 level 0 members returned in the query.  For recursive queries, Essbase Studio created an IN clause with the list of level zero members under the selected member. The 1000 member list was a limitation for Oracle as that is the maximum number of members allowed in an In clause. I’ve not been able to test this yet and wonder how development got around that limitation.

I guess the moral of the story is , even if something is listed as a product limitation, still submit bug and enhancement requests and it is very possible what you need will be changed.

Monday, June 8, 2015

Don’t believe everything you read (again)

I got an email from my boss Edward Roske about an entry in the Tech Reference. He is working on a cool super secret project (all will be Reveled and revealed  at KScope) and he asked me about something he saw in the Tech reference on the AGGMISSG command.

For those of you who don’t like to read the tech reference I’ll save you the time going to it.


Specifies whether Essbase consolidates #MISSING values in the database.

The default behavior of SET AGGMISSG is determined by the global setting for the database, as described in the Oracle Essbase Database Administrator's Guide.




SET AGGMISSG commands apply to calculating sparse dimensions.



See Also

  • SET Commands


    What struck him as funny and me as well was the statement

    SET AGGMISSG commands apply to calculating sparse dimensions. (my highlighting).

    Neither he nor I could remember it acting that way. I reached out to MMIP Cameron Lackpour and he opened his System 9.3.1 tech reference and it said the same thing.

    Thinking this can’t be right, think Planning with upper level periods allowing input and being dense, I decided to test it.

    Using Cameron’s FDITHWW sample Basic, I cleared all the data and set the upper levels of year to be stored.

    I used Smart View to populate the following intersection


    (Note Profit shows up because Measures is dynamic)

    I then ran the following calculation script:



    Calc dim(Measures,Year);

    Agg (Product,market);

    Here are my results:


    As you can see, my dense dimension acted like Edward and I expected, with it ignoring the #MISSING children and keeping Q1 and AGGing it up to Year. This means  the Tech reference is slightly askew.  

    As a side, there is something else in the  Tech ref example, If you look there is a statement:


    I’d never heard of it and a search of the Tech reference has the only reference in the Aggmissg example. Trying to run it gives an invalid syntax so this is inaccurate as well.

    I will be submitting both of these opportunities to the Documentation group as they actually do fix these type of errors when they are found.

    Moral of the story, Even if you read it in the Documentation, try it yourself and you might be surprised at the results.

    Monday, May 11, 2015

    ASO calculation bug


    Note, It funny how things work out. While I’ve not tested it out yet, a patch set update (PSU) 20859535 appeared today  after I created this post.

    Defect Number Defect Fixed

    MDX formulas are not calculating correctly for parent members of the accounts dimension, which are tagged with time balance properties and compression, in an ASO cube where the parent has more than one child.

    This is the bug I reported last month.So it appears it might be fixed. I just have to now test it.

    Glenn 5/11/15

    I am a creature of habit. I have done the same calculation to put YTD net income into Retained earnings in too many cubes to count. In My ASO cubes, I know that I have to set the solve order higher  than normal for the ancestors of my calculated retained earning  member to get it to roll up properly. and it has always worked.That is until now. I’m working on and have run into an interesting issue.

    My retained earnings calculation works if I am at individual periods, but does not work if I’m at total Periods. In addition, it works if I am at the single stored member of my View dimension  but not if I expand the View dimension. The stored member value actually changes.  In tracing through the issue, It appears the formula for retained earnings is not firing when I’m at Total Year or when I have multiple member of my dynamic View dimension.

    . I was able to find a work around. Instead of allowing the parent of my retained earnings calculation be a natural rollup, I forced it to be a calculated formulaic member that adds up its children. That apparently is enough to force the calculation to occur and it properly rolls up to all of the ancestors.


    I don’t particularly like this solution as it means that If the users add a new account then the formula has to be changed as opposed to the hierarchy rolling up correctly.

    This is also part of a bigger issue. During my testing of formulas in a “View” dimension, I had issues where the formula would not work at a parent account level, but would at the child level. Oracle has confirmed this bug and I was able to get around it by setting the Accounts dimension as a default higher solver order.

    Again while this works, IT is different from every other earlier version. My advice is if you upgrade check your calculations very carefully across all of your dynamically calculated dimensions. Don’t assume things will work hunky Dory.

    Thursday, March 19, 2015

    A quick tip for Dataexport

    I love the dataexport function in calc scripts. I tend to use it a lot for both writing data to flat files and to relational databases. I’ve written multiple blog posts on it.

    Today, I got an email from a fellow consultant who was having problems with it and needed help.  It took me a few emails back and forth, but I was able to help them. I decided to post it so we all don’t run into the same issue.

    The original email was

    “There is a sparse dimension that is dynamically calculated in an app.

    I want to export a parent which is dynamically calculated, but even when setting the data export with DataExportDynamicCalc ON;  it still exports the level 0 for that dimension. “ If I change that dimension to Dense, then it exports what I want, the parent.  I even fix on that parent member but it still exports level 0 of that parent.”

    I first responded asking if the member name was explicitly in the Fix Statement and if SET DataExportLevel ALL; was set.  I was assured it was. I was sent the whole calc and it looked good.


      DataExportDynamicCalc ON;
      DataExportLevel ALL;
      DataExportColHeader "Period";
      DataExportOverwriteFile ON; };


    FIX ("1st Pass","Final","Budget", "Actual","FY15", @relative("YearTotal",0),

         "ALY","SAP CC 1000","760","U-ctID","NZU72200",@RELATIVE("Cost Category",0))

         DATAEXPORT "File" "," "TESTENABLE.TXT";


    I was about to write back that that I was stumped and remembered something they said that I skimmed over the first time.  “If I change that dimension to Dense, then it exports what I want”

    Hmmmm. I started to think about the difference between dense and spares dimensions and how Essbase works. It worked on a dense dimension. OK, So it pulls in the block and can calculate the dynamic members. OK, that is reasonable.

    A sparse dimension. Wait a second, there is no block for a dynamically calculated member. In this case, the block is calculated upon retrieval. By default, the fix would bypass empty blocks.  I looked at the set statement again and it hit me. There was no statement for emptyblocks. I remember it because I always include it and turn it off in my scripts so I know it is off for sure.   I knew there was on and looked it up in the tech reference. SET  DataExportNonExistingBlocks ON|OFF

    The tech reference describes this function as:

    Specifies whether to export data from all possible data blocks. For large outlines with a large number of members in sparse dimensions, the number of potential data blocks can be very high. Exporting Dynamic Calc members from all possible blocks can significantly impact performance.

    again hmmmm. All possible blocks.  I had the consultant add this to their extract set it to ON and try it. I did warn them this could be slow, but they told me it was a small outline.

    Lo and behold it worked like a champ. IT is a valuable lesson. With us typically wanting to improve performance we turn things on and off without even thinking about it. Sometimes we have to go back and reevaluate our options.

    Monday, March 2, 2015

    We all need to thank Applied OLAP

    I typically don’t single out a person or company in my blog, but am doing so today. Tim Tow, Oracle Ace Director, Owner of Applied OLAP, Essbase friend and evangelist, on his blog announce the release of newest version of the Next generation Outline extractor .

    Why the big deal? Why am I praising him? First, Tim maintains the code for his love of Essbase, he makes no money from it. Second, it cost him money. Time taken from billable work to make changes is a cost, plus, he has his help desk support people assist anyone with a problem again at no cost.

    That is all well and good, but the final thing is his responsiveness in improving the product. I emailed Tim on a Wednesday asking about missing features of the relational extract. Tim and I exchanged a few emails about what I would like to see and how I thought it should work.  By Sunday, I had a beta version of extractor with all I asked for and more.  I know from Tim’s questions that I was not getting work he had already planned, but that he had modified the product for me.  After my testing the changes (I found no bugs), he has released it to the Essbase world.

    We all need to tank Tim and Applied OLAP for their continued support of the Hyperion community. I don’t work for Tim, but do think his products are awesome. It is nice that he puts as much care into the free products he supports as he does for his fantastic Dodeca product.