Why I created a blog

Its been four years since I first created this blog. It has remained true to Essbase and related information over those years. Hopefully it has answered questions and given you insight over those years. I will continue to provide my observations and comments on the ever changing world of EPM. Don't be surprised if the scope of the blog changes and brings in other Hyperion topics.

Tuesday, December 29, 2015

A Guest Post Reviewing Developing Essbase Applications Hybrid Techniques

OK, I’m just the middleman here. Tracy McMullen asked me to post her review of Developing Essbase Application: Hybrid Techniques and Practices. These are all of her words. I’ve not changed anything in this review except to put quotes around it.

Truly Taking Your Essbase Knowledge to the Next Level
You know how sometimes when a sequel to a movie or good book, comes out… it just stinks compared to the first release? For instance, Dumb and Dumber To or Speed 2… Other movies and books know that releasing a sequel to its classic first release just wouldn’t be the same and they smartly do not follow up their product with a sequel (like Old School or E.T.). Cameron and team’s first edition of Developing Essbase Applications: Advanced Techniques for IT and Finance Professionals was valuable addition to all Essbase administrator’s libraries. So how does their second edition (or “sequel”), Developing Essbase Applications: Hybrid Techniques and Practices fair? This new book follows in the footsteps of sequels like Empire Strikes Back, The Godfather II or The Dark Knight, taking the great things about the first edition and then improving upon them in the following release. They’ve hit another homerun, providing an invaluable tool to the Essbase community.
This Essbase book has something for every type of Essbase consumer, from the super techy Essbase administrator to the IT / infrastructure team supporting Essbase to the brand new Essbase administrator and finally the end user. So buy at least one (or more) for your organization and share with your Essbase stakeholders. The Developing Essbase Applications Hybrid Techniques and Practices writing team (John, Martin, Tim, Cameron, Glenn, Mike, William) has given us this handy toolkit with code examples and reusable code and detailed explanations of simple to complex topics like design best practices, how some of the new features work under the covers, and detailed “how to” steps so that you can follow along. When it comes to testing new features, they’ve done the testing for us.
A few of my favorite parts of the book: John provides general guidance on Essbase server configuration with Exalytics environments and seeing the testing results of just how powerful Exalytics can be. If you are looking to upgrade and purchase new hardware, read the Exalytics chapter! This might give you some of the ammunition to make your case for Exalytics in the purchase debate. I loved Martin’s Magical Number 7 and how it applies to Essbase dimension design. Even experienced admins can benefit from this design best practices chapter. All of the new buzz around hybrid cubes is really exciting! But what is it exactly? Tim and Cameron dissect the new hybrid option for Essbase and share actual performance results. You’ll find some interesting (and surprising) results. Glenn’s chapter on SQL for Essbase is a must read for every Essbase administrator. It helps both the IT developer and the business user understand how SQL can be utilized for loading data into Essbase, extracting data from Essbase back to relational targets. If you’ve ever wanted to load test your Essbase environment, Tim’s chapter will show you the way to accomplishing this tricky task. As of mom of two sibling girls, I completely appreciated the analogy OBIEE and Essbase provided Mike’s chapter, Copernicus was Right. Essbase isn’t the center of the universe? Why symmetry matters? Mike rocks the boat a little in this part of the book but shows how to really address challenges of Essbase and OBIEE integration. If you aren’t familiar with Dodeca, check out Cameron’s chapter on this alternative tool for end users to interact with both Essbase and relational sources. William’s Smart View chapter breaks down all of the different query options available within Smart View (did you even know these options existed?). He provides a super helpful comparison chart then deep dives into the content with examples on the different ways to use Smart View to interact with data.
Developing Essbase Applications Hybrid Techniques and Practices is not just a high level book. This is a roll up your sleeves and jump in the weeds kind of book. There is a LOT of information which can be overwhelming at times (but really that is a good thing). Reread if you need to because the all of the details are there to learn about Essbase. I concur with Steve Liebermensch who wrote the Afterword, add this book to your shopping cart and pay for it already! I’m certain you will learn something new (likely a lot of something new’s) that will help you in your journey with Essbase. “

Monday, November 9, 2015

Essbase 12C is here or is it?

A few weeks ago when I was at Oracle Open World, there was a big to do, partially started my me. Someone had posted that Essbase 12C had been released.  Thinking quickly, I immediately found the readme and saw that EAS and Essbase Studio were not supported. I couldn’t fathom that, going back to command line to  build cubes? and how would load rules get created.
I was hoping that perhaps, it had the new simplified interface that Gabby Rubin has been hinting about (and I actually got to see at OOW  -- It is awesome), but alas no.
It turns out this 12C release is for use with Oracle Business Intelligence Enterprise Edition (OBIEE) 12c that was released the Friday before.  It does have cool things we can expect later in our own EPM version of Essbase.

First it is the new JAVA kernel, so I’m happy to let the OBIEE people debug it for us. This Java is supposed to be first released as EssCloud service. so don’t expect it any time soon in-house. I would say on premise, but apparently Larry is rebranding on premise to be some kind of private cloud, or so the story goes.
Second, It supports all of the functions including cross-dim in hybrid mode. That is huge. It means they figured out cross dim issue performance. I can’t wait to try it.

Finally, They are putting Essbase in memory. The real use case for Essbase with OBIEE is for OBIEE to spin off Essbase cubes for caching data to make OBIEE reports faster. we won’t be able to get at these cubes, but OBIEE reports will be.
So for OBIEE (at least) a REALLY cool version of Essbase is available, but for us EPM-ers we will have to wait. may uncle Gabby, will give me a version for Hanukkah to play with. or a version for each night. I love new toys. 

Tuesday, July 7, 2015

Exalytics X5-4 Fast and Furious

I love going to KScope because I learn about new features and products. This event was no different. In the Sunday symposiums with Oracle there was a discussion on the new Exalytics X5-4. It was only Sept last year at Open World when the X4-4  was announced . Edward Roske talks about it in his Blog. It was a big deal then. With the introduction of the X5-4 only 9 months later it becomes even bigger, better and “Badder”. With the X5-4 we go to a max of 72 cores, up from 60 and more memory. In addition to more cores, the X5-4 supports a new NvMe High bandwidth flash technology that improves throughput by 2.5 times. I won’t bore you with the details if you want to read about them , here are the specs

To me the most remarkable thing about this is you get more and the price has not increased. All of the way back to the X3-4 the price has remained the same. With a list price of $175K it is what I consider cheap.

As John Booth mentions in his Blog, you can get this as an X5-2 configuration as well offering additional flexibility. Note I had a correction from John. The X5-2 was more a wish from him than a reality. While you could create a X5-2 using sub-capacity licensing, you are still paying for the physical cores (Thanks Steve Libermensch for that clarification)

For us in EPM it keeps getting better and better.

Monday, July 6, 2015

Essbase Studio 11.1.2.4.002 patch

Well, I survived KScope. It was a very good event with participants getting over 175 sessions related to EPM/BI.  I sat in a number of sessions and was impressed with the quality of the speakers and presentations.  I also had the opportunity to speak in 4 sessions and I think they went pretty well, at least from the questions people asked.

Patch 11.1.2.4.002 came out the other day and I read through the readme file. There were only two changes and one document change. 

The first bug fix relates to a problem with stored dimensions (I assume ASO) where it would not let you use external consolidation operators. 

The documentation change fixes the statement that you can drill through on any level of a hierarchy including the top level. That is incorrect, you can’t drill through from the top member of the hierarchy (The dimension name).

The most intersting bug fix is the second one and I’m surprised they are calling it a bug as it used to be described as a limitation. When doing a drill through report on a recursive hierarchy, the drill through would fail with an error message if there were more than 1000 level 0 members returned in the query.  For recursive queries, Essbase Studio created an IN clause with the list of level zero members under the selected member. The 1000 member list was a limitation for Oracle as that is the maximum number of members allowed in an In clause. I’ve not been able to test this yet and wonder how development got around that limitation.

I guess the moral of the story is , even if something is listed as a product limitation, still submit bug and enhancement requests and it is very possible what you need will be changed.

Monday, June 8, 2015

Don’t believe everything you read (again)

I got an email from my boss Edward Roske about an entry in the Tech Reference. He is working on a cool super secret project (all will be Reveled and revealed  at KScope) and he asked me about something he saw in the Tech reference on the AGGMISSG command.

For those of you who don’t like to read the tech reference I’ll save you the time going to it.

SET AGGMISSG

Specifies whether Essbase consolidates #MISSING values in the database.

The default behavior of SET AGGMISSG is determined by the global setting for the database, as described in the Oracle Essbase Database Administrator's Guide.

Syntax

SET AGGMISSG ON | OFF ;


Notes



SET AGGMISSG commands apply to calculating sparse dimensions.



Example



SET AGGMISSG OFF;
CALC ALL;
CALC PERCENTS;


See Also



  • SET Commands


  •  



    What struck him as funny and me as well was the statement



    SET AGGMISSG commands apply to calculating sparse dimensions. (my highlighting).



    Neither he nor I could remember it acting that way. I reached out to MMIP Cameron Lackpour and he opened his System 9.3.1 tech reference and it said the same thing.



    Thinking this can’t be right, think Planning with upper level periods allowing input and being dense, I decided to test it.



    Using Cameron’s FDITHWW sample Basic, I cleared all the data and set the upper levels of year to be stored.



    I used Smart View to populate the following intersection



    image



    (Note Profit shows up because Measures is dynamic)



    I then ran the following calculation script:



    SET AGGMISSG  OFF;



    SET UPDATECALC  OFF;



    Calc dim(Measures,Year);



    Agg (Product,market);



    Here are my results:



    image



    As you can see, my dense dimension acted like Edward and I expected, with it ignoring the #MISSING children and keeping Q1 and AGGing it up to Year. This means  the Tech reference is slightly askew.  



    As a side, there is something else in the  Tech ref example, If you look there is a statement:



    CALC PERCENTS;



    I’d never heard of it and a search of the Tech reference has the only reference in the Aggmissg example. Trying to run it gives an invalid syntax so this is inaccurate as well.



    I will be submitting both of these opportunities to the Documentation group as they actually do fix these type of errors when they are found.



    Moral of the story, Even if you read it in the Documentation, try it yourself and you might be surprised at the results.

    Monday, May 11, 2015

    ASO calculation bug

    UPDATE

    Note, It funny how things work out. While I’ve not tested it out yet, a patch set update (PSU) 20859535 appeared today  after I created this post.

    Defect Number Defect Fixed
    20806331

    MDX formulas are not calculating correctly for parent members of the accounts dimension, which are tagged with time balance properties and compression, in an ASO cube where the parent has more than one child.

    This is the bug I reported last month.So it appears it might be fixed. I just have to now test it.

    Glenn 5/11/15

    I am a creature of habit. I have done the same calculation to put YTD net income into Retained earnings in too many cubes to count. In My ASO cubes, I know that I have to set the solve order higher  than normal for the ancestors of my calculated retained earning  member to get it to roll up properly. and it has always worked.That is until now. I’m working on 11.1.2.3.502 and have run into an interesting issue.

    My retained earnings calculation works if I am at individual periods, but does not work if I’m at total Periods. In addition, it works if I am at the single stored member of my View dimension  but not if I expand the View dimension. The stored member value actually changes.  In tracing through the issue, It appears the formula for retained earnings is not firing when I’m at Total Year or when I have multiple member of my dynamic View dimension.

    . I was able to find a work around. Instead of allowing the parent of my retained earnings calculation be a natural rollup, I forced it to be a calculated formulaic member that adds up its children. That apparently is enough to force the calculation to occur and it properly rolls up to all of the ancestors.

    image

    I don’t particularly like this solution as it means that If the users add a new account then the formula has to be changed as opposed to the hierarchy rolling up correctly.

    This is also part of a bigger issue. During my testing of formulas in a “View” dimension, I had issues where the formula would not work at a parent account level, but would at the child level. Oracle has confirmed this bug and I was able to get around it by setting the Accounts dimension as a default higher solver order.

    Again while this works, IT is different from every other earlier version. My advice is if you upgrade check your calculations very carefully across all of your dynamically calculated dimensions. Don’t assume things will work hunky Dory.

    Thursday, March 19, 2015

    A quick tip for Dataexport

    I love the dataexport function in calc scripts. I tend to use it a lot for both writing data to flat files and to relational databases. I’ve written multiple blog posts on it.

    Today, I got an email from a fellow consultant who was having problems with it and needed help.  It took me a few emails back and forth, but I was able to help them. I decided to post it so we all don’t run into the same issue.

    The original email was

    “There is a sparse dimension that is dynamically calculated in an app.

    I want to export a parent which is dynamically calculated, but even when setting the data export with DataExportDynamicCalc ON;  it still exports the level 0 for that dimension. “ If I change that dimension to Dense, then it exports what I want, the parent.  I even fix on that parent member but it still exports level 0 of that parent.”

    I first responded asking if the member name was explicitly in the Fix Statement and if SET DataExportLevel ALL; was set.  I was assured it was. I was sent the whole calc and it looked good.

    SET DATAEXPORTOPTIONS

    {   DATAEXPORTCOLFORMAT ON;
      DataExportDynamicCalc ON;
      DataExportLevel ALL;
      DataExportColHeader "Period";
      DataExportOverwriteFile ON; };

    /*EBIT*/

    FIX ("1st Pass","Final","Budget", "Actual","FY15", @relative("YearTotal",0),

      "EBIT",
         "ALY","SAP CC 1000","760","U-ctID","NZU72200",@RELATIVE("Cost Category",0))

         DATAEXPORT "File" "," "TESTENABLE.TXT";

    ENDFIX;

    I was about to write back that that I was stumped and remembered something they said that I skimmed over the first time.  “If I change that dimension to Dense, then it exports what I want”

    Hmmmm. I started to think about the difference between dense and spares dimensions and how Essbase works. It worked on a dense dimension. OK, So it pulls in the block and can calculate the dynamic members. OK, that is reasonable.

    A sparse dimension. Wait a second, there is no block for a dynamically calculated member. In this case, the block is calculated upon retrieval. By default, the fix would bypass empty blocks.  I looked at the set statement again and it hit me. There was no statement for emptyblocks. I remember it because I always include it and turn it off in my scripts so I know it is off for sure.   I knew there was on and looked it up in the tech reference. SET  DataExportNonExistingBlocks ON|OFF

    The tech reference describes this function as:

    Specifies whether to export data from all possible data blocks. For large outlines with a large number of members in sparse dimensions, the number of potential data blocks can be very high. Exporting Dynamic Calc members from all possible blocks can significantly impact performance.

    again hmmmm. All possible blocks.  I had the consultant add this to their extract set it to ON and try it. I did warn them this could be slow, but they told me it was a small outline.

    Lo and behold it worked like a champ. IT is a valuable lesson. With us typically wanting to improve performance we turn things on and off without even thinking about it. Sometimes we have to go back and reevaluate our options.