Why I created a blog

Its been four years since I first created this blog. It has remained true to Essbase and related information over those years. Hopefully it has answered questions and given you insight over those years. I will continue to provide my observations and comments on the ever changing world of EPM. Don't be surprised if the scope of the blog changes and brings in other Hyperion topics.

Tuesday, November 24, 2009

Miscellaneous Ramblings

Just a few musings to show you I still exist since its been a while since I last posted. I don't know how some people (John Goodwin) have the time or energy to post as often as they do with such good information.

First, as mentioned on Edward Roske's blog, the ODTUG board elections were announced and Essbase's good friend Tim Tow was not reelected, while we have Mark Rittman on the board, Tim is/was the Hyperion advocate on the board and I'm sure he will be missed. We owe a big thank you to Tim for all his hard work to get Hyperion included in the Kaleidoscope conference. It is truly the best home the products have had.

Second, in conjunction with item 1, the Kaleidoscope conference dates and location have been announced. It will be held in Washington D.C. on June 27th to July 1st(2010). While I am not on the Hyperion SIG board this year, I did get a peek at the list of submitted abstracts and they look awesome. I don't know what the final selections will be, but it looks like there will be some great topics. I recommend doing whatever you can to go to that conference. You will not be disappointed. From what I understand, there might even be a track with more intoductory topics and case studies for people who want to start learning more about the product.

Finally, I am anxiously awaiting my copy of the new book on Essbase and BI. While I don't remember the name, I know some of the authors and I'm guessing it will be a good read. As soon as I read through it, I'll put a review here.

Monday, October 12, 2009

Open World 2009 Day 2

Oracle Open World Day 2.
This has been a hectic morning with my needing to deal with some issues at clients so I missed this mornings Keynote. As I looked at the schedule as I did not preregister for any sessions (which is a mistake), I noticed that all of the Hyperion related sessions are in the Intercontinental Hotel on the 5th floor. Interestingly all of the Hyperion Vendor Kiosks were outside the rooms. What a great idea. Not huge booths, but a place to talk to vendors associated with the products you are looking at. Aside from the interRel Kiosk, there was one for Applied OLAP (and their Dodeca product) and Star analytics who has a couple of great products as well. The other booths were competitors, so I guess I won’t mention them. In total I think there were about 8 Kiosks.
I wanted to go to the session on EPM Roadmap, but when I looked on the board, I saw it was overbooked, so I figured I would have no chance to get in. There were people standing off to the side who had been shuffled there by the door wardens ( A thankless job). I figured what the hell, so I got into line expecting to be pushed aside. I got to the door warden, he scanned my card and let me in. I figured out afterwards, that my Blogger pass is worth something. Thank you Oracle.
The session was led by Bill Guilmart and Al Marciante. Like ALL of the sessions here, the first slide was a disclaimer saying nothing they say (or for that fact, my reporting of it) is reality and may never be delivered. So take what I say with a grain of salt or a pound and rub it into your open wounds.
The session first talked about the existing releases since Open World last year and since that is all old news, I won’t repeat any of it here. As they got to talking about new features, there was a these I have seen yesterday OK. I get it, the theme for the conference seems to be “Complete Open Integrated” and the real theme for the session was Unify intelligence from Transactional , BI and Management systems.. Toward that end, the topics would be
· Complete Integrated Close
· Extending the Planning Platform
· Expanded ERP integration
· Ease of use enhancements
· Integration
· Portfolio wide improvements
What does all this mean? I’ll go through what I was able to jot down while actually trying to listen at the same time (because of this my notes are not complete)
Financial Close Management is a four piece enhance,ment
1. A financial Calendar to let everyone know the schedule and task deadlines
2. Process monitoring – A dashboard to show how the close is going
3. Task Management. Lots of improvement including integration with Oulook task lists and calendar
4. Account Reconciliation This includes
a. Internal/ External reporting
b. Disclosure management (10Q, XBRL, SEC filings)
It will work with MS office and other products to provide consistent reporting and accountability

· It too has enhanced process management including reassigning of tasks(delegation)
· A New Form designer
· Composite forms
· Web form enhancements including
o Conditional formatting
o Ad Hoc Data entry forms
o Formatting options (freeze frame, sorting, filtering,, etc)
o Better member selection
o And Built in validation rules
Public sector Planning. This section went too fast for me to get the info, but it looked good

Smart View
· Enhanced Excel experience
· Task lists
· Composite forms from Planning
· Context sensitive ribbon bars
· Enhanced 2007 look and feel
· Better Log in
· For HFM
o Smart slices
o Report designer
o Cascading reports
· For Essbase
o Drill through to ERPI
Shared Service.
Alas, I was not able to get everything down they talked about, but good stuff is coming including
· New security deployment configuration
· 2 way SSL
· SSL offloading
· (I missed the rest of the points sorry)
It looks like they are doing a lot of work on EPMA to make it really usable, this is a good thing.
· Enhanced Essbase support
o ASO and BSO
· HCPM Validations
· Smart mapping of Planning lists to ASO cubes. OK, This means Planning will create ASO reporting cubes. It sounds like it turns smart lists into dimensions (attribute???) as Al talked about 25 dimension cubes for reporting
· Batch updates
· The ability to purge transaction log files
· HFM copy application ability
Calculation Manager
While enabled calculation manage to work oncubes outside of EPMA, it is getting better
· Procedural calculations on ASO cubes (Allocation and others)
· Template ability. Create one and use anywhere
· Parameter passing
Data Relationship manager
· Browser based
· Unicode support (Multilanguage)
· Role based
· Multiple applications per server
· Validations in real time and in batch
· Improved Nnavigation
ERP Integrator (Part of FDQM)
In general improvements in functionality
· Starter Kits – IFRS, Japan, ….
· Enterprise extracts in HFM format
· Integration with Government/Risk
o Segmentation of duty
o GRC manager
· Integration with OBIEE Answers
· Related content enhancement
· Grid level and POV passing enabled for snapshots and books
· MS word integration as Word tables
· LCM support to migrate
· Diagnostics and logging
This is a fairly new product and was not integrated into the suite. The major work being done will make it similar to the other products including SSL, Diagnostics and logging

HPCM (HCPM) I’ve heard it both ways.
· Expanded Driver functions
· Standard Cost drivers
· Sequence dependent drivers
· Expense assignment functionality
· Exp Model navigation
· POV management
· Improved performance
o 10-20 times better for direct drivers
o 3 times for genealogy
o 140 times from Allocations
And last but not least Essbase
· Improved EPMA deployment
· Improvements in LCM
o Outline compare
o Project naming
· Access ability(they put the number 508 in parens after it, I don’t know what that means)
· Functional enhancements
o Username/character length allowability
o ASO Formula editor
· Web services
· Studio Usability
· Diagnostics and logging
They covered a lot in 1 hour and could not go into a lot of depth, but as you can see, the products look like they are going in the right direction. It’s nice to know that Oracle thinks they are worth all the work.

OpenWorld 2009 Sunday

After an on and off and on again scenario, I actually made it to Open World. I got asked to sit on the Customer Advisory board for Essbase (CAB). So if there is anything on your wish list for Essbase or one of its related products (studio, PBIEE integration, etc) let me know and I’ll see if I can get your voice heard.
After a long day in CAB listening to things I’m not allowed to talk about, I’m sitting here with thousands of my closest friends waiting for the opening keynote address. One nice thing, they have set up an area for press and bloggers with tables so I don’t have to try to type in my lap. I hope there is some amazing tidbit about the Hyperion products I can tell you, but I’m not hopeful.
The session is starting with Scott McNeally from Sun Microsystems. His theme is innovation. He started of with a top 10 list of Engineers gone wild. His list is cute, but not worth repeating. (although I did like his sushi USB drives and Nobel prize for Gas Mask Bra ,no more funny than other Nobel prizes. ) He then went into the top 10 innovations from Sun (not a funny list, but interesting). The list includes the following,
Sparc, Solaris, Opensource, Java, Open Storage, Blackbox Datacenters( in containers for 3rd world countries), different chips, multithreading, etc.
I won’t go into them, but Scott talked about them all. And more.
Scott Fowler talked about Sun Systems, integrated systems, Java. Security.. He announces that Sun SPARC/Solaris is the number 1 in all the commercial Benchmarks including Hyperion.. He talked about the Sun Oracle Database machine (Didn’t I see a similar presentation from Oracle and HP last year?). $00 gig of flash on a card that won’t wear out like old flash and a nore Flash array. A single rack equates to 1000s of disk with 4X the throughout .
It was a real love fest from Sun about Oracle.
Larry Ellison then came out and the love fest continued with him talking about why He supports the Sun products and will not get rid of parts. He talked about his commitment to beat IBM. Expect to see ads why Oracle and Sun have a 25% better processing than IBM being 6 times more green.

After the keynote, I headed off to the Oracle Ace dinner. I met some old friends and met some new ones and had a nice mean (Thanks Oracle). While I was at the dinner, The rest of the crew from interRel was at the Partner awards presentation. interRel won the EPM Titan award for a second year in a row. Congratulations to Edward, Eduardo and everyone who was associated with the project

Friday, August 28, 2009

New Essbase book Review - Oracle Essbase 9 Implementation Guide

Recently, I was asked by Packit Publishing to review a new book on Essbase by Joe Gomez and Sarma Anantapantula titled Oracle Essbase 9 Implementation Guide. I jumped at the chance. There are so few resources available out there for someone new to Essbase, it’s a shame. I understand it’s a small audience, but since the Oracle acquisition it is growing. I would like to thank the authors for spending their time, energy, sweat and tears to create the book. I know it is a difficult task, for I have a hard enough time just updating my blog from time to time. So my hat is off to them for their undertaking.

Before my review, I feel it only proper that I give a couple of disclaimers before I review the book.
1. I am not the intended audience for it as they say it is for the IT professional who wants to start working with Essbase. I’ve been doing the way too long, about 14 years now (I think)
2. The company I work for puts out a competing book. Look smarter than you are with Essbase.(I am not the author nor an editor or reviewer of it).
3. I realize this is the first printing of the book and a tried to overlook errors like typos, obvious misstatements and wrong graphics for the text. I tried to concentrate on the subject matter.

With that said, I came into the assignment with an unbiased and open mind. I attempted to read the book from a new IT professional’s point of view while using my knowledge to insure the material was accurate.

I was excited when I got the book last week and using time I don’t have, began to go through it. I have to say, I really wanted to like the book as another source of information would be invaluable to the Essbase community. I am sorry to say I feel this book falls very short of being a good guide or reference. There were some good points, but the problems out weighted those few glimmers of insight. There are a numbers of troubles I had with the book.

First the good. As bright notes, the book talks about Cube Preview in EAS. It is a useful tool for administrators that no one ever talks about. In addition, they spend time talking about report scripts and even go through some of the syntax. Everyone seems to ignore this topic thinking report scripts are dead. I thank the authors for reminding us that they can still be useful. Even though the classic add-in is dying, they spend a bit of time on Query designer. This feature can be very useful and they mention some of the high points of it. I also enjoyed the introduction to data warehousing. It was interesting, although there was not really applicable to the subject of the book.

What did I not like about the book, I’m afraid more than I liked. I won’t go through everything, but give you a number or items I had difficulty with.

First, the book was hard to follow. While in the middle of a subject, the authors would veer off to talk about something related, but minor. In other cases it appeared pages might be missing or thought processes were incomplete. I would get into a subject and it would just end.. As an example of being hard to follow, when talking about dimension building, they went into a detailed discussion of MaxL. With Data loads, they did it again and then for Calc scripts veered off to Esscmd, this while having a whole chapter on automation and not really adding value to the topic. Another item that made it hard to follow was extreme detail would be presented on what buttons and options meant of various screens, but important functionality would be glossed over. For example, they described the load screen extremely well, but gloss over, the two different types of joins, selection and rejection criteria, adding text, etc, giving no examples at all. Finally, the examples were not cohesive. I never really got the overall understanding of what the database they were building looked like. It would morph into different dimensionality without explanation.

Second, there was not a cohesive flow to the examples. It would have been nice to have examples that built on each other. There was talk about the dimensions of the outline, and one example of how to build it, but I would have liked to build on prior steps. Perhaps first create the dimensions manually and build the dimension members manually, then exercises to add the members of other dimensions through load rules. Once flaw here (and with the Look Smarter book line) is there should be a source for sample files used to do dimension builds, data loads, and results from calc scripts that would allow the reader to easily follow the examples.

Third, unknown or inaccurate terminology was used. For instance, I’ve never heard of a parent dimension. Does this imply that there are child dimensions? Throughout the book the terminology was inconsistent. Dimensions were called all sorts of things. I know this next one is trifling, but in the Calculations section, they call the set commands functions. They are not functions but commands. There is a difference and if you are writing you need to be accurate with your information.

Fourth, this is a book that is supposed to be based on System 9. In the install section you get the most basic of installs, Essbase, EAS and the Excel Add-in. No mention of Provider servers, external authentication, FR etc. They talk about Esscmd (a lot). As a dying interface, why spend the time on it. I have to admit the authors do warn you in the automation chapter that Esscmd is being phased out. Get new users using MaxL, since it is the direction of the future. Yes you can mention that Esscmd exists, but don’t waste the readers time teaching it. I have the same comment about the classic Excel Add-in. While I love the add-in, new readers should be learning Smartview. I guess that would have required the author to discuss how to install and configure APS which would have made that section longer and more difficult.

Finally, I could live with all of the above and still be relatively happy, but there are too many inaccuracies in the book that would either confuse or mislead readers. In some cases the book would give completely wrong information. Here are a very few of the many examples I found.

Did you know that Smartview costs extra? I didn’t but according to the authors it does.

When talking about attribute dimensions, they use color as an example and put it on their product dimension. Their product dimension level zero members are Car models (like Sedan). This makes the reader believe you could have different colors (attribute dimension members) for the same model. No where do they tell you that a base member can only have one Attribute from an attribute dimension associated with it. Attributes on dense dimensions? I don’t know they never tell you if you can, but allude that it’s possible.

A second example is their discussion of Two Pass. It is stated that Two pass is only allowed on the Accounts dimension and only on Dynamic calc or Dynamic calc and store members. While only on the accounts dimension was applicable on version 3 of the software, it has not been the case ever since Dynamic calc came into effect. “Only on Dynamic calc (and store)”, If this were the case, what is there a command called Calc TwoPass. Further in the section on Two Pass, they give an example and lead the reader to believe that if you perform aggregations on the database in different orders of dimensions, your results will be different. The example shows data that is added up to parents. The last time I checked addition and subtraction were commutative. I surely hope that adding up my database in different orders will not affect the results. Had they used the typical example of a ratio supplying different results if you add the sums or sum the adds, I could understand, but this example makes me scratch my head.

A third example of misinformation is in the section on Calc scripts. They give you the following calc script in an example
Fix(@IDescendants(“Calendar Periods”))
“Gross Sales” = “Sales” – “Discounts”;
While this calculation is not incorrect, It is no different than just running the calculation on the entire database. In the discussion on Fix, it is stated that items left out of a fix statement exclude them, in truth, dimensions left out of a fix statement are all included at all levels. Like the above, a lot of examples are not well thought out. For Sumrange, the use @Descendants(Products) Based on the outline example they give, you would be summing together multiple levels, parents and children. I doubt that would give the answer one would want. I thought it interesting that they showed examples of calculations, but never showed the starting point or the result. The book claims that after reading the chapter on calculations, you would not need to take a calculation course. I believe the opposite is the truth. After reading their explanations I would need a course more than ever. If I knew nothing about calc scripts, I would have come out of that chapter more confused than when I went in it.

I don’t remember a section on security and I was not impressed with the section on optimization. As you can tell by now, I was rather disappointed with the book. As I said earlier, I applaud the authors’ intentions but I think this book falls much too short to be useful.

Thursday, August 13, 2009

Upcoming Webinar

On Tuesday August 18th (2009) I'll be giving the weekly interRel webinar. It will be on little known features of Essbase. IT is basically the same presentation as one I gave at Kaleidoscope in June. If you missed it then, try to attend this week. You can register for it at www.interrel.com
I was scheduled to repeat the webcast on Thurs Aug 20th, but I'll be unavailable, so my co-worker and friend Cameron Lackpour will be giving it then. Heck sign up for both and see who does it better. At any rate, I think it a worthwhile topic and everyone I know that has seen it has picked up a few tidbits (or more)

Monday, August 10, 2009

4th addendum to Data Export

Well, I guess the team at Oracle was listening to someone. I just looked at the release notes for EPM Fusion Edition and there plain as day is the bug fix

Data Extraction. While exporting data to a relational database, DATAEXPORT does not create thedelimiters if there are one or more missing values in the last column. [8507606]

Another problem I was getting at a client was when I would remote onto a server using the consloe option, when I logged of the Essbase application would hang. That apparently has been fixed as well

Agent. When Essbase is running as a Windows service and the domain user logs off the machine,
the Essbase applications hang, which in turn causes Essbase Server to hang. [8279377, 8464004]

While I worked around both of these problems, Its nice to know I won't have to in the future

Monday, July 13, 2009

3rd addendum to DataExport

My last blog post talks about dropping columns when exporting to relational using the dataexport command.
I found the answer looking through the knowledgebase. Turns out you have to set
DEXPSQLROWSIZE 1 to get it to work. I don't know if I like tihs answer as it requires you to basically turn off bulk insersion, but at least it works and I can continue with my development withput having to asdd jexport to the mix.

Friday, July 10, 2009

2nd Addendum to Data Export

OK, I'm starting to see why people don't like Data Export. I'm trying to export a particular set of data to a relational table and for some reason, one of the dimensions is not showing up in the table. If I change the export to write to a file, the column is there. It is interesting that the order of the row members is different in the flat file than in the relational export. What appears to be happening is it is taking one of my two dense dimensions which shows up as the last row member before my data vailes in my flat file and is moving(or perhaps overlaying) what is my second row member in the flat file. I know that is clear as mud so to show you an example a row my flat file export looks like:


In the relational it looks like:

so it is shifting or overwriting the R1 with Tier1 and where tier1 should be is blank.

If anyone has figured a workaround for this other than to export to a file and load it into relational, let me know. In the meantime I'm opening a SR with Oracle and se eif they have a fix

Tuesday, July 7, 2009

Addendum to Data Export post

As an addendum to my post on Data export, John Goodwin reminded me of a work around I had to do. There is a problem when you are exporting to a relational table and your columns dimension does not have values in the trailing members (for example you have Jan-Dec in columns and you only have data in Jan –Mar) the data export will fail because the record columns don’t match the table columns. . In order to get around it, I set :
DataExportDynamicCalc On
DataExportLevel ALL

Then in my fix statement, I had to make sure that the last column of the load would always have data. Testing it by sending it to a flat file confirmed that if the last column had data, the intermediate columns would have something as well. In Sample Basic if Years is your column dimension, in your fix statement you could fix on Jan:Dec and Year. Since Year is a dynamic calc member if any month has data so will Year. Note, this could be be a problem if you have time balance accounts without skip missing turned on). Because you need to set all levels, you also have to make sure you fix on the level zero members of your other dimensions as well. I set the other dimensions to @relative (dimension name, level zero). Since the last column had data it would load to the end.

In talking with others, they have had problems with Data export, So far, I’ve not found anything I could not work around.

Monday, July 6, 2009

Dataexport is great

I’ve had the need to work with the newish dataExport command (I did it in recently and thought I would share some things I’ve found with it and with Sql interface in load rules. I think you will find my musings interesting.

First, I was using data export to export date to a flat file to act as an audit report for some users. It worked like a charm. Some of the things I found are if you specify to export level zero, no matter what you put in your fix statement it will only look at level zero blocks. Using DataExportDynamicCalc on allowed me to export dynamically calculated members as well. For my 20K rows it did not seem to slow the export down, ut I don’t know the impact on a big data set. I could also specify DataExportDimHeader ON to get column names. Using DataExportColHeader "Dense dimension name" I could specify the dense dimension I wanted as the column. It would have been nice if I could put a single member of a sparse dimension there, but I understand why I can't.

Next I needed to back up some of the static information from my cube. Static in that it is user input for my testing and I didn’t have an easy source to reload from. I set up a fix statement and used the binfile option (DATAEXPORT "Binfile" "fileName"). It created a file that I could reload to my db. I can see the usefulness of this on a production database where you need to save off something like budget, clear the database and reload it with source actual and the budget. It’s much easier than the old export and reload and much quicker. In addition, you can be selective of what you export instead of exporting everything.

Finally, I needed to load data populate a Sql database with data from Essbase, modify it and load it back. Yes there are some things that Sql can do better than Essbase. In this case, It was to take two disassociated sets of data and merge them together. It needed to join them on the one common dimension member and basically turn the two 3k row tables into about 1.5 million records that get loaded back into Essbase. I set up the ODBC driver with no problem and exported the data into flat files to see their structure. I then created tables that matched the structures. I will say that there is where I had minor difficulty. If the columns don’t match exactly, the export fails with little information (Just the typical messages in the log that tell you “you screwed up”). I played around with the table figuring out that I miscounted the columns and fixed it and it worked fine. I defined the amount columns as float and found that for #missing values Essbase stuck -2e-15 in the columns that were once #missing in Essbase. A quick stored procedure and I converted them to null.

Oh but wait, how could I run the Stored procedure. I could run an OSQL but the instance I was working on the tools are not working right. I could get into Sql Server, but could not run OSQL or Bulk insert. So thinking swiftly, I thought of load rules. A load rule is just supposed to take Sql commands, so how could I get it to run a stored procedure. I know I can put statements like a union in a Sql statement, So I tried something like:

Exec myStoredprocedure.

I clicked ok/retrieve, entered my id and password, and lo and behold, I got the system date back into my load rule. I checked my tables and the bad characters were converted to nulls. Wow it worked. Who would have thought? I figured I could use the load date to update the alias of a member to set the last time the calculation was run. Another suggest I had was to use rejection criteria to reject the row and load nothing. I used this technique to run another stored procedure that truncated and populated a table from the tables I loaded, so my next step was to create a load rule and bring the data back in. Everything was done without having to resort to anything but MaxL statements.

I’ve since added a custom defined function that will run the SQL statement directly from the calc script. I got this from Touifc Walkim the development manager for Smartview, a very nice guy and CDF guru. Some clients don’t like the idea of CDFs so I have my origina method available when necessary. IF this works, you can get the CDF here

Adding DATAEXPORTENABLEBATCHINSERT TRUE to my config file made the process run faster as it allows the dataexport to use a batch insert method(when the ODBC driver allows it).

As I use Dataexport more, I’ll comment on my findings, but I have to say I’m impressed so far. I have asked for enhancements to the command, and they have been added to the enhancement list. I’m interested to see if or when they are implemented. I was very happy when I was told they were added to the list since it appears that Oracle is open to hear what people recommend. Some of the things I recommended were:
Add an append option to the file export so you could have multiple fix statements write to the same file
Add a debug option to the SQL export to see the SQL statement generated to make it easier to debug when you have problems
Allow aliases to be exported instead of file names.

Let’s see if these get implemented.

Friday, June 26, 2009

Kaleidoscope wrap up

Well I just got back from Kaleidoscope and am still on a sleep deprived high. I don’t think I got to sleep before 2 am any night thanks to the special midnight madness (ask the expert panel) and getting sucked into games of Werewolf. On Wed night we had over 70 people playing or watching multiple games. But the conference was not about game playing, it was about knowledge.

I presented at two sessions, 10 Optimization tips you (probably) never heard before and Little used features of Essbase. My first session was a 60 minute session and I went 75 minutes and hardly anyone left the room. (sorry guys, I didn’t realize I was running so long). For my next session(back to back with the first(arg), I had someone with cue cards telling me how much time I had left. This was a 90 minute session and I never made it to the last two topics reference cubes and Data mining. I put these as the last topics because I think they will be the least used, but many were disappointed that I didn’t talk about data mining. Frankly, I think this is really better done in relational with larger sets of detailed data anyway.

The pre-conference started on Sunday with a symposium put on by the Oracle Project managers. They discussed the road map, new products and solicited feedback. A lot of what was in the presentations we were asked to not blog about. For me the best presentation of the day was from Toufic Walkim the new Smart view product manager. Toufic has been around a long time and has helped me with numerous things (I believe he wrote the original jExport cdf). He talked about changing the Data connection manager (as he said, trying to get it right for the 4th time) The direction they are going looks like the right way. He also talked about a bunch of new functionality like context specific menus (so if you are in Essbase you only have Essbase options) and a lot of other things. In the Middle of the day Robert Gersten talked at a high level where things are going. It looks like they are putting a lot of R&D into the products.

Sunday night was a reception for all attendees, it was well attended and the food was good. I was in the interRel booth most of the time, so afterwards we went and grabbed a bite of food on our own. I’m not sure if I can call the interRel booth a booth. There were couches and comfy chairs in front of a TV playing the best movies (Princess Bride, Goonies, Lord of the Rings and space balls were some of the offerings during the week). There were also video games and Rock Band going most of the time.

Monday started with a keynote from John Kopcke. It was an interesting discussion on using EPM and management excellence. I think it might have been lost on some of the apex and other developer tracks in the group, but it was a good talk sprinkled with humor. We then went into sessions. The first session I was in was Calc scripts for mere mortals by Gary Crisi and Edward Roske. I was the ambassador for the session so I introduced them. I have heard the presentation before from Edward, but it is always good to get a refresher. I always seem to come away with something new or that I had forgotten about. At the same time there were two other presentations going on and a hands on lab. I think at the lab, people almost came to blows trying to get in. You had to pre-signup for it and it filled very quickly. People tried to crash the party, but when there is no room there is no room. We have talked about trying to expand the labs next year, but as a reminder next year sign up as soon as the schedule is open.

Next I sat in on a presentation from Matt Milella on top 5 Essbase CDFs. It was a good presentation which he said he was putting on the Essbaselabs Blog. They have a library of over 300 cdfs that they are trying to find a home for. If you need something, ask them and they have probably already created it. Matt did a very good job and I can see a lot of use for his CDFs.
After that were vendor sponsored presentations. They were the only sponsored sessions of the week. I heard good things about all of them. They were less Ad like than one would expect and all offered technical content. I was asked to sit in on the one interRel offered on implementing HFM in less than 6 months. I’m not an HFM guy but Tracy McMullen did a good job. It was really more on managing this type of project to get it done than the product itself. But this is what is so important in this type of project.

For the last session of the day, I sat in on Driver based forecasting by Ron Moore. He had good content, but I was pulled away by some work issues and did not get to see the entire presentation. bummer

The day ended late with a special caffeine and chocolate 10:00 to midnight aske the experts session. It was better attended than I expected with over 60 people there asking a panel various questions. I don’t think the panel got stumped at all. A rare occurrence. After the panel, some people got together to play werewolf. I’ll not tell you much about it, but it’s a chance to kill off your friends, coworkers, etc. For more info go to http://www.eblong.com/zarf/werewolf.html
All I can say about Tuesday is ASO, ASO, ASO. I sat through two presentations from Rudy Zuca on Designing your way out of BSO with ASO and ASO Fundimentals (These two were out of order due to an oversight), then Querying ASO with MDX from Mike Nadar and Gary Crisi, followed by I knew how it do it in BSO, how do I do it in ASO. This was followed by Optimizing MDX in ASO and Optimizing ASO both given by Steve Libermensch. All I can say is wow. While there was some overlap in content, if you didn’t know ASO before the sessions, by the end of the day you were an quasi-expert. The best thing I learned in the day was from Steve. If an ASO formula says in the log it is calculating in cell mode, it means it is too complicated for Essbase to handle efficiently. Create new members in the cube and have intermediate calculations that your formula calls. Even though these still have to be evaluated dynamically, it is more efficient for the ASO formula optimizer.

Tuesday evening was a sundown session with the Oracle Ace directors, Time Tow, Edward Roske, Tracy McMullen and Mark Rittman. There was interesting discussion during the presentation, but not any one thing that sticks in my mind to talk about. After the session, was a reception, then I was off to the Oracle Ace dinner and afterwards, an appearance at the Hyperion SIG mixer, then more werewolf.

I was very busy Wednesday as I was asked to sit in on an optimization session for moral support from Scot Marin from cash America. Scot did a fine job and I just talked a little on some more arcane questions. After that were my two sessions. As I stated before I think they went pretty well. I missed the session on How Essbase thinks. I really wanted to go to that as Tom Tortolani and Edward Roske were speaking. Tom helped create Essbase long ago. I got called and had to do real work. Bummer. I was able to attend the 64 bit optimization session. It reminded me, I have to rethink and retest when going to 64 bit as the rules change a lot. In some cases you throw away the optimizations you did for 32 bit. For my last session of the day I was involved in an optimization round table. There were a ton of good questions. Wednesday night was the Kaleidoscope event, dinner and a comedian. Both were very good. Kaleidoscope does not skimp on its food budget. There was variety at every meal and all was tasty. After the comedian, there was a DJ for dancing. Since I have two left feet, I went with a group for the last werewolf game. It was a lot of fun.

Thursday morning came very early and fast. The first session I was going to attend, through some sort of mix-up, the speaker did not show up, so I have conversations with a number of others in the room. Then I listened to Cameron Lackpour talk about MaxL using variables, error handling encryption and finally putting it all together to do something that Planning currently can’t do, use metaread filter. It was a very good presentation. I watched the first half of automating with Perl by Angie Wilcox, but had to leave early and hit the road for a long drive home.

In about 90 days the presentations will be on the ODTUG website. If you were not able to make the conference, look for them there. I would recommend start putting a bug in your boss’s ears for next year. I’m sorry for those who could not attend, you missed a lot.

Oh I forgot to mention the new ODTUG Hyperion Sig board was announced. Elected were (in random order)
Edward Roske
Angie Wilcox
Natalie DelemarJeff McAhren
Doug Bliss
Gary Crisi
Cameron Lackpour
Quinlan Eddy
John Weimar

Please give them your support and offer assistance. You don’t have to be on the board to help.

Friday, June 5, 2009

Ask the experts at Kaleidoscope

On the Tuesday night of this year's (2009) Kaleidoscope conference will be an Ask the Experts panel. We ae soliciting questions for that session. If you have a burning desire to know something, now is your time to ask. Send your questions to Edward Roske eroske@interrel.com (or you can send them to me) or you can even give them to us at the conference.

From what I heat, this special "Midnight" session hosted by the hyperion sig, will feature lots of chocolate.

If you are unable to attend the conference this year, don't fret. I know Edward is planning on blogging on it and if I can get time, I'll post my thoughts on the sessions as well.

If you are at the conference, please say hello.

Thursday, May 28, 2009

Kaleidoscope revisited

With Kaleidoscope less than a month away, I sent in my presentation on little used features of Essbase. I think there is some cool stuff in it. At least I was impressed with some of the things I found. As a preview, I talk a lot about load rules and things I never used in them, the expanded role of substitution variables in newer versions, Triggers, Query logging and much more. I'm excited to give the presentation as I think it has information anyone can use. I think, I may end up doing a second presentation for a presenter that can't attend, but that is still in flux right now, so I won't comment on it more. After Kaleidoscope, I'll post sections of my presentations here for those who could not attend. But it's not the same as hearing it in person. For one, you don't get my dry humor (ok, attempt at humor) or my enthusiasm

Speaking of attendance, if you are still on the fence about going, I would recommend it (as I have in the past). I've seen some of the presentations and they are awesome. They are technical with things that developers and advanced users need. For a couple of more days, you can use the code IRC (interRel consulting) to get the early discount. But I think it expires on June 2nd. Last year I paid for the conference myself and it was worth it. Take vacation if you have to in order to attend. It's better than any single training you can go to.

I hope to see you there. If you are introduce yourself or if I know you and you are not avoiding me, say hi.

Tuesday, March 24, 2009

Random thoughts

It's been a while since I entered anything on my blog, so I thought I would share some random items.

First, I spend a week at Oracle at an event called Xmonth which is a partner seminar. The 15 or so people in the seminar were given an overview of things to come and had a chance to play with some of the features of products. While Non-disclosure does not allow me to talk about it a lot, I will say that most of the Hyperion products are stategic in Oracle's future.

Second, I finally got out of the 18th century and joined linked-in. I did it to join the ODTUG Kaleidoscope group at http://www.linkedin.com/groups?gid=1796201 I'm not sold on it. Its a fad like PCs and fire, but I look forward to seeing what it does for me.

Third, as I've mentioned before, I'm presenting at the Kaleidoscope conference on a session called little used features of Essbase. If there are particular features that you would like more detail on or think others would get benefit from knowing about, let me know, I'm putting together the full content now. I've spent enough time talking about the merits of this conference so I won't bore you more with it. If you want more information, look at the agenda at http://www.odtugkaleidoscope.com/

Finally, a little technical content.
I had a client create a small cube for some really quick analytics. It was a small cube, under 100 meg. I got a call that when they tried to export data out if it, it gave them an error. In sending me the log, it was actually throwing an exception and crashing. I looked through the log and quickly found the problem. They had defined all of the dimensions as sparse. I replicated the problem on my system and sure enough, it crashed also. So it appears in order to export data, you have to have at least one dense dimension. We talk about tiny blocks if you make everything sparse, well, here is a real reason not to.

I've also ben playing with Smartview in EPM Fsuion Edtion 11.1.1 and while I have been an Essbase Add-in bigot, I have to say the enhancements they have made with the later versions makes me want to use it more. If they could get substitution variables, the ability to have both member names and alias and for it to allow the use of multiple alias tables at once, I would probably completely convert to smartview. Oh, I would like HFM to be part of the common provider. I believe that is coming. There are some really cool things like multiple retrievals on one sheet, the ability to create slices and even more use the slices to crate reports with grids, tables and graphs makes smartview my choice for a lot of quick reporting. I also like that you can pull in Planning forms and use them. The next version will have more functionality, but you will have to wait until Oracle tells you about it. But just wait it's cool

Thursday, February 5, 2009

Update on report script commands and more on ODTUG

I was looking through the tech reference the other day (9X) for something and noticed a couple of report writer commands that I had not seen before. They deal with using aliases or member names. AS you might remember we used to use {OUTMBRALT}, {OUTALTMBR}, {OUTALT}, {OUTALTNAMES}, OR {OUTMBRNAMES} sometimes with little success. You had to place one of these commands before the dimension you were defining in the report and change it before the next command if you wanted something different for the next dimension.

When they added duplicate member names, they created new commands
They offer more flexabilitiy than the old commands. You can dfine multipe dimensions at once that share the same attribute.

For example if you want aliases for Product and Market you would specify
%lt;RepAlias "Product" "Market"

It should be noted you can't use these commands mixed with the old commands (exception bing to select the alias table <OutAltSelect) and you can't use them if you are using rename functions. There are additional commands to get fully qualified names, but by looking the above up in the tech reference you should be able to find them.

You may nave noticed I've been posting a lot about the ODTUG conference. I had a comment about me being crazy(well not as nicely put as that) to suggest that people pay for it themselves if their company won't. I really believe this conference adds so much value and knowledge that it is worth it. I paid my own way last year and would again if my boss were not so supportive. (an amazing compliment. Edward, don't get used to it)
If you need rational to present to your boss to compel him/her to let you go, I suggest looking at Gary Crisi's blog entry. http://garycris.blogspot.com/2009/02/some-tips-for-kaleidoscope-2009.html He has great tips on how to get this conference approved in these tight times.

Tuesday, February 3, 2009

Exciting news about the ODTUG Kaleidoscope Conference

I just got word that the ODTUG Kelidoscope has confirmed John Kopke for it's Keynote address and Robert Gersten for the Sunday Hyperion Symposium Keynote. In addition, there are hands on sessions planned for OBIEE, ODI, Essbase Studio and Creating your first Java Essbase application. (Standard disclaimer, this is all subject to change). I believe there will also be a "Midnight Madness" for old people (it starts at 8:00 pm.) where you can bring your tough questions to the Essbase Aces and get them answered.

While I realize budgets have been cut this year in many organizations, this is such a good conference with so much in-depth information, it is something you should not miss even if you pay for it yourself. All the cool nerds will be there including all of the Oracle Ace and Ace Directors in the BI space. Last year we were a minority of the coference with one track, this year there will be four tracks including the hands on. You could not pay and get this much training in one week.

While this might sound like an ad, and is some ways it is, if I did not believe this is the best technical Hyperion based conference to go to, I would not be talking about it.

Tuesday, January 27, 2009

Blog Rant

I apologize in advance if this entry into my blog offends anyone, but today I want to talk about a pet peeve of mine. As most of you know I am a frequent contributor to the Oracle and other forums. There have been multiple threads with a common theme.” I know nothing about Essbase, how do I get certified?” In addition, I’ve looked at web casts and other web sites that help you with the certification questions. One went so far to say “We know our questions and answers are good, we buy them from the internet”. They are willing to teach you, not how to use Hyperion products, but how to pass the test. I will admit they have a disclaimer that passing the test is no substitute for experience. If you don't know anything about the product you should not be certified.

Certification means nothing without experience, so what that I can memorize the formula for computing block size, unless I know how to apply it and the implications of the block size on performance, it means nothing for my clients. Diluting the pool of certified professionals just cheapens the value of certification. I have interviewed (and worked with) certified people who could not answer the most basic questions and have wasted so much client time and money. In some cases, people have gotten jobs (permanent or consulting) from being certified, Did they last long? In most cases NO. So they got paid for a short time, don’t have a reference and leave behind unhappy and upset clients. It makes it harder to get the next job.

I realize that some people get certified because their company wants them to. I’ll admit, that is why I got my latest certification, but I have the experience and knowledge to back it up. Can you say the same thing?

Don’t get certified for the wrong reasons, get it because you know and understand the product(s) and can truly be a help to the client(s) you serve.

Please, I’m interested in your comments, Tell me why I’m wrong!