Why I created a blog

Its been many years since I first created this blog. It has remained true to Essbase and related information over those years. Hopefully it has answered questions and given you insight over those years. I will continue to provide my observations and comments on the ever changing world of EPM. Don't be surprised if the scope of the blog changes and brings in other Hyperion topics.


Monday, February 1, 2021

Using stored procedures in Essbase Drill through


 

And they said it could not be done.  And then they didn’t.  What am I talking about?  Using a stored procedure inside a data source to pull data for a drill through report.  I have to thank Ashish Jain for taking the time to show me how to do it.  It is apparently an undocumented feature will actually be enhanced in the future. I will walk you though the steps I did in order to get this to work on my Essbase 21c instance against a SQ Server database. Of course, your mileage may vary if you are using a different RDBMS, all have similar syntax, but will vary.  As I said, I did this in 21c, but it will work the same way in 19c.

And not I will tell you my title and previous paragraph lied to you. We do not use Stored procedures in this case, but a table valued function.  I am going to use Cameron Lackpour’s favorite database in the whole wide world, Sample Basic for my example. From an Old Studio build, I have a relational database named TBC_Sample that has the underlying data in it.  

The first thing I did was to create the function with parameters. For my example, I created two parameters one for the period and one for the product. Of course I could have created any number depending on what I wanted my drill through based on.

Here is the function I created

SET ANSI_NULLS ON

GO

SET QUOTED_IDENTIFIER ON

GO

-- =============================================

-- Author:         Glenn Schwartzberg

-- Create date: 02/01/21

-- Description:    This function is used to perform An Essbase drill through

---by passing parameters

-- =============================================

alter FUNCTION fun_Drill_through

(    

      -- Add the parameters for the function here

      @Period Varchar(40) = 'Jan',

      @Product Varchar(40) = '100-10'

)

RETURNS TABLE

AS

RETURN

(

SELECT pd.sku

,Format(f.[TRANSDATE],'MMM') as [Month]

,M.State

,acct.child

,S.Scenario

from sales f

join [dbo].[PRODUCT] pr

on pr.productid =f.productid

Join [dbo].[PRODUCTDIM] pd

on pr.sku = pd.sku

Join Market M

on M.stateId=F.Stateid

Join MEasures      acct

on acct.measuresid = f.measuresid

Join scenario S

on S.[SCENARIOID] = f.[SCENARIOID]

where cast(Format([TRANSDATE],'MMM') as varchar(40))  = @Period

      and (pr.sku=@Product or pd.family = @Product)

)

GO

You will notice I have hard coded default values for @Period and @Product. These get replaced when I pass in parameters.  I tested in SQL Server Management Studio to make sure I got what I needed (and that it actually worked)

select * from

fun_Drill_through ('Feb','100-10')

 

Here is a sample of what I got back.

sku

Month

State

child

Scenario

100-10

Feb

New York

Sales

Budget

100-10

Feb

New York

Sales

Budget

100-10

Feb

New York

Sales

Budget

100-10

Feb

New York

Sales

Actual

 

I was very happy when I got this to work.

So now on to creating the data source. I will assume you already have a set up the connection to the DB instance. Remember both Connections and Datasources can be created at either the server or database level. I’m doing server level in this example, but, for my clients, I have found to avoid confusion, it is often better to create them within the application. Doing application level also makes migration between environments a little easier.

I’m not going to walk you through the steps to open the editor, but I created a new data source and connected it to the connection. I gave it a very stupid name Test_Function and in the Query used the same query as I did  when I tested it in SQL Server Management studio EXCEPT, I did not hard code the values. Instead, I used question marks for the variables.  I talked about this in a presentation I did recently. Using question marks allow you to use parameters, that silly step that you could never figure out how to use. 

 

 


 

Once I had the SQL in place I clicked next (well actually I clicked on the columns tag. And I was amazed to see it read the function and brought back my columns.

 


Clicking next (or Parameters), I now actually get something on the parameters screen

 

I edited the parameter names and gave them default values. Note there is a bug, if you go back to General, you will lose what you entered on this screen.


Gong to the Preview screen, we can see sample data. Wow, it works. 

 

 

Of course, we have to save the datasource lest we do all of this work in vain.

Now on to the drill through report. I inspected the database and went to scripts where I created a new drill through report. I gave it a name and associated it with the datasource I had just created.  I selected all columns, but did not associate them to dimensions

I then went to my Drillable regions.  And added this drillable region

@levmbrs("year",0), @levmbrs("Product",0), @levmbrs("Product",1)

If you look back at my SQL, you will see I am doing the comparison on product against two different columns of you joined SQL. That is so I could drill from multiple levels of my cube for the  product dimension. More on this in a minute.

 

Next, we navigate to the Reporting parameters tab. Wait, what is this? You’ve never seen this before? That’s correct, if you did not create parameters in your datasource, the option is hidden. In this tab, we see the parameter names and default values from the datasource, these are not changeable. We do have to assign these parameters to dimensions in my case Year and Product. There is where it is slightly different that in column mapping. Notice the options are none and X generations based the how deep the hierarchy goes. If we did not name our generations it would be the generic Generation 1, 2 etc.

In the General mapping we have a level 0 option and with that option, we get back a list of level 0 members underneath the member we selected. That does not exist here. We het back a single member from Smart View.  The None option returns whatever member you are on; the generation will only return a member from that generation.  I think the None option is safer. There is an enhancement to return a list of members, but I don’t know when it will be delivered.

 

 

Once I had all this in place, I saved the drill though report and went to excel to give it a try.

I created this grid

 

And tested where I could do drill through for. I could do it at the months, level 0 products and level 1 products (as expected).  I first tried to do the sample case of Mar and Diet Cola. I wanted to make sure it was not picking up the defaults. What I got back was (partial result):

 

I then tested to see if the retrieval from a Level 1 product would work so changed the retrieval grid then I selected APR and Root Beer. It looks like it worked as I got back:

 

Note, If I was really doing this, I’m sure I would have included the aliases in the report for the SKU so it was more meaningful.

So there we go. Again, I would like to thank Ashish for information on this and Glenn (what a great name) Hirshon for pushing to get an answer on this.

 

 

 

 

No comments: