To do that, I went into the currency dimension and simply checked the reporting currency box for the currency I wanted.
Why I created a blog
Monday, December 12, 2016
An undocumented Update in the FCCS Dec-16 release
To do that, I went into the currency dimension and simply checked the reporting currency box for the currency I wanted.
Thursday, December 1, 2016
FCCS Update - Dec 16
From the read me
In Data Management, FCCS is now a source system- With this change FCCS can be used as a source for other cloud systems like PBCS( Planning Cloud Service,) EPBCS(Enterprise Planning Cloud), ARCS(Account Reconciliation Cloud Service) and PCMCS (Profitability and Cost Management Cloud Service). In addition, you can push data to a flat file for use in other systems.
Data Management now also supports multi-period loads. This allows you to have periods on the columns in your data file.
There were a couple of bug fixes as well. dealing with member tagged as saved assumptions and Flow not accumulating over time and Data entry intersections not always clearing for parent members.
As you can see the FCCS team has been very busy adding needed enhancements. It just gets better and better.
Wednesday, November 23, 2016
Essbase patch 11.1.2.4.014 is here
Oracle released the latest patch for Essbase and Essbase Studio 11.1.2.4.014. These patches have no real new features but a number of important bug fixes. In Essbase, the two that caught my eye were:
"24675006 - In some cases, a Dynamic Calc member can return different values on subsequent refreshes"
and
"24439234 - In some cases, incorrect results will be returned when retrieving results from SmartView Client on an Aggregate Storage database."
Based on that alone, I would patch my system.
I've heard from a number of users of Essbase studio how Drill through reports that used to work, no longer do. They will be happy with this patch as the big bug fix is:
"24321974, 23222585 - Unable to run Drill Through reports when a member has more than 999 descendants"
If you want to see all of the fixes for Essbase, Studio, APS and the client then you can look at Oracle Blogs. Note to actually read the readme files you will need an Oracle ID.
Hopefully Oracle has a nice Christmas present for us with a new release that has additional features. The irony of a Jewish person wanting Christmas presents for Oracle. Right up there with believing in the Easter bunny.
Friday, September 23, 2016
Migrating from EIS to Essbase Studio
He agreed that it was very under documented and that is why a little while back, he investigated the process to get it working (and he says it works pretty good). He created two white papers on the migration
Oracle Support Document 2094412.1 (EIS - Essbase Integration Services to Essbase Studio migration) for the migration and
Oracle Support Document 2095995.1 (Troubleshooting the Essbase Integration Services (EIS) Migration Tool within Essbase Studio)
for trouble shooting migrations. To get to either of these documents you have to sign in to Oracle support with your ID.
For the migration it seems pretty straight forward:
From the migration document (some of the formatting is mine)
1 Ensure that your EIS metaoutlines and models complies with the guidelines listed in the Essbase Studio user's Guide in the Chapter "Guidelines for Migrating Metaoutlines and Models" and "Properties Not Migrated".2. Install Studio Console on a 32-bit machine. Note: If you are planning to install the EPM bundled ODBC drivers it should be a server operating system.
3. On the 32-bit machine where Studio console is installed, install ODBC drivers that works with the EIS catalog database.
Note: You can install the DataDirect ODBC drivers that are shipped with EPM by installing EPM Foundation Components. However, there is no need to run the EPM System configurator, installation is enough to install the drivers.
4. Create a User DSN that can connect to the EIS Catalog database. Note: It must be a User DSN, not a System DSN.
5. Ensure that you can connect to the EIS Catalog database through the ODBC User DSN on the 32-bit machine. Note: EIS Server does not need to be running
6. Ensure Studio Server is running. Note: Studio Server can be running on a separate 64-bit machine.
7. In Essbase Studio, create a data source that points to the same source database that was used for EIS.
EIS to Studio Migration Steps
1. Start Studio console on the 32-bit machine and connect to the Studio Server where the EIS catalog should be migrated to.
2. In Studio console, start the EIS migration tool. (Tools-> Essbase Integration Services Catalog migration)
3. Fill in the "EIS Catalog ODBC DSN field", the "User' and "Password" field.
4. Click the "Fetch models" button, and the EIS models should be listed.
5. You can now choose
- the matching Data source connection that was used for EIS. Note: Typically different from the EIS Catalog connection
- the EIS Catalog and Schema from the drop down. Note: If your relational database is Oracle, selecting a catalog is not required.
6. Select a folder where the migrated metaoutline and model should be put
7. Click the Migrate button
8. If there are any errors, double-check check that all documented guidelines are followed.
Note: If the migration does not work, it is possible that it is needed to rebuild the models from scratch in Essbase Studio. There are cases that are too complex for the EIS Catalog migration to work.
You should largely still be able to use the source tables that you were using with EIS, although a few tweaks might be needed.
Thank you Aneel for providing the community with this much needed information
Thursday, September 22, 2016
Quick Smart view post
A number of Ace Director friends and I met with the Oracle Smart View Team at Oracle Open world this week and evangelized for you all on changes to the product. More posts will be coming out from all of us on some things we learned and if you follow us all there might be overlap. We just want to get the word out to as many as we know.
One thing that came out during the meeting is there are two Excel Add-ins that cause issues for Smart View and it is suggested you turn them off when using Smart View.
The first is the Webex add-in and the second is the Adobe PDF add-in. something in the way they take control of spreadsheets messes with smart view functionality and stability.
I'll be putting a slightly longer blog post together later on some Knowledge base articles that help with fixing some common problems.
Thursday, September 8, 2016
FCCS Dimension Security
Wednesday, August 31, 2016
A glimpse into FCCS
Dimensions
Calculations
Data Retrieval
Thursday, June 2, 2016
All Dimensions in DataExport rows, Is it possible?
“I am doing a data export in a calc script. I have 12 dimensions in the cube. The output file is coming out fine except for one issue. The file does not write the name of the period member for which the data is being extracted. It writes the members of 11 dimensions but does not write the period.
If I fix on 1 month it will write a total of 12 columns 11 dimensions + 1 data column.
What I want is 13 columns (12 dimensions + 1 data column)
In the Poster’s case, Periods was being set as the columns, but that is not necessarily always going to be the case. As one response suggested the poster could add:
That would insure that periods is being used as the columns, but would not give the poster what he wanted. In a 12 dimension cube, we will only get 11 row dimensions no matter what. The only way I know to get what the poster wants is to add a dense dimension to the cube. This dimension could be a single member dimension (just the dimension name). Since it is a single member it would not take any additional space or increase calculation or retrieval times. We could then set that dimension to be the column dimension and we would get all of our “real” dimensions as row members. If we can’t alter the existing cube, we could clone it add the dimension to the cloned cube, export (hopefully) only level 0 data, load it into the new cube and then use DataExport there to get the format of what we need. Granted that is extra work, but sometimes you do what you have to.
As a side note, you can make that new single member dimension useful Call it something like last_update and in your batch process, set the alias to be the data/time the cube was last updated.
Another possible solution would be to export all 12 periods every time so you always know what the columns are. If we need to really only get certain periods, we could do the DataExport to a relational table, export a sub variable and join to get the period we want.
I’ve seen similar questions about ordering the output and ignoring certain columns. Neither can be done. Dataexport, is fast because to grabs the blocks and outputs them. It does it in the order the dimensions exist in the outline (sparse first, then dense) just like an export. So be careful if you switch the order of dimensions or change something from dense to sparse or sparse to dense, the order in your DataExport will change. Because it is pulling the block, it has to write all dimensions.
Monday, April 4, 2016
Undocumented SQL Data Load Rule Change
Recently I had a client do an in place upgrade of Essbase from 11.1.2.2 to 11.1.2.4. The only thing that was upgrades was the EPM stack, The server and SQL server instance were not changed. Everything seemed to go smoothly except for one thing. We had a process where data is extracted from Essbase into SQL Server some data manipulation is done and the data is loaded back to Essbase. This worked great in 11.1.2.2 but in 11.1.2.4 the numbers in my SQL server table didn’t match what was being loaded into Essbase. Hmmm. What do I mean?
Well, I created this test sample for Oracle support. First I gave the then a table definition
CREATE TABLE [dbo].[test](
[Process] [nvarchar](50) NULL,
[Organization] [nvarchar](50) NULL,
[Produt] [varchar](50) NOT NULL,
[F_Code] [varchar](50) NOT NULL,
[Accounts] [varchar](50) NOT NULL,
[amt] [float] NOT NULL,
) ON [PRIMARY]
Next sql to populate the table with some test values.
insert into Test
values('F1','00N18','CP.T354','F.21N1','E795000',1467149.93)
insert into Test
values('F1','00N18','CP.T354','F.21N2','E795000',1467149.93)
insert into Test
values('F1','00N18','CP.T354','F.21N3','E795000',1467149.93)
insert into Test
values('F1','00N18','CP.T354','F.21N4','E795000',1467149.93)
insert into Test
values('F1','00N18','CP.T354','F.21N5','E795000',1467149.93)
insert into Test
values('F1','00N18','CP.T354','F.21N6','E795000',1467149.93)
insert into Test
values('F1','00N18','CP.T354','F.21N7','E795000',1467149.93)
When I retrieve from the table I showed them I got the following results:
Notice the last column amt matches what was loaded. Next I created a load rule and put in the following SQL statement
* From test
Since this was just a test, I didn’t bother to explicitly list the column names, but usually do and recommend doing that.
I retrieved the data and got:
I don’t know about you, but to me 1467149.93 does not match 1467149.875. If my math is correct it is off by .055. This would typically be no big deal, but these numbers were used in allocations and put the results out of balance by anywhere from 1 to 30 cents per allocation and the differences added up across the enterprise.
The response I got from Oracle surprised me. They told me it was not Essbase but the way SQL Server handled Floating point conversions. Huh? Sql Server had not changed in my environment and worked before. I wasn’t buying their response and rejected it completely. So they went back to development. I kept getting stalled with the similar answers about it not being Essbase but SQL Server and they kept pointing me to web links talking about floating point math.
https://msdn.microsoft.com/en-us/library/ms173773(v=sql.90).aspx https://msdn.microsoft.com/en-us/library/ms173773(v=sql.100).aspx https://msdn.microsoft.com/en-us/library/ms173773(v=sql.110).aspx.
Ok, I get it, I understand floating point math and how it is an approximation, but When SQL shows we the correct answer and Essbase load rules don’t, how can it be a SQL Server issue. I used the analogy with support that I’ve looked at a painting for years and it was clear as crystal. When I got a new pair of glasses, the painting got fuzzy, so that means the fault is in the painting and was not caused by perhaps an incorrect new prescription?
After literally months of going back and forth, I finally got an answer, I didn’t like it, but at least it made sense. I was told:
“A change was made in PS4 that explains the difference between the versions. Previously, all values were binded to SQL_C_CHAR type and the drivers were responsible of doing the conversion while displaying the values. In PS4 the values get binded according to their type (i.e. float is getting
binded to SQL_C_FLOAT). Therefore, the value is converted accordingly by Essbase and not by the driver. This behavior is considered correct.
You can see that the value is indeed stored correctly using a workaround
while creating the rule file: instead of selecting the amt column as is, use str(amt, 25, 2).
The syntax is: STR ( float_expression [ , length [ , decimal ] ] ). “
So now I have to go back and re-code all the SQL to convert the data type. It would have been nice if Oracle documented the change or even better, set an option to use the old coding. But at least I have an answer and so do you in case you are reading this because you did a google search because you are having a problem.
Wednesday, March 16, 2016
Don't be a Glenn
\\Outline verification errors: