Quantcast
Channel: Cameron's Blog For Essbase Hackers
Viewing all 271 articles
Browse latest View live

Out of the Past

$
0
0

Introduction

You’re all Robert Mitchum and Kirk Douglas fans, right? And all film noir fans, too? No? You really should give this film a try. To say that they don’t make ‘em like they used to is putting it mildly. I will spare you my rant on the vast empty wasteland that is modern entertainment and instead take you on a different journey into the past. One that, if you have knocked around the Essbase world long enough, may cause pangs of longing. What oh what could this be?

Remember me?

 
Sob, yes, it is Essbase Application Manager. Oh, AppMan, how I miss your consistent keyboard driven functionality, your easy copy and paste into Excel for hierarchy, your easy to read outlines with nice clean and easy to read text, and your simple and easy installation. And in your place, we have…EAS Console. Sigh.


I know that I am not the only one that misses AppMan. What Cameron, more of your delusional thinking? Nope, I took that screen shot from the desktop of one of my current clients. Yep, Essbase 11.1.2.2 and a rather smart Hyperion admin realizes that she can better understand Essbase when viewing BSO databases through AppMan than when using EAS. Oracle, are you listening? AppMan = 1992 technology, and yet your customers prefer it. This isn’t the first one that I’ve bumped up against that has kept a copy of Application Manager around to make Essbase easier to understand.


I wish I had an old copy of AppMan.exe kicking around. Like a fool, I dumped Essbase 6.5.x’s binaries as fast as I could. Maybe I shouldn’t have been quite so impatient to embrace the future.


Parting shot

Here’s another picture to make you go all weak in the knees.

AppMan makes even Sample.Basic look good. Or in old Essbase-speak, Sample:Basic. Remember the colons instead of the periods as delimiters?


That does it, I’m doing my next automation project in Esscmd instead of MaxL, just because I still can. Okay, maybe I won’t do that, but I am going to look at old CDs and see if maybe I really did keep a copy of My Very Favorite Essbase Editor In The Whole Wide World.


Be seeing you, maybe in the past.


Going under the covers with TRACE_MDX

$
0
0

Introduction

I don't know about you, but I used to use the Classic Add In's Essbase Query Designer to give me a leg up in writing Essbase Report Scripts. As far as I know, there is no way to do that in Smart View (although I am not a super user of it, so corrections to this sentence please write in care of this blog).

Except of course when there is a way. How? Read on.

Some other blog you probably ought to be reading on a regular basis
I know I do, and you should, too. :)

Check out the Oracle Business Analytics (when did the EPM name go away?) Proactive Support post on How to track MDX queries against Essbase.

There is a new (how new is open to debate, I would say it’s been there all along and is just now getting released to world+dog because it works on my oldish AWS instance from John Booth’s Metavero blog) Essbase.cfg setting that will log Essbase MDX queries.


The setting is called: TRACE_MDX.

What does it do?

It logs the MDX query and how long that query takes. That’s it you say? Ah, but there is quite a bit of value in this as I will attempt to explain.

The setting

Read the link(s), or know that the Essbase.cfg setting is:TRACE_MDX appnamedbname 2

As always with Essbase, make the setting in either Essbase.cfg directly or via EAS (don’t forget to click the Apply button) and then bounce the Essbase service which of course is called anything but Essbase.

Here’s the setting for good old Sample.Basic:
TRACE_MDX Sample Basic 2

NB – On my EPM Windows instance, the Essbase service is called “Oracle Process Manager (EPM_epmsystem1). Intuitive, isn’t it? Umm, no.

Some errata

2 is the loneliest number
Fwiw, I tried 0, 1, 2, and 3 and only 2 seems to make anything get logged. Why the number 2? Why don’t those other values do something? Or is it that I just don't know how to set it? Time will tell.Log location
The results of the MDX queries gets dumped to ARBORPATH\
appname\dbname\mdxtrace.log.

Does it actually contain the user name and password of the person/tool doing the pull? Why of course not, that would be too easy. Sigh. You will have to build a cross referencing table. All the more reason Oracle should adopt at least a few ideas from this thread:
http://www.network54.com/Forum/58296/thread/1364255484/Collaborate+-+Application+Utilization

Also, you may note that this log file doesn't exactly go into the normal ODL log location. Why?

Just for the record – I’m not sorry that this log exists, I just wish there was a consistent logging architecture. I can barely remember where these things are from version to version; I just wish Oracle would pick a plan and stick with it. Okay, rant over.
 
How does it work?
Smart View
I *thought* that Smart View used MDX to query Essbase. That may very well be (or maybe not), but ad hoc retrieves against Essbase do not generate any entry in the log. Bummer.

Execute MDX

However, you can use Smart View’s “Execute MDX” command and get a value in the log. For those of you not writing MDX on a regular basis (and bear with me, because I think this log is going to drive a lot of people who are not super experienced with MDX towards it in future), you get to that option by right clicking on the Essbase database (ASO or BSO, it doesn’t matter) and selecting “Execute MDX”.


A dialog box pops up, and you can enter your MDX directly (yes, I stole this directly from the Support blog, just wait, I am going to expand on it):


That produces the following result in Excel:


Mdxtrace.log will have the following (the query is in the log):
===============================================================
Following MDX query executed at Mon Apr 08 08:56:13 2013
===============================================================
SELECT
{[100-10], [100-20]} ON COLUMNS,
{[Qtr1], [Qtr2], [Qtr3], [Qtr4]} ON ROWS
FROM Sample.Basic

=== MDX Query Elapsed Time : [0.009] seconds ===================


The corresponding Sample.log file has this:
[Sat Apr 13 13:04:24 2013]Local/Sample/Basic/hypadmin@Native Directory/7244/Info(1013091)
Received Command [MdxReport] from user [hypadmin@Native Directory]

[Sat Apr 13 13:04:24 2013]Local/Sample/Basic/hypadmin@Native Directory/7244/Info(1260039)
MaxL DML Execution Elapsed Time : [0] seconds


Pretty cool, eh?


One odd thing
I noticed, at least on my release of Smart View (version 11.1.2.2.000 (Build 453)), that the above MDX query cannot show the POV members on the sheet. I can toggle the POV button and have them in the floating palette, but that’s the only way it works. This is different than the 11.1.2.2 behavior with ad hoc queries. Maybe this is in the documentation and I missed it?  That would not be the first time I’ve blown by this sort of thing. Corrections please in care of this blog’s comment section.

Just for giggles, I tried fully qualifying the axes with the below MDX (again, forgive my child-like MDX skilz):

But all I got was this:

Note that this is NOT the way the MDX queries in say ,EAS display:


Not a big deal and yes, you could have used axis(0), axis(1), and axis(2) instead of COLUMNS, ROWS, and PAGES.

But wait there’s more in Smart View

Smart Slices

Even though not seeing the MDX from an ad hoc query is kind of a bummer (and again, maybe I am misunderstanding how Smart View queries data from Essbase), did you know that Smart Slices are just MDX queries? And that means that you can view the MDX in mdxtrace.log.

Define the Smart Slice any which way you want.

Do an ad hoc query off of the Smart Slice named Cameron’s test:

And, voila:
===============================================================
Following MDX query executed at Sat Apr 13 14:37:10 2013
===============================================================
SELECT
{ CROSSJOIN( { [Product] } , { [Year] } ) } PROPERTIES [MEMBER_ALIAS],[MEMBER_UNIQUE_NAME] ,[GEN_NUMBER],[LEVEL_NUMBER] ON ROWS,
{ { [Measures] } } PROPERTIES [MEMBER_ALIAS],[MEMBER_UNIQUE_NAME],[GEN_NUMBER],[LEVEL_NUMBER] ON COLUMNS WHERE {( [East] , [Budget] )} PROPERTIES [MEMBER_ALIAS],[MEMBER_UNIQUE_NAME]


=== MDX Query Elapsed Time : [0.001] seconds ===================

Now are you getting interested in this?

Did you note how there is no FROM Sample.Basic statement in the MDX above? I have to guess that it is somehow stored in the Smart Slice itself and so it isn’t necessary. Again, smarter minds than mine please chime in via the comments section.
What triggers MDX and just what kind of MDX?
Drilling up and down in the sheet does not generate new MDX queries. However, changing Measures to Profit through the Member Selection dialog box does.

Unsurprisingly, a Member Selection action produces a metadata query (you knew MDX could do that because you’ve been or read Gary Crisci’s Kscope presentation on that, right?):
===============================================================
Following MDX query executed at Sat Apr 13 15:30:39 2013
===============================================================
SELECT {HEAD( DESCENDANTS( [Measures] ),5001 )} PROPERTIES [MEMBER_ALIAS],[MEMBER_UNIQUE_NAME],[SHARED_FLAG] ON COLUMNS


=== MDX Query Elapsed Time : [0.000] seconds ===================

Actually clicking on Refresh produces the following MDX – note Profit is now defined:
===============================================================
Following MDX query executed at Sat Apr 13 15:30:39 2013
===============================================================
SELECT
{ CROSSJOIN( { [Product] } , { [Year] } ) } PROPERTIES [MEMBER_ALIAS],[MEMBER_UNIQUE_NAME] ,[GEN_NUMBER],[LEVEL_NUMBER] ON ROWS,
{ { [Profit] } } PROPERTIES [MEMBER_ALIAS],[MEMBER_UNIQUE_NAME],[GEN_NUMBER],[LEVEL_NUMBER] ON COLUMNS WHERE {( [East] , [Budget] )} PROPERTIES [MEMBER_ALIAS],[MEMBER_UNIQUE_NAME]


=== MDX Query Elapsed Time : [0.000] seconds ===================

Getting back to that metadata query, what does it look like in MaxL (I have to stick it there to try to read what comes out)?. Here’s my super simple MaxL:
login hypadmin password on localhost ;

alter session set dml_output alias off ;
alter session set dml_output numerical_display fixed_decimal ;
alter session set dml_output precision 15 ;
set column_width 80 ;
set timestamp on ;

spool on to "c:\\tempdir\\mdxoutput.log" ;

SELECT {HEAD( DESCENDANTS( [Measures] ),5001 )} PROPERTIES [MEMBER_ALIAS],[MEMBER_UNIQUE_NAME],[SHARED_FLAG] ON COLUMNS
FROM [Sample.Basic] ;

exit ;

And that produces:
===============================================================
Following MDX query executed at Sat Apr 13 15:34:00 2013
===============================================================
SELECT {HEAD( DESCENDANTS( [Measures] ),5001 )} PROPERTIES [MEMBER_ALIAS],[MEMBER_UNIQUE_NAME],[SHARED_FLAG] ON COLUMNS
FROM [Sample.Basic]


=== MDX Query Elapsed Time : [0.001] seconds ===================

And this:

You can pull the file down from here for your amusement and also because the above is just illegible.

I count 17 data values. There happen to be 17 Measures. I think (boy oh boy am I doing a lot of guessing in this blog post) those are internal index values for the Measure dimension members. Once again, pretty cool, eh? Dear Tim Tow, when you tell me things like this about Essbase, I do try to remember them, even though 90% of what you tells me flies over my head.

Query Designer

And of course there is a Query Designer in Smart View. If you guessed that this too was a MDX query, you would be 100% right.

When I click on the Apply Query button:

I get this in Smart View:

And this in mdxtrace.log:
===============================================================
Following MDX query executed at Sat Apr 13 14:41:19 2013
===============================================================
SELECT
{ CROSSJOIN( { [Product] } , { [Year] } ) } PROPERTIES [MEMBER_ALIAS],[MEMBER_UNIQUE_NAME] ,[GEN_NUMBER],[LEVEL_NUMBER] ON ROWS,
{ { [Measures] } } PROPERTIES [MEMBER_ALIAS],[MEMBER_UNIQUE_NAME],[GEN_NUMBER],[LEVEL_NUMBER] ON COLUMNS WHERE {( [East] , [Budget] )} PROPERTIES [MEMBER_ALIAS],[MEMBER_UNIQUE_NAME]


=== MDX Query Elapsed Time : [0.000] seconds ===================

Financial Reports

What, Smart View (mostly) exposed isn’t enough for you? Good grief. Did you know that Financial Reports uses MDX? And this time we can view it all, baby. Duluth, MN? Really? Moving on…and realizing that the DPD will likely arrest me on sight for that comment,...

Let’s take this simple report:

Run it in HTML Preview mode:
And see that it produces:
===============================================================
Following MDX query executed at Sat Apr 13 15:49:05 2013
===============================================================
SELECT
{[Qtr1], [Qtr2], [Qtr3], [Qtr4]}
DIMENSION PROPERTIES [Year].[MEMBER_ALIAS] , [Year].[MEMBER_UNIQUE_NAME]
ON COLUMNS ,
{[Profit], [Margin], [Sales], [COGS]}
DIMENSION PROPERTIES [Measures].[MEMBER_ALIAS] , [Measures].[MEMBER_UNIQUE_NAME]
ON ROWS
FROM [Sample].[Basic]
WHERE ([Product], [East], [Actual])


=== MDX Query Elapsed Time : [0.000] seconds ===================

What about Planning?

I understand that Planning forms are MDX-based. Well, opening a form does not generate an MdxReport event in the Essbase application log – I think this is shades of Smart View ad hoc. (Yeah, I have a very long phone call with Tim in the near future so he can explain, again, how all of this stuff works under the covers. And yes, it behooves me to try this out with Dodeca as well but I’ll make that the subject of another blog post.)

However, ad hoc analysis does trigger a MDX query:

Which produces this:

And that in turn produces this in mdxtrace.log:
===============================================================
Following MDX query executed at Sat Apr 13 16:06:04 2013
===============================================================
SELECT {[Period].[YearTotal]} ON COLUMNS,
NONEMPTYBLOCK {[Account].[IncomeStatement],[Account].[300000],[Account].[310000],[Account].[320000],[Account].[330000],[Account].[340000],[Account].[350000]} ON ROWS
FROM SampTest.Consol
WHERE ([Year].[FY13],[Scenario].[Forecast],[Version].[Working],[Segments].[BAS],[Entity].[E01_0])


=== MDX Query Elapsed Time : [0.000] seconds ===================

Do you see what I see? There it is, that still (I think) undocumented super-cool MDX function NONEMPTYBLOCK. Oooh, I love it when a plan comes together.

The conclusion and the point behind all of this

If you can build it in a Smart View Smart Slice, or in Query Designer, you can interrogate mdxtrace.log to find out how Smart View did it. The same goes for Financial Reports. And, with some limitations, the same is true for Planning.

Why oh why oh why would you care about TRACE_MDX? Two reasons spring to mind.

  1. You should care because this has GOT to be easiest way there is to figure out how to write good MDX, at least from a query perspective. MDX is not…intuitive. Oh sure, if you know it, it’s easy, but that’s because you already went through the pain. For rest of us, it can be a little scary and painful. This simple Essbase.cfg setting can make that learning curve so much easier. Whoever in EPM product management pushed this one through; I give you my BSO-brain’s thanks.
  2. Why do these tools sometimes go KABOOM on us? How do they work under the covers? What interesting bits of functionality are they using? TRACE_MDX gives us a window into the way (mostly) EPM tools talk to Essbase. As an example, where did that rather cool undocumented keyword NONEMPTYBLOCK came from? Why it came from examining MDX. I’ll bet there’s more cool stuff that we only need to look for.

I also have to give thanks to Oracle Support for once again coming through with some really cool stuff.

NB – One last point – I don’t do OBIEE but I am willing to bet that this setting came about because of the sometimes ugly MDX that OBIEE generates. Again, people more knowledgeable than I should drop me a line to tell world+dog all about it. In any case, we now all get to benefit from TRACE_MDX.

Be seeing you.

Using TRACE_MDX with Planning

$
0
0

Introduction

As I showed in my last post, TRACE_MDX can be utilized with Planning ad hoc forms. Oh, terrific, but do I really and truly have to go into an ad hoc form to see the layout? Besides, going into an ad hoc form changes its structure and maybe I want to see what that form’s MDX looks like from the word “go”. Is there a way to do it? You betcha.

It’s just this little chromium switch

All you need to do is to go into Planning form design and select “Suppress missing blocks”.
Simply save the form and open it back up.
 
 
And take a look at the (by now) good old mdxtrace.log file and see…
===============================================================
Following MDX query executed at Sun Apr 14 14:05:15 2013
===============================================================
SELECT {[Period].[Jan],[Period].[Feb],[Period].[Mar],[Period].[Q1],[Period].[Apr],[Period].[May],[Period].[Jun],[Period].[Q2],[Period].[Jul],[Period].[Aug],[Period].[Sep],[Period].[Q3],[Period].[Oct],[Period].[Nov],[Period].[Dec],[Period].[Q4],Hierarchize(Descendants([Period].[YearTotal]),POST)} ON COLUMNS,


NONEMPTYBLOCK {Hierarchize(Descendants([Account].[IncomeStatement]),POST)} ON ROWS


FROM SampTest.Consol


WHERE ([Segments].[BAS],[Entity].[E01_0],[Year].[FY13],[Scenario].[Forecast],[Version].[Working])


=== MDX Query Elapsed Time : [0.068] seconds ===================


And what do we get from this?
A whole bunch of things:
  1. There’s that NONEMPTYBLOCK statement again. You know, the thing that makes BSO MDX queries so fast. Yup, it sure is interesting that it has been around in MDX for such a long time, and used for such a long time in Planning (I am going to guess since 11.1.1.1 as that’s when I remember Suppress Empty Blocks becoming available). And yet it wasn’t ever documented. Why?
  2. The MDX portion of the form took only 0.068 seconds.
  3. Did you see the Hierarchize function? And the POST option? Check out the Essbase Technical Reference topic on Hierarchize – do you see how Oracle could make expansion work either way (up or down) if they wanted to? Although I suspect there is a reason that this is not exposed as I’ll explain/guess at in a bit.
  4. Columns get treated differently than rows. What do I mean? If you look at the form layout screen shot, you’ll see that the Planning form command to get all of the periods is IDescendants(YearTotal). That’s how Accounts are defined as well. And yet the MDX clearly shows individual selections for each period and a Descendants of YearTotal where Accounts are simply a Descendants function. Why?


What do I mean by that? I took the MDX and stuck it into Smart View using the Execute MDX command and got the following columns:
Jan
Feb
Mar
Q1
Apr
May
Jun
Q2
Jul
Aug
Sep
Q3
Oct
Nov
Dec
Q4
Jan
Feb
Mar
Q1
Apr
May
Jun
Q2
Jul
Aug
Sep
Q3
Oct
Nov
Dec
Q4
YearTotal


Interesting, eh? Apparently (well, definitely, actually as we can see) Planning needs the columns twice, once through explicit selections and then again through Hierarchize(Descendants([Period].[YearTotal]),POST). Isn’t that just odd?


This must be something internal to Planning as this simpler MDX gives me exactly what I would expect wrt columns, i.e., non-repeated months.
SELECT {Hierarchize(Descendants([Period].[YearTotal]),POST)} ON COLUMNS,
NONEMPTYBLOCK {Hierarchize(Descendants([Account].[IncomeStatement]),POST)} ON ROWS
FROM SampTest.Consol
WHERE ([Segments].[BAS],[Entity].[E01_0],[Year].[FY13],[Scenario].[Forecast],[Version].[Working])
 


Dear Oracle Planning Product Management (or more likely Development) – what the heck is going on? Why oh why oh why does Planning need almost double the columns? Weird.

And what’s really weird

As I stated in the beginning of this post, one needs to flip the Suppress Empty Blocks switch to make MDX fire on form retrieval. And that implies that only this setting (I suspect it is the only way to easily get to the functionality NONEMPTYBLOCKS provides) makes Planning use MDX. I am further guessing, per what My Man In California, Glenn Schwartzberg, stated in the comments section of last week’s blog re default retrieves in Smart View, Planning must use the Grid API to do standard retrieves. I find that fascinating because it has been “common” knowledge that Planning uses MDX to retrieve forms. TRACE_MDX tells us quite clearly that in fact that is not true.


And so that then suggests that maybe MDX still isn’t the fastest or best way to retrieve data from Essbase. I guess I shouldn’t be super surprised that nothing beats a native API, but I do wish this stuff was documented. Wait, it just was. :)


Be seeing you.

A very different kind of ODTUG webinar

$
0
0

Marketing introduction

The ODTUG virtual panel series goes from strength to strength. (What, marketing? You promised me a rose gardena marketing-free blog.   Yes, but this isn't marketing for me, so relax. My self-marketing ineptitude continues apace.)
 
Stuff you actually care about goes right here
What do I mean? None other than Chet Justice, aka ORACLENERD is moderating an ODTUG virtual panel in the form of a webinar.

Wait, ODUTG , virtual panels, ORACLENERD? Is Essbase involved somehow?   No.   Cameron, is it time to get some sleep? Why yes it is, but before I hit the hay, let me pull it all together:
  1. ODTUG is having another one of their successful-beyond-our-dreams virtual panels
  2. Chet is the moderator
  3. The speakers are: Cary Millsap, Dominic Delmolino, and Kris Rice.
  4. The webinar's subject is "Software Development in the Oracle Eco-System". Essbase is absent, but so what?
  5. The time and date are Thursday, May 30, 2013 3:00 PM - 4:30 PM EDST

See here: https://www3.gotomeeting.com/register/759087318

Where's the beef?
Okay, that's all very interesting but why should a bunch of (mostly, or maybe exclusively if you're reading this blog) EPM people attend a webinar on something that is pretty obviously not EPM-related?
  1. Because these guys are good.   I heard Cary speak last year at the Kscope12 keynote.   Did you?   He was fantastic.   Inspirational even.   I don't know the other speakers (blush, my EPM-centricity shows yet again) but I have a sneaking suspicion they are really, really, really, insightful speakers.
  2. They are covering a subject (development, its frustrations and triumphs, and how to have more of the latter than the former) that all of us, each and every one, do to some extent or another.   Isn't the point of ODTUG to find learn from others and occasionally share something with them?   That's pretty much why I'm involved.   Here is your chance to do that with some Really Big Names in the Oracle world.
  3. It's free.   Think about how much you would pay to get trained by these guys.   Yeah, I couldn't afford it either.
What a nice way to end an undoubtedly hectic week.

I guess that's it, and it ought to be enough. I'm signed up. Are you?

For more gen
If you want to get a feel for them, check out their blogs.

There is some really good stuff there -- I love it (not that I ever do it myself) when geeks muse about why they do something as opposed to how although of course the how is important too.   I think this webinar/panel will be in that vein and I am really looking forward to it.

Be seeing (listening to because remember, this is a panel and you can participate?) you on 3 pm EDST, 30 May 2013. 

Stupid Programming Trick No. 17 -- Hacking EPMA's Planning Time Distribution

$
0
0

The problem

A Planning 11.1.2.2 application was created with the wrong Time Distribution option. Oops. And no one noticed until the UAT. Double oops. The real oopsie in this is that once created, a Planning application CANNOT change its Time Distribution. For the uninitiated, this is the spread from upper level Time Periods members like YearTotal or quarters to lower level members. It defaults to an even spread or can be optionally set to fiscal calendar 445, 454, or 544 spreads. Again, this is a one-time shot, so whoever creates the application had best mind his p’s and q’s. Which didn’t happen. Like I wrote, oops.

The hunt for the guilty and his inevitable gruesome punishment shall await Cameron’s Star Chamber. I am waiting for HM The Queen to appoint me Privy Councillor so I may begin the joyous prosecution.
 
Putting that happy day aside, oh may it come, and soon, let’s start figuring out how to fix this oopsie without recreating the Planning application.

NB – We are going to go faaaaaaar beyond what Oracle recommends, supports, or will even give you the time of day on if you FUBAR this and then call Oracle Support. It is an interesting hack, and maybe the exigency of your situation calls for it, but know that you do this at your own risk. You Have Been Warned.
 
The beginning of the fix
Dave Farnsworth found Celvin’s Kattookarn’s blog post with the code to change the Even split to 4-5-4 in the Planning application.
 
Here’s the code to transform the Planning application (this is in the Planning app’s schema/database). In this example, it is to change it from Even to 4-5-4. NB – This is in SQL Server but I think it’s identical in PL/SQL except for the COMMIT commands.
 
USE HYP_PLN_ChgSprd
GO
/*
0 = even spread
1 = 4-4-5
2 = 4-5-4
3 = 5-4-4
*/

--Flip from Even to 4-4-5
BEGINTRANSACTION;
update hsp_systemcfg set support445='2';
update hsp_account set use_445 ='2'where use_445 ='0';
COMMITTRANSACTION;

So problem sorted, yes? 
 
Oops, not entirely
Did I mention this was an EPMA application? Ah, no I did not. And it’s important.
 
After making the above change, and bouncing the Planning service, deploys from EPMA work until there is a change to hierarchy. When that happens, the deploy fails with the below error message:
[May 9, 2013 1:38:35 PM]: Parsing Application Properties...Done
[May 9, 2013 1:38:35 PM]: Parsing Dimensions info...Done
[May 9, 2013 1:38:35 PM]: Registering the application to shared services...Done
[May 9, 2013 1:38:36 PM]: You cannot change the Weeks Distribution after deploying. You must select 454 as the Weeks Distribution before redeploying the application.[May 9, 2013 1:38:36 PM]: An Exception occurred during Application deployment.: You cannot change the Weeks Distribution after deploying. You must select 454 as the Weeks Distribution before redeploying the application.



Btw, here is what EPMA had selected – it’s Even, not 4-5-4. But Planning is 4-5-4. KABOOM.

The research
So a dive into the EPMA tables seems to be in order to change that “Use application distribution” to “Use 454 distribution”.
 
I took a look at the EPM data models to get a feel for what’s going on under the covers. The EPMA schema is pretty sparsely documented, to put it mildly, but after a fair bit of blundering about, I figured out that I needed to look at the DS_Property_Application table and its c_property_value field.
 
Alas, the documentation does not tell you what the property value should be for the time spread or even the property id number. So I created a bunch of different Planning apps, each with a different Time Spread and came up with the following possible settings when I interrogated that field:
  • Even
  • 445
  • 454
  • 544
 
Once I knew what to search for wrt the setting, I then needed to figure out the application id and the property member id. With those two (I only want to change one setting, and I only want to do it for the right app) pieces of information, I can write an UPDATE query to fix the spread issue in EPMA. 
 
What application, what property?
I wrote this query to get that information, knowing that the application name is ChgSprd:
 
SELECT
A.c_application_name
,P.*
FROM DS_Property_Application P
INNERJOIN DS_Application A ON
A.i_application_id = P."i_application_id"
AND A.i_library_id = P.i_library_id
WHERE
A.c_application_name ='ChgSprd'
AND c_property_value ='Even'

When looking at the below, it looks like that the setting is repeated for each deploy of the application
c_application_name
i_library_id
i_application_id
i_prop_def_dimension_id
i_prop_def_member_id
c_property_value
ChgSprd
1
7
1
346
Even
ChgSprd
87
7
1
346
Even
ChgSprd
88
7
1
346
Even
ChgSprd
89
7
1
346
Even
ChgSprd
90
7
1
346
Even
ChgSprd
91
7
1
346
Even
ChgSprd
92
7
1
346
Even
ChgSprd
93
7
1
346
Even
ChgSprd
94
7
1
346
Even
ChgSprd
95
7
1
346
Even
ChgSprd
96
7
1
346
Even
ChgSprd
97
7
1
346
Even
ChgSprd
98
7
1
346
Even
ChgSprd
99
7
1
346
Even
ChgSprd
100
7
1
346
Even
ChgSprd
101
7
1
346
Even


The important bits are: this is application id 7 and the property id is 346. NB – I have been able to test this on completely different system – 346 is the property that contains the time distribution and the i_application_id varies. You will need to run the above query to figure out the i_application_id.
 
The fix
With an i_application_id firmly in hand, I can write an UPDATE query as below:
/*
Possible values for c_property_value when i_prop_def_member_id = 346
Even
445
454
544
*/
UPDATE P
SET P.c_property_value ='454'
FROM DS_Property_Application P
WHERE
i_prop_def_member_id ='346'
AND i_application_id ='7'

Btw, I tried just changing the last record in the above list of applications (the 101) and it didn’t work (although I got to play with SQL’s MAX function in a subquery so there is that). I had to change ALL of the values from Even to 454. I think maybe I could could have gotten away with changing just the i_library_id setting of 1 but I am not made of free time to test this stuff out. If you try it (on a throwaway instance, please) and it works, send in a comment to this blog.

Anyway, I changed them all, and then I bounced the Hyperion EPMA Server service and (since this was a compact deployment) the Hyperion EPM Server- Web Application service:

NB – In a real environment, I found I had to bounce all of the EPM services. Quite painful across a production environment but such is life. And yes, just restarting EPMA and Planning did not do it – there was a serious amount of relational caching going on.

I then logged back into Workspace, went to the Dimension Library, added CL_test3, and saw the following:

I now have 454 distribution. Success! Boil in bag! Hopefully.

The proof
So the proof of this particular pudding is to run a deploy.

And in Planning:

Fixed on both sides: Planning and EPMA. Whew.
 
A couple of notes
Again, for goodness’ sakes, this is a hack, and I had to do it, but I am pretty sure if you try to use this blog post as evidence that this is okay and, “Cameron said I can do it” Oracle is going to laugh in your face. Do it if you must, but may you never have to.
 
With that enormous caveat, if you are going to do it in your environment:
  1. Make a backup of the development EPMA schema. And then in development…
  2. Run the query to confirm the application and property ids. Yourappname will replace ChgSprd.
  3. Modify the UPDATE statement to change Even/445/454/544 to Even/445/454/544.
  4. Don’t forget to change the Planning application’s 454 spread, so maybe a Planning application schema backup is in order too.
  5. Restart all Hyperion EPM services (I found that I needed complete restarts of all EPM services for this to work outside of a compact deployment and yes that hurt).
  6. See if the deploy works. Prayer to whatever God or gods you worship is recommended at this stage. 
 
This was fun, kind of, and I’m not as scared of the EPMA tables as I once was. And it was a pretty cool hack. So I guess it was worth it. But an Order in Council is still going to go out – vengeance shall be mine as all of the above ate a lot of time I didn’t have.

Be seeing you.

What Kscope13 sessions am I looking forward to, part one

$
0
0
Introduction
This is always a tough one because there are so many good sessions at ODTUG’sKscope conferences. If you’ve read my posts in years past on the conference, you will know that I have ranted and raved about the unparalleled knowledge sharing, the training, the networking, the vendor expo (you would be amazed at what’s out there and I am a full member of the does-not-have-contract-signing-ability crowd and I still find it useful), the fun, etc., etc., etc. If you aren’t convinced by this point you either don’t read this blog (in which case I have to ask how you come to read this sentence) or you really don’t pay attention.

And because there is so much good stuff I am going to split my review (yet another sort-of tradition in this blog) of sessions I want to attend across multiple posts. There is just too much cool stuff going on and I want to give these sessions the due they deserve. And not write the Kscope equivalent of Gibbons’The History of the Decline and Fall of the Roman Empire, at least from a length perspective.

With that, let’s kick off with my very favorite technological product: Essbase.

Whoops, before I continue, I should mention that I am friends with, or at least am acquainted with most of the people below. Am I just shilling for them? Nope, these are good sessions. Of course you pick and decide what you want to attend – this is what I am interested in.

Note – you will notice that the name Cameron Lackpour is absent from the below sessions. This is not some kind of false modesty as I do think that my sessions are at least worth considering. I will cover what I am presenting later in the week – this block is for everyone not named Cameron.

Essbase sessions (stolen right off of Kscope13.com)

Practical Essbase Web Services

Jason Jones, Key Performance Ideas, Inc.
When: Jun 24, 2013, Session 5, 3:00 pm - 4:00 pm
Topic: Essbase - Subtopic: Other Essbase
This session will give an introduction to using the new Essbase Web Services feature that has recently been added to the Essbase stack. Attendees will be given an overview of functionality available and practical methods of using the available technology effectively. This presentation will be geared towards users that are familiar with basic programming concepts.

You’ve seen Jason pop up on this blog a few times – he is a real developer. I am not, sob. And I suspect most of you (apologies to those who are full time Computer Scientists) are not either. So let’s learn from a guy who has written real honest to goodness commercial software.

Using Calculation Manager with Essbase ASO

Co-presenter(s): Olivier Jarricot, ADI Strategies
When: Jun 26, 2013, Session 12, 9:45 am - 10:45 am
Topic: Essbase - Subtopic: Other Essbase
Get a sixty-minute cram session on using Calculation Manager with Essbase ASO. This session will navigate you through the key features within the Calculation Manager, specifically for Essbase ASO. The session will then guide you through the steps of creating calculation rules for ASO using the graphical interface and allocation and custom calculation components. Finally, the session will offer best practices and "insider" tips to shorten your learning curve and get you off to a turbo fast start!

In a previous life, I worked on one of Josie’s projects. She is all kinds of awesome. Also, I was on a project last year that screamed ASO as its solution. I had everything working except those !@#$ing level zero rate calculations and it was an interactive (budgeting) application. The client was an 11.1.1.3 environment so we were out of luck, to put it mildly and BSO it was. I am super curious to see what BusyGal (her online moniker) has come up with.

Big Data and Analytics-led EPM

Al Marciante, Oracle Corporation
When: Jun 24, 2013, Session 1, 8:30 am - 9:30 am
Topic: Essbase - Subtopic: No Subtopic
What data is important to analyze in order to sign off on consolidated financial data or for accurately creating forecasts and budgets? How do users efficiently identify the key drivers and model possible financial outcomes based on uncertainty? This session will highlight the synergies between Enterprise Performance Management and Business Intelligence, and will showcase how the two together allows customers to propel their performance.

An Oracle presentation? Aren’t they just great big advertisements for Oracle? Not at Kscope they aren’t. Big Data, Big Data, Big Data, Big Data – it’s everywhere. Does it fit in EPM? I am super curious about this.

Optimizing ASO Models Using Procedural Calculations

Michael Nader, Blue Stone International
Co-presenter(s): Martin Slack, Ernst & Young
When: Jun 26, 2013, Session 16, 4:15 pm - 5:15 pm
Topic: Essbase - Subtopic: Optimization
As analytic data sets grow, Essbase deployments are more and more focused on the ASO technology to take advantage of speed and scale. However, performing intricate calculations at run-time often leads to poor reporting performance. This session focuses on leveraging ASO procedural calculations and allocations to extend analysis and expedite reporting.

Did you read the above session by Josie Manzano-Stettler? This is the other side of calculations in ASO Essbase. I am, again, very, very, very interested in this subject.

How ASO Works and How to Design for Performance

When: Jun 25, 2013, Session 9, 2:00 pm - 3:00 pm
Topic: Essbase - Subtopic: Optimization
Why are some cubes fast and some are slow? When is it OK to use MDX and when should (or what types of) MDX be avoided? How do Solve Order, Dynamic, and Stored Hierarchies interact? By understanding how Essbase goes about resolving a query, many of these questions will answer themselves. Much of this understanding comes from the cryptic Bitmap Statistics dialog in Cube Properties. All of this is summarized in 12+1 rules. These 12+1 rules are discussed and used to illustrate how they should be used when designing your cubes. In particular, there are implications and options for the design of alternative hierarchies. These options will be discussed in terms of the trade-offs of cost (in storage size) vs. performance (in retrieval time). Concrete examples will be demonstrated.

This is one of those rare sessions that is a repeat from last year. Reruns? At Kscope? What, you want your money back? Trust me, this is worth repeating, both for those of us who attended last year and for the people who were foolish enough not to attend. Why? Because Dan has deconstructed the ASO kernel better than anyone whose email address doesn’t end in @oracle.com. It is a brilliant presentation.

Performance Optimization and Measurement: New Thoughts and Test Results for BSO and ASO

When: Jun 26, 2013, Session 14, 1:45 pm - 2:45 pm
Topic: Essbase - Subtopic: Optimization
Have you ever wondered why some loads are fast and others are slow? Why the same query performs differently at different times? Why your queries and loads are not as fast as they should be? Then this is the session for you. Starting with a discussion of how data file IO is handled in Windows and Unix, techniques to ensure apples-to-apples testing are presented. Then using the results of over three hundred load, and calculate/aggregate tests using very large BSO (9gB input level 175gB Calculated) and ASO cubes (1.4 billion cells 84GB aggregated). The testing spans Windows and UNIX; ASO and BSO; varying cache settings; sort order and file formats are presented. These variables are evaluated and ranked with several new and surprising conclusions. Conclusions, that in some cases, run contrary to existing best practices. Finally, expanding on the chapter "How ASO Works and How to Design for Performance" in Developing Essbase Applications, the speaker will discuss surrogate keys, MDX, multi-attribute queries. All of this will be discussed in light of real world experience where multiple cubes are running and data is prepared and hardware supplied by other parts of the organization with differing practices and priorities. In short: real techniques you can implement when you return from the conference.

I am somewhat familiar with the tests Dan has conducted. He has a very interesting take on what makes Essbase databases fast.

Essbase New and Upcoming Features

Gabby Rubin, Oracle Corporation
When: Jun 24, 2013, Session 2, 9:45 am - 10:45 am
Topic: Essbase - Subtopic: No Subtopic
Over the last two years, Essbase footprint in the Oracle product portfolio had increase dramatically. In addition to being an application platform for many customer as well as Oracles own EPM applications, Essbase is a key part of Oracle BI Foundation, Exalytics and Fusion applications. These changes in the Essbase ecosystem along with other market trends such as cloud, require Essbase to adapt and evolve; But how do you prepare for the future while protecting your past? Join this session to learn about Oracles vision for Essbase and the product roadmap.

Oracle again? If you are interested in where Essbase is going, this is your best bet to hear all about it from the Oracle Product Manager himself. Will he deny everything he says the minute he steps out of the room? Probably. Will many of these projected roadmap items show up at Open World as publicly committed-to features? It has happened. :)

Advanced Essbase Studio Tips and Tricks

Glenn Schwartzberg, interRel Consulting
When: Jun 24, 2013, Session 5, 3:00 pm - 4:00 pm
Topic: Essbase - Subtopic: Other Essbase
Essbase Studio has been around since the beginning of version 11.1.X and there is still a lot of mystery around its usage. Come explore some of the more advanced features now that you have gotten your feet wet creating your first Studio model. This session will go over settings, optimizations, tips, and tricks that can make you more successful with Studio.

Glenn is my older brother from completely different parents. Of course he denies any actual familial relationship. Just like an older brother. You decide who has a firmer grasp on reality.

Clawing my way back to relevancy, Studio is a product that I seem to be drifting closer and closer to with the increasing level of SQL and ETL that I do on each project. Glenn’s sessions are always interesting and entertaining.

Thinking Outside the Box -- Optimizing Performance in Essbase and Planning

Glenn Schwartzberg, interRel Consulting
When: Jun 25, 2013, Session 10, 3:30 pm - 4:30 pm
Topic: Essbase - Subtopic: Optimization
There are standard optimizations that developers do to improve performance, then there are those developers who think outside the box and create unique optimizations that can affect calculations, data loads, data transfers, etc. Come to this session to see some unique solutions and truly outrageous ways to improve performance.

I reviewed the first draft of this presentation – again, Glenn has an unusual mind (shades of Young Frankenstein?) and always comes up with how-the-h-e-double-hockey-sticks techniques.

Introducing the Outline Extractor NG (Next Generation)

Tim Tow, Applied OLAP
When: Jun 24, 2013, Session 3, 11:30 am - 12:30 pm
Topic: Essbase - Subtopic: Other Essbase
Although the OlapUnderground Essbase Outline Extractor has been downloaded by over 10,000 unique users, the technology used is quickly becoming outdated. The new Dodeca Essbase Outline Extractor is a complete redesign and rewrite of the popular Essbase Outline Extractor technology and adds the ability to output in Hyperion Planning Outline Load Utility format and to relational databases. It also adds the ability to run on 32-bit and 64-bit operating systems including Windows, Unix, Linux, and even MacOS. Attend this session to learn how to leverage this free utility in your company.

I’ve been on the beta (and have been a terrible beta participant, sorry Tim, but Kscope has eaten my life) for the NG extractor. This is the tool that we will all use going forward.

I for one am looking forward to running the NG OE on my Fat Mac.

Are you crying Uncle yet?

That is ten, count ‘em ten, different Essbase sessions. Is that the sum total of Essbase sessions at Kscope? Absolutely not. In fact there are 38 on offer. And that is just a subset of all the EPM sessions. Kscope has content, content, content.

The next blog post will be The Truth About Hyperion Planning at Kscope.

Be seeing you at Kscope13.

What Kscope13 sessions am I looking forward to, part two

$
0
0

Introduction

I’ve already covered the Essbase side of the house in part one of this series. I lurv Essbase more that is likely healthy for my psyche. Or social life. But I seem to spend an awful lot of my professional life using a wrapper around Essbase. A wrapper called “Planning”. And that wrapper is a pretty powerful application in the EPM space, now with lots and lots of brand extensions.

One thing that I find interesting about Planning and all of the Financial Planning products is that many of these sessions are either focused on latest features or tips and tricks sessions with a few how-to’s thrown in. I suppose this is the nature of an application as opposed to a database, which tends to have a more theoretical bent. Or maybe I am just evincing that somewhat monomaniacal love for Essbase I wrote about above.

Whoops, before I continue, I should mention that I am friends with, or at least am acquainted with most of the people below. Am I just shilling for them? Nope, these are good sessions. Of course you pick and decide what you want to attend – this is what I am interested in.

Note – you will notice that the name Cameron Lackpour is absent from the below sessions. This is not some kind of false modesty as I do think that the Planning session I am copresenting is at least worth considering. I will cover that later in the week – this block is for everyone not named Cameron.

Planning sessions (stolen right off of Kscope13.com) with my comments

Planning at Transaction Levels while Maintaining Performance

Chris Boehmer, The Hackett Group
Co-presenter(s): Danny Frasco, Kimco Realty Corporation
When: Jun 26, 2013, Session 12, 9:45 am - 10:45 am
Topic: Planning - Subtopic: General Planning
Would you like to obtain greater detail in your planning or forecasting process - e.g., to plan for greater product or customer granularity, more periods, etc.? But you're afraid that your Planning application just won't be able to cope with these requirements? In this session, you will hear how an organization's desire to forecast a large number of detailed items for a long period of time led to some innovative solutions built to enhance their Hyperion Planning application.

Hmm, this is interesting – sort of the anti-driver approach to Planning. Of course I’ve been on projects like that myself. It’s all very well to say, “Enter a few items, and let the magic of Planning allocate and aggregate the results,” and quite another to actually have a client do that. I’m interested to see what their techniques are. Could it be a paired BSO/ASO approach with partitions or maybe a reporting cube or maybe something completely different? Dunno, but this is intriguing.

Calc Manager: Go Beyond Basics and Unleash the Power of Oracle Hyperion Planning

Ludovic De Paz, TopDown Consulting
Co-presenter(s): Terrance Sundar, Shutterfly
When: Jun 26, 2013, Session 15, 3:00 pm - 4:00 pm
Topic: Planning - Subtopic: General Planning
With Hyperion 11.1.2.3., Calc Manager is the only option to develop and deploy Business Rules. This is a fantastic opportunity for developers to leverage its latest advancements. This session will examine key features and functionality in Calc Manager and demonstrate how to successfully achieve your goals while improving quality. It will also include best practices, tips, tricks, and techniques that consultants, administrators, and end users can leverage to make completing projects and daily tasks easier.

I’m using Calculation Manager on my current project. I am not in love with the endless objects and drawings. I even had a client ask me, “Does it have to be all broken up like this?” But maybe we’re not giving the more GUI-ish nature a fair shot. For sure we aren’t using really advanced features. Could oh could oh could that be tied to treating Calculation Manager like a glorfied Essbase calc script editor. Why yes it could, so I am going to try to attend this session to see the error of my ways.

Planning Experts Panel

Natalie Delemar, Ernst & Young
When: Jun 25, 2013, Session 10, 3:30 pm - 4:30 pm
Topic: Planning - Subtopic: No Subtopic
TBD

Not a lot to go on, is it? I moderated this panel last year and it was a lot of fun. I’m not involved in the panel this year but I am looking forward to shooting a bunch of tough questions at whoever the panel is.

Dynamic Integrations for Multiple Hyperion Planning Applications

Co-presenter(s): Rodrigo Radtke de Souza, Dell
When: Jun 26, 2013, Session 15, 3:00 pm - 4:00 pm
Topic: Planning - Subtopic: General Planning
This session will cover how to use Oracle Data Integrator and Oracle Hyperion Planning metadata repository to build dynamic, flexible, and reliable processes to maintain metadata, load, and extract data from any number of applications with a single generic component.

If it has ODI as a component of the presentation, I am interested. I almost like ODI more for its ability to tie together completely heterogeneous technologies without a ton of scripting than I do for its base purpose, which is ETL. ODI rocks! Actually, I’m more of a classic jazz fan, but I’ll try to bebop my way to this session.

Automating Hyperion Planning Tasks

Kyle Goodfriend, Rolta Solutions
When: Jun 25, 2013, Session 9, 2:00 pm - 3:00 pm
Topic: Planning - Subtopic: General Planning
Maintaining Hyperion Planning environments can be so time-consuming that there is little time for development. We are constantly being asked to do more with less. Understanding some of the utilities and options to automate redundant operations can significantly improve your ability to react to change. It allows more time for new development, eliminates human error, and increases productivity and system stability. Find out what options are available, how to use them, and see real world examples you can take home.

Kyle must be a lazy programmer. Why do I write that? And no, I am not trying to insult him, as I too am a lazy programmer. What’s at least one definition of a lazy programmer? A lazy programmer is someone who sees a task that is manual or semi-scripted, does the task once and says, “There is no way I am ever doing that again.” He then sets off to write an automated whatever. ‘Coz he’s got better things to do. I suppose a really lazy programmer doesn’t even go through the pain once. I don’t know Kyle so I can’t determine if he falls into the moderately or intensely lazy camp.

NB -- Lazy = smart.

Oracle Project Financial Planning -- The Owner's Manual

Josephine Manzano-Stettler , ADI Strategies
When: Jun 26, 2013, Session 11, 8:30 am - 9:30 am
Topic: Planning - Subtopic: Project Planning
Oracle Project Financial Planning is the newest packaged application to be rolled out with the Oracle Hyperion Enterprise Planning Suite (v.11.1.2.3). This pre-built solution ties project financial planning to corporate financial planning activities within a single application, as well as, supports financial planning throughout the complete project management lifecycle. Want to find out more?...a LOT MORE? This presentation will dive deep into the new Oracle Project Financial Planning solution. The following will be covered: 1) Highlights of key out-of-the-box features, functionality, and analytical tools 2) Integration with the Workforce and Capex Planning Modules, ERP, and Project Management systems 3) Implementation considerations for a successful deployment 4) Minimizing customizations...Are your company's business needs the right fit for the packaged solution? Maximize your success for integrating Oracle Project Financial Planning into your Enterprise Planning tool set.

If Josie does it, it’s good. ‘Nuff said.

Managing Your Project Budgets: Introduction to the New Hyperion Project Planning Module

Tracy McMullen, interRel Consulting
When: Jun 24, 2013, Session 2, 9:45 am - 10:45 am
Topic: Planning - Subtopic: Project Planning
Hyperion Planning 11.1.2.3 adds a new pre-built module to the existing suite of Workforce and Capital Expenditure planning. This new module, Project Planning, fills the gap of how to budget for projects both short- and long-term before they become capitalized assets. Whether you want to budget IT projects from initial proposal through implementation, capital projects to expand facilities, or development projects out in the field, the new Project Planning module can handle them all.

Guess what? If Tracy does it, it’s good. ‘Nuff said, part two.

Did you know (or care) that in a previous professional life I once turned down dancing with Josie and Tracy? On a ship? In the middle of the Gulf of Mexico? I must have been out of my mind. :) I did have a cold. But still.

Introduction to Predictive Planning in Hyperion Planning

Troy Seguin, interRel Consulting
When: Jun 26, 2013, Session 12, 9:45 am - 10:45 am
Topic: Planning - Subtopic: General Planning
Version 11.1.2.3 of Hyperion Planning is including Crystal Ball's predictor feature. Predictor utilizes established time series procedures to help with forecasting upcoming time periods. This presentation will walk you through how to effectively use predictor as part of your budgeting and forecasting duties as well as an intuitive explanation of the concepts working behind the scenes.

I have to say I am somewhat on the fence about this one. No, no, not about Troy, or his presentation skills, or anything like that. It’s about statistics in the hands of your average Planner. Face it, people love to gamble. Face it again, the house (casinos, state lottery boards, Nathan Detroit’s The Oldest Established (Permanent Floating Crap Game In New York), etc.) always wins over time; those institutions must if they are to survive. And yet people gamble in the “sure” knowledge that they will win. But casinos, and state lotteries, and even incredibly charming minor organized crime figures with great singing voices understand how statistical probability works, and in their favor. Wait, I just figured out who needs to go to this session. I hope Frank Sinatra is there (at least spiritually) as well.

Turbocharge Your Hyperion Planning Input Forms with Predictive Planning

Jake Turrell, US-Analytics
When: Jun 26, 2013, Session 13, 11:15 am - 12:15 pm
Topic: Planning - Subtopic: General Planning
One of the most exciting new enhancements to Planning 11.1.2.3 is its new Predictive Planning tool. This new feature allows users to plot their projections alongside those created by Predictive Planning, giving users another data set against which they can compare their results. This live demo will walk users through the process of setting up Predictive Planning and will provide several real-world examples. The session will cover: - When to use Predictive Planning and when to avoid it - Basic statistical concepts used by Predictive Planning - How to best configure input forms for Predictive Planning - A walk through of the Predictive Planning user interface - Running predictions. - Using Comparison Views to review the results of various scenarios - How to tweak your results with filters and reports Users will leave this session with the tools, knowledge, and confidence to implement Predictive Planning in their own environments.

If Troy’s presentation above is the theory, then Jake’s session is the application. So that makes Troy a physicist and Jake an engineer.

Lower your TCO With Oracle Planning and Budgeting Cloud Service (PBCS)

Shankar Viswanathan, Oracle Corporation
When: Jun 24, 2013, Session 5, 3:00 pm - 4:00 pm
Topic: Planning - Subtopic: General Planning
PBCS takes the best of breed Hyperion Planning solution to the Cloud with a compelling offering for companies looking to lower TCO by deploying their applications on the Cloud using the top Planning solution in the market. This session will introduce participants to the upcoming Planning Cloud offering from Oracle. Participants will hear the details of the offering and get a sneak preview of Oracle's Planning on the Cloud offering.

Oracle? Again? Do you want to hear about how Oracle’s cloud service works? Would the Planning product manager be the person to tell you all about it? Why yes, you should and he is. See, this is what makes Kscope so great – we the great unwashed get to interact with the people that make the products we use.

What's New in Planning?

Shankar Viswanathan, Oracle Corporation
When: Jun 25, 2013, Session 8, 11:15 am - 12:15 pm
Topic: Planning - Subtopic: General Planning
The latest Release 11.1.2.3 of Oracle Hyperion Planning takes Enterprise Planning to the next level by providing some valuable agile enablers within Planning. This session will discuss and showcase some of these key features in this release, and provide several good considerations for customers to help choose this release as the go-to release for deployment and upgrade.

I have but one question – Planning ASO, is it any good? This is where I will hear about it for the first time. I hope.

Are you crying Uncle yet?

That is 11 different Planning sessions. Is that the sum total of Planning sessions at Kscope? Absolutely not. In fact there are 31 on offer. And that is just a subset of all the EPM sessions. Kscope has content, content, content.

The next blog post will be The Truth About Hyperion EPM Foundations and Data Management at Kscope.

Be seeing you at Kscope13.


What Kscope13 sessions am I looking forward to, part three

$
0
0

Introduction

I’ve already covered the Essbase and Planning side of the house in parts one and two of this series. What about the foundation for these tools? You know, the data and metadata that make EPM applications, well, right, accurate, useful, etc. Without good data (and metadata) all we EPM practitioners have is a pretty design and bad numbers. Hmm, I may have written a book (or at least a chapter) about this.

Happily, ODTUG agrees with me (Or do I agree with them? Whatever) and they have an EPM Foundations and Data Management track. This is the third in the series of sessions I am looking forward to, and if history and culture are any guide: there is the Rule of three, the Page Cavanaugh Trio’s version of The Three Bears, and perhaps most famously, “All Gaul is divided into three parts”. In other words, three is an important number. And so is this track.

Whoops, before I continue, I should mention that I am friends with, or at least am acquainted with most of the people below. Am I just shilling for them? Nope, these are good sessions. Of course you pick and decide what you want to attend – this is what I am interested in.

Note – you will notice that the name Cameron Lackpour is absent from the below sessions. This is not some kind of false modesty as I do think that the SQL session I am presenting is at least worth considering. I will cover that later in the week – this block is for everyone not named Cameron.

EPM Foundation Data Management sessions (stolen right off of Kscope13.com) with my comments

Integrating PeopleSoft with Planning -- How Amerigroup Replaced HAL with ERPi & FDM

Roger Balducci, Amerigroup
When: Jun 26, 2013, Session 14, 1:45 pm - 2:45 pm
Topic: EPM Foundations & Data Management - Subtopic: FDM
In this session discover how Amerigroup replaced a black box HAL process with FDM & ERPi to load their Planning application. During this session the presenter will review the decision to use ERPi in conjunction with FDM to enable drill through to PeopleSoft. The session will highlight the automation that provides flexibility to process data for the entire company or a single business unit. Finally the session will demo the drill-through capabilities that ERPi provides - not only to the ledger but also to the subledger.

A project that replaces HAL? Death to HAL, I say, death to HAL. That product caused me grief, pain, and psychic discomfort every time I brushed up its mediocre spaghetti diagrams. Yes, yes, I know, it has its defenders, but they’re wrong. Proof? Come see this presentation. You’ll feel clean afterwards, like after a mountain hike whilst eating a York Peppermint Patty. Or am I confusing that with Irish Spring soap and cheesy faux-Irish dialogue? Anway, see how HAL got the coup de grace. And cheer.

Stump the Experts - Hyperion EPM Panel

Natalie Delemar, Ernst & Young
When: Jun 24, 2013, Session 4, 1:45 pm - 2:45 pm
Topic: EPM Foundations & Data Management - Subtopic: No Subtopic
TBD

Intriguing content there, yes? :) I have no idea who is to be on this panel but Kscope always does these right with a good mix of freewheeling questions and lots of opinion. You know, the things consultants are afraid to say to their clients lest they be bounced out on their noggins. Ouch. But no clients (other than the punters in the seats) in this. I am looking forward to it.

ODI - Tips and Tricks to Build Better Integrations

Matthias Heilos, MindStream Analytics
When: Jun 25, 2013, Session 8, 11:15 am - 12:15 pm
Topic: EPM Foundations & Data Management - Subtopic: ODI
Oracle Data Integrator (ODI) is a powerful data integration suite which allows you to build integration processes with enormous productivity gains over conventional tools of the same breed. In this session you will gain insights in how ODI works and what you should consider to build master-class integrations. Learn about tricks and tips on architecture, Knowledge Module optimization, migration, flexible load processes, and many other areas that your organization should be aware of when working with ODI.

Matthias knows ODI. Really, really well. If he does a session on it, it’ll be good.

Think Outside the Box - New Ideas in Financial Data Management

Matthias Heilos, MindStream Analytics
When: Jun 26, 2013, Session 11, 8:30 am - 9:30 am
Topic: EPM Foundations & Data Management - Subtopic: No Subtopic
Are you wondering if you could manage your (financial) data more efficiently? Often, the answer is yes. In this session you will see how other organizations found unusual ways to improve their financial processes. Looking at the bigger picture often allows discovery of new solutions to either automate more effectively, increase transparency, or improve your ability to adapt to change faster. Join this session to learn about unconventional ways to use Hyperion products and OBIEE.

See the above on my opinion on Matthias’ knowledge level and presentation skills. Also, I get sort of obsessed about data, so this ought to be really interesting.

How to Turn New Recruits into Oracle EPM Support Gurus

When: Jun 26, 2013, Session 16, 4:15 pm - 5:15 pm
Topic: EPM Foundations & Data Management - Subtopic: Infrastructure & Essbase Exalytics
Oracle EPM requires a knowledgeable team to provide production support due to its criticality as a service. Typically skill levels vary in the team as resources are pulled from other areas or are required to support multiple services. Consequently, the need for infrastructure training is a recurring theme in an organization. This presentation covers how to explain Hyperion and its architecture in a way to fully engage new support staff. It includes getting started with EPM modules, logs, and troubleshooting.

This is an interesting session and more of an Organizational Psychology topic than technical – I find these fascinating. Some of my clients understand how to do this and some…do not. It’s not easy finding the right person or persons and as the EPM stack become more and more sophisticated and complicated the profile of the right EPM administrator has changed. And a bad admin = a bad system (yes, I have all too painfully experienced this), so this ought to be an informative session.

How Windstream Leverages Essbase Analtyic Link to Increase Their Analytic Capabilities
Alexander Ladd, MindStream Analytics
Co-presenter(s): Jennifer Moline, Windstream Corporation
When: Jun 24, 2013, Session 5, 3:00 pm - 4:00 pm
Topic: EPM Foundations & Data Management - Subtopic: No Subtopic
Windstream Corporation utilizes Essbase and Essbase Analytic Link to unlock analytic value from their HFM application. This presenation will detail how Windstream implemented Essbase and Essbase Analytic Link with drill through to transactional detail via FDM. See the architecture, the data flows, and how this environment was built, and hear the lessons learned about Essbase Analytic Link.

Once upon a time at a mildly disagreeable client, I worked on a HFM to Extended Analytics project. Which was somewhat amusing as I could and still can barely spell H-F-M but it wasn’t my choice and I met some great people along the way; nothing builds teamwork like adversity. In any case, this was back in the System 9 (remember that?) days and the link was…crude. Wouldn’t it have been great if there was a HyperRoll module that pulled data out of HFM in real time and then pushed it to BSO Essbase as a quasi-transparent partition via a CDF? Why yes, it would have, and I wish it existed back then. And now it does, so come see how it works. Although, if I had had this, would I have made those friends? One of life’s imponderables.

Exalytics - An Apples to Apples Comparison

Co-presenter(s): Cameron Lackpour John Booth, Tim German
When: Jun 25, 2013, Session 6, 8:30 am - 9:30 am
Topic: EPM Foundations & Data Management - Subtopic: Infrastructure & Essbase Exalytics
This session will be a panel discussion highlighting the results of our apples to apples test comparing an Exalytics-based solution to a comparable machine in the Amazon Cloud. These tests encompassed ASO and BSO; they covered data loads, BSO calculation, and ASO Aggregation; and finally multi-user performance tests of BSO Planning Rules and ASO Queries. Given the breadth of this testing some of the results are applicable to non-Exalytics solutions (assuming you have the "lots" of CPU and or Memory).

Okay, full disclosure here – I am involved in this one but I am but a supporting player. This is a really interesting session. And yes, I broke my own rule but if I can’t do that on my own blog, where can I?

FDM to ERPi - Upgrade/Migration Strategies & Considerations

Anthony Scalese, Edgewater Ranzal
When: Jun 24, 2013, Session 2, 9:45 am - 10:45 am
Topic: EPM Foundations & Data Management - Subtopic: FDM
Not the fish, anything but the fish! The FDM product is nearing the end of its life. This session will introduce you to FDM 2.0, aka ERP Integrator (ERPi). The session will begin with a technology overview of the new product - architecturally and functionally. The session will continue on to explore key features/changes from FDM. The session will explore strategies, techniques, and key watch-outs for migration from your existing FDM application. Finally the session will discuss best/leading practices for ERPi implementation and go-forward maintenance.

The fish, the fish, oh the humanity! Hmm, something about that doesn’t make sense. Anyway, FDM has always struck me as somewhat old fashioned. And we all know that ODI is all kinds of awesome. And now we see FDM replaced with…ODI in a wrapper. This ought to be interesting. And I’m glad I never learnt how to be an FDM consultant. :)

The New and Improved FDM -- Financial Data Quality Management Enterprise Edition 11.1.2.3

Richard Wilkie, Oracle Corporation
When: Jun 24, 2013, Session 3, 11:30 am - 12:30 pm
Topic: EPM Foundations & Data Management - Subtopic: FDM
The FDM EE 11.1.2.3 release combines the deep functional flows of classic FDM with the deeply integrated technical aspects of ERP Integrator. This new solution allows customers to build deep integrations directly against popular ERP's like E-Business Suite, Peoplesoft and SAP while taking advantage of the functional workflow required in any end user driven data quality process. This session will deep dive into the changes that were made, how they benefit new and existing customers, and typical use cases across the EPM product family.

Whoops, there I go slagging off Oracle’s (well, Hyperion’s which should actually be Upstream’s) fine products and yet I suggest that you attend an Oracle session on FDM EE. If you want to know where the product is going, and what it’s all about, I can’t think of a better person to listen to.

Are you crying Uncle yet?

That is nine, count ‘em nine (you will note that this is divisible by three, I stick to my themes come Hell or high water), different EPM Foundation and Data Management sessions. Is that the sum total of these sessions at Kscope? Absolutely not. In fact there are 17 on offer. And that is just a subset of all the EPM sessions. Kscope has content, content, content.

The next blog post will be The Truth About EPM Reporting at Kscope.

Be seeing you at Kscope13.

What Kscope13 sessions am I looking forward to, part four

$
0
0


Introduction


I’ve covered the Essbase, Planning, and EPM Foundations and Data Management side of the house in parts one, two and three of this series. Those subjects are all about getting data and metadata into the EPM world. What about reporting it out? That is sort of the point, right?
 
ODTUG realizes that and created a reporting track just for this very purpose. And that’s a good thing as the EPM reporting options and solutions have gotten more and more complicated and sophisticated over time. When I started out with an Essbase report meant an Essbase report script. We’ve come a long way. The best way to make sense of all of that is to take a look at some of these sessions.

Whoops, before I continue, I should mention that I am friends with, or at least am acquainted with most of the people below. Am I just shilling for them? Nope, these are good sessions. Of course you pick and decide what you want to attend – this is what I am interested in.
EPM Reporting sessions (stolen right off of Kscope13.com) with my comments

The Art of Ad-Hoc Analysis with Essbase

Joe Aultman, AutoTrader.com
When: Jun 25, 2013, Session 8, 11:15 am - 12:15 pm
Topic: EPM Reporting - Subtopic: No Subtopic
"Where is this number coming from?" "Why doesn't this look right?" "How do I explain these variances?" "Can you show me this data in a different way?" These are all questions our managers ask, and we are expected to answer with Essbase. Yet sometimes, staring at an empty spreadsheet or at a report someone else created, we may not know what do to next. Watching someone else do analysis may leave our heads spinning. Even if things aren't quite that bad, we may feel we're taking the long route to our answers, or that there might be technique we're missing from our repertoire. This session covers the basic methods of ad-hoc analysis for different modes of exploration. The session will discuss the various options, when to use them, and when not to. The session will talk about what kinds of questions lead to what kinds of analysis modes. The session will review techniques for efficiency and lesser-known features. While the session is aimed at the less experienced, there's a good chance even experts will hear something they didn't know. Smart View and Classic Add-In will also be covered.

Old school, but still what makes Essbase great. Are you getting all that you could of Essbase? Also, Joe is a great speaker.

The Top Five Things You Should Know When Migrating from an Old BI Technology to OBIEE

Michael Bender, Performance Architects
Co-presenter(s): John McGale, Performance Architects, Inc.
When: Jun 26, 2013, Session 11, 8:30 am - 9:30 am
Topic: EPM Reporting - Subtopic: No Subtopic
Upgrades and migrations are the perfect time to not only evaluate the latest and greatest technology features, but also to review your organization's business intelligence processes to make sure that you are working effectively and efficiently. The session will review how to "sell" your leadership on why to migrate; thoughts on best practices in upgrades; the difference between a "migration" and an "upgrade"; important items to note as you're thinking about this major change for your organization; migrating existing Oracle Hyperion Intelligence or Discoverer reports...or content from another vendor's BI solution...to OBIEE or BI Foundation; and much more!
 
Interactive Reports is dead, long live OBIEE!  Time to switch, folks, if you haven't already.  And if you haven't already, this would be a great session to attend.

Best Practice Methodologies with Oracle/Hyperion Financial Reporting

Joshua Forrest, Abercrombie & Fitch
When: Jun 24, 2013, Session 4, 1:45 pm - 2:45 pm
Topic: EPM Reporting - Subtopic: Financial Reporting
Learn how developing financial reports with best practice methodologies results in successful implementations and efficient solutions. This session will utilize Oracle/Hyperion Financial Reports to demonstrate time-saving features including Row & Column Templates, Saved Objects, and Rolling Year Reports. Attendees will realize the benefit to using Annotations for commenting on results and related content to provide additional detail. Efficiency accelerators aimed at Reporting Standards and Change Impact will be covered and distributed. Attendees will leave this session with collateral and concepts that can be quickly applied into their environment.
 
I have seen good FR reporting approaches, and bad ones. The bad ones suck. A lot. If you have any doubts about what you are doing with FR, or want to improve your reporting solution, you should be here.

Smart View Query Tools: So Many Tools, So Little Time

Craig Mesenbring, Harbinger Consulting Group
When: Jun 24, 2013, Session 1, 8:30 am - 9:30 am
Topic: EPM Reporting - Subtopic: Smart View
Stuck in the same old rut in how you create Essbase reports in Smart View? Did you know that there are six different Smart View tools for querying Essbase data? In one short session, learn how, when, and why to use each tool. Get out of the rut and expand your horizons with all of the Smart View query tools.
 
Smart View has a lot of functionality beyond an ad-hoc query. Most of us (including me) don’t take advantage of said functionality. So let’s show up at Craig’s session and move on.

Leveraging Office and Smart View to Create a Data Entry Experience

Matt Milella, Oracle Corporation
When: Jun 26, 2013, Session 15, 3:00 pm - 4:00 pm
Topic: EPM Reporting - Subtopic: Smart View
This session will take you though the steps necessary for creating advanced input templates for distribution to end users. The templates will include data from multiple sources and the examples shown will leverage new features in Office 2010 and 2013 along with Excel features have been around for many versions. (charting, functions, sparkles, outline, and formatting) It will touch on the use of VBA to automate common tasks and we will go through some of the challenges of template distribution and maintenance.
 
Matt is an excellent speaker and of course as Product Manager knows Smart View really well. Highly recommended.

Smart View's New Features

Matt Milella, Oracle Corporation
When: Jun 26, 2013, Session 16, 4:15 pm - 5:15 pm
Topic: EPM Reporting - Subtopic: Smart View
In this session we will get you up to speed on all the new features available in the latest version of Smart View. We will discuss general client features like sheet level options, re-designed function builder, or platform support like Office 2013 as well as provider specific features like in-cell Point of View or improved Planning Smart Lists. This session will also contain details on, and show demos of new extensions; most notably the OBI (Oracle Business Intelligence) extension.
 
What, you didn’t see my comments above? Matt is a great speaker, highly knowledgeable, and knows Smart View better than anyone. I predict (and this is a pretty safe prediction) that this will be SRO.

Customizing the Smart View Framework

Michael Nader, Blue Stone International
Co-presenter(s): Martin Slack, Ernst & Young
When: Jun 26, 2013, Session 12, 9:45 am - 10:45 am
Topic: EPM Reporting - Subtopic: No Subtopic
Smart View for Office provides a dynamic framework for customizing the functionality and capabilities to suit a client's deployment. This session provides practical, deployed examples of using customizing Smart View for both Planning and Essbase.

Fellow Developing Essbase Applications author Mike Nader is a great speaker and does really interesting work.

Hyperion Financial Reports: A Love and Hate Relationship
Mehmet Sevinc, University of California Berkeley
When: Jun 24, 2013, Session 3, 11:30 am - 12:30 pm
Topic: EPM Reporting - Subtopic: Financial Reporting
Hyperion Financial Reporting (FR) has been around for a long time. In spite of rumors, FR has been a go-to reporting tool for Profit & Loss, Balance Sheet, Cash Flow, and various other financial reports. The simplicity of the tool, the user-friendly design, and easy training of the users have made FR a very popular reporting tool in the Enterprise Performance Management field.
 
This sounds like a good basic (sorry, Mehmet if this is super advanced, let me know and I will edit accordingly) session on FR. If you’re starting out with FR, this is the place to be.

Automating Hyperion Reporting for the Rest of Your Organization

Jim Wilking, Harbinger Consulting Group (HCG)
When: Jun 24, 2013, Session 5, 3:00 pm - 4:00 pm
Topic: EPM Reporting - Subtopic: Smart View
Everyone realizes that Hyperion users should be utilizing the built-in reporting tools, but how does the rest of your organization get their reports? Not everyone in your organization has Hyperion access. Learn from a Hyperion certified consultant how to harness the built-in power of Smart View VBA Functions to generate reports. This session will focus on how to develop and deliver reporting to the non-Hyperion user community in your organization utilizing your existing tools. Live examples and code will be used to explore solutions to many of the common reporting scenarios. Specific examples will display the benefit of the Smart View 11.1.2.3 multiple grid reporting and butterfly reporting functionality. This session will help you shave hours off of your daily, monthly, and quarterly reporting tasks.
 
More cool stuff with 11.1.2.3 with cool code. Doesn’t that sound cool? It does to me.

Are you crying Uncle yet?
That is nine, count ‘em nine, different EPM Reporting sessions. Is that the sum total of these sessions at Kscope? Absolutely not. In fact there are 14 on offer. And that is just a subset of all the EPM sessions. Kscope has content, content, content.

The next blog post will be The Truth About Other Interesting Topics at Kscope13.

Be seeing you at Kscope13.


What Kscope13 sessions am I looking forward to, part five

$
0
0

Introduction

I’ve covered the Essbase, Planning, EPM Foundations and Data Management and EPM Reporting side of the conference – but what about the rest of Kscope13? Is there anything else at the conference worth attending other than this EPM love fest I have described?

Kscope13 doesn’t cover every single tool that Oracle offers (although I can see ODTUG’s president, Monty Latiolais, doing his usual and profoundly bone-chilling Dr. Evil impersonation as he discusses this very topic at the next ODTUG board meeting – and yes, that is a tough act to pull off for a 6’5” Texan with a full head of hair, but it is true, ODTUG board meetings do discuss the utility and best application of sharks with laser beams in our bid for world domination), but it does speak to an awful lot of them.

What do I mean? Other than EPM, there are the following tracks: Application Express, ADF and Fusion Middleware, Developer’s Toolkit, The Database, .Net, Building Better Software, and Business Intelligence. Did I mention that there is also an EPM Business Content track? Is ODTUG on its way to our planned world domination? That is an awful lot of technological ground, so most of the Oracle world, then?

Whoops, before I continue, I should mention that I am friends with, or at least am acquainted with most of the people below. There are even a few hated and deeply resented greatly admired ex-bosses in the mix. Am I just shilling for them? Nope, these are good sessions. Of course you pick and decide what you want to attend – this is what I am interested in.

And yes, this list is waaaaaaaaaaaaaaaaaaayyyyyyyyyyyyyyy too long for any one person to attend. There is just Too Much Good Content. A nice dilemma to have when it comes to picking where to go.

Application Express

It looks like the Apex world is focusing on the cloud this year at Kscope. I’ve been using EPM in the cloud for almost three years and have happily made Amazon an even richer company. Why don’t my projects live in the cloud – I POC there, but what about the actual development?

It looks like Apex has made the leap to the Cloud, but for real. I can’t wait for EPM to get there.

Developing Real World Applications in the Cloud

Joseph Acero, JSA2 Solutions
Co-presenter(s): Gina Haub, South Texas Project Nuclear Operating Company
When: Jun 26, 2013, Session 12, 9:45 am - 10:45 am
Topic: Application Express - Subtopic: The Basics
Cloud technologies like Amazon Web Services and the new Oracle Cloud allow for rapid application development using distributed teams. This session will walk through the set up and best practices for developing in the cloud while walking through a real world case study.

Amazon Cloud Setup for APEX Environments

Martin D'Souza, ClariFit
When: Jun 24, 2013, Session 4, 1:45 pm - 2:45 pm
Topic: Application Express - Subtopic: Infrastructure/Management/Security
Moving your database into the cloud is a popular option within organizations for development all the way through to hosting production applications. There are several large cloud service providers that offer Database as a Service solutions, Amazon being one of them. This presentation will guide you through setting up an Oracle database on Amazon's Web Service (AWS) Relational Database Service (RDS) platform and setting up web servers to host APEX environments. Other areas to be discussed will be usage for secure development, offline production calculation, other AWS features, and comparisons with other cloud service providers.

Oracle Database Cloud Update

Rick Greenwald, Oracle Corporation
When: Jun 27, 2013, Session 19, 11:00 am - 12:00 pm
Topic: Application Express - Subtopic: No Subtopic
The Oracle Database Cloud went live in 2012. This session will give an overview of the progress of the Database Cloud, including discussions on initial rollout, subsequent enhancements, customer adoption and best practices for working with your own Database Cloud Service. In addition, the session will discuss some general direction for the Database Cloud, as well as act as a forum for your ideas for this Cloud platform

ADF and Fusion Middleware
ADF is here for we EPM geeks. Some call that good, others not so much. If you want to understand why 11.1.2.x of Planning or HFM or whatever looks the way it looks, this is the track to follow.

Also, I see Most Excellent Debra Lilley but where oh where oh where is Really Quite Smelly Stanley? I do hope he makes an appearance. :)

Oracle's Roadmap to a Simple, Modern User Experience in Oracle Fusion Applications

Jeremy Ashley, Oracle Corporation
When: Jun 27, 2013, Session 19, 11:00 am - 12:00 pm
Topic: ADF and Fusion Development - Subtopic: Customizing Fusion Apps
Simplify your user experience. Lower implementation costs. Increase productivity. Delight your users. Are you looking to wow your employees with a user interface that is simple, modern, and compelling? Learn how Oracle's drive toward enhancing productivity helps you achieve value from your application's investment. This session will show you how you can exceed your employees' desire for enterprise data, delivered on any device, and then explain how to reduce the cost of your user interface customizations, configurations, and extensions.

Mobile Development: A Practical Example of Design Patterns in Action

Susan Duncan, Oracle Corporation
Co-presenter(s): Debra Lilley, Fujitsu
When: Jun 25, 2013, Session 10, 3:30 pm - 4:30 pm
Topic: ADF and Fusion Development - Subtopic: Advanced ADF: Mobile, Cloud, Web Services, etc
Oracle's User Experience and ADF teams have worked together to produce a set of mobile design patterns that allow the development of intuitive, easy, and productive to use mobile applications. The patterns range from how to design your navigation, to list layout, to editing a business object, to how to invoke actions that yield a simple and apparent way to complete a task. These patterns work well across platforms (e.g. iOS, Android, and BlackBerry) and are supported by ADF. The patterns have been vetted in Oracle's own mobile products (e.g., Sales, Time Entry, Expenses, Field Service), and work across different user roles and product lines. In this session you'll see a practical example of building a mobile application using these scientifically proven UX design patterns to solve a real customer use case: a conference feedback application. Using ADF Mobile one hybrid mobile application will be developed for deployment both as an iOS and Android mobile app. The session will include a discussion on how the design patterns were used in approaching the problem by both the customer and ADF and UX teams and how this same approach is used for Oracle Fusion Applications. It will be jointly presented by Susan Duncan who leads Oracle's Mobile Development Program office and Debra Lilley, Oracle Customer/ACE Director instrumental in the Fusion User Experience Advocates Program: both active in mobile application design and development in their respective roles.

Developer’s Toolkit

In my dreams, and only my dreams, I am a Data Warrior. Oooh, that is such a good title. Essbase Hacker really doesn’t have the same ring.

Good data is the foundation of all of our systems. Well, okay, bad data is the foundation of some of our systems, but they aren’t likely long for this world, are they?

I really wish this track was almost at another conference so I could simply sit in on all of the sessions and not worry about all of the other tracks I want/need to attend. It is simply that good, from features to the guts of the tools to optimizing to organizational psychology. It’s amazing content.

One last comment – this track has the most creative session names – I simply had to include the one on the 1980s and the other on successful dating.

Top Ten Cool Features in Oracle SQL Developer Data Modeler

Kent Graziano, Data Warrior
When: Jun 25, 2013, Session 10, 3:30 pm - 4:30 pm
Topic: Developer's Toolkit - Subtopic: No Subtopic
Oracle SQL Developer Data Modeler (SDDM) has been around for a few years now and is up to version 3.x. It really is an industrial-strength data modeling tool that can be used for any data modeling task you need to tackle. Over the years, the presenter has found quite a few features and utilities in the tool that he relies on to make him more efficient (and agile) in developing his models. This presentation will demonstrate at least ten of these features, tips, and tricks for you. He will walk through things like installing the reporting repository, building a custom report on the repository using Oracle SQL Developer, modifying the delivered reporting templates, how (and when) to use the abbreviations utility, how to create and apply object naming templates, how to use a table template and transformation script to add audit columns to every table, how to add custom design rules for model quality checks (heck how to use the built-in quality checks), and several other cool things you might not know are there. Since there will likely be patches and new releases before the conference, there is a good chance there will be some new things for the presenter to show you as well. This might be a bit of a whirlwind demo, so get SDDM installed on your device and bring it to the session so you can follow along.

What? You're Still Not Using Groovy?

David Schleis, Wisconsin State Laboratory of Hygiene
Co-presenter(s): Joe Aultman
When: Jun 24, 2013, Session 5, 3:00 pm - 4:00 pm
Topic: Developer's Toolkit - Subtopic: Languages
If you spend time writing Java code, and you're not using Groovy, you're spending too much time writing code. If you've ever pondered an Essbase automation problem and said, "I wish I knew how to write Java," and you haven't looked into Groovy, your answer is here. Groovy is an object-oriented dynamic language (also referred to as a scripting language) like Ruby or PHP. Like these languages, Groovy is much easier to use and has a simpler syntax than Java. However, what makes Groovy different than other scripting languages is that it compiles to Java bytecode. This means that Java programs can run Groovy, and Groovy programs can run Java. This seamless integration with Java and its concise syntax are why Groovy is the language of choice for scripting of ADF Business Components. This integration also means that writing in Groovy makes it easier to use existing Java libraries; including the Java libraries of the Essbase Java API. This session is an introduction to the Groovy programming language and how it can be used in conjunction with the Essbase JAPI to make advanced automation more accessible.

The 80's Called, They Want Their Command Line Interface Back

Jeff Smith, Oracle Corporation
When: Jun 26, 2013, Session 12, 9:45 am - 10:45 am
Topic: Developer's Toolkit - Subtopic: IDEs
I only use SQL*Plus. I say that graphical IDEs are the best. Who is right? How can an old-school database pro be convinced to use newer technology, and more importantly, why SHOULD they be convinced? Tools are designed to do one thing - increase productivity. If your tool is slowing you down, you're doing it wrong, or you're using the wrong tool. Watch Oracle's SQL Developer product manager debate himself on why SQL Developer can be good for both the new and advanced Oracle user.

An Oracle Geek's Guide to Successful Dating

Sean Stuber, American Electric Power
When: Jun 26, 2013, Session 15, 3:00 pm - 4:00 pm
Topic: Developer's Toolkit - Subtopic: Languages
This session will be a short examination of Oracle's date/time datatypes and best practices for manipulating them in SQL and PL/SQL.

The Database

Sometimes I wish this blog was titled Cameron’s Blog for SQL Hackers. My current project has turned into one big festival of SQL.  If only I wasn't like a 3 year old playing with the stuff.  Maybe, just maybe, this track could help me. 

Optimizer Hints: Top Tips for Understanding and Using Them

Maria Colgan, Oracle Corporation
When: Jun 24, 2013, Session 2, 9:45 am - 10:45 am
Topic: The Database - Subtopic: Tuning
The most powerful way to alter an execution plan is via hints; but knowing when and how to use hints correctly is somewhat of a dark art. This session explains in detail how Optimizer hints are interpreted, when they should be used, and why they sometimes appear to be ignored. By attending this session you will arm yourself with the knowledge of how to apply the right hints, at the right time.

Oracle Optimizer: An Insider's View of How it Works

Maria Colgan, Oracle Corporation
When: Jun 26, 2013, Session 15, 3:00 pm - 4:00 pm
Topic: The Database - Subtopic: DBA
With each new release the Optimizer evolves as we strive to find the optimal execution plan for every SQL statement. Understanding how the Optimizer operates and what influences its choices helps you provide the necessary information to make that nirvana a reality. This session explains in detail, how the latest version of the Optimizer works and the best ways you can influence its decisions.

Exadata and the Optimizer

Maria Colgan, Oracle Corporation
When: Jun 27, 2013, Session 17, 8:30 am - 9:30 am
Topic: The Database - Subtopic: Tuning
Knowing when and how to take advantage of each of Exadata's performance enhancing features can be a daunting task even for the Oracle Optimizer, whose goal has always been to find the optimal execution plan for every SQL statement. This session explains in detail how the Oracle Optimizer costing model has been impacted by the introduction of the performance-enhancing feature of the Exadata platform. It will show through the use of real-world examples what you can do to ensure the Optimizer fully understands the capabilities of the platform it is running on without having to mess with initialization parameters or Optimizer hints.

Bye-bye CONNECT BY - Using the New Recursive SQL Syntax

Dominic Delmolino, Agilex Technologies
When: Jun 25, 2013, Session 10, 3:30 pm - 4:30 pm
Topic: The Database - Subtopic: SQL
Hierarchical queries in Oracle have always been a challenge, even for advanced SQL practitioners, with Oracle-specific SQL language elements which are not part of the SQL standards. Beyond the SQL-92 standard, the ANSI SQL:1999 standard added the definition of a recursive query which has now been adopted by Oracle. This presentation will talk about how to translate common CONNECT BY statements and use the new construct to solve more esoteric problems. The attendee will benefit by using this new standard, portable construct for hierarchical queries in Oracle and other databases.

Oracle Database Tools 101: How Does All This Stuff Get Built Anyway?

John King, King Training Resources
When: Jun 25, 2013, Crossover Sessions, 5:30 pm - 6:30 pm
Topic: The Database - Subtopic: No Subtopic
If you've been an Essbase/Hyperion, Applications, or BI user you may wonder what all the "hubbub" on the other side of Kscope is all about. Or maybe you're curious -- "I know there's a database under the covers and lots of developers; what do they do?" If you want to know about the underpinnings of your favorite Oracle software, this session is for you. We'll talk about how it all fits together: database, SQL, PL/SQL, ADF, Forms, APEX, and more (without too many boring details)! Attending this session will improve your understanding of and ability to communicate with the "bit-twiddlers" in your organization.

Tom's Top Twelve Things About the Latest Generation of Database Technology

Thomas Kyte, Oracle Corporation
When: Jun 26, 2013, Session 11, 8:30 am - 9:30 am
Topic: The Database - Subtopic: No Subtopic
The session will be taking a look at the latest generation of database technology and zeroing in on twelve high-impact capabilities, looking at what they are and why they are relevant.

WIT Session: "The Imposter Syndrome- When Successful Women Feel Like Frauds"

When: Jun 26, 2013, Session 12, 9:45 am - 10:45 am
Topic: The Database - Subtopic: No Subtopic
What is Imposter, (aka Fraud) Syndrome and why do so many people feel this way? Georgia University psychologists, Pauline Rose Clance and Sue Imes penned the term "Imposter Syndrome" back in 1978 when referring to those that were susceptible to feeling that they were frauds or imposters, no matter how skilled or successful they were. Both experts recognized that only about 20% of the cases involved men and started to investigate how much was due to nature vs. nurture and culture.

.Net

This is so interesting – a Microsoft tool track at an Oracle tools conference. Shall the Lion and the Lamb lay down together?

Getting Started with Oracle and .NET

Christian Shay, Oracle Corporation
When: Jun 24, 2013, Session 2, 9:45 am - 10:45 am
Topic: .NET - Subtopic: .Net
This beginner-level session introduces Oracle's offerings for .NET programmers, including Oracle Data Provider for .NET (ODP.NET), Oracle Developer Tools for Visual Studio, Oracle Providers for ASP.NET, and .NET stored procedures. Step-by step-demos illustrate how to get started with developing Oracle Database .NET applications by using each of these free products. New and upcoming .NET features, including fully managed ODP.NET, Microsoft Entity Framework features, Microsoft Visual Studio 2012 support, and schema compare tools are also described briefly in the session.

PL/SQL Programming for .NET Developers: Tips, Tricks, and Debugging

Christian Shay, Oracle Corporation
When: Jun 26, 2013, Session 15, 3:00 pm - 4:00 pm
Topic: .NET - Subtopic: .Net
.NET and Oracle programmers frequently work with PL/SQL, whether that means setting up a call to a stored procedure from a .NET program, executing a PL/SQL anonymous block, or writing and debugging the PL/SQL stored procedure code. In this session, we'll look at leveraging PL/SQL from the point of view of a .NET developer and will provide in-depth tips about how to configure and use the tightly integrated PL/SQL debugger in Visual Studio. We will also introduce the new Visual Studio Schema Compare tool and show how this new feature, along with automatic SQL script generation and source control integration assists in the Oracle database development lifecycle.

Building Better Software

Sigh, I should be attending this track as well. I have been on good projects, and I have been on bad ones. Happily more of the former than the latter but we all get to serve our time in hell. I have noticed that the bad projects are almost always a case of bad planning and design.  Wouldn’t it be nice to not go down that path? This track shows you how to do just that.

Five Ways to Make Data Modeling Fun

Kent Graziano, Data Warrior
When: Jun 24, 2013, Session 2, 9:45 am - 10:45 am
Topic: Building Better Software - Subtopic: Modeling
Most people think data modeling booooorrring, right? While data architects the world over all agree that data modeling is a critical success factor to any well-engineered database or data warehouse, many struggle with how to get their organizations to support their efforts. What if you could make data modeling sessions more engaging for the business folks? The end result would be better data models. Using some common games and concepts, this session will show you how to make data modeling fun. This will be a very interactive session complete with audience participation and maybe some prizes!

Performance is a Feature: Here is the Specification

Cary Millsap, Method R Corporation
When: Jun 24, 2013, Session 3, 11:30 am - 12:30 pm
Topic: Building Better Software - Subtopic: Instrumentation
To many software developers, designers, and architects, "performance" is a side-effect...an afterthought of designing and building proper features like "book an order" or "look up a book by author." But great performance at scale doesn't happen by accident. The first step is to know what performance *is*: it is the answer to the question, "What have people been *experiencing*?" Knowing what people experience when they use your software is possible only if you treat performance as a proper feature, a feature you analyze, design, build, test, and maintain. This session explains the steps that will get you started.

EPM Business Content

Business Content isn’t really my thing, except of course I don’t exactly build systems for laboratories. So actually, Business Content is my thing, or at least something I need to care about. Will my brain hold this much information? I may be reaching the tipping point.

What's the UPK?

Opal Alapat, TopDown Consulting
When: Jun 26, 2013, Session 14, 1:45 pm - 2:45 pm
Topic: EPM Business Content - Subtopic: Case Studies
Oracle User Productivity Kit (UPK) allows companies to develop, deploy, and maintain content for training and testing. It can help mitigate project risk, reduce deployment and project timelines, and assist with end-user adoption. This presentation will review the basics of UPK, different use cases for implementing it, and how to leverage it for testing and training purposes. In addition, this presentation will include tips and tricks for getting started and will highlight both the administrator and user perspectives.

The Evolution in Forecasting: Hyperion Planning 11.1.2.3

Tracy McMullen, interRel Consulting
When: Jun 26, 2013, Session 16, 4:15 pm - 5:15 pm
Topic: EPM Business Content - Subtopic: Product Demos
This session will highlight new features available in Planning 11.1.2 including a number of enhancements like approvals, full Smart View functionality, adhoc analysis over the web in a data form, built-in data syncing to an ASO database, and Public Sector Planning. 11.1.2.3 goes even farther by introducing a new module to do project planning, integrating Crystal Ball for predictive planning, and impressively, adding charts and graphs to composite forms to make Planning into more of a dashboard experience.

Business Intelligence
BI and EPM – yes, they are finally coming together. It still isn’t perfect but then again what is? Take a look at the sessions below – if you are interested in EPM or BI, these sessions alone could justify the cost of Kscope. Arrrgh, I need to be cloned so my army of Camerons (now really, that is a scary thought and in theory I am their leader as Ur-Cameron) can attend all of this cool stuff.

I have to also say, when I look at these presentations, I think, “Geez, why am I not doing this stuff?” Indeed, why aren’t I? Maybe I should be a BI geek and not an EPM geek. It would probably be a better intellectual fit. Maybe.

Innovations in BI: Oracle Business Intelligence Against Essbase & Relational Part 1

Stewart Bryson, Rittman Mead
Co-presenter(s): Edward Roske, interRel Consulting
When: Jun 27, 2013, Session 18, 9:45 am - 10:45 am
Topic: Business Intelligence - Subtopic: OBIEE
In OBIEE (Oracle Business Intelligence Enterprise Edition), you can create models against multiple disparate sources that pull metadata and facts from relational databases and multi-dimensional sources. A particularly powerful combination is to use Essbase for pre-consolidated cube data with an Oracle database along side for transactional information. This session will utilize the power of both sources to build on the strengths of each. Join Oracle ACE's, published authors, and presumed experts Stewart Bryson and Edward Roske as they demonstrate the fun of metadata development against a sample Essbase database sourced from an Oracle database. Attendees will leave the session knowing how to model complex Essbase options and integrate those with relational sources like the Oracle database.

Innovations in BI: Oracle Business Intelligence Against Essbase & Relational Part 2

Stewart Bryson, Rittman Mead
Co-presenter(s): Edward Roske, interRel
When: Jun 27, 2013, Session 19, 11:00 am - 12:00 pm
Topic: Business Intelligence - Subtopic: OBIEE
In OBIEE (Oracle Business Intelligence Enterprise Edition), you can create models against multiple disparate sources that pull metadata and facts from relational databases and multi-dimensional sources. A particularly powerful combination is to use Essbase for pre-consolidated cube data with an Oracle database along side for transactional information. This session will utilize the power of both sources to build on the strengths of each. Join Oracle ACE's, published authors, and presumed experts Stewart Bryson and Edward Roske as they demonstrate the fun of metadata development against a sample Essbase database sourced from an Oracle database. Attendees will leave the session knowing how to model complex Essbase options and integrate those with relational sources like the Oracle database

Using OBIEE and Data Vault to Virtualize Your BI Environment: An Agile Approach

Kent Graziano, Data Warrior
When: Jun 24, 2013, Session 5, 3:00 pm - 4:00 pm
Topic: Business Intelligence - Subtopic: OBIEE
This session will interview the users, design a reporting model, and follow up with mounds of ETL development, keeping the user community in the dark during that development. Familiar? This presentation will demonstrate an alternative approach using the Data Vault Data Modeling technique to build a "Foundation" layer in our data warehouse with an Agile methodology. Using the Business Model and Mapping (BMM) functionality of OBIEE, we can virtualize a dimensional model using the Data Vault Foundation layer to decrease the time it takes to get BI content in front of users. Attendees will see a sample Data Vault model designed iteratively and deployed to the semantic model of OBIEE.

Fusion Applications and Your BI/EPM Investment

Debra Lilley, UKOUG / Fujitsu
When: Jun 26, 2013, Session 11, 8:30 am - 9:30 am
Topic: Business Intelligence - Subtopic: No Subtopic
Oracle Fusion Applications are here today providing the next generation of applications. They are about having everything the user needs in one place, and that includes information. Fusion Applications is a window on Oracle's Fusion Middleware stack and a very big part of that is BI/EPM and analytics. This presentation will include a small demo of how Fusion looks and is designed to give you an appreciation of how BI/EPM is embedded in Fusion. For anyone thinking of Fusion in the future it will underline that your B/EPMI investment today is an investment in that future and protected.

Oracle BI Applications 11g with ODI: What You Need to Know

Kevin McGinley, Accenture
When: Jun 24, 2013, Session 1, 8:30 am - 9:30 am
Topic: Business Intelligence - Subtopic: OBIEE
Oracle BI Applications 11.1.1.7.1 PS1 was recently released, adding full support for ODI to the data integration side of BI Applications. This presentation will give details about the new release, including comparisons to previous releases along with demos of how to enable an out-of-the-box ETL run using the new features of ODI, Configuration Manager, and Functional Setup Manager.

Best Practices with Oracle Data Integrator : Flexibility

Gurcan Orhan, Rittman Mead
When: Jun 25, 2013, Session 6, 8:30 am - 9:30 am
Topic: Business Intelligence - Subtopic: No Subtopic
Oracle Data Integrator (ODI) seems slow when it is installed out-of-the-box. Since it has to comply with different versions of the databases and operating systems, the default installation is not the optimal choice. ODI is a flexible product that can be customized for specific requirements and to implement new features of the database or operating systems. Attendees will learn how to easily create a customized ODI environment. This presentation will demonstrate the flexibility of the Knowledge Module, configuration best practices and the best query response time tips and techniques. It will include information about how to load an extensive number of files quickly with a special algorithm, as well as how to define new or customized data types and analytical functions.

GoldenGate and ODI - A Perfect Match for Real-Time Data Warehousing

Michael Rainey, Rittman Mead
When: Jun 25, 2013, Session 9, 2:00 pm - 3:00 pm
Topic: Business Intelligence - Subtopic: Related BI Technologies
Oracle Data Integrator and Oracle GoldenGate excel as standalone products, but paired together they are the perfect match for real-time data warehousing. Following Oracle's Next Generation Reference Data Warehouse Architecture, this discussion will provide best practices on how to configure, implement, and process data in real-time using ODI and GoldenGate. Attendees will see common real-time challenges solved, including parent-child relationships within micro-batch ETL.

OBIEE and Essbase Integration in BI Foundation Suite 11.1.1.7 - Workspace Returns!

Mark Rittman, Rittman Mead
When: Jun 26, 2013, Session 12, 9:45 am - 10:45 am
Topic: Business Intelligence - Subtopic: OBIEE
The 11.1.1.7 release of Oracle BI EE incorporates Essbase into the product install, and provides a combined OBIEE/Essbase platform for high-performance BI and Analysis, optionally hosted on the Exalytics platform. In this session we'll look at this combined architecture, see what role Essbase plays within it and whether Shared Services is still needed, see how Workspace integration with OBIEE is now restored, and see how SmartView provides MS Office integration across both tools.

Using OBIEE to Retrieve Essbase Data: The Seven Steps You Won't Find Written Down

Edward Roske, interRel Consulting
When: Jun 24, 2013, Session 3, 11:30 am - 12:30 pm
Topic: Business Intelligence - Subtopic: OBIEE
If you've ever tried to find information on accessing Essbase from OBIEE, you'll be scouring badly written blogs for days, because there just isn't much published on this. This session will cover the seven poorly documented steps you must do to make sure your Essbase cube isn't flattened, it's in the correct outline order, aliases appear, and more. If you own Essbase & OBIEE and would like to integrate them, learn these seven steps and you too can start your own badly written blog (no offense).

Making Sense of Oracle's Business Intelligence and Analytics Offerings

Tim Vlamis, Vlamis Software Solutions, Inc.
Co-presenter(s): Dan Vlamis, Vlamis Software Solutions
When: Jun 24, 2013, Session 5, 3:00 pm - 4:00 pm
Topic: Business Intelligence - Subtopic: OBIEE
There is no best way to build BI systems and significant trade-offs exist. The BI gurus at Vlamis will compare and contrast alternative strategies for integrating different data sources into OBI systems and use scenarios to outline best practices and to evaluate the costs and ROI. They will talk about BI Apps and Fusion Apps and analytics, OBIEE and Endeca, Essbase and OLAP, Oracle Data Mining and Oracle R Enterprise, and Exalytics, Exadata, and Exalogic, and contrast the Oracle Database Appliance with the Oracle Big Data Appliance.

Are you crying Uncle yet?
Has the content completely overwhelmed you? I know it has for me. And I also know that I have said this many times over – there is simply nothing, and I mean nothing, that can touch Kscope for breadth and depth of content. Nothing.

If preparing presentations and the planning and setup work for Kscope wasn’t so time consuming (I am doing 3.5 presentations and I easily have 400+ hours of effort in this plus all of the other ODTUG things I am doing) I would want Kscope to be quarterly. Sanity and sheer physical exhaustion prevents that but I hope this little romp through the sessions shows how much great, cannot get it anywhere else, isn’t it amazing that people will share this, content Kscope has. I am soooo glad ODTUG exists.

The next and final blog post on Kscope13 sessions will be The Truth About Cameron’s Interesting (?) Kscope13 sessions.

Be seeing you at Kscope13.

What Kscope13 sessions am I looking forward to, Cameron edition

$
0
0

Introduction

In my five previous posts I’ve covered the Essbase, Planning, EPM Foundations and Data Management, EPM Reporting, and Everything Else sessions I am looking forward to (and admitting that I am somewhat unlikely to attend as I would need to be Cameron * 5 to do so and that is a scary prospect, even to me).


But what about my sessions?  Am I ever going to give you at least a hint as to what they are all about?  Am I, indeed.  I like to think, in my humble (ahem) way, that I in fact have some slightly useful information to impart.  You may not agree but after all you are reading this blog.  If you think my presentations stink and yet you’re still here you must at least admit you are somewhat confused.  OTOH, if you think my presentations aren’t half bad (and if they aren’t better than that I have wasted hundreds of hours on them, which I suppose is possible), you may be interested in the following sessions.

Top Six Advanced Planning Tips

When: Jun 24, 2013, Session 4, 1:45 pm - 2:45 pm
I am copresenting this with Jessica Cordova and it’s a chance for the two of us to impart some of the lessons learnt and techniques we’ve picked up over the years in Hyperion Planning-land.  Jessica is the primary, and me the junior on this one and you’ll understand what that means in a moment.

A bit of a side note

Actually “Six” in the title is something of a misnomer because in fact we are only going to present three sessions.  Ah, I hear you (and I do, really, in my mind, which is confusing because of course one hears with one’s ears) exclaim, “Wot?  ‘e said six an’ now he says three.  ‘e’s barmy ‘e is.”  Writing dialect is tough, especially when your inner Cockney only extends as far as really liking the Lambeth Walk.  I lay blame at the feet of my love for British war movies which, just like American ones, always have a mix of men from civvy street and there’s always someone from south of the Thames.    Also, their RSM’s have great script writers.  Wow, quite an excursion into the Weird Entertainment Cameron Likes.  

And we now return to our regular programming

Anyway, Jessica and I wrote our sections, rehearsed them via GoToMeeting, and realized that we had quite a bit (a lot) more than 50 odd minutes worth of content.  So our choice was to either cut sections out entirely or skim through our topics.  As the whole point was to do advanced topics we were in a bit of a bind until I realized that we could put up what sections we are doing up to a vote and do the balance at an ODTUG webinar after Kscope.  We are going to pinky promise on this and we all know that means that is a promise unto death.  Also, this will be an opportunity for you the audience to decide exactly what you want from us – it will be Kscope session democracy in action.  I hope you’re willing to indulge this slightly off beat approach as I think what we have to say is pretty good.

Monday Night Madness/Hyperion EPM Open Mic Night

When:  8:00pm to 10:00 pm
There is an Open Mic Night at Monday’s after-session fun.  Like a fool, I have volunteered to be one of the speakers in case there is a dearth of volunteers.  Also like a fool, and quite true to form (so I am repeating myself) I didn’t realize that one is not required, nay, is not allowed to write a presentation for this.  I wrote one.  Good thing I’ve got a really cool demo to go with it.  


We’ll see if I do this but if I do, I have got a really awesome twist on focused aggregations in Calculation Manager/Planning except this time the products are  Calc Scripts/Dodeca.  All I’ll say is that I have a way round the big problem with focused aggregations in Calculation Manager.  It is fast, fast, fast, fast.

Exalytics - An Apples to Apples Comparison

When:  Jun 25, 2013, Session 6, 8:30 am - 9:30 am
I think this has to be the Kscope group project to end all group projects.  By that I mean that John Booth, Tim German, Dan Pressman, and I all got together on the Mother and Father of all benchmarking tests.


Thanks to the generosity and quite possibly world record patience of Mark Rittman of the eponymously named Rittman-Mead, we have access to an Exalytics box.  What oh what oh what to do with it?  Why benchmark it against a really fast generic Linux box (that John bought with his own money – we are committed, or nuts) of course.  I also love that these severs are named Asgard and Zeus.  Naming the servers after mythological figures was coincidental and I think indicative of their speed.  Both environments are fast and put to shame anything I have ever seen at a client.  This project encouraged me to buy a mega laptop (well, mega as of summer 2013 – 32 gigs of RAM, 8 CPUs, and a SSD); I shall never buy one with a physical drive again.


What are our results?  As is usual, we are testing this down to the very last minute (some, like me, would argue that we are testing this beyond the last minute), so I honestly cannot say.  I will bet that whatever box you’ve got, we’ve got our hands on a faster one.  :)

Practical SQL for EPM Practitioners

When:  Jun 25, 2013, Session 8, 11:15 am - 12:15 pm
I’m terribly excited about this because I have found that I spend more and more of my time in projects using SQL to do all sorts of interesting and unusual (at least for an EPM geek) things.  It’s really opened my horizons wrt the roles I can play on a project and I find writing SQL code to be oddly therapeutic.  Yeah, I’m weird.


I’m going to use practical examples that I have used in the real world to show a bunch of techniques and approaches that I’ve found useful.  Of course, just like the Planning presentation I have more, much more, than I can possibly fit into a single session but I can temper that deficiency by blogging about it here.


As I was not all that long ago firmly in the “SQL means SELECT * FROM …” camp it’s been quite a transformation in skills for me and a valuable one to boot.  If you do EPM, and I don’t just mean Essbase and Planning, and yet are not at a SQL hacker like me I encourage you to attend.  

Lunch n’ Learn

When:  25 June, 12:45 pm to 1:45 pm
I’m sharing the dais with my much-admired former boss, Tracy McMullen (I have no idea what her title is now other than Mrs. Awesome but when I worked at interRel, she was Director of Consulting) and Chris Barbieri (Chris, I admire you too, even if you hate Essbase for reasons of technological jealously (Mr. HFM)  or maybe just sheer perverse bloody mindedness); John Booth moderates.  These are always free-wheeling and open ended.  I do get quite hungry.  This time I’m bring an energy bar.

A BSO Developer's Introduction to ASO Essbase

When:  Jun 26, 2013, Session 13, 11:15 am - 12:15
I actually didn’t sign up to do this presentation but was instead asked to do it.  I’m not an ASO wizard by any means but that actually was a good thing for this presentation because the topic is in fact how to approach ASO when one is a BSO geek.  

Yes, this has been done to death at Kscope but I think I bring two unique approaches to this subject.
  1. I use Sample.Basic, the BSO database that just about everyone knows, as the subject for my conversion.  It is both harder and easier to convert than you think.  As everyone (hopefully) knows what Sample.Basic is all about wrt calcs, dimensionality, etc., I can use the familiar constructs of that database to explain a plethora of ASO topics.  Although ASO and BSO are very different technologies one can use many BSO constructs to understand ASO.
  2. I used Dan Pressman’s standout ASO chapter in Developing Essbase Applications to dive deeply into the ASO kernel and understand how to design ASO databases for performance.  This was really exciting for two reasons:
    1. I need to understand how tools work.  IOW, “I gots to know”.  Hopefully in a less cinematically violent way than Dirty Harry but I at least share that innate curiosity about How Things Work.  Have you ever seen a bitmap index?  You will, and it won’t make your head explode.  I now know how ASO works, or at least as well as anyone who doesn’t have an @oracle.com email address does.
    2. I thought that Dan’s chapter was simply amazing, but also very, very dense.  I don’t mean that as a criticism in any way – it is a difficult and highly technical subject and a complete explanation of it is therefore obliged to be just as difficult and highly technical.  I read the chapter six times from beginning to end in an effort to understand it as well as multiple phone calls with Dan.  I am happy to announce that I think I have made more accessible what I consider to be the most important work on Essbase extant.  Do I cover all of what he wrote about or even delve into it at the same depth?  No, that simply isn’t possible within the 50-odd minutes (that time thing keeps on popping up, doesn’t it?) allotted to me but I think if you’ve read Dan’s chapter and did a big “whaaaa?” you should come to my session.  With its lessons under your belt you’ll be able to tackle Dan’s work again and again.  Btw, Dan is reprising his session from last year (quite unusual in Kscope but such is the import of his work) on 25 June from 2 to 3.  Even if you go to Dan’s session I encourage you to come to mine – the bitmap, the bitmap, the bitmap is difficult to understand at first but it is the essence of ASO performance and I cover it from a beginner’s perspective.  Did I mention I was excited?  :)


Fwiw, this was a very difficult presentation for me to write because so much of it is theory – I tend to think of myself as more the engineer type who takes theory and applies it as opposed to a physicist who strictly stays on the theoretical side of things.  So a bit of a stretch but if you are a BSO developer or someone, like me, who has seen some real dogs of ASO applications and guessed that BSO design principles may not apply, come to this session.

Conclusion or my ODTUG cup runneth over

I’ve figured out that in addition to the above two full presentations, plus the Planning copresentation, plus the benchmarking presentation, plus the open mic presentation, plus two private presentations for ODTUG/Oracle, I am somewhere in the neighborhood of 4 ½ to 5 presentations this year. As I write all of my presentations from scratch (I do not exactly have an army of minions who do this for me) that is an incredible amount of work.  I would guess somewhere around 300 to 400 hours and it came at the expense of just about everything that wasn’t work related; there are others (hello, Dan) on these projects who have put in even more unpaid time.  Yes, I love Kscope, but maybe I can love Kscope14 a little more by trying just a wee bit less.  We’ll see if that actually comes true.  


In any case, I think I’ve got some great things in store for the next week and I will be live blogging starting Saturday with the volunteer event.  


Be seeing you round Kscope13.

Kscope13, day -1

$
0
0

Introduction

For the last two years at Kscope I have live-blogged during the conference; Kscope13 will be no different.  This is both a fun and difficult task/duty.  Sometimes I am really, really, really busy and cannot blog during the day – I will do my best, but no promises.  And with that slightly deflating disclaimer, off we go.

Where oh where am I?

Why on the 31st floor of the Sheraton New Orleans.  Yes, that’s the Mississippi.  The view is pretty spectacular at night.


Volunteer day kickoff
The EPM (Essbase only, actually) started its relationship with ODTUG at Kaleidoscope 2008.  2008 was also the first volunteer event at the then Kaleidoscope and was in fact in a school.  We were back at a school, this time the Charles Easton Charter High School, in the Mid City/Bayou St. John districts of New Orleans.  


We did the usual back breaking (but welcome) work:  moving, painting, sanding, cleaning, etc.  Here we are in the cafeteria about to split up to our respective tasks.


Like a fool (I think I mentioned this in my last post), and yes, with me that is redundancy, I “knew” that indoors was hot, and outdoors was…less hot.  Hmm, I did not count on the working airconditioning but as I ignored my way out of the work I should have done and instead went to the outdoors work, you can guess it was hot, really hot.  And what did I do?  Sanded wood to make it available for picnic tables and benches.  Here’s William Booth hard at work.  Hot work, indeed.


Here’s a view of the painting.  We were busy, busy, busy.


You can see the lumber pile to the right – it’s just a fraction of what we sanded by hand.  


There was a bit of a spanner thrown into the works because of a rainstorm that happened in the middle of the outdoors work but we simply covered up and jumped back outside the moment the rain stopped.


The payoff and a warning

The school was really very nice about thanking we ODTUGers (not the official name but one I like) for the hard work we did.  It really is a pleasure and one of the things that differentiates Kscope from every other conference.


Back at the hotel, as I made my way up to my room, wet, smelly, and utterly wringed out, two women asked, “What’s an Od-Tug”.  I explained that we are an Oracle software conference (I passed by the whole the user conference marketing moment although perhaps I shouldn’t have) and that we always kick off with a volunteer day.  They were pretty surprised and said that it was really awesome we gave back to whatever town hosts our conference.  


It is really awesome and as I wrote, it makes Kscope unlike any other conference.


And now the warning

Hey, when the NOPD tells you not to bring a gun onto school property, they really mean it.  Hey, it’s only up to five years of hard labor in a chain gang; think how well you’ll be able to hone your Cool Hand Luke impersonation.  Happily I was only armed with my rapier wit.


There’s still the Welcome Reception to come later tonight.  


I hope to be seeing you all at Kscope13.

Kscope13, day 0

$
0
0

Introduction

Sundays at Kscope are always Symposium day.  I understand that there are multiple other Symposiums, one for each track.  Of course, being an Essbase geek, the only thing I care about is EPM, and here I am, sitting in the EPM Symposium, listening to Oracle management talk about the latest and greatest in the EPM space.

And the content is…

Sorry, can’t tell you.  That’s the deal – come to the symposium, do NOT blog about what you hear.  Safe harbor statements abound (so they’re going to tell us the future, but reserve the right to change their collective Oracle mind) as well as requests to NOT take photographs, NOT blog about what we learn.  Those are the rules.


So that means if you aren’t at Kscope13, you don’t know what is coming in the EPM space.  And I’m not (I cannot) tell you.  Stinks, doesn’t it?  The way to solve this is to come to Kscope14, and every year thereafter.  That’s my plan for career futures.  :)


What I can tell you is there is some very interesting news about Essbase.  It’s stuff we have all wanted for a long time.  Again, sorry if you are not at Kscope but we attendees are not allowed to tell you more about it per Oracle’s request.  So yes, a big, big tease.


And some very interesting news about Planning.  I have wanted this functionality for approximately forever, or at least since 2002 (ah, Planning 1.5, or maybe 1.1 – I no longer remember but oh my goodness you were buggy).  Alas, I again cannot tell you much of anything.  In fact nothing.

Conclusion

Are you gathering that Kscope gives you information that you cannot get anywhere else?  This is important stuff that defines the future of what we do and no other user conference delivers this information.


The brutal sadist in me sort of enjoys telling you that there is all sorts of cool stuff on offer at the Sunday symposium.  The caring nurturing inner Cameron wishes you were here.  Square the circle, bind the wound, cut the Gordian knot, for goodness’ sakes stop me from tortured metaphors and just make sure you are here next year at Kscope14 so I don’t have to keep on telling you about all the cool things I (and everyone else at Kscope13) know, and you don’t.  See, I really am the caring, nurturing sort.


Be seeing you at Kscope13 (oh, we happy few) and Kscope14.

Kscope13, day 1

$
0
0

Introduction

Monday is the first “real” day of the conference.  All very, very cool and for the first three sessions I am also room ambassador.  Alas, this does not mean that I shall be henceforth be referred to as “Your Excellency” although I am totally okay with that form of address in future.  You decide.

And the content is…

I was the ambassador for the first three sessions.  That means I passed out evaluation slips, was a microphone monkey, and got cool pins I get to attach to my conference lanyard.

Essbase and Planning Calculation Basics

What’s worse than having an 8:30 session?  Why it’s having Oracle Essbase development kibitz on your session and offer improvements to your content.  But it was all in spirit of helpfulness and John Booth was able to recover.

Essbase New and Upcoming Features

Super cool stuff.  You really, really, really should have been there.  Maybe I will update with details if allowed to.  Watch this space.  


Btw, if I am not allowed to, then all I can tell you is that Gabby Rubin, Essbase Product Manager, talked about some really interesting futures for Essbase.  And oh yeah, you should have been there.

Introducing the Outline Extractor NG (Next Generation)

I was on the beta for Tim’s latest and greatest version tool and was very possibly the worst beta participant although I did at least find one bug.  It was very exciting to see the tool actually on display for the world to see.


The Outline Extractor that Applied OLAP has maintained throughout the years has been incredibly useful.  I understand that the single largest downloader, based on domain name is none other than Oracle, so it was perhaps no surprise to see Oracle development staff in the room.  

Top Six Advanced Planning Tips

This is a session Jessica Cordova and I gave – she was the senior and I the junior.  And of course we lied – we didn’t actually do six tips, but instead three, which is sort of not what was promised.  We lied (well, we did, didn’t we?) because we had so much awesome stuff that we couldn’t possibly do all of it in a single session and instead have pinky promised to do the balance in an ODTUG webinar.  It took a lot of pressure off and we still had good content.  

Advanced Essbase Studio Tips and Tricks

This is a session given by my not-actually-related-in-any-way-and-in-fact-he-denies-the-whole-thing older brother Glenn Schwartzberg.  Glenn always does quality work and I wish oh wish oh wish I could have been there but unfortunately I got pulled away by other things.  Sorry big bro, but I’m sure I’m the worse off for not being there.

Keynote

This was really quite good.  Doc Hendley, of the Wine Into Water foundation, did a fantastic job.  A very inspiring and humble man who is determined to do good in this all too imperfect world.


ODTUG also announced that Kscope14 will be in Seattle, Washington.  My parents lived there back in the early 1960s in a basement apartment that had constantly moldy walls, no matter what my mother did to clean them.  My mother, who hasn’t returned to Seattle since 1961, also said, “It was a nice little town.”  I wonder if it has grown any since then.  :)

Conclusion

As always, Kscope is great, Kscope is exhausting, Kscope is the place you ought to be if you care at all about your overall knowledge and of course a chance to meet and great all of your fellow Oracle geeks.


Be seeing you in New Orleans.

Kscope13, days 2, 3, and 4

$
0
0

Introduction

Oh dear, I am rather afraid that the entire concept of live-blogging, or live-tweeting (see that box to the right of this text on my blog), has been a complete and utter failure this year.  Sorry.


What happened is that I, as I seem to do, managed to completely overcommit and then had to live up to the promises.  The end result was I had not time to:
  • Blog
  • Tweet
  • Floss my teeth (yuck, TMI, why?, really?, and just what is your dentist going to say about that?)


The first two I shall hope to correct below and the last, rather personal failure, is this blog’s fault.  Multiple times during this past week I sat down in my hotel room’s couch/davenport/settee to write this blog post and simply fell asleep.  I then would wake up a few hours later, realize the battery on the laptop had died again, plugged it in, brushed my teeth and fell into bed.  I shall hope that extra attention to dental hygiene in future will correct my slackness.  So far, no cavities.

So what did I do, and what did I see?

Monday, 12:30 – Top Six Advanced Planning Tips

Jessica Cordova and I lied to ODTUG (well, we did actually clear this with them first, but this is our doing, not ODTUG’s) and only presented three advanced Planning tips.  This is not because we hate Planning, or ODTUG, or our audience but because when we combined our work we knew we had far more than 50-odd minutes of content.  So we timed everything (Jessica and I come from the rehearse-it-to-death school of presenting), figured out what would fit, and only presented that at Kscope13.  Look in this space and an ODTUG email about our webinar in August where we do the balance (or maybe all) of the tips.  What we presented was still Good Stuff and we got to cover it at the length and scope that it deserved.

Tuesday, 8:30 – Exalytics – An Apples to Apples Comparison

This was the group project to end all group projects.  We (John Booth, Tim German, Dan Pressman, and yr. obdnt. srvnt.), thanks to the rather incredible generosity of Mark Rittman, were able to benchmark a generic Linux box against Exalytics to see which was faster, and why.


The presentation itself was more of a journey in how we set up a benchmark (I think real benchmarkers would laugh at our methodology but we had never done this before) and what choices we made, and why, although there were some results.  


The benchmarking result re which is faster, btw, is the classic Essbase result – it all depends on what you are doing and why.  I will also note that from a storage perspective we really didn’t do a good job setting up like to like comparisons but this was a hobby project (and for all of us, just one of many) and we did our best.  Suffice to say that now we know how to benchmark much, much better.  Hopefully the audience didn’t feel cheated by that.  

Tuesday, 10:45 – Practical SQL for EPM Practitioners

This was the session I was most excited about presenting as I have recently been doing rather a lot with SQL in my EPM projects.  


The presentation was given from a beginner’s perspective (this is easy for me because from as far as SQL is concerned, I too am a beginner) and covered some of the techniques that I have found useful.  


Everyone who does EPM needs to get on the SQL train (and yes, I was one of those Essbase geeks who until quite recently could only write “SELECT * FROM …” so thank you not-really-my-big-brother-but-oh-how-I-wish-you-were Glenn Schwartzberg for helping me (or maybe like completely doing my job) with HFM Extended Analytics; thanks also to Dan Pressman with other SQL content in the presentation.  I stand on the shoulders of giants.


The reason you, gentle reader, need to be more au courant with SQL is because it empowers you in your organization and with your systems.  It honestly isn’t that hard and I hope that this presentation helps you along the way to SQL mastery.

Tuesday, 12:45 Hyperion Apps Lunch n’ Learn

Thanks to the generosity of the OTN program, every year ODTUG presents multiple Lunch n’ Learn sessions across the tracks.  I have been in the Hyperion Apps one as I seem to do that for a living.


I was the Masters of Ceremony aka Microphone Monkey as the original MC/MM, John Booth, was unable to attend because of a family emergency.  I actually think John asked me to do this but I completely forgot (as you may notice, I have a few things going on at this conference and also my memory stinks) so this was a bit of a surprise.  I think the audience participation and the board’s ability to answer was pretty good – fell ACE Director Tracy McMullen and ACE Chris Barbieri did a great job as usual.


I am quite pleased that Lunch n’ Learns have hit their stride.  I MC’d/MM’d one in, I think, 2010, and it was just painful eliciting questions from the audience.  That was not at all the case at Kscope13.  


Thanks again, OTN.

Wednesday, 8:30 Experts Panel:  Essbase BSO Optimization

This was supposed to be moderated by John Booth but as I explained above he had a family emergency and so regretfully was not available.


Glenn Schwartzberg stepped in to moderate and Edward Roske, Tim German, Mike Nader, Steve Liebermensch, and yr. obdnt. srvnt. all sat in.  It was a pretty freewheeling discussion and I learnt something new about Essbase Report Scripts and data extraction.  Will my former boss (Edward) be proven right yet again?  It may pain me, immensely, if so, but Watch This Space for a new data extract post in the next few weeks.

Wednesday, 10:45 – A BSO Developer’s Introduction to ASO Essbase

This was for me, a BSO developer, a bit of a stretch.  It was difficult to write because so very much of it was theoretical, rather than practical application of theory; if you notice this blog, I tend to fall on the practical side of things.  OTOH, if one wants to do ASO right, one must also understand how ASO Essbase works.  Dan Pressman wrote the book (okay, the chapter) on this subject but I always thought his work, while incredibly important, was too hard for many of us to really understand.  Maybe we (or maybe I mean me) are dumb, maybe it is just a really complex subject.


In any case, I used this session as an opportunity to use BSO constructs and descriptions to sort of, kind of, describe how the ASO kernel works (yes, this was a little dangerous and yes, I was very careful to note when the analogies completely broke down) and then apply that understanding to MVFEDITWWW aka Sample.Basic converted to ASO.  It’s really a case of using terms and concepts we BSO types are familiar with and then applying it to ASO.  In my many, many, many conversations with Dan over ASO, that’s the approach that finally led to the “Ah-ha!” moment and I hope that slant plus the conversion of Sample.Basic via two different techniques was the theory made concrete for the audience.


I hasten to add that this presentation was really just a small part of Dan’s work and I am not suggesting that downloading my deck is the same as reading (and rereading and rereading and rereading) his chapter.  If you haven’t yet understood the key to ASO’s internal design (and given that there were about 80 people in the session, I’d say not everyone has), I encourage you to read my presentation as an introduction and then tackle his work.


Thanks again, Dan, for putting up with what must have been a record number of calls.  Now I think I finally understand ASO.

Wednesday, the rest of the afternoon

I am officially Not Allowed To Talk About It (I must keep some mystery in my life), but I’ll just note that I had Yet Another Presentation.

Thursday, the rest of the conference

Alas, I missed all of the sessions on Thursday as I slept in (I sort of had a busy past few days) and so missed Steve Liebermensch’s Essbase Exalytics session, and then had a meeting with my Australian Sister-Across-The-Waters (aka fellow board member and Oracle ACE Bambi Price) about ODTUG’s relentless path to world domination (we talked about Seriously Practical conferences in Asia with Frank Chow, one of my “lucky” EPM buddies).


And that, for me, was the end of the conference.

The end of What Cameron Did This Kscope

I haven’t even begun to cover all of the other things that went on at Kscope13, all the cool things that I could have done and wished I did, how amazingly fast it all went by, or how incredibly tired I am.


Suffice to say, it was an AMAZING conference and proof, if proof be needed, that no other organization throws an Oracle conference/party the way ODTUG does.  Thanks goes to Oracle, fellow presenters, fellow attendees, YCC, the Kscope conference committee(s), my fellow board of directors, and the many, many, many volunteers who make this conference possible.  It is, without exaggeration, the professional peak of my year and I simply could not do my job without ODTUG and Kscope.  I am indebted to you all.


Be seeing you next year in Seattle, Washington, for Kscope14.  I can hardly wait.

What makes Essbase data extraction fast?

$
0
0

Introduction

I am so sad/pathetic/a-geek-desperately-in-need-of-a-life.  No, that is not the point of this blog post (although the statement is true) but instead it is an observation that I am an inveterate hacker who just cannot leave a potentially good hack alone.

What do I mean by that?

At Kscope13 (alas, sadly passed but wow it was fun) this past week (a bit of a time issue here as I started writing this whilst flying home on the plane – ooh, technology, and sadness but what else am I going to do with myself?)  I sat in on a BSO optimization panel.  My hated and despised former oppressor beloved former employer, Edward Roske, mentioned the subject of data extracts and also that he found Essbase Report Scripts to be faster than MDX for exporting data.  This intrigued me (actually, I think my eyebrows went over my hairline, round the top of head, and down my neck) because that is not exactly the result I saw in my testing here.  OTOH, Edward doesn’t drop hints like this unless he is pretty sure of what he says and thus it behooves me to give it a try and see what happens.  


Edit -- Edward also mentioned that he used a report script keyword called {SSFORMAT} as part of his data extraction approach.  This was even more intriguing because I’ve never heard of it and I have been using report scripts for an awfully long time.  What oh what oh what is he going on about?

What I’m going to try to do with this post

While I started this blog entry out as a test to try to measure the impact of {SSFORMAT} on report script data extraction of course went my inquiries went off the rails as they are wont to do and I found myself measuring the much more interesting overall performance question (and I think what Edward was really alluding to):  What is the fastest Essbase data extraction to disk method?  As seemingly always with Essbase, “It depends”, but this blog post will attempt to qualify what the dependencies are and which approach is best for your data extraction needs.

Why this got so interesting so fast

I think one thing I’ve learnt from participating in an Exalytics benchmarking test (which was more like a treatise on how maybe not to perform a benchmark) is to try to have a variety of test cases.  And that turned out to be really important in this example because I soon found that the time Essbase takes to extract data is only one part of the performance puzzle.  There is also the not so little issue of how long Essbase/MaxL/whatever takes to write that information to disk.   Do not underestimate this component of performance because if you do, you will be guilty of the same mistake that I made, i.e., thinking that the time shown in the application log for a particular action is equivalent to the actual time for a data extraction  process ‘cause it ain’t necessarily so.


The first question (and the obvious one if you solely look at the logs) is which tool has faster Essbase performance?  Report scripts or MDX?  With or without {SSFORMAT} if a report script?  What makes {SSFORMAT} faster if it is indeed faster?  Can other techniques using report scripts have equivalent speed?  Is {SSFORMAT} any use at all?  


And then (once you the tester have noted some weirdo results) in elapsed time is which tool has faster overall (command issued to output complete on disk) performance – report scripts or MDX?  


Whew, what a lot of questions.  I guess I am just a benchmarking fool because I try to answer these with databases you can (mostly) replicate/improve/totally prove me wrong with.


What do I mean by all of this?  Simply that the Essbase application log lies.  Sometimes.  


NB – I do a little bit with the BSO calc script language DATAEXPORT but as I am going to spend time bouncing between BSO and ASO I will pretty much be ignoring that approach.  There are some numbers in the last test suite for your edification.

The truth

The Essbase report script time is 100% accurate – if it takes 25 seconds to write a report script out to disk, it took 25 seconds from the time of issuing the command to the time that you can edit the file in a text editor.  So truth, justice, and the American way.

{SSFORMAT}

So what about {SSFORMAT}?  Does it really make any difference?  Edward mentioned it was undocumented (so how did he know?) but actually it can be found in official Oracle documentation.  So not mentioned but shown in code samples.


Of course, once I heard this I simply had to try it out to see what it does.  And also of course, as it is an undocumented (mostly) keyword I didn’t really know what it could or should do.  From a bit of testing I can relate that the command:
  1. Removes page breaks and headers aka { SUPHEADING }
  2. Uses tab delimits aka { TABDELIMIT }
  3. Rounds to zero decimal points aka { DECIMAL 0 }
  4. Is supposed to make reporting fast, fast, fast


Is it really the case that {SSFORMAT} makes extracts faster?  Is Edward right or not?

A mangled <DIMBOTTOM report

I took the Bottom.rep report script that comes with every copy of Essbase and Sample.Basic, modified it to write lots and lots of rows, and came up with the following report.  Yes, it is kind of ugly, but it generates 133,266 rows when not using {SSFORMAT} – not exactly a big report but something that will hopefully register in the Sample application log.


Without SSFORMAT



Looking at this report, we can see column headers, space delmiting, and (although you can’t see it, I can) page breaks.  


Total retrieval time?  1.172 seconds.

With SSFORMAT





The row count is now 120,962 rows.  The data is the same, but the header is suppressed so fewer rows.


Total retrieval time?  0.906 seconds.  That’s almost a quarter faster – 22.7%.  So in the case of this database at least, Edward is right.


And why is he right?  He’s right because {SSFORMAT} gets rid of stuff.  Stuff like spaces and headers, in other words, it’s making the file smaller and smaller = faster.  At least in this case.


But Sample.Basic is not exactly anyone’s description of the real world.  What about another sample database?

Enter ASOsamp.Sample

Just like Sample.Basic, it is sort of difficult to argue that this database is totally reflective of much of anything in the real world as it is a pretty small database in ASO terms.  Regardless, you too can run these tests if you are so inclined and it is at least a little more realistic than Sample.Basic.

Test cases

I came up with a bunch of different report scripts to try to see if I could duplicate what I saw in Sample.Basic and also if I could come up with a way of duplicating {SSFORMAT} or realize that there was some magic in that command.  Is there?


Here’s the base report for all tests except the last two. 


Note the order of the dimensions on the rows.  This will become important later.  This is (thanks, Natalie Delamar for finding the link) ASO good practice.  Except that it isn’t, at least sometimes.  Read on, gentle reader.


The test case basically takes the above report and modifies, a bit, how the data gets exported.
Name
Details
Test4a
Base report with missing rows suppressed, member names, repeated rows, smallest to largest on rows
Test4b
As Test4a with tab delimit, decimal 0, suppress headings

Test4c
As Test4a with SSFORMAT
Test4d
As Test4a with decimal 16, suppress headings
Test4e
As Test4d, with SSFORMAT
Test4f
As Test4d, largest to smallest on rows


I set up a simple MaxL script with the timestamp keyword to get the true export time.


The Essbase Report Script code

spool on to "c:\\tempdir\\Report_script_query_ASOsamp.log" ;


login XXX XXXX on XXXX ;


alter application ASOsamp clear logfile ;


set timestamp on ;


export database ASOsamp.sample using server report_file "Test4a" to data_file "c:\\tempdir\\Test4a.txt" ;
export database ASOsamp.sample using server report_file "Test4b" to data_file "c:\\tempdir\\Test4b.txt" ;
export database ASOsamp.sample using server report_file "Test4c" to data_file "c:\\tempdir\\Test4c.txt" ;
export database ASOsamp.sample using server report_file "Test4d" to data_file "c:\\tempdir\\Test4d.txt" ;
export database ASOsamp.sample using server report_file "Test4e" to data_file "c:\\tempdir\\Test4e.txt" ;
export database ASOsamp.sample using server report_file "Test4f" to data_file "c:\\tempdir\\Test4f.txt" ;
export database ASOsamp.sample using server report_file "Test4g" to data_file "c:\\tempdir\\Test4g.txt" ;
export database ASOsamp.sample using server report_file "Test4h" to data_file "c:\\tempdir\\Test4h.txt" ;
exit ;


I can use the times that MaxL throws into the log file to figure out exactly how long the export process really takes.

Export results

Name
"Essbase time" in seconds
File size  in bytes
# of rows
Start time
End time
Elapsed time in seconds
Test4a
48.429
201,033,651
1,475,017
13:17:03
13:17:52
49
Test4b
47.337
137,485,186
1,316,166
13:17:52
13:18:39
47
Test4c
46.541
137,485,443
1,316,169
13:18:39
13:19:26
47
Test4d
50.431
306,891,743
1,316,166
13:19:26
13:20:16
50
Test4e
49.978
306,892,000
1,316,169
13:20:16
13:21:06
50
Test4f
41.633
306,891,743
1,316,166
13:21:06
13:21:48
42


Hmm, that “It depends” comment is rearing its head again, isn’t it?  There’s barely any difference between the tests.  The difference that I saw with Sample.Basic might be an anomaly or (probably more likely) it might just be too small of a database to measure much of anything.


So in this case at least, Edward is wrong– {SSFORMAT} has no measurable impact, at least on ASOsamp.  And even file size has no real impact.  Weird.


There is one thing that doesn’t make a ton of sense – take a look at that last test, Test4f.  It does something that, in theory, ASO Essbase is supposed to hate – largest to smallest dimensions on the rows.


And that’s what provided better performance, even with 16 decimal points.  So is that post from 2009 wrong?

A brief side trip into proving an adage

So is that Essbase Labs blog post right, or wrong?  Only one way to know.


Name
Details
Test4g
Decimal 0, largest to smallest on ROWs, just about all dimensions
Test4h
Decimal 0, smallest to largest on ROWs, just about all dimensions


Name
"Essbase time" in seconds
File size  in bytes
# of rows
Start time
End time
Elapsed time in seconds
Test4g
59.099
204,609,652
1,316,166
13:21:48
13:22:47
59
Test4h
197.411
204,609,652
1,316,166
13:22:47
13:26:04
197


Here’s the code with smallest to largest on the row:


And per Test4h which is largest to smallest on the row:


So this is interesting.  Sometimes, organizing the report from smallest to largest, as in the above Test4g results in much faster performance.  


And sometimes as in Test4f (which, to be fair is not a complete orientation of dimensions to the row), largest to smallest is faster.


I hope you all know what I am about to write about which approach is right for your report:  It depends.

But what about MDX?

Indeed, what about it?  The output formatting that comes out of MaxL via MDX in a word, stinks.  But who cares if it stinks if it’s fast, fast, fast.  Of course it has to be fast to be worthwhile.  So is it?

The test cases and the code

MDX doesn’t really have an output command the way a report script does – it can’t be part of an export database command in MaxL.  Instead, one must run it via a shell.  Happily (or not, as you will see in a moment), MaxL can be that shell.


I wanted to mimic the report script code as closely as I could.  Of course I can’t use  {SSFORMAT} but I can certainly try to test how long these queries run for and what happens when I write more or less data to disk.  I added or removed content from the output files by increasing/decreasing column width, decimals, and the non-specified POV dimensions on the row or not.
Test cases
Name
Details
MDX1
2 decimals, 40 wide, dims on row
MDX2
As MDX1, no dims on row
MDX3
As MDX1, 80 wid
MDX4
As MDX2, 80 wid
MDX5
16 decimals, 40 wide, dims on row
MDX6
As MDX4, no dims on row
MDX7
As MDX5, 80 wide
MDX8
As MDX5, 80 wide

Sample code

To give you a feel for the code, here’s the basic query with POV dimensions on the row.
SELECT
    { CrossJoin ( { [Years].Children }, { [Measures].[Original Price] } ) }
ON COLUMNS,
     NON EMPTY CrossJoin ( { Descendants( [Products] ) } ,CrossJoin ( { Descendants( [Stores] ) }, { Descendants ( [Geography] ) } ) )
ON ROWS,
    CrossJoin ( CrossJoin ( CrossJoin( CrossJoin ( {[Transaction Type]}, {[Payment Type]} ), {[Promotions]} ), {[Age]} ), {[Income Level]} ) ON PAGES
FROM [ASOsamp].[Sample]
WHERE ([Time].[MTD]) ;


Sample output

So is MDX quicker than a report script?

Name
"Essbase time" in seconds
File size  in bytes
# of rows
Start time
End time
Elapsed time in seconds
MDX1
9.789
318,513,488
1,316,184
9:18:01
9:24:47
406
MDX2
9.701
265,867,452
1,316,187
10:18:49
10:25:30
401
MDX3
9.802
634,394,487
1,316,188
8:31:13
8:41:23
610
MDX4
9.688
529,101,199
1,316,187
8:12:52
8:21:30
518
MDX5
9.76
318,514,116
1,316,185
10:59:35
11:15:45
970
MDX6
9.716
265,867,401
1,316,187
10:33:16
10:45:26
730
MDX7
9.774
634,394,436
1,316,188
11:26:09
11:41:57
948
MDX8
9.729
529,101,001
1,316,187
11:50:58
12:03:32
754


Yes and no, all at the same time.  Yes, the Essbase time as logged in the ASOsamp.log file is much, much quicker than a report script.  But the overall time (again from the timestamp command in MaxL) is slower.  A lot slower.  MaxL is not particularly good at writing data out.  It’s single threaded and writing out half a gigabyte text files is quite arguably not really what MaxL is supposed to do.  And it agrees.


Of course if one could grab that output via MDX and write it out more quickly, the MDX would be the fastest retrieve method bar none, but that simply isn’t an option in a simple scripting test.  I can’t say I know what Perl or the API or some other method might do with this kind of query.  Any takers on testing this out and pushing the Envelope of Essbase Knowledge?


For the record, Edward is right, report scripts beat MDX as an extraction process.  Or do they?

One last test

I have to say I was a bit stumped when I saw the above results.  Edward was right?  Really?  Against all of the testing I had done on that Really Big BSO database with report scripts, DATAEXPORT,  MDX NON EMPTY, and MDX NONEMPTYBLOCK?  Really?  Time to run the tests again.  


I’m not going to go through all of the tests as they are in that post, but here are the tests and the results (and note that I believe I goofed on that old post, report scripts are 10x as bad as I measured before.  Whoops) for your perusal.

The tests

Name
Details
ExalPlan_RS1
Extract of allocated data with 16 decimals, tab delimit, member names, repeated rows, missing rows suppressed
ExalPlan_RS2
Same report script layout as ExalPlan_RS1, but {SSFORMAT} and SUPEMPTYROWS only
ExalPlanCalc
Essbase BSO calc script using DATAEXPORT
ExalPlan_MDX1
Same layout as report scripts, uses NON EMPTY keyword
ExalPlan_MDX2
Same as MDX1, but uses NONEMPTYBLOCK

The results

Name
"Essbase time" in seconds
File size  in bytes
# of rows
Start time
End time
Elapsed time in seconds
ExalPlan_RS1
13,392.3
1,788,728
21,513
15:40:52
19:24:04
13,392
ExalPlan_RS2
13,068.7
1,243,630
21,513
19:24:04
23:01:53
13,069
ExalPlan_Calc
947.842
2,548,818
21,512
23:01:53
23:17:41
948
ExalPlan_MDX1
640.562
3,485,581
21,522
23:17:41
23:28:24
643
ExalPlan_MDX2
1.515
3,536,291
21,835
23:28:24
23:28:29
5


So in this case, Edward is wrong.  MDX trumps report scripts.  I think maybe, just maybe, he’s human and is sometimes right, and sometimes wrong.

What’s going on?

What an awful lot of tests – what does it all prove?


I think I can safely say that {SSFORMAT} doesn’t really make much of a difference, at least with the report that I wrote for ASOsamp.  Maybe in other reports, like the one I wrote for Sample.Basic, it does.  Maybe not.  You will have to try it for yourself.


Think about what {SSFORMAT} will do to the output – especially the zero decimal points (and implicit rounding).  Is that what you want?  If yes, then it is a quick and dirty way to get rid of headers and tab delimit.  Other than that, I can’t say I see any real value in it.  


The current score (Edward and I are not really in any kind of competition, but this is a handy way to figure out who is right and who is wrong, or righter or wronger, or…well, you get the idea): Edward 0, Cameron 1.


Now as to the question of report script vs. MDX I think the question becomes a lot murkier.  In some cases, especially extracts that seem to go export many rows of data, report scripts, at least with the reports and MDX queries I wrote against ASOsamp, report scripts are significantly faster than MDX.


The score has changed to Edward 1, Cameron 1.  A dead heat.


But what about queries that go after lots and lots and lots of data (think really big databases) and only output a relatively small amount of rows?  Then MDX, at least in the case of a ridiculously large BSO database, is faster.


One could argue Edward 1, Cameron 2, but honestly, it’s a pretty weak point.


What does all mean for you?  Wait for it...


It depends.


In other words, just like seemingly every Essbase truism out there, you must think about what you are trying to do and then try different approaches.  I wish, wish, wish I could make a blanket statement and say, “Report scripts are faster” or “MDX queries are faster” but it simply isn’t so.


I will say that if I could get MDX output into something other than MaxL, I think MDX would beat report scripts each and every time.  But I have spent waaaaay more time on this than I expected and maybe if I can get a Certain Third Party Tool to play along, I can prove or disprove this hypotheses.  Or maybe someone else would take the output from MDX via the JAPI and throw it against a text file.  In any case, MaxL’s single threaded nature on log output (which is how MDX writes to disk) is slow, slow, slow.


I have to stop listening to that guy – the above is two full days of testing and coding to try to see if he was full of beans or not.  And in the end the results were…inconclusive.  Such is life.

This blog post brought to you by the John A. Booth Foundation for Impecunious Independent Consultants

No, I am not crying poverty, but know that the times and tests are because of the generosity of John who is allowing me to use one of his slower servers (yes, servers, and yes, it is very nice that he is willing to share it with me, and yes, it is kind of odd that he owns several real honest-to-goodness Essbase servers but he is just that kind of geek) for the testing.


Yes, I could have and should have tested this on AWS, and it isn’t as though I am a stranger to making Amazon even richer, but I thought since I did my first tests with extracts on one of his servers, I should keep on doing just that.  


Thanks again, John, for the  use of your Essbase box.


Be seeing you.

I cover the Antipodes

$
0
0
Okay, technically they’re only the Antipodes if you live in England. And, if you look at a globe, it’s easy to tell that this is just a figure of speech, not a direction for a Journey To The Center Of The Earth. In fact, near as I can tell, China fits for the States if I were to transfer the analogy to where I live. And that makes sense because that movie about Three Mile Island was called The China Syndrome and not The New Zealand Syndrome. (True story – I can remember as a kid my parents sitting at the kitchen table trying to figure out where to bug out if York, PA became a radioactive wasteland. Fun times, fun times. This stuff is safe, right? Riiiiight.) Have I lost everyone? Hopefully not, because there is good stuff to come.

Anyway, I am not in the nuclear power industry (and that is a good thing given my sometimes decided lack of attention and focus) nor am I going to China, but yr. obdnt. srvnt. is going to both New Zealand and Australia for two conferences. Yes, I am a glutton for punishment but I was asked and I said “Yes” before anyone could change his mind.

New Zealand
The New Zealand Oracle Users Group has its conference every 18 months. NZOUG 2013 is from 18 to 19 March 2013 in Te Papa, Wellington, NZ. In theory, I was the content chairman for the BI and EPM track at this conference but I have to admit that this really meant that I bugged, bothered, and pestered Erica Harris, Richard Philipson, and what seems like most of Oracle Australia/New Zealand (thanks to Kay Galbraith and Daniel O’Brien) with trying to figure out what would be appropriate content for NZ and who oh who would present. They did a great job identifying people to speak. NZOUG does quality work and their agenda is very strong. I plan on checking out the other tracks (something I never seem to be able to do at Kscope) while I am there as well as presenting two sessions, one on ODI and data quality (hey, come to NZ or buy my book and read my chapter on this) and the other, excitingly, on Dodeca. Now I just have to finish writing it.

Check out the agenda here.

Here’s what’s planned for BI and EPM:

The important bits are: NZOUG 2013 is from 18 to 19 March in Te Papa, Wellington and costs a mere 795 +GST NZD if you are a member and register under the early bird scheme. Read the full agenda– there’s amazing value for money.

Australia
Ah, another country, and a slightly different group of people to exasperate, although in this case it’s fellow board member and Oracle ACE Bambi Price that I think I annoyed the most and of course Oracle Australia (hi, Kay, and yeah, I owe you). Again, I helped out with the agenda and yes, I have written about this before for the ODTUG blog where you can read all about it.

This is an ODTUG Seriously Practical conference (NZOUG is their own full Oracle product line show show, I am just there to present and help with the BI and EPM content selection) and as such will focus a deep dive into the technical end of the BI and EPM tools. Yes, I am presenting the same two sessions at this conference and no, there will not be many NZers (I just made that word up as “Kiwis” is a bit twee) in Melbourne so I don’t view this approach as a rerun. More like a keep-Cameron-on-the-ragged-edge-of-sanity-because-he-takes-too-much-on approach.

Check out the agenda here.

Here’s what’s planned for BI and EPM:

The important bits are: the ODTUG SP Australia is from 21 to 22 March in Melbourne and costs a mere 599 AUD. Read the full agenda – there’s amazing value for money. Again. 

This is pretty exciting stuff
Okay, the flight in economy from home to NZ to Aus to NZ to LA to home is not exciting. At all. But helping out with BI and EPM geeks on the other side of the world is exciting. Yes, they have odd sounding accents (of course to them I’m the one with the weird way of pronouncing things and the incomprehensible slang) but their passion and commitment to technical knowledge, sharing, and evangelism is just like what you see here in the States with ODTUG’s events. I’m beyond happy and proud to help out and I’m hoping that both events will be a great success.

Thanks to the magic of Google Analytics, I know that both NZ and Australia read this blog. Australasians, if you have ever wondered what kind of idiot I am in person, now’s your chance. :) Seriously, they’re both good presentations and you can always go get a cup of tea (ah, real tea, I wonder what the NZ/Aus. version of Typhoo or PG Tips is) if I prattle on too much. I hope that you’ll be able to come to the conference that is closest – as you can see from the above there’s really some great content on offer.

Be seeing you.

Stupid Planning queries #11 -- Where and what are my Smart Lists

$
0
0

Where oh where has my Smart List gone?

Smart Lists are a wonderful thing. Well, that may be slightly exaggerating their usefulness but if you want to create a drop down list in a Planning Account (or other dimension but I have never seen it) Smart Lists are the way to go. Come to think of it, they are a bit of a Hobson’s Choice (a truly fantastic movie) if you want dropdown lists in Planning forms.

Planning even gives you a great way to view the Smart Lists in your Planning application by simply going to the Administration->Manage->Smart Lists menu.

Here is a (rather short) list of Smart Lists in my sample Planning app:

Live and in person
This is a very silly and pointless form that nevertheless shows a Smart List with nothing selected.

Clicking on the down arrow:

Selected Yes


Saved to Essbase


FWIW, if I pulled the Account YesOrNo in Essbase using a Smart View adhoc analysis link, I would get a 1 in that cell as, bizarrely, Smart Lists do not resolve to Essbase Text Measures. I will try not to think about why that is the case as their functionality is the same. Different development groups is the best explanation I can think of but it is frustrating. Onwards, regardless.

So all of this is great, and more than a little basic
Yes, I know, what is there to query if you just created the Smart List? Well, in the case of this sample Planning application, there is no real reason to query much of anything as we know what the Smart List is and what member it’s tied to.

But what if you didn’t know what member the Smart List was assigned to? How would you know? As far as I can tell, there is no magic report in Planning that will give out this information.

A different story
And what if you were working on a Planning migration/modify project (ahem) and the Planning application had 31 Smart Lists, and somehow the association of Smart List to member got lost (oh, thank you accursed EPMA), and you had to go back to the original Planning app to figure out what goes where? What would you do then? Scream? Cry? Curse your bad luck? Or how about write a query that looks just like this?

The query

/*
    Purpose:     Figure out what Smart Lists are assigned to which Members
    Modified:    23 February 2013, Cameron Lackpour
    Notes:       This is a nice and easy one, isn't it?
*/
SELECT
  O.OBJECT_NAMEAS'Member',
  E.ENUMERATION_ID AS'SL #',
  E.NAME AS'Smart List'
FROM
  HSP_MEMBER M
INNERJOIN
  HSP_ENUMERATION E ON M.ENUMERATION_ID = E.ENUMERATION_ID
INNERJOIN
  HSP_OBJECT O ON O.OBJECT_ID= M.MEMBER_ID


And that produces a result set like this:


That hypothetical Planning application I mentioned above? Would you believe 31 Smart Lists of which 14 were actually assigned? Yup, 17 dead Smart Lists. Isn’t application maintenance a stinker? Apparently so.

Everything you ever wanted to know about a Smart List
Above I joined HSP_ENUMERATION (why wasn’t the tabled called HSP_SMARTLIST?) to HSP_MEMBER to get the link between member (in any dimension) and Smart List. But what if you just wanted a quick review of everything that ever made up a Smart List?

Query the second

/*
    Purpose:     Smart List contents by name
    Modified:    23 February 2013, Cameron Lackpour
    Notes:       Another Nice 'N Easy one.
*/
SELECT
      E.NAME,
      EE.ENTRY_ID,
      EE.NAME,
      EE.LABEL
FROM
      HSP_ENUMERATION_ENTRY EE
INNERJOIN
      HSP_ENUMERATION E ON EE.ENUMERATION_ID = E.ENUMERATION_ID
And that produces a result set like this:


And that’s it
I have to say that I wrote this blog post because I needed to get that list of Smart List associations to members and simply couldn’t find it on the series of tubes that make up the world wide web.   I’m sure it exists, somewhere, or maybe it was just so easy no one bothered to post it. Regardless, now world+dog has it.

I will note again what an incredbily helpful thing it is to write these queries – I cannot imagine going through each one of the Accounts in the application I am talking about (over three thousand across multiple Plan Types) and try to find the silly things – I’d have completely gone off my rocker (although I will admit it might be hard to spot when that happens) and I would have spent a *long* time trying to figure out where the non-assigned 17 Smart Lists should have been. Which was nowhere, thanks to the query. SQL saves the day yet again.

Be seeing you.

Stupid Planning query #12 -- Calculation Manager Rights

$
0
0

Introduction

Arrrgh, this one really got me going. Actually, I have noticed that I have never written a query against the Planning tables unless I am unable to get whatever it is out of Planning easily. And I suppose that sort of makes sense. And I also that means I am always annoyed and that missing Planning reporting features are my opportunity to increase my SQL skills, such as they are. Of course writing the query took way longer than I thought it would. Read on for the reason...

With that preamble, have you ever wondered what security is assigned to Calc Mgr rules in Planning? You basically have to go into each rule and edit the security to see what user or group has been assigned and what rights are set. Annoying, isn't it? And not practical when there are many rules. Here's an example of what it looks like:

I can see that the group PLN_CalcTest_Consol has Launch access to the rule AggAll, but what about AggPlan, CalcRev, etc., etc., etc.?

So yes, this is yet another opportunity to query the tables. And oh yes, I am using this as a teeny part of the Planning presentation I am giving with Jessica Cordova (hi, Jessica) at Kscope13.

NB – One other note, I was inspired to get around to this query in response to “vaio” and his Hyperion Business Rules security query from this Network54 thread: http://www.network54.com/Forum/58296/thread/1362011042/Export+only+Webform+security I figured if world+dog had it for EAS’ business rules, we needed it for Calc Mgr as CM is all there is from 11.1.2.2 onwards.

The reason this query drove me up the wall

Would you believe that deployed Calculation Manager rules do NOT have an object type in Planning? Would you believe that I spent more than a little bit of time trying to find it?

Oh yes, both statements are true. The latter one you are going to have to take on trust. The former I can prove.

Here are the Object Types in the Planning app schema:
SELECTDISTINCT
OT.OBJECT_TYPE,
OT.TYPE_NAME
FROM
HSP_OBJECT_TYPE OT
OBJECT_TYPETYPE_NAME
1
Folder
2
Dimension
3
Attribute Dimension
4
Calendar
5
User
6
Group
7
Form
8
FX Table
9
Currency
10
Alias
11
Cube
12
Planning Unit
30
Attribute Member
31
Scenario
32
Account
33
Entity
34
Time Period
35
Version
37
Currency Member
38
Year
45
Shared Member
50
User Defined Dimension Member


And here is the OBJECT_TYPE that goes with Calc Mgr rules:
SELECT
*
FROM HSP_OBJECT
WHERE
OBJECT_TYPE ='115'

See the 115? I only know that because I did a search on the name of one of the rules and thus figured it out.

Why would you care about OBJECT_TYPE 115?

Well, once you (or I) knew this, you (or I) could write this:
/*
Purpose: Calculation Manager security report by rule, group, and user
Modified: 1 Feb 2013
Notes: Common Table Expressions make joining mostly disparate objects
relatively easy.
NB -- Calc Mgr rules do NOT have an OBJECT_TYPE in HSP_OBJECT.
The OBJECT_TYPE seems to be 115.
*/
-- I am in love with CTEs over subqueries
WITH
-- CTE for Calc Mgr OBJECT_ID, Plan Type, and Name
BRName(CMID,PlanType, BRName)AS
(
SELECT
CMR.ID,
CMR.LOCATION_SUB_TYPE,
O.OBJECT_NAME
FROM
HSP_CALC_MGR_RULES CMR
INNERJOIN
HSP_OBJECT O ON O.OBJECT_ID= CMR.ID
),
-- CTE for Calc Mgr user OBJECT_ID, Calc Mgr OBJECT_ID, and Launch rights
BRAccess(UserID, CMID, Launch)AS
(
SELECT
AC.USER_IDAS'User ID',
O.OBJECT_IDAS'CM Obj ID',
--O.OBJECT_NAME 'CM Name',
CASE AC.ACCESS_MODE
WHEN-1 THEN'No Launch'
WHEN 4 THEN'Launch'
ELSE'Unknown'
ENDAS'Access'
FROM
HSP_ACCESS_CONTROL AC
INNERJOIN
HSP_OBJECT O ON O.OBJECT_ID= AC.OBJECT_ID
WHERE
O.OBJECT_TYPE ='115'
),
-- CTE for user OBJECT_ID, user name, group name
UsersInGroups(UserID, [User Name], [Group Name])AS
(
SELECT
--O.OBJECT_ID AS 'User ID',
O2.OBJECT_IDAS'CM Obj ID',
O.OBJECT_NAMEAS'User Name',
O2.OBJECT_NAMEAS'Group Name'
FROM
HSP_USERS U
INNERJOIN
HSP_OBJECT O ON O.OBJECT_ID= U.USER_ID
INNERJOIN
HSP_USERSINGROUP UG ON UG.USER_ID= U.USER_ID
INNERJOIN
HSP_OBJECT O2 ON O2.OBJECT_ID= UG.GROUP_ID
)
SELECT
BRN.BRName AS'Calc Mgr rule',
BRN.PlanType AS'Plan Type',
BRA.Launch AS'Launch',
UIG.[User Name] AS'User name',
UIG.[Group Name] AS'Group name'
FROM
BRAccess BRA
INNERJOIN
UsersInGroups UIG ON UIG.UserID = BRA.UserID
INNERJOIN
BRName BRN ON BRN.CMID = BRA.CMID
ORDERBY BRN.BRName, UIG.[Group Name], UIG.[User Name]

And then you (or I) could run the above query, and get the following:
Calc Mgr rulePlan TypeLaunchUser nameGroup name
AggAllConsolLaunchTestPlanner1PLN_CalcTest_Consol
AggAllConsolLaunchTestPlanner2PLN_CalcTest_Consol
AggAllConsolLaunchTestPlanner3PLN_CalcTest_Consol
AggPlanConsolLaunchTestPlanner1PLN_CalcTest_Consol
AggPlanConsolLaunchTestPlanner2PLN_CalcTest_Consol
AggPlanConsolLaunchTestPlanner3PLN_CalcTest_Consol
CalcRevConsolLaunchTestPlanner1PLN_CalcTest_Consol
CalcRevConsolLaunchTestPlanner2PLN_CalcTest_Consol
CalcRevConsolLaunchTestPlanner3PLN_CalcTest_Consol
ClrBSConsolLaunchTestPlanner1PLN_CalcTest_Consol
ClrBSConsolLaunchTestPlanner2PLN_CalcTest_Consol
ClrBSConsolLaunchTestPlanner3PLN_CalcTest_Consol
ClrFinalConsolLaunchTestPlanner1PLN_CalcTest_Consol
ClrFinalConsolLaunchTestPlanner2PLN_CalcTest_Consol
ClrFinalConsolLaunchTestPlanner3PLN_CalcTest_Consol
ClrTrgtsConsolNo LaunchTestPlanner1PLN_CalcTest_Consol
ClrTrgtsConsolNo LaunchTestPlanner2PLN_CalcTest_Consol
ClrTrgtsConsolNo LaunchTestPlanner3PLN_CalcTest_Consol


Isn't that pretty? And useful? I (or you) think so.

The sting in the tail

Would you believe I spent a good half hour poking around in the Calculation Manager tables?

Would you believe that the CALCMGROBJECTACCESS table in the Calc Manager schema is completely empty? I have no idea what it is for, but it isn't for rules deployed to Planning. Terrific.

But the good news is that with a little poking about writing a simple (well, CTEs aren’t totally beginner’s stuff but they are so easy to read and work with) query, you (or I) can easily look at who and what kind of access planners and groups have to deployed CM business rules.

Be seeing you.

Australians, So where the bloody hell are you?

$
0
0
Wow, that is a pretty awful tourism slogan, isn’t it?  I figured if you lot inflicted it on the rest of the world turnabout is fair play.  With luck, I will be met with chants of “Yankee go home!”  In the interest of good international relations and not getting pelted with eggs when I mount the podium at the just-about-here-why-haven’t-you-registered-yet ODTUG SP Australia conference, Melbourne 21 to 22 March, let me note that we too have real clunkers like – “Erie, Pennsylvania…Feel the Lake Effect”.  (I’ve been in Erie, PA during the winter and trust me, you don’t want to experience Lake Effect.)  Happily the world shares equally in bad marketing slogans so we can all engage in a little schadenfreude as we read these bon mots.  There, international enough for you?

But there is a point to that now discarded Australian tourism slogan – why oh why oh why have you Australian EPM and BI practitioners not signed up for ODTUG’s totally awesome Seriously Practical conference, Melbourne, 21 to 22 March which is just the end of next week?  Huh?  Why?  Why?  C’mon, give me a good reason.  You can’t, can you?

Don’t believe me?  Check out the agenda

I will note that this Seriously Practical conference has some seriously awesome content.  Again, don’t believe me?  Take a look at the below content.  What are you waiting for?  Me begging on my knees?  You have it, metaphorically.  A fantastic agenda?  Cast your eyes downwards and learn more.

Day 1

Welcome and Opening Remarks, Bambi Price, ODTUG8:45 - 9:00 AM

Keynote: What’s Coming in Oracle BI and EPM, Babar Jan Haleem, Oracle Corporation
9:00 - 9:30 AM
A glimpse into the future of Oracle BI and EPM, delivered by the Director of EPM BI Architecture & Technology for APAC.

SESSION 1
9:30 - 10:30 AM
Fusion Applications and Your BI/EPM Investment,  Debra Lilley, Fujitsu
Oracle Fusion Applications are here today providing the next generation of applications. They are about having everything the user needs in one place, and that includes information. Fusion Applications is a window on Oracle’s Fusion Middleware stack and a very big part of that is BI/EPM and analytics. This presentation will include a small demo of how Fusion looks and is designed to give you an appreciation of how BI/EPM is embedded in Fusion. For anyone thinking of Fusion in the future it will underline that your B/EPMI investment today is an investment in that future and protected.

SESSION 2
10:45 - 11:45 AM
Highlights and Capabilities of the Latest Release of EPM, Charles Pinda, Oracle Corporation
Enterprise Performance Management (EPM) is comprised of four streams to assist the Office of the CFO to deliver predictable financial results. These streams are Strategy Management, Planning and Forecasting, Profitability Management, and Financial Close. In addition, the reporting delivery layer of EPM is supported by Oracle BI to provide further analytical insights into this information. Now the latest release of EPM, version 11.1.2.2, has new functionality to enhance the business process of delivering predictable results. This session will highlight some of these enhancements and demonstrate the new capability.

LUNCH
11:45 AM - 12:45 PM

SESSION 3
12:45 - 1:45 PM
Endeca Information Discovery, Stephen Weingartner, Oracle Corporation
Endeca Information Discovery (EID) provides unique and powerful analytical capabilities that enable organisations to discover insights in information that would otherwise be unusable. EID’s strengths in unstructured data, agile business intelligence, and information discovery will be demonstrated and discussed. Several EID use-cases will be covered to illustrate the wide variety of solutions which have been implemented at several large organisations. During this presentation, a solution will be created in EID, demonstrating how it works in an end-to-end manner. Attendees will learn how EID differs from other business intelligence and big-data technologies and how it has created its own new niche which companies can fill.

SESSION 4
2:00 - 3:00 PM
Exploring Oracle BI Apps: How Does it Work and What Do I Get, Richard Philipson, James & Monroe
This presentation provides an overview of the BI Apps architecture for novice users, with clear concise information presented in an easy-to-understand format. The presentation steps through the aspects of an implementation from conception to execution and concluding with example content for one of the many areas of content Financial Analytics.

SESSION 5
3:15 - 4:15 PM 
Thoughts from the Frontline – Issues and Opportunities Faced When Implementing or Upgrading HFM Applications,
Christine Aird, M-Power Solutions

This session will walk you through the project lifecycle of an HFM implementation/upgrade and cover key areas where problems/opportunities regularly occur. Using real life case studies, interspersed with best practice processes and approaches, the session will give you an insight into how you could avoid the challenges and take advantage of possible opportunities during your project. The session will touch on how we see HFM being used, will cover some of the misconceptions that follow HFM, and will drill into the potential issues this can cause and what is considered best practice for an HFM implementation.
This session will give you great insight into what you should use HFM for and how to deliver a successful project or upgrade.

SESSION 6
4:30 - 5:30 PM
The Spreadsheet Management System Known as Dodeca, Cameron Lackpour, CL Solve
Business users love Essbase for its unparalleled analytic power. Business users also love Excel because spreadsheets are where data is expressed, analyzed, and manipulated. Essbase + spreadsheets = analytic bliss.
But as soon as you move beyond ad-hoc Essbase functionality, a series of questions arise:
1) How do you handle complex functionality?
2) Code yes, but where, and how?
3) What about non-Essbase data?
4) Workbooks on the web?

What’s needed is a system that:
1) Is spreadsheet-centric
2) Ties easily to SQL
3) Automatically distributes and updates workbooks

Dodeca does all of the above, and more; it is the key to managing complex workbooks so you and your company can focus on the real task at hand—analyzing, understanding, and managing the numbers, not the spreadsheets. This presentation introduces the issues around spreadsheets, Dodeca’s philosophy around managing multiple complex workbooks, and then demonstrations of what Dodeca can do.

Day 2

SESSION 7
9:00 - 10:00 AM
Taking OBIEE to the Next Level, Maneesh Disawal, James & Monroe
Move over pivot tables and charts and incorporate Exalytics with KPIs, scorecards, maps, and advanced analytic functions into your regular reporting. Use the latest infrastructure and visualization techniques to create dazzling dashboards to quickly and directly communicate relevant information. Incorporate data in office communications and deliver reports on mobile devices.

SESSION 8
10:15 - 11:15 AM
Essbase ASO – A Brave New World in Australia but not for the Rest of the World, Steve Hitchman, m-power Solutions
Many of you will be familiar with the concept of Essbase ASO / Aggregate Storage. You’ve probably read about the differences from Essbase BSO, seen a case study or used it on a project but even though ASO has been available for over 5 years adoption in Australia continues to be very limited.  In this session, we will cover the basics of what ASO is before exploring what’s so great about it and how it is being used to help companies in Australia.  For lovers of the more traditional BSO model, we’ll explain the differences highlighting what you can and can’t do in ASO models. This initially seems like a lot but we’ll pass on the tips and tricks that we’ve learned that allow this gap to be bridged including how ASO and BSO can work together.
We’ll highlight the real world stuff that goes beyond the textbook and Oracle marketing to expose how ASO technology is revolutionising what Essbase can deliver.

SESSION 9
11:30 AM - 12:30 PM
Oracle BI and Oracle Essbase: Today and Tomorrow, Stephane Roman, Oracle Corporation
Essbase and OBIEE have come a long way together since they met years ago. Being both strategic solutions, Oracle have continuously improved their specific technologies while making the integration more and more seamless at the same time.
In this session, we will first of all give an overview of the OBIEE & Essbase architectures and explain their typical usages within the enterprise. We will then open the bonnet and look at how the two solutions work together. How is the Essbase multi-dimensional structure (outline) understood by the OBIEE semantic layer (Repository)? How can I federate relational and multi-dimensional data sources through the OBIEE RPD’s logical layer? What are the different ways to import and model an Essbase cube OBIEE? How to work with unbalanced hierarchies? Are all Essbase features available through OBIEE (UDAs, Variables, Levels vs Generations…)?
Finally, we will look at how upcoming releases of OBIEE & Essbase will make the two solutions even tightly integrated, as well as a glance on how both are used with new Oracle Applications.

LUNCH
12:30 - 1:30 PM 

SESSION 10
1:30 - 2:30 PM
Slay the Evil of Bad Data in Essbase with ODI, Cameron Lackpour, CL Solve
Everyone knows that bad data kills Essbase applications. But did you know that bad data can also kill careers, consulting engagements, and company-wide labor agreements? Why is the rule of high-quality data in Essbase honored more in the breach than the observance? This session explains the real-world consequences of bad data, categories of data quality, and tools and strategies for ensuring that your Essbase database has the right data. A complete solution in ODI will show one path to salvation in the never-ending quest for quality data.

SESSION 11
2:45 - 3:45 PM 
Growing with Business Analytics - Keeping Updated and Informed, Paul Anderson, Oracle Corporation
What's the fastest way to find out about product certification and compatibility? Which social media channels are available for EPM and BI products? These and many more questions are often asked in the busy and growing business world; incorporating the Internet highway into our lives.
There have been recent changes in both significant improvements and new implementation of ways in which customers can communicate with Business Analytics support.
This presentation will run through changes made by the Oracle Business Analytics Proactive Team to areas including MOS Communities, knowledge management, and social media.
This session will show the way you can ask questions and find answers about Product Lifecycle, how to interact with Business Analytics support in a simple and efficient way, how to keep updated with news on areas such as patches and documentation, and will provide a demo of the new translations being introduced into My Oracle Support.

SESSION 12
Closing Panel
3:45 - 4:30 PM
Do you have a question about BI or EPM? Your BI and EPM speakers are here to give answers. This session is moderated but is expected to be freewheeling and open. Try to stump us!

Whew, is that enough?

What oh what oh what are you waiting for?  Read what’s on offer – that is great stuff, packaged for, and (mostly) delivered by Australians (hey, I have incredibly distant relatives somewhere in Victoria so do I get a partial pass on the Yankee Go Home protests?).  That means the content is targeted to your needs and your market.  What more could you possibly want?  Sign up today.

Be seeing you in Melbourne.
Viewing all 271 articles
Browse latest View live