Quantcast
Channel: Cameron's Blog For Essbase Hackers
Viewing all 271 articles
Browse latest View live

Where in the world is Cameron, days 3 and 4, Australasian edition

$
0
0

Wait, what happened to day three?

What happened was:


  1. I woke up really early
  2. I flew in a New Zealand Air plane to Melbourne
  3. I saw Bambi Price’s house and cruised round Melbourne in a jeep with Bambi’s husband, Fred Price
  4. I bought lots of real Cadbury’s chocolate (what we get in the States looks like Cadbury, but it is a Cruel Joke upon the tasting) to take home
  5. Laid down on the hotel bed “for a minute” before I went out for a beer and woke up the next morning

Okay, so what about day four?

I helped Bambi and Fred set up the conference room at Swineburne University (exciting pictures to be inserted as soon as I can find my boat anchor of a phone), had a flat white, and then proved that Cameron and Fred Do Not Do Networking as we tried, somewhat fruitlessly, to try to connect to the wireless network. Yes, you are reading this, so we are not hopelessly bad at this.

Here we are setting the room up. Oracle Ace Directors do it all, including moving furniture as required.
 

And all of this was for…

The ODTUG Seriously Practical Australia conference, natch. Yes, that link you see (go on, click on it and be surprised, and maybe just a little sad that you aren’t here) is the agenda, and yes, this is Exciting Stuff. We are bringing the same great focus and depth to Australia as we do to Kscope in the States.

Babar Jan Haleem, What’s Coming in Oracle BI and EPM, 9:00 am, Australian Central Daylight Time (ACDT)

Babar is giving the same (well, the same if you were at NZOUG 2013) session he gave at, wait for it, NZOUG 2013, but as that universe is pretty darn small, it is a fresh presentation to most.


My take away – I can’t wait to get EPM into the cloud at a client – I do it all the time from a development/self-training/generally mucking about perspective but that’s a completely different thing than actually running an EPM implementation in the cloud. Let there be no more missed implementation schedules because of install problems!


Debra Lilley, Fusion Applications and Your BI/EPM Investment, 9:30 am to 10:30 am ACDT

Debra just said that Hyperion (aka EPM) is “exciting” and that therefore, I am exciting. She also says that she is not technical. Hmm.

And what is the calculation engine behind Fusion? Why, it’s Essbase. And it’s transactional. You know, the thing that we Essbase developers were Never To Do. Could these be “headless” ASO Essbase databases? Could be.

Debra’s getting a bunch of questions – I’m really glad to see this interaction although she might feel a bit like a trooper serving under General Custer at the moment. What am I talking about, she can more than handle herself. Not technical? Hmm.

Fusion Reporting and Analytics – Oracle Transactional BI (OBTI), Oracle BI Applications (OBIA), and Specialized Analytics. The last bit is all built on Essbase – Essbase is the aggregation engine. How cool is that?

Charles Pinda, Delivering Your Financial Results Better with Oracle EPM 11.1.2.2, 10:45 am to 11:45 am ACDT

Charles does a great job – the 11.1.2.2 functionality that I wish was in “normal” Planning was Decision Management. It’s part of PSPB (Public Sector Planning and Budgeting) and is, in a word, awesome. I can’t even find it documented although I’m sure that exists. It’s a way to collect all of the text, comments, justification, etc. around a budget. It is So Cool.

Endeca Information Discovery, Stephen Weingartner, 12:45 pm to 1:45 pm ACDT

A small world, indeed – I am working (you may or may not be surprised to note that this week is not a normal work week for me) at a client in St. Paul, MN. And Stephen is from…Minneapolis, MN. If you’ve heard of the Twin Cities, you’ll know that St. Paul and Minneapolis are practically one city. As the saying goes, what are the chances?


Beyond odd coincidences, Stephen is here to talk about Endeca. The more I hear about this tool, the more interested I become. Or maybe I finally understand the value of unstructured data and how it might be analyzed. It sure isn’t Essbase, although Essbase can be fodder for Endeca. It’s ability to comb through public data and make sense out of it all is intriguing.


Stephan showed a Twitter data source analysis (Dan O’Brien at NZOUG 2013 did much the same but on #NZOUG and #NZOUG2013 hastags) based on the political turmoil here (I barely understand American politics so look up Gillard and Abbott on your own, I pick no sides) – all public data, all real time, all Real Cool.
 

Richard Philipson, Exploring Oracle BI Apps: How Does it Work and What Do I Get, 2:00 pm to 3:00 pm ACDT

Richard is fielding questions about why Informatica is used instead of ODI. Always a fun moment when one has to defend Oracle’s product decisions, but of course he’s doing fine. For the record, everyone wants ODI (but of course) and expects that it will come soon.
Overall, the BI Apps are pretty cool although certainly not simple. OTOH, they hook into the Oracle applications with a moderate amount of pain and complexity. There is a lot of functionality and flexibility in-built to the tools.
And oh btw, slowly changing attributes, the semi-holy grail of Essbase that is kind of, sort of there, is easily displayed in BI Apps.

Christine Aird, Thoughts from the Frontline – Issues and Opportunities Faced When Implementing or Upgrading HFM Applications, 3:15 pm to 4:15 pm ACDT


Another presenter who claims he (or I suppose she) “isn’t the least bit technical”.  I wonder if Australian English (almost as painful a term as American English) defines technical as “more than the Septics would do”.  She’s a geekette, but just doesn’t know it.  Or maybe that is admit it.  Why do I say this?  Because what she describes as her project work is what I do, and I think I’m technical.  I could be wrong about what I do – it wouldn’t be the first time.

Christine is taking us through the various stages of project implementation, what HFM is good for versus Planning, and general good practices around HFM implementations.

Yr. Obdnt. Srvnt., The spreadsheet management system known as Dodeca, 4:30 pm to 5:30 pm ACDT


I get to talk about my very favorite Essbase front end in the whole wide world.  Talking about Essbase is always a good thing.  Talking about Dodeca is always a good thing.  Talking about both is just perfect.

Would someone in Australia please buy this thing?  No, I don’t want the work, I just want Dodeca to plant its flag on yet another continent.  After all, trade follows the flag.

And after me, the bar


At Beer DeLuxe– what a nice way to end a long and useful ODTUG Seriously Practical first conference day.

Keep tuned
There’s more to come, including my Brush With Celebrity, but that is most definitely a case of “If there aren’t any pictures, it didn’t happen.” Oh yes it did happen and yeah, I have the photos as you can see below. I think we all know who this is.

Stanely looks like he’s been through a lot. He’s also a bit smelly. But then if you read his adventures and travails at his very own blog, it all makes sense. And yes, that is my brush with fame. Also, yes, that is a bullet hole, but Stanley marches on. He is made of Stern Stuff.



Goosebumps is the only way to describe how I feel. :)

Be seeing you.

Where in the world is Cameron, day 5, Australasian edition

$
0
0

Not the beginning of the end, but perhaps the end of the beginning

I’m not one to quibble with WSRC, but the ODTUG SP Australia conference is at day two of two and that means that I am finally coming to the end of Cameron’s Most Excellent Australasian Conference Adventure. It figures that my body is finally sort of, kind of, used to the time zone difference because I will be jetting away tomorrow. It took me a week to get used to the time here and I expect another week of sleepiness when I get home to the States. OTOH, I have lots of real Cadbury chocolate (see yesterday’s rant on the stupidity of US chocolate manufacturing practices – Something Must Be Done), I have really had a great time here, and learnt quite a bit about BI and EPM.

But none of the above really matters – what does matter is: did the conference attendees get value for money? Given the depth and breadth of the sessions, the passion that the presenters brought to their sessions, and the high technical level of the presentations (despite protestations to the contrary re “being technical”), I’d argue that yes, the attendees got their money’s worth, and more.

I was a little apprehensive about helping select sessions (read: beg Oracle Australia, James & Monroe, M-Power, Bambi Price, and just about anyone I knew in Australia to help put together the speaker list – oh dear, I am now on the hook for repaying favors but it is all worthwhile) as the Australian market differs somewhat from the US of A’s. Yes, the market details are different, but at the end of the day we are all trying to solve the same problems with Oracle’s BI and EPM tools. The attendee survey will tell the tale (how could a BI/EPM conference not try to wrap metrics around an event?), but based on conversations I’ve had, I think it will be a solid win.

Taking OBIEE to the Next Level, Maneesh Disawal, 9:00 am to 10 am, ACDT

Maneesh is taking us through a definitely-not-standard approach to making OBIEE more useful. It’s nice to know that hacks aren’t just an EPM-only approach. And besides, a good hack isn’t a hack at all, but instead is Just Really Cool.

It’s interesting to see how much OBIEE overlaps with EPM – yes, yes, I get it, Oracle are bringing the two together, but still, it’s interesting to actually observe it. Most ODTUG conferences have me running round like a chicken with its head cut off. I am really enjoying actually being able to sit back and listen.

Another thing I am noticing about OBIEE is how IT-oriented it is. This isn’t a bad thing but it is evidence (if it were needed) that there is still quite the gap between the BI and EPM worlds. Their eventual merger will be interesting to watch.

Essbase ASO – A Brave New World in Australia but not for the Rest of the World, Steve Hitchman, 10:15 am to 11:15 am, ACDT

This session hasn’t occurred yet, but it’s up next. m-power worked with my buddy Dan Pressman and utilized his Rules of ASO Essbase. I’m very excited to see what they have on offer. Update– The session is in progress right now.

Oh, this is embarrassing, but kind of awesome at the same time. Dan Pressman, ASO wizard extraordinaire, just had a slide devoted to him and His Really Big Brain. What else was part of the slide? Why an advertisement for Developing Essbase Applications. Yes, it is a good book, and internationally loved.

Steve is going through the ASO design principles Dan has tried to hammer into my head:

  • No formulas, unless you must
  • Stored instead of dynamic hierarchies, or at least Multiple hierarchies enabled
  • No more + and - operators, instead just + and flip the data signs to get round the dynamic hierarchy
  • Gary Crisci’s MDX chapter in Developing Essbase Applications just got mentioned as well as a resource (Are Gary and Dan soon to become Australian citizens? Could be.)
  • Do the simple stuff in MDX, do the complex logic in BSO and import results into ASO
  • Alternate YTD hierarchies to come up with YTD values through ASO’s aggregation capabilities
  • Solve order to handle variances

Oracle BI and Oracle Essbase: Today and Tomorrow, Stephane Roman, 11:30 am to 12:30 pm, ACDT

Stephane is taking us all on a journey through Essbase and OBIEE integration in the next release of the OBIEE stack. They are To Become One.

Stephane is reviewing Sample.Basic aka My Very Favorite Essbase Database In The Whole Wide World (MVFEDITWWW). It’s nice to know that The Beverage Company’s business continues forward. Who wants to bet that when Sample.Basic was created that a bunch of Arbor Software developers sat round and said, “Eh, a good first effort, but we have to replace that with something better, but soon.” Soon never came.

Watching Stephane’s presentation, I realize that I have a career decision to make – am I going to jump on the OBIEE bandwagon to get a leg up on the tool or just passively wait for the Bus Named OBIEE to run me over. Maybe getting flattened will be pleasurable? Probably not. So much to do, so little time.

One thing that is funny about OBIEE (and why I personally think things aren’t quite there yet wrt product convergence) – it takes an Essbase database in all of its Essbase awesomeness and turns it into a logical star schema. That is…odd looking. I realize this is how OBIEE federates data but it is still a little jarring for an Essbase developer to see.

Slay the Evil of Bad Data in Essbase with ODI, Cameron Lackpour, 1:30 pm to 2:30 ACDT

This is the same presentation I gave at NZOUG 2013 and I always enjoy a chance to spread the ODI gospel. My solution doesn’t exactly use standard ODI functionality but one of the great things about ODI is that, to quote one of the attendees at ODTUG SP Australia likes to say, “There’s always a way round a problem”. ODI is great at enabling those ways around issues. I am a super fan of the tool.

Growing with Business Analytics - Keeping Updated and Informed, Paul Anderson, 2:45 pm to 3:45 pm ACDT

I finally get to meet Paul – I’ve seen his posts on the Business Analytics – Proactive Support web site.

Oracle Support are doing great things with making the support of the not-exactly-simple EPM tools. They are trying to preempt problems before they occur. One might argue that they are trying to put themselves out of business. I think that isn’t likely to happen any time soon but it is beyond great to know that Oracle understands the importance of fixing problems asap and maybe even preventing the issues before they even occur.

Master Notes, feedback, Advisor Webcasts, patches, product certification, social media/My Oracle Support Communities (hint, use this in lieu of Service Requests when your problem doesn’t involve fully-engulfed-in-flames Essbase servers), product version certifications, patch communities, product specific communities (HFCM and Endeca), Remote Diagnostic Agent (RDA), whew, you get the idea. Oracle Support are doing a lot of interesting things.

Closing Panel, Richard Philipson and Cameron Lackpour, 3:45 pm to 4:30 pm

This is an anything-goes, hit us with your best shot session. They are a lot of fun at Kscope and I am hoping that this will be more of the same. However some topics like do you like wheat or white toast are beyond the pale. Okay, I kid, I kid, wheat every time. Rye vs. wheat will have to remain a secret. Everything else is fair game. :)

Keep tuned

Almost done – I will put in my final thoughts when the party’s over.

Be seeing you.

Out of the Past

$
0
0

Introduction

You’re all Robert Mitchum and Kirk Douglas fans, right? And all film noir fans, too? No? You really should give this film a try. To say that they don’t make ‘em like they used to is putting it mildly. I will spare you my rant on the vast empty wasteland that is modern entertainment and instead take you on a different journey into the past. One that, if you have knocked around the Essbase world long enough, may cause pangs of longing. What oh what could this be?

Remember me?

 
Sob, yes, it is Essbase Application Manager. Oh, AppMan, how I miss your consistent keyboard driven functionality, your easy copy and paste into Excel for hierarchy, your easy to read outlines with nice clean and easy to read text, and your simple and easy installation. And in your place, we have…EAS Console. Sigh.


I know that I am not the only one that misses AppMan. What Cameron, more of your delusional thinking? Nope, I took that screen shot from the desktop of one of my current clients. Yep, Essbase 11.1.2.2 and a rather smart Hyperion admin realizes that she can better understand Essbase when viewing BSO databases through AppMan than when using EAS. Oracle, are you listening? AppMan = 1992 technology, and yet your customers prefer it. This isn’t the first one that I’ve bumped up against that has kept a copy of Application Manager around to make Essbase easier to understand.


I wish I had an old copy of AppMan.exe kicking around. Like a fool, I dumped Essbase 6.5.x’s binaries as fast as I could. Maybe I shouldn’t have been quite so impatient to embrace the future.


Parting shot

Here’s another picture to make you go all weak in the knees.

AppMan makes even Sample.Basic look good. Or in old Essbase-speak, Sample:Basic. Remember the colons instead of the periods as delimiters?


That does it, I’m doing my next automation project in Esscmd instead of MaxL, just because I still can. Okay, maybe I won’t do that, but I am going to look at old CDs and see if maybe I really did keep a copy of My Very Favorite Essbase Editor In The Whole Wide World.


Be seeing you, maybe in the past.

Going under the covers with TRACE_MDX

$
0
0

Introduction

I don't know about you, but I used to use the Classic Add In's Essbase Query Designer to give me a leg up in writing Essbase Report Scripts. As far as I know, there is no way to do that in Smart View (although I am not a super user of it, so corrections to this sentence please write in care of this blog).

Except of course when there is a way. How? Read on.

Some other blog you probably ought to be reading on a regular basis
I know I do, and you should, too. :)

Check out the Oracle Business Analytics (when did the EPM name go away?) Proactive Support post on How to track MDX queries against Essbase.

There is a new (how new is open to debate, I would say it’s been there all along and is just now getting released to world+dog because it works on my oldish AWS instance from John Booth’s Metavero blog) Essbase.cfg setting that will log Essbase MDX queries.


The setting is called: TRACE_MDX.

What does it do?

It logs the MDX query and how long that query takes. That’s it you say? Ah, but there is quite a bit of value in this as I will attempt to explain.

The setting

Read the link(s), or know that the Essbase.cfg setting is:TRACE_MDX appnamedbname 2

As always with Essbase, make the setting in either Essbase.cfg directly or via EAS (don’t forget to click the Apply button) and then bounce the Essbase service which of course is called anything but Essbase.

Here’s the setting for good old Sample.Basic:
TRACE_MDX Sample Basic 2

NB – On my EPM Windows instance, the Essbase service is called “Oracle Process Manager (EPM_epmsystem1). Intuitive, isn’t it? Umm, no.

Some errata

2 is the loneliest number
Fwiw, I tried 0, 1, 2, and 3 and only 2 seems to make anything get logged. Why the number 2? Why don’t those other values do something? Or is it that I just don't know how to set it? Time will tell.Log location
The results of the MDX queries gets dumped to ARBORPATH\
appname\dbname\mdxtrace.log.

Does it actually contain the user name and password of the person/tool doing the pull? Why of course not, that would be too easy. Sigh. You will have to build a cross referencing table. All the more reason Oracle should adopt at least a few ideas from this thread:
http://www.network54.com/Forum/58296/thread/1364255484/Collaborate+-+Application+Utilization

Also, you may note that this log file doesn't exactly go into the normal ODL log location. Why?

Just for the record – I’m not sorry that this log exists, I just wish there was a consistent logging architecture. I can barely remember where these things are from version to version; I just wish Oracle would pick a plan and stick with it. Okay, rant over.
 
How does it work?
Smart View
I *thought* that Smart View used MDX to query Essbase. That may very well be (or maybe not), but ad hoc retrieves against Essbase do not generate any entry in the log. Bummer.

Execute MDX

However, you can use Smart View’s “Execute MDX” command and get a value in the log. For those of you not writing MDX on a regular basis (and bear with me, because I think this log is going to drive a lot of people who are not super experienced with MDX towards it in future), you get to that option by right clicking on the Essbase database (ASO or BSO, it doesn’t matter) and selecting “Execute MDX”.


A dialog box pops up, and you can enter your MDX directly (yes, I stole this directly from the Support blog, just wait, I am going to expand on it):


That produces the following result in Excel:


Mdxtrace.log will have the following (the query is in the log):
===============================================================
Following MDX query executed at Mon Apr 08 08:56:13 2013
===============================================================
SELECT
{[100-10], [100-20]} ON COLUMNS,
{[Qtr1], [Qtr2], [Qtr3], [Qtr4]} ON ROWS
FROM Sample.Basic

=== MDX Query Elapsed Time : [0.009] seconds ===================


The corresponding Sample.log file has this:
[Sat Apr 13 13:04:24 2013]Local/Sample/Basic/hypadmin@Native Directory/7244/Info(1013091)
Received Command [MdxReport] from user [hypadmin@Native Directory]

[Sat Apr 13 13:04:24 2013]Local/Sample/Basic/hypadmin@Native Directory/7244/Info(1260039)
MaxL DML Execution Elapsed Time : [0] seconds


Pretty cool, eh?


One odd thing
I noticed, at least on my release of Smart View (version 11.1.2.2.000 (Build 453)), that the above MDX query cannot show the POV members on the sheet. I can toggle the POV button and have them in the floating palette, but that’s the only way it works. This is different than the 11.1.2.2 behavior with ad hoc queries. Maybe this is in the documentation and I missed it?  That would not be the first time I’ve blown by this sort of thing. Corrections please in care of this blog’s comment section.

Just for giggles, I tried fully qualifying the axes with the below MDX (again, forgive my child-like MDX skilz):

But all I got was this:

Note that this is NOT the way the MDX queries in say ,EAS display:


Not a big deal and yes, you could have used axis(0), axis(1), and axis(2) instead of COLUMNS, ROWS, and PAGES.

But wait there’s more in Smart View

Smart Slices

Even though not seeing the MDX from an ad hoc query is kind of a bummer (and again, maybe I am misunderstanding how Smart View queries data from Essbase), did you know that Smart Slices are just MDX queries? And that means that you can view the MDX in mdxtrace.log.

Define the Smart Slice any which way you want.

Do an ad hoc query off of the Smart Slice named Cameron’s test:

And, voila:
===============================================================
Following MDX query executed at Sat Apr 13 14:37:10 2013
===============================================================
SELECT
{ CROSSJOIN( { [Product] } , { [Year] } ) } PROPERTIES [MEMBER_ALIAS],[MEMBER_UNIQUE_NAME] ,[GEN_NUMBER],[LEVEL_NUMBER] ON ROWS,
{ { [Measures] } } PROPERTIES [MEMBER_ALIAS],[MEMBER_UNIQUE_NAME],[GEN_NUMBER],[LEVEL_NUMBER] ON COLUMNS WHERE {( [East] , [Budget] )} PROPERTIES [MEMBER_ALIAS],[MEMBER_UNIQUE_NAME]


=== MDX Query Elapsed Time : [0.001] seconds ===================

Now are you getting interested in this?

Did you note how there is no FROM Sample.Basic statement in the MDX above? I have to guess that it is somehow stored in the Smart Slice itself and so it isn’t necessary. Again, smarter minds than mine please chime in via the comments section.
What triggers MDX and just what kind of MDX?
Drilling up and down in the sheet does not generate new MDX queries. However, changing Measures to Profit through the Member Selection dialog box does.

Unsurprisingly, a Member Selection action produces a metadata query (you knew MDX could do that because you’ve been or read Gary Crisci’s Kscope presentation on that, right?):
===============================================================
Following MDX query executed at Sat Apr 13 15:30:39 2013
===============================================================
SELECT {HEAD( DESCENDANTS( [Measures] ),5001 )} PROPERTIES [MEMBER_ALIAS],[MEMBER_UNIQUE_NAME],[SHARED_FLAG] ON COLUMNS


=== MDX Query Elapsed Time : [0.000] seconds ===================

Actually clicking on Refresh produces the following MDX – note Profit is now defined:
===============================================================
Following MDX query executed at Sat Apr 13 15:30:39 2013
===============================================================
SELECT
{ CROSSJOIN( { [Product] } , { [Year] } ) } PROPERTIES [MEMBER_ALIAS],[MEMBER_UNIQUE_NAME] ,[GEN_NUMBER],[LEVEL_NUMBER] ON ROWS,
{ { [Profit] } } PROPERTIES [MEMBER_ALIAS],[MEMBER_UNIQUE_NAME],[GEN_NUMBER],[LEVEL_NUMBER] ON COLUMNS WHERE {( [East] , [Budget] )} PROPERTIES [MEMBER_ALIAS],[MEMBER_UNIQUE_NAME]


=== MDX Query Elapsed Time : [0.000] seconds ===================

Getting back to that metadata query, what does it look like in MaxL (I have to stick it there to try to read what comes out)?. Here’s my super simple MaxL:
login hypadmin password on localhost ;

alter session set dml_output alias off ;
alter session set dml_output numerical_display fixed_decimal ;
alter session set dml_output precision 15 ;
set column_width 80 ;
set timestamp on ;

spool on to "c:\\tempdir\\mdxoutput.log" ;

SELECT {HEAD( DESCENDANTS( [Measures] ),5001 )} PROPERTIES [MEMBER_ALIAS],[MEMBER_UNIQUE_NAME],[SHARED_FLAG] ON COLUMNS
FROM [Sample.Basic] ;

exit ;

And that produces:
===============================================================
Following MDX query executed at Sat Apr 13 15:34:00 2013
===============================================================
SELECT {HEAD( DESCENDANTS( [Measures] ),5001 )} PROPERTIES [MEMBER_ALIAS],[MEMBER_UNIQUE_NAME],[SHARED_FLAG] ON COLUMNS
FROM [Sample.Basic]


=== MDX Query Elapsed Time : [0.001] seconds ===================

And this:

You can pull the file down from here for your amusement and also because the above is just illegible.

I count 17 data values. There happen to be 17 Measures. I think (boy oh boy am I doing a lot of guessing in this blog post) those are internal index values for the Measure dimension members. Once again, pretty cool, eh? Dear Tim Tow, when you tell me things like this about Essbase, I do try to remember them, even though 90% of what you tells me flies over my head.

Query Designer

And of course there is a Query Designer in Smart View. If you guessed that this too was a MDX query, you would be 100% right.

When I click on the Apply Query button:

I get this in Smart View:

And this in mdxtrace.log:
===============================================================
Following MDX query executed at Sat Apr 13 14:41:19 2013
===============================================================
SELECT
{ CROSSJOIN( { [Product] } , { [Year] } ) } PROPERTIES [MEMBER_ALIAS],[MEMBER_UNIQUE_NAME] ,[GEN_NUMBER],[LEVEL_NUMBER] ON ROWS,
{ { [Measures] } } PROPERTIES [MEMBER_ALIAS],[MEMBER_UNIQUE_NAME],[GEN_NUMBER],[LEVEL_NUMBER] ON COLUMNS WHERE {( [East] , [Budget] )} PROPERTIES [MEMBER_ALIAS],[MEMBER_UNIQUE_NAME]


=== MDX Query Elapsed Time : [0.000] seconds ===================

Financial Reports

What, Smart View (mostly) exposed isn’t enough for you? Good grief. Did you know that Financial Reports uses MDX? And this time we can view it all, baby. Duluth, MN? Really? Moving on…and realizing that the DPD will likely arrest me on sight for that comment,...

Let’s take this simple report:

Run it in HTML Preview mode:
And see that it produces:
===============================================================
Following MDX query executed at Sat Apr 13 15:49:05 2013
===============================================================
SELECT
{[Qtr1], [Qtr2], [Qtr3], [Qtr4]}
DIMENSION PROPERTIES [Year].[MEMBER_ALIAS] , [Year].[MEMBER_UNIQUE_NAME]
ON COLUMNS ,
{[Profit], [Margin], [Sales], [COGS]}
DIMENSION PROPERTIES [Measures].[MEMBER_ALIAS] , [Measures].[MEMBER_UNIQUE_NAME]
ON ROWS
FROM [Sample].[Basic]
WHERE ([Product], [East], [Actual])


=== MDX Query Elapsed Time : [0.000] seconds ===================

What about Planning?

I understand that Planning forms are MDX-based. Well, opening a form does not generate an MdxReport event in the Essbase application log – I think this is shades of Smart View ad hoc. (Yeah, I have a very long phone call with Tim in the near future so he can explain, again, how all of this stuff works under the covers. And yes, it behooves me to try this out with Dodeca as well but I’ll make that the subject of another blog post.)

However, ad hoc analysis does trigger a MDX query:

Which produces this:

And that in turn produces this in mdxtrace.log:
===============================================================
Following MDX query executed at Sat Apr 13 16:06:04 2013
===============================================================
SELECT {[Period].[YearTotal]} ON COLUMNS,
NONEMPTYBLOCK {[Account].[IncomeStatement],[Account].[300000],[Account].[310000],[Account].[320000],[Account].[330000],[Account].[340000],[Account].[350000]} ON ROWS
FROM SampTest.Consol
WHERE ([Year].[FY13],[Scenario].[Forecast],[Version].[Working],[Segments].[BAS],[Entity].[E01_0])


=== MDX Query Elapsed Time : [0.000] seconds ===================

Do you see what I see? There it is, that still (I think) undocumented super-cool MDX function NONEMPTYBLOCK. Oooh, I love it when a plan comes together.

The conclusion and the point behind all of this

If you can build it in a Smart View Smart Slice, or in Query Designer, you can interrogate mdxtrace.log to find out how Smart View did it. The same goes for Financial Reports. And, with some limitations, the same is true for Planning.

Why oh why oh why would you care about TRACE_MDX? Two reasons spring to mind.

  1. You should care because this has GOT to be easiest way there is to figure out how to write good MDX, at least from a query perspective. MDX is not…intuitive. Oh sure, if you know it, it’s easy, but that’s because you already went through the pain. For rest of us, it can be a little scary and painful. This simple Essbase.cfg setting can make that learning curve so much easier. Whoever in EPM product management pushed this one through; I give you my BSO-brain’s thanks.
  2. Why do these tools sometimes go KABOOM on us? How do they work under the covers? What interesting bits of functionality are they using? TRACE_MDX gives us a window into the way (mostly) EPM tools talk to Essbase. As an example, where did that rather cool undocumented keyword NONEMPTYBLOCK came from? Why it came from examining MDX. I’ll bet there’s more cool stuff that we only need to look for.

I also have to give thanks to Oracle Support for once again coming through with some really cool stuff.

NB – One last point – I don’t do OBIEE but I am willing to bet that this setting came about because of the sometimes ugly MDX that OBIEE generates. Again, people more knowledgeable than I should drop me a line to tell world+dog all about it. In any case, we now all get to benefit from TRACE_MDX.

Be seeing you.

Using TRACE_MDX with Planning

$
0
0

Introduction

As I showed in my last post, TRACE_MDX can be utilized with Planning ad hoc forms. Oh, terrific, but do I really and truly have to go into an ad hoc form to see the layout? Besides, going into an ad hoc form changes its structure and maybe I want to see what that form’s MDX looks like from the word “go”. Is there a way to do it? You betcha.

It’s just this little chromium switch

All you need to do is to go into Planning form design and select “Suppress missing blocks”.
Simply save the form and open it back up.
 
 
And take a look at the (by now) good old mdxtrace.log file and see…
===============================================================
Following MDX query executed at Sun Apr 14 14:05:15 2013
===============================================================
SELECT {[Period].[Jan],[Period].[Feb],[Period].[Mar],[Period].[Q1],[Period].[Apr],[Period].[May],[Period].[Jun],[Period].[Q2],[Period].[Jul],[Period].[Aug],[Period].[Sep],[Period].[Q3],[Period].[Oct],[Period].[Nov],[Period].[Dec],[Period].[Q4],Hierarchize(Descendants([Period].[YearTotal]),POST)} ON COLUMNS,


NONEMPTYBLOCK {Hierarchize(Descendants([Account].[IncomeStatement]),POST)} ON ROWS


FROM SampTest.Consol


WHERE ([Segments].[BAS],[Entity].[E01_0],[Year].[FY13],[Scenario].[Forecast],[Version].[Working])


=== MDX Query Elapsed Time : [0.068] seconds ===================


And what do we get from this?
A whole bunch of things:
  1. There’s that NONEMPTYBLOCK statement again. You know, the thing that makes BSO MDX queries so fast. Yup, it sure is interesting that it has been around in MDX for such a long time, and used for such a long time in Planning (I am going to guess since 11.1.1.1 as that’s when I remember Suppress Empty Blocks becoming available). And yet it wasn’t ever documented. Why?
  2. The MDX portion of the form took only 0.068 seconds.
  3. Did you see the Hierarchize function? And the POST option? Check out the Essbase Technical Reference topic on Hierarchize – do you see how Oracle could make expansion work either way (up or down) if they wanted to? Although I suspect there is a reason that this is not exposed as I’ll explain/guess at in a bit.
  4. Columns get treated differently than rows. What do I mean? If you look at the form layout screen shot, you’ll see that the Planning form command to get all of the periods is IDescendants(YearTotal). That’s how Accounts are defined as well. And yet the MDX clearly shows individual selections for each period and a Descendants of YearTotal where Accounts are simply a Descendants function. Why?


What do I mean by that? I took the MDX and stuck it into Smart View using the Execute MDX command and got the following columns:
Jan
Feb
Mar
Q1
Apr
May
Jun
Q2
Jul
Aug
Sep
Q3
Oct
Nov
Dec
Q4
Jan
Feb
Mar
Q1
Apr
May
Jun
Q2
Jul
Aug
Sep
Q3
Oct
Nov
Dec
Q4
YearTotal


Interesting, eh? Apparently (well, definitely, actually as we can see) Planning needs the columns twice, once through explicit selections and then again through Hierarchize(Descendants([Period].[YearTotal]),POST). Isn’t that just odd?


This must be something internal to Planning as this simpler MDX gives me exactly what I would expect wrt columns, i.e., non-repeated months.
SELECT {Hierarchize(Descendants([Period].[YearTotal]),POST)} ON COLUMNS,
NONEMPTYBLOCK {Hierarchize(Descendants([Account].[IncomeStatement]),POST)} ON ROWS
FROM SampTest.Consol
WHERE ([Segments].[BAS],[Entity].[E01_0],[Year].[FY13],[Scenario].[Forecast],[Version].[Working])
 


Dear Oracle Planning Product Management (or more likely Development) – what the heck is going on? Why oh why oh why does Planning need almost double the columns? Weird.

And what’s really weird

As I stated in the beginning of this post, one needs to flip the Suppress Empty Blocks switch to make MDX fire on form retrieval. And that implies that only this setting (I suspect it is the only way to easily get to the functionality NONEMPTYBLOCKS provides) makes Planning use MDX. I am further guessing, per what My Man In California, Glenn Schwartzberg, stated in the comments section of last week’s blog re default retrieves in Smart View, Planning must use the Grid API to do standard retrieves. I find that fascinating because it has been “common” knowledge that Planning uses MDX to retrieve forms. TRACE_MDX tells us quite clearly that in fact that is not true.


And so that then suggests that maybe MDX still isn’t the fastest or best way to retrieve data from Essbase. I guess I shouldn’t be super surprised that nothing beats a native API, but I do wish this stuff was documented. Wait, it just was. :)


Be seeing you.

A very different kind of ODTUG webinar

$
0
0

Marketing introduction

The ODTUG virtual panel series goes from strength to strength. (What, marketing? You promised me a rose gardena marketing-free blog.   Yes, but this isn't marketing for me, so relax. My self-marketing ineptitude continues apace.)
 
Stuff you actually care about goes right here
What do I mean? None other than Chet Justice, aka ORACLENERD is moderating an ODTUG virtual panel in the form of a webinar.

Wait, ODUTG , virtual panels, ORACLENERD? Is Essbase involved somehow?   No.   Cameron, is it time to get some sleep? Why yes it is, but before I hit the hay, let me pull it all together:
  1. ODTUG is having another one of their successful-beyond-our-dreams virtual panels
  2. Chet is the moderator
  3. The speakers are: Cary Millsap, Dominic Delmolino, and Kris Rice.
  4. The webinar's subject is "Software Development in the Oracle Eco-System". Essbase is absent, but so what?
  5. The time and date are Thursday, May 30, 2013 3:00 PM - 4:30 PM EDST

See here: https://www3.gotomeeting.com/register/759087318

Where's the beef?
Okay, that's all very interesting but why should a bunch of (mostly, or maybe exclusively if you're reading this blog) EPM people attend a webinar on something that is pretty obviously not EPM-related?
  1. Because these guys are good.   I heard Cary speak last year at the Kscope12 keynote.   Did you?   He was fantastic.   Inspirational even.   I don't know the other speakers (blush, my EPM-centricity shows yet again) but I have a sneaking suspicion they are really, really, really, insightful speakers.
  2. They are covering a subject (development, its frustrations and triumphs, and how to have more of the latter than the former) that all of us, each and every one, do to some extent or another.   Isn't the point of ODTUG to find learn from others and occasionally share something with them?   That's pretty much why I'm involved.   Here is your chance to do that with some Really Big Names in the Oracle world.
  3. It's free.   Think about how much you would pay to get trained by these guys.   Yeah, I couldn't afford it either.
What a nice way to end an undoubtedly hectic week.

I guess that's it, and it ought to be enough. I'm signed up. Are you?

For more gen
If you want to get a feel for them, check out their blogs.

There is some really good stuff there -- I love it (not that I ever do it myself) when geeks muse about why they do something as opposed to how although of course the how is important too.   I think this webinar/panel will be in that vein and I am really looking forward to it.

Be seeing (listening to because remember, this is a panel and you can participate?) you on 3 pm EDST, 30 May 2013. 

Stupid Programming Trick No. 17 -- Hacking EPMA's Planning Time Distribution

$
0
0

The problem

A Planning 11.1.2.2 application was created with the wrong Time Distribution option. Oops. And no one noticed until the UAT. Double oops. The real oopsie in this is that once created, a Planning application CANNOT change its Time Distribution. For the uninitiated, this is the spread from upper level Time Periods members like YearTotal or quarters to lower level members. It defaults to an even spread or can be optionally set to fiscal calendar 445, 454, or 544 spreads. Again, this is a one-time shot, so whoever creates the application had best mind his p’s and q’s. Which didn’t happen. Like I wrote, oops.

The hunt for the guilty and his inevitable gruesome punishment shall await Cameron’s Star Chamber. I am waiting for HM The Queen to appoint me Privy Councillor so I may begin the joyous prosecution.
 
Putting that happy day aside, oh may it come, and soon, let’s start figuring out how to fix this oopsie without recreating the Planning application.

NB – We are going to go faaaaaaar beyond what Oracle recommends, supports, or will even give you the time of day on if you FUBAR this and then call Oracle Support. It is an interesting hack, and maybe the exigency of your situation calls for it, but know that you do this at your own risk. You Have Been Warned.
 
The beginning of the fix
Dave Farnsworth found Celvin’s Kattookarn’s blog post with the code to change the Even split to 4-5-4 in the Planning application.
 
Here’s the code to transform the Planning application (this is in the Planning app’s schema/database). In this example, it is to change it from Even to 4-5-4. NB – This is in SQL Server but I think it’s identical in PL/SQL except for the COMMIT commands.
 
USE HYP_PLN_ChgSprd
GO
/*
0 = even spread
1 = 4-4-5
2 = 4-5-4
3 = 5-4-4
*/

--Flip from Even to 4-4-5
BEGINTRANSACTION;
update hsp_systemcfg set support445='2';
update hsp_account set use_445 ='2'where use_445 ='0';
COMMITTRANSACTION;

So problem sorted, yes? 
 
Oops, not entirely
Did I mention this was an EPMA application? Ah, no I did not. And it’s important.
 
After making the above change, and bouncing the Planning service, deploys from EPMA work until there is a change to hierarchy. When that happens, the deploy fails with the below error message:
[May 9, 2013 1:38:35 PM]: Parsing Application Properties...Done
[May 9, 2013 1:38:35 PM]: Parsing Dimensions info...Done
[May 9, 2013 1:38:35 PM]: Registering the application to shared services...Done
[May 9, 2013 1:38:36 PM]: You cannot change the Weeks Distribution after deploying. You must select 454 as the Weeks Distribution before redeploying the application.[May 9, 2013 1:38:36 PM]: An Exception occurred during Application deployment.: You cannot change the Weeks Distribution after deploying. You must select 454 as the Weeks Distribution before redeploying the application.



Btw, here is what EPMA had selected – it’s Even, not 4-5-4. But Planning is 4-5-4. KABOOM.

The research
So a dive into the EPMA tables seems to be in order to change that “Use application distribution” to “Use 454 distribution”.
 
I took a look at the EPM data models to get a feel for what’s going on under the covers. The EPMA schema is pretty sparsely documented, to put it mildly, but after a fair bit of blundering about, I figured out that I needed to look at the DS_Property_Application table and its c_property_value field.
 
Alas, the documentation does not tell you what the property value should be for the time spread or even the property id number. So I created a bunch of different Planning apps, each with a different Time Spread and came up with the following possible settings when I interrogated that field:
  • Even
  • 445
  • 454
  • 544
 
Once I knew what to search for wrt the setting, I then needed to figure out the application id and the property member id. With those two (I only want to change one setting, and I only want to do it for the right app) pieces of information, I can write an UPDATE query to fix the spread issue in EPMA. 
 
What application, what property?
I wrote this query to get that information, knowing that the application name is ChgSprd:
 
SELECT
A.c_application_name
,P.*
FROM DS_Property_Application P
INNERJOIN DS_Application A ON
A.i_application_id = P."i_application_id"
AND A.i_library_id = P.i_library_id
WHERE
A.c_application_name ='ChgSprd'
AND c_property_value ='Even'

When looking at the below, it looks like that the setting is repeated for each deploy of the application
c_application_name
i_library_id
i_application_id
i_prop_def_dimension_id
i_prop_def_member_id
c_property_value
ChgSprd
1
7
1
346
Even
ChgSprd
87
7
1
346
Even
ChgSprd
88
7
1
346
Even
ChgSprd
89
7
1
346
Even
ChgSprd
90
7
1
346
Even
ChgSprd
91
7
1
346
Even
ChgSprd
92
7
1
346
Even
ChgSprd
93
7
1
346
Even
ChgSprd
94
7
1
346
Even
ChgSprd
95
7
1
346
Even
ChgSprd
96
7
1
346
Even
ChgSprd
97
7
1
346
Even
ChgSprd
98
7
1
346
Even
ChgSprd
99
7
1
346
Even
ChgSprd
100
7
1
346
Even
ChgSprd
101
7
1
346
Even


The important bits are: this is application id 7 and the property id is 346. NB – I have been able to test this on completely different system – 346 is the property that contains the time distribution and the i_application_id varies. You will need to run the above query to figure out the i_application_id.
 
The fix
With an i_application_id firmly in hand, I can write an UPDATE query as below:
/*
Possible values for c_property_value when i_prop_def_member_id = 346
Even
445
454
544
*/
UPDATE P
SET P.c_property_value ='454'
FROM DS_Property_Application P
WHERE
i_prop_def_member_id ='346'
AND i_application_id ='7'

Btw, I tried just changing the last record in the above list of applications (the 101) and it didn’t work (although I got to play with SQL’s MAX function in a subquery so there is that). I had to change ALL of the values from Even to 454. I think maybe I could could have gotten away with changing just the i_library_id setting of 1 but I am not made of free time to test this stuff out. If you try it (on a throwaway instance, please) and it works, send in a comment to this blog.

Anyway, I changed them all, and then I bounced the Hyperion EPMA Server service and (since this was a compact deployment) the Hyperion EPM Server- Web Application service:

NB – In a real environment, I found I had to bounce all of the EPM services. Quite painful across a production environment but such is life. And yes, just restarting EPMA and Planning did not do it – there was a serious amount of relational caching going on.

I then logged back into Workspace, went to the Dimension Library, added CL_test3, and saw the following:

I now have 454 distribution. Success! Boil in bag! Hopefully.

The proof
So the proof of this particular pudding is to run a deploy.

And in Planning:

Fixed on both sides: Planning and EPMA. Whew.
 
A couple of notes
Again, for goodness’ sakes, this is a hack, and I had to do it, but I am pretty sure if you try to use this blog post as evidence that this is okay and, “Cameron said I can do it” Oracle is going to laugh in your face. Do it if you must, but may you never have to.
 
With that enormous caveat, if you are going to do it in your environment:
  1. Make a backup of the development EPMA schema. And then in development…
  2. Run the query to confirm the application and property ids. Yourappname will replace ChgSprd.
  3. Modify the UPDATE statement to change Even/445/454/544 to Even/445/454/544.
  4. Don’t forget to change the Planning application’s 454 spread, so maybe a Planning application schema backup is in order too.
  5. Restart all Hyperion EPM services (I found that I needed complete restarts of all EPM services for this to work outside of a compact deployment and yes that hurt).
  6. See if the deploy works. Prayer to whatever God or gods you worship is recommended at this stage. 
 
This was fun, kind of, and I’m not as scared of the EPMA tables as I once was. And it was a pretty cool hack. So I guess it was worth it. But an Order in Council is still going to go out – vengeance shall be mine as all of the above ate a lot of time I didn’t have.

Be seeing you.

What Kscope13 sessions am I looking forward to, part one

$
0
0
Introduction
This is always a tough one because there are so many good sessions at ODTUG’sKscope conferences. If you’ve read my posts in years past on the conference, you will know that I have ranted and raved about the unparalleled knowledge sharing, the training, the networking, the vendor expo (you would be amazed at what’s out there and I am a full member of the does-not-have-contract-signing-ability crowd and I still find it useful), the fun, etc., etc., etc. If you aren’t convinced by this point you either don’t read this blog (in which case I have to ask how you come to read this sentence) or you really don’t pay attention.

And because there is so much good stuff I am going to split my review (yet another sort-of tradition in this blog) of sessions I want to attend across multiple posts. There is just too much cool stuff going on and I want to give these sessions the due they deserve. And not write the Kscope equivalent of Gibbons’The History of the Decline and Fall of the Roman Empire, at least from a length perspective.

With that, let’s kick off with my very favorite technological product: Essbase.

Whoops, before I continue, I should mention that I am friends with, or at least am acquainted with most of the people below. Am I just shilling for them? Nope, these are good sessions. Of course you pick and decide what you want to attend – this is what I am interested in.

Note – you will notice that the name Cameron Lackpour is absent from the below sessions. This is not some kind of false modesty as I do think that my sessions are at least worth considering. I will cover what I am presenting later in the week – this block is for everyone not named Cameron.

Essbase sessions (stolen right off of Kscope13.com)

Practical Essbase Web Services

Jason Jones, Key Performance Ideas, Inc.
When: Jun 24, 2013, Session 5, 3:00 pm - 4:00 pm
Topic: Essbase - Subtopic: Other Essbase
This session will give an introduction to using the new Essbase Web Services feature that has recently been added to the Essbase stack. Attendees will be given an overview of functionality available and practical methods of using the available technology effectively. This presentation will be geared towards users that are familiar with basic programming concepts.

You’ve seen Jason pop up on this blog a few times – he is a real developer. I am not, sob. And I suspect most of you (apologies to those who are full time Computer Scientists) are not either. So let’s learn from a guy who has written real honest to goodness commercial software.

Using Calculation Manager with Essbase ASO

Co-presenter(s): Olivier Jarricot, ADI Strategies
When: Jun 26, 2013, Session 12, 9:45 am - 10:45 am
Topic: Essbase - Subtopic: Other Essbase
Get a sixty-minute cram session on using Calculation Manager with Essbase ASO. This session will navigate you through the key features within the Calculation Manager, specifically for Essbase ASO. The session will then guide you through the steps of creating calculation rules for ASO using the graphical interface and allocation and custom calculation components. Finally, the session will offer best practices and "insider" tips to shorten your learning curve and get you off to a turbo fast start!

In a previous life, I worked on one of Josie’s projects. She is all kinds of awesome. Also, I was on a project last year that screamed ASO as its solution. I had everything working except those !@#$ing level zero rate calculations and it was an interactive (budgeting) application. The client was an 11.1.1.3 environment so we were out of luck, to put it mildly and BSO it was. I am super curious to see what BusyGal (her online moniker) has come up with.

Big Data and Analytics-led EPM

Al Marciante, Oracle Corporation
When: Jun 24, 2013, Session 1, 8:30 am - 9:30 am
Topic: Essbase - Subtopic: No Subtopic
What data is important to analyze in order to sign off on consolidated financial data or for accurately creating forecasts and budgets? How do users efficiently identify the key drivers and model possible financial outcomes based on uncertainty? This session will highlight the synergies between Enterprise Performance Management and Business Intelligence, and will showcase how the two together allows customers to propel their performance.

An Oracle presentation? Aren’t they just great big advertisements for Oracle? Not at Kscope they aren’t. Big Data, Big Data, Big Data, Big Data – it’s everywhere. Does it fit in EPM? I am super curious about this.

Optimizing ASO Models Using Procedural Calculations

Michael Nader, Blue Stone International
Co-presenter(s): Martin Slack, Ernst & Young
When: Jun 26, 2013, Session 16, 4:15 pm - 5:15 pm
Topic: Essbase - Subtopic: Optimization
As analytic data sets grow, Essbase deployments are more and more focused on the ASO technology to take advantage of speed and scale. However, performing intricate calculations at run-time often leads to poor reporting performance. This session focuses on leveraging ASO procedural calculations and allocations to extend analysis and expedite reporting.

Did you read the above session by Josie Manzano-Stettler? This is the other side of calculations in ASO Essbase. I am, again, very, very, very interested in this subject.

How ASO Works and How to Design for Performance

When: Jun 25, 2013, Session 9, 2:00 pm - 3:00 pm
Topic: Essbase - Subtopic: Optimization
Why are some cubes fast and some are slow? When is it OK to use MDX and when should (or what types of) MDX be avoided? How do Solve Order, Dynamic, and Stored Hierarchies interact? By understanding how Essbase goes about resolving a query, many of these questions will answer themselves. Much of this understanding comes from the cryptic Bitmap Statistics dialog in Cube Properties. All of this is summarized in 12+1 rules. These 12+1 rules are discussed and used to illustrate how they should be used when designing your cubes. In particular, there are implications and options for the design of alternative hierarchies. These options will be discussed in terms of the trade-offs of cost (in storage size) vs. performance (in retrieval time). Concrete examples will be demonstrated.

This is one of those rare sessions that is a repeat from last year. Reruns? At Kscope? What, you want your money back? Trust me, this is worth repeating, both for those of us who attended last year and for the people who were foolish enough not to attend. Why? Because Dan has deconstructed the ASO kernel better than anyone whose email address doesn’t end in @oracle.com. It is a brilliant presentation.

Performance Optimization and Measurement: New Thoughts and Test Results for BSO and ASO

When: Jun 26, 2013, Session 14, 1:45 pm - 2:45 pm
Topic: Essbase - Subtopic: Optimization
Have you ever wondered why some loads are fast and others are slow? Why the same query performs differently at different times? Why your queries and loads are not as fast as they should be? Then this is the session for you. Starting with a discussion of how data file IO is handled in Windows and Unix, techniques to ensure apples-to-apples testing are presented. Then using the results of over three hundred load, and calculate/aggregate tests using very large BSO (9gB input level 175gB Calculated) and ASO cubes (1.4 billion cells 84GB aggregated). The testing spans Windows and UNIX; ASO and BSO; varying cache settings; sort order and file formats are presented. These variables are evaluated and ranked with several new and surprising conclusions. Conclusions, that in some cases, run contrary to existing best practices. Finally, expanding on the chapter "How ASO Works and How to Design for Performance" in Developing Essbase Applications, the speaker will discuss surrogate keys, MDX, multi-attribute queries. All of this will be discussed in light of real world experience where multiple cubes are running and data is prepared and hardware supplied by other parts of the organization with differing practices and priorities. In short: real techniques you can implement when you return from the conference.

I am somewhat familiar with the tests Dan has conducted. He has a very interesting take on what makes Essbase databases fast.

Essbase New and Upcoming Features

Gabby Rubin, Oracle Corporation
When: Jun 24, 2013, Session 2, 9:45 am - 10:45 am
Topic: Essbase - Subtopic: No Subtopic
Over the last two years, Essbase footprint in the Oracle product portfolio had increase dramatically. In addition to being an application platform for many customer as well as Oracles own EPM applications, Essbase is a key part of Oracle BI Foundation, Exalytics and Fusion applications. These changes in the Essbase ecosystem along with other market trends such as cloud, require Essbase to adapt and evolve; But how do you prepare for the future while protecting your past? Join this session to learn about Oracles vision for Essbase and the product roadmap.

Oracle again? If you are interested in where Essbase is going, this is your best bet to hear all about it from the Oracle Product Manager himself. Will he deny everything he says the minute he steps out of the room? Probably. Will many of these projected roadmap items show up at Open World as publicly committed-to features? It has happened. :)

Advanced Essbase Studio Tips and Tricks

Glenn Schwartzberg, interRel Consulting
When: Jun 24, 2013, Session 5, 3:00 pm - 4:00 pm
Topic: Essbase - Subtopic: Other Essbase
Essbase Studio has been around since the beginning of version 11.1.X and there is still a lot of mystery around its usage. Come explore some of the more advanced features now that you have gotten your feet wet creating your first Studio model. This session will go over settings, optimizations, tips, and tricks that can make you more successful with Studio.

Glenn is my older brother from completely different parents. Of course he denies any actual familial relationship. Just like an older brother. You decide who has a firmer grasp on reality.

Clawing my way back to relevancy, Studio is a product that I seem to be drifting closer and closer to with the increasing level of SQL and ETL that I do on each project. Glenn’s sessions are always interesting and entertaining.

Thinking Outside the Box -- Optimizing Performance in Essbase and Planning

Glenn Schwartzberg, interRel Consulting
When: Jun 25, 2013, Session 10, 3:30 pm - 4:30 pm
Topic: Essbase - Subtopic: Optimization
There are standard optimizations that developers do to improve performance, then there are those developers who think outside the box and create unique optimizations that can affect calculations, data loads, data transfers, etc. Come to this session to see some unique solutions and truly outrageous ways to improve performance.

I reviewed the first draft of this presentation – again, Glenn has an unusual mind (shades of Young Frankenstein?) and always comes up with how-the-h-e-double-hockey-sticks techniques.

Introducing the Outline Extractor NG (Next Generation)

Tim Tow, Applied OLAP
When: Jun 24, 2013, Session 3, 11:30 am - 12:30 pm
Topic: Essbase - Subtopic: Other Essbase
Although the OlapUnderground Essbase Outline Extractor has been downloaded by over 10,000 unique users, the technology used is quickly becoming outdated. The new Dodeca Essbase Outline Extractor is a complete redesign and rewrite of the popular Essbase Outline Extractor technology and adds the ability to output in Hyperion Planning Outline Load Utility format and to relational databases. It also adds the ability to run on 32-bit and 64-bit operating systems including Windows, Unix, Linux, and even MacOS. Attend this session to learn how to leverage this free utility in your company.

I’ve been on the beta (and have been a terrible beta participant, sorry Tim, but Kscope has eaten my life) for the NG extractor. This is the tool that we will all use going forward.

I for one am looking forward to running the NG OE on my Fat Mac.

Are you crying Uncle yet?

That is ten, count ‘em ten, different Essbase sessions. Is that the sum total of Essbase sessions at Kscope? Absolutely not. In fact there are 38 on offer. And that is just a subset of all the EPM sessions. Kscope has content, content, content.

The next blog post will be The Truth About Hyperion Planning at Kscope.

Be seeing you at Kscope13.


What Kscope13 sessions am I looking forward to, part two

$
0
0

Introduction

I’ve already covered the Essbase side of the house in part one of this series. I lurv Essbase more that is likely healthy for my psyche. Or social life. But I seem to spend an awful lot of my professional life using a wrapper around Essbase. A wrapper called “Planning”. And that wrapper is a pretty powerful application in the EPM space, now with lots and lots of brand extensions.

One thing that I find interesting about Planning and all of the Financial Planning products is that many of these sessions are either focused on latest features or tips and tricks sessions with a few how-to’s thrown in. I suppose this is the nature of an application as opposed to a database, which tends to have a more theoretical bent. Or maybe I am just evincing that somewhat monomaniacal love for Essbase I wrote about above.

Whoops, before I continue, I should mention that I am friends with, or at least am acquainted with most of the people below. Am I just shilling for them? Nope, these are good sessions. Of course you pick and decide what you want to attend – this is what I am interested in.

Note – you will notice that the name Cameron Lackpour is absent from the below sessions. This is not some kind of false modesty as I do think that the Planning session I am copresenting is at least worth considering. I will cover that later in the week – this block is for everyone not named Cameron.

Planning sessions (stolen right off of Kscope13.com) with my comments

Planning at Transaction Levels while Maintaining Performance

Chris Boehmer, The Hackett Group
Co-presenter(s): Danny Frasco, Kimco Realty Corporation
When: Jun 26, 2013, Session 12, 9:45 am - 10:45 am
Topic: Planning - Subtopic: General Planning
Would you like to obtain greater detail in your planning or forecasting process - e.g., to plan for greater product or customer granularity, more periods, etc.? But you're afraid that your Planning application just won't be able to cope with these requirements? In this session, you will hear how an organization's desire to forecast a large number of detailed items for a long period of time led to some innovative solutions built to enhance their Hyperion Planning application.

Hmm, this is interesting – sort of the anti-driver approach to Planning. Of course I’ve been on projects like that myself. It’s all very well to say, “Enter a few items, and let the magic of Planning allocate and aggregate the results,” and quite another to actually have a client do that. I’m interested to see what their techniques are. Could it be a paired BSO/ASO approach with partitions or maybe a reporting cube or maybe something completely different? Dunno, but this is intriguing.

Calc Manager: Go Beyond Basics and Unleash the Power of Oracle Hyperion Planning

Ludovic De Paz, TopDown Consulting
Co-presenter(s): Terrance Sundar, Shutterfly
When: Jun 26, 2013, Session 15, 3:00 pm - 4:00 pm
Topic: Planning - Subtopic: General Planning
With Hyperion 11.1.2.3., Calc Manager is the only option to develop and deploy Business Rules. This is a fantastic opportunity for developers to leverage its latest advancements. This session will examine key features and functionality in Calc Manager and demonstrate how to successfully achieve your goals while improving quality. It will also include best practices, tips, tricks, and techniques that consultants, administrators, and end users can leverage to make completing projects and daily tasks easier.

I’m using Calculation Manager on my current project. I am not in love with the endless objects and drawings. I even had a client ask me, “Does it have to be all broken up like this?” But maybe we’re not giving the more GUI-ish nature a fair shot. For sure we aren’t using really advanced features. Could oh could oh could that be tied to treating Calculation Manager like a glorfied Essbase calc script editor. Why yes it could, so I am going to try to attend this session to see the error of my ways.

Planning Experts Panel

Natalie Delemar, Ernst & Young
When: Jun 25, 2013, Session 10, 3:30 pm - 4:30 pm
Topic: Planning - Subtopic: No Subtopic
TBD

Not a lot to go on, is it? I moderated this panel last year and it was a lot of fun. I’m not involved in the panel this year but I am looking forward to shooting a bunch of tough questions at whoever the panel is.

Dynamic Integrations for Multiple Hyperion Planning Applications

Co-presenter(s): Rodrigo Radtke de Souza, Dell
When: Jun 26, 2013, Session 15, 3:00 pm - 4:00 pm
Topic: Planning - Subtopic: General Planning
This session will cover how to use Oracle Data Integrator and Oracle Hyperion Planning metadata repository to build dynamic, flexible, and reliable processes to maintain metadata, load, and extract data from any number of applications with a single generic component.

If it has ODI as a component of the presentation, I am interested. I almost like ODI more for its ability to tie together completely heterogeneous technologies without a ton of scripting than I do for its base purpose, which is ETL. ODI rocks! Actually, I’m more of a classic jazz fan, but I’ll try to bebop my way to this session.

Automating Hyperion Planning Tasks

Kyle Goodfriend, Rolta Solutions
When: Jun 25, 2013, Session 9, 2:00 pm - 3:00 pm
Topic: Planning - Subtopic: General Planning
Maintaining Hyperion Planning environments can be so time-consuming that there is little time for development. We are constantly being asked to do more with less. Understanding some of the utilities and options to automate redundant operations can significantly improve your ability to react to change. It allows more time for new development, eliminates human error, and increases productivity and system stability. Find out what options are available, how to use them, and see real world examples you can take home.

Kyle must be a lazy programmer. Why do I write that? And no, I am not trying to insult him, as I too am a lazy programmer. What’s at least one definition of a lazy programmer? A lazy programmer is someone who sees a task that is manual or semi-scripted, does the task once and says, “There is no way I am ever doing that again.” He then sets off to write an automated whatever. ‘Coz he’s got better things to do. I suppose a really lazy programmer doesn’t even go through the pain once. I don’t know Kyle so I can’t determine if he falls into the moderately or intensely lazy camp.

NB -- Lazy = smart.

Oracle Project Financial Planning -- The Owner's Manual

Josephine Manzano-Stettler , ADI Strategies
When: Jun 26, 2013, Session 11, 8:30 am - 9:30 am
Topic: Planning - Subtopic: Project Planning
Oracle Project Financial Planning is the newest packaged application to be rolled out with the Oracle Hyperion Enterprise Planning Suite (v.11.1.2.3). This pre-built solution ties project financial planning to corporate financial planning activities within a single application, as well as, supports financial planning throughout the complete project management lifecycle. Want to find out more?...a LOT MORE? This presentation will dive deep into the new Oracle Project Financial Planning solution. The following will be covered: 1) Highlights of key out-of-the-box features, functionality, and analytical tools 2) Integration with the Workforce and Capex Planning Modules, ERP, and Project Management systems 3) Implementation considerations for a successful deployment 4) Minimizing customizations...Are your company's business needs the right fit for the packaged solution? Maximize your success for integrating Oracle Project Financial Planning into your Enterprise Planning tool set.

If Josie does it, it’s good. ‘Nuff said.

Managing Your Project Budgets: Introduction to the New Hyperion Project Planning Module

Tracy McMullen, interRel Consulting
When: Jun 24, 2013, Session 2, 9:45 am - 10:45 am
Topic: Planning - Subtopic: Project Planning
Hyperion Planning 11.1.2.3 adds a new pre-built module to the existing suite of Workforce and Capital Expenditure planning. This new module, Project Planning, fills the gap of how to budget for projects both short- and long-term before they become capitalized assets. Whether you want to budget IT projects from initial proposal through implementation, capital projects to expand facilities, or development projects out in the field, the new Project Planning module can handle them all.

Guess what? If Tracy does it, it’s good. ‘Nuff said, part two.

Did you know (or care) that in a previous professional life I once turned down dancing with Josie and Tracy? On a ship? In the middle of the Gulf of Mexico? I must have been out of my mind. :) I did have a cold. But still.

Introduction to Predictive Planning in Hyperion Planning

Troy Seguin, interRel Consulting
When: Jun 26, 2013, Session 12, 9:45 am - 10:45 am
Topic: Planning - Subtopic: General Planning
Version 11.1.2.3 of Hyperion Planning is including Crystal Ball's predictor feature. Predictor utilizes established time series procedures to help with forecasting upcoming time periods. This presentation will walk you through how to effectively use predictor as part of your budgeting and forecasting duties as well as an intuitive explanation of the concepts working behind the scenes.

I have to say I am somewhat on the fence about this one. No, no, not about Troy, or his presentation skills, or anything like that. It’s about statistics in the hands of your average Planner. Face it, people love to gamble. Face it again, the house (casinos, state lottery boards, Nathan Detroit’s The Oldest Established (Permanent Floating Crap Game In New York), etc.) always wins over time; those institutions must if they are to survive. And yet people gamble in the “sure” knowledge that they will win. But casinos, and state lotteries, and even incredibly charming minor organized crime figures with great singing voices understand how statistical probability works, and in their favor. Wait, I just figured out who needs to go to this session. I hope Frank Sinatra is there (at least spiritually) as well.

Turbocharge Your Hyperion Planning Input Forms with Predictive Planning

Jake Turrell, US-Analytics
When: Jun 26, 2013, Session 13, 11:15 am - 12:15 pm
Topic: Planning - Subtopic: General Planning
One of the most exciting new enhancements to Planning 11.1.2.3 is its new Predictive Planning tool. This new feature allows users to plot their projections alongside those created by Predictive Planning, giving users another data set against which they can compare their results. This live demo will walk users through the process of setting up Predictive Planning and will provide several real-world examples. The session will cover: - When to use Predictive Planning and when to avoid it - Basic statistical concepts used by Predictive Planning - How to best configure input forms for Predictive Planning - A walk through of the Predictive Planning user interface - Running predictions. - Using Comparison Views to review the results of various scenarios - How to tweak your results with filters and reports Users will leave this session with the tools, knowledge, and confidence to implement Predictive Planning in their own environments.

If Troy’s presentation above is the theory, then Jake’s session is the application. So that makes Troy a physicist and Jake an engineer.

Lower your TCO With Oracle Planning and Budgeting Cloud Service (PBCS)

Shankar Viswanathan, Oracle Corporation
When: Jun 24, 2013, Session 5, 3:00 pm - 4:00 pm
Topic: Planning - Subtopic: General Planning
PBCS takes the best of breed Hyperion Planning solution to the Cloud with a compelling offering for companies looking to lower TCO by deploying their applications on the Cloud using the top Planning solution in the market. This session will introduce participants to the upcoming Planning Cloud offering from Oracle. Participants will hear the details of the offering and get a sneak preview of Oracle's Planning on the Cloud offering.

Oracle? Again? Do you want to hear about how Oracle’s cloud service works? Would the Planning product manager be the person to tell you all about it? Why yes, you should and he is. See, this is what makes Kscope so great – we the great unwashed get to interact with the people that make the products we use.

What's New in Planning?

Shankar Viswanathan, Oracle Corporation
When: Jun 25, 2013, Session 8, 11:15 am - 12:15 pm
Topic: Planning - Subtopic: General Planning
The latest Release 11.1.2.3 of Oracle Hyperion Planning takes Enterprise Planning to the next level by providing some valuable agile enablers within Planning. This session will discuss and showcase some of these key features in this release, and provide several good considerations for customers to help choose this release as the go-to release for deployment and upgrade.

I have but one question – Planning ASO, is it any good? This is where I will hear about it for the first time. I hope.

Are you crying Uncle yet?

That is 11 different Planning sessions. Is that the sum total of Planning sessions at Kscope? Absolutely not. In fact there are 31 on offer. And that is just a subset of all the EPM sessions. Kscope has content, content, content.

The next blog post will be The Truth About Hyperion EPM Foundations and Data Management at Kscope.

Be seeing you at Kscope13.


What Kscope13 sessions am I looking forward to, part three

$
0
0

Introduction

I’ve already covered the Essbase and Planning side of the house in parts one and two of this series. What about the foundation for these tools? You know, the data and metadata that make EPM applications, well, right, accurate, useful, etc. Without good data (and metadata) all we EPM practitioners have is a pretty design and bad numbers. Hmm, I may have written a book (or at least a chapter) about this.

Happily, ODTUG agrees with me (Or do I agree with them? Whatever) and they have an EPM Foundations and Data Management track. This is the third in the series of sessions I am looking forward to, and if history and culture are any guide: there is the Rule of three, the Page Cavanaugh Trio’s version of The Three Bears, and perhaps most famously, “All Gaul is divided into three parts”. In other words, three is an important number. And so is this track.

Whoops, before I continue, I should mention that I am friends with, or at least am acquainted with most of the people below. Am I just shilling for them? Nope, these are good sessions. Of course you pick and decide what you want to attend – this is what I am interested in.

Note – you will notice that the name Cameron Lackpour is absent from the below sessions. This is not some kind of false modesty as I do think that the SQL session I am presenting is at least worth considering. I will cover that later in the week – this block is for everyone not named Cameron.

EPM Foundation Data Management sessions (stolen right off of Kscope13.com) with my comments

Integrating PeopleSoft with Planning -- How Amerigroup Replaced HAL with ERPi & FDM

Roger Balducci, Amerigroup
When: Jun 26, 2013, Session 14, 1:45 pm - 2:45 pm
Topic: EPM Foundations & Data Management - Subtopic: FDM
In this session discover how Amerigroup replaced a black box HAL process with FDM & ERPi to load their Planning application. During this session the presenter will review the decision to use ERPi in conjunction with FDM to enable drill through to PeopleSoft. The session will highlight the automation that provides flexibility to process data for the entire company or a single business unit. Finally the session will demo the drill-through capabilities that ERPi provides - not only to the ledger but also to the subledger.

A project that replaces HAL? Death to HAL, I say, death to HAL. That product caused me grief, pain, and psychic discomfort every time I brushed up its mediocre spaghetti diagrams. Yes, yes, I know, it has its defenders, but they’re wrong. Proof? Come see this presentation. You’ll feel clean afterwards, like after a mountain hike whilst eating a York Peppermint Patty. Or am I confusing that with Irish Spring soap and cheesy faux-Irish dialogue? Anway, see how HAL got the coup de grace. And cheer.

Stump the Experts - Hyperion EPM Panel

Natalie Delemar, Ernst & Young
When: Jun 24, 2013, Session 4, 1:45 pm - 2:45 pm
Topic: EPM Foundations & Data Management - Subtopic: No Subtopic
TBD

Intriguing content there, yes? :) I have no idea who is to be on this panel but Kscope always does these right with a good mix of freewheeling questions and lots of opinion. You know, the things consultants are afraid to say to their clients lest they be bounced out on their noggins. Ouch. But no clients (other than the punters in the seats) in this. I am looking forward to it.

ODI - Tips and Tricks to Build Better Integrations

Matthias Heilos, MindStream Analytics
When: Jun 25, 2013, Session 8, 11:15 am - 12:15 pm
Topic: EPM Foundations & Data Management - Subtopic: ODI
Oracle Data Integrator (ODI) is a powerful data integration suite which allows you to build integration processes with enormous productivity gains over conventional tools of the same breed. In this session you will gain insights in how ODI works and what you should consider to build master-class integrations. Learn about tricks and tips on architecture, Knowledge Module optimization, migration, flexible load processes, and many other areas that your organization should be aware of when working with ODI.

Matthias knows ODI. Really, really well. If he does a session on it, it’ll be good.

Think Outside the Box - New Ideas in Financial Data Management

Matthias Heilos, MindStream Analytics
When: Jun 26, 2013, Session 11, 8:30 am - 9:30 am
Topic: EPM Foundations & Data Management - Subtopic: No Subtopic
Are you wondering if you could manage your (financial) data more efficiently? Often, the answer is yes. In this session you will see how other organizations found unusual ways to improve their financial processes. Looking at the bigger picture often allows discovery of new solutions to either automate more effectively, increase transparency, or improve your ability to adapt to change faster. Join this session to learn about unconventional ways to use Hyperion products and OBIEE.

See the above on my opinion on Matthias’ knowledge level and presentation skills. Also, I get sort of obsessed about data, so this ought to be really interesting.

How to Turn New Recruits into Oracle EPM Support Gurus

When: Jun 26, 2013, Session 16, 4:15 pm - 5:15 pm
Topic: EPM Foundations & Data Management - Subtopic: Infrastructure & Essbase Exalytics
Oracle EPM requires a knowledgeable team to provide production support due to its criticality as a service. Typically skill levels vary in the team as resources are pulled from other areas or are required to support multiple services. Consequently, the need for infrastructure training is a recurring theme in an organization. This presentation covers how to explain Hyperion and its architecture in a way to fully engage new support staff. It includes getting started with EPM modules, logs, and troubleshooting.

This is an interesting session and more of an Organizational Psychology topic than technical – I find these fascinating. Some of my clients understand how to do this and some…do not. It’s not easy finding the right person or persons and as the EPM stack become more and more sophisticated and complicated the profile of the right EPM administrator has changed. And a bad admin = a bad system (yes, I have all too painfully experienced this), so this ought to be an informative session.

How Windstream Leverages Essbase Analtyic Link to Increase Their Analytic Capabilities
Alexander Ladd, MindStream Analytics
Co-presenter(s): Jennifer Moline, Windstream Corporation
When: Jun 24, 2013, Session 5, 3:00 pm - 4:00 pm
Topic: EPM Foundations & Data Management - Subtopic: No Subtopic
Windstream Corporation utilizes Essbase and Essbase Analytic Link to unlock analytic value from their HFM application. This presenation will detail how Windstream implemented Essbase and Essbase Analytic Link with drill through to transactional detail via FDM. See the architecture, the data flows, and how this environment was built, and hear the lessons learned about Essbase Analytic Link.

Once upon a time at a mildly disagreeable client, I worked on a HFM to Extended Analytics project. Which was somewhat amusing as I could and still can barely spell H-F-M but it wasn’t my choice and I met some great people along the way; nothing builds teamwork like adversity. In any case, this was back in the System 9 (remember that?) days and the link was…crude. Wouldn’t it have been great if there was a HyperRoll module that pulled data out of HFM in real time and then pushed it to BSO Essbase as a quasi-transparent partition via a CDF? Why yes, it would have, and I wish it existed back then. And now it does, so come see how it works. Although, if I had had this, would I have made those friends? One of life’s imponderables.

Exalytics - An Apples to Apples Comparison

Co-presenter(s): Cameron Lackpour John Booth, Tim German
When: Jun 25, 2013, Session 6, 8:30 am - 9:30 am
Topic: EPM Foundations & Data Management - Subtopic: Infrastructure & Essbase Exalytics
This session will be a panel discussion highlighting the results of our apples to apples test comparing an Exalytics-based solution to a comparable machine in the Amazon Cloud. These tests encompassed ASO and BSO; they covered data loads, BSO calculation, and ASO Aggregation; and finally multi-user performance tests of BSO Planning Rules and ASO Queries. Given the breadth of this testing some of the results are applicable to non-Exalytics solutions (assuming you have the "lots" of CPU and or Memory).

Okay, full disclosure here – I am involved in this one but I am but a supporting player. This is a really interesting session. And yes, I broke my own rule but if I can’t do that on my own blog, where can I?

FDM to ERPi - Upgrade/Migration Strategies & Considerations

Anthony Scalese, Edgewater Ranzal
When: Jun 24, 2013, Session 2, 9:45 am - 10:45 am
Topic: EPM Foundations & Data Management - Subtopic: FDM
Not the fish, anything but the fish! The FDM product is nearing the end of its life. This session will introduce you to FDM 2.0, aka ERP Integrator (ERPi). The session will begin with a technology overview of the new product - architecturally and functionally. The session will continue on to explore key features/changes from FDM. The session will explore strategies, techniques, and key watch-outs for migration from your existing FDM application. Finally the session will discuss best/leading practices for ERPi implementation and go-forward maintenance.

The fish, the fish, oh the humanity! Hmm, something about that doesn’t make sense. Anyway, FDM has always struck me as somewhat old fashioned. And we all know that ODI is all kinds of awesome. And now we see FDM replaced with…ODI in a wrapper. This ought to be interesting. And I’m glad I never learnt how to be an FDM consultant. :)

The New and Improved FDM -- Financial Data Quality Management Enterprise Edition 11.1.2.3

Richard Wilkie, Oracle Corporation
When: Jun 24, 2013, Session 3, 11:30 am - 12:30 pm
Topic: EPM Foundations & Data Management - Subtopic: FDM
The FDM EE 11.1.2.3 release combines the deep functional flows of classic FDM with the deeply integrated technical aspects of ERP Integrator. This new solution allows customers to build deep integrations directly against popular ERP's like E-Business Suite, Peoplesoft and SAP while taking advantage of the functional workflow required in any end user driven data quality process. This session will deep dive into the changes that were made, how they benefit new and existing customers, and typical use cases across the EPM product family.

Whoops, there I go slagging off Oracle’s (well, Hyperion’s which should actually be Upstream’s) fine products and yet I suggest that you attend an Oracle session on FDM EE. If you want to know where the product is going, and what it’s all about, I can’t think of a better person to listen to.

Are you crying Uncle yet?

That is nine, count ‘em nine (you will note that this is divisible by three, I stick to my themes come Hell or high water), different EPM Foundation and Data Management sessions. Is that the sum total of these sessions at Kscope? Absolutely not. In fact there are 17 on offer. And that is just a subset of all the EPM sessions. Kscope has content, content, content.

The next blog post will be The Truth About EPM Reporting at Kscope.

Be seeing you at Kscope13.

What Kscope13 sessions am I looking forward to, part four

$
0
0


Introduction


I’ve covered the Essbase, Planning, and EPM Foundations and Data Management side of the house in parts one, two and three of this series. Those subjects are all about getting data and metadata into the EPM world. What about reporting it out? That is sort of the point, right?
 
ODTUG realizes that and created a reporting track just for this very purpose. And that’s a good thing as the EPM reporting options and solutions have gotten more and more complicated and sophisticated over time. When I started out with an Essbase report meant an Essbase report script. We’ve come a long way. The best way to make sense of all of that is to take a look at some of these sessions.

Whoops, before I continue, I should mention that I am friends with, or at least am acquainted with most of the people below. Am I just shilling for them? Nope, these are good sessions. Of course you pick and decide what you want to attend – this is what I am interested in.
EPM Reporting sessions (stolen right off of Kscope13.com) with my comments

The Art of Ad-Hoc Analysis with Essbase

Joe Aultman, AutoTrader.com
When: Jun 25, 2013, Session 8, 11:15 am - 12:15 pm
Topic: EPM Reporting - Subtopic: No Subtopic
"Where is this number coming from?" "Why doesn't this look right?" "How do I explain these variances?" "Can you show me this data in a different way?" These are all questions our managers ask, and we are expected to answer with Essbase. Yet sometimes, staring at an empty spreadsheet or at a report someone else created, we may not know what do to next. Watching someone else do analysis may leave our heads spinning. Even if things aren't quite that bad, we may feel we're taking the long route to our answers, or that there might be technique we're missing from our repertoire. This session covers the basic methods of ad-hoc analysis for different modes of exploration. The session will discuss the various options, when to use them, and when not to. The session will talk about what kinds of questions lead to what kinds of analysis modes. The session will review techniques for efficiency and lesser-known features. While the session is aimed at the less experienced, there's a good chance even experts will hear something they didn't know. Smart View and Classic Add-In will also be covered.

Old school, but still what makes Essbase great. Are you getting all that you could of Essbase? Also, Joe is a great speaker.

The Top Five Things You Should Know When Migrating from an Old BI Technology to OBIEE

Michael Bender, Performance Architects
Co-presenter(s): John McGale, Performance Architects, Inc.
When: Jun 26, 2013, Session 11, 8:30 am - 9:30 am
Topic: EPM Reporting - Subtopic: No Subtopic
Upgrades and migrations are the perfect time to not only evaluate the latest and greatest technology features, but also to review your organization's business intelligence processes to make sure that you are working effectively and efficiently. The session will review how to "sell" your leadership on why to migrate; thoughts on best practices in upgrades; the difference between a "migration" and an "upgrade"; important items to note as you're thinking about this major change for your organization; migrating existing Oracle Hyperion Intelligence or Discoverer reports...or content from another vendor's BI solution...to OBIEE or BI Foundation; and much more!
 
Interactive Reports is dead, long live OBIEE!  Time to switch, folks, if you haven't already.  And if you haven't already, this would be a great session to attend.

Best Practice Methodologies with Oracle/Hyperion Financial Reporting

Joshua Forrest, Abercrombie & Fitch
When: Jun 24, 2013, Session 4, 1:45 pm - 2:45 pm
Topic: EPM Reporting - Subtopic: Financial Reporting
Learn how developing financial reports with best practice methodologies results in successful implementations and efficient solutions. This session will utilize Oracle/Hyperion Financial Reports to demonstrate time-saving features including Row & Column Templates, Saved Objects, and Rolling Year Reports. Attendees will realize the benefit to using Annotations for commenting on results and related content to provide additional detail. Efficiency accelerators aimed at Reporting Standards and Change Impact will be covered and distributed. Attendees will leave this session with collateral and concepts that can be quickly applied into their environment.
 
I have seen good FR reporting approaches, and bad ones. The bad ones suck. A lot. If you have any doubts about what you are doing with FR, or want to improve your reporting solution, you should be here.

Smart View Query Tools: So Many Tools, So Little Time

Craig Mesenbring, Harbinger Consulting Group
When: Jun 24, 2013, Session 1, 8:30 am - 9:30 am
Topic: EPM Reporting - Subtopic: Smart View
Stuck in the same old rut in how you create Essbase reports in Smart View? Did you know that there are six different Smart View tools for querying Essbase data? In one short session, learn how, when, and why to use each tool. Get out of the rut and expand your horizons with all of the Smart View query tools.
 
Smart View has a lot of functionality beyond an ad-hoc query. Most of us (including me) don’t take advantage of said functionality. So let’s show up at Craig’s session and move on.

Leveraging Office and Smart View to Create a Data Entry Experience

Matt Milella, Oracle Corporation
When: Jun 26, 2013, Session 15, 3:00 pm - 4:00 pm
Topic: EPM Reporting - Subtopic: Smart View
This session will take you though the steps necessary for creating advanced input templates for distribution to end users. The templates will include data from multiple sources and the examples shown will leverage new features in Office 2010 and 2013 along with Excel features have been around for many versions. (charting, functions, sparkles, outline, and formatting) It will touch on the use of VBA to automate common tasks and we will go through some of the challenges of template distribution and maintenance.
 
Matt is an excellent speaker and of course as Product Manager knows Smart View really well. Highly recommended.

Smart View's New Features

Matt Milella, Oracle Corporation
When: Jun 26, 2013, Session 16, 4:15 pm - 5:15 pm
Topic: EPM Reporting - Subtopic: Smart View
In this session we will get you up to speed on all the new features available in the latest version of Smart View. We will discuss general client features like sheet level options, re-designed function builder, or platform support like Office 2013 as well as provider specific features like in-cell Point of View or improved Planning Smart Lists. This session will also contain details on, and show demos of new extensions; most notably the OBI (Oracle Business Intelligence) extension.
 
What, you didn’t see my comments above? Matt is a great speaker, highly knowledgeable, and knows Smart View better than anyone. I predict (and this is a pretty safe prediction) that this will be SRO.

Customizing the Smart View Framework

Michael Nader, Blue Stone International
Co-presenter(s): Martin Slack, Ernst & Young
When: Jun 26, 2013, Session 12, 9:45 am - 10:45 am
Topic: EPM Reporting - Subtopic: No Subtopic
Smart View for Office provides a dynamic framework for customizing the functionality and capabilities to suit a client's deployment. This session provides practical, deployed examples of using customizing Smart View for both Planning and Essbase.

Fellow Developing Essbase Applications author Mike Nader is a great speaker and does really interesting work.

Hyperion Financial Reports: A Love and Hate Relationship
Mehmet Sevinc, University of California Berkeley
When: Jun 24, 2013, Session 3, 11:30 am - 12:30 pm
Topic: EPM Reporting - Subtopic: Financial Reporting
Hyperion Financial Reporting (FR) has been around for a long time. In spite of rumors, FR has been a go-to reporting tool for Profit & Loss, Balance Sheet, Cash Flow, and various other financial reports. The simplicity of the tool, the user-friendly design, and easy training of the users have made FR a very popular reporting tool in the Enterprise Performance Management field.
 
This sounds like a good basic (sorry, Mehmet if this is super advanced, let me know and I will edit accordingly) session on FR. If you’re starting out with FR, this is the place to be.

Automating Hyperion Reporting for the Rest of Your Organization

Jim Wilking, Harbinger Consulting Group (HCG)
When: Jun 24, 2013, Session 5, 3:00 pm - 4:00 pm
Topic: EPM Reporting - Subtopic: Smart View
Everyone realizes that Hyperion users should be utilizing the built-in reporting tools, but how does the rest of your organization get their reports? Not everyone in your organization has Hyperion access. Learn from a Hyperion certified consultant how to harness the built-in power of Smart View VBA Functions to generate reports. This session will focus on how to develop and deliver reporting to the non-Hyperion user community in your organization utilizing your existing tools. Live examples and code will be used to explore solutions to many of the common reporting scenarios. Specific examples will display the benefit of the Smart View 11.1.2.3 multiple grid reporting and butterfly reporting functionality. This session will help you shave hours off of your daily, monthly, and quarterly reporting tasks.
 
More cool stuff with 11.1.2.3 with cool code. Doesn’t that sound cool? It does to me.

Are you crying Uncle yet?
That is nine, count ‘em nine, different EPM Reporting sessions. Is that the sum total of these sessions at Kscope? Absolutely not. In fact there are 14 on offer. And that is just a subset of all the EPM sessions. Kscope has content, content, content.

The next blog post will be The Truth About Other Interesting Topics at Kscope13.

Be seeing you at Kscope13.


What Kscope13 sessions am I looking forward to, part five

$
0
0

Introduction

I’ve covered the Essbase, Planning, EPM Foundations and Data Management and EPM Reporting side of the conference – but what about the rest of Kscope13? Is there anything else at the conference worth attending other than this EPM love fest I have described?

Kscope13 doesn’t cover every single tool that Oracle offers (although I can see ODTUG’s president, Monty Latiolais, doing his usual and profoundly bone-chilling Dr. Evil impersonation as he discusses this very topic at the next ODTUG board meeting – and yes, that is a tough act to pull off for a 6’5” Texan with a full head of hair, but it is true, ODTUG board meetings do discuss the utility and best application of sharks with laser beams in our bid for world domination), but it does speak to an awful lot of them.

What do I mean? Other than EPM, there are the following tracks: Application Express, ADF and Fusion Middleware, Developer’s Toolkit, The Database, .Net, Building Better Software, and Business Intelligence. Did I mention that there is also an EPM Business Content track? Is ODTUG on its way to our planned world domination? That is an awful lot of technological ground, so most of the Oracle world, then?

Whoops, before I continue, I should mention that I am friends with, or at least am acquainted with most of the people below. There are even a few hated and deeply resented greatly admired ex-bosses in the mix. Am I just shilling for them? Nope, these are good sessions. Of course you pick and decide what you want to attend – this is what I am interested in.

And yes, this list is waaaaaaaaaaaaaaaaaaayyyyyyyyyyyyyyy too long for any one person to attend. There is just Too Much Good Content. A nice dilemma to have when it comes to picking where to go.

Application Express

It looks like the Apex world is focusing on the cloud this year at Kscope. I’ve been using EPM in the cloud for almost three years and have happily made Amazon an even richer company. Why don’t my projects live in the cloud – I POC there, but what about the actual development?

It looks like Apex has made the leap to the Cloud, but for real. I can’t wait for EPM to get there.

Developing Real World Applications in the Cloud

Joseph Acero, JSA2 Solutions
Co-presenter(s): Gina Haub, South Texas Project Nuclear Operating Company
When: Jun 26, 2013, Session 12, 9:45 am - 10:45 am
Topic: Application Express - Subtopic: The Basics
Cloud technologies like Amazon Web Services and the new Oracle Cloud allow for rapid application development using distributed teams. This session will walk through the set up and best practices for developing in the cloud while walking through a real world case study.

Amazon Cloud Setup for APEX Environments

Martin D'Souza, ClariFit
When: Jun 24, 2013, Session 4, 1:45 pm - 2:45 pm
Topic: Application Express - Subtopic: Infrastructure/Management/Security
Moving your database into the cloud is a popular option within organizations for development all the way through to hosting production applications. There are several large cloud service providers that offer Database as a Service solutions, Amazon being one of them. This presentation will guide you through setting up an Oracle database on Amazon's Web Service (AWS) Relational Database Service (RDS) platform and setting up web servers to host APEX environments. Other areas to be discussed will be usage for secure development, offline production calculation, other AWS features, and comparisons with other cloud service providers.

Oracle Database Cloud Update

Rick Greenwald, Oracle Corporation
When: Jun 27, 2013, Session 19, 11:00 am - 12:00 pm
Topic: Application Express - Subtopic: No Subtopic
The Oracle Database Cloud went live in 2012. This session will give an overview of the progress of the Database Cloud, including discussions on initial rollout, subsequent enhancements, customer adoption and best practices for working with your own Database Cloud Service. In addition, the session will discuss some general direction for the Database Cloud, as well as act as a forum for your ideas for this Cloud platform

ADF and Fusion Middleware
ADF is here for we EPM geeks. Some call that good, others not so much. If you want to understand why 11.1.2.x of Planning or HFM or whatever looks the way it looks, this is the track to follow.

Also, I see Most Excellent Debra Lilley but where oh where oh where is Really Quite Smelly Stanley? I do hope he makes an appearance. :)

Oracle's Roadmap to a Simple, Modern User Experience in Oracle Fusion Applications

Jeremy Ashley, Oracle Corporation
When: Jun 27, 2013, Session 19, 11:00 am - 12:00 pm
Topic: ADF and Fusion Development - Subtopic: Customizing Fusion Apps
Simplify your user experience. Lower implementation costs. Increase productivity. Delight your users. Are you looking to wow your employees with a user interface that is simple, modern, and compelling? Learn how Oracle's drive toward enhancing productivity helps you achieve value from your application's investment. This session will show you how you can exceed your employees' desire for enterprise data, delivered on any device, and then explain how to reduce the cost of your user interface customizations, configurations, and extensions.

Mobile Development: A Practical Example of Design Patterns in Action

Susan Duncan, Oracle Corporation
Co-presenter(s): Debra Lilley, Fujitsu
When: Jun 25, 2013, Session 10, 3:30 pm - 4:30 pm
Topic: ADF and Fusion Development - Subtopic: Advanced ADF: Mobile, Cloud, Web Services, etc
Oracle's User Experience and ADF teams have worked together to produce a set of mobile design patterns that allow the development of intuitive, easy, and productive to use mobile applications. The patterns range from how to design your navigation, to list layout, to editing a business object, to how to invoke actions that yield a simple and apparent way to complete a task. These patterns work well across platforms (e.g. iOS, Android, and BlackBerry) and are supported by ADF. The patterns have been vetted in Oracle's own mobile products (e.g., Sales, Time Entry, Expenses, Field Service), and work across different user roles and product lines. In this session you'll see a practical example of building a mobile application using these scientifically proven UX design patterns to solve a real customer use case: a conference feedback application. Using ADF Mobile one hybrid mobile application will be developed for deployment both as an iOS and Android mobile app. The session will include a discussion on how the design patterns were used in approaching the problem by both the customer and ADF and UX teams and how this same approach is used for Oracle Fusion Applications. It will be jointly presented by Susan Duncan who leads Oracle's Mobile Development Program office and Debra Lilley, Oracle Customer/ACE Director instrumental in the Fusion User Experience Advocates Program: both active in mobile application design and development in their respective roles.

Developer’s Toolkit

In my dreams, and only my dreams, I am a Data Warrior. Oooh, that is such a good title. Essbase Hacker really doesn’t have the same ring.

Good data is the foundation of all of our systems. Well, okay, bad data is the foundation of some of our systems, but they aren’t likely long for this world, are they?

I really wish this track was almost at another conference so I could simply sit in on all of the sessions and not worry about all of the other tracks I want/need to attend. It is simply that good, from features to the guts of the tools to optimizing to organizational psychology. It’s amazing content.

One last comment – this track has the most creative session names – I simply had to include the one on the 1980s and the other on successful dating.

Top Ten Cool Features in Oracle SQL Developer Data Modeler

Kent Graziano, Data Warrior
When: Jun 25, 2013, Session 10, 3:30 pm - 4:30 pm
Topic: Developer's Toolkit - Subtopic: No Subtopic
Oracle SQL Developer Data Modeler (SDDM) has been around for a few years now and is up to version 3.x. It really is an industrial-strength data modeling tool that can be used for any data modeling task you need to tackle. Over the years, the presenter has found quite a few features and utilities in the tool that he relies on to make him more efficient (and agile) in developing his models. This presentation will demonstrate at least ten of these features, tips, and tricks for you. He will walk through things like installing the reporting repository, building a custom report on the repository using Oracle SQL Developer, modifying the delivered reporting templates, how (and when) to use the abbreviations utility, how to create and apply object naming templates, how to use a table template and transformation script to add audit columns to every table, how to add custom design rules for model quality checks (heck how to use the built-in quality checks), and several other cool things you might not know are there. Since there will likely be patches and new releases before the conference, there is a good chance there will be some new things for the presenter to show you as well. This might be a bit of a whirlwind demo, so get SDDM installed on your device and bring it to the session so you can follow along.

What? You're Still Not Using Groovy?

David Schleis, Wisconsin State Laboratory of Hygiene
Co-presenter(s): Joe Aultman
When: Jun 24, 2013, Session 5, 3:00 pm - 4:00 pm
Topic: Developer's Toolkit - Subtopic: Languages
If you spend time writing Java code, and you're not using Groovy, you're spending too much time writing code. If you've ever pondered an Essbase automation problem and said, "I wish I knew how to write Java," and you haven't looked into Groovy, your answer is here. Groovy is an object-oriented dynamic language (also referred to as a scripting language) like Ruby or PHP. Like these languages, Groovy is much easier to use and has a simpler syntax than Java. However, what makes Groovy different than other scripting languages is that it compiles to Java bytecode. This means that Java programs can run Groovy, and Groovy programs can run Java. This seamless integration with Java and its concise syntax are why Groovy is the language of choice for scripting of ADF Business Components. This integration also means that writing in Groovy makes it easier to use existing Java libraries; including the Java libraries of the Essbase Java API. This session is an introduction to the Groovy programming language and how it can be used in conjunction with the Essbase JAPI to make advanced automation more accessible.

The 80's Called, They Want Their Command Line Interface Back

Jeff Smith, Oracle Corporation
When: Jun 26, 2013, Session 12, 9:45 am - 10:45 am
Topic: Developer's Toolkit - Subtopic: IDEs
I only use SQL*Plus. I say that graphical IDEs are the best. Who is right? How can an old-school database pro be convinced to use newer technology, and more importantly, why SHOULD they be convinced? Tools are designed to do one thing - increase productivity. If your tool is slowing you down, you're doing it wrong, or you're using the wrong tool. Watch Oracle's SQL Developer product manager debate himself on why SQL Developer can be good for both the new and advanced Oracle user.

An Oracle Geek's Guide to Successful Dating

Sean Stuber, American Electric Power
When: Jun 26, 2013, Session 15, 3:00 pm - 4:00 pm
Topic: Developer's Toolkit - Subtopic: Languages
This session will be a short examination of Oracle's date/time datatypes and best practices for manipulating them in SQL and PL/SQL.

The Database

Sometimes I wish this blog was titled Cameron’s Blog for SQL Hackers. My current project has turned into one big festival of SQL.  If only I wasn't like a 3 year old playing with the stuff.  Maybe, just maybe, this track could help me. 

Optimizer Hints: Top Tips for Understanding and Using Them

Maria Colgan, Oracle Corporation
When: Jun 24, 2013, Session 2, 9:45 am - 10:45 am
Topic: The Database - Subtopic: Tuning
The most powerful way to alter an execution plan is via hints; but knowing when and how to use hints correctly is somewhat of a dark art. This session explains in detail how Optimizer hints are interpreted, when they should be used, and why they sometimes appear to be ignored. By attending this session you will arm yourself with the knowledge of how to apply the right hints, at the right time.

Oracle Optimizer: An Insider's View of How it Works

Maria Colgan, Oracle Corporation
When: Jun 26, 2013, Session 15, 3:00 pm - 4:00 pm
Topic: The Database - Subtopic: DBA
With each new release the Optimizer evolves as we strive to find the optimal execution plan for every SQL statement. Understanding how the Optimizer operates and what influences its choices helps you provide the necessary information to make that nirvana a reality. This session explains in detail, how the latest version of the Optimizer works and the best ways you can influence its decisions.

Exadata and the Optimizer

Maria Colgan, Oracle Corporation
When: Jun 27, 2013, Session 17, 8:30 am - 9:30 am
Topic: The Database - Subtopic: Tuning
Knowing when and how to take advantage of each of Exadata's performance enhancing features can be a daunting task even for the Oracle Optimizer, whose goal has always been to find the optimal execution plan for every SQL statement. This session explains in detail how the Oracle Optimizer costing model has been impacted by the introduction of the performance-enhancing feature of the Exadata platform. It will show through the use of real-world examples what you can do to ensure the Optimizer fully understands the capabilities of the platform it is running on without having to mess with initialization parameters or Optimizer hints.

Bye-bye CONNECT BY - Using the New Recursive SQL Syntax

Dominic Delmolino, Agilex Technologies
When: Jun 25, 2013, Session 10, 3:30 pm - 4:30 pm
Topic: The Database - Subtopic: SQL
Hierarchical queries in Oracle have always been a challenge, even for advanced SQL practitioners, with Oracle-specific SQL language elements which are not part of the SQL standards. Beyond the SQL-92 standard, the ANSI SQL:1999 standard added the definition of a recursive query which has now been adopted by Oracle. This presentation will talk about how to translate common CONNECT BY statements and use the new construct to solve more esoteric problems. The attendee will benefit by using this new standard, portable construct for hierarchical queries in Oracle and other databases.

Oracle Database Tools 101: How Does All This Stuff Get Built Anyway?

John King, King Training Resources
When: Jun 25, 2013, Crossover Sessions, 5:30 pm - 6:30 pm
Topic: The Database - Subtopic: No Subtopic
If you've been an Essbase/Hyperion, Applications, or BI user you may wonder what all the "hubbub" on the other side of Kscope is all about. Or maybe you're curious -- "I know there's a database under the covers and lots of developers; what do they do?" If you want to know about the underpinnings of your favorite Oracle software, this session is for you. We'll talk about how it all fits together: database, SQL, PL/SQL, ADF, Forms, APEX, and more (without too many boring details)! Attending this session will improve your understanding of and ability to communicate with the "bit-twiddlers" in your organization.

Tom's Top Twelve Things About the Latest Generation of Database Technology

Thomas Kyte, Oracle Corporation
When: Jun 26, 2013, Session 11, 8:30 am - 9:30 am
Topic: The Database - Subtopic: No Subtopic
The session will be taking a look at the latest generation of database technology and zeroing in on twelve high-impact capabilities, looking at what they are and why they are relevant.

WIT Session: "The Imposter Syndrome- When Successful Women Feel Like Frauds"

When: Jun 26, 2013, Session 12, 9:45 am - 10:45 am
Topic: The Database - Subtopic: No Subtopic
What is Imposter, (aka Fraud) Syndrome and why do so many people feel this way? Georgia University psychologists, Pauline Rose Clance and Sue Imes penned the term "Imposter Syndrome" back in 1978 when referring to those that were susceptible to feeling that they were frauds or imposters, no matter how skilled or successful they were. Both experts recognized that only about 20% of the cases involved men and started to investigate how much was due to nature vs. nurture and culture.

.Net

This is so interesting – a Microsoft tool track at an Oracle tools conference. Shall the Lion and the Lamb lay down together?

Getting Started with Oracle and .NET

Christian Shay, Oracle Corporation
When: Jun 24, 2013, Session 2, 9:45 am - 10:45 am
Topic: .NET - Subtopic: .Net
This beginner-level session introduces Oracle's offerings for .NET programmers, including Oracle Data Provider for .NET (ODP.NET), Oracle Developer Tools for Visual Studio, Oracle Providers for ASP.NET, and .NET stored procedures. Step-by step-demos illustrate how to get started with developing Oracle Database .NET applications by using each of these free products. New and upcoming .NET features, including fully managed ODP.NET, Microsoft Entity Framework features, Microsoft Visual Studio 2012 support, and schema compare tools are also described briefly in the session.

PL/SQL Programming for .NET Developers: Tips, Tricks, and Debugging

Christian Shay, Oracle Corporation
When: Jun 26, 2013, Session 15, 3:00 pm - 4:00 pm
Topic: .NET - Subtopic: .Net
.NET and Oracle programmers frequently work with PL/SQL, whether that means setting up a call to a stored procedure from a .NET program, executing a PL/SQL anonymous block, or writing and debugging the PL/SQL stored procedure code. In this session, we'll look at leveraging PL/SQL from the point of view of a .NET developer and will provide in-depth tips about how to configure and use the tightly integrated PL/SQL debugger in Visual Studio. We will also introduce the new Visual Studio Schema Compare tool and show how this new feature, along with automatic SQL script generation and source control integration assists in the Oracle database development lifecycle.

Building Better Software

Sigh, I should be attending this track as well. I have been on good projects, and I have been on bad ones. Happily more of the former than the latter but we all get to serve our time in hell. I have noticed that the bad projects are almost always a case of bad planning and design.  Wouldn’t it be nice to not go down that path? This track shows you how to do just that.

Five Ways to Make Data Modeling Fun

Kent Graziano, Data Warrior
When: Jun 24, 2013, Session 2, 9:45 am - 10:45 am
Topic: Building Better Software - Subtopic: Modeling
Most people think data modeling booooorrring, right? While data architects the world over all agree that data modeling is a critical success factor to any well-engineered database or data warehouse, many struggle with how to get their organizations to support their efforts. What if you could make data modeling sessions more engaging for the business folks? The end result would be better data models. Using some common games and concepts, this session will show you how to make data modeling fun. This will be a very interactive session complete with audience participation and maybe some prizes!

Performance is a Feature: Here is the Specification

Cary Millsap, Method R Corporation
When: Jun 24, 2013, Session 3, 11:30 am - 12:30 pm
Topic: Building Better Software - Subtopic: Instrumentation
To many software developers, designers, and architects, "performance" is a side-effect...an afterthought of designing and building proper features like "book an order" or "look up a book by author." But great performance at scale doesn't happen by accident. The first step is to know what performance *is*: it is the answer to the question, "What have people been *experiencing*?" Knowing what people experience when they use your software is possible only if you treat performance as a proper feature, a feature you analyze, design, build, test, and maintain. This session explains the steps that will get you started.

EPM Business Content

Business Content isn’t really my thing, except of course I don’t exactly build systems for laboratories. So actually, Business Content is my thing, or at least something I need to care about. Will my brain hold this much information? I may be reaching the tipping point.

What's the UPK?

Opal Alapat, TopDown Consulting
When: Jun 26, 2013, Session 14, 1:45 pm - 2:45 pm
Topic: EPM Business Content - Subtopic: Case Studies
Oracle User Productivity Kit (UPK) allows companies to develop, deploy, and maintain content for training and testing. It can help mitigate project risk, reduce deployment and project timelines, and assist with end-user adoption. This presentation will review the basics of UPK, different use cases for implementing it, and how to leverage it for testing and training purposes. In addition, this presentation will include tips and tricks for getting started and will highlight both the administrator and user perspectives.

The Evolution in Forecasting: Hyperion Planning 11.1.2.3

Tracy McMullen, interRel Consulting
When: Jun 26, 2013, Session 16, 4:15 pm - 5:15 pm
Topic: EPM Business Content - Subtopic: Product Demos
This session will highlight new features available in Planning 11.1.2 including a number of enhancements like approvals, full Smart View functionality, adhoc analysis over the web in a data form, built-in data syncing to an ASO database, and Public Sector Planning. 11.1.2.3 goes even farther by introducing a new module to do project planning, integrating Crystal Ball for predictive planning, and impressively, adding charts and graphs to composite forms to make Planning into more of a dashboard experience.

Business Intelligence
BI and EPM – yes, they are finally coming together. It still isn’t perfect but then again what is? Take a look at the sessions below – if you are interested in EPM or BI, these sessions alone could justify the cost of Kscope. Arrrgh, I need to be cloned so my army of Camerons (now really, that is a scary thought and in theory I am their leader as Ur-Cameron) can attend all of this cool stuff.

I have to also say, when I look at these presentations, I think, “Geez, why am I not doing this stuff?” Indeed, why aren’t I? Maybe I should be a BI geek and not an EPM geek. It would probably be a better intellectual fit. Maybe.

Innovations in BI: Oracle Business Intelligence Against Essbase & Relational Part 1

Stewart Bryson, Rittman Mead
Co-presenter(s): Edward Roske, interRel Consulting
When: Jun 27, 2013, Session 18, 9:45 am - 10:45 am
Topic: Business Intelligence - Subtopic: OBIEE
In OBIEE (Oracle Business Intelligence Enterprise Edition), you can create models against multiple disparate sources that pull metadata and facts from relational databases and multi-dimensional sources. A particularly powerful combination is to use Essbase for pre-consolidated cube data with an Oracle database along side for transactional information. This session will utilize the power of both sources to build on the strengths of each. Join Oracle ACE's, published authors, and presumed experts Stewart Bryson and Edward Roske as they demonstrate the fun of metadata development against a sample Essbase database sourced from an Oracle database. Attendees will leave the session knowing how to model complex Essbase options and integrate those with relational sources like the Oracle database.

Innovations in BI: Oracle Business Intelligence Against Essbase & Relational Part 2

Stewart Bryson, Rittman Mead
Co-presenter(s): Edward Roske, interRel
When: Jun 27, 2013, Session 19, 11:00 am - 12:00 pm
Topic: Business Intelligence - Subtopic: OBIEE
In OBIEE (Oracle Business Intelligence Enterprise Edition), you can create models against multiple disparate sources that pull metadata and facts from relational databases and multi-dimensional sources. A particularly powerful combination is to use Essbase for pre-consolidated cube data with an Oracle database along side for transactional information. This session will utilize the power of both sources to build on the strengths of each. Join Oracle ACE's, published authors, and presumed experts Stewart Bryson and Edward Roske as they demonstrate the fun of metadata development against a sample Essbase database sourced from an Oracle database. Attendees will leave the session knowing how to model complex Essbase options and integrate those with relational sources like the Oracle database

Using OBIEE and Data Vault to Virtualize Your BI Environment: An Agile Approach

Kent Graziano, Data Warrior
When: Jun 24, 2013, Session 5, 3:00 pm - 4:00 pm
Topic: Business Intelligence - Subtopic: OBIEE
This session will interview the users, design a reporting model, and follow up with mounds of ETL development, keeping the user community in the dark during that development. Familiar? This presentation will demonstrate an alternative approach using the Data Vault Data Modeling technique to build a "Foundation" layer in our data warehouse with an Agile methodology. Using the Business Model and Mapping (BMM) functionality of OBIEE, we can virtualize a dimensional model using the Data Vault Foundation layer to decrease the time it takes to get BI content in front of users. Attendees will see a sample Data Vault model designed iteratively and deployed to the semantic model of OBIEE.

Fusion Applications and Your BI/EPM Investment

Debra Lilley, UKOUG / Fujitsu
When: Jun 26, 2013, Session 11, 8:30 am - 9:30 am
Topic: Business Intelligence - Subtopic: No Subtopic
Oracle Fusion Applications are here today providing the next generation of applications. They are about having everything the user needs in one place, and that includes information. Fusion Applications is a window on Oracle's Fusion Middleware stack and a very big part of that is BI/EPM and analytics. This presentation will include a small demo of how Fusion looks and is designed to give you an appreciation of how BI/EPM is embedded in Fusion. For anyone thinking of Fusion in the future it will underline that your B/EPMI investment today is an investment in that future and protected.

Oracle BI Applications 11g with ODI: What You Need to Know

Kevin McGinley, Accenture
When: Jun 24, 2013, Session 1, 8:30 am - 9:30 am
Topic: Business Intelligence - Subtopic: OBIEE
Oracle BI Applications 11.1.1.7.1 PS1 was recently released, adding full support for ODI to the data integration side of BI Applications. This presentation will give details about the new release, including comparisons to previous releases along with demos of how to enable an out-of-the-box ETL run using the new features of ODI, Configuration Manager, and Functional Setup Manager.

Best Practices with Oracle Data Integrator : Flexibility

Gurcan Orhan, Rittman Mead
When: Jun 25, 2013, Session 6, 8:30 am - 9:30 am
Topic: Business Intelligence - Subtopic: No Subtopic
Oracle Data Integrator (ODI) seems slow when it is installed out-of-the-box. Since it has to comply with different versions of the databases and operating systems, the default installation is not the optimal choice. ODI is a flexible product that can be customized for specific requirements and to implement new features of the database or operating systems. Attendees will learn how to easily create a customized ODI environment. This presentation will demonstrate the flexibility of the Knowledge Module, configuration best practices and the best query response time tips and techniques. It will include information about how to load an extensive number of files quickly with a special algorithm, as well as how to define new or customized data types and analytical functions.

GoldenGate and ODI - A Perfect Match for Real-Time Data Warehousing

Michael Rainey, Rittman Mead
When: Jun 25, 2013, Session 9, 2:00 pm - 3:00 pm
Topic: Business Intelligence - Subtopic: Related BI Technologies
Oracle Data Integrator and Oracle GoldenGate excel as standalone products, but paired together they are the perfect match for real-time data warehousing. Following Oracle's Next Generation Reference Data Warehouse Architecture, this discussion will provide best practices on how to configure, implement, and process data in real-time using ODI and GoldenGate. Attendees will see common real-time challenges solved, including parent-child relationships within micro-batch ETL.

OBIEE and Essbase Integration in BI Foundation Suite 11.1.1.7 - Workspace Returns!

Mark Rittman, Rittman Mead
When: Jun 26, 2013, Session 12, 9:45 am - 10:45 am
Topic: Business Intelligence - Subtopic: OBIEE
The 11.1.1.7 release of Oracle BI EE incorporates Essbase into the product install, and provides a combined OBIEE/Essbase platform for high-performance BI and Analysis, optionally hosted on the Exalytics platform. In this session we'll look at this combined architecture, see what role Essbase plays within it and whether Shared Services is still needed, see how Workspace integration with OBIEE is now restored, and see how SmartView provides MS Office integration across both tools.

Using OBIEE to Retrieve Essbase Data: The Seven Steps You Won't Find Written Down

Edward Roske, interRel Consulting
When: Jun 24, 2013, Session 3, 11:30 am - 12:30 pm
Topic: Business Intelligence - Subtopic: OBIEE
If you've ever tried to find information on accessing Essbase from OBIEE, you'll be scouring badly written blogs for days, because there just isn't much published on this. This session will cover the seven poorly documented steps you must do to make sure your Essbase cube isn't flattened, it's in the correct outline order, aliases appear, and more. If you own Essbase & OBIEE and would like to integrate them, learn these seven steps and you too can start your own badly written blog (no offense).

Making Sense of Oracle's Business Intelligence and Analytics Offerings

Tim Vlamis, Vlamis Software Solutions, Inc.
Co-presenter(s): Dan Vlamis, Vlamis Software Solutions
When: Jun 24, 2013, Session 5, 3:00 pm - 4:00 pm
Topic: Business Intelligence - Subtopic: OBIEE
There is no best way to build BI systems and significant trade-offs exist. The BI gurus at Vlamis will compare and contrast alternative strategies for integrating different data sources into OBI systems and use scenarios to outline best practices and to evaluate the costs and ROI. They will talk about BI Apps and Fusion Apps and analytics, OBIEE and Endeca, Essbase and OLAP, Oracle Data Mining and Oracle R Enterprise, and Exalytics, Exadata, and Exalogic, and contrast the Oracle Database Appliance with the Oracle Big Data Appliance.

Are you crying Uncle yet?
Has the content completely overwhelmed you? I know it has for me. And I also know that I have said this many times over – there is simply nothing, and I mean nothing, that can touch Kscope for breadth and depth of content. Nothing.

If preparing presentations and the planning and setup work for Kscope wasn’t so time consuming (I am doing 3.5 presentations and I easily have 400+ hours of effort in this plus all of the other ODTUG things I am doing) I would want Kscope to be quarterly. Sanity and sheer physical exhaustion prevents that but I hope this little romp through the sessions shows how much great, cannot get it anywhere else, isn’t it amazing that people will share this, content Kscope has. I am soooo glad ODTUG exists.

The next and final blog post on Kscope13 sessions will be The Truth About Cameron’s Interesting (?) Kscope13 sessions.

Be seeing you at Kscope13.

What Kscope13 sessions am I looking forward to, Cameron edition

$
0
0

Introduction

In my five previous posts I’ve covered the Essbase, Planning, EPM Foundations and Data Management, EPM Reporting, and Everything Else sessions I am looking forward to (and admitting that I am somewhat unlikely to attend as I would need to be Cameron * 5 to do so and that is a scary prospect, even to me).


But what about my sessions?  Am I ever going to give you at least a hint as to what they are all about?  Am I, indeed.  I like to think, in my humble (ahem) way, that I in fact have some slightly useful information to impart.  You may not agree but after all you are reading this blog.  If you think my presentations stink and yet you’re still here you must at least admit you are somewhat confused.  OTOH, if you think my presentations aren’t half bad (and if they aren’t better than that I have wasted hundreds of hours on them, which I suppose is possible), you may be interested in the following sessions.

Top Six Advanced Planning Tips

When: Jun 24, 2013, Session 4, 1:45 pm - 2:45 pm
I am copresenting this with Jessica Cordova and it’s a chance for the two of us to impart some of the lessons learnt and techniques we’ve picked up over the years in Hyperion Planning-land.  Jessica is the primary, and me the junior on this one and you’ll understand what that means in a moment.

A bit of a side note

Actually “Six” in the title is something of a misnomer because in fact we are only going to present three sessions.  Ah, I hear you (and I do, really, in my mind, which is confusing because of course one hears with one’s ears) exclaim, “Wot?  ‘e said six an’ now he says three.  ‘e’s barmy ‘e is.”  Writing dialect is tough, especially when your inner Cockney only extends as far as really liking the Lambeth Walk.  I lay blame at the feet of my love for British war movies which, just like American ones, always have a mix of men from civvy street and there’s always someone from south of the Thames.    Also, their RSM’s have great script writers.  Wow, quite an excursion into the Weird Entertainment Cameron Likes.  

And we now return to our regular programming

Anyway, Jessica and I wrote our sections, rehearsed them via GoToMeeting, and realized that we had quite a bit (a lot) more than 50 odd minutes worth of content.  So our choice was to either cut sections out entirely or skim through our topics.  As the whole point was to do advanced topics we were in a bit of a bind until I realized that we could put up what sections we are doing up to a vote and do the balance at an ODTUG webinar after Kscope.  We are going to pinky promise on this and we all know that means that is a promise unto death.  Also, this will be an opportunity for you the audience to decide exactly what you want from us – it will be Kscope session democracy in action.  I hope you’re willing to indulge this slightly off beat approach as I think what we have to say is pretty good.

Monday Night Madness/Hyperion EPM Open Mic Night

When:  8:00pm to 10:00 pm
There is an Open Mic Night at Monday’s after-session fun.  Like a fool, I have volunteered to be one of the speakers in case there is a dearth of volunteers.  Also like a fool, and quite true to form (so I am repeating myself) I didn’t realize that one is not required, nay, is not allowed to write a presentation for this.  I wrote one.  Good thing I’ve got a really cool demo to go with it.  


We’ll see if I do this but if I do, I have got a really awesome twist on focused aggregations in Calculation Manager/Planning except this time the products are  Calc Scripts/Dodeca.  All I’ll say is that I have a way round the big problem with focused aggregations in Calculation Manager.  It is fast, fast, fast, fast.

Exalytics - An Apples to Apples Comparison

When:  Jun 25, 2013, Session 6, 8:30 am - 9:30 am
I think this has to be the Kscope group project to end all group projects.  By that I mean that John Booth, Tim German, Dan Pressman, and I all got together on the Mother and Father of all benchmarking tests.


Thanks to the generosity and quite possibly world record patience of Mark Rittman of the eponymously named Rittman-Mead, we have access to an Exalytics box.  What oh what oh what to do with it?  Why benchmark it against a really fast generic Linux box (that John bought with his own money – we are committed, or nuts) of course.  I also love that these severs are named Asgard and Zeus.  Naming the servers after mythological figures was coincidental and I think indicative of their speed.  Both environments are fast and put to shame anything I have ever seen at a client.  This project encouraged me to buy a mega laptop (well, mega as of summer 2013 – 32 gigs of RAM, 8 CPUs, and a SSD); I shall never buy one with a physical drive again.


What are our results?  As is usual, we are testing this down to the very last minute (some, like me, would argue that we are testing this beyond the last minute), so I honestly cannot say.  I will bet that whatever box you’ve got, we’ve got our hands on a faster one.  :)

Practical SQL for EPM Practitioners

When:  Jun 25, 2013, Session 8, 11:15 am - 12:15 pm
I’m terribly excited about this because I have found that I spend more and more of my time in projects using SQL to do all sorts of interesting and unusual (at least for an EPM geek) things.  It’s really opened my horizons wrt the roles I can play on a project and I find writing SQL code to be oddly therapeutic.  Yeah, I’m weird.


I’m going to use practical examples that I have used in the real world to show a bunch of techniques and approaches that I’ve found useful.  Of course, just like the Planning presentation I have more, much more, than I can possibly fit into a single session but I can temper that deficiency by blogging about it here.


As I was not all that long ago firmly in the “SQL means SELECT * FROM …” camp it’s been quite a transformation in skills for me and a valuable one to boot.  If you do EPM, and I don’t just mean Essbase and Planning, and yet are not at a SQL hacker like me I encourage you to attend.  

Lunch n’ Learn

When:  25 June, 12:45 pm to 1:45 pm
I’m sharing the dais with my much-admired former boss, Tracy McMullen (I have no idea what her title is now other than Mrs. Awesome but when I worked at interRel, she was Director of Consulting) and Chris Barbieri (Chris, I admire you too, even if you hate Essbase for reasons of technological jealously (Mr. HFM)  or maybe just sheer perverse bloody mindedness); John Booth moderates.  These are always free-wheeling and open ended.  I do get quite hungry.  This time I’m bring an energy bar.

A BSO Developer's Introduction to ASO Essbase

When:  Jun 26, 2013, Session 13, 11:15 am - 12:15
I actually didn’t sign up to do this presentation but was instead asked to do it.  I’m not an ASO wizard by any means but that actually was a good thing for this presentation because the topic is in fact how to approach ASO when one is a BSO geek.  

Yes, this has been done to death at Kscope but I think I bring two unique approaches to this subject.
  1. I use Sample.Basic, the BSO database that just about everyone knows, as the subject for my conversion.  It is both harder and easier to convert than you think.  As everyone (hopefully) knows what Sample.Basic is all about wrt calcs, dimensionality, etc., I can use the familiar constructs of that database to explain a plethora of ASO topics.  Although ASO and BSO are very different technologies one can use many BSO constructs to understand ASO.
  2. I used Dan Pressman’s standout ASO chapter in Developing Essbase Applications to dive deeply into the ASO kernel and understand how to design ASO databases for performance.  This was really exciting for two reasons:
    1. I need to understand how tools work.  IOW, “I gots to know”.  Hopefully in a less cinematically violent way than Dirty Harry but I at least share that innate curiosity about How Things Work.  Have you ever seen a bitmap index?  You will, and it won’t make your head explode.  I now know how ASO works, or at least as well as anyone who doesn’t have an @oracle.com email address does.
    2. I thought that Dan’s chapter was simply amazing, but also very, very dense.  I don’t mean that as a criticism in any way – it is a difficult and highly technical subject and a complete explanation of it is therefore obliged to be just as difficult and highly technical.  I read the chapter six times from beginning to end in an effort to understand it as well as multiple phone calls with Dan.  I am happy to announce that I think I have made more accessible what I consider to be the most important work on Essbase extant.  Do I cover all of what he wrote about or even delve into it at the same depth?  No, that simply isn’t possible within the 50-odd minutes (that time thing keeps on popping up, doesn’t it?) allotted to me but I think if you’ve read Dan’s chapter and did a big “whaaaa?” you should come to my session.  With its lessons under your belt you’ll be able to tackle Dan’s work again and again.  Btw, Dan is reprising his session from last year (quite unusual in Kscope but such is the import of his work) on 25 June from 2 to 3.  Even if you go to Dan’s session I encourage you to come to mine – the bitmap, the bitmap, the bitmap is difficult to understand at first but it is the essence of ASO performance and I cover it from a beginner’s perspective.  Did I mention I was excited?  :)


Fwiw, this was a very difficult presentation for me to write because so much of it is theory – I tend to think of myself as more the engineer type who takes theory and applies it as opposed to a physicist who strictly stays on the theoretical side of things.  So a bit of a stretch but if you are a BSO developer or someone, like me, who has seen some real dogs of ASO applications and guessed that BSO design principles may not apply, come to this session.

Conclusion or my ODTUG cup runneth over

I’ve figured out that in addition to the above two full presentations, plus the Planning copresentation, plus the benchmarking presentation, plus the open mic presentation, plus two private presentations for ODTUG/Oracle, I am somewhere in the neighborhood of 4 ½ to 5 presentations this year. As I write all of my presentations from scratch (I do not exactly have an army of minions who do this for me) that is an incredible amount of work.  I would guess somewhere around 300 to 400 hours and it came at the expense of just about everything that wasn’t work related; there are others (hello, Dan) on these projects who have put in even more unpaid time.  Yes, I love Kscope, but maybe I can love Kscope14 a little more by trying just a wee bit less.  We’ll see if that actually comes true.  


In any case, I think I’ve got some great things in store for the next week and I will be live blogging starting Saturday with the volunteer event.  


Be seeing you round Kscope13.

Kscope13, day -1

$
0
0

Introduction

For the last two years at Kscope I have live-blogged during the conference; Kscope13 will be no different.  This is both a fun and difficult task/duty.  Sometimes I am really, really, really busy and cannot blog during the day – I will do my best, but no promises.  And with that slightly deflating disclaimer, off we go.

Where oh where am I?

Why on the 31st floor of the Sheraton New Orleans.  Yes, that’s the Mississippi.  The view is pretty spectacular at night.


Volunteer day kickoff
The EPM (Essbase only, actually) started its relationship with ODTUG at Kaleidoscope 2008.  2008 was also the first volunteer event at the then Kaleidoscope and was in fact in a school.  We were back at a school, this time the Charles Easton Charter High School, in the Mid City/Bayou St. John districts of New Orleans.  


We did the usual back breaking (but welcome) work:  moving, painting, sanding, cleaning, etc.  Here we are in the cafeteria about to split up to our respective tasks.


Like a fool (I think I mentioned this in my last post), and yes, with me that is redundancy, I “knew” that indoors was hot, and outdoors was…less hot.  Hmm, I did not count on the working airconditioning but as I ignored my way out of the work I should have done and instead went to the outdoors work, you can guess it was hot, really hot.  And what did I do?  Sanded wood to make it available for picnic tables and benches.  Here’s William Booth hard at work.  Hot work, indeed.


Here’s a view of the painting.  We were busy, busy, busy.


You can see the lumber pile to the right – it’s just a fraction of what we sanded by hand.  


There was a bit of a spanner thrown into the works because of a rainstorm that happened in the middle of the outdoors work but we simply covered up and jumped back outside the moment the rain stopped.


The payoff and a warning

The school was really very nice about thanking we ODTUGers (not the official name but one I like) for the hard work we did.  It really is a pleasure and one of the things that differentiates Kscope from every other conference.


Back at the hotel, as I made my way up to my room, wet, smelly, and utterly wringed out, two women asked, “What’s an Od-Tug”.  I explained that we are an Oracle software conference (I passed by the whole the user conference marketing moment although perhaps I shouldn’t have) and that we always kick off with a volunteer day.  They were pretty surprised and said that it was really awesome we gave back to whatever town hosts our conference.  


It is really awesome and as I wrote, it makes Kscope unlike any other conference.


And now the warning

Hey, when the NOPD tells you not to bring a gun onto school property, they really mean it.  Hey, it’s only up to five years of hard labor in a chain gang; think how well you’ll be able to hone your Cool Hand Luke impersonation.  Happily I was only armed with my rapier wit.


There’s still the Welcome Reception to come later tonight.  


I hope to be seeing you all at Kscope13.

Kscope13, day 0

$
0
0

Introduction

Sundays at Kscope are always Symposium day.  I understand that there are multiple other Symposiums, one for each track.  Of course, being an Essbase geek, the only thing I care about is EPM, and here I am, sitting in the EPM Symposium, listening to Oracle management talk about the latest and greatest in the EPM space.

And the content is…

Sorry, can’t tell you.  That’s the deal – come to the symposium, do NOT blog about what you hear.  Safe harbor statements abound (so they’re going to tell us the future, but reserve the right to change their collective Oracle mind) as well as requests to NOT take photographs, NOT blog about what we learn.  Those are the rules.


So that means if you aren’t at Kscope13, you don’t know what is coming in the EPM space.  And I’m not (I cannot) tell you.  Stinks, doesn’t it?  The way to solve this is to come to Kscope14, and every year thereafter.  That’s my plan for career futures.  :)


What I can tell you is there is some very interesting news about Essbase.  It’s stuff we have all wanted for a long time.  Again, sorry if you are not at Kscope but we attendees are not allowed to tell you more about it per Oracle’s request.  So yes, a big, big tease.


And some very interesting news about Planning.  I have wanted this functionality for approximately forever, or at least since 2002 (ah, Planning 1.5, or maybe 1.1 – I no longer remember but oh my goodness you were buggy).  Alas, I again cannot tell you much of anything.  In fact nothing.

Conclusion

Are you gathering that Kscope gives you information that you cannot get anywhere else?  This is important stuff that defines the future of what we do and no other user conference delivers this information.


The brutal sadist in me sort of enjoys telling you that there is all sorts of cool stuff on offer at the Sunday symposium.  The caring nurturing inner Cameron wishes you were here.  Square the circle, bind the wound, cut the Gordian knot, for goodness’ sakes stop me from tortured metaphors and just make sure you are here next year at Kscope14 so I don’t have to keep on telling you about all the cool things I (and everyone else at Kscope13) know, and you don’t.  See, I really am the caring, nurturing sort.


Be seeing you at Kscope13 (oh, we happy few) and Kscope14.

Kscope13, day 1

$
0
0

Introduction

Monday is the first “real” day of the conference.  All very, very cool and for the first three sessions I am also room ambassador.  Alas, this does not mean that I shall be henceforth be referred to as “Your Excellency” although I am totally okay with that form of address in future.  You decide.

And the content is…

I was the ambassador for the first three sessions.  That means I passed out evaluation slips, was a microphone monkey, and got cool pins I get to attach to my conference lanyard.

Essbase and Planning Calculation Basics

What’s worse than having an 8:30 session?  Why it’s having Oracle Essbase development kibitz on your session and offer improvements to your content.  But it was all in spirit of helpfulness and John Booth was able to recover.

Essbase New and Upcoming Features

Super cool stuff.  You really, really, really should have been there.  Maybe I will update with details if allowed to.  Watch this space.  


Btw, if I am not allowed to, then all I can tell you is that Gabby Rubin, Essbase Product Manager, talked about some really interesting futures for Essbase.  And oh yeah, you should have been there.

Introducing the Outline Extractor NG (Next Generation)

I was on the beta for Tim’s latest and greatest version tool and was very possibly the worst beta participant although I did at least find one bug.  It was very exciting to see the tool actually on display for the world to see.


The Outline Extractor that Applied OLAP has maintained throughout the years has been incredibly useful.  I understand that the single largest downloader, based on domain name is none other than Oracle, so it was perhaps no surprise to see Oracle development staff in the room.  

Top Six Advanced Planning Tips

This is a session Jessica Cordova and I gave – she was the senior and I the junior.  And of course we lied – we didn’t actually do six tips, but instead three, which is sort of not what was promised.  We lied (well, we did, didn’t we?) because we had so much awesome stuff that we couldn’t possibly do all of it in a single session and instead have pinky promised to do the balance in an ODTUG webinar.  It took a lot of pressure off and we still had good content.  

Advanced Essbase Studio Tips and Tricks

This is a session given by my not-actually-related-in-any-way-and-in-fact-he-denies-the-whole-thing older brother Glenn Schwartzberg.  Glenn always does quality work and I wish oh wish oh wish I could have been there but unfortunately I got pulled away by other things.  Sorry big bro, but I’m sure I’m the worse off for not being there.

Keynote

This was really quite good.  Doc Hendley, of the Wine Into Water foundation, did a fantastic job.  A very inspiring and humble man who is determined to do good in this all too imperfect world.


ODTUG also announced that Kscope14 will be in Seattle, Washington.  My parents lived there back in the early 1960s in a basement apartment that had constantly moldy walls, no matter what my mother did to clean them.  My mother, who hasn’t returned to Seattle since 1961, also said, “It was a nice little town.”  I wonder if it has grown any since then.  :)

Conclusion

As always, Kscope is great, Kscope is exhausting, Kscope is the place you ought to be if you care at all about your overall knowledge and of course a chance to meet and great all of your fellow Oracle geeks.


Be seeing you in New Orleans.

Kscope13, days 2, 3, and 4

$
0
0

Introduction

Oh dear, I am rather afraid that the entire concept of live-blogging, or live-tweeting (see that box to the right of this text on my blog), has been a complete and utter failure this year.  Sorry.


What happened is that I, as I seem to do, managed to completely overcommit and then had to live up to the promises.  The end result was I had not time to:
  • Blog
  • Tweet
  • Floss my teeth (yuck, TMI, why?, really?, and just what is your dentist going to say about that?)


The first two I shall hope to correct below and the last, rather personal failure, is this blog’s fault.  Multiple times during this past week I sat down in my hotel room’s couch/davenport/settee to write this blog post and simply fell asleep.  I then would wake up a few hours later, realize the battery on the laptop had died again, plugged it in, brushed my teeth and fell into bed.  I shall hope that extra attention to dental hygiene in future will correct my slackness.  So far, no cavities.

So what did I do, and what did I see?

Monday, 12:30 – Top Six Advanced Planning Tips

Jessica Cordova and I lied to ODTUG (well, we did actually clear this with them first, but this is our doing, not ODTUG’s) and only presented three advanced Planning tips.  This is not because we hate Planning, or ODTUG, or our audience but because when we combined our work we knew we had far more than 50-odd minutes of content.  So we timed everything (Jessica and I come from the rehearse-it-to-death school of presenting), figured out what would fit, and only presented that at Kscope13.  Look in this space and an ODTUG email about our webinar in August where we do the balance (or maybe all) of the tips.  What we presented was still Good Stuff and we got to cover it at the length and scope that it deserved.

Tuesday, 8:30 – Exalytics – An Apples to Apples Comparison

This was the group project to end all group projects.  We (John Booth, Tim German, Dan Pressman, and yr. obdnt. srvnt.), thanks to the rather incredible generosity of Mark Rittman, were able to benchmark a generic Linux box against Exalytics to see which was faster, and why.


The presentation itself was more of a journey in how we set up a benchmark (I think real benchmarkers would laugh at our methodology but we had never done this before) and what choices we made, and why, although there were some results.  


The benchmarking result re which is faster, btw, is the classic Essbase result – it all depends on what you are doing and why.  I will also note that from a storage perspective we really didn’t do a good job setting up like to like comparisons but this was a hobby project (and for all of us, just one of many) and we did our best.  Suffice to say that now we know how to benchmark much, much better.  Hopefully the audience didn’t feel cheated by that.  

Tuesday, 10:45 – Practical SQL for EPM Practitioners

This was the session I was most excited about presenting as I have recently been doing rather a lot with SQL in my EPM projects.  


The presentation was given from a beginner’s perspective (this is easy for me because from as far as SQL is concerned, I too am a beginner) and covered some of the techniques that I have found useful.  


Everyone who does EPM needs to get on the SQL train (and yes, I was one of those Essbase geeks who until quite recently could only write “SELECT * FROM …” so thank you not-really-my-big-brother-but-oh-how-I-wish-you-were Glenn Schwartzberg for helping me (or maybe like completely doing my job) with HFM Extended Analytics; thanks also to Dan Pressman with other SQL content in the presentation.  I stand on the shoulders of giants.


The reason you, gentle reader, need to be more au courant with SQL is because it empowers you in your organization and with your systems.  It honestly isn’t that hard and I hope that this presentation helps you along the way to SQL mastery.

Tuesday, 12:45 Hyperion Apps Lunch n’ Learn

Thanks to the generosity of the OTN program, every year ODTUG presents multiple Lunch n’ Learn sessions across the tracks.  I have been in the Hyperion Apps one as I seem to do that for a living.


I was the Masters of Ceremony aka Microphone Monkey as the original MC/MM, John Booth, was unable to attend because of a family emergency.  I actually think John asked me to do this but I completely forgot (as you may notice, I have a few things going on at this conference and also my memory stinks) so this was a bit of a surprise.  I think the audience participation and the board’s ability to answer was pretty good – fell ACE Director Tracy McMullen and ACE Chris Barbieri did a great job as usual.


I am quite pleased that Lunch n’ Learns have hit their stride.  I MC’d/MM’d one in, I think, 2010, and it was just painful eliciting questions from the audience.  That was not at all the case at Kscope13.  


Thanks again, OTN.

Wednesday, 8:30 Experts Panel:  Essbase BSO Optimization

This was supposed to be moderated by John Booth but as I explained above he had a family emergency and so regretfully was not available.


Glenn Schwartzberg stepped in to moderate and Edward Roske, Tim German, Mike Nader, Steve Liebermensch, and yr. obdnt. srvnt. all sat in.  It was a pretty freewheeling discussion and I learnt something new about Essbase Report Scripts and data extraction.  Will my former boss (Edward) be proven right yet again?  It may pain me, immensely, if so, but Watch This Space for a new data extract post in the next few weeks.

Wednesday, 10:45 – A BSO Developer’s Introduction to ASO Essbase

This was for me, a BSO developer, a bit of a stretch.  It was difficult to write because so very much of it was theoretical, rather than practical application of theory; if you notice this blog, I tend to fall on the practical side of things.  OTOH, if one wants to do ASO right, one must also understand how ASO Essbase works.  Dan Pressman wrote the book (okay, the chapter) on this subject but I always thought his work, while incredibly important, was too hard for many of us to really understand.  Maybe we (or maybe I mean me) are dumb, maybe it is just a really complex subject.


In any case, I used this session as an opportunity to use BSO constructs and descriptions to sort of, kind of, describe how the ASO kernel works (yes, this was a little dangerous and yes, I was very careful to note when the analogies completely broke down) and then apply that understanding to MVFEDITWWW aka Sample.Basic converted to ASO.  It’s really a case of using terms and concepts we BSO types are familiar with and then applying it to ASO.  In my many, many, many conversations with Dan over ASO, that’s the approach that finally led to the “Ah-ha!” moment and I hope that slant plus the conversion of Sample.Basic via two different techniques was the theory made concrete for the audience.


I hasten to add that this presentation was really just a small part of Dan’s work and I am not suggesting that downloading my deck is the same as reading (and rereading and rereading and rereading) his chapter.  If you haven’t yet understood the key to ASO’s internal design (and given that there were about 80 people in the session, I’d say not everyone has), I encourage you to read my presentation as an introduction and then tackle his work.


Thanks again, Dan, for putting up with what must have been a record number of calls.  Now I think I finally understand ASO.

Wednesday, the rest of the afternoon

I am officially Not Allowed To Talk About It (I must keep some mystery in my life), but I’ll just note that I had Yet Another Presentation.

Thursday, the rest of the conference

Alas, I missed all of the sessions on Thursday as I slept in (I sort of had a busy past few days) and so missed Steve Liebermensch’s Essbase Exalytics session, and then had a meeting with my Australian Sister-Across-The-Waters (aka fellow board member and Oracle ACE Bambi Price) about ODTUG’s relentless path to world domination (we talked about Seriously Practical conferences in Asia with Frank Chow, one of my “lucky” EPM buddies).


And that, for me, was the end of the conference.

The end of What Cameron Did This Kscope

I haven’t even begun to cover all of the other things that went on at Kscope13, all the cool things that I could have done and wished I did, how amazingly fast it all went by, or how incredibly tired I am.


Suffice to say, it was an AMAZING conference and proof, if proof be needed, that no other organization throws an Oracle conference/party the way ODTUG does.  Thanks goes to Oracle, fellow presenters, fellow attendees, YCC, the Kscope conference committee(s), my fellow board of directors, and the many, many, many volunteers who make this conference possible.  It is, without exaggeration, the professional peak of my year and I simply could not do my job without ODTUG and Kscope.  I am indebted to you all.


Be seeing you next year in Seattle, Washington, for Kscope14.  I can hardly wait.

What makes Essbase data extraction fast?

$
0
0

Introduction

I am so sad/pathetic/a-geek-desperately-in-need-of-a-life.  No, that is not the point of this blog post (although the statement is true) but instead it is an observation that I am an inveterate hacker who just cannot leave a potentially good hack alone.

What do I mean by that?

At Kscope13 (alas, sadly passed but wow it was fun) this past week (a bit of a time issue here as I started writing this whilst flying home on the plane – ooh, technology, and sadness but what else am I going to do with myself?)  I sat in on a BSO optimization panel.  My hated and despised former oppressor beloved former employer, Edward Roske, mentioned the subject of data extracts and also that he found Essbase Report Scripts to be faster than MDX for exporting data.  This intrigued me (actually, I think my eyebrows went over my hairline, round the top of head, and down my neck) because that is not exactly the result I saw in my testing here.  OTOH, Edward doesn’t drop hints like this unless he is pretty sure of what he says and thus it behooves me to give it a try and see what happens.  


Edit -- Edward also mentioned that he used a report script keyword called {SSFORMAT} as part of his data extraction approach.  This was even more intriguing because I’ve never heard of it and I have been using report scripts for an awfully long time.  What oh what oh what is he going on about?

What I’m going to try to do with this post

While I started this blog entry out as a test to try to measure the impact of {SSFORMAT} on report script data extraction of course went my inquiries went off the rails as they are wont to do and I found myself measuring the much more interesting overall performance question (and I think what Edward was really alluding to):  What is the fastest Essbase data extraction to disk method?  As seemingly always with Essbase, “It depends”, but this blog post will attempt to qualify what the dependencies are and which approach is best for your data extraction needs.

Why this got so interesting so fast

I think one thing I’ve learnt from participating in an Exalytics benchmarking test (which was more like a treatise on how maybe not to perform a benchmark) is to try to have a variety of test cases.  And that turned out to be really important in this example because I soon found that the time Essbase takes to extract data is only one part of the performance puzzle.  There is also the not so little issue of how long Essbase/MaxL/whatever takes to write that information to disk.   Do not underestimate this component of performance because if you do, you will be guilty of the same mistake that I made, i.e., thinking that the time shown in the application log for a particular action is equivalent to the actual time for a data extraction  process ‘cause it ain’t necessarily so.


The first question (and the obvious one if you solely look at the logs) is which tool has faster Essbase performance?  Report scripts or MDX?  With or without {SSFORMAT} if a report script?  What makes {SSFORMAT} faster if it is indeed faster?  Can other techniques using report scripts have equivalent speed?  Is {SSFORMAT} any use at all?  


And then (once you the tester have noted some weirdo results) in elapsed time is which tool has faster overall (command issued to output complete on disk) performance – report scripts or MDX?  


Whew, what a lot of questions.  I guess I am just a benchmarking fool because I try to answer these with databases you can (mostly) replicate/improve/totally prove me wrong with.


What do I mean by all of this?  Simply that the Essbase application log lies.  Sometimes.  


NB – I do a little bit with the BSO calc script language DATAEXPORT but as I am going to spend time bouncing between BSO and ASO I will pretty much be ignoring that approach.  There are some numbers in the last test suite for your edification.

The truth

The Essbase report script time is 100% accurate – if it takes 25 seconds to write a report script out to disk, it took 25 seconds from the time of issuing the command to the time that you can edit the file in a text editor.  So truth, justice, and the American way.

{SSFORMAT}

So what about {SSFORMAT}?  Does it really make any difference?  Edward mentioned it was undocumented (so how did he know?) but actually it can be found in official Oracle documentation.  So not mentioned but shown in code samples.


Of course, once I heard this I simply had to try it out to see what it does.  And also of course, as it is an undocumented (mostly) keyword I didn’t really know what it could or should do.  From a bit of testing I can relate that the command:
  1. Removes page breaks and headers aka { SUPHEADING }
  2. Uses tab delimits aka { TABDELIMIT }
  3. Rounds to zero decimal points aka { DECIMAL 0 }
  4. Is supposed to make reporting fast, fast, fast


Is it really the case that {SSFORMAT} makes extracts faster?  Is Edward right or not?

A mangled <DIMBOTTOM report

I took the Bottom.rep report script that comes with every copy of Essbase and Sample.Basic, modified it to write lots and lots of rows, and came up with the following report.  Yes, it is kind of ugly, but it generates 133,266 rows when not using {SSFORMAT} – not exactly a big report but something that will hopefully register in the Sample application log.


Without SSFORMAT



Looking at this report, we can see column headers, space delmiting, and (although you can’t see it, I can) page breaks.  


Total retrieval time?  1.172 seconds.

With SSFORMAT





The row count is now 120,962 rows.  The data is the same, but the header is suppressed so fewer rows.


Total retrieval time?  0.906 seconds.  That’s almost a quarter faster – 22.7%.  So in the case of this database at least, Edward is right.


And why is he right?  He’s right because {SSFORMAT} gets rid of stuff.  Stuff like spaces and headers, in other words, it’s making the file smaller and smaller = faster.  At least in this case.


But Sample.Basic is not exactly anyone’s description of the real world.  What about another sample database?

Enter ASOsamp.Sample

Just like Sample.Basic, it is sort of difficult to argue that this database is totally reflective of much of anything in the real world as it is a pretty small database in ASO terms.  Regardless, you too can run these tests if you are so inclined and it is at least a little more realistic than Sample.Basic.

Test cases

I came up with a bunch of different report scripts to try to see if I could duplicate what I saw in Sample.Basic and also if I could come up with a way of duplicating {SSFORMAT} or realize that there was some magic in that command.  Is there?


Here’s the base report for all tests except the last two. 


Note the order of the dimensions on the rows.  This will become important later.  This is (thanks, Natalie Delamar for finding the link) ASO good practice.  Except that it isn’t, at least sometimes.  Read on, gentle reader.


The test case basically takes the above report and modifies, a bit, how the data gets exported.
Name
Details
Test4a
Base report with missing rows suppressed, member names, repeated rows, smallest to largest on rows
Test4b
As Test4a with tab delimit, decimal 0, suppress headings

Test4c
As Test4a with SSFORMAT
Test4d
As Test4a with decimal 16, suppress headings
Test4e
As Test4d, with SSFORMAT
Test4f
As Test4d, largest to smallest on rows


I set up a simple MaxL script with the timestamp keyword to get the true export time.


The Essbase Report Script code

spool on to "c:\\tempdir\\Report_script_query_ASOsamp.log" ;


login XXX XXXX on XXXX ;


alter application ASOsamp clear logfile ;


set timestamp on ;


export database ASOsamp.sample using server report_file "Test4a" to data_file "c:\\tempdir\\Test4a.txt" ;
export database ASOsamp.sample using server report_file "Test4b" to data_file "c:\\tempdir\\Test4b.txt" ;
export database ASOsamp.sample using server report_file "Test4c" to data_file "c:\\tempdir\\Test4c.txt" ;
export database ASOsamp.sample using server report_file "Test4d" to data_file "c:\\tempdir\\Test4d.txt" ;
export database ASOsamp.sample using server report_file "Test4e" to data_file "c:\\tempdir\\Test4e.txt" ;
export database ASOsamp.sample using server report_file "Test4f" to data_file "c:\\tempdir\\Test4f.txt" ;
export database ASOsamp.sample using server report_file "Test4g" to data_file "c:\\tempdir\\Test4g.txt" ;
export database ASOsamp.sample using server report_file "Test4h" to data_file "c:\\tempdir\\Test4h.txt" ;
exit ;


I can use the times that MaxL throws into the log file to figure out exactly how long the export process really takes.

Export results

Name
"Essbase time" in seconds
File size  in bytes
# of rows
Start time
End time
Elapsed time in seconds
Test4a
48.429
201,033,651
1,475,017
13:17:03
13:17:52
49
Test4b
47.337
137,485,186
1,316,166
13:17:52
13:18:39
47
Test4c
46.541
137,485,443
1,316,169
13:18:39
13:19:26
47
Test4d
50.431
306,891,743
1,316,166
13:19:26
13:20:16
50
Test4e
49.978
306,892,000
1,316,169
13:20:16
13:21:06
50
Test4f
41.633
306,891,743
1,316,166
13:21:06
13:21:48
42


Hmm, that “It depends” comment is rearing its head again, isn’t it?  There’s barely any difference between the tests.  The difference that I saw with Sample.Basic might be an anomaly or (probably more likely) it might just be too small of a database to measure much of anything.


So in this case at least, Edward is wrong– {SSFORMAT} has no measurable impact, at least on ASOsamp.  And even file size has no real impact.  Weird.


There is one thing that doesn’t make a ton of sense – take a look at that last test, Test4f.  It does something that, in theory, ASO Essbase is supposed to hate – largest to smallest dimensions on the rows.


And that’s what provided better performance, even with 16 decimal points.  So is that post from 2009 wrong?

A brief side trip into proving an adage

So is that Essbase Labs blog post right, or wrong?  Only one way to know.


Name
Details
Test4g
Decimal 0, largest to smallest on ROWs, just about all dimensions
Test4h
Decimal 0, smallest to largest on ROWs, just about all dimensions


Name
"Essbase time" in seconds
File size  in bytes
# of rows
Start time
End time
Elapsed time in seconds
Test4g
59.099
204,609,652
1,316,166
13:21:48
13:22:47
59
Test4h
197.411
204,609,652
1,316,166
13:22:47
13:26:04
197


Here’s the code with smallest to largest on the row:


And per Test4h which is largest to smallest on the row:


So this is interesting.  Sometimes, organizing the report from smallest to largest, as in the above Test4g results in much faster performance.  


And sometimes as in Test4f (which, to be fair is not a complete orientation of dimensions to the row), largest to smallest is faster.


I hope you all know what I am about to write about which approach is right for your report:  It depends.

But what about MDX?

Indeed, what about it?  The output formatting that comes out of MaxL via MDX in a word, stinks.  But who cares if it stinks if it’s fast, fast, fast.  Of course it has to be fast to be worthwhile.  So is it?

The test cases and the code

MDX doesn’t really have an output command the way a report script does – it can’t be part of an export database command in MaxL.  Instead, one must run it via a shell.  Happily (or not, as you will see in a moment), MaxL can be that shell.


I wanted to mimic the report script code as closely as I could.  Of course I can’t use  {SSFORMAT} but I can certainly try to test how long these queries run for and what happens when I write more or less data to disk.  I added or removed content from the output files by increasing/decreasing column width, decimals, and the non-specified POV dimensions on the row or not.
Test cases
Name
Details
MDX1
2 decimals, 40 wide, dims on row
MDX2
As MDX1, no dims on row
MDX3
As MDX1, 80 wid
MDX4
As MDX2, 80 wid
MDX5
16 decimals, 40 wide, dims on row
MDX6
As MDX4, no dims on row
MDX7
As MDX5, 80 wide
MDX8
As MDX5, 80 wide

Sample code

To give you a feel for the code, here’s the basic query with POV dimensions on the row.
SELECT
    { CrossJoin ( { [Years].Children }, { [Measures].[Original Price] } ) }
ON COLUMNS,
     NON EMPTY CrossJoin ( { Descendants( [Products] ) } ,CrossJoin ( { Descendants( [Stores] ) }, { Descendants ( [Geography] ) } ) )
ON ROWS,
    CrossJoin ( CrossJoin ( CrossJoin( CrossJoin ( {[Transaction Type]}, {[Payment Type]} ), {[Promotions]} ), {[Age]} ), {[Income Level]} ) ON PAGES
FROM [ASOsamp].[Sample]
WHERE ([Time].[MTD]) ;


Sample output

So is MDX quicker than a report script?

Name
"Essbase time" in seconds
File size  in bytes
# of rows
Start time
End time
Elapsed time in seconds
MDX1
9.789
318,513,488
1,316,184
9:18:01
9:24:47
406
MDX2
9.701
265,867,452
1,316,187
10:18:49
10:25:30
401
MDX3
9.802
634,394,487
1,316,188
8:31:13
8:41:23
610
MDX4
9.688
529,101,199
1,316,187
8:12:52
8:21:30
518
MDX5
9.76
318,514,116
1,316,185
10:59:35
11:15:45
970
MDX6
9.716
265,867,401
1,316,187
10:33:16
10:45:26
730
MDX7
9.774
634,394,436
1,316,188
11:26:09
11:41:57
948
MDX8
9.729
529,101,001
1,316,187
11:50:58
12:03:32
754


Yes and no, all at the same time.  Yes, the Essbase time as logged in the ASOsamp.log file is much, much quicker than a report script.  But the overall time (again from the timestamp command in MaxL) is slower.  A lot slower.  MaxL is not particularly good at writing data out.  It’s single threaded and writing out half a gigabyte text files is quite arguably not really what MaxL is supposed to do.  And it agrees.


Of course if one could grab that output via MDX and write it out more quickly, the MDX would be the fastest retrieve method bar none, but that simply isn’t an option in a simple scripting test.  I can’t say I know what Perl or the API or some other method might do with this kind of query.  Any takers on testing this out and pushing the Envelope of Essbase Knowledge?


For the record, Edward is right, report scripts beat MDX as an extraction process.  Or do they?

One last test

I have to say I was a bit stumped when I saw the above results.  Edward was right?  Really?  Against all of the testing I had done on that Really Big BSO database with report scripts, DATAEXPORT,  MDX NON EMPTY, and MDX NONEMPTYBLOCK?  Really?  Time to run the tests again.  


I’m not going to go through all of the tests as they are in that post, but here are the tests and the results (and note that I believe I goofed on that old post, report scripts are 10x as bad as I measured before.  Whoops) for your perusal.

The tests

Name
Details
ExalPlan_RS1
Extract of allocated data with 16 decimals, tab delimit, member names, repeated rows, missing rows suppressed
ExalPlan_RS2
Same report script layout as ExalPlan_RS1, but {SSFORMAT} and SUPEMPTYROWS only
ExalPlanCalc
Essbase BSO calc script using DATAEXPORT
ExalPlan_MDX1
Same layout as report scripts, uses NON EMPTY keyword
ExalPlan_MDX2
Same as MDX1, but uses NONEMPTYBLOCK

The results

Name
"Essbase time" in seconds
File size  in bytes
# of rows
Start time
End time
Elapsed time in seconds
ExalPlan_RS1
13,392.3
1,788,728
21,513
15:40:52
19:24:04
13,392
ExalPlan_RS2
13,068.7
1,243,630
21,513
19:24:04
23:01:53
13,069
ExalPlan_Calc
947.842
2,548,818
21,512
23:01:53
23:17:41
948
ExalPlan_MDX1
640.562
3,485,581
21,522
23:17:41
23:28:24
643
ExalPlan_MDX2
1.515
3,536,291
21,835
23:28:24
23:28:29
5


So in this case, Edward is wrong.  MDX trumps report scripts.  I think maybe, just maybe, he’s human and is sometimes right, and sometimes wrong.

What’s going on?

What an awful lot of tests – what does it all prove?


I think I can safely say that {SSFORMAT} doesn’t really make much of a difference, at least with the report that I wrote for ASOsamp.  Maybe in other reports, like the one I wrote for Sample.Basic, it does.  Maybe not.  You will have to try it for yourself.


Think about what {SSFORMAT} will do to the output – especially the zero decimal points (and implicit rounding).  Is that what you want?  If yes, then it is a quick and dirty way to get rid of headers and tab delimit.  Other than that, I can’t say I see any real value in it.  


The current score (Edward and I are not really in any kind of competition, but this is a handy way to figure out who is right and who is wrong, or righter or wronger, or…well, you get the idea): Edward 0, Cameron 1.


Now as to the question of report script vs. MDX I think the question becomes a lot murkier.  In some cases, especially extracts that seem to go export many rows of data, report scripts, at least with the reports and MDX queries I wrote against ASOsamp, report scripts are significantly faster than MDX.


The score has changed to Edward 1, Cameron 1.  A dead heat.


But what about queries that go after lots and lots and lots of data (think really big databases) and only output a relatively small amount of rows?  Then MDX, at least in the case of a ridiculously large BSO database, is faster.


One could argue Edward 1, Cameron 2, but honestly, it’s a pretty weak point.


What does all mean for you?  Wait for it...


It depends.


In other words, just like seemingly every Essbase truism out there, you must think about what you are trying to do and then try different approaches.  I wish, wish, wish I could make a blanket statement and say, “Report scripts are faster” or “MDX queries are faster” but it simply isn’t so.


I will say that if I could get MDX output into something other than MaxL, I think MDX would beat report scripts each and every time.  But I have spent waaaaay more time on this than I expected and maybe if I can get a Certain Third Party Tool to play along, I can prove or disprove this hypotheses.  Or maybe someone else would take the output from MDX via the JAPI and throw it against a text file.  In any case, MaxL’s single threaded nature on log output (which is how MDX writes to disk) is slow, slow, slow.


I have to stop listening to that guy – the above is two full days of testing and coding to try to see if he was full of beans or not.  And in the end the results were…inconclusive.  Such is life.

This blog post brought to you by the John A. Booth Foundation for Impecunious Independent Consultants

No, I am not crying poverty, but know that the times and tests are because of the generosity of John who is allowing me to use one of his slower servers (yes, servers, and yes, it is very nice that he is willing to share it with me, and yes, it is kind of odd that he owns several real honest-to-goodness Essbase servers but he is just that kind of geek) for the testing.


Yes, I could have and should have tested this on AWS, and it isn’t as though I am a stranger to making Amazon even richer, but I thought since I did my first tests with extracts on one of his servers, I should keep on doing just that.  


Thanks again, John, for the  use of your Essbase box.


Be seeing you.

An even better focused aggregation with Dodeca

$
0
0

The obligatory disclaimer

I gave a presentation on this very subject at the Hyperion SIG Open Microphone Night.  I am not entirely sure that a formal (well, informal, but it was PowerPoint) presentation is 100% a spontaneous thing, but they asked, I double-checked, they agreed, and so it was.  


The disclaimer?  I am a cheerleader for Dodeca, but not a salesman, not an employee, not a shareholder, etc., etc., etc.  Here’s the who-is-this-nut-in-front-of-the-room graphic for those who doubt.
Also, for the record, there were alcoholic drinks there but there was also…chocolate milk and regular milk.  2% and skim.  And cookies.  Yum.  It is somewhat incongruous to admit this at my age, but I really like the taste of milk.  I like mixing chocolate and regular milk (so I guess I like watered down chocolate milk) even more.  What an awesome idea for snacks.


Back to Dodeca – I like the product, a lot, I don’t do very much with it (if I could sell it, I would, but I can’t very much so I don’t), I do consider Tim Tow both a friend and a mentor, and that’s it.  If this post reads like an advertisement, it isn’t.  It’s just what the very last bullet point says it is – an interesting way to make processing faster.


Whoops, one more thing.  It would be awesome if Planning could pass the contents of the rows and columns to Calculation Manager.  Awesome.  As you will see if you but continue reading on, gentle reader.   

The good and the bad about focused aggregations

In Planning-land, focused aggregations off from forms, at least in the BSO Planning world almost all of us still live in, is the only way to fly.  You can read all about it here in detail.


Very, very briefly, the big win is only to calculate the ancestor blocks that are affected by the form.  In other words, instead of aggregating an entire dimension, including lots and lots of hierarchies that are not affected by the form input, just aggregate the hierarchies that matter.  It’s that easy.


As an example, let’s take a look at the Planning sample application and compare the calculation time of a CALC ALL (not something you are likely to do, but still a valid data point), an AGG of just the two sparse dimensions, and then of course a partially focused aggregation.  Remember, this is with Entity in the form row and Segments in the page dimension so Calculation Manager can read Segment via a Run Time Prompt variables but not Entity.


Approach
Seconds
Percent
Calc All
5.766
3696%
Agg of Entity & Segment
0.375
240%
Agg Entity and focused Segment
0.156



The blocks tell the tale

When we look at how many reads and writes occur, and how many cells are touched via a SET MSG DETAIL statement, it’s easy to see why a focused aggregation is so fast – it simply touches fewer data points.  To quote Ludwig Mies van der Rohe, “Less is more”.


Approach
Sparse writes and reads
Sparse cells addressed
CALC ALL
22,236 writes
107,570 reads
101,820,000

Agg of Entity and Segment
2,869 writes
10772 reads
12,592,000

Agg of Entity, focused Segment
888  writes
2,960 reads
3,897,400



Last bit of review

The code looks like ugly but this is how BSO Essbase walks the hierarchies when it does an AGG of sparse dimensions.  Weird, it’s true, but it works.

That was the good, here’s the bad

But focused aggregations in Planning have a problem – while they can read the point of view dimensions and drop down or page dimensions, whatever dimensions are on the sheet, and those dimensions’ members are terra incognita to Calculation Manager.  


Why care about row and column dimensions?  If the dimensions are dense, and if I’ve designed the Planning app, I don’t care – everything is dynamic.  But if the dimensions are sparse (and if I’ve designed the Planning app I will have fought this tooth and nail but sometimes that’s just the way people interact with data), I do care because that means any aggregations off of the form will require a full AGG.  Not so focused, is it?


What oh what to do?  In Planning – there is not a blessed thing.  However, what if the application in question was Dodeca?  

What Dodeca can do

The problem with the focused aggregation approach in Planning is as I stated – I don’t know what the sparse dimensions on the sheet are, nor do I know the scope of the member selections.


If I did know that, why then I could write focused aggregations up to the top of the dimension and down to any subtotals.  In other words, a super focused aggregation with hopefully super fast results.  How oh how do we do this?  It’s actually quite easy.

Dynamic Rowset to the rescue

All I need to do is modify the approach I wrote about back in December of 2012 on how to do a dynamic rowset report in Dodeca and change the database from Sample.Basic to the Planning sample application.  And once I do that, I then need to take advantage of the selected Entity and insert it into my Dodeca-specific calc.


To review:  a Dyanmic Rowset report takes a user selecting in a dimension, figures out what the descendants (or children or siblings or whatever) are, and then sticks all of them on a sheet via an Essbase report scritpt, MDX query, or delimted list.  Dodeca does all of this through something called a Workbook Script – sort of Dodeca’s enlightened approach to coding.


Set up a few ranges on the sheet to indicate where the retrieve range is and where to insert rows, and then “code” by modifying the below Workbook Script that fires when the workbook itself opens up, but before it retrieves.


  1. BuildRangeFromScript – I chose the the EssbaseReportScript type; there are many ways of doing this including MDX (cannot be seen because this is an older release):


  1. ScriptText – Code to actually build the rows.  Tokenized to push the value of the select from the dimension treeview.




  1. StartCell – Range name of the repeated rows that are tied to the output from ScriptText.  This is the green range.
  2. Rows – This report has dynamic rows; it could just as easily be columns.
  3. EnterNumbersAsText – Just in case member names such as 100 are used, treat them as text.
  4. CopyFromRange – The name of the range to be repeated.  This is the blue range.
  5. Insert – Set to TRUE as I want the output of ScriptText to be reflected in the sheet.
  6. OutputRangeName – The name of the rows that are built during the insert process.
  7. OutputMap – The column that receives the output of ScriptText.
When all of that is built, and the rows are placed onto the sheet, Dodeca does its retrieve.  Magical, isn’t it?


A short note about tokens

Do you like Planning’s Run Time Prompt variables because of their utility in Calculation Manager?  If so, you will really like Dodeca’s tokens because they are even more powerful.  Tokens work in report scripts,  calc scripts (just like RTP variables in Calculation Manager), and they work in retrieves as well.  


You saw above how a token with a Workbook Script function to force the valuation of the token worked in getting the content’s of the rows


Within a report (aka an Excel sheet) before:


And after in Dodeca:


And in the calc script itself:

Did you catch the clever cannot-be-done-anywhere-else bit?

It’s as easy as 1, 2, 3.
1 – After fixing on whatever the Segments dimension member is, calculate the inclusive descendants of the Entiy dimension that drives the rows.
2 – Within that same FIX, calculate the ancestors of the Entity dimension.
3 – Then FIX on the inclusive descendants and the ancestors of the Entity dimension that drives the rows and aggregate the ancestors of the Segments dimension.


We’re done.  It was easy, and awesome.

What’s the payoff?

Time

The payoff is speed, a lot of it.  Let’s review the speed again, but now with the above row-based focused aggregation.  The difference is dramatic.  The Dodeca focused aggregation is 93 times as fast as a Calc All, 6 times as fast a dimension aggregation, and 2.5 times as fast as a partial focused aggregation.  Would you like to speed up your budgeting aggregations by a factor of 6?  Or 2.5?  You bet you would.  And you can, quite easily.
Approach
Seconds
Percent
Calc All
5.766
9300%
Agg of Entity & Segment
0.375
605%
Agg Entity and focused Segment
0.156
252%
Dodeca agg of focused Entity and Segment
0.062

Blocks

We can look at time or we can look at the blocks.  If time = money, then time also = blocks.  The fewer are most definitely the better.  The percentage improvements in sparse cells touched by the aggregations mirror the time pretty closely.  
Approach
Sparse writes and reads
Sparse cells addressed
Percent
CALC ALL
22,236 writes
107,570 reads
101,820,000

7733%
Agg of Entity and Segment
2,869 writes
10772 reads
12,592,000

956%
Agg of Entity, focused Segment
888  writes
2,960 reads
3,897,400

296%
Dodeca focused Entity and Segments
300 writes
1,000 reads
1,316,700

Takeaways are not necessarily fish ‘n chips

1 – Focused aggregations are good because they can make your aggregations quick.
2 – Planning can’t give you the rowset for a sparse aggregation although it’s way better than nothing.
3 – Dodeca can give you the rowset for a sparse aggregation through a combination of Workbook Scripts and Tokens.


Try it, you’ll like it.  :)


Stupid Planning security trick 1 of 4

$
0
0

Introduction

You might be thinking, “A four part blog post on Planning security?  Quick, am I near a suitable surface for snoozing?  Blog-induced narcolepsy may ensue.”  You might be thinking that but you’d be oh so wrong.  I am, in Yr Hmbl & Obdnt Srvt’s, opinion, going to write about four (well, okay, three, but it has to lead with the initial query) very interesting uses of queries against the Planning security tables.  I like to preach (So where’s my pulpit?  Right here, actually.) that SQL is the key to the EPM kingdom and this is yet another example of that very rule.

What this post is all about

But to do those interesting things (they’re just hacks but quite useful), one must first have a good Planning security query.  Yes, I wrote about querying Planning security back in October of 2011 (eeek, that long ago?) and in March of this year (2013 A.D.).  What these queries didn’t do was tie together (or at least tie together explicitly) the link between ALL of the objects in Planning’s security tables, users, groups, and the users in those groups.  

Objects in Planning

There’s a table in the Planning repository called HSP_OBJECT_TYPE.  Why do you care about this?  Because it is the table that describes Planning’s objects.  And those objects are the targets of security.  And you need to know about those if you are getting a report out of Planning and into the reporting tool of choice.  There, I have led you on quite the merry chase as to why HSP_OBJECT_TYPE is oh so important.  Fantastic, right?  But is it everything?

Well, I was playing around with security and I found objects that weren’t in HSP_OBJECT_TYPE.  How?  I have no idea – some of them don’t make sense from a security perspective (such as Menus) and others most definitely do (such as Business Rule folders).  Here’s the query I came up with to describe all of them and I’ve commented the object numbers and descriptions that I have found.  There may be more so consider this an incomplete list.  Also, I leave off the CalcMgrVariables because I can’t figure out how to assign security to these objects.  Please leave word via the comments section if you find more objects than I did.

/*
    1 to 50 defined in HSP_OBJECT_TYPE
    103 = Menus
    107 = Composite forms
    115 = Deployed Business Rule
    116 = Looks like Business Rules, but don't exist in CM?  So orphaned?
    117 = Calculation Manager variables
    118 = Business Rule Folder
    119 = CalcMgrRulesets -- that's actually the OBJECT_NAME, so defined by
system?
    120 = There are four valies in a three Plan Type Planning app:
            CalcMgrVariables
            CalcMgrVariablesPTName1
            CalcMgrVariablesPTName2
            CalcMgrVariablesPTName3        
*/
--    The objects that we know
SELECTDISTINCT
    OBJECT_TYPE AS "OBJECT_TYPE",
    TYPE_NAMEAS "TYPE_NAME"
FROM HSP_OBJECT_TYPE
--    The objects that we only know through accident
UNION
SELECT
    CONVERT(INT,'103')AS "OBJECT_TYPE",
    'Menu'AS "TYPE_NAME"
UNION
SELECT
    CONVERT(INT,'107')AS "OBJECT_TYPE",
    'Composite Form'AS "TYPE_NAME"
UNION
SELECT
    CONVERT(INT,'115')AS "OBJECT_TYPE",
    'Business Rule'AS "TYPE_NAME"
UNION
SELECT
    CONVERT(INT,'118')AS "OBJECT_TYPE",
    'Business Rule Folder'AS "TYPE_NAME"
UNION
SELECT
    CONVERT(INT,'119')AS "OBJECT_TYPE",
    'CalcMgrRuleSets'AS "TYPE_NAME"
ORDERBY OBJECT_TYPE,TYPE_NAME

Here’s the output:
OBJECT_TYPE
TYPE_NAME
1
Folder
2
Dimension
3
Attribute Dimension
4
Calendar
5
User
6
Group
7
Form
8
FX Table
9
Currency
10
Alias
11
Cube
12
Planning Unit
30
Attribute Member
31
Scenario
32
Account
33
Entity
34
Time Period
35
Version
37
Currency Member
38
Year
45
Shared Member
50
User Defined Dimension Member
103
Menu
107
Composite Form
115
Business Rule
118
Business Rule Folder
119
CalcMgrRuleSets

For you  Oracle PL/SQL users, the code would look like this (just a few minor differences):
/*
    1 to 50 defined in PLANAPP8.HSP_OBJECT_TYPE
    103 = Menus
    107 = Composite forms
    115 = Deployed Business Rule
    116 = Looks like Business Rules, but don't exist in CM?  So orphaned?
    117 = Calculation Manager variables
    118 = Business Rule Folder
    119 = CalcMgrRulesets -- that's actually the OBJECT_NAME, so defined by
system?
    120 = There are four valies in a three Plan Type Planning app:
            CalcMgrVariables
            CalcMgrVariablesPTName1
            CalcMgrVariablesPTName2
            CalcMgrVariablesPTName3        
*/
SELECTDISTINCT
    OBJECT_TYPE AS "OBJECT_TYPE",
    TYPE_NAMEAS "TYPE_NAME"
FROM HSP_OBJECT_TYPE
UNION
SELECT
    TO_NUMBER('103','999')AS "OBJECT_TYPE",
    'Menu'AS "TYPE_NAME"
FROM DUAL
UNION
SELECT
    TO_NUMBER('107','999')AS "OBJECT_TYPE",
    'Composite Form'AS "TYPE_NAME"
FROM DUAL
UNION
SELECT
    TO_NUMBER('115','999')AS "OBJECT_TYPE",
    'Busines Rule'AS "TYPE_NAME"
FROM DUAL
UNION
SELECT
    TO_NUMBER('118','999')AS "OBJECT_TYPE",
    'Busines Rule Folder'AS "TYPE_NAME"
FROM DUAL
UNION
SELECT
    TO_NUMBER('119','999')AS "OBJECT_TYPE",
    'CalcMgrRulesets'AS "TYPE_NAME"
FROM DUAL
ORDERBY OBJECT_TYPE,TYPE_NAME

With this list (and as I said, who knows if it’s complete but there are some important object types that I’ve added on) we can go on to figure out who has access to what, and hopefully have all of the what.  Why oh why oh why are there more objects than identified in HSP_OBJECT_TYPE?  Beats me.  Oracle, do you have an explanation?

Users in groups

I actually threw this query code into the Calculation Manager query but didn’t really show the power of the query.  What it gives you is the users in the groups.  Big deal you say?  If you want to see what user rights does a user have via a group, this query is essential.

Users in groups query

SELECTDISTINCT
    O1.OBJECT_NAMEAS "Group",
    O2.OBJECT_NAMEAS "User"
FROM HSP_USERSINGROUP G
INNERJOIN HSP_OBJECT O1
    ON G.GROUP_ID = O1.OBJECT_ID
INNERJOIN HSP_OBJECT O2
    ON G.USER_ID= O2.OBJECT_ID 

And the output

Group
User
PLN_CalcTest
JessicaC
PLN_CalcTest
TestPlanner1
PLN_CalcTest
TestPlanner2
PLN_CalcTest_Consol
JessicaC
PLN_CalcTest_Consol
TestPlanner1
PLN_CalcTest_Consol
TestPlanner2
PLN_CalcTest_Consol_Americas
TestPlanner1
PLN_CalcTest_Consol_APAC
TestPlanner2
PLN_CalcTest_Consol_EMEA
JessicaC

Looking at user JessicaC, I can tell she has effective membership in groups PLN_CalcTest, PLN_CalcTest_Consol, and PLN_CalcTest_Consol_EMEA.  

When I look at those three groups (which are all hierarchically linked, with EMEA in Consol, and Consol in CalcTest) in Shared Services, I see that she is really only in PLN_CalcTest_Consol_EMEA.  That is the power of inherited security and the way Planning security and group hierarchies should be managed. There, I put it in bold red just in case anyone missed it.  It is Very Powerful Medicine and if you aren’t using it in your Planning (or name your EPM tool) you are really missing a very easy way to reduce maintenance in your system and still be just as flexible as you need to be with security.  You Have Been Warned.

Putting it all together

This approach uses Common Table Expressions out the wazoo (I hope you are all appreciating my deep technical terms – wazoo = “a lot”) because I find them so handy when it comes to breaking down complex SQL queries into a series of steps.  There are quite a few here because the query is pulling so many things together.

At a very high level, this query combines:
CTE
Description
Dimensions
List of dimensions
ObjType
Object types, including undocumented ones
FinalCTE
Object security by user, group, access type, hierarchical relationship.  You could stop the query right here if you didn’t want the relationship between users and groups.
UsersInGroups
Users in groups via inheritance and direct assignments
UserDefinedSecurity
Security assigned directly to usernames
GroupDefinedSecurity
Security assigned to users via groups
UserAndGroupDefinedSecurity
UNION of UserDefinedSecurity and GroupDefinedSecurity

There’s a very simple SELECT statement that goes agaist UserAndGroupDefinedSecurity.  It’s almost kind of pointless but don’t worry, it starts to make a lot more sense as we progress through this series of security queries.  

Full security query

WITH
 -- Dimensions and all of their members
    Dimensions AS
    (
    SELECTDISTINCT
        O.OBJECT_ID,
        O.OBJECT_NAMEAS "Member",
         (SELECT OB.OBJECT_NAME
            FROM HSP_OBJECT OB
            WHERE OB.OBJECT_ID= M.DIM_ID)AS "Dimension"
    FROM HSP_OBJECT O
    INNERJOIN
        HSP_MEMBER M
        ON M.MEMBER_ID = O.OBJECT_ID
    ),
 -- All of the other object types, including the ones that aren't documented or in HSP_OBJECT_TYPE
    ObjType AS
    (
    /*
    1 to 50 defined in HSP_OBJECT_TYPE
    103 = Menus
    107 = Composite forms
    115 = Deployed Business Rule
    116 = Looks like Business Rules, but don't exist in CM?  So orphaned?
    117 = Calculation Manager variables
    118 = Business Rule Folder
    119 = CalcMgrRulesets -- that's actually the OBJECT_NAME, so defined by
            system?
    120 = There are four valies in a three Plan Type Planning app:
            CalcMgrVariables
            CalcMgrVariablesPTName1
            CalcMgrVariablesPTName2
            CalcMgrVariablesPTName3        
    */
    SELECTDISTINCT
        OBJECT_TYPE AS "OBJECT_TYPE",
        TYPE_NAMEAS "TYPE_NAME"
    FROM HSP_OBJECT_TYPE
    UNION
    SELECT
        CONVERT(INT,'103')AS "OBJECT_TYPE",
        'Menu'AS "TYPE_NAME"
    UNION
    SELECT
        CONVERT(INT,'107')AS "OBJECT_TYPE",
        'Composite'AS "TYPE_NAME"
    UNION
    SELECT
        CONVERT(INT,'115')AS "OBJECT_TYPE",
        'Business Rule'AS "TYPE_NAME"
    UNION
    SELECT
        CONVERT(INT,'118')AS "OBJECT_TYPE",
        'Business Rule Folder'AS "TYPE_NAME"
    ),
 --  Get every object in the application
    ObjectID AS
    (
    SELECT
        OBJECT_ID,
        OBJECT_NAME,
        OBJECT_TYPE,
        SECCLASS_ID
    FROM HSP_OBJECT
    ),
 -- This is almost the end of the road, but it doesn't take into account implicit security
 -- Stop here if that isn't important
    FinalCTE AS
    (
    SELECT
        --OT.TYPE_NAME AS "Type",
        --  If the OBJECT_TYPE = 50 then it is a user-defined or custom dimension
        --  so do a subquery to pull the dimension name
        CASE
            WHEN O_ID.OBJECT_TYPE != 50 THEN OT.TYPE_NAME
            ELSE (SELECT D."Dimension"
                    FROM Dimensions D
                    WHERE O_ID.OBJECT_ID= D.OBJECT_ID)
        ENDAS "Type",
        O_ID.OBJECT_NAMEAS "Object",
        CASE
            -- Subquery to get user or group type
            (SELECT OA.OBJECT_TYPE
                FROM HSP_OBJECT OA
                WHERE OA.OBJECT_ID= AC.USER_ID)
            WHEN 5 THEN'User'
            WHEN 6 THEN'Group'
        ENDAS "Security Type",
            (SELECT OA.OBJECT_NAME
                FROM HSP_OBJECT OA
                WHERE OA.OBJECT_ID= AC.USER_ID)AS "User/Group Name",
        CASE AC.ACCESS_MODE
            WHEN 1 THEN'Read'
            WHEN 2 THEN'Write'
            WHEN 3 THEN'Write'
            WHEN 4 THEN'Launch'
            WHEN-1 THEN'Deny'
        ENDAS "Read/Write",
        CASE AC.FLAGS
            WHEN 0 THEN'"'+ O_ID.OBJECT_NAME+'"'
            WHEN 5 THEN'@CHI("'+ O_ID.OBJECT_NAME+'")'
            WHEN 6 THEN'@ICHI("'+ O_ID.OBJECT_NAME+'")'
            WHEN 8 THEN'@DES("'+ O_ID.OBJECT_NAME+'")'
            WHEN 9 THEN'@IDES("'+ O_ID.OBJECT_NAME+'")'
        ENDAS "Hierarchy function"      
        FROM ObjectID O_ID
        INNERJOIN ObjType OT
            ON OT.OBJECT_TYPE = O_ID.OBJECT_TYPE
        INNERJOIN HSP_ACCESS_CONTROL AC
            ON O_ID.OBJECT_ID= AC.OBJECT_ID
    ),
    UsersInGroups AS
    (
    SELECT
        O1.OBJECT_NAMEAS "Group",
        O2.OBJECT_NAMEAS "User"
    FROM HSP_USERSINGROUP G
    INNERJOIN HSP_OBJECT O1
        ON G.GROUP_ID = O1.OBJECT_ID
    INNERJOIN HSP_OBJECT O2
        ON G.USER_ID= O2.OBJECT_ID 
    ),
    --  Get the security that is specifically assigned to users
    UserDefinedSecurity AS
    (
    SELECT
        F."Type",
        F."Object",
        F."Security Type",
        F."User/Group Name",
        'User-assigned'AS "Parent Group",
        F."Read/Write",
        F."Hierarchy function"
    FROM FinalCTE F
    WHERE "Security Type" ='User'
    ),
    --  Get the security that is specifically assigned to groups
    --  The join between the CTE UsersInGroups and FinalCTE is the key to implicit security
    GroupDefinedSecurity AS
    (
    SELECT
        F."Type",
        F."Object",
        F."Security Type",
        U."User" AS "User/Group Name",
        F."User/Group Name" AS "Parent Group",
        F."Read/Write",
        F."Hierarchy function"
    FROM FinalCTE F
    INNERJOIN UsersInGroups U
        ON U."Group" = F."User/Group Name"
    ),
    --  UNION the explicit to the user and the implicit via a group security
    UserAndGroupDefinedSecurity AS
    (
    SELECT
        *
    FROM UserDefinedSecurity  
    UNION
    SELECT*FROM GroupDefinedSecurity
    ) 
    --  Now report out however you like
SELECT
    "User/Group Name",
    "Security Type",
    "Parent Group",
    "Type",
    "Object",
    "Read/Write",
    "Hierarchy function"
FROM UserAndGroupDefinedSecurity
ORDERBY 1, 4, 3, 5

Sample output
And what does that write out?

Now that’s a security query.  

What’s next?

Would you believe (my name should probably be Maxwell Smart, aka Agent 86):
  • Specific user by group queries (it is harder than you think once you bring inheritance into the picture)?
  • Building SECFILE.txt import files to do selective security migrations (sadly, not really doable with LCM if you ignore your consultant aka Yr Hmbl & Obdnt Srvt, and do user and group security)?
  • Programmatically building MaxL statements to assign filter security to ASO reporting databases via Planning security?

The code gets progressively more complex with the last one a combination of a bunch of scripting technologies – in other words, one of my better hacks.  See, I told you there were some interesting queries coming out of this.  :)

Be seeing you.
Viewing all 271 articles
Browse latest View live