Quantcast
Channel: Cameron's Blog For Essbase Hackers
Viewing all 271 articles
Browse latest View live

Hear all about Developing Essbase Applications

$
0
0

You’ve read the voice, now hear it

Here’s a stumper for you – when you read a blog, a newspaper article, or a book, whose voice do you hear when you read?  Not yours, unless you’re reading your own work.  Surely not the character or authors either.  I note that Pierre, his mother, his father, the lion, and the doctor all sound rather the same, don’t they?

But of course that’s not what is meant by a writer’s voice – that’s the style, syntax, etc.  C’mon, you have to admit (for better or worse, likely worse) that this blog has a particular style.

But wouldn’t it be great (it would, wouldn’t it?) to hear what the authors really sound like?  You can do that if you come to Kscope.  Or listen to a webinar.  Or even better, buy the book and then listen to us talk about the book.

Oracle Author Podcasts has it

I am very happy to announce that John Baker of Oracle interviewed coauthor Gary Crisci and yr. obnt. srvnt. all about the very best, most advanced, and just generally chock full of good stuff Essbase book there ever has been, Developing Essbase Applications.  In my opinion, of course.  :)

Where oh where do you hear this?

Why, at the Oracle Author Podcast web page, of course.  We’re right there, at the top of the page, and of course if you have iTunes, you can see us there as well:
If you wonder about the title, there was a hard limit on the number of speakers I could bring onto the podcast (yes, I wanted all of you fabulous writing Essbase cats there so I could relive my editor-in-chief joyful agony, but for some reason that was felt to be slightly unwieldy so two of us it was).

Here’s the complete description for your viewing (and listening) pleasure:
Developing Essbase Applications:  Advanced Techniques for Finance and IT Professionals is the proceedings of the best technical Essbase conference there ever could be with unparalleled investigation and explanation of Essbase theory and good practices.

Essbase is a powerful and intuitive tool used to build highly useful analytical models, reporting systems, and forecasting applications.  Essbase’s ease of use and power enables a rapid development cycle.  This highly productive environment does not eliminate the need for good design.  Proper design results in an elegant solution with high data quality.  Poor design results in one-off solutions and difficult to maintain applications with poor data quality.

We love Essbase and hate to see Essbase done wrong.

What’s it all about, Alfie?

Have you ever wondered why we wrote this love letter to Essbase?  I could tell you, but then you wouldn’t listen to the podcast, would you?  So listen to hear who at least some of us are, why we did it, and where we think Essbase is going.  Truly a Ripping Yarn.  Join us there, won’t you?

Will the 13th be a lucky day

$
0
0

Nope, I am not becoming superstitious

Oh, I never walk on a crack, lest I crack my mother’s back (you have to see the pavement heaving in my town – I blame all those lovely trees), nor do I walk under ladders (having fallen off a tall one – yeah, that explains a lot, doesn’t it? – I know what is up will eventually come down, maybe on my head), and I never throw my hat on a hotel bed (when I am wearing one of my boonies, I am more likely to be near a sleeping pad and bag), so no, I am not becoming superstitious in my old age.  And 13 December is not Friday the 13th,  so really, what could possibly go wrong?  Ah, but a man's reach should exceed his grasp, Or what's a heaven for?

I am hoping for luck, however, because it will be required

Why?  Because I (I should say we, as in ODTUG) am going to participate in something new, innovative, quite possibly very rewarding, and also very possibly a bit of a stress-inducer.  What oh what oh what am I talking about?  Nothing other than the first ever ODTUG virtual experts panel.

Panel beating

I have participated in, and moderated a few Kscope panels (I leave it up to the reader to decide if yr. obdnt. srvnt. deserved to be up front – I have to say I sometimes wonder) and I know how much fun and yet informative they can be.  From a preparation perspective, all one needs to do is bring knowledge and experience, and a healthy desire to chime in – the rest is magic.  And from an audience perspective panels are an opportunity to get some (hopefully) knowledgeable opinion on technical matters from a variety of perspectives and experience.  Did I mention that moderating these things is like trying to herd autistic cats?  Fun all around.  

Kscope (like Christmas) comes but once a year, but the need and desire for panels happens the other 51 weeks.  What to do?  Enter a brainstorming IM session between John Booth and myself.  We were kicking around the idea of doing something like the show these two morons/idiots/geniuses/very funny guys do for the automotive world, but for EPM world when one of us (I know not which, but I suspect it was John) said, “Why can’t we do a panel?”  And thus the ODTUG virtual experts panel was born.  (If you follow the link, replace Chemical X with very strong coffee.)

Not just EPM

One thing to note – as John and I talked about this idea, we realized that to limit this just to EPM was silly.  You will note that two out of the four panelists are NOT from the EPM world at all, but the larger data integration and business intelligence communities.  I am particularly excited about this (and excited that we were able to go outside of the US and get people many, many, many hours ahead of the States to participate) because Oracle’s tools are crossing disciplines.  We chose Oracle Data Integrator because it is an exemplar of a tool that does just that.  Given ODI’s read-from-anywhere, write-to-anything nature, looking at ODI from an EPM perspective simply didn’t make sense.  And so we are not.

The vision

The way this is going to work is:
  1. ODTUG is going to scour the world for the best practitioners in a given field.  In our first go round, focusing on ODI, we have Matthias Heilos, Gurcan Orhan, Mark Rittman, and (somehow) yr. obdnt. srvnt, up on the panel.
  2. You, dear audience, will connect to ODTUG’s GoToWebinar session and listen to John kick off the panel.
  3. You will listen to our witty yet wise banter and repartee and this dialog will spur ODI questions.
  4. You will send questions to the ODTUG GoToWebinar administrator via G2W’s (I am not typing that out any more) chat feature.
  5. The ODTUG G2W (hi, Lori Lorusso) admin will collate the questions and pass them to John and John will ask the panel the question.  Btw, this approach is because moderating a panel and running a webinar is akin to rubbing your stomach and patting your head at the same time.
  6. Chaos/genius/a cacophony of panelist voices will ensue, and hopefully your question will be answered.
  7. This will all be a great success (bar some inevitable minor logistical SNAFUs as we climb the learning curve, and yes we have rehearsed it beforehand, but there will be stress for we panelists, not you the audience) and you will see many more virtual panel webinars from ODTUG.  Did I mention this will be great?

Be a part of it

A panel differs from a presentation in its spontaneity and improvised content.  In other words, while this webinar will be recorded, it does not follow a set path and you most certainly can and will influence what we talk about.  But of course you have to be there to impact it.

Here’s the gen

What:  ODTUG Expert’s Panel Webinar - Oracle Data Integrator (ODI)
When:  13 December 2012, 1:00 pm to 2:30 pm Eastern (US) Standard Time
How:  https://www3.gotomeeting.com/register/270041222

Join us, won’t you?

Stupid Programming Tricks #15 -- @SHARE the pain with FIX

$
0
0

What is the deal with shared members?

Have you ever been to a conference or seminar or webinar or meeting or whatever, and asked a seemingly simple question and just stumped the presenter 100%?  Some blather on forever hoping that the noise will confuse you, others pause, look at you with that I-hate-you-because-you-have-uncovered-a-chink-in-my-intellectual-armor-but-I-cannot-show-that-my-ignorance-is-causing-me-angst look (What, you’re not familiar with that? – go on, stump a self-important consultant <grin> and see what he does.), and will then say, “That’s a really good question.  We’ll cover that later.”  FWIW, I have noticed that later never comes.  <even bigger grin>

Adventures in deflating self-important people are always fun, except of course when you are the target, or at least are collateral damage.  What I’m talking about is this thread over on what-is-possibly-my-favorite-board-in-the-whole-wide-world, Network54:

Anyway, I am a consultant, hopefully not self-important, and I was 110% (Sammy Davis, Jr., one of my favorite singers – yeah I know what you’re thinking, but check out hisolderstuff, no Candy Man for me – always gave an impossible 110% effort when entertaining, so he is my blog idol) stumped by this question.  

Later is now, at least for shared members.

Back up a bit

A super quick review – shared members are members that are referenced in more than one hierarchy, but only stored once.  A simple example of this is Sample.Basic’s Product dimension which has a Diet parent and three shared children as shown below:

Shared members + @REMOVE + FIX = huh?

Check out this thread.  Passing over the fact that I 100% (or should that be 110% in keeping with my Sammy Davis, Jr. theme?) forgot that I participated quite extensively in that thread, I proved that FIX statements with @REMOVE and shared members do not work.  Do not work?  Yup.  

Case #1 – First you don’t see it

All I am doing is removing the member 200-10/Old Fashioned from the FIX scope.  The database is empty, I am forcing the creation of blocks on sparse calculation, and my expectation is that Sales in Jan, Actual, and every product but 200-10 will have a value of 100.

Yup, just as expected – 200-10 is #Missing which is just what the code said to do – all level zero Products except 200-10.


Case #2 – And now you do

If I clear out the database, and then change the scope so that the FIX now removes 200-20 – that is the shared Diet Root Beer, the code looks the same except for the member name.  The result should be the same as the above, but just for a different member.


And what do we get?



Whoops.  That wasn’t what was suppsoed to happen, was it.  And of course we now see it twice, once in the stored location as a child of 200 and again as a shared child under Diet.  Huh?

Don’t believe me?  Remove the entire Diet hierarchy from your copy of Sample.Basic – 200-20 will then not have a value of 100, it will be #Missing.  Shared members are behaving in an interesting, yet confusing manner.  Hang on, it’s about to get more confusing.

Case #3 – Trying yet again, the same but differently, and getting different results

What happens if I change the hierarchy a wee bit?  In the below example, I have created a new parent, Total Product, and pushed products 100, 200, 300, and 400 underneath it.  Diet is now a sibling to Total Product.  What oh what oh what do you suppose happens when the same code as above gets applied to this?

The code in all its glory.  I have expanded this to all three shared members but really, the logic is the same.


A great big heaping bowl of whaaaaaaaaaat?  Doesn’t taste very good, does it?

Or maybe it’s just an acquired taste?  How oh how oh how did 100-20, 200-20, and 300-30 not get valued?  That is what we wanted, right?  Do we have an answer?  We are getting close.

Case #4 – Variation on a theme by Shared

What happens if I go after “Diet”?  FWIW, I also stuck a @LIST around Diet’s @RELATIVE – no difference in behavior.

I would love to write that the below is a joke, but it is not.  Sigh, we’re right back to the beginning.


Case #5 – Throwing a spanner into the works

No, not this one:

Although this post is now at the Lost and Foundry stage.  

I am using a monkey wrench called “UDA”.  Throw a uda called “SharedandRemove” against those shared members:

And then change the code to this:
And we get…


There is no Ghost in the Machine, that’s exactly like using @REMOVE with @LIST(“100-20”, “200-20”, “300-30”).

Case #6 – Back to square 1, year dot, zero, etc.

But, if I remove Total Product and make the Product dimension look the way it “should”:

And change the code to just refer to the top of the dimension:
We get…

As Charlie Brown would say, “AUUUUUUUGGGGGGHHHH!!!!”  Back again to the beginning.  What is going on?

What can we conclude at this stage, and where do we go from here?

The following can be observed:

  1. Case #1 – @REMOVE against non-shared members works
  2. Case #2 – @REMOVE with shared members does not behave the same way
  3. Case #3 – When @REMOVE is used against shared members with a different parent and those shared members are explicitly listed, @REMOVE against shared members does work
  4. Case #4 – When @REMOVE is used against shared members with a different parent and those shared members are referred to by a parent function, @REMOVE against shared members does not work
  5. Case #5 – When @REMOVE is used against shared members with a different parent and those shared members are referred to by a UDA function, @REMOVE against shared members does work
  6. Case #6 – When @REMOVE is used against shared members with a common parent and those shared members are referred to by a UDA function, @REMOVE against shared members does not work

From this, I think we can accurately surmise that:

  1. If you want to remove shared members from a FIX range, at the very least the shared members and must not have a common parent.  Sticking Total Parents into the hierarchy and making it a sibling to Diet allowed Case #3 and #5 to work.
  2. Even though it is completely non-intuitive, the way the members are referenced changes Essbase’s behavior.  Even in a non-common parent scenario as I outlined above, when @RELATIVE(“Diet”, 0) is used to remove the shared members, the @REMOVE does not work.  But a @UDA or an explicit list does.  So weird.

So then the question is – is there any function in the really rather large BSO calc script function set that might make any difference.  Why yes, there does seem to be one.  Or <insert corny ghost music right here> is there?

Can @SHARE make things better?

There appears to be a function that deals with shared members directly.  Could the name possibly be @SHARE?  Why yes it could.  Hopefully it fixes everything.  I am the optimistic sort, aren’t I?

What do the docs say?

If you want to get it from the horse’s mouth, go here:
http://docs.oracle.com/cd/E17236_01/epm.1112/esb_tech_ref/share.html
My summarization
Right at the very tippy top is the bit that piqued my (And maybe yours?  Surely if you have read this far) interest:  
“Checks each member from rangeList to see if it has a shared member and returns a list of the shared members it has found.”

And my question

Does identifying shared members through the @SHARE function make their selection any better?  One is given hope through this bit of the documentation:
To remove a specific member from the Product dimension, you can use @SHARE specifying the shared member to be removed:
@REMOVE(@DESCENDANT(Product),@SHARE("100-20"))  

Now that looks like good old Sample.Basic aka MVFEDITWWW (somewhat unbelievably, that bit of doggerel can be searched on so go ahead and find out what it means).  Channeling W.S. Gilbert, maybe not.

Case #7 – Time for some disappointment

So what happens when we use the above code with a mild modification to change the @REMOVE to go after level zero members only?  Should be good, right?


Oh, bugger.  Btw, I am going for definitions 9 and 11 in that link.  Why?  ‘Cause @SHARE is just as bad as before, if before means use cases #2, #4, and #6.  Put on a sad face.

Even with Tony Benett singing the above is bad.  But at least we have the consolation of 33 & 1/3 and Hi-Fi.  

Case # 8 – Meet the new boss, same as the old boss

Any better with a different parent?  Uh, no, not really.



Oh, bugger yet again.  @SHARE is more than a bit pointless if you ask me, at least within the context of a FIX.
Case #9 – One last try
Nope, neither Conrad Birdie nor Dick Van Dyke nor Janet Leigh (phwoar) nor Ann-Margret (double-phwoar) can save this, but we can try.  And yes, you can likely tell that 1965 is my cutoff date for movies worth watching.  Moving beyond my antediluvian movie/music (mostly)/literature tastes, Kyle Whigham on that original Network54 board post noted:
Looks like I encountered a similar issue to yours. I was able to resolve by including the stored member and the dhared (sic) member in the list for members to be removed.

i.e.


@Remove("Products",0, @List("Member 1", @Share("Member 1")))

Hmm, the above is shades of the explicitly listed members.  So what happens with something like this?

Honestly, what is the point of using @SHARE if you have to explicitly  list the members to be excluded?  None, actually.  We are back at use case #3, but at least this time we don’t have to worry about having a separate non-shared parent.  Not that the other shared members are excluded as one might hope.



Bye-Bye, @SHARE

Given the almost unmeasurable improvement of @SHARE in FIXing on members, I think we can safely say @SHARE is a bit of a damp squib.

Get that bad taste out of your mouth

The awesomeness that is Essbase gives us a way out:  EXCLUDE...EXCLUDE.  This isn’t exactly a fix for FIX (oh, I kill myself) and shared members, but it is surely a way round the problem, at least in Sample.Basic.

Huzzah!  Success!  Boil in bag!  Btw, do you see the impossible code above?  It validates and works – I love
finding stuff like this.


I love a happy ending

Could @SHARE work with EXCLUDE…ENDEXCLUDE?  Why yes it could.  And this time no bug in the code.





The above success of @SHARE within EXCLUDE…ENDEXCLUDE (we will leave aside the EXCLUDE…ENDFIX bug) and its failure within FIX…ENDFIX makes me think all the more that the way shared members work within FIX are some bigger bug.  Why, why, how, and most importantly, when will it get fixed is 100% (or is that 110%?) out of my bailiwick, but it is interesting.

Is there a moral to this story?
All of my stories end with a moral, or at least a conclusion, I hope.  And what is that finding?  If you must use FIX...ENDFIX  It is this:
If you want to remove shared members, and do not want to reference specific members, your best chance is use case #5, where you stick those shared members under separate parents and select them through UDAs (or attributes).

And if EXCLUDE...ENDEXCLUDE tickles your fancy, then @SHARE works perfectly, but the outline has to work out that way.  A great reason to get all calculation requirements before database design commences. 

Btw, if you wonder why I spent this much time working on a seemingly simple topic in a supposedly dead language (BSO ain’t Latin, yet, and one wonders sometimes except there are all of those BSO apps out there), it’s because I really think this one (two, actually) is a bug.  I leave it to someone working at a platinum partner (they can raise Service Requests and yr. obdnt. srvnt. does not rate that) or a customer (you must have a valid maintenance contract) to log this one with Oracle Support.  And when you do, please tell me the bug number.  

Be seeing you.

Fixing Planning's annoying LCM dimension import error

$
0
0

The problem

I am a big, big, big fan of Lifecycle Management (LCM) for migrating Planning applications.  It gets better and better with each release, and has (hopefully) forever gotten me out of the business of modifying Planning schemas and hoping that I got everything right when I did it.  Yup, LCM is great, but I have found that sometimes when I migrate an application, or try to pull dimensions out of an application, the dimensions import in the wrong order, and then those rogue dimensions cannot be moved in the Performance Settings screen.  

NB – In case you have still not twigged on to this, I am talking about Classic Planning.  I try, successfully I might add, my very best to stay away from EPMA.  You can draw your own conclusions as to why, but have a read of this thread for more information.

Back to the issue, am I imagining that Planning dimensions cannot be moved, sometimes, after LCM importation?  Nope, as it is documented in this Oracle Support KB:  Unable to Order Dimensions Through Hyperion Planning 'Performance Settings' Screen [ID 1072365.1].

And what does that really mean?  

Here is a brand new Planning database with the dimensions in the (sort of, actually, I want Segments to go right above Entity) right order.

Yes, that looks an awful lot like the Planning sample application, but it’s my take on it for use in a future (hint, hint) blog post.  I want the dimensions to be ordered (mostly) this way.  

So what happens when I use LCM to import those base dimensions from the Planning sample application?

Hee I am grabbing them from an LCM file source:

And my target – have you figured out yet what the next blog post is likely to be about?

In progress:

Done, and successfully:

And here it is:

Guess what, Segments is where it ought to be (somewhat miraculously as it has moved, and that is a clue) , but Entity is not.  Can I move Enitity?  Why, I cannot move the !@#$ing thing.  The up and down arrows – they do nothing.  Ai, Caramba! This is the bug referenced in Oracle Support.  If you read that Support article, it claims that this is fixed in 11.1.2.1.  I have news for you – this is on 11.1.2.1.  The bug is still with us.  

Two possible fixes

Now to be fair to the KB article, it has two solutions.  The first suggests editing the HSP_DIMENSION table and modifying the dimension’s POSITIONx field.  I am a tremendous fan of reading from the Planning tables, but I get nervous (I am the nervous type in general) when I change the tables.  What could possibly go wrong?  Uh, everything, if I FUBAR/SNAFU it as is my wont.

The below may seem a little daunting, but is actually quite simple, although I would recommend using some of the queries I’ve written about in the past to figure out what those DIM_ID values actually pertain to:

The other approach is (and I only quote this because it is incomplete) this:
SOLUTION 2:
1. Export the Standard dimension settings using LCM.
2. Edit the xml with notepad, ensure there are no duplicate values in the Plan1, Plan2 etc., performance orders.
3. Import the standard dimension you edited back into the app using LCM.
4. Planning performing as expected.

As far as I can tell, the duplicate values issue above is not the issue/fix.  Well, maybe it is sometimes, but not in the context of the single PlanType application that is the sample Planning application.  Maybe that is the right approach in multiple PT apps, but I leave that to you to prove or disprove.

The fix that works

The KB is so close – all we need to do is combine the first suggestion and the second one and we are going to be very happy geeks.  What do I mean?  Well, remember that LCM xml files contain everything there is to know about a dimension, including its position.  How else do you think the Planning repository gets that information?

And what oh what do we see below?  Is that the dimension order?  Why, yes it is.

And what happens if I turn that 3 to a 4 and reimport the dimension?

‘Arf a mo, guv’nor,  won’t there be another dimension with an order of 4?  Yep.

Hmm, let’s take a look at all of the Plan1PerfOrder settings in the export files and then how do we want it to look?

Dimension
Exported order
Desired order
Period
2
1
Account
1
2
Segments
4
3
Entity
3
4
Year
3 (yes, really)
5
Version
8
6
Scenario
7
7
Currency (to be revealed in the next blog post)
N/A
N/A

What needs to be done then is to set all of the dimensions to the right order and then bring them all in.  

Problem (hopefully) solved.

Set those dimension orders

Period


Account


Segments


Entity


Year


Scenario


Version


And do that import

Save the files, and then reimport.

Success, boil in bag

LCM works just fine, so those xml file mods were tickety-boo.
And then see what we have in Planning.
Ta da, we are done as the dimensions are in the right order.  That was only slightly painful, and now I can go on my merry way with the work I really want/need/am-compelled-to-do-even-though-it-may-spell-my-certain-doom-such-is-my-lot-what-is-yours to do.


Stupid Programming Tricks #16 -- Special characters in Substitution Variables with MaxL

$
0
0

Shouldn’t I be able to find this?

I was working on an upgrade project this week from Essbase 9.3.1 and as part of this process I needed to add a bunch of Essbase substitution variables to my test server.  This should have been No Big Deal, especially as the client was going to give me a MaxL script with the variables and their values all ready to go.  Simply change the username, password, and server and everything will be tickety-boo, right?

So how do you add a variable, anyway?

I don’t know how the variables got onto the client’s server, but I suspect they have been there for a good long time, given that this is a 9.3.1 instance of Essbase.  What I got from the client was code that looked like this:

alter database appname.dbname set variable variablenamevalue ;

What oh what is wrong with this approach when said variable (I am calling it CLTest) doesn’t exist?  Simply this:

After a bit (okay, a lot) of headscratching it made sense – the variable has to exist before it can be set.  In other words, set variable only works when the variable already exists.  Uh oh, as I had 76 variables to value.  For sure I didn’t want to set that in EAS.  So, I thought in my sometimes-logical mind, surely if I can set it, I can add it, right?

However, a review of the alter command’s 11.1.2.2 (and yes, I am finally on an 11.1.2.2 project, only eight months or so after its release, Huzzah!) documentation at the system, application, and database level, I couldn’t find how to add (I hope with the repetition of that word I have clued you in by now) a variable.

So I went to the great false god Google and searched for how to do this and got lots of hits but got…nothing.  Lots and lots and lots of ways to set the values, but not to add them.

And then it hit me

What oh what oh what would happen if I modified the alter command, took out set and put in add, as adding a variable is what I want to do?

And so it was:

A bozo is what I am for struggling with that for 15 minutes.

MaxL functionality through the ages

I don’t have a 9.3.1 instance to prove this out on, but the client said this (see below) worked and it most definitely did not in 11.1.2.2, so I am going to guess that it changed.  In any case, this stuff is not all that well documented, so I thought I’d illustrate these so you don’t have to go out of your mind (Question:  If I am already out of my mind, and I claim that these things drive me out of my mind, does that – make it worse, who can tell, or, and  most likely, does anyone care?  But I digress.) trying to figure it all out.

Doesn’t work #1 – variable as a numeric value

I got this (changed to something generic lest I get sued) and it supposedly (I have no reason to doubt them, I just can’t see it for myself) works in 9.3.1:

alter database sample.basic set variable CLTest 3 ;

In 11.1.2.2, that resolves to:
Hmm, I can’t remember setting a substation variable in any version of Essbase where the variable value is a number.  How oh how oh how do we do it in 11.1.2.x?  Careful (well, lucky) observation shows single quotes around the 3 above.  Could that be the answer?  Why yes it could.  

alter database sample.basic set variable CLTest '3' ;


And yes, you can reference CLTest as a value in a calc script.  A very interesting approach and one that I’ve never seen nor thought of.

Doesn’t work #2 – double quotes around the member name

Again, I don’t know how this works in 9.3.1; I suspect maybe not given the date of this Network54 thread that shows the solution.  In 11.1.2.2, simply sticking double quotes around a value will result in “success”, but it will also mean that the variable value will not resolve in BSO (or ASO, for that matter, although there [ and ] need to surround the member name) as BSO Essbase needs that " and " around the variable value.  What do I mean?
 

So fix it with special escape characters (this one I did not think up on my own as per above, but I thought I would illustrate it anyway).

alter database sample.basic set variable CLTest "\"Opening Inventory\"" ;


Btw, have did you notice how the leading and trailing double quotes are different?  Why?  Why doesn’t this work?
 
alter database sample.basic set variable CLTest "\"Opening Inventory"\" ;

Dunno, but this is the result.  
Go on try it.  You won’t get MaxL to execute that line.  My guess is that once "\" starts off a string, it needs to get closed with \"".  Or at least it appears that way.

Woot, woot, woot!  Geek alert, jump to the bottom to understand why the above is true.  Thank  Jason Jones for the full explanation.  A summary of the conversation (I hope you are enjoying the Christmas-y colors) is that the double quotes  in green are MaxL’s and the red \" is the escaped Substitution Variable’s  double quote.

alter database sample.basic set variable CLTest "\"Opening Inventory\"" ;

And that’s the end

I have to admit that I debated (I do actually think, or what passes for thinking for me) about writing this one but then I was consoled by the thought that this is called a “Stupid Trick”, so it isn’t as though you weren’t warned.  

OTOH, the add functionality isn’t documented anywhere, nor are the single and double quote approaches, so I think this one is worthy of a “Stupid Trick”.

Be seeing you.

The totally geeky cool addendum
Jason and I were IM chatting as we are wont to do, and this blog post came up.  Follow along and excuse the misspelt words that are the product of geeky enthusiasm.  He’s giving me a pretty good lesson in why MaxL works the way it does.  See, this blog is educational.  

Herewith the conversation:

[15:53] Jason Jones: btw, just saw your newest blog article
[15:53] Jason Jones: seriously, hit me up if you have quotes issues -- its a programming thing that i know inside and out :)
[15:53] CameronLackpour: I was writing it.
[15:53] CameronLackpour: So why does the leading quote look like "/" and the trailing quote look like /""
[15:53] CameronLackpour: That is super weird.
[15:53] Jason Jones: not at all
[15:54] Jason Jones: man, i get to school you in this quotes stuff left and right
[15:54] Jason Jones: :)
[15:54] CameronLackpour: Doesn't bother me.
[15:54] CameronLackpour: Btw, those escape codes are 100% undocumented.
[15:54] Jason Jones: i've actually run into this same thing with sub vars
[15:54] Jason Jones: well, they are undocumented, kind of
[15:54] CameronLackpour: Oh, everyone has.
[15:54] Jason Jones: if you have, say, programmed in perl it's sort of implicit
[15:54] CameronLackpour: Hmm, but MaxL isn't perl.
[15:55] Jason Jones: it's not but it has Perl/PHPish quoting semantics
[15:55] Jason Jones: so the basic idea is this
[15:55] Jason Jones: sub vars are strings
[15:55] Jason Jones: right?
[15:55] Jason Jones: they don't have a type like numeric or whatever, they are arbitrary text -- a string
[15:55] CameronLackpour: Except of course when ' and ' surround it.  Then it's a (if numeric) value.
[15:55] CameronLackpour: I verified that the '3' worked as an assign in a calc script -- it does.
[15:55] Jason Jones: well that's sort of another issue you are talking about
[15:56] Jason Jones: so in scripting languages you can quote things in different ways
[15:56] CameronLackpour: I am seeing a JJ Essbase blog post, but go on.
[15:56] Jason Jones: :)
[15:56] Jason Jones: let's invent a scripting language right now called CLScript
[15:56] Jason Jones: and in CL script you can put values into variables and you can print things
[15:56] Jason Jones: variables shall be prefixed with a dollar sign
[15:57] Jason Jones: $favoriteColor = "Blue"
[15:57] Jason Jones: print $favoriteColor
[15:57] Jason Jones: output is Blue, with no quotes
[15:57] Jason Jones: so I'll denote that as this: > Blue
[15:57] Jason Jones: now, CLScript also allows us to use single quotes to denote variable values as well
[15:57] Jason Jones: $favoriteColor = 'Blue'
[15:57] Jason Jones: print $favoriteColor
[15:58] Jason Jones: > Blue
[15:58] Jason Jones: fair enough?
[15:58] CameronLackpour: Sure
[15:58] Jason Jones: $favoriteColor = "Dark Blue"
[15:58] Jason Jones: print $favoriteColor
[15:58] Jason Jones: > Dark Blue
[15:58] Jason Jones: we have a space in our value but that's okay because as far as the CLScript interpreter is concerned, it just takes everthing between the quotes and thats the value of the variable
[15:59] Jason Jones: so it doesn't care
[15:59] Jason Jones: let's also say that in CLScript we can write this:
[15:59] Jason Jones: $favoriteColor = Blue
[15:59] Jason Jones: print $favoriteColor
[15:59] Jason Jones: > Blue
[15:59] Jason Jones: it *works* but is kind of frowned upon
[15:59] Jason Jones: CLScript decided to be nice and said "well, I don't like it but there's only one thing there so I'll pretend it has quotes around it"
[15:59] Jason Jones: but saying $favoriteColor = Dark Blue doesn't work
[16:00] Jason Jones: because the CLScript interpreter is like: Well, it looks like shit but i'll either not take it or I'll just take Dark since that's the first thing I see
[16:00] Jason Jones: so that's a toss up
[16:01] Jason Jones: CLScript is smart and takes absolutely everything in between the double quotes as its value… but what if you want a double quote inside the double quotes?!
[16:01] Jason Jones: $favoriteColor = "Dark "Navy" Blue"
[16:01] Jason Jones: it can't figure it out
[16:01] Jason Jones: so for ease of use, CLScript says, okay, if you want double quotes INSIDE your value I will let you put them into single quotes
[16:01] Jason Jones: $favoriteColor = 'Dark "Navy" Blue'
[16:02] Jason Jones: print $favoriteColor
[16:02] Jason Jones: > Dark "Navy" Blue
[16:02] Jason Jones: but what if we really have to use double quotes to quote a string that has double quotes in it?
[16:02] CameronLackpour: Like "Beginning Inventory"
[16:03] Jason Jones: CLScript says, okay, you can "escape" the quote -- put a backslash in front of a double quote and I'll now that that isn't a double quote to indicate the start and stop of your value, but literally you want a double quote
[16:03] Jason Jones: (that's another issue actually -- one sec)
[16:03] Jason Jones: so we can "escape our quotes
[16:03] Jason Jones: $favoriteColor = "Dark \"Navy\" Blue";
[16:03] Jason Jones: print $favoriteColor
[16:03] Jason Jones: > Dark "Navy" Blue
[16:04] Jason Jones: so it's not that the backslash double quote combination was the special character sequence on the outside of the value we want, it is for all the internal occurrences of a double quote
[16:04] Jason Jones: and that's why you have "\"Opening Inventory\"" instead of "\"Opening Inventory"\"
[16:04] Jason Jones: in fact, if using the latter form is not an error, it results in a value of this:
[16:04] CameronLackpour: Ooooh, I see
[16:04] Jason Jones: "Opening Inventory
[16:05] Jason Jones: with no end quote
[16:05] Jason Jones: now, all that being said there is yet one more wrinkle here
[16:05] Jason Jones: which is probably what's really biting you
[16:05] Jason Jones: aside from the double quote thign
[16:05] Jason Jones: so, we can go into EAS itself and edit our sub vars
[16:05] Jason Jones: &CurrMeasure, for example
[16:05] Jason Jones: i can go into EAS and set the value to this: Opening Inventory
[16:05] Jason Jones: notice no quotes
[16:06] CameronLackpour: hmm, would just \"Opening Inventory\" work?  I wonder why not.  Ah, because the interpreter needs the outer " and " around the internal \" Opening Inventory \".
[16:06] Jason Jones: what you say would not work
[16:06] Jason Jones: because you have two issues here
[16:06] Jason Jones: issue 1: MaxL recognizing the value and 2: the value itself from an Essbase perspective
[16:06] CameronLackpour: No, I think i get it.  MaxL pukes when there is a value with spaces.
[16:06] CameronLackpour: It needs "
[16:06] CameronLackpour:  and "
[16:06] Jason Jones: right
[16:07] Jason Jones: but here's the rub
[16:07] Jason Jones: if you put quotes around Opening Inventory and MaxL reads that, MaxL goes "okay, the thing between the quotes is the variable value"
[16:07] Jason Jones: BUT
[16:07] Jason Jones: here's the rub
[16:07] Jason Jones: and why it's more complex than that
[16:07] CameronLackpour: But then to get the escaped " and " -- that needs to go inside.  So "\"Operating Income\""
[16:07] Jason Jones: from an ESSBASE SUBSTITUTION VARIABLE STANDPOINT -- you need quotes on the subvar itself!
[16:08] Jason Jones: that's the difference between going into EAS and looking at sub vars and seeing one of these
[16:08] Jason Jones: CurrMeasure --> Opening Inventory
[16:08] Jason Jones: or
[16:08] Jason Jones: CurrMeasure --> "Opening Inventory"
[16:08] Jason Jones: if the value of the substitution variable itself doesn't have the quotes on it, it wont work!
[16:08] Jason Jones: because it looks like this when the code goes to execute
[16:08] Jason Jones: FIX(Opening Inventory)
[16:08] CameronLackpour: It'll get assigned, but just won't work./
[16:08] Jason Jones: rightio
[16:08] Jason Jones: FIX (&CurrMeasure)
[16:09] Jason Jones: straightup text replacement
[16:09] CameronLackpour: I will amend my text and give you educational credit.
[16:09] Jason Jones: and you can't put the quotes in yourself because then the FIX wouldn't be able to interpolate the variable since it wouldnt see it
[16:09] Jason Jones: ie
[16:09] Jason Jones: FIX ("&CurrMeasure")
[16:09] CameronLackpour: I know that FIX("&CurrMeasure") doesn't work
[16:09] Jason Jones: doesn't work
[16:09] Jason Jones: however
[16:09] CameronLackpour: Right, it tries to read it as a member name
[16:10] Jason Jones: and this is due to this measure having a space in the name
[16:10] Jason Jones: if it was OpeningInventory you'd be fine and not notice the issue
[16:10] Jason Jones: so probably how this manifests itself every now and then is that someone ends up googling "Error when using substiution variable with space in the name" or something to that affect
[16:10] CameronLackpour: Yes, that makes sense.  As I noted, I will amend it before Glenn jumps on it.  :)  I will give you all credit.
[16:11] Jason Jones: it's just a doubly gnarly issue when you add in the MaxL piece since you gotta get the quotes exactly right
[16:11] Jason Jones: :)
[16:11] Jason Jones: and then very, very lastly to really beat the issue to death, note that only double quoted strings interpolate variables
[16:11] CameronLackpour: What is interesting is that the client stated that "Opening Inventory" worked perfectly.  And yet when we looked in EAS we say "Opening Inventory", not Opening Inventory.
[16:11] Jason Jones: ie, we have an environemnt variable of NAME or something
[16:11] Jason Jones: so you might be able to do alter server set whatever "Name_$NAME"
[16:12] Jason Jones: and $NAME is replaced with the value of NAME
[16:12] Jason Jones: but if you do single quotes
[16:12] Jason Jones: 'Name_$NAME'
[16:12] Jason Jones: then the value of the variable is literally the dollar sign
[16:12] Jason Jones: that's why you might have to escape double quotes
[16:12] Jason Jones: to get variable interpolation as well as the quotes in your variable
[16:12] Jason Jones: alternatively in your example you could to this:
[16:12] Jason Jones: '"Opening Inventory"'
[16:12] CameronLackpour: Yes, I have seen this in MaxL with parameters.  Generally, I can throw a $1 or $2 into a string (like for file names) and so long as there are no spaces, everything works.
[16:12] Jason Jones: (single quote, double quote, value, double quote, single quote)
[16:13] CameronLackpour: But I have to encapsulate all in double quotes.
[16:13] Jason Jones: well, now you know how to do it with spaces
[16:13] Jason Jones: so if you wanted to say, pass in a command line parameter to the script that is used in the variable, like say $1 = March, you could do this:
[16:13] CameronLackpour: Thanks.  Hey, I have to run (not to hide my face in shame, but to clean out a gutter before it gets dark), but thank you.  You can correct me any time.
[16:13] Jason Jones: alter foo set to "\"Opening Inventory for Month of $1\""
[16:14] Jason Jones: no problem
[16:14] Jason Jones: glad to help a fellow nerd :)

Dodeca dynamic reports

$
0
0

Introduction

One of the greatest strengths of Dodeca are the many, many, many kinds of functionality that would require scads of code are simply not needed because Applied OLAP has already done all the work and built it into the tool.  Whether it be SQL drill through, frighteningly awesome member selectors that you can drive from dimensions, members, SQL, delimited lists, the ability to quickly convert complex Excel workbooks into highly-functional Essbase reporting decks – I could go on and on and I have.

But what happens when you need to do something outside of those great pieces of functionality?  Yes, Dodeca does many, many, many things, but what about custom stuff?  Why thank you for asking, because that is where Workbook Scripts come into play.

What the heck are Workbook Scripts?

That is a very interesting question, and not an entirely easy one to answer.  Okay, part of the answer is straightforward -- WS (I am not typing out Workbook Scripts any more in this post) are how you  customize a Dodeca View (for the uninitiated, Views are reports/forms/etc., usually, but definitely not always a marriage of Essbase and Excel workbooks as hosted in Dodeca).  But WS are not some new age Visual Basic for Applications for Dodeca.  They are more akin to Excel’s original Macros (this is pre-VBA) but with multiple twists – WS have their own Essbase/SQL/Excel language, are tied intimately into Dodeca’s event model, and can even directly execute Excel formulas.  Did I mention that all of this happens in unison with Dodeca events occurring on the sheet AND all of the functionality and formulas that are intrinsic to Excel AND the Essbase API AND Essbase report scripts AND/OR MDX AND well, you get the idea?  It is, in a word, cool.

Crawl before you walk

If the above sounds overwhelming, I am doing WS a disservice – they are actually pretty easy to use (that is sort of their point) and a whole bunch of coding that you might have to do in another tool is simply not an issue as Dodeca does it all for you.  Probably the best way to show this is an example and per the title of this blog post, I am going to show you how to create a report that dynamically sets the rows to the level zero descendant of whatever off grid/POV Product is selected.  To do this I will write an extremely simple two step WS that will:
  1. Read the member selection from the Product treeview POV control
  2. Run a report script to get the inclusive level zero descendants of that Product
  3. Dynamically increase or reduce the no. of Products on the sheet
  4. Retrieve the data from the sheet

A very simple report script


The first step is to write a report script for MVFEDITWWW (aka Sample.Basic) that mimics a drilldown in Excel.  I have been writing Essbase report scripts for longer than I care to think, so this was dead easy.  You could make this report as simple or as complex as you like – data is not the point, just metadata.  Here it is:
 

And the output (yes, there is data, no, I don’t care about the numbers, yet):

Tokens to the rescue

If I were writing this report to only do Colas I guess I could stop, but I have a mad lust for power, uh, flexibility, and who knows, I may share this report with someone who is responsible for Cream or Fruit Soda pop, so I am going to remove the member “100” and replace it with the string [T.Product].  See my other posts on Dodeca to explain how tokenization works.  

Tokens via ScriptText

What matters now is that whatever Product member is selected is what gets put into the report script.  To do that, I will change the report and in the WS BuildRangeFromScript method’s ScriptText property I will put in a WS function called TokenValue or @TVal (see what I mean about this looking like Excel 4 and before Macros?).  The bits in yellow are the token string [T.Product].  The bits in a sort of horrible brown are the function @TVal(<TokenName>) which make Dodeca value the token before the report script is executed.  

Also note that the whole thing is enclosed in double quotes – this makes sense as the Product name could be Fruit Soda instead of Colas.  Remember, Dodeca can pass either the name or the alias, depending on what the user has selected for the dimension control, so the report script has to be accomodate either.

Just to be clear, to make Product tokenized, I put in the WS function and token string of:
@TVal([T.Product]) and encase that within double quotes so that it looks like:
"@TVal([T.Product])"

Ranges R Us

There are three ranges that must be defined for this process to work:
  1. An Essbase retrieve range – where Essbase will retrieve the data (remember, this is the range that we want to dynamically make larger or smaller).  This range will get larger or smaller depending on the output of the report script.  NB – Make this range one row (or column if this is a column build) longer than it needs to be to get Dodeca to insert rows correctly.
  2. A template range – a range that is repeated for each member in the output of the script.  Formatting, formulas, etc. go in here.  If the range is one row in height (which is the usual approach) then just one row will be inserted.  If the range is two or ten rows, then that many rows and their contents will be inserted when the row is repeated.  Think of this as a quick and dirty method to do multiple dimension drilldowns where the inner dimension is a fixed set of members.
  3. An Essbase target range – A single cell that tells Dodeca where to stick the copied template range.

Two simple range rules


  1. The target range must be inside the retrieve range
  2. The target range must be outside of the template range

Follow the above two rules and you will be happy.  Ignore them at your peril – I will illustrate later what happens to Essbase/Dodeca geeks that do not follow those two pretty simple strictures.  It’s ugly.

So what does this all look like?


Here’s the View template.  Note that there are no predefined members in the row headers – that will be the output of the report script.  You will also note the three ranges that I have named (Ess.Retrieve.Range.1 is fixed, but I could have named the other two Platypus and Orange if I wanted to but as I am not totally insane, yet, I chose meaningful names; you should too.)  Just so everyone is clear (are you getting the idea that the position of these ranges is important?), the ranges are as follows:
  1. In green in cell A7 is the target row for the inserted Products.  Note that the Insert.Marker range is within Ess.Retrieve.Range.1.
  2. In yellow from A4 to P8 is the Essbase retrieve range Ess.Retrieve.Range.1.  Note that it is one row longer than needed – this is to get the insert function to grow the range, just like in Excel (go on, try it).  If you don’t make Ess.Retrieve.Range.1 one row longer than necessary, you’ll get this when everything fires:
Note that rows have been inserted onto the sheet, but the range Ess.Retrieve.Range.1 has not been expanded.  You Have Been Warned but I am jumping ahead of the narrative.

  1. In blue from A3 to P3 is the row template range Row.Template – this is what will get repeated for each selected Product.

Again, note the application of the two range rules:

  1. Insert.Marker is within Ess.Retrieve.Range.1
  2. Insert.Marker is outside of Row.Template

Setting the properties

Dodeca helpfully supplies an event called OnAfterWorkbookOpen and just like the name suggests, this is after the workbook is open but before anything has been retrieved.  This is the time and the place to set the contents of the row.  To do that, this View must have nine properties set – it looks a little overwhelming at first but honestly it isn’t very hard.

  1. BuildRangeFromScript -- You must decide how you are to build the row set.  I have an older version (I am too lazy/overwhelmed-with-so-many-things-to-do-it-scares-me to download the latest and greatest but I should) of Dodeca, so I am missing the MDX script option.  In any case, I want the EssbaseReportScript type.

  1. ScriptText – The report script text as shown in the ScriptText editor a few sections above must be entered here.  Tokens, btw, are not a requirement and indeed when I was proving that this WS worked, I just used the Essbase report script with Product 100.  Once I had that working, I tokenized it.
  2. StartCell – Range name of the repeated rows that are tied to the output from ScriptText.  This is the green range.
  3. Rows – This report has dynamic rows; it could just as easily be columns.
  4. EnterNumbersAsText – Just in case member names such as 100 are used, treat them as text.
  5. CopyFromRange – The name of the range to be repeated.  This is the blue range.
  6. Insert – Set to TRUE as I want the output of ScriptText to be reflected in the sheet.
  7. OutputRangeName – The name of the rows that are built during the insert process.
  8. OutputMap – The column that receives the output of ScriptText.

Attaching the WS to the View


Once you have written the WS and committed it, simply assign it to the View in the WS property:

Let’s run the view

Diet Drinks

Colas

 

Cola

 

All Products


You get the idea from the colored ranges, right?  The Row.Template range in blue is repeated and inserted into Ess.Retrieve.Range.1 as many times as there are Products coming out of the ScriptText property.  Also note that Insert.Marker gets pushed down as the rows get inserted.  Lastly, the Row.Tempate range is no longer in the sheet – I put in a second step into the WS to delete that range once the retrieve was complete.
 

What users would actually see

And lets look at the same report as the above but without the colors to show the addresses of the ranges post retrieve.

 

WS can be assigned to multiple views – think of it as a way to build a library of functionality within an application to be used over and over again.

That was easy, wasn’t it?

Think about how you would do that outside of Dodeca with a spreadsheet.  Think about all of the things Dodeca is giving you – connections, grids, selectors, tokenization, and now WS.  Think about the effort on your part to code all of that.  It’s sort of a consultant’s dream, isn’t it, so long as the dream consists of writing enormous amounts of code.  I’ve done it in the Classic (nope, now it is Legacy) Essbase Excel add-in toolkit and I don’t ever, ever, ever want to do it again coz I have way more important things to do with my life, like blog, or post on OTN or Network54 or work on “special projects” or prepare content for my three Kscope13 presentations or I dunno, try to have a life.  Yep, plenty of other things to do and not one of them includes writing tons of code.

Addendum – two examples of what not to do

I could have ended this post right up above but I thought I would save anyone who tries this approach from the errors I committed.  And more than that, it really illustrates how this technique works.  So with the thought that errors can be fun and educational, and the reminder that you should follow the below two rules at all times, let’s begin.

For the record once again (I think this is the third time I’ve written this, so yes, it is important) make your life simple and follow these two rules:
  1. Insert.Marker is within Ess.Retrieve.Range.1
  2. Insert.Marker is outside of Row.Template

What happens when you don’t follow those rules?  Or, in my case, had the rules explained to you by an ever-patient Tim Tow but then completely ignored them?  Pain, that’s what.  Let me show you what happens when Insert.Marker is not within Ess.Retrieve.Range.1.

Btw, we can only view this by invoking the WS debugger and then stepping the process.  I should note that the only way to see what is on the sheet when there are WS errors  is to run the WS in debug mode and then use the CoverView button to display the View.  Otherwise, all you get is this:

The CoverView button is on the top toolbar of the debugger.


Error no. 1 – Insert.Marker outside of Ess.Retrieve.Range.1

Assuming that the WS is being stepped, and that the CoverView button has been selected, here’s the sheet before the ScriptText is applied.  Note how Insert.Marker is outside of Ess.Retrieve.Range.1.

And we get…KABOOM!
 
Do you see what happened?  The Row.Template got repeated correctly, but the Ess.Retrieve.Range.1 did not get expanded.  When that retrieve fires, there will be no Product within the range.  The error message is a bit cryptic although if you look at the above range, you’ll realize it makes perfect sense.
Error no. 2 – Insert.Marker within Row.Template
What happens if the Insert.Marker is within the Row.Template?  It’s actually quite logical – the ScriptText returns lots of rows one for each product, but doesn’t put the right members in, Dodeca inserts them but they are now within the expanded Row.Template range which gets expanded.  Ess.Retrieve.Range.1 is what gets retrieved but it doesn’t have any Products and so you get the below result.
And a nasty error message of:
It’s the same error message (it is the same error from an Essbase perspective) but a completely different looking sheet.  Don’t do as I did and all will be well.  And use that debug function and even color the ranges if you are confused as to what is where and why.
And now really the end 
Beyond user-initiated error, i.e., I screwed up, this is dead easy and takes much longer to read about than it does to set up and run.  I think practically every Dodeca customer out there uses WS in one form or another, but like all powerful tools, there is definitely a learning curve.  Hopefully I’ve brought you a bit along the way with WS in Dodeca and showed you some of the power.  In case you can’t tell, I really like this tool.

Be seeing you.

Death of a thousand cuts

$
0
0

It is no secret

I have no love for Essbase Load Rules.  Mostly because I have comprehensively, completely, and utterly tied myself into knots using them.  And I should be clear about this – they are dangerous in my mind because it is easy to crude ETL in them that you (or at least I) will promptly forget about.  When it comes time to change things, or data changes, it is a complete Pain In The You-Know-What to figure out what was done in the rule.  Yes, they were Hot Stuff when invented, but that was almost 20 years ago (yes, really) and it is time for their closed nature to go, go, go.  Well, at least that’s my opinion.  I don’t really hate the things, but I do think that ETL via Load Rules is just asking for trouble and I wonder why there is no other lightweight way of sending data in an automated fashion.  Read on, gentle reader (all three or four of you), and yr. obdnt. srvnt. will reveal all.

Four exceptions to Load Rules

Actually, there are at least four different ways of sending data to Essbase without using Load Rules.

Load in free form

Have you ever locked and sent (BSO and the classic add-in)/sent (ASO and the classic add-in) or submitted (BSO or ASO and Smart View)?  Then you have used Essbase’s free form data format and yes, you can load it directly into Essbase sans Load Rule.  Want an example?  Take a look at CALCDAT.txt in Sample.Basic.  It is a free form data file with level zero and consolidated data values.  Why Arbor (yes, it is that old) did that, I cannot say, but they did.  

Load via ODI

ODI has an option to load data directly into Essbase without a Load Rule.  However, there seem to be lots of issues with that approach (go on, search for it and you will note that most ODI practitioners use a Load Rule because it is cleaner) and of course you have to buy into ODI in the first place.  I am a huge fan of the tool but its learning curve has a very cliff-like look to it.

Load via Studio

Essbase Studio is a great way to explore data, model it, and ultimately express it in Essbase.  And yes, there is nary a Load Rule to be seen in the tool.  But, at the end of the day, Load Rules are used to get metadata and data into Essbase.  I am inclined to cut Studio a break as Studio developers don’t so much as dirty their fingernails with Load Rules.  Nevertheless, programmatically derived Load Rules (that use ultra-secret API calls) are still part of the equation as they are generated by Studio so it is still tarred with that brush.

Load via MaxL

Did you know that MaxL allowed you to load data points to Essbase?  No?  Did you know import data did this?  It’s really pretty easy.

Before

 

MaxL

 

After

 


Pretty cool, eh?  You will note that this is essentially a free form data load which is the same as a submit (see, I am divorcing myself, reluctantly, from the classic add-in in every way and manner) from the Microsoft Office tool of your choice.  But this approach isn’t really suitable for more than a small amount of information.  

Four exceptions, but none necessarily right

Free form simply is not a practical solution, ODI is a great tool, but you have to be 100% committed to it and it has its caveats, Studio, while cool, is also not a simple solution and it resolves to the dreaded Load Rules in the end, and loading small bits of data via MaxL is interesting, but ultimately not practical.

What is needed is a lightweight way to load data, from SQL, without any of the above approaches and of course without Load Rules.  How oh how oh how can this be done?

HyperPipe to the rescue

Well, I didn’t write it

I was whining/complaining/ranting (I can do all of this at the same time and it is difficult to distinguish one from the other) to Jason Jones about Load Rules (yeah, this is one of the biggest bees that buzz around my bonnet) and he said (NB – Artistic License ahead), “Really?  You’d like to load data to Essbase without using Load Rules?  Give me a couple of days.”  And so it was.  It is great knowing people who are smarter than you.  

So what is it?

It’s a command line way to load from files or SQL to Essbase.  Note the words that are missing from that sentence – Load Rules.  <insert evil laugh>  This example will show loading to MVFEDINTWWW, aka Sample.Basic step by step.
Data in SQLOh, did I mention that the Merant ODBC drivers are no more?  What you see below is the open source H2 Java database.  Did I mention the client is running on a Mac?  It made me laugh out loud.
I should note that the above is not implying a Mac version of Essbase, but instead shows that with a web application orientation, one can access data and processes across platforms and machines.

Data target

Here’s the data target.  Jason has not yet made the switch to Smart View.  I understand his pain.

Where does it run?

Jason is a Java guy, so the code to run this is in the hyper-pipe (HyperPipe?  hyper-pipe?  Tomatoe, tomatoh) jar file through the bash Unix shell.
Are you catching all of the properties?  Just about everything one could possibly think of wrt a data load to Essbase.  You can read the screen as well as I but I draw your attention to two parameters.

sqlDriver

JDBC to the rescue, ODBC goes bye-bye.  Neat, huh?

sqlInitQuery

Want to manipulate columns and data?  Do it in SQL, as Zeus declared on Mount Olympus.  At least I’m pretty sure he said that in Clash of the Titans.  Just watch the good (original) one -- I think me mentions it right after, “Release the Kraken”.  

You may fire when you are ready, Gridley

And that’s it.  A load to Essbase, from SQL, with Load Rules being conspicuously absent.  <insert big grin>

Data in Essbase

How does it work?

Jason described it as akin to the MaxL data load process.  And that is all I know of it as I am no Java programmer.  However, the good news is that Jason is going to release this as an open source project so you can download it and tear it apart.  Have fun.

So is that the end of Load Rules?

Alas and alack, no.  They will be around, quite possibly forever (if you define forever for as long as there is an Essbase, then yes, definitely forever) as there are simply too many people with too much code in them.

On the other hand, Oracle took the bold step of giving the chop to Hyperion Business Rules to be replaced in 11.1.2.2 onwards with Calculation Manager and the same is true for the Classic Excel add-in for Essbase, so there is hope.  

As of this writing, Jason has not released HyperPipe but as I noted, he is an open-source kind of guy.  Contact him via his blog and look there for a much more in depth review of the tool.

And let me also give a hearty thanks to Jason for figuring out how to do this.  Now if only a certain absolutely humongous software company takes this concept and runs with it.  <insert the biggest grin thus far>  As the immortal Robbie Burns wrote, “Ah, but a man's reach should exceed his grasp, Or what's a heaven for?

Be seeing you.

Get the FLASH about Kscope13

$
0
0
No, not that that superhero with Mercury’s wings on his ankles (Slight side note:  How was that supposed to make him run faster?  I mean, wings on his feet I understand, but those buggers were on his ankles.  Feet are attached to ankles and so the feet were fast too?  Why not attach the wings to his elbows?  They’re attached to his feet, kind of.  Ah, I would have made a lousy scholar of Classics.  One last question -- why is he wearing a hardhat?  None of this makes sense and yes, I am digressing in a most spectacular way.  Back to the task at hand).  And no, not a flashbang grenade (I read too many…odd websites).  And no, not a 32GB flash drive.  No, no, no to all of the above.

Nope, what I am talking about is a soon-to-expire, very-possibly-never-to-be-repeated-but-don’t-hold-me-or-ODTUG-to-it, Kscope13 offer that will get you a cool $150 off the Kscope13 registration cost.  Is there a catch?  But of course.  You must register before Friday, 11 January 2013, 12:30 pm EST to get this great deal.  And how do you do this?  With the code “FLASH”.  Yep, that’s the code and that’s the flash, and that’s all, folks.  

Be seeing you in New Orleans, 23 to 27 June 2013.


The fastest way to export targeted data from BSO Essbase with NONEMPTYBLOCK

$
0
0
NB – The best way to experience this blog is with one of the following musical soundtracks open and playing in a loop on another tab or window.  Yes, Big Black’s RacerX gets the idea across just right.  And yes, it may be hard to believe for those of you who have met me that I was ever a fan of punk, but yup, that was my youthful musical rebellion.  For those of you who are not manly (womanly?  whatever) enough to take that can listen to the original Speed Racer theme as it ought to be heard.  All too much for you?  How about the version that most matches my current musical tastes?  Here’s a nice meld of a Quncy Jones bossa nova arrangement of “Desafinado” and of course the appropriate speed frame of mind.

Okay, with that bit of not-quite-totally irrelevant trivia out of the way, but with you most definitely in the mood for speed, speed, speed (and, if you are listening to Big Black, the desire to throw yourself into a mosh pit, yeah, this blog caters to all tastes) read on for some pretty darn exciting news about extracting data out of BSO Essbase.

The obligatory primer

When it comes to getting data out of BSO Essbase, there are a couple of ways to extract data.

Export it

From the days of long ago Esscmd’s EXPORT (believe it or not, I saw a new system last year that 100% relied on Esscmd.  Look, I thought it was a nice succinct language, but it is dead, dead, dead.  As you might imagine, the rest of the system was pants.), to MaxL’s export data command, to EAS’ export data functionality.  If you need to get ALL of the data out of the database for backup purposes, go crazy with parallel exports and have a good time.  And note, if you are working with a 92 .PAG file database as I am currently, that is NOT going to be a fast process.

Focused exports

But what happens when you need to only export a portion of a database?  The above approaches aren’t going to do the trick because they are all or nothing.  

Happily Essbase provides many ways to extract data:  report scripts, the DATAEXPORT calc script command, and, for those of us who use BSO on a regular basis, the slightly exotic looking MDX queries.  Let’s examine each in turn and yes, I have a (and I think once you see it you will agree this is sans hyperbole) technique whose performance will leave you absolutely gobsmacked– it did for me and I am not all that easily overawed.  

But just as with dinner, first the meat and veg, and then the oh-so-tasty dessert.  I am not doing this out of sheer bloody-mindedness but because I think you need to see all of the options.  And of course malva pudding tastes best after the Bobotie (mmmmm, South African food is lekker).  Enough of my culinary analogy – the other thing this review does is show how this new technique spanks every other approach.  It is a Most Awesome Hack.

Essbase report scripts

So what if we stepped back to year zero and tried this the old fashioned way with Essbase report scripts.  They have been around, literally, from the beginning of Essbase and are known to most.  

What does a report script look like that exports level zero Product and Postcode in the Forecast Scenario, Working Version, in Jan FY12 for the Account AT?  Oh, something like this:

We likely want to run this in batch (I dunno, do you like staring at a screen as you wait and wait for a process to finish?  Me neither.) via MaxL:
export database db.dbname using server report_file "Test1" to data_file "c:\\tempdir\\Test1.txt" ;

How long does it take?  1269.72 seconds for 21,788 records (one for the header).  The output looks like this (I could have suppressed the headers with the SUPHEADING keyword but chose not to):

BSO calc script DATAEXPORT

DATAEXPORT is the darling of many because it can be invoked within the context of a calc script and because it can produce nicely formatted (well, cleanly formatted) exports and even write to SQL.  

Which produces something like this (which does look a bit a ugly in WordPad):

Why oh why did I open it in WordPad?  Because DATAEXPORT only writes a Line Feed (LF) at the end of the record, not CRLF as Windows requires.  Poor old Notepad can’t handle that so you get this:

How long did all of that take?  Why a mere 1070.79 seconds for a 21,787 record export.  That’s just about 16% faster than a report script, so if speed is the purpose, then DATAEXPORT is the way to go.

So what about MDX?

Well, this wouldn’t be where I would normally turn for data extraction, mostly because MDX is so ASO-ish and I’m not writing formulas in an ASO Essbase database.  But in fact MDX has a query language with ROWs, COLUMNs, PAGEs, and as many AXISes as one can shake a stick at (actually the max is 64).  

I really got inspired to try out MDX as part of a Really Special Project (i.e., a project that I am doing for “fun” and getting $0/hour for and OMG the hours are killing me – I’d be a freaking millionaire if I were doing this for a client.  Write constructive suggestions to me care of this blog on how to conduct oneself professionally with an eye to not going broke.) and because of a recent thread on Network54.

Well, after an amazing amount of pain for really very little output (this is, sadly, my modus operandi when it comes to new-to-me technology) and boost from my buddy Dan Pressman, Mr. ASO and by extension sorta-Mr. MDX query, I came up with the following:

That’s the query with pretty colors.  To make this actually somewhat readable when it gets output, I stuck the query in the following MaxL script:
spool on to "c:\\tempdir\\MDX_Extract_Test.log" ;

login usernamepassword on servername ;

alter application ep clear logfile ;

/*    The below settings are right out of Developing Essbase Applications    */
alter session set dml_output alias off ;
alter session set dml_output numerical_display fixed_decimal ;
alter session set dml_output precision 15 ;
set column_width 80 ;
set timestamp on ;

SELECT
    {CrossJoin({[Period].[Jan]}, {[Account].[Allocation Target]})}
ON COLUMNS,
    NON EMPTY (CrossJoin([Product].Levels(0).Members, [Postcode].Levels(0).Members))
ON ROWS
FROM [EP].[ExalPlan]
WHERE ([Version].[Working], [Scenario].[Forecast], [Year].[FY12]) ;

I have to give a plug to Gary Crisci – the settings I use to turn off aliases, set decimals and precision, and the column stamp are all out of Gary’s MDX chapter in Developing Essbase Applications.  As you’ll see, formatting in MDX isn’t all that great – it all goes into MDX_Extract_Test.log along with a bunch of other stuff.  Ick, yuck, eeewwwww.

The good news is that Network54 thread I referenced above has some great suggestions for getting round all of this.  Read it through and you’ll get some ideas.

How long does all of this take?  Ah, that’s where this gets interesting -- 638.926 seconds.  Now that is interesting – almost twice as fast as report scripts and takes just 60% of the time of the DATAEXPORT calc script time.

So what do we have?

Three different ways to write out data, with an increasing performance profile:
  • Essbase report scripts
  • DATAEXPORT report script
  • MDX queries

Looking at MDX versus Essbase report scripts, I really have to say, at least in the database I am using and for the extract I am using, MDX is by far the fastest way to do this.  Awesome, right?  We have a winner.  Or do we?

Undocumented, unrecognized, and unreal

All those times, all of those techniques I wrote about above?  Yes, they’re all going to pull the same data, and yes, I have shown you a way that is twice as fast as the most common approach, at least for this database.  But you know what?  They stink.  I have, thanks to a number of people and, I might add, Oracle, a much better way to do this.  It’s so fast the first, oh, eight or nine times I ran it I thought for sure it was simply an error.  It is no error and it is freaking awesome.  

Did I figure this out myself?  Nope.

Btw, I must tell you that I did not figure this out on my own but was told of the command by my fellow ACE Director Tim Tow.  And he discovered it whilst working with Bryan Bain, one of the original AFSG (Arbor Field Services Group) Essbase consultants, when he was trying to extract the last ounce of performance out of Essbase for his flagship tool, Dodeca.  I should also mention that I tested this out as part of that Very Special Project on one of John Booth’s test servers. I am, as always, standing on the shoulders of giants.

For the love of Mike, what is it?

It is an undocumented MDX BSO-only keyword that is hinted at in this old Network54 thread (which I was part of but, no, I did not hide it from you till this time) and mentioned in the 11.2.2 Essbase readme.  Look for defect 13037253.  And you have seen it in Planning forms.  Planning?  Yes, Planning.  Have you figured it out yet?

It is:  NONEMPTYBLOCK

How does it work?

Just like this:

How long does it take?

The query (the same as the first MDX example save the keyword change) took 1.789 seconds with NONEMPTYBLOCK.  Really.  As I wrote, no, I didn’t believe it either.  Go on, try it yourself.

Are you laughing yet?  This didn’t make my day, or my week, or my month.  I think this is going to make me laugh all the way through 2013.  And considering I am stuck in an airport for hours and hours (thank you, US Airways, for eating all of my Sunday, again)after an ODTUG board meeting (the meeting was fun, the flying (or lack thereof) not so much), I’d say that is pretty strong medicine.

To put this into percent of time relative to an Essbase report script extracting the exact same data, this technique takes 0.14% as much time.  That’s right, not 14% but (I am going to say it out loud) zero-point-one-four percent.  In other words, it’s a freaking rocket.  And of course it’s Oracle that gave us this command and then mysteriously didn’t bother documenting it.  Why?  I have no idea.  It is still freaking neat.

Where does Planning fit in?

Have you ever edited a Planning (I am not going to bring up Planning just to snapshot this – trust me, it’s there and has been for a long time) form and ticked the box that tells Planning to suppress missing blocks?  Did you know that Planning builds its forms with MDX?  If not, you do now.  In any case, when you tick that box, you are using NONEMPTYBLOCK.  Remember, there is a slower (and documented) option to suppress missing in MDX – that is the NON EMPTY keyword I showed in the first MDX query example and that corresponds to Planning’s suppress missing form setting.

Does it always work like this?

With Essbase, the answer is easy:  no.  Or at least, test it, within the context of several constraints.

As I wrote above, this only makes sense within the context of BSO Essbase.  There are no blocks in ASO so this command doesn’t make sense and consequently doesn’t work (yes I tried it, no it doesn’t work).

This command also only makes sense if you are trying to extract lots of sparse at level zero of the database.  Per that Planning documentation link I gave you, if you do not have many missing blocks: “The Suppress missing blockssetting can degrade performance if few or no rows are suppressed.  Test forms before and after using this setting to determine whether performance is improved.”

And of course if you are not suppressing data then this command makes no sense at all.  Having said that, custom exports from Essbase almost always are at level zero and almost always suppress missing data, at least in my experience.  And if that is the case, NONEMPTYBLOCK is your BFF.  Btw, I would go with definition #3 on that link.  :)

Be seeing you and enjoy the hack.

An awesome 40

$
0
0

Sadly, not my age

Oh if that were still true (funny how being 40 wasn’t particularly thrilling at the time) but alas and alack, it is not.  OTOH, 40mm can be awesome, when it is in the form of a Bofors.

Why ODTUG is awesome

What on earth does a WW II era anti-aircraft gun by way of Sweden and then Chrysler (just like Merlins were by way of Rolls-Royce and Packard) have to do with Essbase?  Well, as you can see in the snap below, I am manning a Bofors quad mount on the USS North Carolina during one of the two annual ODTUG board face-to-face meetings.  You may think I just splashed a Zero whist defending my ship given my facial expression.  But no, that is not a Victory Flag, but instead an inducement to come to Kscope13, and yes, you are just looking at a geek at play.  And fun is a vital component of ODTUG.  Don’t think so?  Then you haven’t been to a Kscope, because it is fun, and exhausting, and extremely educational.  What more could anyone want in a technical conference?


Is there any point to this?  Why yes there is.

I would love to tell you that ODTUG board meetings (we have monthly telephone calls, too, oh the joy) mostly consist of climbing into, around, and over products of the Washington Naval Treaty with an eye to fully engaging my inner history geek, but alas and alack yet again, my chance to man a crew-served weapon doesn’t actually crop up very often in the course of ODTUG activities.  Nope, instead we examine, discuss, debate, and decide:  how ODTUG is doing (we have to cover costs and yes there is a Treasurer’s Report every month), what ODTUG is doing (are we meeting the needs of our members), and where ODTUG is going (we have today covered, we think, but what about the future).  It is serious stuff and it is how the board serves you, the ODTUG member.  

You would, I think, be astonished at the amount of work (and oh the time) it takes to make Kscope and the SP conferences and the webinars and the website and all of the other initiatives ODTUG performs actually come off.  And that isn’t to slight our volunteers, who are legion (I have been waiting to use that phrase forever– it could only be improved by replacing “volunteers” with “minions” but we are not Super Villains but instead a wholly benign user group) and contribute materially to ODTUG’s success.

All of the above is my tortured way of saying that ODTUG and ODTUG Kscope13 are the outcome of a lot of blood, and sweat, and tears and yes, a bit of fun.  That’s why Kscopes are, in my not entirely unbiased opinion, the best conference anyone in the EPM world can attend.  There’s nothing to touch it.  

Be seeing you in New Orleans.

Patching 11.1.2.x the wizard way

$
0
0

Introduction

Yes, the thought of me writing much of anything on infrastructure is slightly (completely?) laughable given my fully documented and freely admitted serial incompetence in this area.  However, not everyone has the luxury of saying, “Damn it, Jim, I’m an application consultant, not an infrastructure geek” and truth be told I get pulled into these situations, at least tangentially, from time to time.  And of course because I’m in a roll-your-own one man consulting band, I have to occasionally patch my development system as well.

You are likely not in that spot (once upon a time the EPM market was full of we hardy independent souls – we now seem to be a vanishing breed of which only the fittest/most stubborn remain but I digress) and need to do the patching, or at least manage, or maybe just have an appreciation for what is entailed in the process of patching and maintaining your company’s not-architected-for-scaredy-cats EPM system.

If this is a task that strikes fear into your heart (actually, if it doesn’t strike fear into your heart, you’re not paying nearly enough attention) then you are in luck for Oracle Support (thanks, T.) have pulled out all the stops and have produced their very own Patching & Maintenance Advisor: Enterprise Performance Management (EPM) 11.1.2.x(1517258.1).

What’s it all about, Cameron?

It is a wizard-based (you wondered about the title of this post, I’ll reckon) approach to:
  1. Why you should patch (this isn’t too hard to fathom)
  2. How you patch (ah, the details, where Old Scratch lives)

The goal is to get you to carefully consider the potential  impact, possible considerations, and concrete actions you need to take, step by step, from a 11.1.2.NotEverPatchedNoWayNoHow release to the very latest version of 11.1.2.WowEvenOracleDoesn’tKnowAboutAllOfThesePatches.

My story of woe, agony, and defeat, that you can now avoid

I wish this had existed, oh, about two years ago when I was on my very first 11.1.2.0 Planning implementation.  The installing consulting company (whom I shall not name ‘cause getting sued isn’t in my list of things to do) insisted that patching was not necessary.  I thought this was likely one of the dumber things I had heard from a consultant’s mouth (and yeah, I’m a consultant and I say some pretty dumb things from time to time) but I had nothing to fight this with other than, “Why on earth would you not want to head off known issues?”  As you might imagine, he was Infrastructure, and I was Applications, and that battle was lost before it began.  It was fun a few weeks down the road whist watching the client put them through the wringer when things didn’t work.  That patching I wanted up front eventually happened but oh what a waste of time and effort.

If I had had a time machine to get this patch advisor (cf. Mr. Peabody’s Improbable History and yes, I look disturbingly like his boy Sherman) from the then future, I could have gone to each one of the sections of Support’s step by step guide and rebutted their every claim with the icing on the cake of, “Well, that might be what you say, but Oracle Support (you know, the vendor) says…”  Yes, I have revenge dreams and whoever gets to do this really ought to have pity on me and post their success to this blog’s comment section for my Schadenfreude moment.  :)  

What oh what does it look like?

Just like this:

This is in line with other advisors I have written about before.  It is a wizard with subsections off to the left to show what needs to be completed in each section.

Wizard steps

Each overall phase of the patching process has individual step by step guides.  Every one of the guides takes you to a new document that, at the very least, provides food for thought.  And maybe ammunition against someone who maybe shouldn’t be in the installation game.  Ahem.

Evaluate

  • Business Plan Value
  • Increase Supportability
  • Overview of EPM Patching
  • Business Plan Considerations
  • Glossary Of Terms

Plan

  • Define Proactive & Reactive Patch Plans
  • Define Patch Test Plan
  • Define Patch Implementation Plan
  • Identify Patches / Patchsets to Apply
  • Read Patch Documentation
  • Assess Impact
  • Milestone Checklist and Feedback
  • Glossary Of Terms

Test

  • Apply Patch Test Plan
  • Verify Patch Install Has Been Successful
  • Verify That Backup / Recovery Works
  • Document Lessons Learned
  • Milestone Checklist and Feedback
  • Glossary Of Terms

Implement

  • Verify That Backup / Recovery Works
  • Apply Patch Implementation Plan
  • Verify Patch Install Has Been Successful
  • Milestone Checklist and Feedback
  • Glossary Of Terms

Pretty comprehensive, eh?  The advisor really spoon feeds the whole patching process.  Thanks, I need all the help I can get and I suspect I am not alone.

But wait, there’s more

Would you believe there are more goodies to be had?  As they might say at my current client (bonus points if you can place this by US state), you betcha.

What oh what oh what are the patches for each one of our beloved EPM products?  There’s a very nice and concise list of links to all of the patches (you can sort of see part of it in the screen shot above).  Of course it is now a pretty rare thing that only one product is being used at a time (I would say almost impossible given how Shared Services and sometimes EPMA are part and parcel of practically every implementation).  Would these have been handy during my losing argument with the installer, mighten it?  You betcha again.

DescriptionDocument ID
Available Patch Sets and Patch Set Updates for Oracle Hyperion Shared ServicesDocument 1481942.1
Available Patch Sets and Patch Set Updates for Oracle Hyperion Financial ManagementDocument 1321453.1
Available Patch Sets and Patch Set Updates for Oracle Hyperion Reporting and Analysis, Financial Reporting and Interactive ReportingDocument 1360962.1
Available Patch Sets and Patch Set Updates for Oracle Hyperion PlanningDocument 1395593.1
Available Patch Sets and Patch Set Updates for Oracle Hyperion EssbaseDocument 1396084.1
Available Patch Sets and Patch Set Updates for Enterprise Performance Management Architect and Calculation ManagerDocument 1400076.1
Available Patch Sets and Patch Set Updates for Hyperion Financial Data Quality Management and FDM ERPI IntegratorDocument 1400561.1


Did you know there was a Hyperion Patch Reviews Community?  Nope, neither did I.  But now we both do.

May this bring an end to these questions on OTN

I am not totally sure why OTN (and to a lesser extent, Network54) has become the home of infrastructure-related questions when there are so many good resources in Oracle Support.  The two Johns (John Goodwin and John Booth) seem to answer most of these but I have to wonder if the original posters even bothered to look on Support.  Remember, if your company has an EPM product (actually, any Oracle product) you at least have read access to Support.  The answers Support come up with are the official word (this stuff gets vetted through a process ever so slightly more rigorous than OTN or Network54 or this blog) and ought to be at the very least your starting point for all things infrastructure.  With this patch advisor, I hope to never see (and never post myself) another patch question on OTN.

Be seeing you.

One Of These Things (is Not Like The Others)

$
0
0
A different kind of blog
I’m sort of a boring guy (go on, just ask those who know me personally) and I’m definitely a busy geek (you will have to decide if that is because I do a lot, or try to do too much, or simply have lousy time management skills, I tend to think the last) and thus most of the blogs I read are 100% technical and informative -- how do I do this, why doesn’t this work, what’s the workaround for this, what are the concepts behind that.  You get the idea – boring + busy + always scrambling for ideas = technical blogs, not blogs that espouse philosophy, or a particular Weltanschauung.

But I think maybe I’m doing myself a disservice with this kind of focus.  Simply being technically proficient (you decide to what extent I fit that description) is not enough to make me a well-rounded consultant or even person.  I don’t write philosophical (opinionated yes, philosophical barely) blogs on the state of EPM, but is there someone who does?  Why yes there is.

It’s a secret

Well, I’m not 100% sure it is secret, but at least it isn’t a well-publicized blog.  What am I talking about?  The “hidden” counterpoint to The Travelling Consultant.  Take a gander at the url for that blog:  http://thetravelingconsultant2.wordpress.com/ Btw, I like this blog, a lot, strictly on technical grounds.  But moving on…

Do you see the “2”?  The number sort of implies that a predecessor exists (Thanks, John and no, he isn't the author, just more observant than I).  Try this url:
http://thetravelingconsultant.wordpress.com/

Ah, now we have something very different.  A personal blog but from a (mostly) EPM consulting perspective.  There is some very inside baseball, or industry specific stuff here (all names, places, and dates quite rightfully redacted but very interesting) and if you ever wanted to know what it’s like to be an EPM consultant, or what consultants are like, or the nature of EPM consultancy, I can’t think of another place to find it.  I would also recommend this for customers who want an insight into how a consulting practice works and what you might (for good or for ill) expect to see in EPM consultants.

Yes, these are told from a particular point of view (one that I mostly agree with).  You may not agree with everything written (that is sort of the point of a personal blog) but I think the articles ring very true.  I applaud the Travelling Consultant, whoever that is, on his honesty and candor.  And sense of humor.  :)  And no, I am not the writer.

Some of the more interesting posts

This is pretty straight stuff with little controversy:
Consulting 101: Project Roles
Consulting 101: Project Phases

I got a lot of pleasure out of these Profiles In Consultancy.  And yes, I have met every one of these types.  I try not to think too closely about which one I most nearly resemble.
Consultant Profiles: the “Great Guy/Gal”
Consultant Profiles: the “Flake”
Consultant Profiles: the “Diva”
Consultant Profiles: the “Talker”
Consultant Profiles: the “Over Biller”

Want to be a consultant?  How does it all work?  Read on.  I think every new hire to a consulting firm ought to get the links to these stories as part of their welcome package, especially if they are new to consulting.
Consultant Policies

These are good but I’ll bet the really good ones are Too Dangerous To Print.  
War Stories

So wrote it?

That’s a secret, too.  I think if you read it, you’ll understand why.  The posts aren’t negative in any way but some of the truths they speak are…uncomfortable.  In the interest of self-preservation aka continued employment the writer is anonymous.  Such is life.

Go on, broaden your horizons

Given the non-technical nature of the blog, the great thing is that you don’t need to rush through it.  I wish there were more EPM blogs like this.  I do try to bring at least a little philosophy (some call it idiocy) to my posts but I fall far short of the other The Traveling Consultant.  I hope you enjoy it as much as I did.

Be seeing you over on the non-technical side.

I cover the Antipodes

$
0
0
Okay, technically they’re only the Antipodes if you live in England. And, if you look at a globe, it’s easy to tell that this is just a figure of speech, not a direction for a Journey To The Center Of The Earth. In fact, near as I can tell, China fits for the States if I were to transfer the analogy to where I live. And that makes sense because that movie about Three Mile Island was called The China Syndrome and not The New Zealand Syndrome. (True story – I can remember as a kid my parents sitting at the kitchen table trying to figure out where to bug out if York, PA became a radioactive wasteland. Fun times, fun times. This stuff is safe, right? Riiiiight.) Have I lost everyone? Hopefully not, because there is good stuff to come.

Anyway, I am not in the nuclear power industry (and that is a good thing given my sometimes decided lack of attention and focus) nor am I going to China, but yr. obdnt. srvnt. is going to both New Zealand and Australia for two conferences. Yes, I am a glutton for punishment but I was asked and I said “Yes” before anyone could change his mind.

New Zealand
The New Zealand Oracle Users Group has its conference every 18 months. NZOUG 2013 is from 18 to 19 March 2013 in Te Papa, Wellington, NZ. In theory, I was the content chairman for the BI and EPM track at this conference but I have to admit that this really meant that I bugged, bothered, and pestered Erica Harris, Richard Philipson, and what seems like most of Oracle Australia/New Zealand (thanks to Kay Galbraith and Daniel O’Brien) with trying to figure out what would be appropriate content for NZ and who oh who would present. They did a great job identifying people to speak. NZOUG does quality work and their agenda is very strong. I plan on checking out the other tracks (something I never seem to be able to do at Kscope) while I am there as well as presenting two sessions, one on ODI and data quality (hey, come to NZ or buy my book and read my chapter on this) and the other, excitingly, on Dodeca. Now I just have to finish writing it.

Check out the agenda here.

Here’s what’s planned for BI and EPM:

The important bits are: NZOUG 2013 is from 18 to 19 March in Te Papa, Wellington and costs a mere 795 +GST NZD if you are a member and register under the early bird scheme. Read the full agenda– there’s amazing value for money.

Australia
Ah, another country, and a slightly different group of people to exasperate, although in this case it’s fellow board member and Oracle ACE Bambi Price that I think I annoyed the most and of course Oracle Australia (hi, Kay, and yeah, I owe you). Again, I helped out with the agenda and yes, I have written about this before for the ODTUG blog where you can read all about it.

This is an ODTUG Seriously Practical conference (NZOUG is their own full Oracle product line show show, I am just there to present and help with the BI and EPM content selection) and as such will focus a deep dive into the technical end of the BI and EPM tools. Yes, I am presenting the same two sessions at this conference and no, there will not be many NZers (I just made that word up as “Kiwis” is a bit twee) in Melbourne so I don’t view this approach as a rerun. More like a keep-Cameron-on-the-ragged-edge-of-sanity-because-he-takes-too-much-on approach.

Check out the agenda here.

Here’s what’s planned for BI and EPM:

The important bits are: the ODTUG SP Australia is from 21 to 22 March in Melbourne and costs a mere 599 AUD. Read the full agenda – there’s amazing value for money. Again. 

This is pretty exciting stuff
Okay, the flight in economy from home to NZ to Aus to NZ to LA to home is not exciting. At all. But helping out with BI and EPM geeks on the other side of the world is exciting. Yes, they have odd sounding accents (of course to them I’m the one with the weird way of pronouncing things and the incomprehensible slang) but their passion and commitment to technical knowledge, sharing, and evangelism is just like what you see here in the States with ODTUG’s events. I’m beyond happy and proud to help out and I’m hoping that both events will be a great success.

Thanks to the magic of Google Analytics, I know that both NZ and Australia read this blog. Australasians, if you have ever wondered what kind of idiot I am in person, now’s your chance. :) Seriously, they’re both good presentations and you can always go get a cup of tea (ah, real tea, I wonder what the NZ/Aus. version of Typhoo or PG Tips is) if I prattle on too much. I hope that you’ll be able to come to the conference that is closest – as you can see from the above there’s really some great content on offer.

Be seeing you.

Stupid Planning queries #11 -- Where and what are my Smart Lists

$
0
0

Where oh where has my Smart List gone?

Smart Lists are a wonderful thing. Well, that may be slightly exaggerating their usefulness but if you want to create a drop down list in a Planning Account (or other dimension but I have never seen it) Smart Lists are the way to go. Come to think of it, they are a bit of a Hobson’s Choice (a truly fantastic movie) if you want dropdown lists in Planning forms.

Planning even gives you a great way to view the Smart Lists in your Planning application by simply going to the Administration->Manage->Smart Lists menu.

Here is a (rather short) list of Smart Lists in my sample Planning app:

Live and in person
This is a very silly and pointless form that nevertheless shows a Smart List with nothing selected.

Clicking on the down arrow:

Selected Yes


Saved to Essbase


FWIW, if I pulled the Account YesOrNo in Essbase using a Smart View adhoc analysis link, I would get a 1 in that cell as, bizarrely, Smart Lists do not resolve to Essbase Text Measures. I will try not to think about why that is the case as their functionality is the same. Different development groups is the best explanation I can think of but it is frustrating. Onwards, regardless.

So all of this is great, and more than a little basic
Yes, I know, what is there to query if you just created the Smart List? Well, in the case of this sample Planning application, there is no real reason to query much of anything as we know what the Smart List is and what member it’s tied to.

But what if you didn’t know what member the Smart List was assigned to? How would you know? As far as I can tell, there is no magic report in Planning that will give out this information.

A different story
And what if you were working on a Planning migration/modify project (ahem) and the Planning application had 31 Smart Lists, and somehow the association of Smart List to member got lost (oh, thank you accursed EPMA), and you had to go back to the original Planning app to figure out what goes where? What would you do then? Scream? Cry? Curse your bad luck? Or how about write a query that looks just like this?

The query

/*
    Purpose:     Figure out what Smart Lists are assigned to which Members
    Modified:    23 February 2013, Cameron Lackpour
    Notes:       This is a nice and easy one, isn't it?
*/
SELECT
  O.OBJECT_NAMEAS'Member',
  E.ENUMERATION_ID AS'SL #',
  E.NAME AS'Smart List'
FROM
  HSP_MEMBER M
INNERJOIN
  HSP_ENUMERATION E ON M.ENUMERATION_ID = E.ENUMERATION_ID
INNERJOIN
  HSP_OBJECT O ON O.OBJECT_ID= M.MEMBER_ID


And that produces a result set like this:


That hypothetical Planning application I mentioned above? Would you believe 31 Smart Lists of which 14 were actually assigned? Yup, 17 dead Smart Lists. Isn’t application maintenance a stinker? Apparently so.

Everything you ever wanted to know about a Smart List
Above I joined HSP_ENUMERATION (why wasn’t the tabled called HSP_SMARTLIST?) to HSP_MEMBER to get the link between member (in any dimension) and Smart List. But what if you just wanted a quick review of everything that ever made up a Smart List?

Query the second

/*
    Purpose:     Smart List contents by name
    Modified:    23 February 2013, Cameron Lackpour
    Notes:       Another Nice 'N Easy one.
*/
SELECT
      E.NAME,
      EE.ENTRY_ID,
      EE.NAME,
      EE.LABEL
FROM
      HSP_ENUMERATION_ENTRY EE
INNERJOIN
      HSP_ENUMERATION E ON EE.ENUMERATION_ID = E.ENUMERATION_ID
And that produces a result set like this:


And that’s it
I have to say that I wrote this blog post because I needed to get that list of Smart List associations to members and simply couldn’t find it on the series of tubes that make up the world wide web.   I’m sure it exists, somewhere, or maybe it was just so easy no one bothered to post it. Regardless, now world+dog has it.

I will note again what an incredbily helpful thing it is to write these queries – I cannot imagine going through each one of the Accounts in the application I am talking about (over three thousand across multiple Plan Types) and try to find the silly things – I’d have completely gone off my rocker (although I will admit it might be hard to spot when that happens) and I would have spent a *long* time trying to figure out where the non-assigned 17 Smart Lists should have been. Which was nowhere, thanks to the query. SQL saves the day yet again.

Be seeing you.

Stupid Planning query #12 -- Calculation Manager Rights

$
0
0

Introduction

Arrrgh, this one really got me going. Actually, I have noticed that I have never written a query against the Planning tables unless I am unable to get whatever it is out of Planning easily. And I suppose that sort of makes sense. And I also that means I am always annoyed and that missing Planning reporting features are my opportunity to increase my SQL skills, such as they are. Of course writing the query took way longer than I thought it would. Read on for the reason...

With that preamble, have you ever wondered what security is assigned to Calc Mgr rules in Planning? You basically have to go into each rule and edit the security to see what user or group has been assigned and what rights are set. Annoying, isn't it? And not practical when there are many rules. Here's an example of what it looks like:

I can see that the group PLN_CalcTest_Consol has Launch access to the rule AggAll, but what about AggPlan, CalcRev, etc., etc., etc.?

So yes, this is yet another opportunity to query the tables. And oh yes, I am using this as a teeny part of the Planning presentation I am giving with Jessica Cordova (hi, Jessica) at Kscope13.

NB – One other note, I was inspired to get around to this query in response to “vaio” and his Hyperion Business Rules security query from this Network54 thread: http://www.network54.com/Forum/58296/thread/1362011042/Export+only+Webform+security I figured if world+dog had it for EAS’ business rules, we needed it for Calc Mgr as CM is all there is from 11.1.2.2 onwards.

The reason this query drove me up the wall

Would you believe that deployed Calculation Manager rules do NOT have an object type in Planning? Would you believe that I spent more than a little bit of time trying to find it?

Oh yes, both statements are true. The latter one you are going to have to take on trust. The former I can prove.

Here are the Object Types in the Planning app schema:
SELECTDISTINCT
OT.OBJECT_TYPE,
OT.TYPE_NAME
FROM
HSP_OBJECT_TYPE OT
OBJECT_TYPETYPE_NAME
1
Folder
2
Dimension
3
Attribute Dimension
4
Calendar
5
User
6
Group
7
Form
8
FX Table
9
Currency
10
Alias
11
Cube
12
Planning Unit
30
Attribute Member
31
Scenario
32
Account
33
Entity
34
Time Period
35
Version
37
Currency Member
38
Year
45
Shared Member
50
User Defined Dimension Member


And here is the OBJECT_TYPE that goes with Calc Mgr rules:
SELECT
*
FROM HSP_OBJECT
WHERE
OBJECT_TYPE ='115'

See the 115? I only know that because I did a search on the name of one of the rules and thus figured it out.

Why would you care about OBJECT_TYPE 115?

Well, once you (or I) knew this, you (or I) could write this:
/*
Purpose: Calculation Manager security report by rule, group, and user
Modified: 1 Feb 2013
Notes: Common Table Expressions make joining mostly disparate objects
relatively easy.
NB -- Calc Mgr rules do NOT have an OBJECT_TYPE in HSP_OBJECT.
The OBJECT_TYPE seems to be 115.
*/
-- I am in love with CTEs over subqueries
WITH
-- CTE for Calc Mgr OBJECT_ID, Plan Type, and Name
BRName(CMID,PlanType, BRName)AS
(
SELECT
CMR.ID,
CMR.LOCATION_SUB_TYPE,
O.OBJECT_NAME
FROM
HSP_CALC_MGR_RULES CMR
INNERJOIN
HSP_OBJECT O ON O.OBJECT_ID= CMR.ID
),
-- CTE for Calc Mgr user OBJECT_ID, Calc Mgr OBJECT_ID, and Launch rights
BRAccess(UserID, CMID, Launch)AS
(
SELECT
AC.USER_IDAS'User ID',
O.OBJECT_IDAS'CM Obj ID',
--O.OBJECT_NAME 'CM Name',
CASE AC.ACCESS_MODE
WHEN-1 THEN'No Launch'
WHEN 4 THEN'Launch'
ELSE'Unknown'
ENDAS'Access'
FROM
HSP_ACCESS_CONTROL AC
INNERJOIN
HSP_OBJECT O ON O.OBJECT_ID= AC.OBJECT_ID
WHERE
O.OBJECT_TYPE ='115'
),
-- CTE for user OBJECT_ID, user name, group name
UsersInGroups(UserID, [User Name], [Group Name])AS
(
SELECT
--O.OBJECT_ID AS 'User ID',
O2.OBJECT_IDAS'CM Obj ID',
O.OBJECT_NAMEAS'User Name',
O2.OBJECT_NAMEAS'Group Name'
FROM
HSP_USERS U
INNERJOIN
HSP_OBJECT O ON O.OBJECT_ID= U.USER_ID
INNERJOIN
HSP_USERSINGROUP UG ON UG.USER_ID= U.USER_ID
INNERJOIN
HSP_OBJECT O2 ON O2.OBJECT_ID= UG.GROUP_ID
)
SELECT
BRN.BRName AS'Calc Mgr rule',
BRN.PlanType AS'Plan Type',
BRA.Launch AS'Launch',
UIG.[User Name] AS'User name',
UIG.[Group Name] AS'Group name'
FROM
BRAccess BRA
INNERJOIN
UsersInGroups UIG ON UIG.UserID = BRA.UserID
INNERJOIN
BRName BRN ON BRN.CMID = BRA.CMID
ORDERBY BRN.BRName, UIG.[Group Name], UIG.[User Name]

And then you (or I) could run the above query, and get the following:
Calc Mgr rulePlan TypeLaunchUser nameGroup name
AggAllConsolLaunchTestPlanner1PLN_CalcTest_Consol
AggAllConsolLaunchTestPlanner2PLN_CalcTest_Consol
AggAllConsolLaunchTestPlanner3PLN_CalcTest_Consol
AggPlanConsolLaunchTestPlanner1PLN_CalcTest_Consol
AggPlanConsolLaunchTestPlanner2PLN_CalcTest_Consol
AggPlanConsolLaunchTestPlanner3PLN_CalcTest_Consol
CalcRevConsolLaunchTestPlanner1PLN_CalcTest_Consol
CalcRevConsolLaunchTestPlanner2PLN_CalcTest_Consol
CalcRevConsolLaunchTestPlanner3PLN_CalcTest_Consol
ClrBSConsolLaunchTestPlanner1PLN_CalcTest_Consol
ClrBSConsolLaunchTestPlanner2PLN_CalcTest_Consol
ClrBSConsolLaunchTestPlanner3PLN_CalcTest_Consol
ClrFinalConsolLaunchTestPlanner1PLN_CalcTest_Consol
ClrFinalConsolLaunchTestPlanner2PLN_CalcTest_Consol
ClrFinalConsolLaunchTestPlanner3PLN_CalcTest_Consol
ClrTrgtsConsolNo LaunchTestPlanner1PLN_CalcTest_Consol
ClrTrgtsConsolNo LaunchTestPlanner2PLN_CalcTest_Consol
ClrTrgtsConsolNo LaunchTestPlanner3PLN_CalcTest_Consol


Isn't that pretty? And useful? I (or you) think so.

The sting in the tail

Would you believe I spent a good half hour poking around in the Calculation Manager tables?

Would you believe that the CALCMGROBJECTACCESS table in the Calc Manager schema is completely empty? I have no idea what it is for, but it isn't for rules deployed to Planning. Terrific.

But the good news is that with a little poking about writing a simple (well, CTEs aren’t totally beginner’s stuff but they are so easy to read and work with) query, you (or I) can easily look at who and what kind of access planners and groups have to deployed CM business rules.

Be seeing you.

Australians, So where the bloody hell are you?

$
0
0
Wow, that is a pretty awful tourism slogan, isn’t it?  I figured if you lot inflicted it on the rest of the world turnabout is fair play.  With luck, I will be met with chants of “Yankee go home!”  In the interest of good international relations and not getting pelted with eggs when I mount the podium at the just-about-here-why-haven’t-you-registered-yet ODTUG SP Australia conference, Melbourne 21 to 22 March, let me note that we too have real clunkers like – “Erie, Pennsylvania…Feel the Lake Effect”.  (I’ve been in Erie, PA during the winter and trust me, you don’t want to experience Lake Effect.)  Happily the world shares equally in bad marketing slogans so we can all engage in a little schadenfreude as we read these bon mots.  There, international enough for you?

But there is a point to that now discarded Australian tourism slogan – why oh why oh why have you Australian EPM and BI practitioners not signed up for ODTUG’s totally awesome Seriously Practical conference, Melbourne, 21 to 22 March which is just the end of next week?  Huh?  Why?  Why?  C’mon, give me a good reason.  You can’t, can you?

Don’t believe me?  Check out the agenda

I will note that this Seriously Practical conference has some seriously awesome content.  Again, don’t believe me?  Take a look at the below content.  What are you waiting for?  Me begging on my knees?  You have it, metaphorically.  A fantastic agenda?  Cast your eyes downwards and learn more.

Day 1

Welcome and Opening Remarks, Bambi Price, ODTUG8:45 - 9:00 AM

Keynote: What’s Coming in Oracle BI and EPM, Babar Jan Haleem, Oracle Corporation
9:00 - 9:30 AM
A glimpse into the future of Oracle BI and EPM, delivered by the Director of EPM BI Architecture & Technology for APAC.

SESSION 1
9:30 - 10:30 AM
Fusion Applications and Your BI/EPM Investment,  Debra Lilley, Fujitsu
Oracle Fusion Applications are here today providing the next generation of applications. They are about having everything the user needs in one place, and that includes information. Fusion Applications is a window on Oracle’s Fusion Middleware stack and a very big part of that is BI/EPM and analytics. This presentation will include a small demo of how Fusion looks and is designed to give you an appreciation of how BI/EPM is embedded in Fusion. For anyone thinking of Fusion in the future it will underline that your B/EPMI investment today is an investment in that future and protected.

SESSION 2
10:45 - 11:45 AM
Highlights and Capabilities of the Latest Release of EPM, Charles Pinda, Oracle Corporation
Enterprise Performance Management (EPM) is comprised of four streams to assist the Office of the CFO to deliver predictable financial results. These streams are Strategy Management, Planning and Forecasting, Profitability Management, and Financial Close. In addition, the reporting delivery layer of EPM is supported by Oracle BI to provide further analytical insights into this information. Now the latest release of EPM, version 11.1.2.2, has new functionality to enhance the business process of delivering predictable results. This session will highlight some of these enhancements and demonstrate the new capability.

LUNCH
11:45 AM - 12:45 PM

SESSION 3
12:45 - 1:45 PM
Endeca Information Discovery, Stephen Weingartner, Oracle Corporation
Endeca Information Discovery (EID) provides unique and powerful analytical capabilities that enable organisations to discover insights in information that would otherwise be unusable. EID’s strengths in unstructured data, agile business intelligence, and information discovery will be demonstrated and discussed. Several EID use-cases will be covered to illustrate the wide variety of solutions which have been implemented at several large organisations. During this presentation, a solution will be created in EID, demonstrating how it works in an end-to-end manner. Attendees will learn how EID differs from other business intelligence and big-data technologies and how it has created its own new niche which companies can fill.

SESSION 4
2:00 - 3:00 PM
Exploring Oracle BI Apps: How Does it Work and What Do I Get, Richard Philipson, James & Monroe
This presentation provides an overview of the BI Apps architecture for novice users, with clear concise information presented in an easy-to-understand format. The presentation steps through the aspects of an implementation from conception to execution and concluding with example content for one of the many areas of content Financial Analytics.

SESSION 5
3:15 - 4:15 PM 
Thoughts from the Frontline – Issues and Opportunities Faced When Implementing or Upgrading HFM Applications,
Christine Aird, M-Power Solutions

This session will walk you through the project lifecycle of an HFM implementation/upgrade and cover key areas where problems/opportunities regularly occur. Using real life case studies, interspersed with best practice processes and approaches, the session will give you an insight into how you could avoid the challenges and take advantage of possible opportunities during your project. The session will touch on how we see HFM being used, will cover some of the misconceptions that follow HFM, and will drill into the potential issues this can cause and what is considered best practice for an HFM implementation.
This session will give you great insight into what you should use HFM for and how to deliver a successful project or upgrade.

SESSION 6
4:30 - 5:30 PM
The Spreadsheet Management System Known as Dodeca, Cameron Lackpour, CL Solve
Business users love Essbase for its unparalleled analytic power. Business users also love Excel because spreadsheets are where data is expressed, analyzed, and manipulated. Essbase + spreadsheets = analytic bliss.
But as soon as you move beyond ad-hoc Essbase functionality, a series of questions arise:
1) How do you handle complex functionality?
2) Code yes, but where, and how?
3) What about non-Essbase data?
4) Workbooks on the web?

What’s needed is a system that:
1) Is spreadsheet-centric
2) Ties easily to SQL
3) Automatically distributes and updates workbooks

Dodeca does all of the above, and more; it is the key to managing complex workbooks so you and your company can focus on the real task at hand—analyzing, understanding, and managing the numbers, not the spreadsheets. This presentation introduces the issues around spreadsheets, Dodeca’s philosophy around managing multiple complex workbooks, and then demonstrations of what Dodeca can do.

Day 2

SESSION 7
9:00 - 10:00 AM
Taking OBIEE to the Next Level, Maneesh Disawal, James & Monroe
Move over pivot tables and charts and incorporate Exalytics with KPIs, scorecards, maps, and advanced analytic functions into your regular reporting. Use the latest infrastructure and visualization techniques to create dazzling dashboards to quickly and directly communicate relevant information. Incorporate data in office communications and deliver reports on mobile devices.

SESSION 8
10:15 - 11:15 AM
Essbase ASO – A Brave New World in Australia but not for the Rest of the World, Steve Hitchman, m-power Solutions
Many of you will be familiar with the concept of Essbase ASO / Aggregate Storage. You’ve probably read about the differences from Essbase BSO, seen a case study or used it on a project but even though ASO has been available for over 5 years adoption in Australia continues to be very limited.  In this session, we will cover the basics of what ASO is before exploring what’s so great about it and how it is being used to help companies in Australia.  For lovers of the more traditional BSO model, we’ll explain the differences highlighting what you can and can’t do in ASO models. This initially seems like a lot but we’ll pass on the tips and tricks that we’ve learned that allow this gap to be bridged including how ASO and BSO can work together.
We’ll highlight the real world stuff that goes beyond the textbook and Oracle marketing to expose how ASO technology is revolutionising what Essbase can deliver.

SESSION 9
11:30 AM - 12:30 PM
Oracle BI and Oracle Essbase: Today and Tomorrow, Stephane Roman, Oracle Corporation
Essbase and OBIEE have come a long way together since they met years ago. Being both strategic solutions, Oracle have continuously improved their specific technologies while making the integration more and more seamless at the same time.
In this session, we will first of all give an overview of the OBIEE & Essbase architectures and explain their typical usages within the enterprise. We will then open the bonnet and look at how the two solutions work together. How is the Essbase multi-dimensional structure (outline) understood by the OBIEE semantic layer (Repository)? How can I federate relational and multi-dimensional data sources through the OBIEE RPD’s logical layer? What are the different ways to import and model an Essbase cube OBIEE? How to work with unbalanced hierarchies? Are all Essbase features available through OBIEE (UDAs, Variables, Levels vs Generations…)?
Finally, we will look at how upcoming releases of OBIEE & Essbase will make the two solutions even tightly integrated, as well as a glance on how both are used with new Oracle Applications.

LUNCH
12:30 - 1:30 PM 

SESSION 10
1:30 - 2:30 PM
Slay the Evil of Bad Data in Essbase with ODI, Cameron Lackpour, CL Solve
Everyone knows that bad data kills Essbase applications. But did you know that bad data can also kill careers, consulting engagements, and company-wide labor agreements? Why is the rule of high-quality data in Essbase honored more in the breach than the observance? This session explains the real-world consequences of bad data, categories of data quality, and tools and strategies for ensuring that your Essbase database has the right data. A complete solution in ODI will show one path to salvation in the never-ending quest for quality data.

SESSION 11
2:45 - 3:45 PM 
Growing with Business Analytics - Keeping Updated and Informed, Paul Anderson, Oracle Corporation
What's the fastest way to find out about product certification and compatibility? Which social media channels are available for EPM and BI products? These and many more questions are often asked in the busy and growing business world; incorporating the Internet highway into our lives.
There have been recent changes in both significant improvements and new implementation of ways in which customers can communicate with Business Analytics support.
This presentation will run through changes made by the Oracle Business Analytics Proactive Team to areas including MOS Communities, knowledge management, and social media.
This session will show the way you can ask questions and find answers about Product Lifecycle, how to interact with Business Analytics support in a simple and efficient way, how to keep updated with news on areas such as patches and documentation, and will provide a demo of the new translations being introduced into My Oracle Support.

SESSION 12
Closing Panel
3:45 - 4:30 PM
Do you have a question about BI or EPM? Your BI and EPM speakers are here to give answers. This session is moderated but is expected to be freewheeling and open. Try to stump us!

Whew, is that enough?

What oh what oh what are you waiting for?  Read what’s on offer – that is great stuff, packaged for, and (mostly) delivered by Australians (hey, I have incredibly distant relatives somewhere in Victoria so do I get a partial pass on the Yankee Go Home protests?).  That means the content is targeted to your needs and your market.  What more could you possibly want?  Sign up today.

Be seeing you in Melbourne.

Where in the world is Cameron, day 1, New Zealand edition

$
0
0

Introduction

Over the next few days (where I am, it’s Monday, so that’s a hint) I’m going to use my blog to highlight my travels through Australasia. This series is basically my take on Where In The World is Cameron (some wonder where I am, intellectually, all of the time, let alone when I’m on the other side of the world) for the next week. Hopefully for those of us not from the Antipodes, this will give you an American’s/Yank’s/Septic Tank’s/Seppo’s (yeah, the nicknames get less and less loving from left to right) take on what user groups are like in other countries. At the end of the day, we all speak Oracle, and the really awesome and cool (ahem) amongst us speak Oracle EPM, so I am looking forward to seeing how things differ from the US of A.

Day 1, NZOUG 2013, 18 March 2013

The kickoff, 9:30 am NZDST

Would you believe that New Zealand is a long way from the East coast of the States? Well, it is, but I’m here, somehow, and I am writing this in the kick off session of NZOUG (btw, last night I was lectured, and then tested, quite closely, on my ability to say “N-Zed-O-U-G” – I am happy to report that a lifetime of watching Trevor Howard and John Mills movies about Splendid English Chaps and Beastly Everyone Else well prepared me for this linguistic challenge) 2013.

This morning I’ve listened to man-without-a-country NZOUG president Francisco Munoz kick off the conference, Peter Idoine the NZ Oracle MD welcome everyone to New Zealand’s once-every-18-months conference, and now Stuart Speers the Platinum Sponsor talk about the cloud. As all seven or eight of you that follow this blog know, I am a huge fan of the cloud and use it all the time for EPM. No need to sell me on the cloud Stuart, I am 100% on board with the message. :) Okay, he’s not speaking to me, but I am a huge proponent of cloud functionality. It is The Way of The Future.

The keynote

The leadup

So this is a bit embarrassing as someone who makes his living off from Oracle products. I have never listened to Tom Kyte speak – I suspect that fellow Essbase Hackers (again, all seven or eight including my Mum) of you are also likely similarly ignorant of probably the biggest name in the Oracle database world. This is a function of Oracle-is-bigger-than-the-sun and the siloing of many of the products, or maybe the siloing of my technical knowledge. No matter, I haven’t seen him speak and I am really looking forward to it.

The Tom Kyte update, 10:15 NZDST, 18 March 2013

OMG, this guy I like – “I dream in SQL”. Yup, that is my kind of guy. He loves SQL as much as I love Essbase – maybe more, which is saying something.
Perspective is a funny thing – for an EPM guy, I like to flatter myself that I have a decent basic understanding of SQL and have used that SQL hacking (sort of like my Essbase hacking) for fun and profit. I am not 100% deluded as I have always realized that I’m just scratching the surface with my knowledge level. Having said that, OMG yet again – nothing like a reminder of how basic basic really is when one bumps up against a master at a technology. No surprise to the rest of the world, Tom really knows what he’s talking about.

There is a lot of buzz around 12c and to quote Tom, it is “coming out soon”.

The break, 10:50 am to 11:10 am

Just like Kscope, between sessions there’s the ability to walk the vendor booths. And of course the really important stuff is just below:
Flat blacks (Americanos, sort of), flat whites (café au laits, sort of), espresso, mmmmm. What was I here for again? Oh, right, NZOUG. And here’s the user group booth.

Babar Jan-Haleem, BI Futures, 11:10 am to 12:00 pm, NZDST

I begged, harangued, and bothered Bambi Price, and Erica Harris, and Kay Galbraith, and Babar Jan-Haleem himself to come to NZOUG and talk BI futures. Perhaps they all realized I wasn’t going to go away until he said “Yes”. And I quite happy to say that he is here, and is talking about BI futures. But what is this talk about Exalytics being anything more than the biggest and baddest Essbase box in the world? There’s more to life than Essbase? Apparently so.

Charles Pinda,Financial Results with EPM 11.1.2.2, 12:10 pm to 1:00 pm, NZDST

Ah, Oracle presales, but I like him. :) I kid, I kid. These guys are great – I have *tried* to do it and good grief is that hard to do (I might add that I ran away from it as fast as my little legs would take me). Implementation geeks like me cannot exist (or at least cannot earn a living) without guys like Charles and the sales reps he supports. And to be fair to them, they tend to have a much broader view of the tools, needs, the market, etc. than Essbase/Planning/ODI/whatever hackers tend to have. They have to speak to multiple products, multiple industries, and multiple customers, all at the same time.

So what’s this session all about? Smart View, for sure. And from an Essbase geek’s perspective, the fact that 11.1.2.2 has reached parity with the add-in, well…that’s it for me from a tool perspective now that I can seamlessly use Planning with Excel.

Project Financial Planning – the prebuilt colossus known as PFP (there is some really cool BSO and ASO integration behind the scense). Personally, from a Capex and even Workforce perspective, this has to be the future (no, that is not official Oracle-speak, just what appears to be obvious to me and of course I could be wrong). It’s focus is long term projects and yes, I have tried to do this in “normal” Planning and it is sort of a pain to do. Given its name, it isn’t super surprising that PFP is a better fit for this kind of Planning. Btw, this may either make you sad or jump for joy (I tend to be in the latter camp) – no EPMA.

Charles just talked about Decision Management – this is really cool – it’s a summary of budget requests with narrative justification and supporting detail. This is a new feature of Public Sector Planning and Budgeting aka PSPB as of 11.1.2.2. I don’t do PSPB but this is something that really ought to be in “normal” Planning IMHO.

Richard Philipson,Exploring Oracle BI Apps: How does it work and what do I get ?, 2:30 pm to 3:20 pm, NZDST

I know Richard from multiple Kscopes – now I’m sitting in on his BI Applications suite session. There are multiple apps around Sales, Financials, HR, Marketing, Procurement & Spend, Supply Chain – who knew? Not me. And that is why I come to conferences like this – the world I need to learn is large, the amount of information I actually know is really pretty small, something has to bridge the gap, ergo user conferences like NZOUG 2013.

And what makes up BI Apps? An ETL tool, a central console to manage anything, built in ETL adaptors, a unified data model, and reports – ta da, BI Apps.

Richard’s presentation is showing Informatica (I think that this is actually as the BI Apps packages are sold from Oracle which is a bit confusing although I could have it wrong) as the ETL tool, but ODI can be part of this as well, and that’s how the BI Apps hook into the transactional system. A data warehouse is at the center (hmm, should that be centre or is that just too twee for words?). Just like ODI, OBIEE has physical and logical layers hung off the DW to come up with reporting.

(Hmm, there’s a Q&A going on right now about the Informatica vs. ODI issue – it sounds like Informatica is the historical solution but the future is likely to be ODI.)

Yr. Obdnt. Srvnt.,The spreadsheet management system known as Dodeca, 3:40 pm to 4:30 pm, NZDST

Hmm, tough to say if this was a success or not. Most of the presentations here tend not to be super technical, at least in the BI/EPM track. This presentation was pretty technical, and I’m not 100% sure I hit the mark with this one. OTOH, I did see people furiously scribbling down notes (although I have to wonder if they were writing down what needs to be related to theNZ Ministry of Health as I could be a hazard to the public when I am at full chat) as I ranted and raved (in a positive way) about Dodeca so maybe it wasn’t so bad. You decide – I’ve stuck the full presentation righthere. This presentation is pretty big (34 megabytes) because of the embedded movies – You Have Been Warned.

Dan O’Brien,Oracle Business Intelligence: Model First, Build Later, 4:40 pm to 5:30 pm NZDST

High concept – Agile development with OBIEE.

Everything else concept – multiple techniques to quickly model business processes to OBIEE applications without going through a complex bottom-up build process.

This is pretty interesting stuff as Dan is talking about a bunch of different strategies about how to get round the formal, inflexible, “normal” way of developing OBI applications. Really the issues he’s talking about apply equally to EPM.

NB – One really funny comment in his presentation – “spreadmarts”. This is totally in line with my Dodeca presentation – you know your system/implementation/company is in trouble when really big spreadsheets become data marts. Spread + mart = spreadmart.

The end of today

Well, not really the end, but the end of what I’m going to blog for now. There’s a NZOUG event tonight at 7 pm and I hope to get a good sleep tonight. I have really totally given up trying to figure out what time it is, or what time my body thinks it is, and just think about strategies for blissful rest.

And that’s where I’m going to end this post.

Be seeing you.

Where in the world is Cameron, day 2, Australasian edition

$
0
0

Day 2, 19 March 2013, 8:55 am NZDST

It’s a good thing I am, seemingly, beyond embarrassment, as I feel asleep last night during dinner.


Actually, it’s a bit worse than that, as I actually fell asleep after the quite excellent NZOUG 2013 dinner, during the music quiz, in a room full of screaming people. With a live DJ. OTOH, I have never in my life awoken to the strains of “Wake up, Little Susie” sung just for me, so there is that. And before you go, “Ah, sloshed again. Get that geek dried out, asap,” I knew Cameron + tiredness + alcohol was a bad combination and very purposely held back from the truly excellent Mac’s Bitter. All to no effect – I might have well had more of that beer than I did for all the good it did me. I will plead the excuse that when I finally walked back to the hotel I was able to figure out that in fact, Cameron-time, it was about 4 am and I had just spent the entire night awake. My buddy Bambi Price, keeps on telling me, “Don’t think about what time it really is”, and my conscious mind doesn’t, but I fear that my body does.

Yr. Obdnt. Srvnt., Slay the evil of bad data in Essbase with ODI, 9:30 am to 10:20 am, NZDST

I really enjoy doing this presentation as I am a huge fan of ODI and of good data in Essbase databases. It is a surprising and mildly shocking fact that many, many, many Essbase/Planning implementations do not handle data quality at all, or rely on 100% manual data validation to tie out numbers.


I am writing this update a half hour before the presentation, so I have no idea what turnout will be. Also, it is quite possible that after yesterday’s Dodeca presentation the word has spread and I will be facing the New Zealand equivalent of a ghost town.


Update – nope, I had a decent number of people show up a wee bit late – was everyone toasting the sleepy American or just out partying? They aren’t telling me and I’m not asking. :)

Charles Naslund, Infrastructure Preparations for Hyperion EPM 11.1.2.2, 10:40 am to 11:30 am, NZDST

Many things in life elude me: why do we vote for politicians that lie to us and they know that we know that they are lying and yet we vote for them again and again; why, really, do all the things that taste so good end up being so bad for us; why is Oracle EPM infrastructure so hard?


Well, lying politicians (as far as I can tell this is true for all parties and all countries) and fattening-yet-delicious food will be with us (and me) forever, but could there be hope when it comes to the complex concept known as EPM infrastructure? Regular readers of this blog know that EPM infrastructure is a continuing challenge for me so I have great hopes for Charles’ session. Maybe my limited knowledge can be expanded. Maybe.


Charles (yea!, fellow Septic) is going through the architecture topology in nice simple to understand terms. Keep it simple Charles, please.


Now we’re onto the topic of virtualization – yup, it’s the same story as in the States – real boxes for Essbase, virtualize everything else, and it would be a really good idea if you went with Oracle VM.


SLAs (Service Level Agreements) – ah, these tend to be somewhat more honored in the breach than in the than the observance. This one particularly frosts my cookies when the Essbase or Planning server (or servers) go KABOOM and no one but no one in IT seems to own the servers.


  • Packet size – pre compression. Now I know why Smart View is faster than IE when it comes to forms. Here are the average network bandwidth requirements on a per form basis:
  • HFM 64 to 128 k
  • Planning 32 to 64 kb
  • Smart View 28k This is pretty darn amazing – the SV team has done some magic here.


SANs – Every client I’ve had in the last five years has wanted to run Essbase (and everything else) off of a SAN. Essbase needs fast disk to perform well, memory and CPUs be damned. How to do this – sort of be virtualized, sort of not, by using dedicated LUNs, CPUs, and memory. The data (PAG/IND or tablespace) is on the dedicated LUN – everything else can be on shared SAN resources. This strategy gives you a 20 to 25 percent performance boost.
All in all, a nice session. Maybe if I attend enough of these some infrastructure wisdom will rub off on me. Maybe.

Richard Philipson, EPM Case Study: Rank Group, 2:00 pm to 2:50 pm

Richard as always does an excellent job. Quite how he does Essbase, Planning, HFM, BI, infrastructure, etc., etc., etc. is a bit beyond me. Did I mention that he’s a talented graphic artist. <insert envy> Or maybe I’m just stuck in rut.


This is an interesting app – it’s architected so that the private equity firm (Rank Group) can bring in/drop companies really quickly. Unlike most other corporate systems, and because their business is so dynamic, they have both full public internet access (definitely not the norm) and four environments: dev, qual, prod, and archive. The last environment is used to snapshot their business at a given time so they can have a baseline to compare against. Again, that is not the way HFM is typically set up, to put it mildly.

The Teaser

This blog will be updated throughout the day (although looking at my laptop clock I realize it’s 9:30 pm EDST so how much bated-breath refreshing of this blog there may be is open to question) so stay tuned.


Be seeing you.

Where in the world is Cameron, days 3 and 4, Australasian edition

$
0
0

Wait, what happened to day three?

What happened was:


  1. I woke up really early
  2. I flew in a New Zealand Air plane to Melbourne
  3. I saw Bambi Price’s house and cruised round Melbourne in a jeep with Bambi’s husband, Fred Price
  4. I bought lots of real Cadbury’s chocolate (what we get in the States looks like Cadbury, but it is a Cruel Joke upon the tasting) to take home
  5. Laid down on the hotel bed “for a minute” before I went out for a beer and woke up the next morning

Okay, so what about day four?

I helped Bambi and Fred set up the conference room at Swineburne University (exciting pictures to be inserted as soon as I can find my boat anchor of a phone), had a flat white, and then proved that Cameron and Fred Do Not Do Networking as we tried, somewhat fruitlessly, to try to connect to the wireless network. Yes, you are reading this, so we are not hopelessly bad at this.

Here we are setting the room up. Oracle Ace Directors do it all, including moving furniture as required.
 

And all of this was for…

The ODTUG Seriously Practical Australia conference, natch. Yes, that link you see (go on, click on it and be surprised, and maybe just a little sad that you aren’t here) is the agenda, and yes, this is Exciting Stuff. We are bringing the same great focus and depth to Australia as we do to Kscope in the States.

Babar Jan Haleem, What’s Coming in Oracle BI and EPM, 9:00 am, Australian Central Daylight Time (ACDT)

Babar is giving the same (well, the same if you were at NZOUG 2013) session he gave at, wait for it, NZOUG 2013, but as that universe is pretty darn small, it is a fresh presentation to most.


My take away – I can’t wait to get EPM into the cloud at a client – I do it all the time from a development/self-training/generally mucking about perspective but that’s a completely different thing than actually running an EPM implementation in the cloud. Let there be no more missed implementation schedules because of install problems!


Debra Lilley, Fusion Applications and Your BI/EPM Investment, 9:30 am to 10:30 am ACDT

Debra just said that Hyperion (aka EPM) is “exciting” and that therefore, I am exciting. She also says that she is not technical. Hmm.

And what is the calculation engine behind Fusion? Why, it’s Essbase. And it’s transactional. You know, the thing that we Essbase developers were Never To Do. Could these be “headless” ASO Essbase databases? Could be.

Debra’s getting a bunch of questions – I’m really glad to see this interaction although she might feel a bit like a trooper serving under General Custer at the moment. What am I talking about, she can more than handle herself. Not technical? Hmm.

Fusion Reporting and Analytics – Oracle Transactional BI (OBTI), Oracle BI Applications (OBIA), and Specialized Analytics. The last bit is all built on Essbase – Essbase is the aggregation engine. How cool is that?

Charles Pinda, Delivering Your Financial Results Better with Oracle EPM 11.1.2.2, 10:45 am to 11:45 am ACDT

Charles does a great job – the 11.1.2.2 functionality that I wish was in “normal” Planning was Decision Management. It’s part of PSPB (Public Sector Planning and Budgeting) and is, in a word, awesome. I can’t even find it documented although I’m sure that exists. It’s a way to collect all of the text, comments, justification, etc. around a budget. It is So Cool.

Endeca Information Discovery, Stephen Weingartner, 12:45 pm to 1:45 pm ACDT

A small world, indeed – I am working (you may or may not be surprised to note that this week is not a normal work week for me) at a client in St. Paul, MN. And Stephen is from…Minneapolis, MN. If you’ve heard of the Twin Cities, you’ll know that St. Paul and Minneapolis are practically one city. As the saying goes, what are the chances?


Beyond odd coincidences, Stephen is here to talk about Endeca. The more I hear about this tool, the more interested I become. Or maybe I finally understand the value of unstructured data and how it might be analyzed. It sure isn’t Essbase, although Essbase can be fodder for Endeca. It’s ability to comb through public data and make sense out of it all is intriguing.


Stephan showed a Twitter data source analysis (Dan O’Brien at NZOUG 2013 did much the same but on #NZOUG and #NZOUG2013 hastags) based on the political turmoil here (I barely understand American politics so look up Gillard and Abbott on your own, I pick no sides) – all public data, all real time, all Real Cool.
 

Richard Philipson, Exploring Oracle BI Apps: How Does it Work and What Do I Get, 2:00 pm to 3:00 pm ACDT

Richard is fielding questions about why Informatica is used instead of ODI. Always a fun moment when one has to defend Oracle’s product decisions, but of course he’s doing fine. For the record, everyone wants ODI (but of course) and expects that it will come soon.
Overall, the BI Apps are pretty cool although certainly not simple. OTOH, they hook into the Oracle applications with a moderate amount of pain and complexity. There is a lot of functionality and flexibility in-built to the tools.
And oh btw, slowly changing attributes, the semi-holy grail of Essbase that is kind of, sort of there, is easily displayed in BI Apps.

Christine Aird, Thoughts from the Frontline – Issues and Opportunities Faced When Implementing or Upgrading HFM Applications, 3:15 pm to 4:15 pm ACDT


Another presenter who claims he (or I suppose she) “isn’t the least bit technical”.  I wonder if Australian English (almost as painful a term as American English) defines technical as “more than the Septics would do”.  She’s a geekette, but just doesn’t know it.  Or maybe that is admit it.  Why do I say this?  Because what she describes as her project work is what I do, and I think I’m technical.  I could be wrong about what I do – it wouldn’t be the first time.

Christine is taking us through the various stages of project implementation, what HFM is good for versus Planning, and general good practices around HFM implementations.

Yr. Obdnt. Srvnt., The spreadsheet management system known as Dodeca, 4:30 pm to 5:30 pm ACDT


I get to talk about my very favorite Essbase front end in the whole wide world.  Talking about Essbase is always a good thing.  Talking about Dodeca is always a good thing.  Talking about both is just perfect.

Would someone in Australia please buy this thing?  No, I don’t want the work, I just want Dodeca to plant its flag on yet another continent.  After all, trade follows the flag.

And after me, the bar


At Beer DeLuxe– what a nice way to end a long and useful ODTUG Seriously Practical first conference day.

Keep tuned
There’s more to come, including my Brush With Celebrity, but that is most definitely a case of “If there aren’t any pictures, it didn’t happen.” Oh yes it did happen and yeah, I have the photos as you can see below. I think we all know who this is.

Stanely looks like he’s been through a lot. He’s also a bit smelly. But then if you read his adventures and travails at his very own blog, it all makes sense. And yes, that is my brush with fame. Also, yes, that is a bullet hole, but Stanley marches on. He is made of Stern Stuff.



Goosebumps is the only way to describe how I feel. :)

Be seeing you.

Where in the world is Cameron, day 5, Australasian edition

$
0
0

Not the beginning of the end, but perhaps the end of the beginning

I’m not one to quibble with WSRC, but the ODTUG SP Australia conference is at day two of two and that means that I am finally coming to the end of Cameron’s Most Excellent Australasian Conference Adventure. It figures that my body is finally sort of, kind of, used to the time zone difference because I will be jetting away tomorrow. It took me a week to get used to the time here and I expect another week of sleepiness when I get home to the States. OTOH, I have lots of real Cadbury chocolate (see yesterday’s rant on the stupidity of US chocolate manufacturing practices – Something Must Be Done), I have really had a great time here, and learnt quite a bit about BI and EPM.

But none of the above really matters – what does matter is: did the conference attendees get value for money? Given the depth and breadth of the sessions, the passion that the presenters brought to their sessions, and the high technical level of the presentations (despite protestations to the contrary re “being technical”), I’d argue that yes, the attendees got their money’s worth, and more.

I was a little apprehensive about helping select sessions (read: beg Oracle Australia, James & Monroe, M-Power, Bambi Price, and just about anyone I knew in Australia to help put together the speaker list – oh dear, I am now on the hook for repaying favors but it is all worthwhile) as the Australian market differs somewhat from the US of A’s. Yes, the market details are different, but at the end of the day we are all trying to solve the same problems with Oracle’s BI and EPM tools. The attendee survey will tell the tale (how could a BI/EPM conference not try to wrap metrics around an event?), but based on conversations I’ve had, I think it will be a solid win.

Taking OBIEE to the Next Level, Maneesh Disawal, 9:00 am to 10 am, ACDT

Maneesh is taking us through a definitely-not-standard approach to making OBIEE more useful. It’s nice to know that hacks aren’t just an EPM-only approach. And besides, a good hack isn’t a hack at all, but instead is Just Really Cool.

It’s interesting to see how much OBIEE overlaps with EPM – yes, yes, I get it, Oracle are bringing the two together, but still, it’s interesting to actually observe it. Most ODTUG conferences have me running round like a chicken with its head cut off. I am really enjoying actually being able to sit back and listen.

Another thing I am noticing about OBIEE is how IT-oriented it is. This isn’t a bad thing but it is evidence (if it were needed) that there is still quite the gap between the BI and EPM worlds. Their eventual merger will be interesting to watch.

Essbase ASO – A Brave New World in Australia but not for the Rest of the World, Steve Hitchman, 10:15 am to 11:15 am, ACDT

This session hasn’t occurred yet, but it’s up next. m-power worked with my buddy Dan Pressman and utilized his Rules of ASO Essbase. I’m very excited to see what they have on offer. Update– The session is in progress right now.

Oh, this is embarrassing, but kind of awesome at the same time. Dan Pressman, ASO wizard extraordinaire, just had a slide devoted to him and His Really Big Brain. What else was part of the slide? Why an advertisement for Developing Essbase Applications. Yes, it is a good book, and internationally loved.

Steve is going through the ASO design principles Dan has tried to hammer into my head:

  • No formulas, unless you must
  • Stored instead of dynamic hierarchies, or at least Multiple hierarchies enabled
  • No more + and - operators, instead just + and flip the data signs to get round the dynamic hierarchy
  • Gary Crisci’s MDX chapter in Developing Essbase Applications just got mentioned as well as a resource (Are Gary and Dan soon to become Australian citizens? Could be.)
  • Do the simple stuff in MDX, do the complex logic in BSO and import results into ASO
  • Alternate YTD hierarchies to come up with YTD values through ASO’s aggregation capabilities
  • Solve order to handle variances

Oracle BI and Oracle Essbase: Today and Tomorrow, Stephane Roman, 11:30 am to 12:30 pm, ACDT

Stephane is taking us all on a journey through Essbase and OBIEE integration in the next release of the OBIEE stack. They are To Become One.

Stephane is reviewing Sample.Basic aka My Very Favorite Essbase Database In The Whole Wide World (MVFEDITWWW). It’s nice to know that The Beverage Company’s business continues forward. Who wants to bet that when Sample.Basic was created that a bunch of Arbor Software developers sat round and said, “Eh, a good first effort, but we have to replace that with something better, but soon.” Soon never came.

Watching Stephane’s presentation, I realize that I have a career decision to make – am I going to jump on the OBIEE bandwagon to get a leg up on the tool or just passively wait for the Bus Named OBIEE to run me over. Maybe getting flattened will be pleasurable? Probably not. So much to do, so little time.

One thing that is funny about OBIEE (and why I personally think things aren’t quite there yet wrt product convergence) – it takes an Essbase database in all of its Essbase awesomeness and turns it into a logical star schema. That is…odd looking. I realize this is how OBIEE federates data but it is still a little jarring for an Essbase developer to see.

Slay the Evil of Bad Data in Essbase with ODI, Cameron Lackpour, 1:30 pm to 2:30 ACDT

This is the same presentation I gave at NZOUG 2013 and I always enjoy a chance to spread the ODI gospel. My solution doesn’t exactly use standard ODI functionality but one of the great things about ODI is that, to quote one of the attendees at ODTUG SP Australia likes to say, “There’s always a way round a problem”. ODI is great at enabling those ways around issues. I am a super fan of the tool.

Growing with Business Analytics - Keeping Updated and Informed, Paul Anderson, 2:45 pm to 3:45 pm ACDT

I finally get to meet Paul – I’ve seen his posts on the Business Analytics – Proactive Support web site.

Oracle Support are doing great things with making the support of the not-exactly-simple EPM tools. They are trying to preempt problems before they occur. One might argue that they are trying to put themselves out of business. I think that isn’t likely to happen any time soon but it is beyond great to know that Oracle understands the importance of fixing problems asap and maybe even preventing the issues before they even occur.

Master Notes, feedback, Advisor Webcasts, patches, product certification, social media/My Oracle Support Communities (hint, use this in lieu of Service Requests when your problem doesn’t involve fully-engulfed-in-flames Essbase servers), product version certifications, patch communities, product specific communities (HFCM and Endeca), Remote Diagnostic Agent (RDA), whew, you get the idea. Oracle Support are doing a lot of interesting things.

Closing Panel, Richard Philipson and Cameron Lackpour, 3:45 pm to 4:30 pm

This is an anything-goes, hit us with your best shot session. They are a lot of fun at Kscope and I am hoping that this will be more of the same. However some topics like do you like wheat or white toast are beyond the pale. Okay, I kid, I kid, wheat every time. Rye vs. wheat will have to remain a secret. Everything else is fair game. :)

Keep tuned

Almost done – I will put in my final thoughts when the party’s over.

Be seeing you.

Viewing all 271 articles
Browse latest View live