Quantcast
Channel: Cameron's Blog For Essbase Hackers
Viewing all 271 articles
Browse latest View live

One of these Smart Views is not like the other, one of these things does not belong

$
0
0

You must look for the clew

A hat tip must be given to Gary Adashek aka @GADASHEK for it is his hard work and (apparently) incessant nagging that got this superduper feature.  See, mega multinational colossus software companies do listen to their customers.

Have a looky loo at the below two screenshots.

More than one red herring

Things you should discount:
  • Excel 2013 versus Excel 2010
  • Slightly different formatting styles (I used the bold red for data to show that I could write to an intersection via a ugly filter)
  • Cell C3 versus B5

Do you see it?

Standard

Something new has been added

Still can’t see it?  That’s not very Holmesian.  

Gotcha!

Perhaps the subtle visual hints below help.

Going walkies
Let’s clear out that grid.

Click on “Insert Attributes” and the magic occurs.  

There they are, the five attribute dimensions from Good Old Sample Basic aka MVFEDITWWD.  No more guesing or going into the Member Selection dialog.  Finally!

There is one teensy weensy problem


That Insert Attributes selection?  It performs a retrieve.  Sample.Basic retrieves the top of the house on 0.016 seconds but for all of my love of this database, it isn’t exactly a real world example.

How do I know that Insert Attributes does this?  I know because I tried this out on an ASO database with 15 attribute dimensions (I swear, not my design) and Smart View…went away for a while.  Bummer.

Imagine that retrieve against a BSO database.  Double bummer.

Could Oracle fix this?  Absolutely – just turn off the retrieve when the dimensions are on the sheet.  It isn’t as though the data value would be any different and the function would be faster, much faster.  Also that would give the user a chance to remove the attributes he doesn’t want.  Surely this is possible, says the geek who actually has zero idea how hard this is.  Never let it be said that I let my ignorance get in the way of my wishes.  Oops, wait…

How do you get this?  It’s as easy as pi.

Simply download the latest version of Smart View.  For those of you reading this in December 2015, that’s release 11.1.2.5.510.  What could be easier?

Again a big thanks

Gary Adashek is the one to thank, not yr. obt. svt. as I would have figured this out in approximately eleventy billion years.

This is pretty cool and proof that Smart View’s progress continues apace.

Be seeing you.

The Compleat Idiot's Guide to PBCS, No. 2. – Philip Hulsebosch's blog

$
0
0

Yeah, I’m stealing the content

Well, not really, more like giving a little publicity to a blog that most of this particular blog’s readers do not read.  

It’s by Philip Hulsebosch.  For those of you who don’t know Philip, he’s an accomplished EPM practitioner who has presented multiple times at Kscope as well as a prolific writer of white papers.

For some odd reason (Dear non-anglophone readers, please enjoy the following example of cultural arrogance) this blog is written in German instead of English.  Given that my German mostly consists of types of beers and descriptions of delicious meat dishes, I have been happy to turn to Google translate to get a rather good account of Philip has been up to.  

For the record, I understand that people not in the States speak other languages.  Once upon a time, even yr. obt. svt. could speak Vlaams.  Disuse has ended whatever I could communicate in something other than English.  Although some would argue that even my native tounge isn't something I've really mastered.

For those of you who are not fully monolingual, here’s Philip’s post in the original German:

And here it is in the Queen’s English:

Or to shorten that:  PBCS for a week.

What’s it all about, Alfie?

Do I have to put in that Michael Caine reference again?  Just listen to Cilla Black sing.  What a fantastic movie.  But I digress.  Again.

Moving beyond British movies from the 1960s, Philip covers:
  • Data center location
  • Planning user roles
  • Reports
  • Data export (quite a bit on this)

What I particularly like about Philip’s approach is that he is coming at these functions as someone who has experience in on-premises Planning but is a Cloud n00b.  Sort of just like me and perhaps you as well.

I have high hopes for many more posts from Philip.  You should too.

Be seeing you.

Do you know who runs the ODTUG EPM Community?

$
0
0

Probably not, but you should

Have you ever wondered who makes the ODTUG EPM Community initiatives come off?  I don’t mean Kscope (there’s a Conference Committee for that) but instead the webinars, the Kscope EPM Monday Night EPM event, the Technical Journal articles, the regional meetups, and the quarterly newsletter?

In large part, you do Oh Gentle reader do when you present a webinar, or have fun on Monday night, or write a whitepaper, or help out with the newsletter, or attend a meetup.  But who organizes those initiatives?  Why do they even exist?

‘Cos

These initiatives exist, and engage you, and succeed because of volunteers.  Volunteers who are just like you or yr. obt. svt. only crazier more dedicated.  They’re ODUTG volunteers that serve everyone in the EPM community within and without ODTUG.  That’s right – ODTUG is the foundation of these enterprises but they serve all EPM practitioners everywhere.  It sounds kind of awesome and it is.

They (and yes, I’m going to get to them in a moment so please bear with me) recruit webinar speakers (hey, if you’re interested in presenting an ODTUG webinar, sign up here), organize the Monday Night Kscope event, recruit Technical Journal writers, appleseed EPM meetups, and recruit writers, write articles themselves, and edit the quarterly newsletter.  None of this happens by magic and all of it happens for “free” to your benefit.  Have you noticed how these activities have picked up over the last year?  That’s not a happy accident but instead the outcome of a group of talented and hardworking individuals.

We happy few

This is the first in a series of posts on ODTUG’s EPM community and in future posts I’ll highlight each of the volunteers, but for this first go round I shall leave the panegyrics to a simple listing of the volunteers and their roles and responsibilities.  As promised, more detail will follow.

*A note and a happy one about GaryC (two Gary’s overwhelm my limited mind so I refer to them as GaryC and GaryA, probably to their intense annoyance):  GaryC is a recently elected ODTUG board member and as the EPM community is run by EPM community members, not the Board of Directors, he had to find a replacement.  I’m glad to note that GaryC has a namesake colleague GaryA just as interested in the EPM community.  Also, the almost-the-same names keeps my level of naming confusion to a minimum.  Sort of.

What about yr. obt. svt.?

I’m very happy to report that I am entering my last year on the board (not actually happy about that) and thus my last year as the EPM community liaison to the BoD (Hmm, not happy about that either but such are term limits.  That’s the way she goes.).  What does that title mean?  While I would love to tell you that it means I expend an enormous amount of hard work, that the success of these initiatives are due to my sterling efforts, etc. the role really is only keeping an eye on what the EPM community does and bringing any requests for funding or direction to the attention of the BoD.  It’s an easy billet.

More coming

As noted, you’ll see more on the EPM community, what we’re doing, and how you can contribute as well as benefit from these volunteers’ hard work.

Be seeing you.

The Compleat Idiot's Guide to PBCS, No. 3 -- Managing PBCS

$
0
0

Quid Novi PBCS

Never fear, that’s the last Latin you’ll see in this post.  But I will indeed be covering what’s new in the administration of the PBCS service.  There’s a lot of interesting information available.  Remember, this isn’t about managing your firm’s Planning application but instead managing the PBCS instance.

NB – We still haven’t gotten to PBCS itself and that is the very next post but managing the Cloud is important.  
A warning, this is a very picture heavy post as there’s an awful lot to see.  Dial up (are there any left?) users probably ought to go get a cup of coffee.  Or two.  Perhaps grinding the beans wouldn’t be a bad idea.  Roasting from green wouldn’t necessarily be a bad idea either.  You get the idea.

One other thing – what I’m going to review isn’t what a user or even a Planning administrator would see but instead what Oracle calls the Service Administrator.  This person, whoever he is, manages the Cloud service(s) so this applies to more than just PBCS.

Oracle has a lot of Cloud offerings and they’re all managed through the Oracle Cloud My Services portal.

Wouldn’t this be nice?

It is.  As I’ve noted before, one of the things I like so much about PBCS is how Oracle bring useful information to the front instead of making you search through scads of documentation.

As an example, when I log in as a cloud administrator…

I see the below as I have my username set up to show this on login.

Do I want to know what the very latest and greatest about Cloud My Services (CMS)?  It’s right there staring me in the face.

Remember how I wrote about how great the PBCS documentation is?  It’s baked into the tool itself.  

Clicking on the Account Administrator (AA) link takes me to all of the tasks an AA does and then shows how to do them.

What’s new?

Want to know what the new CMS functionality is without searching through patches and ReadMes and all of the other stuff we need to do with on-premises tools?  Easy peasy.
NB – For you Frankie fans, that song (see the above section title link) is from one of his suicide albums.  Fun times.  Oracle Cloud doesn’t engender that feeling in me but rather the opposite.

Want to understand what all the pretty icons mean? Click away.

The Surrey with the Fringe on Top

Once I get past the help panel, the next thing I see is the status of my PBCS instance.
Another American Songbook note: Mel Torme sings “…the dashboard’s genuine leather…” in the song.  Sometimes my references are too obscure.

Back to reality, things are running well, thankfully.  Click on the Details button and I get, unsurprisingly, details.

I can take a longer view of the service status – here’s the past year.

Navigation takes me to my pod’s service outages and all kinds of other metrics.

I can even see the Cloud (not actually users within PBCS but) user accounts.

Users with FTP access:

PBCS user roles:

As well as contact and account management (there’s not much to see and even I weary of cut and paste).

Wot’ll she do, mister?

Concerned about how long it will take upload that humungous data file?  Here’s my totally lame-o fiber upload speed.  Not actually all that great.

What does this all mean?

I wanted to walk you all through this because I think it’s important to understand how Oracle gets you out of the infrastructure business and yet lets you keep track of everything that is infrastructure.  The information you need, the pain you don’t.  Is there anyone outside of infrastructure consultants who like infrastructure?  Not yr. obt. svt. And I suspect not you.

As noted in the introduction the next post will finally, finally, finally touch PBCS itself.  I thought it was important for you to understand the underpinnings (documentation, Cloud management) behind the tool.

Be seeing you.

Stupid Programming Tricks No. 28 -- LCM, 7z, and Planning migrations

$
0
0

How long can Yr. Obt. Svt. be wrong?

The answer to that question is apparently indefinitely.  And the task so trivial.  Sigh.

The problem

This post was supposed to be one in my Compleat Idiot’s Guide to PBCS series and I will use some screenshots from a future post on on-premises to PBCS and back migration but I got hung up on making this work.

And then I realized I made the same mistake at a client.  Remember what consultants are supposed to do:  help customers.  I did, sort of, but I made my task much harder.  Sorry.

Let’s walk through this using PBCS although as you will see the issue is exactly the same in on-premises.  Sigh.

The Brotherhood of Man

I have to give credit to my younger, taller, smarter brother from a completely different set of parents, Celvin Kattookaran.  In my hour of need (I have many, too many) he came through for me and didn’t even make all that much fun of me when he explained the answer.

Breaking LCM

I want to migrate an on-premises application to  PBCS.  I have the old Planning sample application from (I think) 11.1.1.3.  I have it working in both 11.1.2.3.500 as well as 11.1.2.4.  It’s a simple application, really simple, and I find simple hard enough as will be shown.

Please leave the premises

Here I am on my on-premises install of 11.1.2.4.  I click on Application Management and…

Here I am in good old Shared Services:

I then export the application file objects:

I’ve moved it to another my Windows 7 VM (I try to keep my 11.1.2.4 VM local only):

Not a cloud in the sky

Here I am in PBCS’ Application Management.  It’s just about the same as Shared Services:

Upload the LCM zip file:

And, as expected, here are the artifacts in PBCS.  As Danny Kaye says, everything is tickety-boo.

So what’s the problem?

If I unzip the application download and then make a change, any change, or even no change at all, and then rezip the file I get this on reimport.

Here I am in 7-Zip, a really awesome WinZip open source clone.  Fwiw, I actually have WinZip (I even paid for it and don’t like violating licenses by installing it on more than one machine) on my bare-metal laptop but 7-Zip is free and has almost all of the functionality; in some ways it’s quite a bit more advanced.

Here’s the folder unzipped.  If I were migrating this to an on-premises install I could copy this entire folder without compression to the import_export Shared Services folder.

Now I’m going to zip this to SampApp1a.zip.  Note that not a blessed thing has changed to the contents.  Also note that the compression engine, be it 7-Zip or Windows’ own compression functionality, makes no difference.  
The below are defaults:

Zip-a-Dee-Doo-Dah indeed


Uh-oh.  There’s a difference in size of the compressed archive.  How can that be?  Nothing and I mean nothing has been changed.  Oh well, there couldn’t possibly be anything to worry about yr. obt. svt. blithely thinks.

The upload goes swimmingly, I think.

Oh how wrong I am despite this lovely message:

And what happens when I try to open the file?

¡Ay, caramba!

Is the service really not available?  That’s silly as it’s clearly there and thus we see Yet Another Confusing Error Message.  Perhaps Oracle never anticipated someone doing something as boneheaded as is described below?  Probably.

Here’s the problem.  I zipped an unzipped folder and that was the issue.  Wot?  It’s the same, right?  Nope.

Here’s the zip file from Planning.

And here’s the unzipped-to-zipped archive.  Do you see what I did?  I zipped the SampApp1 folder within the archive.  What?

It actually makes sense – I zipped c:\users\cameronl\downloads\SampleApp1 to c:\users\cameronl\downloads.  The SampApp1 folder is part of the overall path and thus it gets included in the zip file.  Ultimatel Fail.  Although to be fair that isn’t actually an intuitive result.  Regardless, I should have looked into the zip archive itself but alas did not.

Nothing’s Impossible

The solution is to go into the SampApp1 folder and zip from there.

And here we go, just as LCM defines the file structure.

Success! Boil in bag!

Upload it and all is well.  Despite the changed (or in this case not changed) zip file.

Ballin’ the Jack

And that’s it.  So trivial and so painful.

Oh yes, my poor unfortunate client.  Ugh.  I was tasked with splitting up a Planning application.  I downloaded the LCM xml files, did my modifications, zipped them back up and…failure.  Bugger.

As this was on-premises Planning, after a moderate amount of pain I was able to get access to the import_export folders and move the modified LCM files.  I understate the case:  getting that access was really painful.  If only I had begged on my knees asked for Celvin’s help back then.

May my errors not be yours.  

Great American Songbook

It’s difficult to tell if any of you ever click through on the hyperlinks I sprinkle throughout these posts.  Assuming that you do (or maybe assuming that you don’t) I thought I would give you a listing of the music (and one TV show and a few movies) so you can have some idea of my strange cultural tastes.  As I like to remind Natalie Delemar –  @EssbaseLady– popular culture pretty much doesn’t interest me much past 1965.  It shows.
 
In order:
  1. How to Succeed in Business Without Really Trying, The Brotherhood of Man, Robert Morse (The 1967 version is the OBC definitive version or as close as we can get save a time machine to get back to 1961.)
  2. The Honeymooners, Please Leave the Premises, Jackie Gleason aka The Great One, et. al.
  3. Merry Andrew, Everything is Tickety-Boo, Danny Kaye
  4. Song of the South, Zip-a-Dee-Doo-Dah, James Baskett
  5. Swing Time, Pick Yourself Up (medley), Andy Williams and Jack Jones
  6. That’s My Boy, Ballin’ The Jack, Dean Martin, Polly Bergen, and a guy who reminds me of me only with a lot more talent

I like to think that a hundred years from today The Great American Songbook will be what our descendants will view as the musical acme of the 20th century.  Jazz is America’s Classical Music, or at least it is on this blog.

Be seeing you.

Working in EPM? Live in South Florida? Not going to the South Florida EPM Meetup? Why?

$
0
0

Why indeed

Just what is a meetup?  Given EPMers’ technological bent, it is surprising to me that we don’t readily cotton on to the concept of a meetup.  That’s a pity because they are a great way to (danger ahead:  a geek who thinks he’s witty) meet up with like-minded individuals.  Think of them as a social media tool involving living, breathing meatware, using their collective wetware, all occurring in real life.  Isn’t slang wonderful?

What does that all mean in Plain English?  


Have you been to an ODTUG-nurtured EPM meetup?  Hardworking (given her ODTUG volunteer workload it’s more like insanely hardworking) Janice D'Aloia heads that initiative within the ODTUG EPM community.  A note about these meetups:  ODTUG encourages them through funding and support but at the end of the day a meetup is owned by its members, not ODTUG so please don’t think attendees are taking any orders from what-is-likely-the-best-Oracle-user-group-ever.  It’s all part of ODTUG’s service to its community members and yes, it is pretty noble sounding and it just plain is.

If you are interested in getting help starting up a meetup in your area, please contact ODTUG at erin@odtug.com to start the meetup ball rolling.  ← That’s an idiom, not slang, but aren’t idioms just as wonderful as slang?  Discuss.

South Florida EPM Meetup

And that brings us to a specific meetup, namely the upcoming South Florida EPM meetup.  It’s  occurring on Thursday, 18th February, 2016 at Dave & Buster’s Hollywood, Florida location from 3:30 to 6:30 (or later if you’re having that much fun) pm.

What’s on offer?  The very things that make meetups so much fun:  education in form of a Special ODTUG Surprise (And no, I do not qualify as a surprise, or at least not a pleasant one.  There will be a projector, and a laptop, and a demo.), a super geeky-cool game, and the opportunity to meet your fellow Floridian EPM practitioners.  What’s not to like?

It’s easy-peasy to register – you can do this on ODTUG’s EPM meetup page right here.  Meetup.com provides a lovely confirmation screen once you’ve registered.

And Bob’s your uncle, you’re set to attend.  You are going to, right?  You should if you’re not.

And to whom to do we owe the pleasure?

As I noted, ODTUG is an enabler but meetups are intrinsically grassroots.  They are founded and led and staffed and attended by you, the EPM geek.  The organizers are just like us – people who live, breathe, and eat EPM.

In the case of the South Florida meetup it’s Jessica Cordova of ARC EPM and Kris Calabro of Tyco International.  Jessica and Kris make meetings like this possible and we’re all in their debt.  Meetup organizers, whether they be hiking enthusiasts, British sports car owners (make mine a Sunbeam Tiger with Minilites), or yes, even EPM practitioners do it because they love whatever the passion is.  Benefit from their enthusiasm if you’re in the South Florida area on 18 February, 2016.  Join us, won’t you?

Be seeing you.

The Compleat Idiot's Guide to PBCS, No. 4 - Philip Hulsebosch and Smart Push

$
0
0

A note

I have again managed to convince Philip to write a guest post.  Actually, he suggested it but I like to think that I am Svengali or at least a salesman.  Nope, this was all his idea.  It’s really great that our EPM community can come together and share in this way.

And with that, and a wee conclusion at the end, take it away Philip!

Smart Push, this blog, and you


Cameron, some other consultants and I were given the opportunity to work with a Planning and Budgeting Cloud Service (PBCS) pod (Version 15.11.65). We could see, touch, and feel the latest version of Planning, including all the new functionality already included there. We were able to explore and experiment with the specifics of the Cloud Service and found that there are some major differences between working with an on-premises and cloud version. As an example, we (sort of, or have at least resigned ourselves to it) love EAS, because it gives us a feeling of control. During our exploratory work, I sometimes missed it, but seriously, I did not need it. Now, I can imagine a life without EAS.  PBCS has much to offer and a different and often better way of handling common Planning requirements.

As Cameron noted, this series of PBCS blog posts is all about sharing our knowledge with the wider community as it marks a new era in the category of planning tools.

Smart Push Functionality

Smart Push is a method to push data into different Plan Types with a process triggered from a data form.  You can push data including comments, attachments, and supporting detail from Block Storage Option (BSO) Plan Types into other BSO or Aggregate Storage Option (ASO) Plan Types. It is different from EPMA Data Transfer, partitioning, or @XWRITE/@XREF. It comes close to the existing on-premises data copy into the reporting Plan Type through scheduled transfers, but I think it is easier to set up and more integrated into Planning.

Why is this so exciting?

Some users want to see the results at aggregated levels of the data input they have saved a few seconds ago. Yes! They also want to have all their KPI’s and Variances calculated as well. Yes! And you want to deliver, because you are so nice to them!

First, you tried aggregations. Then you needed to optimize these aggregations to the very last. Then you were having a reporting Plan Type in your Planning application, and you have built scheduled transfers running each couple of minutes and eating much of your server resources. Ugh, ugh, and ugh.  No, do not give up. Smart Push comes to the rescue.

This post will cover how you can implement this functionality and show you how it really works using the PBCS version of the sample Vision application which has a reporting Plan Type.

How to do this?

First, you need at least two Plan Types in your Planning application to connect these. A reporting Plan Type will do very good as a target although it could be other kinds of Plan Types if required.

The next step will be to map the source to the target. In this we need to take care of the dimensions, the point-of-view and the data options. Conceptually, this is familiar ground.

We will then define this connection in a data form and configure that form to run the data transfer on the save data process.

Lastly, I will show an example where Smart Push is tested with data.

Whoops, really lastly, I will show in the Simplified Interface where to check for errors.

Map Reporting Application

The first step is to map the reporting application. This target can be any BSO or ASO Plan Type in the Planning Application. Select the menu “Administration” and then the option “Map Reporting Application”.  


Figure 1: Administration menu to select the option “Map Reporting Application”.

The window “Map Reporting Application” opens. Here you can see existing mappings and with the green plus sign, new mappings can be created.


Figure 2: Overview of the mapped applications.

It is best practice to use a name which describes the mapping of the source and target Plan Type. I have taken here the name Plan1-VisASO. Accordingly, I take “Plan1” as the source and “VisASO” as the target Plan Type. Please note, multiple connections can be made between Plan Types.


Figure 3: Selection of source and target Plan Type.

In the second tab is called “Map Dimensions”. Here, the dimensions between the source application and the target application are mapped. There are 3 options for this:
  • Not Linked
  • Dimension to Dimension
  • Smart List to Dimension


Figure 4: Mapping of the dimensions.

In the member selection column, a selection can be made of the members of the individual dimensions. Therefore, it is important to know, which members exist in both Plan Types, otherwise the data cannot be captured at the receiving side.

There is no dimension “HSP_View” in the reporting application “AVision” and there is no dimension “AltYear” in Plan1 cube of the planning application. Therefore, no Element has been selected in figure 4 for this dimension “AltYear”.

Note: At least one dense dimension and the “Account” or “Period” dimension must be mapped. This is also visible at the upper section of figure 4.

When opening the Dimensions section in the Administration menu, one can select the receiving Plan Type, which is in our example “VisASO”, and see the reduced member set of the dimension accounts as shown in figure 5.

Figure 5: Account dimension of Plan Type “VisASO”.

In the member properties, you can see at the Source Plan in which Plan Type the data is stored. The checkmarks further down at the Plan Type section show in which Plan Type this member exists. In this example it is in the “Plan1” and “ViSASO”. Unchecking it will remove this member and all descendants from this Plan Type. Therefore, take caution here!

Figure 6: Member Properties and Plan Type selection.

Because we have a mismatch on dimensions between the Planning application and the Reporting application, we need address this in the “Point of View” tab. Here the member from the dimension is chosen which holds the data and which will serve as a placeholder for the missing dimension. In figure 7 you see the member “Base Data” selected for “HSP_View” and the member “FY09” for the dimension “AltYear”.

Figure 7: Mapping of the POV.

At the last tab of the Map Reporting Application you can see the “Data Options”.

Figure 8: Options about pushing comments, attachments and supporting detail or not.

Note the Interesting option “Append multiple cells into one cell”.  This is new functionality that  will merge the comments and attachments in the relational data source. This is impossible to do out-of-the-box.

Now, all settings were done and we can save the mapping. As you can see, multiple mappings are possible in this window (figure 9). In the next section I will continue on adding the mapping to a data form and enable Smart Push. This is the real new part.


Figure 9: Saved Application Mapping.

Activate Smart Push in Form Management

The next step is to enable Smart Push in Form Management. As you can see in figure 10, there is a new tab called “Smart Push”. Here you can add, change and remove Smart Push definitions of this data form.

Figure 10: Enable Smart Push at the data form.

After pushing the add connection button, a window with all mappings will be shown for selection. In this, I select the mapping created in the previous steps.
Figure 11: Selection of the Application Mapping.

It is very important to select the option “Run on Save”. This rather tiny option is on the top right side of the window.

Usually, you will select the option “Use Form Context” for all dimensions, but there is an option to overwrite the selection of the data form. An example is shown in figure 12. In other words, technically, you can copy data from Scenario A on the data form into Scenario B in the Reporting application. It is possible to add a few members into the selection. Mmm, this will be interesting…

It is also possible to add more than one mapping to one data form. Pushing the same data into different locations….


Figure 12: Setting about Form Content and example of an Overwrite Selection.

This is really smart, isn’t it?

I saved the form and now move over to test it.

Testing of Smart Push

I am doing some data entry on the data form. Then pushing the save button.


Figure 13: Data on the form before save and kicking off Smart Push.

The Smart Push actions always take place after the save and calculate process.


Figure 14: Information box after save data. Smart Push was successful.

Figure 15 shows the situation in the reporting application before pressing the save button on the data form. We see there is no data in the reporting application on the same POV as at the data form of Plan 1. We have selected the member “FY09” in the Dimension “AltYear”. This is where we defined the data target to be in the mapping.

Figure 15: Reporting Application (VisASO) before saving data in the “Plan1” application.

Figure 16 shows the situation after pushing the data into the reporting application. At level 0 the data look fine. We see summarized data in the member “Sales”, but the amount on “Net Income” -> “Sales” is incorrect!
Figure 16: Reporting Application (VisASO) – the data were pushed over and aggregate into the nodes.

Where is the problem for this delta? We see in the reporting application data too low for ”Gross Profit“ and see no data for ”Total Cost of Sales and Service“. This is missing and did not get transferred. When we have a look at the data form used for the Smart Push, we see these members are not included in the selection and not present on the data form.   

To correct the situation, I added the “Cost of Sales” member onto the data form and repeated the data entry and save. This did the Smart Push of the data again.

Figure 17: Data at the modified data form, now including the Member “Cost of Sales”, save and Smart Push.

When reviewing the data in figure 18, then I see the correct values for “Net Income” -> “Sales”. The Smart Push is looking at the data form for which data to transfer.

Figure 18: Reporting Application (VisASO) – now the data of “Cost of Sales” were pushed as well and aggregated.

Smart Push always clears the target data area before loading the data. In other words, also #missing values are transferred.

There is a log file on Smart Push Processes. The application manager should review this for areas of problems. The interesting part is, you can see this log only in the Simplified Interface. To get there, select the option “Console” > “Jobs” and set the filter.

Figure 19: Adjust the jobs filter for Smart Push.

Figure 20: Then select the status.

Figure 21: Example of Smart Push Job in the overview.

Philip’s final comments

Smart Push is already widely adopted within PBCS Planning applications and will be in on-premises whenever that comes out because it is so powerful and can thus be applied with a lot of creativity. However, it is very important all data is copied. Nothing kills the reputation of an application faster than wrong data.

Likely a data transfer process needs to be established next to Smart Push, which synchronizes all data. Smart Push is for the end-users who need to see their changes without waiting for some admin process to finish.

Compared to partitioning, @XWRITE and other options of data transfer, Smart Push is rather simple to configure and maintain. I am sure you will love it too!

Regards, Philip Hulsebosch.

Cameron’s final comments

Philip, again thank you so much for putting all of this hard work into this post.  We all owe you a debt of gratitude for showing us on-premises luddites how Smart Push works.  I (we) can hardly wait till it and the many other cool features of PBCS makes it to on-premises.  

Be seeing you.

The Compleat Idiot's Guide to PBCS, No. 5 -- Migrating from PBCS to On-Premises Planning

$
0
0

Wrong Way Corrigan

There are plenty of blogposts, videos, and Oracle documentation on how to go from on-premises Planning to PBCS.  But what about the other way round?  What if you want to take a PBCS application and migrate it to on-premises?  It is actually doable and not all that hard.  

Why would you want to do this keeping in mind the lower level of functionality in on-premises (as of the writing of this post and 11.1.2.4)?  Beyond the “OMG, PBCS isn’t what we thought it would be.  Eject, eject, eject” use case which I don’t think is particularly likely I see these as:
  • Developers need to make changes to hierarchy, calculations, forms (with some caveats about functionality) outside of the development environment (think dev, qual, and prod)
  • Prototyping within a limited PBCS instance environment for customers and consultancies that can’t spring for more than one instance
  • Customers that have a real production environment and treat PBCS as their development environment
  • Any kind of hack is super cool and besides it gives one’s inner geek insight into how PBCS works beneath the covers

So is this approach the wrong way?  It’s certainly possible that the above may fit one of your uses cases.  Just remember that all of the neat-o, gee whiz, spiffy deluxe functions as below are not currently in on-premises.  I’ll do my best to show how to get round these although I may not be catching every function:
  • Smart Forms
  • Smart Push
  • Sandboxing
  • Valid Combinations
  • I’m sure there’s something else but I can’t think of what it is.  You PBCSer’s with more experience than yr. obt. svt. will be able to tell us.  Comment away to this post and I’ll include your notes.

Imports Aweigh

As you remember from my post on my inability to correctly zip a LCM file, PBCS supports LCM although it is called Application Management.  Assuming I actually manage to export an application and then bring it down to my on-premises instance and then make the even greater leap to importing the zip file, Shared Services looks like this in my 11.1.2.4 VM instance on PBCS application import:

Who’s got my data?

Thinking that this worked like an on-premises LCM migration, I created a Planning data source (HYP_PBCSVision) and a Planning app called Vision with the Plan Types from the LCM artifacts.  I ran the import and…

Ah, the datasource name.  I find it intriguing, sort of, that it is named PlanningDatasource1.  If there’s a
“1” maybe there will one day be a “2”?  A geek can dream.

Begin the Beguine

Dimensions

After creating a Planning data source with the name “PlanningDatasource1”, I ran the import again and got:
The dimension Year doesn’t exist?  Is this some kind of weird-o PBCS thing that changes the name of the dimension?  It’s certainly there in on-premises:

And it’s there in the PBCS LCM artifacts as well.  What’s going on?

It is one of yr. obt. svt.’s more glorious moments.  Did you see what I selected up at the top?  Plan Types but not Global Artificats?  Arrrgh.

Select the right Global Artifacts so that there actually is a Year dimension.

KABOOM
Wow, that didn’t work the way I hoped it would.

What’s not working here?
  1. Accounts are failing
  2. Adhoc thingies aren’t playing nice
  3. For the Love of Mike, even the Version dimension doesn’t work
  4. HSP_View?  Whazzat?

Other than that, everything’s tickety-boo.

Insane in the Membrane

Hewing to the commonly held definition of insanity, and putting aside all of the other things that went tits up the last time round, I tried reimporting the standard dimensions:

And had fewer (and smaller) errors.  Yes, sorry for the small typeface but I’m not going through all of this again just to make it easy to read.  I should think about how this is going to look when I write it all up but as I could barely get it to work I’m glad I made it hard to read.  Whew, that was a mouthful.

Everything Happens to Me

The important one is here.  “Sandbox Enabled”?
We have now reached one of those it-ain’t-in-on-premises-yet features.  Sandboxing data is possible in PBCS via Version dimension functionality but the concept just doesn’t exist for on-premises.  And that failure of Version to get created makes the member formulas for Earnings Per Share fail because they reference the Version dimension.  Bummer.

On a Clear Day (You Can See Forever)

It’s easy to find in the LCM Version.csv file:

And just as easy to delete:

As with on-premises, dimensions are also available from the Administration’s menu Import and Export Metadata to File functionality.

Minus all of the LCM-specific header records, the columns are the same:

With much of the application already built, I could import that dimension directly via the same process on the way into on-premises.

However it’s done, the result is (less Sandboxing) complete:

Too clever by half

Ever wondered what the name of the Essbase applications and databases are in PBCS?  You know, the ones you can’t see ‘cos there ain’t no EAS?  Refresh the database to Essbase and there they are:

NB – Vision corresponds to the application name; AVision and BVision hold the ASO plan types.  Interesting, no?

If I have Essbase databases, I can bring in data.  Or can I?

Bugger

Unicode?  Oops.  I didn’t create my Planning datasource with an Essbase unicode connection and it looks like PBCS uses that by default.  As an aside, I don’t know why any Planning or Essbase application doesn’t use Unicode by default but that’s above my pay grade.

Surely I can get round that with EAS?  Uh, no.

Given that I goofed, the easiest way round this encoding issue is to convert the data from UTF-8 to ANSI or at least that’s what I think I need to do.  An easy way, assuming there’s enough memory to do so, is to open the file in Notepad and then do a File->Save As.  The encoding type will be right there next to the Save button.

I used my very favorite text editor, Notepad++, to do convert from UTF-8 to ANSI.  Yes I could have done that in Notepad but I lurv Notepad++ and it works for much bigger files:

If you want a treatise on Encode vs. Convert, have a read here.

I also could have used the Essbase Unicode File Utility but find Notepad++ to just be a whole lot easier.  

As a further aside, I could have converted the Vision application to Unicode if I wanted to:

But not the ASO applications:

The odd thing about the ASO database  data loads is that I did not need to go through a Unicode to ANSI conversion.  The data came in the native file format and I can only conclude that Essbase treats the data encoding differently between the two database types.  Or it’s a bug.  I’d be interested to hear if anyone knows for sure.

With that, data can now be loaded via a load rule.  Yes, I could have brought this back inside the LCM folder hierarchy but I’m tired of that tool.  Besides, I’ve finally gotten this beast into EAS so I’m not letting go.

Success!  Boil in bag!


Is all well?  Well (I kill myself), no, it isn’t.  Yes you saw a successful import, but what you didn’t catch was this data column that I had to ignore in the Load Rule.

What is BaseData?  What indeed.  It’s there in Philip Hulsebosch’s post on PBCS and Smart Push but not in LCM.  Huh?  BaseData is part of the HSP_View dimension which supports Sandboxing which (a bit of a run-on sentence here) is not in LCM’s dimensions.  It’s an intrinsic part of PBCS just like the HSP_Rates dimension in multicurrency Planning apps which are also not specifically defined in LCM.  The application import takes care of this once it realizes that the application is Sandbox Enabled.

The dilemma is that there isn’t an on-premises Sandboxing function.  What to do?  One approach would be to ignore the HSP_View dimension but as we’re replicating PBCS and because the forms all need the dimension, it must be added.  But how given that LCM doesn’t support it?  

Just as with the Version dimension, simply export it from PBCS’ metadata export:

The output from that PBCS metadata export:

Import it in through on-premises’ metadata import function after first creating the dimension.



Ta da, we now have a HSP_View dimension.

The Rest of the Story

There are other artifacts to import but hopefully we’re (I’m) a bit more cognizant of the limitations of on-premises vs. PBCS.  Alas and alack, Jobs and Valid Combinations Rules (especially alack on the latter) aren’t supported in on-premises, so they need to be excluded from the import.

Let’s bring in relational data objects, excluding those oh-so-cool-and-oh-so-not-there-in-on-premises Sandbox Changes:

And KABOOM yet again:

I didn’t do security.  Arrgh.  Let’s do it:

I’m blurring the specific names but the types are Planner, Poweruser, Service Account, and Viewer.  The two groups I need to care about for security are Finance Management and Vision Planner.  I must first create them in Shared Services as they don’t exist.  With that done, I can import them.

Users

As I have security that is not assigned by group, I have to create the PBCS email-driven usernames, e.g., cameron@somethingorotherandthisisntreal.com.

Groups


Text artifacts can then be imported.

Tablets

I did have a spot of bother with forms being available in Tablet mode and ended up manually selecting them:

The end of the beginning

That should be it, I hope.  Is my PBCS-to-on-premises Vision the same barring the functionality that’s not quite there in on-premises?  

PBCS

On-premises


The Roses of Success

That was quite the trip but proof that with a few hacks here and there it’s actually quite easy to bring PBCS down to on-premises Planning.  

Only Oracle and your company’s contracts department know if any of the above is actually legal.  You Have Been Warned.

I love hacks, cf. the name of this blog.  When the hacks are useful so much the better.  Hopefully this post falls into that useful category.  I know it made me better understand PBCS and as the purpose of the Compleat Idiot’s Guide to PBCS is to educate we on-premises practitioners, I think it meets that condition.

Be seeing you.

The Compleat Idiot’s Guide to PBCS, No. 6 – Philip Hulsebosch and Valid Intersections

$
0
0

This is your host

Yr. Obt. Svt. may be the host but Philip Hulsebosch has again done all the work.  It is, as usual, top drawer stuff and yet another dive into the coolness that is PBCS.  
Sit back, adjust your comfy slippers (well, I’m writing this in my comfy slippers; hopefully you are in yours as well), and give this a read.  It’s well worth your time.
A note about comfy slippers – I take my sartorial hints from Fred MacMurray for which I blame a childhood of watching My Three Sons.  Check out this episode– not only are the slippers in play but so is the cardigan sweater throughout the episode.  Think about it – I largely work from home doing something that hardly anyone understands (Just what did Steve Douglas do for a living?  I couldn’t ever figure it out.), I wear slippers just like this (slate floors are cooooold), and the modern day equivalent of that cardigan. I’m livin’ the dream.
And with that minor digression about Cameron’s complete lack of fashion sense – unless you live as I do in 1961 – out of the way, on with Philip’s considerably more relevant work.

Valid Intersections.

The functionality of “Valid Intersections“, currently available in the “Planning Budgeting Cloud Service” (PBCS), is subject worthy of review because it gives write access to a node of two or more dimensions and read access to those areas that are not defined in the set. Finally Planning can assign access to detailed member level intersections - something we we’ve been able to do in Essbase for a long time.
With Valid Intersections, we can define sections in the database as having “cross-dimensional irrelevance” and assign read-only access to them. By doing so, we prevent data being stored accidently, where they should not be and enable the relevant slices for data input.
Let me give an example: When you have an account as “Mobile Subscribers” and you have product categories “Mobile”, “Fixed Line” and “Cable”, then it is logical to have data only in the first category. One could create a Valid Intersection of the Account “Mobile Subscribers” and Product category “Mobile”. This would make the combinations with “Fixed Line” and “Cable” read-only.
Set Theory
Valid Intersections are based, like many of the functions in a multidimensional database, on Set Theory. Last, when I was working on this subject of Valid Intersections, my son entered the office. I told him that I was working again with Set Theory and how much I benefited from learning about this subject in school in a parental attempt to motivate him for subjects at school which might not be in line with his current interests. He said, “interesting”, looked at my screen and asked why I did not have a touch screen and just mark in a graphical interface the intersections I needed. That was so much easier… well, … a good question! Today, it is all about the interface.
Back to Set Theory – see, the stuff we learn in school really is relevant later in life no matter how much the 16 year old in us rebels. In Figure 1 you see a diagram with an intersection of two circles A and B. These represent Dimensions. In a Valid Intersection, you define A in a rule and B in a rule. As a result, the red area means write access, the white area means read access.
Figure 1: A Venn diagram illustrating the intersection of two sets. Red is the intersection of A and B and would result into write access.

How do I define “Valid Intersections” ?

The functionality to define “valid intersections” is only available in the “Simplified Interface”. It is the user interface of the future for both the administrator as well as the power user. In the last days, I have worked extensively with it on a PC and I really like it. The Tablet is also possible, but I love my large computer screen and keyboard. It makes me so much more efficient.
After logging into PBCS, select the Option “Console”, then select the third tab at the left side with the name “Valid Intersections” and it will open a window. Here you see existing “Valid Intersections” and at the top left a menu for adding or modifying them.
Figure 2: Overview of the Valid Intersections.
In this post, I want to show how to create “Valid Intersections” in the “Vision” application. In this example, the yellow marked areas should be write-enabled for scenario “Forecast” and “Plan”. The dark yellow areas should only have write access for “forecast”. All other areas should have read access. In the rows is the dimension “Entity” and in the columns is the dimension “Products”.
Figure 3: Example to apply Valid Intersections in the Vision application.
By pressing the button “Create” a new Intersection can be created. First, you need to think about which Dimension should be the “Anchor Dimension”. This is the….
I will take the “Entity” dimension and along this dimension, I will create various rules.
Figure 4: Selection of the Anchor Dimension.
I will give it the name “Entity-Product”, so I can see in the name which Dimensions are used in the Intersection.
Figure 5: Option “Unselected members are valid” is activated.
I have enabled the option “Unselected members are valid”. This means, all not selected members in this dimension are not part of the set and thus read only.
The next step is to select the “Product” dimension, by clicking the “Add Dimension” button.
Figure 6:  Selection of an additional dimension.
I tag this dimension to be required. So a selection is needed for all rules within this Valid Intersection.
Figure 7: Enabled the required option for this dimension.
Now, I can create the different rules. To do this, I click at the button “Add Rule” and the selection window with the two dimensions opens. At the top of the window, you can switch between the dimensions while selecting the intersection. Check Figure 8 for details.  Members can be selected by mark next to the name and functions are available at the right side of the member name.
Figure 8: Selection in the dimension “Product”.
At the right top side of the page, you can find the menu with the display options.
Figure 9: Display options.
When you are familiar with the member names, you can type them into the box. It is not necessary to select them in the member selection.
Figure 10: Editing the selection by typing.
Click the “Save and Close” button and the Valid Intersection is added in the list.
Figure 11: Valid Intersections. The top one is disabled.
Now let’s see this on an Ad-Hoc report. A refresh on an existing report shows the result. A logout and new login is not necessary. The entities “420“ and “421“ and the child products of “TP1“ were defined in a Valid Intersection. These are write-enabled, while the rest become read-only.
Figure 12: Result of the Valid Intersection definition.
Adding a new rule to the Valid Intersection. I can edit the rule on the window and I do not need to go into the member selector.
Figure 13: Adding a new combination of members as a rule.

A new query. The results become immediately visible.
Figure 14: Result of the changed Valid Intersection definition.
And now adding the last rule and hand-pick the elements of the product dimension.
Figure 15: And we add another combination into a rule.
With this, the third section of rows has been defined.
Figure 16: Result of the changed Valid Intersection definition.
And now, I combine the remaining entities “410“, “450“ and “810“ with the dimension member “Product”. This is quick and dirty, but it works, because these members will become read-only.
Figure 17: Adding the last rule.
With this, all sections are now defined. Let’s check the results.

Figure 18: Result of the changed Valid Intersection definition.
I change the account. Good, the access rights do not change on the entities and products.
Figure 19: Result after selecting a different account.
Now I change the scenario. The write enabled sections do not change.
Figure 20: Result after selecting scenario “Actual”.
Now I have to make the difference between scenario “Forecast” and “Plan”. In “Forecast” we wanted to add write access to the Entities “410”, “450” and “810”. So we have to extend the current Valid Intersections.
You can easily duplicate a Valid Intersection. Then it is really easy to modify it.
Figure 21: Duplicate a Valid Intersection.
Figure 22: Result from the duplication step.
Then I will rename the intersection into “Entity-Product - Forecast“.
To make the difference between Scenario’s, I will need to add this dimension into the Valid Intersection. I will select the member “Forecast” in each row. In the last row with the entities “410”, “450” and “810” I replace the “Product” for “Children(P_TP1)”.
Figure 23: Changed the selection in this new Valid Intersection.
Figure 24: Result after saving.
The month has been changed into “July” to see the effect on scenario Forecast, but as you can see in figure 25, the entities are not write-enabled! What is wrong?
Figure 25: Result of selecting scenario “Forecast”. Nothing has changed.
The reason is both valid intersections add up. I will select “Budget” in the scenario and I see the rule with “forecast” works.
Figure 26: Result after selecting scenario “Plan”. The intersections are adding up and scenario “Plan” is not in the intersection anymore.

Figure 27: Menu options under “Action”.  In Englisch, the menu choices are:  edit, duplicate, delete, up, and down.  German is the Esperanto of the future.  
It is simple to enable and disable Valid Intersections. Just remove the green checkmark in the column Activate. As in figure 28 shown, a green checkmark it is enabled and a grey one means it is disabled.

  
Figure 28: Enable and disable a Valid Intersection.

I disable the Valid Intersection “Entity-Product”.
Figure 29: The first Valid Intersection is disabled.
And now, entity “410” in “Forecast” is write-enabled. So that is verified to be the reason. I will need to modify the Entity-Product Valid Intersection.  
Figure 30: Result when selecting scenario “Forecast”. Also entity “410” is write-enabled.
And in scenario “Plan” it is not as can be seen in figure 31.
Figure 31: Result with scenario “Plan”.
In the Valid Intersection Entity-Product, I add also the dimension Scenario. In each row, I add “IChildren(Scenario)”. I enable this Valid Intersection after saving. Now, both Valid Intersections are active again.
These are the 4 rules now:
Figure 32: Valid Intersection 1 has been adjusted.
Refreshing the reports. Scenario “Forecast” is now correct.
Figure 33: The result in scenario “Forecast” looks correct.
Scenario “Plan” looks also OK.
Figure 34: Result in scenario “Plan”. It is correct, because entity “410” is read-only.
It is possible to reduce the number of rules to a minimum ….
Figure 35: The Valid Intersection has been reduced to the relevant part.
And it is also possible to combine both Valid Intersections into one.
Figure 36: Combination of the rules in both Valid Intersections into one Valid Intersection.
A check and the results stay correct!

The option Unselected members are valid.

To show the effect of the option “Unselected members are valid" I will extend the current report with additional entities. These entities are “815”,”830” and “840”. In the Valid Intersection, I select the anchor dimension Entity and open the menu. Here I make sure the “Unselected members are valid" is enabled.
Figure 37: Entity with the option “Unselected members are valid".
The members “815”, “830” and “840” are write-enabled in the scenario “Plan” and “Forecast”.
Figure 38: The members “815”, “830” and “840” are write-enabled.
Figure 39: Also in Forecast are members “815”, “830” and “840” write-enabled.
I will switch off the option “Unselected members are valid". This will take care that the members not selected become read-only.
Figure 40: Entity with the option “Unselected members are valid" disabled.
The result is as expected. The entities “815”, “830” and “840” who now are excluded from the selection set are read-only.
Figure 41: The entities “815”, “830” and “840” are read-only in the scenario “Forecast“.
Figure 42: Also in the scenario “Plan“ are the entities “815”, “830” and “840” are read-only.
Now we have reached our goal which is displayed in figure 43.
Figure 43: Overview of the write enabled areas for scenario “Plan” and “Forecast“.

Conclusion

Valid Intersections gives us a very nice possibility of access control. It can be defined up to the single cell level, which was not possible until now in Oracle Hyperion Planning. This functionality is available in PBCS and likely will be coming soon in the on premise version as well.
I did not check if Valid Intersections also work with data load. In the data forms it is working very well and I hope this post helps you on this subject.

Yr. Obt. Svt.‘s conclusion

I’m not sure what is more amazing:  Planning finally supporting ANDs in security or Philip’s ability to clearly explain a complex concept and process in a clear way.  Okay, what’s even more amazing is his willingness to do this as a guest writer.  This was no quick note as Word tells me this is page 20 which is proof, if proof was needed, that his committment to the Oracle EPM’s community’s education is deep and sustaining.
Philip’s clients are lucky to have him.  We’re lucky to have him as a blogger (in Englisch), writer, and presenter.
Be seeing you.

The Compleat Idiot's Guide to PBCS, No. 7 -- Smart Forms are Smart

$
0
0

A triumph, a warning, and a lament

Yes, it’s another one of my “All of Gaul is divided into three parts” posts.  Or maybe it’s my Descartesian nature.  Or maybe PBCS’ Smart Forms simply lend themselves to threes (four but I can’t count and it breaks the analogy).  Read on, Gentle Reader, and you may agree.

What makes Smart Forms so smart

When I first heard Shankar Viswanathan, Planning’s product manager, talk about Smart Forms at ODTUG’sKscope15 (Or was it Kscope14?  It all blurs together after a while.) I thought he was off his rocker/absolutely brilliant because what he described was using Excel to build Planning forms and use Excel logic within Planning itself.  Howzat supposed to work?

At first I thought he meant we’d be able to create traditional forms in Smart View instead of Planning’s web interface the way the Planning admin extension to Smart View provides but that is wrong, wrong, wrong or at least partially, partially, partially right.  What Shankar actually meant is that we’d be able to create forms (so that bit I had right) and calculations in Excel and have those both persist in Planning.  I thought this would be really cool – we could (actually we can) create calculations using Excel’s functions instead of Planning’s form calculations and then have them be available for other users.  Even better, those Excel formulas would get translated into web-based forms.  Could it really work that way?  The answer is yes.

You can read through this blog (please, else why should I have spent a day of my all too short life writing this), but also have a read of the ratherexcellent PBCS documentation on it as well as I’ve undoubtedly missed something.

And it’s dead easy.  Let’s wall through the process, shall we?

Step 1

Smart Forms start out in – wait for it – Smart View.  That’s right, unlike every other kind of form they cannot be created in the web interface but instead must originate from a Planning ad-hoc query.  

Ad-hoc queries can either be created from a single cell retrieve or via the conversion of a proper form into an ad-hoc query.   In the screenshot below, I picked a Nice N’ Easy Income Statement form from PBCS’ Vision application.  

Step 2

I then converted the form into an ad hoc view by right clicking on the form name and choosing Ad hoc analysis.

I also could have started the ad hoc analysis by creating a new tab and then selecting Ad hoc analysis.  

The end result is the below:

I then did a mild bit of moving members about and created a four column report with a 2nd Half member in Column D with the super-sophisticated formula of SUM(B3:C3).  Silly, yes, but it prevents me from having to go into the outline and add that number as I only care about it once in a Blue Moon.

NB – To get this to work, I have to put a label on the column.  If that isn’t done the Smart Form won’t work.

Step 3

Once I have an ad hoc sheet with a formula, I simply save the analysis as a Smart Form using “Save As Smart Form”.

Step 4

Planning converts the ad hoc grid to a form, nicely centers the year across columns B through E, and then gives a visual cue that column D is calculated in Excel, not Planning itself by shading “2nd half” in orange and the calculated values  from D3 to D12 in green.

Big deal you may say but you’re missing three (see, Descartes does have a role in this post) points:
  1. That calculation is now part of a form that you can pull onto any new sheet just as you wish.  The calculation is available to all with no need to go into ad hoc mode.
  2. That calculation is now in Planning Simplified Interface’s web forms.
  3. The need to use Planning grid formulae is gone, gone, gone.

Here’s what it looks like in the Simplified Interface:

There’s my 2nd half column in Planning itself.  That means that the boffins at Oracle have managed to convert Excel formulae into Planning’s web interface.  Pretty cool.

One big caveat – these forms are only available in the Simplified Interface.  Oh, you can see them in Workspace but click on them and Wait for Godot.

Coming back to the Simplified Interface, it seems logical that editing the form further should be possible.  

Alas and alack, that isn’t the case.  It’s a bit difficult to see but the form definition below is read-only.  I don’t think I’ve ever seen that in on-premises Planning.
Note also how column A is Q3 and Q4 and column B shows only the dimension name “Period”.  However PBCS translates Excel into the web interface, we’re not going to see it.

Coming back to Smart View, as this is a form I can change the Product dimension from T_TP (I think that’s Total Product.  Sorry, I was too lazy/sloppy to make sure that the Default Alias was used) and just like a form it changes.

And if I want to convert this form back into an ad hoc analysis, that’s available to me as well.  The formula that I created when I saved the original ad hoc view is retained.

Where exactly does this ability to convert Excel formulae into a Smart Form begin and end?  There a lots and lots and lots of formula functions in Excel – that’s one of its many strengths.  Unfortunately, not all formula functions are supported and if there’s a list anywhere of what works and what doesn’t I was unable to find it.

Arrrgh

Just what is supported?

An example of this is what I thought to be a rather clever (Clever is as clever does but I don’t actually think this is all that clever.) use of Excel to predict Net Income through Excel’s TREND function using the least squares method.  Think of it as a poor man’s Predictive Planning.

I will gloss over the amount of time it took me to actually figure out how to use TREND and show you instead the super awesomeness that I created:
   

That’s a bit hard to see but I wanted to show you how TREND does a halfway decent approach to fitting a curve.  It’s not a patch for Essbase’s own BSO @TREND function but it is a lightweight approach that might only make sense in a handful of forms or even just one – a great use case for Smart Forms.

Or so I thought because when I try to save this as a Smart Form I get this:

Bugger.  So that’s at least one formula that isn’t supported via Smart Form.  It sure would be nice to have that list of supported/unsupported.

No pretty pictures

One other thing doesn’t work:  applying formats.  According to the documentation, I should be able to apply formatting to both Planning (Essbase really) data as well as the Excel-derived cells.  Here’s what I get when I try:
Oddly enough I can use the Apply Formats function for the normal Planning cells but again that doesn’t work with the Excel-derived cells.  Bugger yet again.

Conclusion

Other than that, everything works and two minor bugs (one’s a function of missing documentation and the other really is a bug although I can’t find it in MOS) in what is pretty new functionality isn’t half bad.

I’ll take the compromises – I now never, ever, ever have to remember Planning grid formulas, I can do all sorts of neat-o labels, I can build it from an ad hoc view to a Smart Form back to an ad hoc form – that is pretty cool.

Smart Forms are Yet Another Feature That PBCS Has That On-Premises Doesn’t aka YAFTPHTONPD pronounced “yaphtopd”.  Oracle, please, please, please bring this to on-premises.  The rest of your customers are waiting with baited breath.  It’s that good of a feature.

Be seeing you.

The Compleat Idiot's Guide to PBCS, No. 8 -- Supported Smart Forms Excel Functions

$
0
0

What’s in, what’s out, and what’s not official (that would be all of it)

I’ve done this before, and I’m doing this again:  this blog is (like many if not all of my posts) information that is not in any way supported by Oracle.  Do not go to Oracle whining about, “Cameron said this worked, but it doesn’t, so I hate you Oracle, blah, blah, blah, blah, blah,” as I am telling you that what you read below is unsupported, unofficial, incomplete, tied to the PBCS release of today, 22 March 2016, unknown to anyone at @oracle.com, etc. In other words, enjoy and maybe use the below and don’t have a conniption if it all goes sideways on you.

You Have Been Warned.

The warning is over, here’s the cool stuff

Could whining actually be an effective approach?  Maybe.

In my last PBCS post on Smart Forms in Smart View, I whined the following:
“Where exactly does this ability to convert Excel formulae into a Smart Form begin and end?  There a lots and lots and lots of formula functions in Excel – that’s one of its many strengths.  Unfortunately, not all formula functions are supported and if there’s a list anywhere of what works and what doesn’t I was unable to find it.”

And then went on to whine:
“Bugger.  So that’s at least one formula that isn’t supported via Smart Form.  It sure would be nice to have that list of supported/unsupported.”

Yes, I am a whiner, to the detriment and annoyance of all who know me.  And yes, whining seems to be a theme in this post.  Perhaps I have done it recently and feel guilty about it?  The Psychology of Cameron is a frightening thing.

Now to the cool stuff

How about that currently supported/but use at your own risk list of functions?  Ask and ye shall receive.
Very, very, very nice and an awful lot of them to boot.

 Function
Category
ACCRINT
Financial
ACCRINTM
Financial
AMORDEGRC
Financial
AMORLINC
Financial
COUPDAYBS
Financial
COUPDAYS
Financial
COUPDAYSNC
Financial
COUPNCD
Financial
COUPNUM
Financial
COUPPCD
Financial
CUMIPMT
Financial
CUMPRINC
Financial
DB
Financial
DDB
Financial
DISC
Financial
DOLLARDE
Financial
DOLLARFR
Financial
DURATION
Financial
EFFECT
Financial
FV
Financial
FVSCHEDULE
Financial
INTRATE
Financial
IPMT
Financial
IRR
Financial
ISPMT
Financial
MDURATION
Financial
MIRR
Financial
NOMINAL
Financial
NPER
Financial
NPV
Financial
PMT
Financial
PPMT
Financial
PRICE
Financial
PRICEDISC
Financial
PRICEMAT
Financial
PV
Financial
RATE
Financial
RECEIVED
Financial
SLN
Financial
SYD
Financial
TBILLEQ
Financial
TBILLPRICE
Financial
TBILLYIELD
Financial
XIRR
Financial
XNPV
Financial
YIELD
Financial
YIELDDISC
Financial
YIELDMAT
Financial
ISERR
Information
ISERROR
Information
AND
Logical
IF
Logical
NOT
Logical
OR
Logical
ABS
Math and trigonometry
MOD
Math and trigonometry
PI
Math and trigonometry
PRODUCT
Math and trigonometry
ROUND
Math and trigonometry
SUM
Math and trigonometry
TRUNC
Math and trigonometry
AVERAGE
Statistical
AVERAGEA
Statistical
COUNT
Statistical
COUNTA
Statistical
MAX
Statistical
MIN
Statistical
DATE
Date and time
DAY
Date and time
DAYS360
Date and time
EDATE
Date and time
EOMONTH
Date and time
HOUR
Date and time
MINUTE
Date and time
MONTH
Date and time
NETWORKDAYS
Date and time
NOW
Date and time
SECOND
Date and time
TIME
Date and time
TODAY
Date and time
WEEKDAY
Date and time
WEEKNUM
Date and time
WORKDAY
Date and time
YEAR
Date and time
YEARFRAC
Date and time
FALSE
Logical
TRUE
Logical
ACOS
Math and trigonometry
ACOSH
Math and trigonometry
ASIN
Math and trigonometry
ASINH
Math and trigonometry
ATAN
Math and trigonometry
ATAN2
Math and trigonometry
ATANH
Math and trigonometry
CEILING
Math and trigonometry
COMBIN
Math and trigonometry
COS
Math and trigonometry
COSH
Math and trigonometry
DEGREES
Math and trigonometry
EVEN
Math and trigonometry
EXP
Math and trigonometry
FACT
Math and trigonometry
FACTDOUBLE
Math and trigonometry
FLOOR
Math and trigonometry
GCD
Math and trigonometry
INT
Math and trigonometry
LCM
Math and trigonometry
LN
Math and trigonometry
LOG
Math and trigonometry
LOG10
Math and trigonometry
MROUND
Math and trigonometry
MULTINOMIAL
Math and trigonometry
ODD
Math and trigonometry
POWER
Math and trigonometry
QUOTIENT
Math and trigonometry
RADIANS
Math and trigonometry
RAND
Math and trigonometry
RANDBETWEEN
Math and trigonometry
ROUNDDOWN
Math and trigonometry
ROUNDUP
Math and trigonometry
SIGN
Math and trigonometry
SIN
Math and trigonometry
SINH
Math and trigonometry
SQRT
Math and trigonometry
SQRTPI
Math and trigonometry
SUMSQ
Math and trigonometry
TAN
Math and trigonometry
TANH
Math and trigonometry

Awesome is the only word that describes this function list.  Note that it’s 128 functions long.  Think of the Excel formulae you could write, think of the rich functionality in Excel, think of how many of these functions are not supported in BSO and then think about how many are not supported in MDX.  

Conclusion and a whine

And that Gentle Reader, is why Smart Forms are 100 (128?) times better than traditional form formulae.  Think about what you could write as one offs in a form.  No more creating a member in the Account dimension that only gets used once and is deadwood otherwise.   Instead, create the base members you need to perform the calculations and use the quite considerable power of Excel to do the heavy lifting.  Remember that Smart Forms are easily made into ad hoc forms thus keeping the calculated member in play.  Also remember that these functions are available in the Simplified Interface.  Happy times indeed.

One last whine:  when oh when oh when will we see this in on-premises?  I love, love, love the idea of Oracle writing this stuff as it is top drawer functionality but thus far it is for PBCS only.  Please Oracle, for the balance of customers who are not on The Cloud, bring this to on-premises so all of your customers can benefit from this functionality.  

Be seeing you.

The Compleat Idiot's Guide to PBCS No. 9 -- The A to Z of EPM Automate

$
0
0

The sum of more than its parts

In the on-premises Planning world we have access to a variety of EPM script languages:  Essbase’s MaxL (probably the most commonly used), Shared Services’ LCM utility, and the many Planning utilities.

In the PBCS world we have precisely none of that.  No MaxL, no LCM utility, no suite of Planning utilities.  How oh how oh how are we supposed to manage PBCS except through Workspace or the Simplified Interface?  The answer is EPM Automate.

Death of a thousand cuts aka Yet Another EPMA Rant by Cameron

I am happy to relate that the first four characters of EPM Automate do not in any way, manner, or form refer to some kind of zombie-like continuation of EPMA.  That product, as worthy as its genesis may have been (and I do actually think the idea of a single metadata editor across multiple products has quite a bit of merit), its execution has been an unending source of pain, confusion, and anguish since the first day I (and practically everyone else I’ve met) set eyes on it back in 2008.  Whew, that was quite a rant and whoever owns it at Oracle, I’m sorry, but your product has caused quite a few of my grey hairs.

#NoAccess

Pointless but deeply satisfying rants aside, there are several problems wrt PBCS automation:  there’s no way to actually connect to the server OS – no terminal, no OS user id, no OS password, no view of the server file system, the lack of live connectivity means that automation has to run from a client machine, not the server itself, and lastly and most importantly the suite of automation tools we use on a regular basis doesn’t exist.  

What are we trying to do

If we are to consider a typical Planning administrative use case of a monthly load of actuals into a forecast scenario it will require the following steps:
  • Partially clearing a Plan Type of a forecast out month(s).
  • Loading metadata, pushing out a refresh
  • Loading data
  • Running a calculation uses,

Those steps in turn require:  
  • MaxL to clear via a calc script,
  • outlineload.cmd to load metadata,
  • REFRESHCUBE.cmd to refresh the Planning application
  • MaxL again to run an administrative calc script (yes, you could do this with a Calc Mgr rule but I’ve never seen it).

All of that can be done in EPM Automate and you’ll see this bit by bit over the next few weeks but first we (or at least I) need to understand some kind of mapping between the on premises and cloud world.

The universe of commands in the other tools is too large to manage or even comprehend – I’ve been working with Planning since 2001 and I’d guess at least 30% of the command line grammar is still terra incognita to me – I’ve never used SortMember or TaskListDefUtil or DeleteSharedDescendant – let alone all of the many, many, many commands in MaxL.  Instead what we’ll do is look at what EPM Automate can do and match it to the commands that exist in the on-premises world.  From that I’ll have another post on what it actually takes to follow that use case with a few surprises along the way.

With that, let’s do that compare and contrast exercise.

EPM Automate commands

A quick note about documentation – Oracle has done sterling work with their PBCS documentation and EPM Automate is no exception.  I encourage you to peruse Working with EPM Automate for Oracle Enterprise Performance Management Cloud in addition to the genius (ahem) below.

Applicable across all services

NB – These seem awfully PBCS-specific despite the documentation’s claim otherwise.  Having said that, as I’ve never used anything other than PBCS, I could yet again be 100% wrong.  Those who take glee in correcting me, please write a comment to this blog.

NB No. 2 – If I only RTFMed, I’d have seen that there’s a whole section on Account Reconciliation Cloud that does have commands of its own.  Okey dokey, that means these are global commands.

EPM Automate
On-premises equivalent
Comment
help
Most if not all utilities offer some kind of terse help.
Command help is what it is.  MaxL is pretty bad at this.
encrypt
Planning has its own PasswordEncryption utility, MaxL has for ages had it’s approach.  
Whatever you do, if there’s a plain text file that is used to create the encrypted password, move it off the server/client to some secure place.  
login
Again, duplicated in Planning utilities and MaxL.  There’s but a single password for EPM Automate.
In on-premises it’s possible – likely even – to have different passwords for Planning and Essbase.  That isn’t the case with PBCS.  You’ll have to decide if that’s a good or bad thing.
logout
A lot of the Planning utilities are run-from-a-command-line and there is no logout after the process is complete.
MaxL requires either a logout or exit command.  
uploadfile
Load metadata via outlineload.cmd, data via Essbase/MaxL.
The concept of uploading data or metadata to a folder for further processing doesn’t apply to on-premises.  One could look at OS file system folders to do the same functionality but they are not intrinsic to Planning.
downloadfile
DATAEXPORT BSO calc command, LCM utility
Again there is no concept of a folder in the on-premises world.  Up to three different utilities are required to duplicate EPM Automate’s functionality.
listfiles
OS folders only.
You could use file system folders in the OS to sort of, kind of duplicate this functionality but really there is no direct analogue.
deletefile
LCM utility, OS deletes.
Again, not a great match between the two worlds because of the folder construct.
exportsnapshot
LCM EPMExportAll utility.
This is a pretty tight match in functionality.
importsnapshot
LCM EPMImportAll utility.
Again a pretty good match.
recreate –f
CubeRefresh utility, sort of
Cuberefresh can recreate the Essbase outline whilst getting rid of any hard-earned calc scripts, load rules, etc. (Why would anyone ever want to do this beyond the first go round?).  What it can’t do is delete the app as recreate –f can.
feedback
N/A
Call your friendly Planning Product Manager (not much chance of that really) or submit an enhancement request to Oracle Support.
resetservice
Bounce the Planning service.
This is a complete PITA in on-premises because it will affect all Planning apps.

PBCS-specific commands

NB – Now we really are getting to functionality that only makes sense within PBCS.

Two notes

A note about the oft-referred to Jobs.  Jobs are actions that are used to support EPM Automate functions, e.g. a job to import data into an application.  This concept just doesn’t exist in on-premises.

As with on-premises Planning, Calculation Manager Business Rules are used to perform all calculations.  Essbase calc scripts are not available in PBCS.  There’s no reason to use MaxL to call a script as there are no scripts to call.

EPM Automate
On-premises
Comments
importdata
MaxL, outlineload
The on-premises (and PBCS) native data load functionality is so brain-dead it beggars belief.  Really, it’s just awful.  At least the two are at parity.  :)
exportdata
MaxL, outlineload
Ibid.
refreshcube
CubeRefresh
There are far fewer options in EPM Automate, e.g. there is no ability just refresh filters in EPM Automate.
runbusinessrule
CalcMgrCmdLineLauncher
The functionality is at parity, right down to ways to pass RTP values.
runplantypemap
PushData
The functionality is at parity.
importmetadata
Outlineload
The functionality is at parity except outlineload.cmd can read from SQL.
exportmetadata
OutlineLoad
The functionality is at parity except outlineload.cmd can read to SQL.

Data management specific commands

As you may have noted in my rant on importdata, to state that the native data load to Planning is flawed is to be excessively kind.  I can’t think of one system I’ve ever come across that uses it and I’ve been working with Planning since 2002.

To that end, and because FDMEE is the data integration tool of choice for EPM, PBCS uses FDMEE.  The EPM Automate commands reflect that approach.

EPMA Automate
On-premises
Comments
rundatarule
loaddata
EPM Automate cannot load metadata through the loaddata command.  FDMEE’s on-premises loadmetadata can.
runbatch
Runbatch
On-premises requires a username and password in addition to the batch name.  This isn’t required in EPM Automate because batch execution takes place within an already-logged in session.

That’s just the beginning

You now have yr. obt. svt.’s take on EPM Automate.  I rather like it because I no longer have to mash together a bunch of tools to get a common Planning automation working.  I really like the fact that I don’t need to run the Planning utilities from the Planning server itself.  Explaining that to a very skeptical/hostile IT department is not one of my favorite Planning tasks.

I’ll walk through some, but not all, of these commands in my next post as I do a compare-and-contrast approach to my on-premises version of PBCS’ Vision application and the real PBCS deal.  

One note:  pity my younger, smarter, not-in-any-way-from-the-same-parents brother and fellow ACE DirectorCelvin, as I asked him to confirm a few statements.  I did warn you that I am a Compleat Idiot when it comes to PBCS but I thought it’d be nice if I didn’t spread misinformation.

Be seeing you.

The Compleat Idiot's Guide to PBCS No. 10, PBCS and metadata

$
0
0

The Big One

In some ways I really enjoy writing these long ones – I get to really dive into a subject, I absolutely learn how to do something, and in theory I am imparting valuable information to you, Gentle Reader.

On the other hand, this sort of thing takes time.  Lots of time.  There’s stuff I have to learn (I’m not calling this the Compleat Idiot’s Guide for nothing), dreaming up use cases, actually performing the example, and then writing it all up.  

All of the above is an excuse for not getting something out quicker.  OMG this took a while what with paid work (I am trying to reverse my normal “never let paid work get in the way of unpaid work”), in theory having a life, and oh yeah preparing for my Kscope16 sessions.  Whoops, right back to my work-before-and-after-work.

Regardless of my seemingly non-stop whining over this, for your reading pleasure below I’ve tried to compare and contrast what it’s like to do a common Planning administrative task of monthly Actual loads in to a forecast scenario in both PBCS and on-premises 11.1.2.4 Planning in an automated fashion.

To do that, I must:
  1. Load new metadata
  2. Refresh the Planning application
  3. Clear out existing Actual data just in case there’s anything there.  No, there shouldn’t be but one never knows if there’ll be more than one pass.
  4. Load data.
  5. Aggregate the database

Happily, most of this is very similar (so what’s my excuse for not getting this out weeks ago?) to on-premises Planning so the only thing we will have to really consider how data gets loaded via the concept of inboxes.

Splitting things into two and more

I’m going to write this series in two major and many minor parts.

The two main themes:
  1. What are the user interface methods of doing this?  There are a lot of steps here and a few surprises along the way.
  2. How do I automate this in both environments?  What seems to be a bit of a disappointment in the first part turns out to be worth it in the second.
^^^See how I tease or at least hope I tease you into sticking around for both major themes?  I try.

As there is so much to show , I am going to split that into many posts.  Why do that?  Mostly because there are so many, many, many steps that I want – and I think in the SI world you need – to show in excruciating detail.  I for real and for true took 99 screenshots of this process to illustrate how to do things and I simply can’t put all of that into one post or even two.

Before you dive any further, know that this post is long, like really long, like OMG how long is it?  Long.  I did that for two reasons:  to show you exactly how it’s done and to drive home again and again how different and frankly clickicious (just made that one up) the Simplified Interface can be.  Different is another word for mind-bogglingly click happy.  You’ll see.

What’s the use case?

Simply the export of metadata into both on-premises Planning and PBCS.  For those of us Compleat Idiots (I mostly refer to myself in this regard), the on-premises bit is old hat but as we delve into PBCS you’ll see that isn’t the case at all.

One other really important bit

Every act I take in PBCS in this post is for the most part with a few exceptions, just about the same as on-premises when PBCS’ Workspace mode is used.  

Where there are differences I will point them out but there is generally administrative parity between on-premises Workspace and PBCS’ Workspace.

The Simplified Interface (SI) will differ quite a bit as we shall see.

I want doesn’t get

Wouldn’t it be nice if PBCS supported EAS?  Wouldn’t it?  It doesn’t.

But as I’ve gone and converted PBCS’ Vision application to on-premises, I now have quasi-PBCS-enabled-Vision on my local VM and once there I can view the Plan Types in Essbase form, just as Bob Earle decreed.  ←Technically, he didn’t write EAS, but he did write AppMan, and if he were still in this space he’d be an advocate of real outline editors.  Or not ‘cos he likey really does have a life.

Let’s take a look at the Product dimension.  Lots of lovely computers, but no Tricorders?  My Inner Spock rebels at the thought.

Is that the case in PBCS?   It’s the same but that should be no surprise as I’ve not touched this dimension.

Export it out

I can never remember the format of outlineload.cmd (or in this case, Planning’s UI) metadata file format.  Rather than try to tough it out or – gasp – RTM, I will instead take the lazy geek’s way out and export the metadata from Planning itself.  I can then modify it and be on my merry way.

Workspace

On-premises

I’ll export the dimension through Administration=>Import and Export->Export Metadata to File

Pick the Product dimension:

Save the file as a .csv file:

Once saved, the text file is as follows:

Not easy to read but all there.

PBCS

Workspace

It’s the same.  That’s it.  I told you there was parity on most administrative functions.

Simplified Interface

SI is quite a bit different, and there are two ways to do this.
  • Through local files
  • Via a PBCS

Local files are easier but are one-offs whereas PBCS jobs can be automated.  Let’s examine the local approach first.

All news is local

Having selected the most awesome Candy Crush (I stole that description from He Who Shall Remain Nameless) SI Console and then click on the Export button:

And then click on the dimension I want to export (Product) and then the Export button itself:

Note that, unlike Workspace, I can define what delimiters I want to use.  In on-premises and PBCS Workspace, it’s comma or nothing.

The result is the same export dimension metadata except it now comes across as a .zip file:

Why bother making it zipped?  Part of that has to do with moving files across an inter- rather than intra-net connections.  The other, rather more interesting part is that it is possible to export more than one or indeed all of the dimensions at the same time and even export them as a mix of comma and tab and who knows what else delimited outputs.  Choice is good.

In any case, here it is as a single dimension zip file:

Get a job

The local approach works but it’s strictly manual just like the Workspace approach is.  I can hear you say, “But Cameron, all actions in the User Interface are manual.  That’s the quintessence of something done in any kind of GUI.  Duh.”  And  Yr. Most Humbl. & Obt. Svt. agrees; that’s why this next, rather more convoluted Job approach is worth the pain.

Here’s my take on a Job (and I imagine there are better descriptions out there but this is what’s worked for me):  a PBCS Job is an administration process definition.

Setting the job itself up is a complete PITA and unfortunately not something we’ll see the benefit of till the next installment (or two or three or whatever down the line) but trust me on this, it’s eventually worthwhile.

Using the same use case, I’ll select the Product dimension and then click on the “Save as Job” button:

I’ll name the job (you can’t see it from this screenshot but it’s called ExportProduct):

Ta da, a job to export metadata has been created:

How does one run a job?  It’s a defined administrative task which is just dandy, but there’s no point to a process that can’t be done.  

The way to run a job is to navigate to the calendar-ish (there’s a hint there) Job tab and then click on “Schedule Job”.

Remember that comment about calendars?  That’s one of the nice things about a job – it can be scheduled and a variety of tasks can be scheduled including the Export Metadata task that we need.

Don’t miss the ability to schedule tasks.  This illustration doesn’t take advantage of that but know that this is something completely alien to on-premises Planning which relies on OS schedulers.

For the time being, let’s just run the job right now:

And walk our way through the wizard:

One last check to see if this is really and truly something I want to do:

There’s my job:

I can check the status:

And then go look in the Inbox/Outbox Explorer – that’s where the dimension export will have been written to:

Download the file:

Finally, here we are:

Importing metadata

Good grief that was painful (and long – we’re at page 20 in Word).  It is pretty much the same story on the way back in.  I’m not going to walk you through all of the painful detail – I’ll give you a taste but the Workspace/Local/Job concepts are the same.

Here’s the import file:


That’s hard to read, so here’s the almost-easier-to-read text word-wrapped many times:
Product, Parent, Alias: Default, Alias: English, Valid For Consolidations, Data Storage, Two Pass Calculation, Description, Formula, UDA, Smart List, Data Type, Hierarchy Type, Enable for Dynamic Children, Number of Possible Dynamic Children, Access Granted to Member Creator, Plan Type (Plan1), Aggregation (Plan1), Data Storage (Plan1), Formula (Plan1), Plan Type (VisASO), Aggregation (VisASO), Data Storage (VisASO), Formula (VisASO), Solve Order (VisASO), Plan Type (Vis1ASO), Aggregation (Vis1ASO), Data Storage (Vis1ASO), Formula (Vis1ASO), Solve Order (Vis1ASO)
P_170,P_TP1,Tricorder,Tricorder,false,store,false,,<none>,,,unspecified,none,,,,true,+,store,<none>,true,+,store,<none>,0,true,+,store,<none>,0

What could possibly not be clear about that file format?

At least it’s easy to see the Tricorder.
Workspace
It’s the same simple task it was in exporting only the other way round.  Yeah, I know, duh, but it really is simple.

Pick the file:

Click on Update to make it all happen:

And there it is:
‘Cos I am a compleatist (groan) I will show the last few steps.  

Kick off the refresh:

Do I want to do it?  Yes I do.

And I did.

Simplified Interface

Local

Back to the Console/Application tab, then Dimensions, and then click on Import:

Click on Create

Then on the dimension itself and then browse for the file:

Note that I am importing a .csv file, not a zip file:

Then click on Import.

Clicking on Import gives me the choice of refreshing the application or not.  I vote Refresh.

Did it refresh?  All that I know is that the metadata job was submitted.  So did the refresh happen or not?

If I go to Jobs and look at the activity pane, I see that indeed the refresh has happened.

And there’s my Tricorder.  Spock, eat your heart out.

Explicitly with a job

This time I’m going to explicitly move this file to the Planning Inbox.

Pretend that I’ve shown you how to navigate to the Console, click on Dimensions, Import, and Create.  

To get this into the Inbox/Outbox, click on Planning InBox, provide the file name (no browsing via Windows Explorer), and then Save as Job.


I again have the choice of refreshing the application or not but this isn’t an immediate act.  Instead, I am tying the refresh to the job ImportNewProduct.  

Yes, I have defined a job, but no, the actual file hasn’t been uploaded to PBCS.  To do that, go back to the Application tab and then select Inbox/Outbox Explorer.

Once inside, click on the job and then Upload.

Browse to the file.

Upload it.

You’ve managed to do it.

Let’s take stock:  we have an import metadata job, a file loaded to the Inbox tied to that job, and so now need to run that import.  To do so, go back to the Console and select the Jobs tab, then click on Schedule Jobs.

Pick the job type and run it right now.

Pick the just created ImportNewProduct job and then click Next.

Click on Finish and the job will run.

As can be seen, both the load and the refresh are happening.

And it’s done.

Import without Refresh

To define the Refresh as a separate act (remember how multiple dimensions can be defined and loaded in a single job?), I’ll not tick the box “Refresh Database” when I create the job.

Again, going back to the Console, select the Refresh Database selection from the Actions dropdown.

Yup, you’ll have to create a job.  Once again, there is the choice of instant gratification or defining this as a job.  Note the options for performing a refresh.  Would that on-premises Planning had these options for managing users.

Let’s do it right now.  

The refresh is ongoing.

And done.

But what of the alternative of going all the way with this as a job?  Click on that Save as Job button and name it.

And then run it.

Right now.

Yup, we go through the wizard again.

And again.

Check again.

It’s done.

For giggles, let’s go see the dimensionality in Smart View as I am getting a bit punch drunk (I’ll bet you are too) with the SI.

Yippee!  We have a Tricorder of our own.

Still here?

You’re a brave one for persevering through all 55 pages of screenshots.  This is the longest post I’ve ever written and I hope to never do write one as long again.  Amen.

But this pain has brought home a couple of things when it comes to the steps an administrator has to go through to perform common steps.  Btw, steps, clicks, stares of despair at yet another screen all mean approximately the same thing here.  As I noted up above, this is a long post and I’m tired.  So sue me.

Let me count the ways

Dimension export

  • PBCS’ (and on-premises) single dimension export took five steps.
  • PBCS’ Simplified Interface, local edition, took four steps.
  • Using PBCS’s job approach, we’re approximately at 22 steps.  ←I could be wrongish on that as it’s late and all of this is a bit of a blur but the number is close.

Dimension import

  • PBCS’ and on-premises’ Workspace dimension import took six steps.
  • PBCS’ local approach took 11 steps.
  • PBCS’ job approach with a refresh tacked on took 22 steps.

Database refresh

  • PBCS’ and on-premises’ Refresh database is four steps.
  • PBCS’s Simplified Interface run-it-right-now process  is four steps.
  • PBCS’s job approach is seven steps.

Counting the cost

If we do the math, we’ll see that to export, import, and then refresh takes:
  • Workspace – 15 steps
  • PBCS’ local approach – 19 steps
  • PBCS’ job approach – 51 steps

51 steps vs. 19 or 15.  Who would go down the path of jobs and why? 

What’s next

You thought we were done?  No such luck, I’m afraid.  Per the steps written lo so many days ago at the tippy-top of this post, we still have to tackle clearing data, loading data, and then aggregating.  Happily, the business rules side of things is much easier although alas the import of data still has the same multiple path approach.  It’ll be long but not nearly so much as this post.

Be seeing you.

The Compleat Idiot's Guide to PBCS, No. 11, PBCS and data

$
0
0

Short(ish) and sweet(ish)

The previous post on this was long, too long maybe (Maybe?  Definitely.) and certainly the longest post I’ve ever written if counting the pages in MS Word is a guide – 56 – and that is too much of a good thing. In my defense, the length came from trying to show you all of the different ways PBCS supports loading metadata.  I’m going to try to keep this a bit shorter although the same approach of using graphical how-to between Workspace and the Simplified Interface will continue.

This post is going to show how to follow the next steps in loading Actuals into a forecast:
  1. Clear out the current Actual month just in case there’s something there or if a reload is required.  ←No, I’m not going to do that this time round as this post will again be too long.  You’ll have to wait.
  2. Load of Actuals data to the current month.
  3. Aggregate the database ← Ibid.  Sorry, but I can’t do another 50+ page post.

The post after this one (edit:  nope, the one after the one after this one, see above) will show (finally) how to do this in an automated way in both on-premises and PBCS.  If you thought there were differences between the two products thus far, you ain’t seen nothing yet.  

To reiterate, based on the length of this post, it’s data, data, data and no calculations as yet.  Sooner or later…

Sort of a mea culpa about data

Back in part 9 of this series, I wrote:

The on-premises (and PBCS) native data load functionality is so brain-dead it beggars belief.  Really, it’s just awful.  At least the two are at parity.  :)

Well, it is brain-dead.  And it is awful.  But, it does work.  This isn’t quite like my EPMA rant (see here and here and here and I’m sure there other places) but let’s take a look at data file formats for PBCS and then decide for yourself.  Btw, if you disagree with me, I’d really like to hear why.

An important bit to remember

Usually (always, actually) when I load data into a Planning application, I use an Essbase load rule to do so.  In fact, I’ve never, ever, ever used Planning to load data.  There is no direct access to Essbase in PBCS and hence no load rule.  You must provide a preformatted and pretransformed data file and it must be a text file unless you load through the FDMEE-lite that comes with PBCS.  I’ll cover that in a future post and as it’s not required and maybe not even desired, I’m going to stick with loading text files.  

I’ll also note that the data files can be in a zip format but again I’ll cover that in later post to help somewhat with brevity.

First the sweet, then the bitter

This is what Essbase (PBCS) data ought to look like.

There are the following dimensions in the PBCS sample application Vision’s Plan1 Plan Type:
  • HSP_View
  • Year
  • Scenario
  • Version
  • Entity
  • Product
  • Account
  • Period

It looks ugly when Yr. Obt. Svt. marks it up, but it’s a very simple dimensional mapping of data with each column representing a dimension and the last being the data values.

The tab delimited record:
"BaseData"    "FY15"    "Forecast"    "Working"    "410"    "P_000"    "4110"    "Jul" 2000

Also note that all field values except data must be enclosed in double quotes.  This is not necessary when loading through an Essbase load rule in on-premises Planning.  Why the tab and then double quote delimiters?  Damfino.

That bitterness?  Is it cyanide?

I might be resorting to a teensy-weensy bit of hyperbole, but this format just offends my Essbase and relational and I’m sure something else sensibilities.  

In any case, this is what PBCS data looks like and it oughtn’t.

The Planning data format is…different.  There’s a concept of a row and a column which in this case is all level 0 Products and July respectively.  The other data elements are in the Point of View but are actually they’re analogous to the data columns above.  In addition, the Plan Type is defined in the record as well.

NB – This record is comma delimited but it could have used a tab or some other character if desired.

The comma-delimted record:
P_000, 2000, "BaseData, FY15, Forecast, Working, 410, 4110", Plan1

Why would Oracle Hyperion do this?  The data ends up being sandwiched between the row (which ought to be more than one dimension in just about anyone’s definition of a fact table row) and the column and the “POV” which btw require all of the other dimensions – and thus fulfills the role of the row columns expcet it’s got a different name – to be defined within double quotes.  Why oh why oh why is this considered “better” or even “necessary” or “not-put-out-of-its-miserable-life-by-a-bullet-to-the-back-of-its-head-whilst-kneeling-at-the-edge-of-a-lime-filled-pit”?  ←This last bit might be a tad over the top but take it as read that I’m not a fan of seemingly needless complexity.

Seriously, Gentle Reader, tell me why it’s so and I’ll write a panegyric about why you’re right and I’m not.

Just in case you think I’m telling porkies, that Planning file format comes out of PBCS’ own self:

No, one cannot put more than one dimension into the Row slice definition.  Believe me, I tried.  

So I call mea culpa on my mea culpa– the Planning data format is brain-dead.  I’m sure someone, somewhere, somehow uses it but I cannot for the life of me understand why.

Step-by-step-by-step

Putting aside that rant – and it was a good one wasn’t it – let’s play the how-does-a-Compleat Idiot-do-this game.

Before and after the fact

NB – I’m only going to show this once as loading the data will have the same impact each time.

July is empty

Here’s my oh-so-simple load of data.  Note that member P_170 aka Tricorders is part of the form.  Yes, I do actually write these posts in serial.

July is loaded

There it is.  Rinse, rather, repeat through the three approaches to loading data to follow.

Workspace

As with metadata Workspace is dead easy and fast.  Do you see a pattern here?  And no, it’s not because I’m used to Workspace, it’s because there are just so few steps compared to the SI.

Navigate to Administration->Import And Export->Import Data From File.

Pick your poison.

I’ve selected Native Essbase and pointed it to Plan1:
Ta-da, we’re done.

Did that in five (give or take) steps.  Will I match that with the SI?  When Sus scrofa domesticus becomes airborne should be your guess.

Simplified Interface

By now you should be familiar with the Console as this is where most processing occurs.

Get a job but not this time

One thing to note is that I’m not explicitly creating a job this go round as I’m directly loading the file.  Regardless, I’ll be viewing the success or failure of this load via the Job console.  Weird.

Putting aside weirdness, I’ll go to the Actions button and select Import Data.

You can see that I’ve already created jobs to import data.  No mind, I’m going to create a new one:

Oooh, I’m at the bit where I define the file I want to import.  Let’s go pick it.  

What could possibly go wrong?  I’m loading the Essbase format file.

As I am a bit on the cautious side, let’s validate that file before actually loading it.

Hmm, something isn’t working out.

The Plan Type name isn’t there.

The problem is I picked an Essbase file format where a Planning file is requried.  Bugger.

Let’s try that again but this time select the right file type.  Duh.  

Sparing you the act of getting back to the file, this time Click on import having selected Essbase Source Type.

Ahhh, that tastes better.

This time it’s all good although the message…
…is just wrong Gentle Reader as I checked to see if the data loaded; it did.  Oh well, what would life be without a scintilla of uncertainty?

So long as we’re counting steps, I make that seven.

Get a job you bum!

This time round we’re going to load this to the PBCS InBox.

Our first stop naturally then is the Inbox/Outbox Explorer.

And then Upload the file to the InBox:

Pick the file.

Actually upload it

There’s the data file:

It worked:

A job can’t be run that doesn’t exist.   Create it by beginning an Import Data action.

Then Create it:

Does this look familiar?  It should as it’s the same screen we used for the local file import.

Change the location to the inbox, set the format as Essbase, and name the file with care.  It must match the file name in the inbox.  Then click on Save as Job.

Give it a name

We now have an import data job:

Close out, back to the Console, select Jobs, and now click on Schedule Jobs.

Once in the scheduler, click on Import Data, Run Now, and then Next.

Pick the Inbox file and then click Next.

This is your very last chance.  Since you are that devil-may-care sort of a chap that I know you to be, throw caution to the wind and click on Finish.

Lo, the mighty Infernal Machines begin their terrible work:

It works!

Let’s see what happened by clicking on the title, “Load Forecast Data”.  Oh, PBCS has a sense of humor.  Zero records?  Really?  Really?  Really?  No, not really.  

In fact it loaded:

Whew.

What have we learnt?

Other than I could keep this down to a mere 37 pages?  

More importantly, there’s an interesting observation for all to see with regard to effort:  the traditional Workspace requires less effort than local file Simplified Interface which requires less effort than jobs in the Simplified Interface.

Using this post as a guide, to load a single data file into a single plan type takes by environment:
  • Workspace – five steps
  • Simplified Interface Local – seven steps
  • Simplified Interface Job – 25 steps

Hmmm.

The other thing you have figured out is that I’m not fond of the Planning data file format but in comparison to an “old-fashioned” user interface that is five times as easy (using the rough number of clicks as a measure of effort) as the super-duper new one that dislike ain’t much cop.

We’re almost at the end of the follow-the-monkey approach.  There’s just the Business Rules that clear and aggregate the data.  I will note a handful of differences between on-premises and PBCS in that post.

And after that, I’ll finally tie this all together using both PBCS’ EPM Automate as well as traditional on-premises scripting.

Be seeing you.

The Compleat Idiot's Guide to PBCS, No. 12, PBCS and calculations

$
0
0

I have spared you Gallia est omnis divisa in partes tres.  Are you grateful?

After my last bout of dead language infatuation, the answer is almost undoubtedly yes.  

Although I am sticking to the Queen’s English this time, like Caesar and what became France, I am at the third and last post on the current month Actuals in Forecast use case via the various UIs.

In part one of this series I covered what it takes to get new metadata into PBCS, in the last post I reviewed loading data (and had a mildly epic rant about Planning’s native data format), and I shall now illustrate what it takes to write and run a Calculation Manager Business Rule.

With that let’s begin.

Calculation Manager is everywhere

Calculation Manager is Calculation Manager is Calculation Manager

Using version 16.2.2.0.0 of Oracle Planning and Budgeting Cloud service, Calculation Manager doesn’t have a Simplified Interface, uh, interface.  Oh you can get to Calc Mgr through the SI, but once done, it’s the same Workspace we all know and love.

Navigating to the Navigator’s Rules…

…launches (briefly) a new window…

…and then finally into good old Workspace:

Alternatively, for those of us Bittereinders who are holding on to Workspace with every ounce of energy we can muster, it looks just like on-premises and takes us to the same Calc Mgr explorer.

What does the code actually do?

Let’s take a quick break to review the very simple code.

Again coming back to the steps that a monthly Actual load automated process has to go through, it must:
  1. Load new metadata
  2. Refresh the database
  3. Clear out the most current month’s data
  4. Load that data
  5. Aggregate the data

Clearing the deck

The data clear is simple.  I’m showing it in PBCS but the code is the same in on-premises.

The logic is simple – FIX on all level 0 members for the current month using the Essbase Substitution Variable &CurMth to clear out the Forecast Scenario.  Easy-peasy.

Sum of the parts

After the data has been loaded there’s an even easier aggregation of the Entity dimension.  Believe it or not, in the PBCS Vision application all other dimensions are either fully dynamic sparse or dense so there’s just Entity that needs to be aggregated.


Executing Business Rules

Traditional from Workspace

Whether on-premises or PBCS, Workspace navigation is the same:

And then run from the list:

It runs:

It’s done:

Simplified Interface

In the SI, things are a bit different but conceptually it’s the same.

Navigate to Rules:

Find the one you want, in this case ClearCurrentMonth, and run it.

No confirmation message pops up when complete.  Instead go back to the good old Console’s Jobs tab to see the results:

Smart View

Finally, it’s possible to run business rules from Smart View – both on-premises and PBCS work the same way:

Just as with Workspace and SI, there’s notification of both execution:

And completion:

How many ways to skin this cat?  Four.

Other than the title bar that says, “Planning and Budgeting Cloud Service Workspace” can you see a subtle addition to the Calc Mgr code snippet above?  Look on the top toolbar in the editor.  See it?  No?  It’s oh so little and yet oh so useful.  

Let’s make it a bit more explicit.


That launch button is not in on-premises like so many other functions.  Sigh.

Putting aside my eternal lament about feature parity, this is Yet Another Pretty Cool PBCS Function (YAPCPF, pronounced yapkapfif).  No more need to deploy the rule to Planning to test it although for Workspace, the SI, and Smart View that will have to be done..  Fwiw, if you didn’t find it, be glad because I had to really exercise my inner OCD to see the difference as I compared icon-by-icon-by-icon across the two platforms.

Clicking on that button delivers a Validation before run (remember, if it’s deployed it’s validated):

And then a processing message:

And then finally a completion message:

Btw, there’s a Log Messages panel in Calc Mgr to give you log file information.  Note that there is no other way to get to the log file. Bummer on that one.

So what’s it all supposed to do?

Let’s take it in calculation steps.

Assuming this:

The ClearCurrentMonth business rule gets executed.  As July is the current month and there is no data, nothing changes.

Data gets loaded in as per The Compleat Idiot's Guide to PBCS, No. 11, PBCS and data which now looks like:

To aggregate it, run the AggregateCurrentMonth rule.  I chose to run it from Smart View but it could happen in Workspace, the SI, or Calc Mgr itself:

Ta da, aggregated data:

Btw, I am almostresigned used to working with Planning ad hoc forms in lieu of Essbase ad hoc connections.  Almost.

Summing it all up

If I were to look at the four approaches and count clicks as a way of measuring complexity, I see the following:
  • Workspace – four including clicking on OK when the process is finished
  • Simplified Interface – four including navigating to the Jobs console
  • Smart View – six including closing the Business Rules dialog box
  • Calc Mgr itself – two assuming being already in the editor

So not much of a difference in terms of effort between traditional Workspace and the Simplified Interface.

PBCS really moves beyond on-premises with that ostensibly simple ability to run the business rule from within the editor.  No more writing the code, deploying it, watching it fail/having useless junk in your Planning application.  Instead, just write, run, edit, run, edit, run, approve, deploy, drink a celebratory beverage of your choice.

Let’s take stock of where we are with this series:

With those three posts we’ve covered how to interactively load current Actual into Forecast.  That’s all well and good but in the real world no one (hopefully) would ever do this.  Instead it needs to be scripted so it can be run on demand or through a scheduler.

And that scripting process for both on-premises and PBCS will be the subject of the next post.  Expect to see quite a few differences between the two platforms.  You’ll have to decide which one is the best for you although I think that will be pretty obvious.

Be seeing you.

The Compleat Idiot's Guide to PBCS No. 13, LCM aka Application Management

$
0
0

Danger 5

I have been accused of being somewhat conservative in my consulting career:  taking the simple approach over the clever even when the clever is…clever ‘cos it inevitably blows up when the consultant genius who put it in leaves/is fired/can’t get it to work in the new release, never develop in production (oh, the arguments I have over this and if you think “There’s no test like production” you’ll be sooooooorrrrrrrryyyyyyy), actually running calculations through regression tests, and always making sure I’ve got stuff backed up before I make a change.  You say cowardly, I say deeply burnt by terrible experiences and am thus quite good at avoiding danger.  That’s why you hire experienced practitioners – they’ve (or at least I’ve) made almost every mistake there ever was and have learnt to never ever ever ever do it again.

Back it up

And with that, we’ve come to an awfully important part of our series, and one that I jumped over but didn’t forget in my excitement over diving into how to manage the Vision PBCS application.  In a word:  backups.  Boring, yes.  Vital, absolutely when things go pear shaped.  And PBCS gives you two (sort of) different ways of doing it.

I’m not going to recap how to do an on-premises Essbase back up as I’ve covered it here:  A lightweight, modular, and better Essbase backup.  So long as I’m trying to encourage you, note that the code I’ve written is error checked, parameterized, and extremely lightweight.  I’ve seen more complex backups – actually a lot more complex, cf. my comment about complexity vs. simplicity above – but I haven’t actually seen any that back up Essbase more completely.

But when it comes to Planning and FR and SS and who knows what else, an Essbase backup isn’t going to cut it.  Instead, in the on-premises world, we use LCM to either back up application components through the UI or via the LCM utility.

NB – Some of the more perspicacious amongst you may note that LCM backups Essbase data as part of the Planning application.  That’s as true as far as it goes but it’s pretty crippled and a far more flexible approach is to have a separate Essbase backup.  To each his own.

In the land of Cloud (Sky of Could?  Heaven of Cloud?  Hope and Glory?) things are both easier and the same.  But I’m getting ahead of myself.  Let’s stick with Good Old On-premises for the time being.

On-premises

What’s easier to archive?  Essbase via that backup code I wrote?  LCM backups of Shared Services groups, users, and provisioning?  Or Calculation Manager?  Or Financial Reports?  Or the Planning application itself?  All of these must be (at least in the UI if LCM is to be used) backed up one after the other.  It’s a complete pain.

It’s easy enough to do just involved.

You’d have to (and we will do this in the next post) script the whole thing using the Lifecycle Management Utility and then schedule it unless you really are a glutton for punishment.  All of my readers (the three or four of you) are of course of the Best and Brightest so there’s little chance of that happening.  Right?  Right.

In the meantime I will spare you the selects and exports but here’s what you end up with:

Navigating through the products isn’t so awful, it’s just that I had to select ‘em all.

Downloading it is dead easy:

And here it is on my local drive:

Do with it what you will.  In the real world, a nightly export is often archived off to a safe location.

Not really Workspace, not really the Simplified Interface

And in PBCS?  It backs up the application automagically each night.  There’s no need to script anything or run a utility or fire up your OS’ scheduler or think about having a separate Essbase backup (although the backups are a bit odd – see my post on migrating from PBCS to on-premises).  It just isn’t necessary.  All that you, Gentle Reader, need to do is to make two simple selections  from the admin landing page.

And then pick when you want the backup to occur having set your time zone.

And you are done.  That was nice, wasn’t it?

So just what does the scheduled snapshot show?

Workspace

Let’s go take a look from the traditional soon-to-be-gone Workspace:

And there it is.  Instead of on-premises’ “File System” LCM backups are called “Application Snapshots”.

The Artifact Snapshot is the system-generated backup and it has everything but everything all together.  

Just as with on-premises, if last night’s backup needs to be restored or just a part of that backup, simply navigate to the object and click on import:

And then confirm that you really want to import the object:

There it is.

Also note that the Migration Status Report shows the nightly backup.  Of course that makes sense because that’s exactly what was scheduled.  If you’re trying to reconcile the time between what you see in this screenshot and the Maintenance Time dialog box, remember that 22:00 Pacific = 01:00 Eastern.

All or nothing at all


And here’s a bit more than the on-premises backup as I only downloaded the Vision Planning application.  Remember that on-premises doesn’t combine products.  

You have a choice

That’s the scheduled export.  What about if a manual backup is needed?  This use case is solved almost exactly the same as on-premises LCM:  pick the product, click on Select All then Export then name the file:

Ta da, we have a single product export.

What about the Simplified Interface

It’s a bit of a work in progress.  Oh, it looks like it’s there in the Navigator.  So go on, click on it:

A new window opens:

And here we are again:

Look for this and everything that is in Workspace to eventually be transferred to the Simplified Interface but not quite yet.  One of the problems with this series in particular is that PBCS is changing so quickly some of this will be out of date far faster than on-premises products.  Customers are still on 11.1.2.2 as of the date of this writing – there won’t be PBCS instances three plus years out of date in future.

How hard was that?

There are two axes of effort:  automatic versus custom full exports.  Automate PBCS backups consist of one step of set and forget.  On-premises we’ll have to tackle next time because it’s all got to be scripted and then scheduled; that post will also take on EPM Automate’s backup functionality.  Let’s take it as read that on-premises is way harder.

As for interactive backups, on-premises vs. PBCS Workspace/Simplified Interface is largely the same.  One could argue that having to navigate through multiple application in on-premises makes the process a bit harder but that’s counting how many angels can dance on the head of a pin.

Our next post will be tying all of this together – backups, metadata loads, data clears, data loads, data aggregations – in an overall script.  It’s not hard to predict which will be the more painful although if you can’t guess you’ll just have to wait till the next post and All Will Be Revealed.

Be seeing you.

The Compleat Idiot's Guide to PBCS No 14, Chris Rothermel and Planning data loads

$
0
0

Introduction

Yes, another Compleat Idiot post, and yes another one not written by yr. obt. svt. Aren’t you glad?

This time round we have a lament coupled with a plea by my friend and colleague Chris Rothermel.  He’s got some actually useful information and informed opinion on PBCS’ native data handling which is a bit more helpful than my “The on-premises (and PBCS) native data load functionality is so brain-dead it beggars belief.  Really, it’s just awful.  At least the two are at parity.”  quote which on reflection is quite whiny but moderately funny.  It did spur Chris to write this so it wasn’t completely without effect.

With that mea culpa out of the way, let’s dive into what Chris has to say about PBCS and data loading.

Lament:  PBCS Native Essbase Data Loads Lack Functionality

Like Cameron I’m used to loading data files using an Essbase load rule.  This option is not available in the brave new world that is PBCS.  This post describes two behaviors of PBCS Native Essbase loads I find frustrating.
  1. The data load is all or nothing
  2. Data load error handling is poor

The data load is all or nothing.  If you have a single erroneous record in the file none of the valid records in the file are loaded.  

Can the “Validate” button help before I try to load?  Not in this case.  Validate checks the file format, not for valid members.  In Developing Essbase Applications, Chapter 2: Slay Bad Data In Essbase, Cameron creates a process to ensure members exist in the outline before data is loaded.  Here in PBCS our tools are limited.


So know when you see a screenshot like the one below nothing got loaded, not a single record from the file.
Let’s click “View Status” to see what the complaint is about and why the file errored out.

🍗 TurkeyLeg is not a valid member in the Entity Dimension.  So my data file containing 99 other valid data records did not load.  Here is the file that doesn’t load because of a TurkeyLeg.

The following webform confirms the data did not get loaded:


So what now?  Well it is probably no surprise that “TurkeyLeg” is not a valid Entity member for this database.  It was only there for demonstration purposes.  When I update the record to the correct entity the file loads and we see the data in the system.  Hooray #1:

Hooray #2:

What if the All-Or-Nothing approach is the design?

There is an advantage to the all-or-nothing approach:  The complete data file is loaded only when it is completely perfect thus avoiding the situation of partial data loads.  So if you’re an administrator in this example you would field the complaint:  “Why don’t we see any data for April?!” instead of “We are missing some data in the system.  Is the data loaded for April complete?”

The traditional way of loading Essbase data gives you a list of kickouts.  For my example file I had 99 valid records and 1 invalid record.  The 99 valid records would have loaded and the one kickout record would have been identified.  The kickout file for traditional Essbase is in a format ready for re-loading.  

The PBCS Native Essbase Data Load is different from this familiar reporting.  A list of kickouts is not provided – only the first record is!  That 100 record file was just a sample.  I really have a 27,528 record file.  When that tries to load it fails on another member that is not in my outline but should be.  An inefficient process would be to correct the single error record and then try the re-load, find the next missing member, and repeat.   

The All-Or-Nothing approach has limited merit but the functionality of the Native Essbase Data Load falls far short from what we’re used to.  

PBCS Data Load Error Handling is Poor

Navigate to Tools/Import and Export Status and find this sorry report on the data load.  Several disappointing items:
  • First it is labeled as a “Metadata Import.”  Boo.  I think the most junior programmer should be able to solve this for Oracle and correctly label this as a “Data Import.”  
  • How many records were read in my file?  0?!  Oh please put a small amount of code to say the data file had 100 records.  Again, a junior programmer can easily code this.
  • Records Processed:  0.  Well I guess this is fair when the data load fails because nothing was loaded.  The problem is when 100 records are loaded this is also 0!
  • Record Index:  This process failed on the 23rd line in the file.  If there were multiple records that fail to load it would be great to see them as well.


My improvement request to Oracle

  1. Give us the option for the All-Or-Nothing data load and realize we don’t want to be held up by one failing record
  2. Give us a complete reject list similar to the old Essbase kickout file
  3. Label Data Load logs as “Data” and fill in the record count details appropriately

Chris Rothermel
Rothermel Consulting LLC

Talking turkey?

Eating turkey engenders calmness through amino acids.  Turkey has tryptophan.  Does PBCS have an analogue?  

Be seeing you.

The Compleat Idiot's Guide to PBCS, No. 15 -- batching it up in PBCS and on-premises

$
0
0

Finally

Yes, finally.  You, oh Gentle Reader, and me, yr. obt. svt., are at that stage of finally tying together the different threads needed to perform the ostensibly simple monthly load of actuals into a Planning application.  As I’ve covered functionality I’ve made an effort to illustrate how to perform a given task in both PBCS and on-premises.  Arguably, that approach made the posts longer than perhaps needed but I, and I suspect you if you’re following along in this series, come to PBCS from an on-premises perspective and I wanted to highlight the differences.  No matter, it has been quite a bit of fun as well as hopefully educational.

And there is quite a bit that is different:  Jobs, Inbox/Outbox, Application Management, and most especially the epmautomate utility as will be covered in this post.  If you need a refresher or simply have far too much free time on your hands, I encourage you to peruse the below posts:

There’s quite a bit of content there, or at least awful lot of screenshots.  There are 200+ screenshots on just my contributions to this particular Compleat Idiot theme of monthly actuals.  That’s not counting my other posts as well as skipping over the contributions of Philip Hulsebosch and Chris Rothermel.  Good grief that’s a lot of PBCS content.  Some of it may even be correct, most likely the stuff Philip and Chris wrote.

It seems so simple

It is.  Let’s recap what needs to happen:
  1. Current application state is backed up to the local drive
  2. New metadata is loaded and the Planning application is refreshed
  3. The current month’s actual data is cleared
  4. Current month data is loaded
  5. The current month’s data is aggregated

And yet it is so different

The actions and concepts are exactly the same in on-premises and PBCS.  It’s those damn details that make it interesting.

When it comes to automation, the real difference is the number and type of utilities that are needed to perform identical actions in both platforms.  

Step the first – backup current application state

I showed in Compleat Idiot No. 13 how to use on-premises LCM and PBCS’ Application Management to manually back up applications and while the UI approach is largely the same the automation tasks are not.

On-premises

As with PBCS, one must first create an on-premises LCM export definition via the Shared Services console.  It’s a fair number of clicks but in the end it all makes sense:

LCM will write out the contents of the exported artefacts in one shot.  

If you have access to the Shared Services’ import_export folder, the exported objects are there for the taking:

Failing that, download the File System object from Shared Services and then look inside the zip file:

Whatever the approach, a file called Export.xml is created – this is the file definition of the export process.

Export.xml describes tasks which in turn define what is to be exported when the LCM utility is used:

The quintessence of eponymous

It isn’t often that EPM makes me laugh but this tickles my sense of humor:  what’s the logical name for LCM’s command line utility?  Why, “utility” of course.  Thus utility.bat exists and you can read all about it in the Lifecycle Management Guide.  

There are two things you must keep in mind, both of them fairly maddening.
Utility.bat (I am actually falling out of love with that name because if everything else I use at the command line in EPM-land is a utility, and they have names like outlineload.cmd and refreshcube.cmd thus I actually have a hint as to what they must do, a utility named “utility” doesn’t have a lot of utility.  Groan.  Really, the name is confusing.) must be run on the Shared Services server.  Actually, strike that – it has to be installed in some gawdawful way that is far beyond Infrastructure-Allergic Cameron’s abilities.  On my all-in-one VM, it’s not a big deal but it might be in the real world.  Check with someone who has a clue, IOW not me.

On initial run, utility.bat will ask for a correctly provisioned Shared Services username and password, which it will use to encrypt those fields in Export.xml.  And then it…deletes the file with the just-input and just-encrypted password on execution.  Double, no triple, no quadruple bugger.  You can catch it, just, in an editor like Notepad++ or TextPad if the xml file is already open and then not update the file when the editor catches the file change.  And then you make a copy of that file and thereafter copy that copy on top of the copy actually used.  OMG, it’s too awful to even type – just read Peter Nitscke aka @EssbaseDownUnder’sexcellent post on it.  I thought I was losing what little mind I had left when this happened to me.  I think we can all agree that ain’t no feature.

Initial state:

Here’s the file somehow encrypted:

And here’s what you end up with right back at the beginning:

Maddening I tell you.  Maddening.

The answer is as Peter noted:  simply create a copy of the encrypted username/password file (btw, you can’t pass parameters to utility.bat – do it the utility’s way or do it all by hand) and copy that on top.  It doesn’t have to be sophisticated:

Did you enjoy the additional mini rant in the comments?  And how annoyed I was when I wrote it, i.e. the somewhat tortured grammar?  See, it is worth writing those things out if for the amusement factor alone.

Calling the code itself is then easy peasy no big deasy:
This will write the export content to the Export.xml (remember, it’s really the just-copied-over Export2.xml but hey, who’s counting?) location:

PBCS

I am happy to report that PBCS’ approach is much, much, much faster both to code and to describe.  

Remembering that just like on-premises’ backup a manual backup must be created to act as a template for LCMing (or App Managing) the application objects.

Step 1 – cause a complete backup of the Vision application:


Step 2 – download that backup to the local disk


That’s it.

And here you go in zipped format (on-premises creates uncompressed folders by tool although the ultimate content is the same):

Step the second – load metadata

On-premises

On-premises is easier this time, thankfully, as it couldn’t get a whole lot harder.  There are eleventy billion possible switches to that utility, most of which I have thankfully never used.  

Here’s the same file I used back in Compleat Idiot’s Guide No. 10:

Outlineload.cmd must be run from the Planning bin folder.  Good luck in gettting IT to go along with that when it comes to automation.  It has always been one of my more painful conversations during implementations.

If you can get past that (and really, what other choice is there), running it is a doddle.  I’m using Windows environment variables to make my code a wee bit easier to read but otherwise it’s 100% stock:

You’ll note that there’s an encrypted password file which, unlike LCM, does not get deleted on run.  Huzzah!

There’s output, lots of output from this command.  Chatty is one way of describing it although I suppose the proper Comp Sci way of describing it is “verbose”.

Here’s the output piped to a log file:

There’s also an error file which, whether it has any readable content or not, always exists, and always has at least three bytes of content, thus making it imperative that any automation process not only look for that file but look inside that file to see if something went badly or not.  At least it keeps us employed…

I could have used the /C switch in outlineload.cmd to force a refresh but I like to spell everything out so I’ve used refreshcube.cmd.  It, like outlineload.cmd is easy to invoke:

PBCS

Would you believe it isn’t quite as easy?  There are certainly more steps but in fact they’re quite a bit simpler:

It’s just four steps:
  1. Make sure the file NewProduct.zip isn’t in the InBox
  2. Upload the file NewProduct.zip to the InBox
  3. Use the job ImportNewProduct to load NewProduct.zip.  Remember that NewProduct.zip is the same file that the outlineload.cmd utility uses.
  4. Refresh the database.

I’ve gone into the overloading of zip files in Jobs for metadata (and a bit later data) before.  I encourage you to read Compleat Idiot No. 10 for all of the gen on how to do that.

Once the file has been uploaded to the Job called “ImportNewProduct”,

Just a note about epmautomate.cmd’s logging – it’s errorlevel return codes or nothing:
Status Code
Description
0
Operation completed without errors.
1
Operation failed to execute because of an error
2
Cancel pending.
3
Operation is terminated by the user.
4
Incorrect parameters.
5
Insufficient privileges.
6
Service is not available.
7
Invalid command.
8
Invalid parameter.
9
Invalid user name, password or identity domain.
10
Expired password.
11
Service is not available.

NB – The above is cribbed directly from the docs – I really do encourage you to read (and hey, steal give full attribution to sources) and learn from them – as they have improved quite a bit from the on-premises version.

Step the third – clear out the current month Actual data

On-premises

We have yet another utility on offer:  calcmgrcmdlinelauncher.cmd  Isn’t this fun?  It really is an example of how on-premises has evolved.  Remember that before 11.1.2.2 (11.1.2.1?) Hyperion Business Rules were written in EAS.  If you’ve really been around, business rules had its own horrific desktop tool.  Ah, the bad old days that are best forgotten.

In any case, there is a command line tool to launch business rules.  I’m not using rtp files or any of the other functionality.  See the docs here for more information.

NB – If you’ve ever wondered why I put in so many references to the documentation it is both because I am lazy (yes, I have been called that and haven’t quite decided if I’m flattered or insulted; I’m inclined towards the former) and because Oracle do a better job with the details than I ever could.

In any case, here are the parameters:  password, application, username, plan type, rule name.  

Unfortunately, there is no echoing of status.  Within the context of a extraordinarily simple batch script sans error checking I threw in an echo statement to at least tell me that it’s running.  Before you get excited about this lack of rigor, remember that this is a blog post, not an actual implementation.

PBCS

It’s good old epmautomate.cmd to the fore:

Again, additional parameters are possible.  See the PBCS documentation for details on how to parameterize the command.

Step the fourth – load data

On premises

As I wrote in Compleat Idiot Nos. 9 and 11, Planning’s native file format is brain dead.  Yes, I can sort of see the point when loading text or even Smart Lists (see, Peter, I do sometimes listen), but for the purposes of this series the purposes of most data load use cases, it’s beyond useless.

And, if PBCS supports the Essbase data file format, and on-premises’ outlineload.cmd doesn’t, what’s a geek to do?  The answer is spelt M-a-x-L.

The code to call MaxL is simple:

Here’s the MaxL code as called by the above batch script.  You now know the username and password to my 11.1.2.4 VM.  Don’t do this at home.

Did you spot what’s missing?  There’s no load rule.  None.  In that way it’s the same as PBCS loading Essbase’s data format.

The data file fully describes the outline layout.
"BaseData"    "FY15"    "Forecast"    "Working"    "410"    "P_000"    "4110"    "Jul"    2001
"BaseData"    "FY15"    "Forecast"    "Working"    "410"    "P_100"    "4110"    "Jul"    6184
"BaseData"    "FY15"    "Forecast"    "Working"    "410"    "P_110"    "4110"    "Jul"    6807
"BaseData"    "FY15"    "Forecast"    "Working"    "410"    "P_120"    "4110"    "Jul"    6425
"BaseData"    "FY15"    "Forecast"    "Working"    "410"    "P_130"    "4110"    "Jul"    6778
"BaseData"    "FY15"    "Forecast"    "Working"    "410"    "P_140"    "4110"    "Jul"    5198
"BaseData"    "FY15"    "Forecast"    "Working"    "410"    "P_150"    "4110"    "Jul"    3129
"BaseData"    "FY15"    "Forecast"    "Working"    "410"    "P_160"    "4110"    "Jul"    3750
"BaseData"    "FY15"    "Forecast"    "Working"    "410"    "P_170"    "4110"    "Jul"    1500

If there is an error on load, a file is created.

PBCS

Are you getting tired of references to empautomate.cmd or does it fill your heart with joy?  Hopefully the latter is true as it makes things so simple.

There’s no separate language such as MaxL.  There is the requirement to load the data file to the InBox after first making sure that the file isn’t already there.  If it is there, you’ll get a lovely error as there is no overwrite option.

Once that is done, as before it’s simple:

Step the fifth – aggregate the latest month

On-premises


PBCS

Step the sixth – exit

On-premises

There really isn’t anything to do with the Planning command line utilities as each one is self-contained.

I’m not going to repeat a screenshot but the MaxL script has a logout command.  There, that was fair wasn’t it.

PBCS


Strictly speaking this isn’t really necessary but the OCD/conscientious programmer in me insists on this.

Finally

We have reached the point in the show where all of the code is brought together in one delicious goulash.

The pictures tell a thousand words and as I’ve typed over two thousand thus far let’s leave it at that.

Results

On-premises

In the dimension editor

Metadata and data in Smart View

Aggregated

PBCS

In the dimension editor

Metadata and data in Smart View

Aggregated

Overall code

On-premises

I’ve enjoyed (?) counting the steps and using that as a way to measure complexity.  

In the case of on-premises I say complexity is a function of:
  1. Steps
  2. Tools
  3. Number of scripts

If you’re counting that’s:  12 steps, five tools (LCM’s utility.bat, outlinelineload.cmd, refreshcube.cmd, MaxL, and calcmgrcmdlinelinelauncher.cmd), and two scripts.  Add ‘em up and we get 19 discrete objects so long as your definition of object is sufficiently elastic.  Work with me on this – there are worse metrics out there.

The main DemoBatch.cmd code:

The auxiliary Essbase MaxL LoadForecastData.msh code:

PBCS

There are more steps with PBCS’ epmautomate.cmd primarily because of file management.

Counting the downloads, deletes, and uploads along with all of the other steps there are:  15 steps, one tool, and one overall script for a value of 17.  

Do we have a winner?

Even though Cameron thinks Math Class is Tough, especially for him, 17 is only two less than 19.  Is that a fair measure of PBCS being just a wee bit simpler than on-premises or quite a bit more or something else?  The metric I used in the other posts in this series focused on the number of steps in the user interface to complete a task.  Writing an automation script isn’t focused on clicks but instead commands and parameters.  

Perhaps a better way to measure simplicity is to make that score a product of the number of utilities by the number of possible commands or switches.  Assuming that more is less, there’s a fairly obvious winner and loser.

On-premises

Command
Parameters
LCM utility
1
outlineload.cmd
57
refreshcube
8
calcmgrcmdlinelauncher
7
MaxL import
12
Total
85

PBCS

Command
Parameters
login
5
exportsnapshot
1
download
2
deletefile
1
uploadfile
2
importmetadata
2
refreshcube
0
importdata
2
runbusinsessrule
2
logout
0
Total
17

On-premises uses five different utilities with a total of 85 possible parameters and switches while PBCS uses one utility with 17.

From an ease of understanding, writing, and managing within a script, PBCS is the clear winner unless you have a strange love for arcane and little used parameters.

When will we have epmautomate for on-premises Planning?

What’s next?

This is part six of a thankfully six part series on comparing on-premises versus PBCS administrative task processing in both interactive and now batch form.  To do this I had to document what seemed an almost maddening level of detail but if I want to understand, really understand something, I simply have to do it.  I now have that basic level of knowledge and I hope that you’ll be able to use these posts as your initial guide to PBCS when (I think we will all be PBCS customers sooner or later) you make the switch.

I haven’t exhausted the subject of PBCS and given the development cadence of the tool I have to wonder if I ever will.  Yes, the title of this blog contains the word Essbase and I hope to get back to that most excellent of databases but it might not be till the fall.  Really.

As for this administration use case I’m not done with this subject.  Look for an expansion (Yeah, I know, how could that possibly be but it will be, I promise.)  of this at the soon-to-be-here Kscope16 where I’ll present On-Premises Planning vs. PBCS: Common Administrative Tasks Compared, Contrasted, and Recommended with Jason Jones on Monday, 27 June 2016 at 12:45 pm.  

For those of you who don’t know Jason, here’s a recent photo of the two of us.  I’m on the right.
http://vignette2.wikia.nocookie.net/uncyclopedia/images/6/6f/Dean_martin_and_jerry_lewis.jpg/revision/latest?cb=20110601232043

Be seeing you.

Kscope16 Apex sessions I'm interested in

$
0
0

Apex?  Whhaaaaaat?

Although it may be somewhat hard to believe, in fact yr. obt. svt. has technology interests outside of EPM many of which – like what would it take to really build that fusion reactor in my back shed– are likely best left unexplored.

What is best explored is an exploration of what else Kscope16 has on offer.  As most everyone knows, my quite-a-bit-smarter-not-at-all-genetically-or-familially-related-but-somehow-confused-for-me-brother Martin D’Souza, is Mr. Apex.  Although explaining Apex to me required Martin to exercise patience and lots of really small words, I now have an appreciation for Application Express aka Apex.  For those of you as clueless as I (so, exactly no one but bear with me regardless), it took a while for the penny to drop:  Apex’s concept of a relational back end and programmatically generated web code is what builds practically all of the web sites you visit except maybe ones like this.  

Good grief I – and I’ll bet you – spend a lot of time on the web:  message boards, technical documentation, user group portals, really cool upcoming conferences, this idiotic web site.  Given that just about all of the EPM products are web based, and are becoming even more so over time, and are in fact conceptually Apex-like, it occurred to me that I ought to actually try to understand what’s going on under the hood.  This is a block for hackers, right?  And what better conference than Kscope16 for understanding that?

And with that…

Paranoia

Databreachesareeverywhereontheweb.  Everywhere.  It’s nice to know that Kscope developers get it in the form of defensive measures.  I suppose they’re exercising their inner paranoia which nicely matches with mine.

Security from Tim Austin

APEX Security: Discussing Real-World Security Risks, Jun 28, 2016, Session 10, 2:00 pm - 3:00 pm
APEX Security: Anatomy of a Cross-Site Scripting Attack, Jun 29, 2016, Session 18, 4:30 pm - 5:30 pm

I think, I hope, I pray that attending Tim’s sessions on real world security makes the paranoia go away.  I hope.

Internet of Things

Will EPM ever be part of IoT?  I am not holding my breath.  For those of us who are not terminally dull, Anton Nielsen is running a lab on just that (IoT, not being terminally boring as I am the world domain lead for that) via his Hands-On Training: Build Something! IOT = Internet + Oracle + Things lab on Jun 27, 2016, Traditional HOT 3, 2:00 pm - 4:15 pm.

My code never sucks.  Never.  Never. Ever.  

How does that go?  Admit nothing, deny everything, make counter-accusations, never change your story. When things go tits up, obviously someone else buggered up the code.  And Peter Raganitsch agrees. Ah, I just read his summary more closely so that isn’t strictly true:  he’ll be talking about how to find the bugs, not shift the blame.  Where’s the fun in that as blame shifting is my modus operandi.  Given that I am kidding it behooves me to attend his It Wasn't Me: Finding Bugs in APEX Applications session on Jun 28, 2016, Session 12, 4:45 pm - 5:45 pm because sooner or later someone else is going to figure out my code stinks.

Why is Apex doing so well?  

Apex is cool.  EPM?  You decide.  Perhaps we can crib a page from Juergen Schuster’sWhy? session Jun 27, 2016, Session 6, 4:30 pm - 5:30 pm.  I do think in the end that it boils down to Apex geeks being far cooler than we EPMers.  Seriously, we need to figure out why they have meetups, why their exclusive conferences are so awesome, and why they are kicking EPM’s conference butt.  

Is that enough?

It ought to be.

There’s some really good content in the Apex track even if we never ever ever plan on writing a link of code in that framework.  Security, IoT, code quality – these apply to us in equal measure.  

I hope to be at each and every one of these sessions and I hope you will be too.

Postscript

Blog Hop

Thanks for attending this ODTUG blog hop! Looking for some other juicy cross-track sessions to make your Kscope16 experience more educational? Check out the following session recommendations from fellow experts!


Things that go FOOM!

As a child of the Cold War, anything that involves acronyms like AEC or NRC or IAEA makes me positively squirm in my seat no matter how intriguing the thought may be.  Even solar radiation is something I avoid.  Playing with that stuff without fully understanding it (or wearing a dosimeter)?  Madness.  Apex?  Coolness.

Be seeing you.

Hybrid Essbase, Oracle, Kscope16, and you

$
0
0

Surveying the Hybrid landscape

If you read this blog, you know that Hybrid is the future of Essbase.  If you don’t agree with that you should read this, and this, and buy this book, and listen to this podcast, and read this white paper, and read this, and this, and this, and then go find the downloads for these twopresentations.  Convinced?  You should at least be convinced that I think Hybrid is the future.

But as with so many things in my life, what I think really doesn’t matter.  I need only reflect on my 15 years of life with my cat as proof positive that what I think or want or need is 100% not important except for meal times and when it’s cold and I can act as the human furnace.  This harsh relationship has taught me to focus on what others need, not what I want.  See cat haters, felines are agents of self-actualization.  And hairballs.  

What really matters

What really matters in the world of Essbase is what you want the product to do.  Oracle owns Essbase, invests money in it, brings new features to market, and is literally invested in your adoption, advocacy, and use of the tool but only if you buy and use it.

How does Oracle decide what, when, and how will a feature be supported or not in Essbase?

They set product direction based on what you tell them.

Case in point and the point of this post

Hybrid Essbase – the magical combination of BSO flexibility and ASO power – has been out there for almost two years.   There hasn’t been a lot of real world noise about Hybrid’s glorious success or ignominious failure which is odd given the push many, including Yr. Obt. Svt, have made.

So what’s really going on with Hybrid Essbase?  Has anyone actually for real and for true implemented Hybrid in production?  Is it wonderful?  Awful?  Something in between?  What challenges did you face?  What was amazingly easy and awesome?
You have an opportunity to tell world+dog at Kscope16 and an opportunity to send this feedback right back to Oracle product management via John Booth’s survey:  https://www.surveymonkey.com/r/GotHybrid

Who knows – you might actually get picked for the panel.  Then you can tell Oracle both to their face (I estimate the chances of someone from Essbase product management being in the room to be quite high) and via the survey whether you actually use Hybrid in production, why, what’s holding you back, and what makes it awesome.  And of course you can share your experience with your fellow Kscope16 attendees.

Noting names

Fellow ACEs John Booth, Tim German, Mike Nader, and Yr. Obt. Svt. have all signed on for this.  We’re True Believers in this open relationship with Oracle.  We hope you will too.

Be seeing you.
Viewing all 271 articles
Browse latest View live