Thursday, July 22, 2010

Running BPC process chains within non-BPC process chains

Business Planning and Consolidation version for Netweaver extensively uses process chains for running the BPC processes. These process chains are automatically invoked by the BPC application when the BPC user executes processes from the front end. Should these process chains be exclusively executed only within BPC alone or should we able to execute them outside BPC, using native Netweaver BW, may be from within any custom process chains that we may create? Is there any need to do so? And finally, is there any way to do that? Let try to answer these questions in this blog. Let us begin with trying to see if we have any business reason to run the BPC process chain outside BPC application. In order to do that we need to understand how the optimization process works in BPC version for Netweaver.

Optimizing the BPC data model:

A dimension in BPC is equivalent to a characteristic in Netweaver BW and dimension members in BPC are equivalent to characteristic values in Netweaver BW. Taking this further, when a user creates a dimension in BPC version for Netweaver, a Netweaver BW characteristic is generated in the BPC namespace for the same. When a user creates a dimension member for that dimension in BPC version for  Netweaver, a characteristic value is generated in Netweaver BW in the master data of characteristic corresponding to that BPC dimension. When a user creates a BPC application in BPC version for  Netweaver by selecting a few of the BPC dimensions, an infocube (as well as a multiprovider containing that infocube) is generated in the BPC namespace that includes all the characteristics corresponding to the selected BPC dimensions. (You can read more about the BPC namespace athttps://www.sdn.sap.com/irj/sdn/weblogs?blog=/pub/wlg/11279)

We should distinguish the BPC dimension from the Netweaver BW dimension. In Netweaver BW, the term dimension is used to group the characteristics. How the characteristics in a BPC infocube are organized among the Netweaver BW dimensions within the generated BPC infocube? Well, it depends upon the number of dimensions included in the BPC application. If the number of BPC dimensions in the BPC application is 13 or fewer, then all of them are automatically modeled as line item dimensions in the BPC infocube. This is because Netweaver BW allows upto 13 user defined Netweaver dimensions in an infocube. If the number of BPC dimensions exceeds 13, then the BPC infocube model is automatically generated for those BPC dimensions. The data modeling thus generated while creating the cube may not remain the most optimized one as the fact table of the cube begins to grow. BPC version for Netweaver gives the option to the BPC user to optimize the data model from the front end. As shown below, there are two options to optimize - Lite optimize and Full optimize..

image

The Lite Optimize option does not make any changes to the data model. It just closes the open request; compresses and indexes the cube and updates database statistics. The Full optimize option is the one that may rearrange the characteristics among the 13 user defined Netweaver BW dimensions. The Full Optimize process will check if the size of the dimension table is less than 20% of the fact table or not and create as many line item dimensions as possible. In order to do this reconfiguration, it takes the appset offline, creates a shadow cube with optimal data model; links the new optimal cube to the multiprovider for the application; moves data to the shadow cube; deletes the original cube; closes the open request; compresses and indexes the cube; updates database statistics and brings the appset online again. Though this results in creating a new infocube, the multiprovider remains the same and all the BPC reports are built on the multiprovider and not the underlying infocube. Hence this optimization does not affect the BPC reports reporting this data. 

Using ETL for BPC infocubes:

Since the data that the BPC user enters from the BPC front end is stored in the underlying real time infocube for that application, one may ask whether it is possible for us to load data to that cube with normal Netweaver BW ETL process. The answer to that is ‘yes' - but with a caveat.

We can use Netweaver BW ETL for the BPC infocubes. Here is an example of a DTP to load data through a flat file to a BPC infocube.

image

Now if the BPC user chooses to do a Full Optimize' for this application, it may result in creating a new infocube with more optimal data model. That new infocube, though gets automatically linked to the multiprovider for the BPC application, at present, does not inherit the ETL structure that was built on the original cube. So in the above example, if the BPC user executes a ‘Full Optimize' for the Finance application, the new optimal infocube for the Finance application may not inherit the DTP created on the original /CPMB/DZID30P infocube. The source system, data source, infosource etc will remain but the transformation that links these to the infocube will get deleted and has to be recreated. If this optimization happens in the production system then the transformation may have to be recreated and transported up the landscape.

A way to obviate such situation is to execute the process chains used by BPC to load data using native Netweaver BW tools, outside the BPC application. In the above example, a flat file is being loaded to the BPC infocube using Netweaver BW ETL tools. However, BPC application itself offers a front end functionality of Data Manager to load data either through a flat file or from any other Infoprovider. Data Manager uses BPC process chains in the background to load the data as shown below.

image

If we can run these process chains outside BPC  - from the EDW layer using the native Netweaver BW, then not only we can integrate this with the custom process chains but also obviate the issue of ETL structures getting deleted on ‘Full Optimize'. Running BPC process chains outside BPC is also important if we are using open hub and want to automate the flat file load to BPC cubes by creating a user defined process chain that integrates the file creation of the open hub and loading of that file to BPC cube. If by any means, our user defined (custom) process chain (that we create in transaction ‘rspc') can run the BPC process chain to load the data to BPC cube, then we have an ‘industrial strength' solution for loading data to BPC infocubes using Netweaver toolset. The question now becomes how to accomplish this. Let us try to understand the steps involved.

Steps in using BPC process chain within non-BPC process chain:

The first step is to upload the flat file. If we want to use open hub then the open hub can place the file at any specified location on the BPC application server  or we can upload the flat file to the BPC File service (transaction ‘ujfs') as shown below.

image

The second step is to create a transformation file using the BPC front end. Though we want to run the BPC process chain with native Netweaver tools, this is the only step that we have to do with the BPC front end. This is because the BPC process chain looks for the XML version of the transformation file. When we process the transformation file from the BPC front end, this XML version of the transformation file is automatically created and stored in the file service.

image

The third step is to create an answer prompt file that passes the required parameters to the BPC process chain. This file should be a tab delimited file. The format of the answer prompt file is as follows:

      %FILE%    'csv file path in file service'

      %TRANSFORMATION%                 'transformation file path in file service '

      %CLEARDATA%       1/0

      %RUNLOGIC%         1/0

      %CHECKLCK%        1/0

Here is an example of the answer prompt file:

image

The fourth step is to run program ujd_test_package with the right Appset and Application. We should use the answer prompt file created in the above step and save the variant for the program as shown below.

image

image

However, please note that this ujd_test_package program was originally designed to assist in debugging the data manager packages. Hence it may not be a bad idea to copy this program to a user defined program and use the user defined program in the next step - just to be on safer side so that if future development changes the nature of this program, then we shouldn't get unnecessary surprises!

Now in the final step, we are ready to create our custom process chain that executes the BPC process chain. As shown below, create a user defined process chain in transaction ‘rspc' and include a process type to execute ABAP program. Include ujd_test_package program (or the user defined program created based on ujd_test_package) with the saved variant.

image

image

Activate the process chain and execute the process chain.

image

Thus we can run the BPC process chain from within non-BPC process chains. These steps will work not only for the process chain to load flat file into BPC infocube with open hub, but also for loading data from other Infoprovider to BPC infocube (using the BPC process chain to load data from Infoprovider)

SAP BPC 5.1 : Heuristic Installation

As a being computer geek, one of the thing I like doing when I have time is installing and trying  softwares especially not  the ones installed with  few clicks. Some of you may already have read my blog which is telling about my NW2004s IDES installation experience. In that blog I was telling how I installed NW2004s IDES on my laptop in 55 hours. After that installation experience I was looking for new tools and time to install and try. Last couple of months I read a lot about SAP BPC and then I decided to install and try it however recently I had a time to install SAP BPC and in this blog    I will share my SAP BPC 5.1 heuristic installation experience without using installation guide

As I mentioned above I read a lot about SAP BPC in the last couple of months and seen that always highlighted for SAP BPC by SAP is "set up by IT and managed by Business". Having this in my mind I decided to install SAP BPC without any installation guides and any other materials just with heuristic way in order to see how it is easy to install, manage and use.

I started installation on w2k3 server with 2gb ram and 80gb disk laptop which I have used for SAP NW2004S IDES installation and in the first screens it asked me which version of MS SQL Server will be used for installation of SAP BPC then I chose MS SQL Server 2005 click next and installation software could not find installed MS SQL server in my laptop. It may sound silly for you but I was thinking that MS SQL server packed with SAP BPC installation files but it is not packed so I cancelled installation and went back and found MS SQL 2005  and started installation of MS SQL Server 2005 and completed installation without any problems.After installing MS SQL 2005 server I started again installation of SAP BPC  and this time installation software was able to locate  SQL Server installation after providing required parameters for installation  installation software started copying files to my laptop but before starting to copy files to my laptop installation gave some warnings which I ignore and installation let me go meanwhile I remembered that IIS  is used by SAP BPC as a web server but not enabled on my w2k3  so I have enabled IIS. Installation interface is clear enough and guiding well for installation. Actually when you provided required parameters for installation then what you need is to wait for completion of the installation. I can say that my installation has been completed in less than an hour installation.

SAP BPC Main Screen

After installation is completed it was time to play with SAP BPC and one hour was enough to discover SAP BPC and it is very easy to use with action panes. While trying to create application set (inforarea) and application (infocube) I was getting database related error and I could not create then I was trying to find the reason for error and I found server manager for SAP BPC where there is a diagnostic option to perform on the installation. After I run diagnostic, I have seen the warnings which I ignored during the installation and  telling my MS SQL Server is not updated then I downloaded latest service packs and applied to my MS SQL and now SAP BPC is working properly and the performance on my laptop is satisfactory.

For BW guys the terms used in SAP BPC may sound quite different at first sight but with taking  some time on SAP BPC easy to get familiar with terms. Comparing to NW2004S installation I can say that installation of SAP BPC is much easier and less time consuming however can be installed without looking at any docs. If you want a guide for SAP BPC I strongly recommend the How to Get Help from SAP Regarding SAP BPC  additionally there are several materials in SDN for SAP BPC that waiting for you.

Hope to blog about new installation experience in soon and thanks to Macir for reminding Heuristic again.

How to debug 'Business Rule(Table Based Logic)' in BPC for Microsoft platform

A few weeks ago, I visited one of our customer sites and did some technical consulting.
While I was working with them, one of customer told me that Business Rule didn't work suddenly. That business rule was working properly before I visited. :(.
So I checked business rule (It was for currency conversion) and found nothing that needed to be fixed.  .
As we know, it is really hard to find out the root cause because Business Rule just returned 'FX-XXX' error message. Even though it gives some clues but sometimes it is not the root cause then user will be lost what they need to fix..
So I tried to find how to debug easily for escaping this situation because customer started to suspect me that I had ruined that logic. (Of course I didn’t!!!) .
As we know, All Business rule stored procedure is encrypted so customer and partners could not debug it in the source code level of stored procedure..
BUT... (There is always BUT. :) .
We can get more detail error message from the management studio using my method that I will explain..
First, we need to create temporary scope table.Usually Scope table will be created automatically based on the passing parameters but we need to create it to run it in management studio..
Here is scope table structure. You can give any name for it because we will use it when we call stored procedure..
image
Second, fill some values in the table.
This value should be same value when you run that Business Rule..
image
Third, Execute SPRUNXXXX stored procedure in the management studio..
You need to specify correct one because there are several stored procedures in BPC. For example, Currency conversion is SPRUNCONVERSION, Account Transformation is SPRUNCALCACCOUNT etc. Now you can see the result as below screenshot.

Please remember when you run SPRUNXXXX stored procedure, you should use scope table name that we created. Below example shows that scope table name is 'Test_Scope'
image
As you can see, you will find more detail information here.In this example, it says Currency_type is invalid column. It means mbrrptcurrency table should have currency_type column (property) but it doesn't have.
Therefore, if we create Currency_type property for RPTCURRENCY dimension and fill in proper value, we can solve issue.
By the way, the customer that I addressed earlier had wrong property value in their time dimension but it was processed properly because it was a property value. I also solved that case using this way.This is not the solution for all cases but I am pretty sure it will give more hints.

If you have any questions. Please comment it.

Tuesday, July 20, 2010

BPC 7M, Migrations and Upgrades

After a couple of months of performing migrations and gathering feedback, I am now ready to continue my dialog on SAP BPC, Migrations and Upgrades relevant to the Microsoft version. Since my last blog in February, we have successfully launched BPC 7.0M SP3 and SP4, as well as exited both RAMP-UP programs for the Microsoft and Net Weaver versions. In addition, the EPM RIG team has been working hard to build, test, and migrate various version combinations using migration tools. So what have we learned....lots, plus we have some guidance on current programs for these requests.

Now, this current blog will focus on the BPC for Microsoft version, since a core goal for our group is to help customer begin the process of migrating away from historical Outlook Soft versions of BPC since the end of maintenance is coming up fast at the end of March 2010. That's ONLY 290 days from today; not that much time. So what do you need to know?  Well, all version 4.2 SP3, SP4, SP5 customers may take advantage of a Migration Utility to assist in the transition to BPC7M (SP4 on SQL2008 is preferred). How do I get more details? You can gather details from the following SAP web site.

http://www.planningandconsolidationupgrade.com

This is the first step in the process for the customer. Next, we have a plan in place to certify the partners and SAP consultants who wish to assist in the migrations with our customers. You must attend the Migration Training class and pass a written test to become certified. ( Note: This process isn't just limited to the Microsoft product either.  There are classes for both products)  If you choose to perform a manual migration, you don't need any certification, but if you plan to utilize the Migration Utilities, we expect a certified consultant to assist the process.

Many of the questions that we often get asked include:

- How hard is it to migrate using the utility?

- What value does the utility provide to the process?

- If I am on version 5 do I need the migration tool?

- How long might the process take to migrate?

- Where can I get the migration utility?

Well the answers aren't simple, but here is my personal synopsis:

- Questions 1 are 2 and 4 are very related.  The utility performs the technical migration steps; it moves dimensions, data,  excel files and menus, web contents, comments, data manager files and sets up the new environment for processing and manual interaction. The tool itself is easy to use, but some of the errors you might encounter can be frustrating to solve. That is why we have created a certification class. I have performed around 15 migrations just using the tool (so without all the functional fixes needed) and they all take on average 1-2 days to complete the whole technical tool specifc process. But then there is the functional work that need to be done.

- Question 3. If you are on version 5, then you DO NOT need the migration tool (unless you are migrating to the NW version, which is where the next migration tool starts) In FACT, we are trying to differentiate HOW people describe these processes. Moving from BPC 5 to BPC 7 is an UPGRADE - especially if you are going to upgrade to SQL 2008, this is mostly a technical process. Any movement from BPC versions 4,3 or 2 is considered a MIGRATION. ANY movement from BPC 5 to BPC NW, is also a MIGRATION.  UPGRADES from version 5 to version 7M do require some work, but the primary product components are 100% transferrable. After you conclude the minor add-ons for the UPGRADE and finish building the new environment, the process should only require a backup and restore of the customer application set.

- Question 5.  The ONLY people who have been granted access to the Migration Utility are the certified resources. This is our attempt to monitor the process and understand WHO and WHEN the migrations are going to take place.  In addition, we plan to assign a migration liaison (or contact or coach) that will interface with the Certified partner projects to relay the details back to the SAP EPM Migration Back Office team. We are committed to the customers and want them all to have good migration experiences and help when needed.

In conclusion, the goal is to help all customers move to the newest versions of BPC before the deadline.  We know this will be a challenge for everyone, but we are committed to this process.  The new version of BPC 7M on SQL2008 is a big move, but you will really enjoy the capabilities and power of the new tool.

More ways to manage security in BPC7NW

Business Planning and Consolidation version for Netweaver has some additional features that we can leverage while setting up security in BPC. In this blog we will discuss how security concept in BPC7NW is in some ways a little different than the traditional security concept in SAP and how we can use more ways to build secure BPC applications.

Basics of BPC security:

In BPC7NW, the security is defined within BPC. The Netweaver security is not invoked and the BPC user interacts with Netweaver through service user. Thus the security for BPC namespace infoobjects is governed from within BPC. (for more discussion on BPC namespace, please see  http://www.sdn.sap.com/irj/scn/weblogs;jsessionid=(J2EE3414700)ID1897954850DB00258195100095085303End?blog=/pub/wlg/11279)

There are four basic steps to set up security in BPC. These are:

  • Adding users
  • Assign users to teams
  • Assign task profiles to users or teams
  • Assign member access profiles to users and teams

Adding users and grouping users into teams in order to assign security to the group instead of each user may be intuitive enough for many traditional SAP users. Task profiles determine various tasks that you can perform within BPC. Depending on our business needs we can create new task profiles or just modify existing profiles.

Member access profiles is where we can define either read access or read/write access or no access to the dimension members in BPC for each application.

image

.

We can define multiple member access profiles each with a different set of access authorizations and a BPC user can be assigned multiple member access profiles. This gives a lot of flexibility to define and manage access to the dimension members. However this also creates some challenges in resolving the conflicting access issues that we should be aware of. Let us consider these conflicts and also talk about ways to resolve them.

Potential conflicts while using multiple member access profiles:

Member Access profiles are fairly simple to understand and construct, but they may not be very trivial to resolve when there is conflicting information. If each user has a single member access profile then there is no conflict. A potential conflict may arise when a user is assigned two or more member access profiles and the access authorizations in those profiles are opposite to each other. For example, what if in one member access profile a user has "Write" access to a member, and in another profile the same user has "Deny" access to the same member. Does the user have Write or Deny access? The way this conflict is resolved within BPC security is by following two simple rules. These are:

  • WITHIN one member access profile, it is always possible to apply more specific security at a lower level in the hierarchy and the least privileged access wins
  • ACROSS multiple member access profiles, the one with the "greatest" access (or least restrictions) wins

While applying these rules, the system looks at one dimension at a time, cycling through all assigned member access profiles. By cycling through all dimension restrictions, the dimension value in the record being tested will effectively only be limited by the least restrictive rule. This serial ‘dimensional' approach does not take into consideration any other dimension in the record or the member access profile. Thus, though we define the member access profiles for a set of dimensions, the default conflict resolution takes place by considering one dimension at a time.

Let us take an example to clarify this scenario.

Let us consider that a member access profile (say MAP1) is set up as follows:

Category =       Plan -               Read/Write

Time =             2009.OCT -     Read/write

Time =             [ALL] -            Read only

The idea here is that the user should be able to read the plan values for all time periods but write to only 2009. OCT month plan values.

Now consider that there is another member access profile (say MAP2) that is set up as follows:

Category =       Forecast -                    Read/Write

Time =             [ALL] -                        Read/Write

The idea here is that the user should be able to write forecast values to all periods.

Now, if these profiles are assigned to two different users there is no issue. However if they are assigned to the same user, then there is a conflict. Clearly, the administrator of the security intends to RESTRICT writing of values to Plan category for time other than 2009.OCT. However what happens is that if the user who has both these member access profiles assigned, enters a Plan value for say 2009.SEP, it gets saved according to the rule of least restrictions by dimension - in this case the time dimension. This is not what the administrator might have intended. Hence we have to be aware of this fact while defining the member access profiles and assigning multiple member access profiles to the same user. Member access profiles are very simple to administer and manage but if we want the ‘matrix' type security, which traditional SAP users might be more familiar with, we have to use other tools in addition to the member access profiles to resolve such conflicts.

Alternatives to resolve such potential conflicts:

There are many ways to augment security provided by the member access profiles and achieve the ‘matrix' security. These include the use of following approaches:

  • Member access profiles to provide the read authorization and partial write back control
  • Use BPC validations and leverage validation Badi to control the writing of records. More information about BPC validations is available at: http://www.sdn.sap.com/irj/scn/weblogs;jsessionid=(J2EE3414700)ID1897954850DB00258195100095085303End?blog=/pub/wlg/14726
  • Work status - Use work status in conjunction with the member access profiles. The combination of member access profiles and work status alone may serve the purpose in many situations
  • Excel based logic - Last but not the least, the Excel based validations and macros can be very handy in providing some basic validations for writing the records through input schedule.

With the all this ammunition at our disposal we can surely make any complex BPC configuration secured like a fortress.

Modifying Web Interface Builder - Part 4 - Hiding Cell Contents

BW-BPS Layouts presented through the Web Interface Builder may attempt to present cells that are expected to contain a zero value or are not relevant. This scrip allows you to “blank-out” or hide any specific cells of a layout’s output.
Example code:
<script language=JavaScript type=text/javascript>
….
document.getElementById("LAYOUT_LAY002-12-2-cell").innerHTML = "";
document.getElementById("LAYOUT_LAY002-13-2-cell").innerHTML = "";
document.getElementById("LAYOUT_LAY002-14-2-cell").innerHTML = "";
….
</script>
This sample script "blanks-out" or hides cells in rows 12, 13, and 14, all associated with column 2 of layout "LAYOUT_LAY002".

Modifying Web Interface Builder - Part 3 - Hiding Rows in Layouts

There may be situations in which multiple layouts are presented in one Web Interface frame. It may be desirable to eliminate any column header data rows which may detract from a unified multiple layout presentation as well as providing unnecessary duplication of data. This scrip allows you to eliminate specific rows of a layout's output.
Example code:
<script language=JavaScript type=text/javascript>
document.getElementById("LAYOUT_LAY002-table").deleteRow(0);
</script>
This sample script removes the first row (Row 0) of layout LAYOUT_LAY002 which contains the data column titles.

Modifying Web Interface Builder - Part 2 - Adjusting Column Widths

One or more BPS layouts may be viewed within a frame of a web application. When multiple layouts are thus presented, it is sometimes desirable to vertically line up the columns of each layout to enhance overall readability. An easy way to force alignment is to specific similar column widths. The follow Javascript code will allow adjustment of specified column widths:
Example code:
<script language=JavaScript type=text/javascript>
document.getElementById("LAYOUT_LAY001-1-1-cell").width = "300";
document.getElementById("LAYOUT_LAY001-1-2-cell").width = "80";
document.getElementById("LAYOUT_LAY001-1-3-cell").width = "80";
document.getElementById("LAYOUT_LAY002-2-1-cell").width = "300";
document.getElementById("LAYOUT_LAY002-2-2-cell").width = "80";
document.getElementById("LAYOUT_LAY002-2-3-cell").width = "80";
document.getElementById("LAYOUT_LAY003-2-1-cell").width = "300";
document.getElementById("LAYOUT_LAY003-2-2-cell").width = "80";
document.getElementById("LAYOUT_LAY003-2-3-cell").width = "80";
</script>
The above script sets column 1 of layouts LAY001, LAY002, and LAY003 to 300. Columns 2 and 3 are also set to 80 for each layout.
The result is vertical unification of the three layouts.

BOBJ Planning and Consolidations, Netweaver

BusinessObjects Planning and Consolidation, version of Netweaver (formally known as Business Planning and Consolidation, version for Netweaver) has the ability to incorporate Script Logic to perform calculations and customize functionality.  In this blog series, I will attempt to provide guidance surrounding the use of key words for BPCNW Script Logic.

Basics:

Script logic can be called from two locations from within BPCNW:

Default Logic –

A “default logic” file is automatically defined within every BPC application.  Upon the creation of new records, such through the action of a BPC input schedule, this default logic is automatically executed.  The default logic is executed specifically against all of the newly created records.  In other words, the “scope” of default logic initially consists of all new records created/modified by the input schedule.

In some cases, the scope of the default logic may need to include additional member values not contained in the original scope.  For example, you might enter Quantity on an input schedule, but you may need to calculate TotalCost by finding UnitCost and multiplying that value by the manually input for Quantity.  In this scenario, the scope initially defined by the newly created records and need to be expanded to include TotalCost and UnitCost.  

Controlling scope

There are several key words included in BPC70NW Script Logic that directly control scope:

*XDIM_MEMBER

*XDIM_MEMBERSET
*XDIM_ADDMEMBERSET

*XDIM_MEMBER {Dimension name} = {Members Set}

As of BPCNW, SP01 my observations are that this command has the same functionality as *XDIM_MEMBERSET

*XDIM_MEMBERSET {Dimension name} = {Members Set}

This command allows the definition of the specific member values to be included in the execution of the subsequent script commands.  There are several potential useful formats:

Example 1

*XDIM_MEMBERSET TIME = 2007.JAN  

This statement restricts the execution of any subsequent logic to only the one time member, 2007.JAN.

Example 2

*XDIM_MEMBERSET TIME = 2007.JAN, 2007.FEB, 2007.MAR  

This statement restricts the execution of any subsequent logic to a predefined selection of time members: 2007.JAN, 2007.FEB, and 2007.MAR

Example 3

*XDIM_MEMBERSET TIME = 2007.JAN to 2007.MAR  

This statement restricts the execution of any subsequent logic to a predefined selection of time members: 2007.JAN, 2007.FEB, and 2007.MAR using the “to” qualifier.

Example 4

*XDIM_MEMBERSET TIME = BAS(2007.TOTAL)

This statement restricts the execution of any subsequent logic to the twelve base member children associated with parent node 2007.TOTAL. 

Example 5

*XDIM_MEMBERSET TIME = DEP(2007.Q3)

This statement restricts the execution of any subsequent logic to the three member children associated with parent node 2007.Q3.  In this case scope would be restricted to 2007.JUL, 2007.AUG, and 2007.SEP. 

Example 6

*XDIM_MEMBERSET TIME = ALL(2007.TOTAL)

This statement restricts the execution of any subsequent logic to all parent as well as base member children associated with the parent node 2007.TOTAL.  In this case scope would be restricted to 2007.JAN, 2007.FEB, 2007.MAR, 2007.APR, 2007.MAY, 2007.JUN, 2007.JUL, 2007.AUG, 2007.SEP, 2007.OCT, 2007.NOV, 2007.DEC, as well as including 2007.Q1, 2007.Q2, 2007.Q3, 2007.Q4, and 2007.TOTAL.

Example 7       

*XDIM_MEMBERSET TIME = %VARIABLE%

This statement restricts the execution of any subsequent logic to all member included in the variable set “%VARIABLE%.  This variable set can be populated using the SELECT statement, to be discussed in more detail in a future blog.

===================================================================================

*XDIM_ADDMEMBERSET {Dimension name} = {Members Set}

With the keyword XDIM_ADDMEMBERSET, the logic can merge a specific set of members withthe members passed in the region for which the logic should be executed. This instruction issimilar to the instruction *XDIM_MEMBERSET. The difference is that, while XDIM_MEMBERSETredefines the region passed by the user, XDIM_ADDMEMBERSET adds the defined set to thepassed region. 

Example 8

*XDIM_MEMBERSET TIME = 2007.JAN

*XDIM_ADDMEMBERSET TIME = 2006.JAN

The first line restricts the execution to 2007.JAN.  The second line adds to the defined scope, namely 2007.JAN, the additional value 2006.JAN.  Scope is now defined as 2007.JAN and 2006.JAN.

Modifying Cell Background Colors in Web Interface Builder

As delivered BW-BPS layout functionality differentiates “ready for input” cells from “view only” cells by enclosing the “ready for input” cell within a heavy line boarder. In some cases, the use of color coding the background of the cells help identify the ready for input condition. Insert the follow javascript code to modify a specific cell's background color:
<script language=JavaScript type=text/javascript>

document.getElementById ("LAYOUT_LAY001-2-3-cell").style.backgroundColor="yellow";

</script>
(where LAYOUT_LAY001 Is the technical name of the element specifying the layout in the Web Interface, and the 2-3 represents the 2nd row of the 3rd column.)

BW Objects technical name changeability for SAP BO Planning & Consolidation, version for Netweaver

SAP Business Objects Planning & Consolidation, version for Netweaver (BPC_NW), transports operate by transferring metadata about the existing source BPC_NW configuration to a target system.  Upon successful import into the target system, the metadata describing the new (or changed) BPC_NW objects described in the transport is used to recreate/change BPC_NW objects in the target system.

BW users that wish to connect to these objects should be aware of which objects will retain their original (source system) technical name, and which should not be counted upon to remain static.

DIEMENSIONS

SAP Note 1445688 advises that  “… (the) BPC_NW transport is designed to guarantee the info object technames of BPC dimensions (will remain) identical between source and target system”.  In other words , BPC_NW transports will not change the technical name of the BW info object associated with a particular BPC Dimension.

PROPERTIES

Likewise I have confirmed that the BW attribute technical names associated with BPC_NW Dimension Properties also will not be changed by transport.  As with the BPC_NW Dimensions, the technical name retention is by design of the transport.

APPLICATION

The BW technical name of the BPC_NW Application is a different situation.  These applications can be “fully optimized” and as a result the data model of an optimized application may change.   The current solution is for the underlying BW info provider to be deleted then recreated with the new data model.  BPC_NW transports of application configuration act in a similar manner.   The technical name of the BPC_NW application in the target system is not guaranteed to match the technical name of the same application in the transport source system.

MULTIPROVIDER

The generated BPC_NW multi info provider does retain its technical name after transport.  This is also by design of the transport process. 

LIMITATIONS

If configuration is allowed to be performed directly in the target system the technical id coding the application (a two character designation that is unique for every application…”QW” in the examples below) may be used.  As a result, this two character combination will not be available for generating the same technical id as in the source system upon importing the transport.   

In summary, the following BW object’s technical name retention after transport (assuming the above limitations do not apply) is as follows:

Info Objects (Dimensions) - technical name is retained

JPEG1

Attributes (Properties) - technical name is retained

Info Provider (Applications) - technical name can be changed!

JPEG3

Multi Info Providers- technical name retained

JPEG4

SAP BusinessObjects Planning and Consolidation version for Netweaver - Deleting an Appset

The application SAP BusinessObjects Planning and Consolidation version for Netweaver (SBOP PC NW) contains delivered functionality to transport configuration.  In some cases transports may fail and the only way to correct the issue is to delete an appset and re-import it.   This blog lists the steps that I found successful in performing this activity.

Note: this process will effectively delete the appset in the target system; all master and transactional data will be destroyed!  Be sure to make backups of master and transactional data to so that they may be entered back into the restored appset!

The situation:

Configuration for an appset had been transported into the target system.  At some point in the project the source system was refreshed leaving the old configuration in the target system. Attempts at reproducing the old configuration in the source system, then transporting the deletion, provided ineffective – primarily due to the difficulty of controlling the generated technical names of newly created BPC objects. The decision was made to delete the entire appset in the target system and re-transport the new configuration. 

The steps that needed to be taken to delete the appset are listed below.  All steps that were needed are listed below.  If you attempt this, please follow all steps in sequence.  Pay close attention as to which steps are executed in which systems!

1.  Target System only:  Delete the appset in the target system by using the delivered program (SE38):  UJS_ACTVIATE_CONTENT.  Enter the AppSet Id and uncheck all options except for “Delete the Appset”: 

image

2. Source and Target systems:  Delete all entries (source and target systems) in the tables ujt_trans_hdr and ujt_trans_obj by running the program (SE38) : UJT_DLT_TRANS_DATA.  Select the appset to be deleted (you may need to type the Appset name directly into the prompt box):

image

3) Target and Source Systems:  Verify that all table entries for the appset have been deleted.  Run program (SE38) : UJT_CHECK_TABLE_ENTRIES.  Enter the Appset name and UJ* to capture all BPC tables.

image

Target System: all UJ tables should indicate 0 rows

Source System:  Check that tables ujt_trans_hdr and ujt_trans_obj indicate 0 rows:

image

4. Only after steps 1-3 are completed: create a new transport (t-code: UJBPCTR) in the source system for the appset and release to the target system.

image

Note: this process will effectively delete the appset in the target system; all master and transactional data will be destroyed!  Be sure to make backups of master and transactional data to so that they may be entered back into the restored appset!

Using JavaScript in SAP BusinessObjects Planning and Consolidation, version for Netweaver

General Syntax:
The data to be converted is initially identified in the system variable %external%. JavaScript functionality can be accessed by preceding any command with the key "js:" to indicate a JavaScript command will follow next.

To use JavaScript, ensure that the CONVERT_INTERNAL parameter is set in the calling Transformation file as follows:
If the transformation option Convert_Internal = NO, JavaScript can be used in the Internal column.
Example: the function js:parseInt(%internal%) placed in the internal column of the conversion sheet will return the Integer portion of the converted data.

If the transformation option Convert_Internal = YES, JavaScript can be used in the External column.
Example: the function js:parseInt(%external%) placed in the external column of the conversion sheet will return the Integer portion of the incoming data.

JavaScript can also be used in the FORMULA column of the Conversion sheet. The system variable "Value" is defined as the initial value of the string before processing:
js:Math.round(Value) and Value =0.60 returns the quantity "1"
js:Math.round(Value) and Value =0.50 returns the quantity "1"
js:Math.round(Value) and Value =0.49 returns the quantity "0"
js:Math.round(Value) and Value =-4.43 returns the quantity "-4"

Example:

Wildcards
You can use the asterisk (*) and question mark (?) wildcards in the External or Internal columns.
An asterisk (*) stands for any character, while a question mark (?) stands for any single
character. For example, if you want to reference all members, use the asterisk (*). Example:

JavaScript function: parseInt()
This function parses an input string and returns an integer value. For example:
js:parseInt(%external%)and %external%="10 " returns the value 10
js:parseInt(%external%)and %external%="10.00" returns the value 10
js:parseInt(%external%)and %external%="50.33" returns the value 50
js:parseInt(%external%)and %external%="000000010" returns the value 10
js:parseInt(%external%)and %external%="C000010" returns the value "NaN"

Tips and Notes:
1. Only the first number in the string is returned!
2. Leading and trailing spaces are allowed.
3. If the first character cannot be converted to a number, parseInt() returns NaN ("Not a Number").

Example:

JavaScript function: if...then...else
Use the if/then/else statement to execute some code if the condition is true and another code if the condition is false.
See example above: js: if(isNaN(%external%)) then %external%.substring(0,6);else parseInt(%external%) .

JavaScript function: toUpperCase()
This function coverts all characters in the input string into upper case characters.
Example:

JavaScript function: replace
This function finds a match between a substring (or regular expression) and a string, and replaces the matched substring with a new substring.
Example:

JavaScript function: split & join
This function finds a match between a substring (or regular expression) and a string, and replaces the matched substring with a new substring.

Example:

SAP BusinessObjects Planning and Consolidations, version for Netweaver: DESTINATION_APP "How To" Guide

The script logic functionality for the application SAP BusinessObjects Planning and Consolidation 7.0 version for Netweaver (SAP BPC NW) does not currently have the inherent functionality to move transaction records from one application to another.  This functionality exists in previous Microsoft based versions of BPC (keyword: *DESTINATION_APP), and is desirable from a data modeling design perspective.

New coding is now available to replicate the DESTINATION_APP functionality in SAP BPC NW.  The new functionality allows the transfer of records from one application to another within the same AppSet.  The dimension values contained in the source and the target application can be mapped, renamed, and/or added through a series of parameters available in the new functionality.

The SAP Businessobjects EPM Region Implementation Group has recently posted onto SDN a How To Guide that provides this functionality back into SAP BPC NW through the use of a Business Add-In (BADI).  You can view the How To guide and an associated file that provides the ABAP code to implement it (contained in an easy to use self contained transport) at the following links:

How To... Custom BADI for replicating "Destination_App" script logic functionality in SAP BUSINESSOBJECTS PLANNING AND CONSOLIDATION, version of SAP NetWeaver:

http://www.sdn.sap.com/irj/scn/index?rid=/library/uuid/e04b5d24-085f-2c10-d5a2-c1153a9f9346&overridelayout=true

Transport File for Destination App :

http://www.sdn.sap.com/irj/scn/index?rid=/library/uuid/1048e171-0a5f-2c10-42a6-852f6caa0c05&overridelayout=true

SAP BusinessObjects Planning and Consolidations (SAP BPC NW) Tips -6 [Allocations]

The application SAP BusinessObjects Planning and Consolidation version for Netweaver (SAP BPC NW) contains delivered functionality to execute allocations.  This functionality differs from the Microsoft based versions of BPC, mainly by the lack of some capability.  Nevertheless, the functionality that is provided allows quick and easy configuration of a variety of allocation scenarios.  This blog identifies the current capabilities of the allocation function for BPC version up to an including BPC 70 NW SP04.  

The general format for defining an allocation is:

*RUNALLOCATION

*FACTOR={expression}

*DIM {dim name} WHAT={set}; WHERE={set};[USING ={set};] [TOTAL={set}]

*DIM ...

*ENDALLOCATION

*FACTOR

This instruction can be used to define any arithmetic expression ( written in the {expression} parameter) and may contain operands, parentheses, constants and one or both of the keywords USING and TOTAL, representing respectively the amount coming from the "USING" region (i.e. the amount of the driver) and the amount coming from the "TOTAL" region (i.e. the sum of the drivers). Another keyword supported by this parameter is COUNT, which represents the number of members into which one amount must be allocated. For example when allocating evenly a yearly value into all months of a year, the administrator may just use the COUNT keyword (defining the factor expression as 1/COUNT). In this case COUNT will automatically contain the value 12 (This keyword will obviously be more helpful in cases where the number of members is not predictable.

If omitted, the factor will always default to 1

There are two basic syntax definitions that can be used when defining the allocation function:

Format 1: this format is commonly used when an allocation is to be performed once in the script logic file.

*RUNALLOCATION

*FACTOR=USING/TOTAL

*DIM P_ACCT WHAT=CE0004010; WHERE=CE0004020; USING=CE0004030; TOTAL=<<<

*DIM ENTITY WHAT=A1000; WHERE=<<<; USING=<<<; TOTAL=<<<

*DIM TIME WHAT=2009.JAN; WHERE=>>>; USING=BAS(2009.TOTAL); TOTAL=<<<

*ENDALLOCATION

Format 2: this format is used when it is desirable to call an allocation multiple times within a script logic file.

*ALLOCATION ALLOCATE

*FACTOR=USING/TOTAL

*DIM P_ACCT WHAT=CE0004010; WHERE=CE0004020; USING=CE0004030; TOTAL=<<<

*DIM ENTITY WHAT=A1000; WHERE=<<<; USING=<<<; TOTAL=<<<

*DIM TIME WHAT=2009.JAN; WHERE=>>>; USING=BAS(2009.TOTAL); TOTAL=<<<

*ENDALLOCATION

...

...

*RUN_ALLOCATION ALLOCATE  

...

***************************************************************

***************************************************************

Delta functionality from the Microsoft version of BPC:

*APP

The *APP key word is not available in BPC 70 NW.  This of course implies that the ability to source and/or post data into applications outside of the calling application is not currently supported by BPC 70 NW.  Unfortunately, the *APP keyword is accepted by BPC70NW during script logic file validation; no error message is displayed when using *APP.  Execution of an allocation involving external application references in the *APP command line will still run, however the system will only source and/or post data from/to the home application.

In the following example:

*RUNALLOCATION

*FACTOR=USING/TOTAL

*APP WHAT=PLANNING; WHERE=NEWAPP; USING<<<; TOTAL<<<

*DIM P_ACCT WHAT=CE0004010; WHERE=CE0004020; USING=CE0004030; TOTAL=<<<

*DIM ENTITY WHAT=A1000; WHERE=<<<; USING=<<<; TOTAL=<<<

*DIM TIME WHAT=2009.JAN; WHERE=>>>; USING=BAS(2009.TOTAL); TOTAL=<<<

*ENDALLOCATION

The intention of the developer was to post the allocated values into the application "NEWAPP".  However during execution, the allocation function will actually post into the "PLANNING" application.

*DIM AMOUNT

In previous version of BPC, there was the ability to specify a special case of a dimension name called "AMOUNT". This AMOUNT dimension was used to set a filter for the amounts to be allocated or to be used as drivers. For example, an allocation is only performed on members where the amount of sales is greater than zero.  As with the *APP command, this keyword is accepted during script logic validation, however it is also ignored during actual execution.

Business Planning and Consolidations (BPC) Tips -5

Business Planning and Consolidations (BPC)

The application SAP BusinessObjects Planning and Consolidation version for Netweaver (SAP BPC_NW) provides the ability to create offline copies of input schedules and reports.  These schedules and reports contain "EV" functionality that is continues to work even though the document is offline.  This is facilitated by the system converting "EV" functions into a special offline version, such as "EV_".  For example: EVDES is converted into EV_DES in the offline Park-n-Go version.  In addition, the associated dimension data is cached into hidden sheets within the offline workbook. 

These hidden sheets can be accessed using the following Visual Basic code entered into the marcro of the offline file.  The full procedure is as follows:

Open Offline Park-n-Go template and note number of workbook sheets visible (in the example show there is only one sheet visible):

1

Access Visual Basic (Alt+F11), this is located in the "Developers" tab for Excel 2007:

2

Select the "ThisWorkbook" object and add the following Visual Basic (VB) code to the right screen:

Sub showAllSheets()

Dim i As Integer

For i = 1 To Sheets.Count

Sheets(i).Visible = -1

Next i

End Sub

Your screen should look similar to this:

3

Run the script by selecting F5 or   4.

The VB code exposes the cache sheet generated when the Park-n-Go offline options were executed:

5

In this example: EVDRE, EVGET, EVCOM, and EVPROPS data were all cached to allow for the proper display of all EV function in this workbook.

Examples of the cached data:

6

Business Planning and Consolidations (BPC) Tips -4

Business Planning and Consolidations (BPC)

BPC Excel has a variety of delivered macros that utilized by the BPC application builder.  These macros can be incorporated into any BPC design and can be executed via the instructions provided in my previous posting [https://www.sdn.sap.com/irj/sdn/weblogs?blog=/pub/wlg/9344].

Please note, although this list is substantial, I do not wish to imply it is in any way 100% complete:

Send and Refresh Schedule

MNU_ESUBMIT_REFRESH

Open My Schedule

MNU_ESUBMIT_OPENMY

Save My Schedule

MNU_ESUBMIT_SAVEMY

Open Dynamic Schedule Template

MNU_ESUBMIT_SCHEDULE

Open Schedule Library

MNU_ESUBMIT_OPENSTANDARD

Validate Submission

MNU_ESUBMIT_VALIDATE

Publish to Content Library

MNU_ESUBMIT_SUBMIT

Modify Work Status

MNU_ESUBMIT_MODIFY

Spread

MNU_eSUBMIT_ESPREAD

Trend

MNU_eSUBMIT_ETREND

Weight

MNU_eSUBMIT_EWEIGHT

Manage Dynamic Hierarchies

MNU_ESUBMIT_MANAGE_DYNAMICHIERARCHIES

Using Schedule's Help

MNU_ESUBMIT_HELP

Open My Reports

MNU_EANALYZE_OPENMY

Save My Reports

MNU_EANALYZE_SAVEMY

Open Dynamic Report Template

MNU_EANALYZE_REPORTWIZARD

Open Report Library

MNU_EANALYZE_OPENSTANDARD

Audit Reports

MNU_EANALYZE_VIEWAUDIT

Using Report Help

MNU_EANALYZE_HELP

Open Web-Ready File from BPC Web

MNU_EPUBLISH_OPENPUBLICATION

Save as Web-Ready File

MNU_EPUBLISH_PUBLISHSHEET

Book Publication Wizard

MNU_EPUBLISH_PUBLISHBOOK_WIZARD

Create New Book

MNU_EPUBLISH_PUBLISHBOOK_NEW

Edit Book

MNU_EPUBLISH_PUBLISHBOOK_EDIT

Save Book

MNU_EPUBLISH_PUBLISHBOOK_SAVE

Save As Book

MNU_EPUBLISH_PUBLISHBOOK_SAVEAS

Validate Book

MNU_EPUBLISH_PUBLISHBOOK_VALIDATE

Offline Distribution Wizard

MNU_EPUBLISH_OFFLINE_WIZARD

Manage Distribution List - New

MNU_EPUBLISH_OFFLINE_NEW

Manage Distribution List - Edit

MNU_EPUBLISH_OFFLINE_EDIT

Manage Distribution List - Save

MNU_EPUBLISH_OFFLINE_SAVE

Manage Distribution List - Save As

MNU_EPUBLISH_OFFLINE_SAVEAS

Manage Distribution List - Validate

MNU_EPUBLISH_OFFLINE_VALIDATE

View BPC Web

MNU_EPUBLISH_VIEWeDASH

Publishing Help

MNU_EPUBLISH_HELP

Park N Go

MNU_ETOOLS_PARKNGO

Expand All

MNU_ETOOLS_EXPAND

Refresh Workbook

MNU_ETOOLS_REFRESH

Drill Down

MNU_ETOOLS_DRILLDOWN

Drilldown Back

MNU_ETOOLS_DRILLDOWN_BACK

Drill Down Forward

MNU_ETOOLS_DRILLDOWN_FORWARD

Drill Through

MNU_ETOOLS_DRILLTHROUGH

Select Member

MNU_ETOOLS_MEMBERSELECTOR(xxxxxxx)

Function Wizard

MNU_ETOOLS_FUNCTIONWIZARD

Refresh Dimension Members

MNU_ETOOLS_UPDATEAPPINFO

Data Manager

MNU_ETOOLS_DATAMANAGER

Workbook Options

MNU_ETOOLS_WBOPTION

Open Dynamic Templates

MNU_ETOOLS_OPENSTANDARD

Save Dynamic Templates

MNU_ETOOLS_SAVESTANDARD

Open Custom Menu

MNU_ETOOLS_PSMANAGER_OPEN

Test Current Worksheet

MNU_ETOOLS_PSMANAGER_COMPILE

Save Custom Menu

MNU_ETOOLS_PSMANAGER_SAVE

Change Application Set

MNU_ETOOLS_CHANGEAPP

Client Options

MNU_ETOOLS_OPTION

Journal

MNU_ETOOLS_JOURNAL

View BPC Action Pane

MNU_ETOOLS_TASKPANE

BPC for Office Help

MNU_ETOOLS_HELP

About BPC

MNU_ETOOLS_ABOUT

Run Package

MNU_eData_RUNPACKAGE

Packages Schedule Status

MNU_eData_PackageSechedules

View Status

MNU_eData_ViewStatus

Organize Package List

MNU_eData_OrganizePackage

Manage Team User Package Access

MNU_eData_ManageSitePackage

Data Preview

MNU_eData_DataPreview

Data Upload

MNU_eData_DataUPLoad

Data Download

MNU_eData_DataDownLoad

New Transformation File

MNU_eData_NewTransformation

Manage Transformation Files

MNU_eData_OpenTransformation

Validate and Process Transformation File

MNU_eData_SaveTransformation

Copy Transformation File

MNU_eData_SaveAsTransformation

New Conversion File

MNU_eData_NewConversionFile

Manage Conversion File

MNU_eData_OpenConversionFile

Validate and Process Conversion File

MNU_eData_SaveConversionFile

Copy Conversion File

MNU_eData_SaveAsConversionFile

Add Conversion Sheet

MNU_eData_NewConversionSheet

Clear Save Prompt Values

MNU_eData_ClearPromptValue

Data Manager Help

MNU_eData_Help

About Data Manager

MNU_eData_About

Show Member List

MNU_eData_Showmemberlist

Business Planning and Consolidations (BPC) Tips -3

Business Planning and Consolidations (BPC) 

The REC Statement provides a convenient method to create and manipulate the values of new records using Script Logic.  The basic format of the REC statement is as follows:

            ...

            *WHEN Dim1

            *IS *

            *REC(FACTOR=1, Dim2="A")

            *ENDWHEN

            ...

NOTE: The REC statement requires the selection of an existing record to process, hence the mandatory inclusion of the WHEN / ENDWHEN statements.

The new record(s) created by the REC statement inherits the same values of all dimensions from the original record.  Any dimension specified within the REC expression will override the original dimension value and will be substituted by the REC definition for that dimension.  In the example above, the REC statement inherits each record's dimension values except for Dim2.  Regardless of the original record's Dim2 value, the new record will have Dim2 set to the value "A".  

The REC can accept multiple dimension assignments, each dimension assignment must be separated with a comma.  Example:

            ...

            *REC(FACTOR=1, Dim2="A",Dim3="C",Dim4="D")

            ...

The keyword FACTOR allows the definition of a numeric value that is used to determine the new record's value. Factors are multiplied against the original record's values.  In the example provided, a FACTOR=1 essentially keeps the original value in the new records being created.  A FACTOR=2 will double the value of the new record, and a FACTOR=0.5 will reduce the new record's value by half.   Example:

            *WHEN ....

            *IS ....

            *REC(FACTOR=.5)

            *ENDWHEN

A FACTOR can be positive or negative values:

            ...

            *REC(FACTOR=-2)

            ...

A FACTOR can contain simple calculated values:

            ...

            *REC(FACTOR=3.6/4.7) 

            ...

In this case, the factor that will be applied is 3.6 divided by 4.7 or 0.765957.

The keyword EXPRESSION may also be used to modify the REC generated record values. The EXPRESSION formula can include regular arithmetic operators, fixed values and the keyword %VALUE% (representing the original retrieved value).

Examples:

            ...

            *REC(EXPRESSION=%VALUE%+1500)

            ...

In this example, the value of the new record is determined by adding the value of the original record (%VALUE%) by a fixed amount (1500).

Business Planning and Consolidations (BPC) Tips -2

Relevant versions: BPC5.x+

Business Planning and Consolidations (BPC)

EVLST and EVSET are two powerful BPC functions that can work together to present filtered member data selections for use in BPC Planning applications.   EVLST accesses the member data file and EVSET filters the returned list according to MDX filtering specifications.

A practical example in using these functions is as follows:

Return a list of dimension members, but only those members with a specific property (i.e. attribute).

In the example provided below, the scenario is to return a list of dimension member IDs that only contain the property "Group" set to "Group A".

The EVLST function requires the following configuration:

EvLST

EvLST Parameters:

AppName = the name of the BPC Application

DimensionName = the name of the dimension from which you are requesting the list of members.

SetExpression = the cell location of the EVSET function.  EVSET will be applied as a filter against the entire member list associated with "DimensionName".

Target = specify the cell locations you want the filtered list to appear

PropertyName = EVLST can return any property within the specified dimension, in this example the member's ID is being requested.

ExpandDown = enter TRUE to expand members by row, FALSE to expand members by column.  Value if omitted is TRUE.

RepeatDuplicates = TRUE displays duplicate properties, FALSE suppresses multiple copies of the same property.  Value if omitted is TRUE.

The EvSET function is referenced by the EvLST function in order to filter the listing to specific parameters.  In this example, the EvSET function is limiting the selection to a MDX filter specified in the Filter parameter:

EvSET

EvSET parameters:

AppName = the name of the BPC Application

Member = enter one valid member of the dimension; this is required for the MDX statement to execute properly.  The selection of which member to enter is irrelevant, since the returned list will be consist of all members matching the "Filter" criteria

Include Flag = enter "EVMEMEBRS" to specify base members

LevelDown = specific number of hierarchy levels down the filter needs to expand to

ParentBefore = if returning a hierarchy, enter TRUE for the placement of the parrnt before the child, FLASE to place parent after children.

Filter = is aan optional parameter, but is required in this example to filter on member properties.  Complete example expression:

"ACCOUNT.CurrentMember.PROPERTIES('GROUP')='GROUP_A' "

This expression will select only those members in the dimension "ACCOUNT" that have the property "GROUP" set equal to "GROUP_A".

http://sapbpctutorials.blogspot.com/2010/07/business-planning-and-consolidations.html

Business Planning and Consolidations (BPC) Tips -1

Business Planning and Consolidations (BPC)

Relevant versions: BPC5.x+

Using Excel “FORMS” to provide end user initiated function selection capability:

Steps

  1. Form the Excel menu View>Toolbars>Forms   The forms toolbar will appear.
  2. To create a user selectable “button”, select the form that looks like a grey rectangle.
  3. Create the button by moving your cursor to the desired location and trace out the location of the button.  You can always resize and reposition the button at a latter date by access the object’s Format Control.
  4. When you create the button, you will be asked to identify a macro for the object.  The macro is the excel program that will be executed when the button is clicked.
  5. The following macros are typically used within BPC applications:

           MNU_ETOOLS_REFRESH    

           [Same as "REFRESH WORKBOOK"

           MNU_ESUBMIT_REFRESH   

          [Same as "SEND AND REFRESH SCHELDULE"]  

  1. Edit the text of the new button by right clicking on the button outline and selecting  “Edit Text” from the context menu:  
  2. Final product:  you now have a “button” that will allow the user to initiate specific actions.

SAP Insider: Reporting and Business Intelligence conference - Day 1

This week I'm attending the conference "Reporting & Business Intelligence with SAP and Business Objects" conference. It is hosted by SAP Insider and being held in Oak Brook, IL from July 21-July 23. This blog entry gives a review of the seminar and provides you with some tips that you should find useful for planning your reporting implementation.

The speakers for the week are as follows: David Dixon and Jenny Shah, both from Inforte. Bryan Katis and Bobby Coates are from SAP and BOBJ/SAP respectively. All speakers have a solid amount of SAP and Business Objects experience to draw from.

First off, I'm a little disappointed by the location. On the brochure is shows pictures of downtown Chicago and I'm having visions of hitting the town each night after the seminars are over. But that idea quickly faded as I Googled the hotel and found out that we are well outside of being even remotely close to downtown Chicago. Oh well, at least I won't be distracted and I'll have ample time to write my blog....

One thing I found unique about his conference when compared to others that I've attended is that the speakers have written an incredible amount of material for each presentation. In a typical seminar you might get 15 powerpoint slides with a few bullet points on them. In this seminar, each slide is packed with data and there are 50-60 slides for each presentation. This makes it impossible for the speaker to cover everything in the slide deck and some slides are skipped completely. There was just too much information to cover it all within the given time frame (1.5 hrs per presentation). As you're reading this, you might be thinking that this is a bad thing. But it's just the opposite. This gives us ample information to take back to the office (or in my case, the hotel room) and read up on each topic in more detail. Most seminars I go to have such sparse slides that when I go back to the hotel room to review them, I frequently have no idea what some of them are about b/c they all seem disconnected from each other. In additon to the detailed slides, we were given a CD that has even more content about each seminar (including walk-throughs about how to perform certain tasks and all the demos are availble to watch again). Clearly, each speaker has put in a lot of work to make their seminar informative for everyone. If your boss is expecting you to come back from the conference and share what you learned, you will have more than enough material to draw from.

In the following summaries, it may appear that I cover most of what each speaker talked about. But I only give a few highlights because there is just too much information that was covered.

Session 1: SAP and BOBJ Fundamentals for every customer.

David Dixon gave us an overview to all the tools we'll be covering this week. This was a great overview b/c it put into perspective how the BOBJ acquisition changed the reporting and analysis landscape. There are many new tools and they have redundent feature sets with existing SAP products. This appears to be both bad and good (or at least confusing) 

A common problem I heard was how to decide when to use which tool, and what to do with the existing resources you have already in place. According to the roadmap we were shown, many of SAP tools are replaced with the BOBJ equivalants. According to David, the SAP tools are limited to SAP data and are not suited for accessing non-SAP data. Of course, the majority of SAP customers use disparate data sources and need a way to bring them together. The BOBJ tools have been designed from the start to work with mulitple vendors and have the connectivity built into them to use a variety of data sources. By replacing the SAP tools with BOBJ products, you get to access all your data sources within the same user interface. He covers Crystal Reports 2008, Crystal Xcelsius, SAP BW and BOBJ XI integration paths, NetWeaver CE, Dashboard Builder, Web Intelligence, Voyager, Polestar, LIve Office, BOBJ Mobile, Data Integrator and Data Quality. Whew!

David stressed that SAP is not advocating a "rip and replace" policy for using the new tools. Instead, you should 'adapt and adopt'. Use your existing tools as normal, but new development can focus on using the new products and integrating them into your landscape. For example, BEx Reporting will be replaced by Crystal Reports, but BEx Reporting will still be supported by SAP through 2016. There is no need to through away what you already created.

Session 2: Practical recommendations for mapping SAP and Business Objects capabilities to your requirements.

Bryan Katis went into more detail about how the new BOBJ products can be used within your company. There was a little cross-over between his talk and David's talk in the previous session, but they each had a different focus. Bryan focused on how usage of BOBJ products impacts the technical aspect of your infrastructure as well as how the BOBJ products are more user friendly.

From a user standpoint, BOBJ tools give the user more freedom to get access to the data they need to do their job. The tools put more power into the user's hands. That being said, the tools are not for everyone. They are for the 'pro-sumer'; the business user who is more tech savy than his/her peers, but not enough to work in the IT area. These users are known for being the ones who aren't happy just looking at Excel data and instead will choose to write some simple macros to help make their job easier. The BOBJ products make it possible for these users to create reports and perform ad-hoc analysis for their departments without having to contact the IT department for help.

This impacts the IT department in two ways. The first is that since the end-user has the power to do their own ad-hoc queries and reporting, the IT dept has time to focus on more important work. But that doesn't mean that they can be hands-off. Instead, they still need to look at security and performance issues. Now that end-users can do ad-hoc queries, you have to make sure that you have the infrastructure in place so that this doesn't drag down performance for other users. Bryan gave recommendations on how to keep your users happy and keep the system running at peak performance. The second impact on the IT department is that even though you don't have strict governance over the tools, you don't want an uncontrolled environment either. You need to prevent junk reports and reduce redundancy by promoting reuse.

Session 3: New guidelines for enterprise performance management.

Unfortnately, a client called me during this session and I missed almost all of it. Thus, I don't have many comments about it. Nonetheless, I'll highlight a few points I see in the slide deck:

What's new in in EPM?
* Optimizing financial operations wth Financial Performance Managment.
* Tracking spending and supply chain activities with Operational Performance Management.

Over the long term, SAP will move to a unified application experience for the full set of applications, starting with planning and consolidations. Merge the applications on a common business intelligence platform consuming enterprise SOA services that abstract the user layer from the data layer and ensure a unified application environment for business users.

BPM enhancements will reduce reliance on IT. A focus on familiar and easy to use planning interfaces (Office 2007 and Web) facilitate broader use and more collaborative planning.

When choosing between BPC and BI-IP, BPC is the recommended choice. Your existing investments in BPS/BI-IP are protected and can co-exist with BPC.

Session 4: Designing, Formatting and Delivering Enterprise Ready Reports

Jenny Shah gave us an introduction to Crystal Reports and how to use it. The presentation started out covering an overview of the functionality of Crystal Reports and it later went into a detail about how each feature works. For example, a couple topics covered where how to create parameters, report alerts, filters, etc. One thing I would have liked to have seen is a direct comparison between BEx Reporting and Crystal Reports. Having a chart that compares feature sets would have been very interesting.

She went through a demo on how to create a basic report and the steps for adding more complex features. Afterwards, she had a high level discussion of Web Intelligence (WebI),  Voyager, and Xcelsius. At first I was disappointed that the coverage of Crystal Reports was so high level and didn't discuss anything about SAP integration. But she reminded us that the third presentation on Tuesday will be 100% about how CR and SAP work with each other. I'm looking forward to it!

That is it for the first day. It was a great day to say the least. As I said earlier, it was an enormous amount of information and this blog doesn't do it justice for how much material each speaker covered. Also, if your company is looking seriously at using Crystal Reports, I pack just as much information into my book, Crystal Reports Encyclopedia. I cover basic and advanced reporting topics and each chapter has numerious tutorials to reinforce what you learned. You can find out more information and read reviews at Amazon.com: http://www.amazon.com/exec/obidos/ASIN/0974953601/sap4india-20

Check back tomorrow for my blog entry about the sessions on the second day of the conference.

Monday, July 19, 2010

SAP – BPC: Implementation Lesson Learnt Part – 5.

SAP – BPC: Implementation Lesson Learnt Part – 5.

Continuing from our last series. We will discuss about the Point # 5: 'Configuration/Development Best Practices.'
      
Object & development method standardization -


       To start with in this topic, let me reiterate the term ‘Flexibility', which I was so passionate about. No doubt it is BPC's main strength, with that, I also do consider that can be a potential weakness too. Because you can do same thing in multiple different way and creative team will do anything to achieve the result with a different route. Beware, before you start developing, make your standard and guidelines, consider the best possible route to achieve a solution and communicate. A creative solution might not be always the best solution for everyday use and might not be easy to maintain and control.

Consider scalability as a building block -
       Always build/configure with scalability in mind. For example, you creating a Financanial Planning AppSet and in future you do foresee customer demand integration over there. In that scenario, do add a ‘Customer' dimension to start with. Adding dimension in BPC at a later stage is quite a job. Hierarchy eats up a considerable amount of resources while reporting. Try to limit the number of Hierarchy you build in a system.

InApp consideration -
       InApp property while designing the Dimension properties, works as in Navigational attribute concept in BI. All the BPC MDX OLAP gets loaded (overloaded) with the dimension properties if the InApp property is switched on. Studies showed more than 33% increase in response time when you clean up your InAPP properties.


Audit trail & high maintainability by ‘Request ID’ -


       The SAP BI has a concept of tracking data loads by ‘Request ID'. This helps tremendously during the support and data management process. Unfortunately BPC doesn't have any request ID concept. What BPC has a ‘Clear' package, which import zero value for the selective dataset. But there can be several scenarios (particularly in planning) where people do planning at a same overlapping combination level. To clearly segregate who did what, having a custom dimension to populate this sequence number will help tremendously.


One AppSet -

      In QA & particularly in Production always try to have only one AppSet.

Default Logic & modularization -

      Default logic in BPC gets executed every time there is a data load or input schedule sends data. So building default logic need to be well planned. If you put all the code in the default logic it will add overhead to the system. Consider modularize your code and manage better. Talking about modularization, SAP BPC can hold the entire global constants in a file ‘System Constant' in the server. The file can be found in ‘ \Data\Webfolders\\AdminApp\' location. Avoid hard coding different constants in the different program and instead use the ‘System Constant' file wisely.

Control transformation file –
       There will be cases when BPC creates input file for external system loading. In this scenario we use the DTS package to ‘Export from the Fact table'. There are examples when the column sequence gets interchanged for some reason. To counter this, always try to use a transformation file to control the different columns in the output file.

Transformation debugging –

         When you are using a conversion & transformation file testing, keep the files open in excel and try loading the data. In case there are any problems in those files, the error will be highlighted in those files. This works like a charm when you have lengthy control files.
        
Work status & Security-

         Use ‘work status' extensively rather than building complex security design.
When you are building security for the system, make sure you have 2 separate users from Application Administrator & System Administrator. SAP BPC tool is heavily dependent on this 2 segregation.

         Also during the security design, never assign a ‘Team' to another ‘Team' or grant somebody two different access profile. In BPC, whenever there is a conflict in multiple access profiles, the less restrictive access always win. All these ensure a robust and friendly security design.
It is wise to use the formatting options (for EvDRE) whenever you are creating a report. Make a format template available in the server and ask your development team to reuse it whenever they are building reports. This ensures a consistency & decrease in development effort.

Comments -

        Lastly, ‘Comments'. BPC does support the intelligent auditing process and with many other features it also has a feature of adding comment (more like metadata information). Always train your user (or force the system) to add a comment with all the base level member selected. People sometimes associate comment with one or the other dimension value blank and those are hard to find. Always remember in classical SAP means all whereas it is just the opposite in BPC.

 

SAP – BPC: Implementation Lesson Learnt Part – 4.

SAP – BPC: Implementation Lesson Learnt Part – 4.

Continuing from we left off last time - 

Point #4: System Landscaping:


    I recommend using a 3 tier landscape to manage your SAP BPC environment.
             

    BPC 5.1 doesn't support the classical SAP transport process. You need to use Backup/Restore option at the server management level and then use file transfer process to support the implementation and post-production support. So when you backup the development server and restore the it to the new box to create QA server, you do need to do some manual work like setting up the initial security to the newly built system & etc. Because all these work (including the file movement analogues to the transport) are manual in nature, always keep a detailed work list with proper validation to have higher accuracy & low cutover time.

image

   In case you are using a standalone BPC system only, in that scenario, 2 tiers architecture also will work.

  Now, consider a scenario, where, you have already implemented BPC - Planning. And now you want to implement BPC - Consolidation. In this case you have 2 development pipe-lines. Consider the below diagram to manage the system landscape that time.

       In the diagram below, you will be noticing the 2 lines. The top line was the first development roll out and the line below shows the current one. For example, your firm has already implemented Planning and now an initiative is happening to rollout consolidation.

image


                The green arrow signifies ‘One time Copy' to build the initial AppSet. The blue line shows the number in sequence for the object migration originated from Development AppSet-1. The black line signifies the object migration originated from the Development AppSet-2. The Red Cross signifies when the Development AppSet-2 is ready, Development AppSet -1 will be deleted and so on.

 Lastly as I have mentioned, BPC 5.1 doesn't support the classical SAP transport process, you need to be creative to establish your transport management. I plan to discuss on this in my next series.

 

SAP – BPC: Implementation Lesson Learnt Part – 3

SAP – BPC: Implementation Lesson Learnt Part – 3

 Continuing from we left off last time - 


Point # 3: BPC Software components:


   Microsoft has released office 2007 for the general use some times back and currently several user groups started using MS Office 2007 or in the process of getting into it. Now this is a face that BPC 5.1 is predominantly much more comfortable with MS office 2003, particularly when we are talking at the MS Excel functionality based model.

   I personally came across lot of situations where a transition (from MSO'03 to MSO'07) during a BPC implementation timeline or where some team member uses ‘07 and some '03, creates a lot of unwanted confusions, breaks -  resulting loss of valuable but unnecessary time.

     My suggestions, before you embark on a BPC project decide about the MS version you would like to use during the implementation and stick to it. In case if you decide to use MS Office 2007 to start with, there are several OSS notes available to make it work, some at a server level or some at the client side.

    Here are the impact areas for an Office compatibility problem,
  • BPC - Admin area (where you maintain your dimension members)
  • BPC for excel navigation links
  • BPC for Word/PowerPoint navigation links

    I have documented some key OSS notes I found handy here. But will strongly recommend looking into the SAP support portal for latest updates. Always remember, in all probability you are not the ‘lucky' few to suffer for the first time for that particular issue. There must be somebody else who has gone through the same pain. Here is the list; one must have when in this situation -

  • 1089848 Users Cannot Log Into BPC for Excel After MS Update
  • 1173726 65K row limitation and Office 2007
  • 1240811 Known issues with BPC and Office 2007
  • 1243550 Templates become corrupted after saving in Office 2007
  • 1264171 Microsoft Excel 2003 template is unrecognized in Excel 2007.
  
    And lastly, make sure you are having the proper SP patch (>= SP2) implemented in SAP BPC to start with. As on 10th March 2009, SAP has published Patch 8 for the general use. I personally always believe in (Current -1) strategy. So one should have at least at the SP7 level as on today.

  SAP BPC comes up with a standard Work flow engine ‘Business Process Flow'. There are a lot of values & process integrations can be achieved using this BPF functionality and its strength has been mentioned in each of the BPC sales material you might have came across. But there are several projects, where BPF didn't work the way it was expected to do.  An early planning & strategy formulation about BPF functionality in your rollout is a time well spent.

       Now my favorite topic - sequencing your Consolidation & Planning effort.  To discuss this, I must commit to the fact that I still have some confusion about BPC positioning in the bigger roadmap. I read quite a bit of write-ups and listened to SAP folks and came into a conclusion that, SAP really thinks BPC is the tool for the financial Planning going forward.

     But if you look into the consolidation scenario, SEM-BCS is still there in SAP's plan. Also whoever has used SEM BCS, knows it (BCS) comes with a lot of predefined processes which are not present in BPC (but those are possible to configure/develop with some tweak). Also as the ‘IFRS' gaining momentum, SAP Business Object Consolidation tool ‘Cartesis' as a consolidation tool getting lots of focus.  

   There are several communications from SAP on this and I have a feeling considering all the point above that rolling out BPC with Financial Planning and then to Consolidation make a lot of sense from product roadmap angle. But, if you look into the business process angle, without Actual data in the system, planning seems to be a big NO. Keeping that in mind it is preferable to rollout Consolidation & then Budgeting & Forecasting in BPC. You should think which fits well for your company.