Roadmap To install and set up Oracle BI Applications, do the following:
Once you have installed and set up Oracle BI Applications components, configured the modules (optional), and customized Oracle BI Applications (optional), you are ready to start running ETL processes.
During the Oracle Business Analytics Warehouse configuration process, when you create the data warehouse tables, “Creating Data Warehouse Tables,” you can create tables in one tablespace and indexes in another tablespace. However, for performance reasons, it is recommended that you create tablespaces as described below
Tablespace Name | List of Tables |
---|---|
DIM_STG | W_*DS |
FACT_STG | W_*FS |
DIM | W_*D and W_*MD |
FACT | W_*F |
AGG | W_*A |
OTHER | Remaining W* tables |
DIM_INDX | Indexes of W_*D tables (for example, other tables would include W*G and W*GS tables) |
FACT_INDX | Indexes of W_*F tables |
OTHER_INDX | Remaining indexes of W* tables |
To avoid fatal deadlocks during the ETL, make sure that you select the 'Session Level Retry on Deadlock' option in Informatica.
To configure the Business Analytics Data Warehouse on Oracle databases more easily, refer to the parameter template files init10gR2.ora and init11g.ora. These files are located in
<DRIVE>:\<BI Applications install directory>\dwrep\Documentation\.
The parameter template files provide parameter guidelines based on the cost-based optimizer for Oracle 10g and 11g.
Copy the appropriate template file into your ORACLE_HOME/dbs directory. Then, review the recommendations in the template file, and make the changes based on your specific database configuration. The database administrator should make changes to the settings based on performance monitoring and tuning considerations.
Note: The NLS_LENGTH_SEMANTICS parameter enables you to define byte- or character-length semantics. Oracle BI Applications supports BYTE and CHAR values for this parameter. You can add this parameter to the init10gR2.ora and init11g.ora files if you are using MLS characters.
This section contains additional suggestions for optimizing performance for Oracle databases.
These settings are required for adequate performance from the dedicated Web client.
To increase data throughput between Oracle BI Server and Oracle, change SDU and TDU settings in listener.ora. The default is 2 KB and can be increased to 8 KB.
SID_LIST_LISTENER =
SID_LIST =
SID_DESC = (SDU=16384)(TDU=16384)
ORACLE_HOME = /.....)
SID_NAME = SOLAP)
)
)
myhost_orcl.world=
DESCRIPTION=(SDU=16384)(TDU=16384)
ADDRESS = (PROTOCOL = TCP)(HOST=myhost)(PORT=1521))
CONNECT_DATA=(SID=ORCL))
After the Oracle BI Applications files have been installed on Machine A, the DAC Client is installed on Machine B, and the DAC Server is installed on Machine C.
In addition, the following files are copied from the installation machine (Machine A) to the Business Intelligence Deployment Tier (Machine Group F) as follows:
The BI Deployment tier is used to deploy the business intelligence dashboards.
The Oracle BI Applications installer requires that you have already installed Oracle Business Intelligence Enterprise Edition on a Windows machine. The instance of Oracle Business Intelligence Enterprise Edition must be a complete installation; that is, you must install it using the Complete setup type option. This installation is only required to enable the Oracle BI Applications installer to install the Oracle BI Applications files on a machine.
All the below informations can be found in this document: System Requirements and Supported Platforms for Oracle® Business Intelligence Applications
Software | Version |
---|---|
Oracle Business Intelligence Enterprise Edition | 10.1.3.4.1 |
Informatica PowerCenter | 8.6.1 with Hotfix 6 |
Data Warehouse Administration Console (DAC) | 10.1.3.4.1. with Patch 9179169 (file p9179169_101341_Generic.zip) |
Oracle Database | 10gR2: 10.2.0.4. or higher, 11gR1: 11.1.0.7, 11gR27: 11.2.0.1 or higher |
DAC produces parameter files that are used by Informatica. If an execution plan fails in DAC and you want to debug the workflow by running it directly from Informatica, then the parameter file produced by DAC should be visible to Informatica. This is one reason for the requirement to co-locate the DAC and Informatica components as stated above.
The DAC Client uses the Informatica pmrep and pmcmd command line programs when communicating with Informatica PowerCenter. The installation of PowerCenter Client Tools provides the pmrep executable file. The installation of PowerCenter Services provides the pmcmd executable file. The DAC installer performs post-install configuration tasks related to the pmrep and pmcmd executable files.
Oracle Business Analytics Warehouse | CPU: 2 GHz or better, 8 CPU cores minimum, RAM: 8 GB |
ETL Client (Oracle Business Intelligence Data Warehouse Administration Console Client and Informatica PowerCenter Client Tools) | CPU: 1 GHz or better, RAM: 1 GB |
ETL Server (Oracle Business Intelligence Data Warehouse Administration Console Server and Informatica PowerCenter Services) | CPU: 2 GHz or better, 4 CPU cores minimum. RAM:8 GB. Storage Space:100 GB free space |
Informatica PowerCenter Services (Oracle Business Intelligence Data Warehouse Administration Console Server version 10.1.3.4.1) | Oracle Enterprise Linux 5 (32-bit)3. Processor: x86 |
Informatica PowerCenter Client Tools (Oracle Business Intelligence Data Warehouse Administration Console Client version 10.1.3.4.1) | Windows 32 bit Xp and above 2000 |
Oracle eBusiness Suite R12.1.1 is supported with Oracle Business Intelligence Applications version 7.9.6.1.
Setting NLS_LANG correctly allows proper conversion
When these settings are the same, Oracle assumes that the data being sent or received is encoded in the same character set as the database character set, so character set validation or conversion may not be performed. This can lead to corrupt data if the client code page and the database character set are different and conversions are necessary.
You need to set the NLS_LANG environment variable on each machine that has the Oracle client installed.
The NLS_LANG character set should reflect the setting of the operating system character set of the client.
For example, if the database character set is AL32UTF8 and the client is running on a Windows operating system, then you should not set AL32UTF8 as the client character set in the NLS_LANG parameter because there are no UTF-8 WIN32 clients. Instead, the NLS_LANG setting should reflect the code page of the client.
To determine the code page of the Windows machine, run the following command
C:\Documents and Settings\Administrator>chcp
Active code page: 437
And then set the appropriate character set for the NLS_LANG windows environment variable.
The version file version_apps.txt can be found in this directory OracleBI_Home\Document\
Build: 7.9.6.1.100609.2038
Release Version: Oracle Business Intelligence Applications 7.9.6.1
Package: 100609.2038
The following directories or files have been created:
Oracle Database - Installation
With the init.ora parameter file that is located in OracleBI_Home\dwrep\Documentation.
The init files contains very valuable informations and a schema for datawarehouse database creation.
One or more database instance are needed to hold the following:
Note the following points:
CREATE ROLE SSE_ROLE;
GRANT CONNECT TO SSE_ROLE;
GRANT RESOURCE TO SSE_ROLE;
GRANT SELECT ANY TABLE TO SSE_ROLE;
GRANT INSERT ANY TABLE TO SSE_ROLE;
GRANT UPDATE ANY TABLE TO SSE_ROLE;
GRANT DELETE ANY TABLE TO SSE_ROLE;
GRANT GRANT ANY OBJECT PRIVILEGE TO SSE_ROLE;
GRANT GRANT ANY PRIVILEGES TO SSE_ROLE;
For an Oracle database, the following additional privileges SELECT, INSERT, UPDATE, DELETE, GRANT must be granted to the SSE ROLE if the DAC Repository and the Informatica Repository are stored in a different database from the data warehouse database
CREATE USER DWH_REP IDENTIFIED BY DWH_REP
DEFAULT tablespace users
TEMPORARY tablespace temp
quota unlimited ON users;
GRANT CONNECT, resource, CREATE VIEW TO DWH_REP;
GRANT SSE_ROLE TO DWH_REP;
When you run the Repository Service:
Action Status for Restore Contents: Succeeded.
Repository Service is running in exclusive mode, please restart the Repository Service to change to normal mode.
Copy from the OBIA installation machine:
on the Informatica PowerCenter Server Machine respectively:
You can see the parameters values PMSourceFileDir and PMLookupFileDir in the administration console in the process tab of the integration service.
ValidateDataCodePages enforces data code page compatibility.
When this option is disable, the restrictions are lifted for source and target data code page selection, stored procedure and lookup database code page selection, and session sort order selection.
Administration Console > Integration Service > Properties tab. > Configuration Properties > Edit and Deselect the 'ValidateDataCodePages' check box.
The Integration Service performs data code page validation in Unicode data movement mode only (General Properties section).
Administration Console > Integration Service > Properties tab. > Custom Properties area > Edit and add the following parameters:
Enter the number of an available port. For example, 4006. This custom property configures Informatica PowerCenter Services to listen on the Server port number. DAC communicates with the PowerCenter Integration Services service using this port.
Enables Informatica to evaluate parameters within mapplets.
This procedure is not specific to the Siebel source system. It must be performed for all types of source systems.
If your source to target data movement configuration is Unicode to Unicode, you need to create a custom property called SiebelUnicodeDB on Integration Services. If your source to target data movement configuration is either Code Page to Code Page or Code Page to Unicode, you do not need to create this property.
In the custom properties of the integration service, you must add one properties with:
For example, [email protected] [email protected] where:
Value | Is the | Value Field | Description |
---|---|---|---|
oltp | is the | [user_OLTP] | The database user for the OLTP source database |
db204007.host.com | is the | [ConnectString_OLTP] | The connect string for the OLTP database |
olap | is the | [user_OLAP] | The database user for the Oracle Business Analytics Warehouse database |
db204008.host.com | is the | [ConnectString_OLAP] | The connect string for the data warehouse database |
You need to create a connection for:
Example for the DataWarehouse:
Where:
For the EBS target:
The name of the relational connection must match the name entry in the set up “Physical Data Source” tab of the DAC Client
The DAC installer installs DAC in the \orahome\10g\bifoundation\dac directory.
The DAC Client uses an ODBC connection to create and alter tables in the Oracle Business Analytics Warehouse.
If your source system is Siebel CRM, you must create an additional ODBC connection to the OLTP (source) database
Create a DSN Odbc Connection (ODBC - How to create a DSN connection in Windows) to the Oracle Business Analytics Warehouse database on the machine that hosts the DAC Client with the following:
DAC uses the Informatica pmrep and pmcmd command line programs to communicate with Informatica PowerCenter in the following ways:
The pmrep/pmcmd program must be installed:
It's why the two software (client as server) must co-located.
The pmrep is already installed with the default installations and the pmcmd program must be manually copied from the PowerCenter Services bin folder to the PowerCenter Client bin folder.
DAC metadata files are extracted by the Oracle BI Applications installer (in the directory OraclelBI\dwrep\DAC_metadata\) and you need to copy these files to the machines hosting the DAC Client and Server.
DAC_Client in this table is the directory OraclelBI\dwrep\DAC_metadata
DAC Machine | File/Folder Name | Source File/Folder | Destination File/Folder |
---|---|---|---|
Client | The complete Export folder | DAC_metadata\DAC_Client\export | DacHome\bifoundation\dac\export |
Client | The deletetriggers.list file | DAC_metadata\DAC_Client | DacHome\bifoundation\dac\conf |
Server | The parameterfileOLTP.txt file | DAC_metadata\DAC_Server | DacHome\bifoundation\dac\Informatica\parameters\input |
Server | The parameterfileDW.txt file | DAC_metadata\DAC_Server | DacHome\bifoundation\dac\Informatica\parameters\input |
If your source or target database is a DB2/390 database, run DAC_DB2390_Source.sql or DAC_DB2390_Target.sql immediately after importing the seed data by connecting to the database where the DAC Repository resides. These files are stored in the \DAC directory.
as sys@obia_dwh
SELECT * FROM V$NLS_PARAMETERS where parameter = 'NLS_CHARACTERSET';
PARAMETER VALUE
------------------------------ ------------------------------
NLS_CHARACTERSET AL32UTF8
If a 'Success' message is displayed, the data warehouse tables have been created. If you want to see log information about the process, use the following log files.
If a 'Failure' message is displayed, the data warehouse tables have not been created. Use the log information in DAC_Home\bifoundation\log\config\generate_ctl.log to diagnose the error. The createtables.log is not generated.
On Windows, start the DAC Server by double-clicking the \DAC\startserver.bat script.
title Oracle DAC Server
call config.bat
Rem Uncomment the below if you want to see a DOS window with messages.
Rem and comment out the JAVAW line.
Rem
%JAVA% -Xms256m -Xmx1024m -cp %DACCLASSPATH% -Duser.dir=%DAC_HOME% com.siebel.etl.net.QServer
Rem
Rem start %JAVAW% -server -Xms256m -Xmx1024m -cp %DACCLASSPATH% -Duser.dir=%DAC_HOME% com.siebel.etl.net.QServer
28-sep-2010 4:34:47 com.siebel.etl.bootup.DAWUtils checkPermission
SEVERE: Checking the localhost with the values defined in the DAC repository!
28-sep-2010 4:34:47 com.siebel.etl.bootup.DAWUtils checkPermission
INFO: Adding server name : localhost
28-sep-2010 4:34:47 com.siebel.etl.bootup.DAWUtils checkPermission
INFO: Adding alternate server name : localhost
1 SEVERE Tue Sep 28 04:34:47 PDT 2010 Checking the localhost with the values defined in the DAC repository!
2 INFO Tue Sep 28 04:34:47 PDT 2010 Adding server name : localhost
3 INFO Tue Sep 28 04:34:47 PDT 2010 Adding alternate server name : localhost
4 INFO Tue Sep 28 04:34:47 PDT 2010 Bound to port 3141
5 INFO Tue Sep 28 04:34:47 PDT 2010 Creating ClientMessage Dispatcher with 2 worker-threads
6 INFO Tue Sep 28 04:34:47 PDT 2010 SERVER_NETWORK_MESSAGE: Created ClientMessage Dispatcher with 2 worker-threads
7 INFO Tue Sep 28 04:34:47 PDT 2010 com.siebel.etl.net.ClientMessageDispatcher registered with HeartBeatManager
8 INFO Tue Sep 28 04:34:47 PDT 2010 CONNECTION_ISSUE Repository starting polling
9 INFO Tue Sep 28 04:34:47 PDT 2010 CONNECTION_ISSUE Repository 10 aquired for polling
10 INFO Tue Sep 28 04:34:47 PDT 2010 CONNECTION_ISSUE Repository ending polling
Where:
For a description of all DAC System Properties, see “How to Set DAC System Properties”
Be sure that you have the INFA_HOME\Server\bin and INFA_HOME\Client\bin in the PATH environment variable. Otherwise DAC Server and DAC client can't found the pmcmd and pmcrep programs and the connection test will fail
For a test machine without a lot of power, you can divide by two the parameter maximum session. Determining the maximum number of session
The Physical Data Sources tab displays a precreated record for the data warehouse with the name DataWarehouse, and one or more records for the OLTP sources. The records that are created by DAC for the OLTP sources depend on the business application source systems you selected when importing the DAC metadata.
You must set up as data source only :
The file “Connection Type” doesn't need any configuration.
where:
The data source number is passed as a parameter to the Informatica workflows. If you are using multiple sources, each data source has a unique number. Typically, all source dependent extracts will use this parameter to populate the DATASOURCE_NUM_ID column, and the source independent workflows will carry these values to the final dimension and fact tables.
Create the mails that you need on the setup > Email Recipient tab with the notification levels as follows:
Inactive indicates whether the selected email activation is active or inactive.
To set up the SMTP parameters, you must go to Tools > DAC Server Setup > Email Configuration (Email - How to verify your SMTP connection and parameters (TSL/SSL) with TELNET ?)
You cannot make any changes to the preconfigured containers. You must make a copy of a container before you can make any changes to it
You set source system parameters (also known as DAC ETL Preferences) in DAC to specify how the ETL routines process data for a container. For example, if operating in the United States, you might set the $DLFT_COUNTRY to 'USA' to identify your data.
The following preferences are applied to all tasks within a container.
If extracting data from more than one source, these preferences will need to be re-applied to each associated container. They can be overridden at the task level by adding the parameter to the specific task and assigning a value there.
With the copy of the container, modify the parameters:
The Oracle BI Applications repository file named OracleBIAnalyticsApps.rpd in the ORACLEBI_HOME\dwrep\server\Repository folder has as credential: Administrator/SADMIN or Administrator/Admin123
Before starting a full load (Bi Apps: Problems With Index Creation During Full Load), you must be sure that you don't have any duplicate in the table per_all_people_f. You may find in the vision database (the sample database of EBS) a bad record.
SELECT person_id,
effective_start_date,
effective_end_date
FROM per_all_people_f
WHERE person_id = 6272;
person_id | effective_start_date | effective_end_date |
---|---|---|
6272 | 04-JAN-91 | 06-FEB-02 |
6272 | 04-JAN-91 | 31-DEC-12 |
A person id can't be effective for the same date. This is technically possible because the database uniqueness depends on person_id, effective_start_date, effective_end_date, all three of them. However, this is functionally not possible. You must then correct the second line with the value 07-FEB-02 for the effective_start_date.
Following 2 mappings failing in 7.9.6 (SDE_ORA_Stage_ARTransactionFact_DiffManDerive, SDE_ORA_Stage_GLRevenueFact_DiffManDerive ) with this error :“Database connection name PARAM_OLTP_ORA11I for database connection variable DBConnection_OLTP is not well defined.” See this solution: SDE_ORA_Stage_ARTransactionFact_DiffManDerive error
Don´t add the “SA Financial - Group Account Cleanup” subject area in the execution plan. See ORA-00001: unique constraint error for W_AR_XACT_F_U1 index
$$Hint1= /*+ USE_NL(TARGET_TABLE SCD_HISTORY)*/
$$Hint2= /*+ USE_NL(SCD_OUTER DELTA_TABLE)*/
$$Hint3= /*+ INDEX(DELTA_OUTER W_PARTY_PER_D_T1_TEST_AK)*/
In the DAC log file (located DAC_Home\10gR3_1\bifoundation\dac\log), you can find the log file of the execution (such as Financial_Analytics_Gerardnico.21484569.log) and found this error:
Request to start workflow : 'SIL_InsertRowInRunTable' has completed with error code 3
Error Message : An error occurred in starting or running or stopping the workflow or task.
Just grab the pmcmd command and test it in a DOS shell.
>pmcmd startworkflow -sv int_obia7961 -d Domain_OBIA7961 -u dac -p dac -wait -f SILOS -paramfile E:\Informatica\PowerCenter8.6.1\server\infa_shared\SrcFiles\FlatFileConnection.DataWarehouse.SILOS.SIL_InsertRowInRunTable.txt SIL_InsertRowInRunTable
Informatica(r) PMCMD, version [8.6.1 HotFix10], build [412.0123], Windows 32-bit
Copyright (c) Informatica Corporation 1994 - 2010
All Rights Reserved.
Invoked at Sat Nov 06 13:15:54 2010
[PCSF_46007] No gateway connectivity is provided for domain [Domain_OBIA7961].
ERROR: Cannot connect to Integration Service [int_obia7961].
Completed at Sat Nov 06 13:15:54 2010
This error comes from that pmcmd can't find the dd'efinition file of the domains. Just set up the INFA_DOMAINS_FILE environment variable to point out to the domains.infa file such as:
C:\>set INFA_DOMAINS_FILE=E:\Informatica\PowerCenter8.6.1\domains.infa
More log: E:\Informatica\PowerCenter8.6.1\server\infa_shared\SessLogs
I don't know why but it seems that the current schema is not the good one when integration service does a connection with the Oracle Database.
To resolve this problem, you can add this command in the connection environment SQL of the relational database parameters:
ALTER SESSION SET CURRENT_SCHEMA = "DWH_REP";
After you have installed and configured Oracle BI Applications, your Oracle Business Analytics Warehouse (OLAP) database is empty. You need to perform a full load ETL to populate your Oracle Business Analytics Warehouse.
The ETL processes for Oracle BI Applications are created and managed in DAC.
This section uses an example to show you how to get started quickly with running a full load ETL. In this example, you have installed Oracle Financial Analytics with an Oracle EBS OLTP data source, and you want to load OLTP data for the subject area Receivables, as follows:
If the ETL run:
Notice the global log file. It's in the DAC log directory and have the process id number
If an ETL run fails, you cannot re-run the ETL until the failed ETL has been cleared from the Current Run tab. To clear an ETL from the Current Run tab, right click on the ETL and select Mark As Completed.
For any DAC problem (also during the load), see the log files.
If the process fails, use the \DAC\log\import.log file to diagnose errors.
If a 'Failure' message is displayed, the data warehouse tables have not been created. Use the log information in DAC_Home\bifoundation\log\config\generate_ctl.log to diagnose the error. The createtables.log is not generated.
It's not an error. You have to continue the configuration process in order to set them after the first start of the DAC Server.
The log files are located in DAC_HOME\dac\log. (such as C:\orahome\10gR3_1\bifoundation\dac\log)
Example of content of an error file:
ANOMALY INFO::: Error while executing pmcmd pingservice -sv obia7961int_obia7961 -d Domain_OBIA7961
MESSAGE:::Cannot run program "pmcmd": CreateProcess error=2, The system cannot find the file specified
For the tip, this error occurs when you forget to add the INFA_HOME\Server\bin or/and INFA_HOME\Client\bin in the PATH environment variable
E:\Informatica\PowerCenter8.6.1\server\bin>infacmd.bat ping
[ICMD_10135] Command requires at least one option specified.
[ICMD_10004] Usage:
ping [<-DomainName|-dn> domain_name]
[<-ServiceName|-sn> service_name]
[<-GatewayAddress|-dg> domain_gateway_host:port]
[<-NodeName|-nn> node_name]
[<-ResilienceTimeout|-re> timeout_period_in_seconds]
E:\Informatica\PowerCenter8.6.1\server\bin>infacmd.bat ping -dn "Domain_OBIA7961"
[ICMD_10052] Domain [Domain_OBIA7961] Host:Port [OBIA7961:6001] was successfully pinged.