Training details
- Training duration: 4 days
- Number of people in the group: 4 - 8
- Target group: Developers
- Firmware: Version 10
- Prerequisites: RDBMS, SQL, PowerCenter: Data Integration for Developers
- List price for 1 participant: 550 Euro/day, possible discount for more participants
Discover Informatica PowerCenter 10. Use PowerCenter Designer, Workflow Manager and Workflow Monitor tools during data integration processes.
Cele
Upon successful completion of the course, the participant will acquire the skills necessary to:
- understanding the architecture of the solution Informatica Power Center
- defining the structure and use of PowerCenter parameter files
- implement advanced and user-defined data processing functions
- normalize and denormalize data with PowerCenter
- data search transformation (Lookup) in dynamic mode
- calling SQL stored procedure from PowerCenter mappings
- creating and configuring SQL transformations
- designing error handling strategies depending on the purpose of data processing in the workflow
- using PowerCenter's transaction control features based on changes to source, target and user-defined systems
- use of data loading mechanisms to maintain integrity ties in the target data model
- proper use of the built-in and optional retry capabilities of the PowerCenter mapping-design recovery option
- create batch scripts using command line tools: PMCMD and PMREP
- using the PowerCenter capacity configuration methodology
- determine the performance impact of mapping design and apply this principle to mapping design
- calculating the use of operational memory and managing the level of its use by PowerCenter processes
- application of process partitioning functions, data distribution and optimization of processor memory usage
training agenda
MODULES 1
PowerCenter 10 Architecture
- Describe the components of the Informatica PowerCenter 10 architecture and define key terms
- Describe PowerCenter's optional and built-in high availability features
MODULES 2
Parameter Files
- Ascertain the use of the IsExprVar property in a mapping.
- Determine the structure of a parameter file.
- Establish the use of parameter files in mappings and sessions
- Describe the flexibility of using parameter files to build mapping expression logic.
- Describe the use of a date/time mapping variable, in a parameter file for incremental loading
MODULES 3
User-Defined and Advanced Functions
- Describe and implement advanced functions
- Describe User-Defined functions
- Create a public, User-Defined Function to create a standard name formatting function and implement the UDF in the mapping.
- Use the AES_Encrypt and Encode functions to encrypt and encode customer data before writing it to flat file.
- Debug the mapping using an existing session and observe the results
MODULES 4
Pivoting Date
- Describe the use of a Normalizer transformation to normalize data
- Describe the use of an Aggregator to denormalize data
- Normalize data into a relational table
- Denormalize data into a Fact table
MODULES 5
Dynamic Lookups
- Define Dynamic Lookup
- Describe the Dynamic Lookup Cache
- Use a Dynamic Lookup to load data into a dimension table.
- Use a Dynamic Lookup in tandem with an Update Strategy transformation to keep historic data in a dimension table
MODULES 6
Stored Procedure and SQL Transformations
- Call a SQL stored procedure from a PowerCenter mapping
- Create and configure a SQL transformation in script mode.
- Create and configure a SQL transformation in query mode.
- Use a SQL transformation to create tables on an “as needed” basis.
- Enter a properly formatted query into a SQL transformation.
- Locate database errors in the result output of a SQL transformation
MODULES 7
Troubleshooting Methodology and Error Handling
- Design error handling strategies appropriate for the intended purpose of a workflow
- Identify data errors and load them to an error table.
- Describe Update Strategies
MODULES 8
Transaction Processing
- Describe PowerCenter source-based, target-based, and user-based transaction control with and without the high availability option
- Describe constraint-based loading in databases with referential integrity constraints
- Load data to a set of tables with a RDBMS Primary-Foreign key relationship
MODULES 9
Transaction Control Transformation
- Describe the use of the transaction control transformation for data-driven transaction control
- Control when data is committed to disk or the target database
- Use a transformation variable to create a flag that determines when to commit data to the RDBMS based upon data values
MODULES 10
Recovery
- Describe workflow and task recovery with and without the high availability option
- Recover tasks and workflows that stop, abort, or terminate
- Verify that recovery workflow works in a consistent, reliable manner
MODULES 11
Command Line Programs
- Describe PMCMD, PMREP, and INFACMD command line functionality
- Build batch files that use PMCMD and PMREP command line programs
- Use the command line utilities to execute a variety of platform status, query, object export, and workflow tasks
MODULES 12
Performance Tuning Methodology
- Isolate source, target and engine bottlenecks
- Interpret the performance counters
- Tune different types of bottlenecks
- Run a benchmark test
- Run a target bottleneck test
- Evaluate the results
MODULES 13
Performance Tuning Mapping Design
- Apply best practices in your mappings to optimize performance
- Locate session properties that can unnecessarily lower performance.
- Inspect and edit mappings for optimal performance design.
- Inspect and edit transformations for optimal performance design
MODULES 14
Memory Optimization
- Tune session-level memory
- Tune transformation caches
- Calculate how much memory a session uses
- Become familiar with PowerCenter Performance Counters
- Edit session memory limits
- Edit transformation cache memory properties
- Calculate memory cache sizes for transformations
MODULES 15
Performance Tuning: Pipeline Partitioning
- Apply partition points to efficiently utilize your CPU
- Partition your data efficiently to utilize your CPU
- Distribute your partitioned data to preserve functionality while optimizing your CPU
- Optimize your memory usage according to your partitioning strategy