Skip to main content

Improve Performance In Informatica


Optimizing the Target Database

If your session writes to a flat file target, you can optimize session performance by writing to a flat file target that is local to the Informatica Server.

If your session writes to a relational target, consider performing the following tasks to increase performance:
  • Drop indexes and key constraints.
  • Increase checkpoint intervals.
  • Use bulk loading.
  • Use external loading.
  • Turn off recovery.
  • Increase database network packet size.
  • Optimize Oracle target databases.


Dropping Indexes and Key Constraints

When you define key constraints or indexes in target tables, you slow the loading of data to those tables. To improve performance, drop indexes and key constraints before running your session. You can rebuild those indexes and key constraints after the session completes.
If you decide to drop and rebuild indexes and key constraints on a regular basis, you can create pre- and post-load stored procedures to perform these operations each time you run the session.
Note: To optimize performance, use constraint-based loading only if necessary.


Increasing Checkpoint Intervals

The Informatica Server performance slows each time it waits for the database to perform a checkpoint. To increase performance, consider increasing the database checkpoint interval. When you increase the database checkpoint interval, you increase the likelihood that the database performs checkpoints as necessary, when the size of the database log file reaches its limit.


Bulk Loading on Sybase and Microsoft SQL Server

You can use bulk loading to improve the performance of a session that inserts a large amount of data to a Sybase or Microsoft SQL Server database. Configure bulk loading on the Targets dialog box in the session properties.
When bulk loading, the Informatica Server bypasses the database log, which speeds performance. Without writing to the database log, however, the target database cannot perform rollback. As a result, the Informatica Server cannot perform recovery of the session. Therefore, you must weigh the importance of improved session performance against the ability to recover an incomplete session.
If you have indexes or key constraints on your target tables and you want to enable bulk loading, you must drop the indexes and constraints before running the session. After the session completes, you can rebuild them. If you decide to use bulk loading with the session on a regular basis, you can create pre- and post-load stored procedures to drop and rebuild indexes and key constraints.

For other databases, even if you configure the bulk loading option, Informatica Server ignores the commit interval mentioned and commits as needed.


External Loading on Teradata, Oracle, and Sybase IQ

You can use the External Loader session option to integrate external loading with a session.
If you have a Teradata target database, you can use the Teradata external loader utility to bulk load target files.
If your target database runs on Oracle, you can use the Oracle SQL*Loader utility to bulk load target files. When you load data to an Oracle database using a partitioned session, you can increase performance if you create the Oracle target table with the same number of partitions you use for the session.
If your target database runs on Sybase IQ, you can use the Sybase IQ external loader utility to bulk load target files. If your Sybase IQ database is local to the Informatica Server on your UNIX system, you can increase performance by loading data to target tables directly from named pipes. Use pmconfig to enable the SybaseIQLocaltoPMServer option. When you enable this option, the Informatica Server loads data directly from named pipes rather than writing to a flat file for the Sybase IQ external loader.


Increasing Database Network Packet Size

You can increase the network packet size in the Informatica Server Manager to reduce target bottleneck. For Sybase and Microsoft SQL Server, increase the network packet size to 8K - 16K. For Oracle, increase the network packet size in tnsnames.ora and listener.ora. If you increase the network packet size in the Informatica Server configuration, you also need to configure the database server network memory to accept larger packet sizes.


Optimizing Oracle Target Databases

If your target database is Oracle, you can optimize the target database by checking the storage clause, space allocation, and rollback segments.
When you write to an Oracle database, check the storage clause for database objects. Make sure that tables are using large initial and next values. The database should also store table and index data in separate tablespaces, preferably on different disks.
When you write to Oracle target databases, the database uses rollback segments during loads. Make sure that the database stores rollback segments in appropriate tablespaces, preferably on different disks. The rollback segments should also have appropriate storage clauses.
You can optimize the Oracle target database by tuning the Oracle redo log. The Oracle database uses the redo log to log loading operations. Make sure that redo log size and buffer size are optimal. You can view redo log properties in the init.ora file.
If your Oracle instance is local to the Informatica Server, you can optimize performance by using IPC protocol to connect to the Oracle database. You can set up Oracle database connection in listener.ora and tnsnames.ora.


Comments

Popular posts from this blog

Contact Me

Do You have any queries ?                   If you are having any query or wishing to get any type of help related Datawarehouse, OBIEE, OBIA, OAC then please e-email on below. I will reply to your email within 24 hrs. If I didn’t reply to you within 24 Hrs., Please be patience, I must be busy in some work. kashif7222@gmail.com

Top 130 SQL Interview Questions And Answers

1. Display the dept information from department table.   Select   *   from   dept; 2. Display the details of all employees   Select * from emp; 3. Display the name and job for all employees    Select ename ,job from emp; 4. Display name and salary for all employees.   Select ename   , sal   from emp;   5. Display employee number and total salary   for each employee. Select empno, sal+comm from emp; 6. Display employee name and annual salary for all employees.   Select empno,empname,12*sal+nvl(comm,0) annualsal from emp; 7. Display the names of all employees who are working in department number 10   Select ename from emp where deptno=10; 8. Display the names of all employees working as   clerks and drawing a salary more than 3000   Select ename from emp where job=’clerk’and sal>3000; 9. Display employee number and names for employees who earn commission   Select empno,ename from emp where comm is not null and comm>0. 10

Informatica sample project

Informatica sample project - 1 CareFirst – Blue Cross Blue Shield, Maryland (April 2009 – Current) Senior ETL Developer/Lead Model Office DWH Implementation (April 2009 – Current) CareFirst Blue Cross Blue Shield is one of the leading health care insurance provided in Atlantic region of United States covering Maryland, Delaware and Washington DC. Model Office project was built to create data warehouse for multiple subject areas including Members, Claims, and Revenue etc. The project was to provide data into EDM and to third party vendor (Verisk) to develop cubes based on data provided into EDM. I was responsible for analyzing source systems data, designing and developing ETL mappings. I was also responsible for coordinating testing with analysts and users. Responsibilities: ·          Interacted with Data Modelers and Business Analysts to understand the requirements and the impact of the ETL on the business. ·          Understood the requirement and develope