We have tried to come up with some of best practices in informatica
1) Always try to add expression transformation after source qualifier and before Target. If source or target definition changes it is easier to reconnect the ports
2) Always use Cobol File for normaliser in binary format otherwise there are lot of issues specially with comp-3 fields
3) Remove unused ports, though unused ports do not have any effect on performance bit it is always better to remove them for more visibility
4) If possible try to do calculation in Output Ports instead of variable ports as variable ports are need to assign/reassign each time and it can slow down the performance
5) Try to avoid complex filter expression instead of that try to evaluate filter expression in upstream expression transformation and pass it to filter transformation. If you use too many complex calculations in filter condition expression it can slow down performance.
6) In workflow Source/Target directory Property take advantage of Unix links. Instead of hard coding path in source/target directory specify path with Unix link
i.e. suppose in devt environment you are specifying Source directory path as /devserver/team/source and in prod server you specify it as /prodserver/team/source .You can get link created in $PMRootDir in as src_file_dir pointing to /devserver/team/source in dev server and /prodserver/team/source in prod server and in your source/Target file directory you can put path as $PMRootDir/src_file_dir In this case there is no need to change Source/Target directory every time you move between production and dev and testing
7) In sequence generator do not connect current value port to downstream (unless required) transformation as when we connect current value port from sequence generator transformation Informatica Server processes one row in each block. We can optimize performance by connecting only the NEXTVAL port in a mapping .
8) Improve lookup performance by putting all conditions that use the equality operator ‘=’ first in the list of conditions under the condition tab.
9) Always remember rule not to cache look tables having more than 550000 rows (Assuming row size 1024) .If your row size is less than or more than 1024 then adjust number of rows accordingly .
10) Avoid calculating same value again and again. Instead of that store it in a variable use it several times.
If you have further queries then please mail to support@itnirvanas.com
Tuesday, 3 February 2009
Best Practices in Informatica (Part1)
Subscribe to:
Post Comments (Atom)
hi,
ReplyDeleteCan you tell how you came to the conclusion of point 9? What units is 1024 in here? KB? Bytes? I assume it as bytes and your 550000 rows comes in as 500 KB. If I assume it as 1024 KB, then the 550000 rows contribute to 500 MB.
What are trying to infer here?
Cheers
Sarma.
Hi,
ReplyDeleteI've given this suggestion and few more in this article. Please have a look at it.
Cheers
Sarma.
Hello,
ReplyDeleteVery nice posts. Really appreciate your efforts.
I have doubt in determining the Look up cache size and index cache size for the lookup transformation. On which terms we will decide the size of those?
For example, if we have a target of size 100GB and we are taking that target as look up table can we go with the default size of look up cache and index cache? If we need to change the cache, on what measure we need to change them?
Thanks in advance.
--
suri