This is default featured slide 1 title

Go to Blogger edit html and find these sentences.Now replace these sentences with your own descriptions.

This is default featured slide 2 title

Go to Blogger edit html and find these sentences.Now replace these sentences with your own descriptions.

This is default featured slide 3 title

Go to Blogger edit html and find these sentences.Now replace these sentences with your own descriptions.

This is default featured slide 4 title

Go to Blogger edit html and find these sentences.Now replace these sentences with your own descriptions.

This is default featured slide 5 title

Go to Blogger edit html and find these sentences.Now replace these sentences with your own descriptions.

Monday, 18 November 2013

IDQ Interview Questions Part1

Q1 What is the difference between the Power Center Integration Service and the Data Integration Service? The Power Center Integration Service is an application service that runs sessions and workflows.The Data Integration Service is an application service that performs data integration tasks for the Analyst tool,the Developer tool, and external clients. The Analyst tool and the Developer tool send data integration task requests to the Data Integration...

IDQ Functionality

Use the IDQ to design and run processes to complete the following tasks:Profile data : Profiling reveals the content and structure of data. Profiling is a key step in any data project, as it can identify strengths and weaknesses in data and help you define a project plan.Create scorecards to review data quality : A scorecard is a graphical representation of the quality measurements in a profile.Standardize data values : Standardize...

Tuesday, 29 October 2013

IDQ Parser Transformation

IDQ Parser Transformation In this article we are going to cover parser based transformation .It is one of most important transformation used in IDQ. Parsing is the core function of any data quality tool and IDQ provides rich parsing functionality to handle complex patterns.Parser transformation can be created in two modeToken Parsing Mode Pattern Based ParsingToken Based Parsing : It is used to parse strings that match token sets regular...

Generating Dynamic Multiple Target files in Informatica

Recently we came across a scenario to generate multiple dynamic Multiple Target files in Informatica. We receive vendor data through Legacy database in below table Invoice_IDInvoice_NoInvoice_AmountVendor_IdWe need to separate all details related to one vendor in separate file so that we can pass data to third part vendors in separate file.INVOICE_DETAILS INVOICE_ID INVOICE_NO INVOICE_AMOUNT VENDOR_ID 1 A01 100.00 10 2 A02 125.00 10...

Saturday, 26 October 2013

Informatica Port Order

Port Order Informatica calculates ports in the following order: Input ports. Informatica calculates all input ports first as they are not dependent on any other ports. So, we can create input ports in any order. Variable ports. Variable ports can reference input ports and variable ports, but not output ports. As variable ports can reference input ports, the informatica calculates variable ports after input ports. Likewise, since variable can reference...

Thursday, 24 October 2013

Informatica 9 new features for developers

Informatica 9 new features for developers Informatica 9 has lot of new features including IDQ and Informatica Analyst etc ..in this post we will focus on features which are specially useful for developersLookup Transformation : Cache updates. We can update the lookup cache based on the results of an expression. When an expression is true, we can add to or update the lookup cache. We can update the dynamic lookup cache with the results of an expression. Multiple rows return We can configure the Lookup transformation to return all rows that...

Wednesday, 23 October 2013

Informatica Scenario Based Interview Questions (Part1)

Interviewer asked “How  will you get first 3 records from flat file source? “...You will say in your mind that’s simple You reply “We can use variable in expression and increment it and the use filter transformation to pass just first two records” Interviewer will come up with a new trick in his sleeve “How  will you get last 3 records from flat file source? “Déjà vu …JYou will think and try to build some logic on the fly and try to explain...