Extract transform load

Application Life Cycle Management Tool – HP Quality Centre 11.0 for Beginners

Posted on Updated on

Hi All,

HP Quality Center is quality management software offered from the HP Software Division of Hewlett-Packard with many capabilities acquired from Mercury Interactive Corporation.[1] HP Quality Center offers software quality assurance, including requirements management, test management and business process testing for IT and application environments. HP Quality Center is a component of the HP Application Life cycle Management software solution set. (Taken from Wiki)

Qualitycenter

I have been working in QC right from version 9.0 to 11.0. There are so many key functionalities added to the new version.
I hope most of them worked in Quality Centre as a Management Tool. Here I will explain for those who want to learn QC and its modules.

QC is a web tool, user can access from anywhere but only in INTRANET (specific to your company we). You cannot access outside of your company. Projects will be created and by QC Admins and the access to the projects will be given by QC Admins as per your Role. Quality Center has so many functionalities and a user cannot access all of them and can use most of them based on the access given.

  1. If you are a Test Analyst, you will not have access to delete any entity that’s created folder level.
  2. If you are a Test Analyst you will not have access to move the defects from Analysis status to retest status.
  3. If you are a Developer you will only have access to move the defects from Analysis status to retest status.
  4. If you are a Test Lead / Dev Lead , you will have most of the access
  5. If you are a Test Manager you will have access to all the functionalities existing in Dashboard module

Logging into Quality Center:

  • Use the URL (Web Link)
  • Login Name – Enter your User Id (Mostly Windows Login user Name)
  • Password – Enter your password (You will be entitled to a Default password @ the time of user id creation and you can change the password by Tools à Customize à Change Password.
  • After entering User Name and Password and Check the Authentication of your user name and password by clicking Authentication button.
  • After Clicking it, it will show the Domain and Project Name that is assigned to you, If you are assigned to more than one project the use the drop down and select the Project that you wanted to work.
  • You can change the project from the QC Home Screen (Top Left Corner) Change project.

Module 1à Dashboard Module:

In Dashboard Module user can see two types of View, Analysis and Dashboard View. Analysis view can be used to get the reports for the particular Test Sets, Defects from particular Test Sets and Reports from particular project. Dashboard views will be used by the Test Managers, and they can create a report across programme level. In each View you will have two folders, Public and Private. If you create any Reports under Private then it’s visible to the creator only but if you create a report in Public then it’s visible to all the users who are accessed to the Domain and the Project.

1. Analysis View

In this user can create a Graphical Report or Standard Report. There is a button (+), if you click you can see

Graph Wizard, New Graph, New project Report, New Excel Report and New Standard Report.  Click on the type of the report that you wanted and choose where you wanted to post the report either in Private or in Public.

Now I am using New Graph functionality.

  1. Click on New Graph (Under Private section)
  2. Upon clicking you will get a New Window where you need to select1.
    1. Entity (Defects, Tests etc.,)
    2. Graph Type (Progress , Age, Summary, Trend)
    3. Graph Name (Project Name  – User Specific)
    4. After creating the Test Report, you can see the Project Name that you have created under the Private Section. (Left Side of the tool)
    5. Click on the Project then check your right hand side, you can see 3 tabs, Details, Configuration and View.
    6. Details tab – the use can’t edit anything, its auto populates the values that we had given when created the project.
    7. Configuration tab – this is the place that we need set up what we need in the report.
      1. X- Axis – Choose from the drop down what you need to see in the X- Axis
      2. Y- Axis – Choose from the drop down what you need to see in the Y- Axis
      3. Grouped by – Choose the entity by which the X-Axis and Y-Axis needs to be grouped by
      4. And you can see a button Filter (Funnel Symbol) click on that then you will get a Pop up window where you need to set the target folder from where you get the data for the Graph.
      5. In that window you will have Filter and Cross filter tabs. Choose Cross Filter tab and you can see different sections like Defects, Test Sets, Requirements etc,
      6. Choose the Entity that you need, right now I want to show to my lead how many test cases that I have executed over a period of time so I will choose Test Set section.
      7. Under Test Set section you will have 3 radio button, click on the radio button that shows next to None radio button.
      8. At the end of the text box you will have (…) 3 dots. Click on it. It will take you to the Test Lab where you can see all the Test Sets created under a given domain.
      9. Select the Test Set that you want from the Test Set tree.
      10. And Click Ok now Click on View you can see the Graphical representation of the data that you have in your Test Lab.
      11. By Clicking Data Grid on View tab you can see the data in Numbers.

Module 2 > Management Module:

Management module will allow the Test Managers or Test Leads to set up the Cycle Start and Cycle End date to the given Test Set in Test Lab. A user can’t access a Test Set after the Cycle End date. This module can give the data to drive the Test Metrics related to Effort.

Module 3 > Requirements Module:

Business requirements are captured in this module. You can add the actual business scenarios or you can add the test conditions as requirements. You can add requirements by simple clicking the Add New Requirement button or you can upload using the Excel upload add in manager.

By capturing the Testable requirements we can achieve the requirement traceability. Suppose if you are testing a log in screen then you will set up a test condition as

‘Verify the Login Screen is working as expected by giving valid user name and password’

This test conditions can be furthermore explained in detail steps under Test Plan module.

The key points that we need to maintain when setting up the Test Conditions are

  1. We need to set the Priority to each test conditions
  2. We need to write the meaning full test condition name
  3. We need to choose the correct requirement type from the drop down box (Right top most corner)
  4. Author name will be auto populated from your Login credential (if I logged in then my name will be populated as Author name)
  5. You can’t delete a created Test Requirement whereas you can Cut and Paste into Recycle Bin folder. (Requirement, Test Plan and Test Lab modules will have the Recycle Bin to keep the rubbish contents)
  6. The created Test Conditions will be mapped to Test Cases from the Test Plan module. If you have not mapped then you can see Direct Cover Status as Not Covered, if you have mapped but not executed then you can see it as Not Completed, you have executed the Test Case if the test case is Passed or Failed then you can see it in Direct Cover Status.

Module 4 > Testing

Under Testing module we will have below mentioned tabs

Test Resources – This tab will be used for QTP to keep the Scripts

Test Plan – Marinating all detailed steps involved in particular Test Conditions

Test Lab – We will pull the test cases under Test Lab for Execution

Test Plan

We can simply add a Test Case using New Test button or we can write the test cases in spread sheet and upload it into QC using QC Excel Add in Manager.

The key points that we need to maintain when setting up the Test Plan are:

  1. Click on the New Test button , you can see a window opened
  2. Enter a Valid test name, mostly it should match with the name that we had given for test conditions
  3. Select the type of the test that we are performing (right top most corner)
  4. If you see any sections are marked as Red then it’s a mandatory column and you need to enter the value.
  5. Select the SDLC phase from drop down box like System Test, E2E Test, Regression Test etc
  6. Select the Priority from the drop down box, this priority should match with the priority that we had given for Test Conditions.
  7. Select the Test Type from drop down , what kind of testing that you are doing, like Functional or Non Functional etc.,
  8. Capability mostly Do Not Know
  9. Select the Application that you are going to work in and this will be pre-defined by the QC Admin team
  10. When you are creating the Test Case please keep the Reviewed question as Not Reviewed and assign the Reviewer name.
  11. Once you save the Test Case the Auto Email will be sent to the Reviewer.
  12. Now reviewer will be reviewing the test case and will set the status as Reviewed.
  13. Under the Description Section we need to enter the following details, these details will be pre-defined by the QC Admin team and will be populated to all the resources associated to the domain.

Objective:

Description:

Pre-Requisites/Data Setup:

Expected Results:

  1. We have set up the Test Case, now we need to map this Test Case to the Test Conditions. To do that please Click on the Test Case name from the left side pan so that you can see the below mentioned tabs on your right side
    1. Summary – All the details related to that test cases (what we had given when creating) will be populated
    2. Design Steps –  will have the Details steps that will be performed on the Test Conditions
    3. Parameters – the data that needs to be passed for Automation Frameworks
    4. Test Configuration – auto populates the values
    5. Attachments – if you like to add any documents that related to this particular scenario you can add under this tab
    6.  Requirement Coverage – from this tab we can search the Test Requirement (test conditions) that related to the test case and drag and drop using Select Requirement button
    7. Linked Defects – Defects can be attached to Test Case or to Test Steps. So the attached Defects will be shown under this tab
    8. Dependency – This tab is used for Automation tests, where you need to create dependency between different modules.

PS: After mapping the Test Conditions to Test Cases please go to Requirement module and check the Direct Cover Status, if the Status is Covered then leave it else Refresh it reflect the status.

Test Lab

We have set up the Test Conditions and Test Cases and linked them now we need to pull them into Test Lab and make them available to Test Execution.

Before pulling the Test Cases into Test Lab, please create a Test Set folder using New Test Set button. By Clicking it you will be getting a window where we need to enter the logical test set name.

Now we have created the Test Set and pull the test cases into test lab by going to Execution Grid tab in the Test Lab module and Click Select Tests this will take you to Test Plan module where you browse the test cases that you wanted to move.

Pull all the test cases that you wanted. All the modules in QC you will have the Select Columns button to customize the details that you wanted to display in the Screen, so play across the needed columns and make them available in your screen.

The key points that we need to maintain when setting up the Test Lab are:

  • Running the Test Case
  1. You have Run button to execute the test case. In Run we will have Run With Manual Runner and Continue Manual Run.
  2. By simply clicking Run will lead you to Run with Manual Runner option. Now you can see the test steps that are written in Test Plan module.
  3. To pass the Test Case, we can click CTL+P to fail the Test Case CTL+F. Please attach a Test Evidence for each Test Case that you are executing.
  4. Do not execute the test cases more than one, if you do then it will create an Instance for each time that you are executing this will annoy your test reports
  5. If you kept a Test Step as Not Completed for any reason @ first time of execution and you wanted to execute that particular test step then choose Continue Manual Run under RUN tab.
  6. If you are failing a Step then you can create a Defect from Test Plan module.
  7. After failing the Defect, go to Linked Defects tab and Click Add (+) button it will take you to defect module from there you can create defect and it automatically linked to that particular step.
  8. If you are keeping a Test Case as Not Applicable then you need to attach a Evidence why this test cases are chosen as Not Applicable
  9. If you are keeping a Test Case as Deferred  then you need to attach a Evidence why this test cases are chosen as Deferred

Module 5 > Defects

Defects can be added from Test Plan module or from Defect Module. If a Defect is attached to a test cases then please do link the test case to this defect. You can raise an Orphan defects without linking them to any test cases.

Go to defect module and click Add Defect button then you will get a window where we need to input the defect details.

  1. Summary – Brief description about defects
  2. There are few columns will be auto populated as per the QC configuration
  3. When we raise the Defects it will be in New Status as per the Defect Life Cycle
  4. Defect Type – We need to select the defect type from the drop down like Application Code, Requirement Defect etc.,
  5. Discovery Phase – We need to select on which phase the defect is Injected like System Test, UAT, E2E etc.,

And there will be more than 10 mandatory fields that we need to enter as per the project specific details.

  1. Description – Testers needs to give the detailed description about the defect in Description section

Test User Name/ID
Pre-requisite
Steps to Replicate
Expected Result
Actual Result
Test Data Reference
Test Case Reference [Test Case Name (Step number)]

 Defect Life Cycle in QC

  • New – (When the defect is created)
  • Analysis – (When the defect is moved to Developers)
  • Fix – (When the fix is given by the developers)
  • Deploy –(when the developers deployed the defect fixed code into test environment)
  • Retest – (When the code is ready for retesting)
  • Closed – (when the defect is retested and closed by the testers – once you closed the defect we can’t modify the defect.

Note: User can’t jump from one status to another status by by-passing one status in between.

Setting up the Priority and Severity to the Defects:

Do you remember we have kept Priority to a Test Requirement when we have created in Requirement module? I have raised a Defect that related to the Test Case which was set to Priority Low then keep Defect priority as Low.

Cheers – Asik

Importance of Non Functional Testing in Data warehouse

Posted on Updated on

Hi All

In this post I would like to share my knowledge in Non Functional Testing in Data warehouse testing.

car

There are different types non-functional testing that we do in testing world, some of them is

  1. Baseline testing
  2. Compatibility testing
  3. Compliance testing
  4. Documentation testing
  5. Endurance testing
  6. Load testing
  7. Localization testing and Internationalization testing
  8. Performance testing
  9. Recovery testing
  10. Resilience testing
  11. Security testing
  12. Scalability testing
  13. Stress testing
  14. Usability testing
  15. Volume testing

To me Non Functional testing is something like which will not give any business values; It’s something like dealing with the environment. When we extract the data from heterogeneous source system, we might need to think of handling

Verifying the volume of the data

Any business can’t ensure what could be the volume of the data that they will send. They can say approximately, Our Code should have the capability of pulling the maximum number of data that they source system can send at any point of the time. To manage the Volume of the data, Teradata has the feature called M-Load and T-Pump. When developers designs the system they fix a limit by which data will be loaded into Warehouse.

Example:

  • M-Load – If we get a data file with 100 records then the records will be loaded by M-Load functionality
  • T-Pump – If we get a data file with less than 100 records then the records will be loaded by T-Pump

What we need to test here is, send a file with 100 records and check records are loaded by M-Load. This can be verified using the Load Job Names.

Verifying Date and Time of the Data file arrival to the Unix Landing directory

Most of the Companies will not function on Week Ends, Public Holidays so our source systems will not send any transactional data on those days. Because of the phenomenon developers will design their jobs to archive any files coming on these days.

Normally, Monday’s transactional data will come to us for loading on Tuesday early morning and it will end on Fridays transactional data will hit us on Saturday early morning.

We as testers need to verify these schedules are working as per the specification. This can be achieved

  • sending a file on Week End and check this file is archived
  • Sending a file on Public Holiday and check this file is archived
  • Verifying Mondays transactional data received on Tuesday morning until on Saturday morning

Verifying Purging and Truncate Loads

I have already mentioned about Purging and Truncate loads in my earlier blogs.

Purging –  The AutoSys jobs will Purge the data leaving the required data in staging table. Suppose if I have loaded 10th,11th ,12th of January data into staging table and when I load 13th of January data, the 10th of January data will be purged.

Truncate –  Simple load day_01 data and when you load day_02 data  they Day_01 data will be deleted

We as testers need to verify the Truncate and Purging is happening as per design requirement.

Verifying File Watcher Script

There will be File Watched Script that will look for files until it arrives the Unix Landing directory. Source system is promising us that they will send Day_01 file on 10-01-2013. So we have set the Date in File watcher Script. Source System sent the records on 10-01-2013 , now our File watcher Script will look the date from the file header, if both are matching then it will process the file into Staging table. Source system failed to send the data on 11-01-2013, our file watcher job will look for the file on 11-01-2013 for given time interval if its not arrived then automated Email will be sent to the concern source system saying the file is not arrived

So we as testers needs to verify the File watched job is working as expected.

Cheers – Asik.

My Imagination – How Scientists and Software Testers are related to each other?

Posted on

Hi All,

Hope you guys had  wonderful X-Mas holidays :-).

Image

As we all know all the Inventions are made by Asking more and more questions.

If Newton would have not asked ‘Why Apple fallen down from the Tree’ then we would have not have Gravity Theory.

If Archimedes would have not asked ‘Why waters spilled out of the Tub when he was bathing?’ then we would have not have Archimedes principle.

Likewise if testers asks questions on applications like Why or How? if he/She finds the answer that is not matching with what it supposed to be then IT IS A Discovery, So we are Scientists 🙂

As far as I consider testing the application is asking questions about the application that I am testing – is perfect way of finding software anomalies.

Each functional defect that we stopping by our testing is just saving the production fixes but if we stop any Business Gaps then it saves the whole Business Need. So before kick of the testing make sure we know the Business Needs.

You will get a Functional Defect – If you ask How things are working? then it will be already mentioned in all the functional specifications ! if some functionality is missed out in the developed code can be found.

You will get a Req/ Specification / Design / Defect If you ask why things are working ? then you will need to check the Business Requirements, if you feel something is wrong then you will encounter few anomalies that may be from all the specifications related to that particular work request.

In this post I wanted to explain how Important the Domain knowledge is required for a Testers.

Domains like Banking, Health Care, Manufacturing, Insurance etc., All these domains are closely related to us.

To modernize these functional areas

> Business peoples will write Specs to cater the business needs as Business Requirement Documents.

> Considering the Business Specifications Solution Designers will prepare Functional Specification Documents.

> We testers and developers will refer the above documents and Develop and Test the application.

How you can learn Business easily ?

If you are working in a Banking Domain, you got Loan functionality to be tested in your Client website, then please create a Loan application in real time with your bank or with some other bank’s on-line application ,

If you are working in a Health Care Domain, you got Inventory functionality to be tested then go to nearby Chemist (Medical Shop) pick some medicines from here and there and go for billing and check how the shop keeper handling your goods.

Like above two examples, what ever business that you are testing, please do imagine that YOU ARE ALSO GOING TO USE THIS PRODUCT. Would you accept a Defective product from your manufacturer, ‘No’ right then your testing will be perfect.

I hope all the readers know about Validation and Verification ?

Let me tell you what I think about it,

Validation means, we need to verify all the documents that related to the given functionality are acceptable and valid.

Verification means, by validated specifications, the Code is written and its verified by us.

In Data warehouse world, the Specification documents are needs to be validated, because even a simple mistake will create a huge problem @ the end.

For an Example,

In warehouse we keep amount columns are in Negative (Bank Owe’s to us) as well as in Positive numbers (We Owe to bank).

Business Need – All the transactions of the day to be extracted

For extracts If specification documents asks us to pull the records where Balance > 0 then you will get the customers who are owing money to the bank.

So even a Single Symbol matters a lot !!! Before we start the Verification we need to Validate first!!!

Lets Discover along with Finding defects 🙂

Cheers – Asik

Process Control & its importance in ETL Testing

Posted on Updated on

Hi All,

Being a gap given today my blog about Process Control and its importance in ETL Testing.

process control

Everything in the world should be controlled by something, like how a mom controls her kids 🙂 Why we need to control the ETL process? and why it does matter for the Testers?

If we do not control the ETL process then we might not do the Correct Inserts and Updates to the target system. The code may be worn out and it behaves crazy. And if something goes wrong then you cant find the root cause of the issue. If you would have lost few records in ETL process without controlling the process then those records will be unknown for ever. Data fix is easy if something goes wrong. We can easily control the source behavior by controlling the ETL Process.

The extract , load and transfer mechanism is controlled by few attributes.

1. Load Date

2. Load Time

4. Process Identifiers 

5. Record Active Date

6. Record Expiry Date

Testers should verify these attributes because

1. Are the data loaded for today’s run?

2.Are the data updated correctly?

3.Are we loading the correct data from source system?

4.Are we maintaining the historical data?

5.Are the Insert and Update process is happening correctly?

6. Are we doing the correct Catch up for the delta records

 (Catch up means, I have inserted a record today and its delta record is coming one week later, this record should expire the          existing record and should insert the new record.)

7. Are the reconciliation is achieved in both Insert and Update scenarios?

In ETL process, the first step is loading into Intermediate Tables – just dumping all the source data into a table for further processing. You cant keep the data in the Intermediate tables for ever, because of the Database size. So you need to delete the data. So you can Truncate or Purge the data.

Truncate – Every time delete and load the data into intermediate tables.

Purge – Keep few days of data into Intermediate tables and delete the older data from Intermediate tables. This will be useful for capturing the historical data to do the catch up.

Here testers need to verify Truncating of records are functioning or not using the Date Columns in the Intermediate tables and Purging of records are happening for given data intervals using the Date and Time columns.

So now we are extracted the Source data and its controlled for Transforming – and we do not have any Delta Logic’s in the Intermediate tables.

Coming to the Warehouse tables, we have Inserts, Updates and Logical Deletes , let me explain you how these process are controlled and how testers needs to verify the same.

Again let me take my Face Book account for better understanding.

Inserts and Process Control

Today I  have created a profile using the Face Book sign up page !! and when Face Book engineer load my data into their ware house they attach today’s date as my Effective Date that is 21/12/2013 and Expiry Date as HIGH DATE (coz they don’t know when I will deactivate the account) and they will give me a Process Active Identifier (to denote it is a Insert or update record) for inserting the record and as the record is active so they will not give Process Expiry Identifier .

> Testers needs to verify Date columns and the Identifiers are for the current day load or not.

Updates and Process Control

The very next data I have updated my Display Name, now my record is going for Delta,

what will happen to existing record’s Effective Date,Effective Date,Process Active Identifier ,Process Expiry Identifier ?

and

what will happen to new record’s Effective Date,Effective Date,Process Active Identifier ,Process Expiry Identifier ?

Above questions are based on our requirements. So Testers should verify the transformation for these Process Control values are achieved as per the Source to Target specifications.

Logical Deletes and Process Control

On the third day I’m deactivating the Face Book account. Now my record in the warehouse should be deactivate in the case

what will happen to columns Effective Date,Effective Date,Process Active Identifier ,Process Expiry Identifier for my delete record?

Again testers should verify the correct transformation is achieved for the Deleted records as per the Transformation rules.

By ensuring this Process Control attributes we can ensure that

“Are we inserting the correct records into warehouse”

“Are we updating the correct records”

“Are we logically delete the correct records”

“Catch of the records are achieved”

“Maintaining the historical data and so on :-)”

Cheers

Asik 🙂

Chapter 14 – A Sample ETL Project – Customer Management in a Hospital

Posted on Updated on

Hi All,

Have a please week ahead !!!

In this post let me explain a Project Structure.

asik_world

I have a Multi-Specialty Hospital located across the world. My hospital is famous for Vaccination. Patients who comes to my hospital across globe will be given a User Card with which they can access any of my hospital in the world.

Business Need

Now we maintain all customers in one Database. There are heaps and heaps of customers related to my hospital.So I decided to split up the customers based on the country and load them into corresponding country tables. Elaborated details of this requirements will be mentioned in the Business Requirement Specification.

Design Need 

To pull the customers as per Country, a designer should know what are all the places the Customer Data is available. So the data extracting will be done by our Source System. They will pull the all the relevant customer data and will give us a Data file.

In Design Documents you will have

Source File Specification 

1)   File Name Specification – Name String , Extension of the files
2)   Date and Time format of the File – YYYYMMDD, HHMMSSTT or any other format
3)   Control File Specification – Name String , Extension of the files
5)   Header Records Layout – |H|Customer_Records|20131012|1245872|
6)   Details Record Layout – |D|Asik|123456|20101012|20121013|MVD|Sanjay|NSW|AU|06031987|A
7)   Trailer Record Layout – |T|10|

Detail Records will tell you what data you are getting from source, what data type, is it mandatory or not and the length of the column.

File Position Column_Name Filed Length Data Type Mandatory (Y/N) Key Column
1 Customer Name 255 VARCHAR Y Y
2 Customer ID 18 VARCHAR Y N
3 Customer Open Date 8 DATE Y N
4 Last Consulted Date 8 DATE N N
5 Vacination Type 5 CHAR N N
6 Doctor Consulted 255 CHAR N N
7 State 5 CHAR N N
8 Country 5 CHAR N N
9 Post Code 5 INT N N
10 Date of Birth 8 DATE N N
11 Active Customer 1 CHAR N N

Click below

Please refer my Post – Source File Validation to know how validate the above details.

The sample file format will be

|H|Customer_Records|20131012|1245872|
|D|Asik|123456|20101012|20121013|MVD|Sanjay|NSW|AU|06031987|A
|D|Alex|123457|20101012|20121013|MVD|Sanjay1|SA|USA|06031987|A
|D|John|123458|20101012|20121013|MVD|Sanjay|TN|IND|06031987|A
|D|Mathew|123459|20101012|20121013|MVD|Sanjay|WAS|PHIL|06031987|A
|D|Matt|12345|20101012|20121013|MVD|Sanjay|BOS|NYC|06031987|A
|D|Jacob|1256|20101012|20121013|MVD|Sanjay|VIC|AU|06031987|A
|D|Arun|1456|20101012|20121013|MVD|Sanjay|QA|AU|06031987|A
|D|Wague|23456|20101012|20121013|MVD|Sanjay|AOL|CAN|06031987|A
|D|Steve|126|20101012|20121013|MVD|Sanjay|NSW|AU|06031987|A
|D|Neil|156|20101012|20121013|MVD|Sanjay|NSW|AU|06031987|A
|T|10|

Now using the ETL process now we loaded the data into Staging Tables. Intermediate tables will look like below

Staging Table Load 

Click below

Please Check my ETL Process post how the data is loaded and what we need to verify from this Step :

Name Cust_I Open_Dt Consul_Dt VAC_ID DR_Name State County DOB FLAG
Asik 123456 20101012 20121013 MVD Sanjay NSW AU 6031987 A
Alex 123457 20101012 20121013 MVD Sanjay1 SA USA 6031987 A
John 123458 20101012 20121013 MVD Sanjay TN IND 6031987 A
Mathew 123459 20101012 20121013 MVD Sanjay WAS PHIL 6031987 A
Matt 12345 20101012 20121013 MVD Sanjay BOS NYC 6031987 A
Jacob 1256 20101012 20121013 MVD Sanjay VIC AU 6031987 A
Arun 1456 20101012 20121013 MVD Sanjay QA AU 6031987 A
Wague 23456 20101012 20121013 MVD Sanjay AOL CAN 6031987 A
Steve 126 20101012 20121013 MVD Sanjay NSW AU 6031987 A
Neil 156 20101012 20121013 MVD Sanjay NSW AU 6031987 A

Now my Staging Data is ready and we need to load them into corresponding Target Tables.

As a project we will test data load from file to Staging as Phase-I and Staging to Target tables to Phase II.

In this project we are not having any transformation rules , so we will insert the records from Staging tables to corresponding target tables.

Like

All customers related to India will go to Table_India and so on. In my next blog let me come up with few more complex examples for Target tables load.

The Source to Target Document will looks like

 Customers From India will be loaded by below logic
S.No Source Column Source Table Transformation Rule Target Table KEY / Value Target Column
1 Customer Name CUST_STG If Source value is Null Do not Load Else load CUST_IND Key CUST_NAME
2 Customer ID CUST_STG If Source value is Null Do not Load Else load CUST_IND Key CUST_ID
3 Customer Open Date CUST_STG Assign Source Value CUST_IND Value CUST_OP_DT
4 Last Consulted Date CUST_STG Assign Source Value CUST_IND Value CUST_CONS_DT
5 Vacination Type CUST_STG Assign Source Value CUST_IND CUST_VAC_TYPE
6 Doctor Consulted CUST_STG Assign Source Value CUST_IND CUST_DR_CONS
7 State CUST_STG If Source value is Null Do not Load Else load CUST_IND Value CUST_STATE
8 Country CUST_STG If Source value is Null Do not Load Else load CUST_IND Value CUST_COUNTRY
9 Post Code CUST_STG If Source value is Null Do not Load Else load CUST_IND CUST_POS_C
10 Date of Birth CUST_STG Assign Source Value CUST_IND CUST_DOB
11 Active Customer CUST_STG Assign Source Value CUST_IND CUST_STATUS

So based on above rule will load the data into IND_CUST tables.

Cheers

Asik