Insert

What is Quality Assurance & Quality Control? and Who is PQA, CC, DPA in QA ???

Posted on

Hi All,

Quality Assurance and Quality Control

quality-blocks1

 

Being a Test Professionals we should know about QA and QC. What we are doing is actually Quality Control related stuff and what people involved in bringing up CMMI, ISO standards are Quality Assurance. Quality Assurance is more likely related to maintain the process in any given Project or Programme. Quality Control is nothing but giving the right product by verifying the requirements are covered and working as expected.

We know there are multiple levels of testing methods are spoken in the testing world and we have the methodology by which we need to execute them like System Test, Integration Test etc., and Methodology like Water fall, Iterative etc.,

Quality Assurance:

Let me explain what I know and aware of Quality Assurance:

There 3 different roles who will be responsible for assuring the process in any projects.

  1. PQA – Project Quality Analyst
  2. CC – Configuration Controller
  3. DPA – Defect Prevention Analyst

Project Quality Analyst – PQA role

A person who involved in this role needs to know the pre-defined industry standards of that Organization.

PQA’s Roles and Responsibilities

  1. Documents Naming Convention met as per the Industry Standard
  2. Names of who prepared , reviewed , approved the deliverables
  3. Reviews happened across all the Customer facing documents
  4. Review defects are found, fixed, verified and captured for Metrics
  5. Checking whether all the deliverables are kept in a Common place where the stake holders can access
  6. Checking all the necessary deliverables are prepared by the project team
  7. Checking the actual project delivery date and the date mentioned in the documents are same
  8. Checking the Stake Holders names, document owner names are mentioned correctly in all customer deliverables
  9. Differentiating the Customer facing deliverables and Internal audit specific deliverables are as per the industry standards
  10. Verifying the Entry Criteria and Exit Criteria of any Levels in SDLC are met and collecting the proofs for the same
  11. PQA’s will be involved in all levels of SDLC

Business Analyst Teams will have separate sets of Deliverables like Business Requirement documents, Business Understanding Documents, Requirement Traceability documents etc.,

  1. Development teams will have separate sets of Deliverables like High Level Design, Low Level Design , Functional Specifications etc.,
  2. Testing teams will have separate sets of documents like Test Plans, Test Conditions

The PQA should validate all the documents that supposed to be delivered to the Clients and maintain for internal audits

CC – Configuration Controller

Configuration Controller who controls the versions and the placement of the documents in tools like VSS – Microsoft Virtual Source Safe or Documentum etc.,

Configuration Controller Roles and Responsibilities

  1. CC’s are responsible of Creating the folder structures in VSS or Documentum

Like, in any Projects the following folders will be created to maintain the projects deliverables

  1. Project Kick off
  2. Minutes of Meeting
  3. Requirements
  4. Review Log
  5. Development

1.1.  High Level design

1.2. Low Level Design

1.3. Issue Log

1.4. Emails

6. Testing

1.1   Unit Testing

1.2   System Testing

1.3   System Integration Testing

1.4   User Acceptance Testing

6.Production Support

CC’s will have the Admin rights to Grant and Revoke access to folders.

Developers should not have access to the folders related to Testing and vice versa

  1. CC’s will maintain the Check in and Check out of the documents that goes into VSS
  2. CC’s will maintain the relevant documents are kept in corresponding folders in VSS

DPA – Defect Prevention Analyst

Defect Prevent Analysts will maintain the defects across the SDLC. For any raised defects the work flow should be maintained. Proper comments for those defects should be given when they are created. All high severity defects should be fixed from one Phase to next phase to being with.

As testers when we raise defects we need to concentrate on applying Defect Cause and Defect Type in any Defect Management tool. This will help DPA’s to classify the defects and comes up prevention tips.

Defect Cause – What is the root cause of the defect that is,

  • Is the defect caused because of the Upstream Source Data or Test Data
  • Is the defect caused because of Incomplete of Missing requirements
  • Is the defect caused because the Inconsistent Requirement
  • Is the defect caused because of the Code discrepancy
  • If you find any anomalies in any Documents then raise the defects to Artefacts
  • If any of your defects leads to Changes in the Requirement then please raise them as Change Request – CR can be on the Actual Business Requirement or on Design Changes.

Defect Type – Classifications of the Defects that is,

  • Is the defect related to Data Error
  • Is the defect related to Application Code
  • Is the defect related to Work Request
  • Is the defect related to Change Request
  • Is the defect related to Deployment
  • Is the defect related to Specification
  • Is the defect related to Artefact
  • Is the defect related to Production
  • Is the defect related to Process
  • Is the defect related to Environment
  • Is the defect related to Specification
  • Is the defect related to Requirements
  • Is the defect related to Reviews

DPA’s most prominent work is to prepare the CAPA – “Corrective Analysis and Preventive Actions”

DPA roles and Responsibilities

  1. DPA’s will collect the Metrics related to Defects in a periodic manner – Weekly, Monthly or Ad-hoc
  2. DPA’s will collect the defects by Defect classifications like in a given period of time how many defects are raised in Reviews, Code, Requirement Change and collects the cause of the defects
  3. Then using the metrics that was retrieved from any Defect Management tools, they design the Fish Bone diagram by filling the necessary details
  4. Using the Statistical tools like Minitab, they calculate the Defect Density of the defects from each phase
  5. Then they will create the Prevention actions on the areas where the Defect Density is above to the threshold limits.

Suppose if your Organization has the Defect Density threshold limit as 0.5 and your Defects under Review defect type is more than 0.5 then they will ask the Reviewers to do their work better to minimize the review defects at any levels of SDLC.

Cheers – Asik

My Imagination – How Scientists and Software Testers are related to each other?

Posted on

Hi All,

Hope you guys had  wonderful X-Mas holidays :-).

Image

As we all know all the Inventions are made by Asking more and more questions.

If Newton would have not asked ‘Why Apple fallen down from the Tree’ then we would have not have Gravity Theory.

If Archimedes would have not asked ‘Why waters spilled out of the Tub when he was bathing?’ then we would have not have Archimedes principle.

Likewise if testers asks questions on applications like Why or How? if he/She finds the answer that is not matching with what it supposed to be then IT IS A Discovery, So we are Scientists 🙂

As far as I consider testing the application is asking questions about the application that I am testing – is perfect way of finding software anomalies.

Each functional defect that we stopping by our testing is just saving the production fixes but if we stop any Business Gaps then it saves the whole Business Need. So before kick of the testing make sure we know the Business Needs.

You will get a Functional Defect – If you ask How things are working? then it will be already mentioned in all the functional specifications ! if some functionality is missed out in the developed code can be found.

You will get a Req/ Specification / Design / Defect If you ask why things are working ? then you will need to check the Business Requirements, if you feel something is wrong then you will encounter few anomalies that may be from all the specifications related to that particular work request.

In this post I wanted to explain how Important the Domain knowledge is required for a Testers.

Domains like Banking, Health Care, Manufacturing, Insurance etc., All these domains are closely related to us.

To modernize these functional areas

> Business peoples will write Specs to cater the business needs as Business Requirement Documents.

> Considering the Business Specifications Solution Designers will prepare Functional Specification Documents.

> We testers and developers will refer the above documents and Develop and Test the application.

How you can learn Business easily ?

If you are working in a Banking Domain, you got Loan functionality to be tested in your Client website, then please create a Loan application in real time with your bank or with some other bank’s on-line application ,

If you are working in a Health Care Domain, you got Inventory functionality to be tested then go to nearby Chemist (Medical Shop) pick some medicines from here and there and go for billing and check how the shop keeper handling your goods.

Like above two examples, what ever business that you are testing, please do imagine that YOU ARE ALSO GOING TO USE THIS PRODUCT. Would you accept a Defective product from your manufacturer, ‘No’ right then your testing will be perfect.

I hope all the readers know about Validation and Verification ?

Let me tell you what I think about it,

Validation means, we need to verify all the documents that related to the given functionality are acceptable and valid.

Verification means, by validated specifications, the Code is written and its verified by us.

In Data warehouse world, the Specification documents are needs to be validated, because even a simple mistake will create a huge problem @ the end.

For an Example,

In warehouse we keep amount columns are in Negative (Bank Owe’s to us) as well as in Positive numbers (We Owe to bank).

Business Need – All the transactions of the day to be extracted

For extracts If specification documents asks us to pull the records where Balance > 0 then you will get the customers who are owing money to the bank.

So even a Single Symbol matters a lot !!! Before we start the Verification we need to Validate first!!!

Lets Discover along with Finding defects 🙂

Cheers – Asik

Process Control & its importance in ETL Testing

Posted on Updated on

Hi All,

Being a gap given today my blog about Process Control and its importance in ETL Testing.

process control

Everything in the world should be controlled by something, like how a mom controls her kids 🙂 Why we need to control the ETL process? and why it does matter for the Testers?

If we do not control the ETL process then we might not do the Correct Inserts and Updates to the target system. The code may be worn out and it behaves crazy. And if something goes wrong then you cant find the root cause of the issue. If you would have lost few records in ETL process without controlling the process then those records will be unknown for ever. Data fix is easy if something goes wrong. We can easily control the source behavior by controlling the ETL Process.

The extract , load and transfer mechanism is controlled by few attributes.

1. Load Date

2. Load Time

4. Process Identifiers 

5. Record Active Date

6. Record Expiry Date

Testers should verify these attributes because

1. Are the data loaded for today’s run?

2.Are the data updated correctly?

3.Are we loading the correct data from source system?

4.Are we maintaining the historical data?

5.Are the Insert and Update process is happening correctly?

6. Are we doing the correct Catch up for the delta records

 (Catch up means, I have inserted a record today and its delta record is coming one week later, this record should expire the          existing record and should insert the new record.)

7. Are the reconciliation is achieved in both Insert and Update scenarios?

In ETL process, the first step is loading into Intermediate Tables – just dumping all the source data into a table for further processing. You cant keep the data in the Intermediate tables for ever, because of the Database size. So you need to delete the data. So you can Truncate or Purge the data.

Truncate – Every time delete and load the data into intermediate tables.

Purge – Keep few days of data into Intermediate tables and delete the older data from Intermediate tables. This will be useful for capturing the historical data to do the catch up.

Here testers need to verify Truncating of records are functioning or not using the Date Columns in the Intermediate tables and Purging of records are happening for given data intervals using the Date and Time columns.

So now we are extracted the Source data and its controlled for Transforming – and we do not have any Delta Logic’s in the Intermediate tables.

Coming to the Warehouse tables, we have Inserts, Updates and Logical Deletes , let me explain you how these process are controlled and how testers needs to verify the same.

Again let me take my Face Book account for better understanding.

Inserts and Process Control

Today I  have created a profile using the Face Book sign up page !! and when Face Book engineer load my data into their ware house they attach today’s date as my Effective Date that is 21/12/2013 and Expiry Date as HIGH DATE (coz they don’t know when I will deactivate the account) and they will give me a Process Active Identifier (to denote it is a Insert or update record) for inserting the record and as the record is active so they will not give Process Expiry Identifier .

> Testers needs to verify Date columns and the Identifiers are for the current day load or not.

Updates and Process Control

The very next data I have updated my Display Name, now my record is going for Delta,

what will happen to existing record’s Effective Date,Effective Date,Process Active Identifier ,Process Expiry Identifier ?

and

what will happen to new record’s Effective Date,Effective Date,Process Active Identifier ,Process Expiry Identifier ?

Above questions are based on our requirements. So Testers should verify the transformation for these Process Control values are achieved as per the Source to Target specifications.

Logical Deletes and Process Control

On the third day I’m deactivating the Face Book account. Now my record in the warehouse should be deactivate in the case

what will happen to columns Effective Date,Effective Date,Process Active Identifier ,Process Expiry Identifier for my delete record?

Again testers should verify the correct transformation is achieved for the Deleted records as per the Transformation rules.

By ensuring this Process Control attributes we can ensure that

“Are we inserting the correct records into warehouse”

“Are we updating the correct records”

“Are we logically delete the correct records”

“Catch of the records are achieved”

“Maintaining the historical data and so on :-)”

Cheers

Asik 🙂

Chapter 6 – Test Data Fabricating in Business Intelligence

Posted on Updated on

Hi All,

Thanks for viewing my blog so far ! I wanted to keep sharing my knowledge as much as I can – Happy Reading 🙂

In this blog I am going to explain about Test Data. In other testing practices you don’t need predefined test data, as we are the user who updated the values either in the Front End applications or some other applications. But in Data warehouse we are the ones who are completely depend on our Upstream (front end application) applications. We will test what all are the data that used in the Upstream guys.

Image

We testers are the intermediate guys who buys the data from one team and validate , transform and sell it to other guys 🙂

Image

Let me explain you in normal terms, you are doing a business , you need to buy the goods from your buyer and sell to u r customer!

What all you need to do?

– Need to buy quality products

– Need to check the count what you ordered

– Need to check the specification what you asked for?

– Need their support in fixing the issues

– if they cant fix it, then you need to repair it and sell it to your customer

Now come to our Business Intelligence scenarios,

As a tester what you need to do with the data ?

– Check the quality of the data (is as per the specifications)

– Check the count is correct using Trailer records (refer my Source Data validation Blog)

– Check if the data is not as per requirement then ask upstream guys (the people who gave the data to you) to fix the data issues

– Check you have got all the data that you required, if not ask your upstream to provide more data, suppose if the deny, please manufacture your data 😦

As I said already in my earlier blogs we have below types of DWH testing

1. Loads from file to staging tables to IM tables
2. Loads from table to staging tables to IM tables
3. DIMN/FACT projects
4. Extract projects
5. Reports
Test Data fabricating for Staging table to Target tables are quite easy, but for Extract Testing and Reports testing are bit difficult 😦

For most of the positive scenarios Upstream will provide the data for us !! but we tester wants to break the code so we will have few negative scenarios for which we will not get data from upstream. And for the scenarios which will happen once in a while we will not get the data so we need to create the data.

For Extracts, the Teradata SQL codes will be written by joining more than 10 tables , each tables will have its own WHERE clauses. After extracting the records using the SQL ,  you found that few scenarios are missing then you need to set up a data that needs to flow across all the tables.( Will be explained in my Know about Extracts Blog)

As I said in my Test Environment blog about the access, the testers should have Create table access in Test Environment and Select access to View Data Base.

So using Create Table access we can create our own test data as follows:

Please read the image very carefully, if you don’t understand please send me an EmailTo : aashiqamrita@gmail.com

Image

1 Create a TEMP table in you test Environment using Create Table Qry.

2. Insert sample records into your TEMP tables using INSERT INTO Qry.

3. Modify the data as per your requirement

4.Insert back the modified data into your actual table.

5.Your test data is ready to use 🙂

Hope you guys understood the test data manufacturing technique . If you guys need more support please comment in this blog 🙂

Thanks –

Happy Reading

Asik

Chapter 3 – Setting up Test Environment for BI Projects

Posted on Updated on

Test Environment:

Image

In general BI Projects will have 3 environments

Production – where our tested code will function in real time

Development – where developers develops and Unit tests the code

Test Environment (System Test / SIT / E2E / UAT) – where the developed code will be deployed for testing

Setting up a Test Environment for Business Intelligence project is critical because the data for different level of testing is different. Developers will develop the ETL code in DEV (development) environment and when the testing phase kicks of they will point their codes to Testing environments.  

Why we need a separate environment? 

1. ACCESS and Test Environment

Because Developers are the one who designs the code and they will keep on changing the code until it works. We don’t have the version control in Data Stage or in Teradata. The development environment is the open space for all the Developers involved and there are high possibility of irregular updates on the code so the DEV environment is loosely controlled. Developers will have INSERT / UPDATE / DELETE access to all the designs in DEV and Test Environments. But testers will have only VIEW access on VIEWS (will explain what is Table and View in my next blog). Tester can only verify the data is as per ETL Code , they cant update any record to make them correct as per requirement. And developers also should agree what ever defects found in our environment should be open in DEV environments.

2. Data and Test Environments 

Data that used for Developing the code should be different to the data is used for testing why because developers used to create Test Data for UNIT testing, they are happy when the functionality is working fine as expected. So they are not interested source data quality. If any data causing trouble to them , they simply delete it and load the rest of the data. But testers should be very conscious about the data because data is the key for us to proceed the testing. So testers should have their own data in their own Environment. They should not depend on the source data that relies on Development environment.

 What all components needs access for a Tester ?

1. File Landing Directory – If your project is File to Table load then Source system will send the Files to a specified location. This location is different for both Development and Testing. Testers should get the access to this Directory.

2. Select ACCESS to Source Views – Tester can easily check whether they have Select access to the tables by simply querying the tables.If you are not having access to the tables then please create a Grants statement and send to your DBA.

[GRANT SEL ON DB_Name to User_ID;]

Good practice availing the access:

  1. A tester should run his queries in Views not in Tables, so Leads should make sure that the testers should not have Select access to tables before they start execution.
  2. Developers are creating Views over the tables, and when they deploy the code into PRODUCTION their code will points to the VIEWs not TABLEs.
  3. Developers might have introduced few filter statements in Views in order to stop duplicates, which might cause errors. Suppose if testers are tested the Target data using Tables then the errors related to Views could not be identified.
  4. Testers should not have Insert, Update, Delete access to Source and Target tables.
  5. Testers should have Create table access on Test Database for manufacturing data (will be explained in Test Data Management Blog)

Availability of the Source Data in Test Environment:

If the data load is into Existing warehouse tables then check the historical data is available in the Test environment by simply selecting the table. If there no records in test environment then request developer to copy sample records from PRODUCTION data into Test Environment.

[INSERT INTO TEST_DB.TABLE SEL * FROM PROD_DB.TABLE SAMPLE 1000;]

Q: Why we should have existing data into Test Environment if it is load into existing table?

A: Because our load into existing should not be delta records for the existing records. This can be verified using the record count       before and after the load – the count existing count should not be disturbed.
If the data load is into new warehouse tables then the tables should not contain any data in the Test Environment.

Q: Why we should not have any data  New target tables?

A: Because these tables do not exist in PRODUCTION and we are the one going to load the data into these tables.
So it should not contain any data.

Reference Tables in Test Environment 

MAP and TYPE tables are used for Referential Integrity. A project can use the existing MAP or TYPE tables or they
can create their own based on the project requirement

  1. Testers should verify all the MAP and TYPE tables mentioned in the S2T (even if it is not used in S2T transformation) are created in the Test Environment
  2. If the MAP or TYPE table exists then the testers should verify the data in Test Environment and PROD environment should be same.
  3. If the MAP or TYPE tables are created project specific then testers should verify the data is inserted correctly as per the Insert statements provide in S2T
  4. Tester should cross the values inserted into MAP and TYPE tables are matching with DDS (refer Appendix section for Reference data). If you find any data missing or added in the table please raise defect to Data designer.

Now we have everything in our Environment 🙂  Now we are going for Validation of Specification on next Blog.

See you @ my next Blog.

Regards – Asik