Quality Assurance and Quality Control
Being a Test Professionals we should know about QA and QC. What we are doing is actually Quality Control related stuff and what people involved in bringing up CMMI, ISO standards are Quality Assurance. Quality Assurance is more likely related to maintain the process in any given Project or Programme. Quality Control is nothing but giving the right product by verifying the requirements are covered and working as expected.
We know there are multiple levels of testing methods are spoken in the testing world and we have the methodology by which we need to execute them like System Test, Integration Test etc., and Methodology like Water fall, Iterative etc.,
Let me explain what I know and aware of Quality Assurance:
There 3 different roles who will be responsible for assuring the process in any projects.
- PQA – Project Quality Analyst
- CC – Configuration Controller
- DPA – Defect Prevention Analyst
Project Quality Analyst – PQA role
A person who involved in this role needs to know the pre-defined industry standards of that Organization.
PQA’s Roles and Responsibilities
- Documents Naming Convention met as per the Industry Standard
- Names of who prepared , reviewed , approved the deliverables
- Reviews happened across all the Customer facing documents
- Review defects are found, fixed, verified and captured for Metrics
- Checking whether all the deliverables are kept in a Common place where the stake holders can access
- Checking all the necessary deliverables are prepared by the project team
- Checking the actual project delivery date and the date mentioned in the documents are same
- Checking the Stake Holders names, document owner names are mentioned correctly in all customer deliverables
- Differentiating the Customer facing deliverables and Internal audit specific deliverables are as per the industry standards
- Verifying the Entry Criteria and Exit Criteria of any Levels in SDLC are met and collecting the proofs for the same
- PQA’s will be involved in all levels of SDLC
Business Analyst Teams will have separate sets of Deliverables like Business Requirement documents, Business Understanding Documents, Requirement Traceability documents etc.,
- Development teams will have separate sets of Deliverables like High Level Design, Low Level Design , Functional Specifications etc.,
- Testing teams will have separate sets of documents like Test Plans, Test Conditions
The PQA should validate all the documents that supposed to be delivered to the Clients and maintain for internal audits
CC – Configuration Controller
Configuration Controller who controls the versions and the placement of the documents in tools like VSS – Microsoft Virtual Source Safe or Documentum etc.,
Configuration Controller Roles and Responsibilities
- CC’s are responsible of Creating the folder structures in VSS or Documentum
Like, in any Projects the following folders will be created to maintain the projects deliverables
- Project Kick off
- Minutes of Meeting
- Review Log
1.1. High Level design
1.2. Low Level Design
1.3. Issue Log
1.1 Unit Testing
1.2 System Testing
1.3 System Integration Testing
1.4 User Acceptance Testing
CC’s will have the Admin rights to Grant and Revoke access to folders.
Developers should not have access to the folders related to Testing and vice versa
- CC’s will maintain the Check in and Check out of the documents that goes into VSS
- CC’s will maintain the relevant documents are kept in corresponding folders in VSS
DPA – Defect Prevention Analyst
Defect Prevent Analysts will maintain the defects across the SDLC. For any raised defects the work flow should be maintained. Proper comments for those defects should be given when they are created. All high severity defects should be fixed from one Phase to next phase to being with.
As testers when we raise defects we need to concentrate on applying Defect Cause and Defect Type in any Defect Management tool. This will help DPA’s to classify the defects and comes up prevention tips.
Defect Cause – What is the root cause of the defect that is,
- Is the defect caused because of the Upstream Source Data or Test Data
- Is the defect caused because of Incomplete of Missing requirements
- Is the defect caused because the Inconsistent Requirement
- Is the defect caused because of the Code discrepancy
- If you find any anomalies in any Documents then raise the defects to Artefacts
- If any of your defects leads to Changes in the Requirement then please raise them as Change Request – CR can be on the Actual Business Requirement or on Design Changes.
Defect Type – Classifications of the Defects that is,
- Is the defect related to Data Error
- Is the defect related to Application Code
- Is the defect related to Work Request
- Is the defect related to Change Request
- Is the defect related to Deployment
- Is the defect related to Specification
- Is the defect related to Artefact
- Is the defect related to Production
- Is the defect related to Process
- Is the defect related to Environment
- Is the defect related to Specification
- Is the defect related to Requirements
- Is the defect related to Reviews
DPA’s most prominent work is to prepare the CAPA – “Corrective Analysis and Preventive Actions”
DPA roles and Responsibilities
- DPA’s will collect the Metrics related to Defects in a periodic manner – Weekly, Monthly or Ad-hoc
- DPA’s will collect the defects by Defect classifications like in a given period of time how many defects are raised in Reviews, Code, Requirement Change and collects the cause of the defects
- Then using the metrics that was retrieved from any Defect Management tools, they design the Fish Bone diagram by filling the necessary details
- Using the Statistical tools like Minitab, they calculate the Defect Density of the defects from each phase
- Then they will create the Prevention actions on the areas where the Defect Density is above to the threshold limits.
Suppose if your Organization has the Defect Density threshold limit as 0.5 and your Defects under Review defect type is more than 0.5 then they will ask the Reviewers to do their work better to minimize the review defects at any levels of SDLC.
Cheers – Asik
In this post I would like to share my knowledge in Non Functional Testing in Data warehouse testing.
There are different types non-functional testing that we do in testing world, some of them is
- Baseline testing
- Compatibility testing
- Compliance testing
- Documentation testing
- Endurance testing
- Load testing
- Localization testing and Internationalization testing
- Performance testing
- Recovery testing
- Resilience testing
- Security testing
- Scalability testing
- Stress testing
- Usability testing
- Volume testing
To me Non Functional testing is something like which will not give any business values; It’s something like dealing with the environment. When we extract the data from heterogeneous source system, we might need to think of handling
Verifying the volume of the data
Any business can’t ensure what could be the volume of the data that they will send. They can say approximately, Our Code should have the capability of pulling the maximum number of data that they source system can send at any point of the time. To manage the Volume of the data, Teradata has the feature called M-Load and T-Pump. When developers designs the system they fix a limit by which data will be loaded into Warehouse.
- M-Load – If we get a data file with 100 records then the records will be loaded by M-Load functionality
- T-Pump – If we get a data file with less than 100 records then the records will be loaded by T-Pump
What we need to test here is, send a file with 100 records and check records are loaded by M-Load. This can be verified using the Load Job Names.
Verifying Date and Time of the Data file arrival to the Unix Landing directory
Most of the Companies will not function on Week Ends, Public Holidays so our source systems will not send any transactional data on those days. Because of the phenomenon developers will design their jobs to archive any files coming on these days.
Normally, Monday’s transactional data will come to us for loading on Tuesday early morning and it will end on Fridays transactional data will hit us on Saturday early morning.
We as testers need to verify these schedules are working as per the specification. This can be achieved
- sending a file on Week End and check this file is archived
- Sending a file on Public Holiday and check this file is archived
- Verifying Mondays transactional data received on Tuesday morning until on Saturday morning
Verifying Purging and Truncate Loads
I have already mentioned about Purging and Truncate loads in my earlier blogs.
Purging – The AutoSys jobs will Purge the data leaving the required data in staging table. Suppose if I have loaded 10th,11th ,12th of January data into staging table and when I load 13th of January data, the 10th of January data will be purged.
Truncate – Simple load day_01 data and when you load day_02 data they Day_01 data will be deleted
We as testers need to verify the Truncate and Purging is happening as per design requirement.
Verifying File Watcher Script
There will be File Watched Script that will look for files until it arrives the Unix Landing directory. Source system is promising us that they will send Day_01 file on 10-01-2013. So we have set the Date in File watcher Script. Source System sent the records on 10-01-2013 , now our File watcher Script will look the date from the file header, if both are matching then it will process the file into Staging table. Source system failed to send the data on 11-01-2013, our file watcher job will look for the file on 11-01-2013 for given time interval if its not arrived then automated Email will be sent to the concern source system saying the file is not arrived
So we as testers needs to verify the File watched job is working as expected.
Cheers – Asik.
Thanks for viewing my blog so far ! I wanted to keep sharing my knowledge as much as I can – Happy Reading 🙂
In this blog I am going to explain about Test Data. In other testing practices you don’t need predefined test data, as we are the user who updated the values either in the Front End applications or some other applications. But in Data warehouse we are the ones who are completely depend on our Upstream (front end application) applications. We will test what all are the data that used in the Upstream guys.
We testers are the intermediate guys who buys the data from one team and validate , transform and sell it to other guys 🙂
Let me explain you in normal terms, you are doing a business , you need to buy the goods from your buyer and sell to u r customer!
What all you need to do?
– Need to buy quality products
– Need to check the count what you ordered
– Need to check the specification what you asked for?
– Need their support in fixing the issues
– if they cant fix it, then you need to repair it and sell it to your customer
Now come to our Business Intelligence scenarios,
As a tester what you need to do with the data ?
– Check the quality of the data (is as per the specifications)
– Check the count is correct using Trailer records (refer my Source Data validation Blog)
– Check if the data is not as per requirement then ask upstream guys (the people who gave the data to you) to fix the data issues
– Check you have got all the data that you required, if not ask your upstream to provide more data, suppose if the deny, please manufacture your data 😦
As I said already in my earlier blogs we have below types of DWH testing
1. Loads from file to staging tables to IM tables
2. Loads from table to staging tables to IM tables
3. DIMN/FACT projects
4. Extract projects
Test Data fabricating for Staging table to Target tables are quite easy, but for Extract Testing and Reports testing are bit difficult 😦
For most of the positive scenarios Upstream will provide the data for us !! but we tester wants to break the code so we will have few negative scenarios for which we will not get data from upstream. And for the scenarios which will happen once in a while we will not get the data so we need to create the data.
For Extracts, the Teradata SQL codes will be written by joining more than 10 tables , each tables will have its own WHERE clauses. After extracting the records using the SQL , you found that few scenarios are missing then you need to set up a data that needs to flow across all the tables.( Will be explained in my Know about Extracts Blog)
As I said in my Test Environment blog about the access, the testers should have Create table access in Test Environment and Select access to View Data Base.
So using Create Table access we can create our own test data as follows:
Please read the image very carefully, if you don’t understand please send me an EmailTo : firstname.lastname@example.org
1 Create a TEMP table in you test Environment using Create Table Qry.
2. Insert sample records into your TEMP tables using INSERT INTO Qry.
3. Modify the data as per your requirement
4.Insert back the modified data into your actual table.
5.Your test data is ready to use 🙂
Hope you guys understood the test data manufacturing technique . If you guys need more support please comment in this blog 🙂