Non-functional testing

What is Quality Assurance & Quality Control? and Who is PQA, CC, DPA in QA ???

Posted on

Hi All,

Quality Assurance and Quality Control

quality-blocks1

 

Being a Test Professionals we should know about QA and QC. What we are doing is actually Quality Control related stuff and what people involved in bringing up CMMI, ISO standards are Quality Assurance. Quality Assurance is more likely related to maintain the process in any given Project or Programme. Quality Control is nothing but giving the right product by verifying the requirements are covered and working as expected.

We know there are multiple levels of testing methods are spoken in the testing world and we have the methodology by which we need to execute them like System Test, Integration Test etc., and Methodology like Water fall, Iterative etc.,

Quality Assurance:

Let me explain what I know and aware of Quality Assurance:

There 3 different roles who will be responsible for assuring the process in any projects.

  1. PQA – Project Quality Analyst
  2. CC – Configuration Controller
  3. DPA – Defect Prevention Analyst

Project Quality Analyst – PQA role

A person who involved in this role needs to know the pre-defined industry standards of that Organization.

PQA’s Roles and Responsibilities

  1. Documents Naming Convention met as per the Industry Standard
  2. Names of who prepared , reviewed , approved the deliverables
  3. Reviews happened across all the Customer facing documents
  4. Review defects are found, fixed, verified and captured for Metrics
  5. Checking whether all the deliverables are kept in a Common place where the stake holders can access
  6. Checking all the necessary deliverables are prepared by the project team
  7. Checking the actual project delivery date and the date mentioned in the documents are same
  8. Checking the Stake Holders names, document owner names are mentioned correctly in all customer deliverables
  9. Differentiating the Customer facing deliverables and Internal audit specific deliverables are as per the industry standards
  10. Verifying the Entry Criteria and Exit Criteria of any Levels in SDLC are met and collecting the proofs for the same
  11. PQA’s will be involved in all levels of SDLC

Business Analyst Teams will have separate sets of Deliverables like Business Requirement documents, Business Understanding Documents, Requirement Traceability documents etc.,

  1. Development teams will have separate sets of Deliverables like High Level Design, Low Level Design , Functional Specifications etc.,
  2. Testing teams will have separate sets of documents like Test Plans, Test Conditions

The PQA should validate all the documents that supposed to be delivered to the Clients and maintain for internal audits

CC – Configuration Controller

Configuration Controller who controls the versions and the placement of the documents in tools like VSS – Microsoft Virtual Source Safe or Documentum etc.,

Configuration Controller Roles and Responsibilities

  1. CC’s are responsible of Creating the folder structures in VSS or Documentum

Like, in any Projects the following folders will be created to maintain the projects deliverables

  1. Project Kick off
  2. Minutes of Meeting
  3. Requirements
  4. Review Log
  5. Development

1.1.  High Level design

1.2. Low Level Design

1.3. Issue Log

1.4. Emails

6. Testing

1.1   Unit Testing

1.2   System Testing

1.3   System Integration Testing

1.4   User Acceptance Testing

6.Production Support

CC’s will have the Admin rights to Grant and Revoke access to folders.

Developers should not have access to the folders related to Testing and vice versa

  1. CC’s will maintain the Check in and Check out of the documents that goes into VSS
  2. CC’s will maintain the relevant documents are kept in corresponding folders in VSS

DPA – Defect Prevention Analyst

Defect Prevent Analysts will maintain the defects across the SDLC. For any raised defects the work flow should be maintained. Proper comments for those defects should be given when they are created. All high severity defects should be fixed from one Phase to next phase to being with.

As testers when we raise defects we need to concentrate on applying Defect Cause and Defect Type in any Defect Management tool. This will help DPA’s to classify the defects and comes up prevention tips.

Defect Cause – What is the root cause of the defect that is,

  • Is the defect caused because of the Upstream Source Data or Test Data
  • Is the defect caused because of Incomplete of Missing requirements
  • Is the defect caused because the Inconsistent Requirement
  • Is the defect caused because of the Code discrepancy
  • If you find any anomalies in any Documents then raise the defects to Artefacts
  • If any of your defects leads to Changes in the Requirement then please raise them as Change Request – CR can be on the Actual Business Requirement or on Design Changes.

Defect Type – Classifications of the Defects that is,

  • Is the defect related to Data Error
  • Is the defect related to Application Code
  • Is the defect related to Work Request
  • Is the defect related to Change Request
  • Is the defect related to Deployment
  • Is the defect related to Specification
  • Is the defect related to Artefact
  • Is the defect related to Production
  • Is the defect related to Process
  • Is the defect related to Environment
  • Is the defect related to Specification
  • Is the defect related to Requirements
  • Is the defect related to Reviews

DPA’s most prominent work is to prepare the CAPA – “Corrective Analysis and Preventive Actions”

DPA roles and Responsibilities

  1. DPA’s will collect the Metrics related to Defects in a periodic manner – Weekly, Monthly or Ad-hoc
  2. DPA’s will collect the defects by Defect classifications like in a given period of time how many defects are raised in Reviews, Code, Requirement Change and collects the cause of the defects
  3. Then using the metrics that was retrieved from any Defect Management tools, they design the Fish Bone diagram by filling the necessary details
  4. Using the Statistical tools like Minitab, they calculate the Defect Density of the defects from each phase
  5. Then they will create the Prevention actions on the areas where the Defect Density is above to the threshold limits.

Suppose if your Organization has the Defect Density threshold limit as 0.5 and your Defects under Review defect type is more than 0.5 then they will ask the Reviewers to do their work better to minimize the review defects at any levels of SDLC.

Cheers – Asik

Advertisements

Importance of Non Functional Testing in Data warehouse

Posted on Updated on

Hi All

In this post I would like to share my knowledge in Non Functional Testing in Data warehouse testing.

car

There are different types non-functional testing that we do in testing world, some of them is

  1. Baseline testing
  2. Compatibility testing
  3. Compliance testing
  4. Documentation testing
  5. Endurance testing
  6. Load testing
  7. Localization testing and Internationalization testing
  8. Performance testing
  9. Recovery testing
  10. Resilience testing
  11. Security testing
  12. Scalability testing
  13. Stress testing
  14. Usability testing
  15. Volume testing

To me Non Functional testing is something like which will not give any business values; It’s something like dealing with the environment. When we extract the data from heterogeneous source system, we might need to think of handling

Verifying the volume of the data

Any business can’t ensure what could be the volume of the data that they will send. They can say approximately, Our Code should have the capability of pulling the maximum number of data that they source system can send at any point of the time. To manage the Volume of the data, Teradata has the feature called M-Load and T-Pump. When developers designs the system they fix a limit by which data will be loaded into Warehouse.

Example:

  • M-Load – If we get a data file with 100 records then the records will be loaded by M-Load functionality
  • T-Pump – If we get a data file with less than 100 records then the records will be loaded by T-Pump

What we need to test here is, send a file with 100 records and check records are loaded by M-Load. This can be verified using the Load Job Names.

Verifying Date and Time of the Data file arrival to the Unix Landing directory

Most of the Companies will not function on Week Ends, Public Holidays so our source systems will not send any transactional data on those days. Because of the phenomenon developers will design their jobs to archive any files coming on these days.

Normally, Monday’s transactional data will come to us for loading on Tuesday early morning and it will end on Fridays transactional data will hit us on Saturday early morning.

We as testers need to verify these schedules are working as per the specification. This can be achieved

  • sending a file on Week End and check this file is archived
  • Sending a file on Public Holiday and check this file is archived
  • Verifying Mondays transactional data received on Tuesday morning until on Saturday morning

Verifying Purging and Truncate Loads

I have already mentioned about Purging and Truncate loads in my earlier blogs.

Purging –  The AutoSys jobs will Purge the data leaving the required data in staging table. Suppose if I have loaded 10th,11th ,12th of January data into staging table and when I load 13th of January data, the 10th of January data will be purged.

Truncate –  Simple load day_01 data and when you load day_02 data  they Day_01 data will be deleted

We as testers need to verify the Truncate and Purging is happening as per design requirement.

Verifying File Watcher Script

There will be File Watched Script that will look for files until it arrives the Unix Landing directory. Source system is promising us that they will send Day_01 file on 10-01-2013. So we have set the Date in File watcher Script. Source System sent the records on 10-01-2013 , now our File watcher Script will look the date from the file header, if both are matching then it will process the file into Staging table. Source system failed to send the data on 11-01-2013, our file watcher job will look for the file on 11-01-2013 for given time interval if its not arrived then automated Email will be sent to the concern source system saying the file is not arrived

So we as testers needs to verify the File watched job is working as expected.

Cheers – Asik.

Chapter 2- Business Intelligence Test Planning – [Estimation , Test Plan, Test Case Design]

Posted on Updated on

Test Planning phase is consisting of – Test Estimation, Test Plan preparation and Test Case design.

Image

Test Estimation

Test Estimation is an ART. Estimations is a variable component of SDLC, however it can be achieved based on experience. A person who estimates the project timelines should be technically sound enough.

Test Estimates are typically required to be produced at 3 stages

  1. Ball Park Estimate (BPE) should be produced in the Plan & Analyse phase of the SDLC.
  2. High Level Estimate (HLE) should be produced towards the end of High Level Solution Design in the Design phase.
  3. Detailed Estimate (DE) should be produced towards the end of Detailed Design in the Build phase

In Data warehouse world the estimations could be differ from each of the components mentioned below :

  • Loads from file to staging tables to Integrated Model testing
  • Loads from table to staging tables to IM tables testing
  • DIMN /FACT projects testing
  • Extract projects testing
  • Reports testing
  • Non-functional tests (T-Pump ,M-Load ,File watcher Catch up logic, Purging loads, Truncate loads

Test Plan :

Test Plan document identifies the risks to the program / project. Test analyst needs to ask as many as questions on the project and detail them as a document is called Test Plan. Please find common questions that you need to ask.

What you are testing? When you are starting and finishing testing? who is doing the testing? any risk? In Scope and Out Scope, who is giving data to you? to whom you are giving your inputs? who all are depend on you ? how you are depend on others? what files/tables/extract you are testing? how you are going to publish your reports? are you maintaining testing data in Quality Center ?

  • This is the ‘WHAT’ will be done. The Test Plan deals with the actions that will be taken to mitigate those risks;
  • This is the ‘HOW’ and ‘WHO’. The Test Plan may deal with a single test phase or multiple depending on the number of authors and the size/complexity of each phase;

– This will be identified in the Test Strategy. The typical test phases where Organization either author or review Test Plans
are System, Application Acceptance, System Integration, End-to-End and Production Assurance Testing Phases.

Test Case Design

The difference between a Test Condition and a Test Case is that a Test Condition is “what” we need to test and a Test Case is
“how” we will test the Test Condition.

  • Test Conditions are the ‘What’ we need to test and are derived from the identified requirements.
  • Test Conditions individually identify the individual functional behaviors that make up a requirement (Note: This differs from what is currently done, namely attempting to cover more than one function in the one Test Condition).

For Data warehouse projects we will have below test scenarios

1. Verify the Reconciliation – count between source and target are same after ETL Loads?

2. Verify Inserting a new record into warehouse – a brand new record which is not loaded into warehouse.

3.Verify how the Delta record is processed – Update record to an existing record in warehouse

4.Values that are loaded by ETL Code

5.Values that are loaded by Data Stage jobs

6.Deleting a record (expiring without updating)

Folks will explain each of them above in detail in upcoming blogs. Please provide feed backs to improve my blogs 🙂

Many Thanks – Asik