Systems development life-cycle
In this post I would like to share my knowledge in Non Functional Testing in Data warehouse testing.
There are different types non-functional testing that we do in testing world, some of them is
- Baseline testing
- Compatibility testing
- Compliance testing
- Documentation testing
- Endurance testing
- Load testing
- Localization testing and Internationalization testing
- Performance testing
- Recovery testing
- Resilience testing
- Security testing
- Scalability testing
- Stress testing
- Usability testing
- Volume testing
To me Non Functional testing is something like which will not give any business values; It’s something like dealing with the environment. When we extract the data from heterogeneous source system, we might need to think of handling
Verifying the volume of the data
Any business can’t ensure what could be the volume of the data that they will send. They can say approximately, Our Code should have the capability of pulling the maximum number of data that they source system can send at any point of the time. To manage the Volume of the data, Teradata has the feature called M-Load and T-Pump. When developers designs the system they fix a limit by which data will be loaded into Warehouse.
- M-Load – If we get a data file with 100 records then the records will be loaded by M-Load functionality
- T-Pump – If we get a data file with less than 100 records then the records will be loaded by T-Pump
What we need to test here is, send a file with 100 records and check records are loaded by M-Load. This can be verified using the Load Job Names.
Verifying Date and Time of the Data file arrival to the Unix Landing directory
Most of the Companies will not function on Week Ends, Public Holidays so our source systems will not send any transactional data on those days. Because of the phenomenon developers will design their jobs to archive any files coming on these days.
Normally, Monday’s transactional data will come to us for loading on Tuesday early morning and it will end on Fridays transactional data will hit us on Saturday early morning.
We as testers need to verify these schedules are working as per the specification. This can be achieved
- sending a file on Week End and check this file is archived
- Sending a file on Public Holiday and check this file is archived
- Verifying Mondays transactional data received on Tuesday morning until on Saturday morning
Verifying Purging and Truncate Loads
I have already mentioned about Purging and Truncate loads in my earlier blogs.
Purging – The AutoSys jobs will Purge the data leaving the required data in staging table. Suppose if I have loaded 10th,11th ,12th of January data into staging table and when I load 13th of January data, the 10th of January data will be purged.
Truncate – Simple load day_01 data and when you load day_02 data they Day_01 data will be deleted
We as testers need to verify the Truncate and Purging is happening as per design requirement.
Verifying File Watcher Script
There will be File Watched Script that will look for files until it arrives the Unix Landing directory. Source system is promising us that they will send Day_01 file on 10-01-2013. So we have set the Date in File watcher Script. Source System sent the records on 10-01-2013 , now our File watcher Script will look the date from the file header, if both are matching then it will process the file into Staging table. Source system failed to send the data on 11-01-2013, our file watcher job will look for the file on 11-01-2013 for given time interval if its not arrived then automated Email will be sent to the concern source system saying the file is not arrived
So we as testers needs to verify the File watched job is working as expected.
Cheers – Asik.
Hope you guys had wonderful X-Mas holidays :-).
As we all know all the Inventions are made by Asking more and more questions.
If Newton would have not asked ‘Why Apple fallen down from the Tree’ then we would have not have Gravity Theory.
If Archimedes would have not asked ‘Why waters spilled out of the Tub when he was bathing?’ then we would have not have Archimedes principle.
Likewise if testers asks questions on applications like Why or How? if he/She finds the answer that is not matching with what it supposed to be then IT IS A Discovery, So we are Scientists 🙂
As far as I consider testing the application is asking questions about the application that I am testing – is perfect way of finding software anomalies.
Each functional defect that we stopping by our testing is just saving the production fixes but if we stop any Business Gaps then it saves the whole Business Need. So before kick of the testing make sure we know the Business Needs.
You will get a Functional Defect – If you ask How things are working? then it will be already mentioned in all the functional specifications ! if some functionality is missed out in the developed code can be found.
You will get a Req/ Specification / Design / Defect If you ask why things are working ? then you will need to check the Business Requirements, if you feel something is wrong then you will encounter few anomalies that may be from all the specifications related to that particular work request.
In this post I wanted to explain how Important the Domain knowledge is required for a Testers.
Domains like Banking, Health Care, Manufacturing, Insurance etc., All these domains are closely related to us.
To modernize these functional areas
> Business peoples will write Specs to cater the business needs as Business Requirement Documents.
> Considering the Business Specifications Solution Designers will prepare Functional Specification Documents.
> We testers and developers will refer the above documents and Develop and Test the application.
How you can learn Business easily ?
If you are working in a Banking Domain, you got Loan functionality to be tested in your Client website, then please create a Loan application in real time with your bank or with some other bank’s on-line application ,
If you are working in a Health Care Domain, you got Inventory functionality to be tested then go to nearby Chemist (Medical Shop) pick some medicines from here and there and go for billing and check how the shop keeper handling your goods.
Like above two examples, what ever business that you are testing, please do imagine that YOU ARE ALSO GOING TO USE THIS PRODUCT. Would you accept a Defective product from your manufacturer, ‘No’ right then your testing will be perfect.
I hope all the readers know about Validation and Verification ?
Let me tell you what I think about it,
Validation means, we need to verify all the documents that related to the given functionality are acceptable and valid.
Verification means, by validated specifications, the Code is written and its verified by us.
In Data warehouse world, the Specification documents are needs to be validated, because even a simple mistake will create a huge problem @ the end.
For an Example,
In warehouse we keep amount columns are in Negative (Bank Owe’s to us) as well as in Positive numbers (We Owe to bank).
Business Need – All the transactions of the day to be extracted
For extracts If specification documents asks us to pull the records where Balance > 0 then you will get the customers who are owing money to the bank.
So even a Single Symbol matters a lot !!! Before we start the Verification we need to Validate first!!!
Lets Discover along with Finding defects 🙂
Cheers – Asik
Test Planning phase is consisting of – Test Estimation, Test Plan preparation and Test Case design.
Test Estimation is an ART. Estimations is a variable component of SDLC, however it can be achieved based on experience. A person who estimates the project timelines should be technically sound enough.
Test Estimates are typically required to be produced at 3 stages
- Ball Park Estimate (BPE) should be produced in the Plan & Analyse phase of the SDLC.
- High Level Estimate (HLE) should be produced towards the end of High Level Solution Design in the Design phase.
- Detailed Estimate (DE) should be produced towards the end of Detailed Design in the Build phase
In Data warehouse world the estimations could be differ from each of the components mentioned below :
- Loads from file to staging tables to Integrated Model testing
- Loads from table to staging tables to IM tables testing
- DIMN /FACT projects testing
- Extract projects testing
- Reports testing
- Non-functional tests (T-Pump ,M-Load ,File watcher Catch up logic, Purging loads, Truncate loads
Test Plan :
Test Plan document identifies the risks to the program / project. Test analyst needs to ask as many as questions on the project and detail them as a document is called Test Plan. Please find common questions that you need to ask.
What you are testing? When you are starting and finishing testing? who is doing the testing? any risk? In Scope and Out Scope, who is giving data to you? to whom you are giving your inputs? who all are depend on you ? how you are depend on others? what files/tables/extract you are testing? how you are going to publish your reports? are you maintaining testing data in Quality Center ?
- This is the ‘WHAT’ will be done. The Test Plan deals with the actions that will be taken to mitigate those risks;
- This is the ‘HOW’ and ‘WHO’. The Test Plan may deal with a single test phase or multiple depending on the number of authors and the size/complexity of each phase;
– This will be identified in the Test Strategy. The typical test phases where Organization either author or review Test Plans
are System, Application Acceptance, System Integration, End-to-End and Production Assurance Testing Phases.
Test Case Design
The difference between a Test Condition and a Test Case is that a Test Condition is “what” we need to test and a Test Case is
“how” we will test the Test Condition.
- Test Conditions are the ‘What’ we need to test and are derived from the identified requirements.
- Test Conditions individually identify the individual functional behaviors that make up a requirement (Note: This differs from what is currently done, namely attempting to cover more than one function in the one Test Condition).
For Data warehouse projects we will have below test scenarios
1. Verify the Reconciliation – count between source and target are same after ETL Loads?
2. Verify Inserting a new record into warehouse – a brand new record which is not loaded into warehouse.
3.Verify how the Delta record is processed – Update record to an existing record in warehouse
4.Values that are loaded by ETL Code
5.Values that are loaded by Data Stage jobs
6.Deleting a record (expiring without updating)
Folks will explain each of them above in detail in upcoming blogs. Please provide feed backs to improve my blogs 🙂
Many Thanks – Asik