Trading Life Cycle – Trade Attributes

Posted on

Hello Guys,

Again let me take an example of buying Apple iPhone through Amazon Marketplace. Once the Order is placed the Amazon will create an Order Ticket that will have below information,

  • PRODUCT Details
  • Date & Time of the Transaction
  • Price Details
  • Quantity
  • Is it a BUY or Sell?
  • The Executive from Amazon who handles this order.
  • Customer Name
  • Seller Details
  • Expected Delivery Dates
  • Updates to the Order if any

Just apply the same logic to Trade Attributes.

Let’s take this example – Bank of Moon (:p) buying 20M valued Equities from Bank of Venus, this trade is placed in the National Stock Exchange India. This is high-level information about the trade, let us break it down.

Nature of the Trade 

Is the Trade between counterparties are

  • Buying or Selling the Shares of one counterparty to another eventually the buyer of the Shares will get the dividend and Seller will increase his Capital to do the business – EQUITY
  • Buying or Selling the assets at the fixed price with some floating or fixed interest as the payout to the buyer of the BOND – FIXED INCOME
  • Buying and Selling the Currency – FOREX Trading

This we call as TRADE TYPE – FXSPOT, FORWARDS, NON DELIVERED FORWARDS are some of it. I will explain each Trade Type in upcoming Posts.

Mandatory Information about the Trade

Every department in the Trade Life Cycle requires information till the Trade Settles.

  • Trade Identifier that was created by Banks internal process per department. This values should be unique.
  • Trade Type – FXSpot
  • Trade Date
  • Execution Time
  • Executed at – Location

Financial Information about the Trade

  • PRICE of the Trade
  • Buy or Sell
  • Product details (USDINR)
  • Under & Accounting Currency details
  • Notional of the Trade (Size of the Trade)

Booking Information if the Trade

  • Who is booking the Trade is it SALESPERSON or the EXECUTING TRADER?
  • Each bank has its own designated area (DESK) to execute the Trade for each Asset Classes
  • BOOK – A trader can trade a Book which is nothing but Record of All Positions held by the Trader.
  • Trader’s information – Trade Name / Location etc.,

COUNTERPARTY INFORMATION of the TRADE

  • Counterparty name 
  • Counterparty Location
  • Settlement Type of Counterparty 
  • Settlement Date

In my next post, I will try to explain multiple Date analogies in the Trading world.

Thanks for reading – please comment and or share it with your friends if you like.

Happy Reading – Asik

Advertisements

Investment Banking – Trade Life Cyle

Posted on

Hello Folks,

My New Website – http://asketltester.com/trading-life-cycle/ please check here for more updates!!

I have not blogged for a couple of years due to the family commitment, now I have time to share my experiences with you all.

okay, let’s go – Many QA working in Investment Banking domain without Business Knowledge (even I was & I am now :p).  So, I myself sneaked into google &  referred few books. Now I am going to blend my day to day experience with the theoretical knowledge.

Let me give you a classic and simple example of how trade happens?

Asik wants to buy an Apple iPhone from Amazon. Here Asik & Amazon are considered as COUNTERPARTIES and the iPhone is considered as PRODUCT that is being traded. Asik places an order by paying $1000 for 1 iPhone. The No of iPhone is nothing but NOTIONAL. The Amazon is an online e-commerce website – this is called the Market Place. So when Asik places an order, he should check the availability and Asik should accept the money that was mentioned in the website here if SIZE & PRICE should match for trade to happen.

After Order is placed, Amazon should accept the ORDER– soon the SIZE & PRICEmatches Amazon will tag that product to Asik, so the TRADE took place.

There are different types of Order & Trade status maintained before the Order gets FILLED (Order Size & price fulfilled and the Order is closed after the Trade completely Fills )

Sample Order & Trade Status – New, Pending New, Pending Cancel, Cancel Request, Cancelled, Filled, Done for the Day (I will explain each status in upcoming posts – Don’t worry :p).

Okay, now why People, Industries, and Investment banks TRADING?

  1. If I want to increase my current then I will go to the market and  BUY STOCK 
  2. I want to get rid of 50% of my stock because I anticipate there will be price Drop – SELL
  3. To make the profit when he/she anticipates there will be the Price spike
  4. To reduce the RISK by considering the future Event.

BUY or SELL is considered as SIDE of the Order/Trade – Buy Trade / Sell Trade or Buy Order / Sell Order.

Who are all the typical Market participants in any Trading activities?

  • PRODUCER or MANUFACTURE
  • CONSUMER or CUSTOMERS
  • SPECULATOR or MARKET WATCHER
  • MARKET MAKERS

WHO IS/ARE PRODUCER?

  • Apple manufacture (producer) has invested 100 $ millions to produce iPhone XI.
  • The entire income of Apple depends on the Total No of units (Size) and price of each unit.
  • Apple can maximize the no of units produced however they can predict little on Price.
  • To avoid the risk, Apple Fixes a Target Price & no of units that will be sold to Apple Dealers

WHO IS/ARE  CONSUMER?

  • Apple wants few thousands of OLED Screens for their new iPhone Model so they are ready to pay more money to Samsung to get the product on time to market.
  • So Samsung reduces his exposure to the OLED Screens to manage the Supply & DemanD

WHO IS/ARE  SPECULATOR?

  • Best Buy Store (Speculator) takes a view on the likely direction of the price change.
  • Best Buys the Apple units on Foward contract when they anticipate the shortage of iPhone units in the US market and supply later.
  • Best Buy sells the Apple units if the anticipated future glut (new products introduced by another vendor)

High-Level Trade Flow

BROKER –> EXCHANGE –> OTC – Overt the Counter.

BROKER 

  • If an Individual wants to buy some EQUITIES from the market, he can’t go directly to the MARKET and BUYSELL the EQUITIES.
  • Because your BUYER or SELLER doesn’t know your creditworthiness
  • You will not have the liquidity of the Equity.

So individuals or small-scale institutional customers will go to Broker for their trading requirements.

EXCHANGE

  • An organized place where anyone can trade any available Products
  • Only EXCHANGES will display BUY & SELL Price of an asset.
  • Makes sure enough liquidity of the Product available in the Exchange
  • Assessing the Counterparty Risk limitations
  • Electronic Traded Exchanges – the Virtual marketplace

Over the Counter – OTC

  • Exchanges are operated for certain hours, if anyone wants to BUY/SELL any products then they have to go to Over the Counter market.
  • Exchanges will not have all the products, because of Product Limitations, one should go to OTC to trade.
  • Only certified people can trade in EXCHANGES, however, OTC any nonmembers also can trade any Products.

Trading Life Cycle will be continued in next Post.

Thanks – Asik

Data warehouse – Test Data Mangement

Posted on Updated on

Test Data Management

Hi All,

Hope all are doing great….it’s very long time since I posted through my blog… my sincere apologies  😦

Happy Reading……:-)

Software-testing-trends-2013

Test Data availability is the one of and most significant issue that will lead to Schedule slippage in DWH projects.

So the Testers and Test Managers should know what all are requirements for the test data and they need to define the test data strategy.

The Test Data Management should cover the below points:

  1. Subset of PROD data catered for Testing Requirements

 Suppose if you are working in an Enhancement projects then you can avail the existing data that is already loaded into warehouse. In this instance you can’t use the data that’s pulled directly from Production as it’s sensitive to the Bank and its Customers.

  1. Mask the data as per regulatory and data privacy requirements 

All the customer information related information should be masked. It is difficult to test the Masked data, suppose if the Name column should accept 20 (CHAR) and if the data is masked then we cannot do the BVA for the masked column.

  1. Fabricate Test Data in case of un availability 

Your source system cannot provide you the data for all your Testing Scenarios. Lack of Test data is the major issue in the Data warehouse projects. A tester should analyze the system and need to create the test data in the form of Insert Scripts. The test data should be extracted correctly by the query before its loaded into the Target tables.

Typical Challenges involved in Test Data Management:

  1. Delay in Schedule

The test data from Upstream should be their UAT else our testing will not effective, Suppose if we are in SIT and our Up Stream have not Completed UAT then our Schedule will be postponed as unavailable of UAT tested data from Up Stream.

  1. Issues to cover Business Cases

Test Data for few Business Scenarios cannot be produced in Up Stream or through Data fabricating. Take this example, reopening of an Account, a Customer was having an account and he closed the same few years before and his Records are Kept Active in ACCT_STATUS Closed Records. When the Customer comes to Bank and the Bank will somehow Identify his existing account and will try to Reopen across all the tables to stop not having one more record in the warehouse, for this kind of scenarios it’s difficult to manufacture the data in a given time.

  1. Data Privacy & Compliance

Banks are curious about the Customer Data because of the Data theft in any means. So when we will get the data from PROD for our Regression Testing, most of the Customer related data will be masked as per Compliance policy. So we cannot produce the data or test those scenarios in our Environment.

  1. Dependency of SME’s

SME’s availability is a biggest issue in Data warehouse world.

System Architect – He is accountable for the System Design based on the Business Requirements. He will be closely working with Data Modeler (Data Scientist) to design the Source to Target. System Architect needs to fit the current requirement into the Existing or he should create an Application Landscape for the new requirement.

Data Modeler – He/She is accountable for designing the S2T. Data Modeler should know the Physical and Logical Design of the System and Database. For any Data warehouse project, the S2T is the important document.

ETL Developers – He/She should be accountable for High Level and Low Level design of the Requirements into Code. An ETL developer should be capable enough to design the ETL Code without compromising the performance of the Extraction and Lodging mechanism. He/She should know what kind of Transformation mechanism should be designed for the particular requirement.

Test Analyst / Test Managers – Test Managers should foresee all the technical and business requirements can be tested in given time frame, because the Test Data and System Behaviors might be changing which might cause the Schedule slippage.

  1. Availability of Adequate Data

We should start our System Testing with the Source System’s UAT Tested Data. If in case due to some issue if the Source System is not completed their System Testing, and they can provide only their System Tested data, then these data is not sufficient to continue our System Testing, so we need to fabricate the Test Data in order to kick off out Testing.

  1. Volume of Test Data

The Data threshold requirement will be declared by the System Owners. Whereas we cannot create that declared volume of data in our environment and test the performance beyond the threshold limit and below the threshold limit, so there might be Spool Space issues when we go to Production.

  1. Multiple Source & format of data

In case of Multiple Source Systems are participating in ETL, then it’s difficult to get the Data from different source systems on the same time to being our Testing, in this case again we need to create the Mock Test files to being our testing.

 8.     Version Control of Data

Versioning of Data is very difficult in the Data warehouse world whereas we can see the history of the data using few housekeeping columns (EFFT_D, EXPY_D). But the data loads and extracts should be versioned to avoid the confusions.

Cheers,

Asik

About Source to Target (S2T) Document or Mapping Document

Posted on

About Source to Target (S2T) Document or Mapping Document

S2T document is the bible of any ETL Projects. This will be designed by the Data Modeler’s using the FSD’s (Functional Specification Documents).

Data Modeler’s are interested in

  • the Source Data’s nature
  • Source Data is expected as it is mentioned in FSD
  • Are the source is sending the correct data type as they promised?
  • All the Date and Time fields are expected
  • Columns that are participating in Referential Integrity are falling under the data set given in FSD
  • Source Table and Target Table nature – like Set, Multiset etc.,
  • Is the Source is Time Variant or Non Time Variant
  • What all are the Key columns that makes the Record unique from the source?
  • What all are the value columns that triggers the delta from the Source?
  • Logical connections of the Source Tables participating in ETL Loads
  • Logical connections of the Target Tables participating in ETL Loads
  • Physical connections of the Source Tables participating in ETL Loads
  • Physical connections of the Target Tables participating in ETL Loads

Most of the S2T’s are written in Excel spread sheet. Where you will find many columns, each column is important for the ETL Load to be designed.

Major components of an S2T,

  1. Version Number – To maintain and track the ongoing changes happening in the S2T are tracked

using the Version Numbers.

 SourceDatabase– Source Database name or names will be mentioned in this space

 Source Column– All the Source columns from respective Source Database are mentioned in this space (these columns will undergo the transformations if required.

 Extraction Criteria– As I already mentioned, we are not going to pull all the data from the source before transformation however we need to pull the data that we required for the transformation and this will be mentioned in this space. All the Joins and Unions will be done here so as a Tester we need to understand and analysis this area with more care.

 Example – Now I wanted to load Customers details and their balance from Savings account. Customers Details will be fetched from Cust_Detl table and the Balance from ACCT_BALANCE table, so you need to perform join in between. This extracting filters only Customers Savings account. So I consider this is my Data extraction criteria.

 Filter Criteria– After we extracted the data from the source we need to filter the data if required for few or more target tables and this will be covered in this space.

 Example – I have extracted 100 records from the source which is linked to Savings account and to ACCT_OVER_LIMIT , we just want to load , customer’s who’s acct balance is more than $100, so Data Modelers, will use the filter rule just below to the Record extraction criteria.

 Target Database – Target Database name or names will be mentioned in this space

 Target Column – All the Target columns from respective Target Database are mentioned in this space (these columns will undergo the transformations if required.

 Key or Value Columns–Next to the Target Column names you could see Key / Value. Key column means the column that makes the record unique and Value means, what makes the record time variant.

Comments and Version Changes – This space will explain us what was in the S2T before and what changed now. Comments will tell us more about the transformation.

What we need to look in the S2T?

As soon as you get the S2T, please query your Staging Source tables and check the data that you have got will satisfy your transformation rule. S2T’s will be written with the SIT phase data, and the rules mentioned might change as soon as we get the UAT Phase data. So to achieve the good quality of testing we always interested in UAT data.

Example: Suppose if Data modeler mentioned a column as Key column (which should not change consecutive loads). But you have noticed that column values are changing from source on consecutive loads then you should notify the data modeler and the source system , either one should be correct and have to make the changes from their end.

Most of us will confuse the below transformation logic

  1. Difference between IN and NOT IN operator

Example: If the source columns are NOT NULL BLANK SPACE 0 and DO NOT LOAD the record. In this case, we might look for the record in the target when it has Unknown values.

  1. Joins – Even if the Data modelers mentioned the Joins in the Extraction rules we need to ensure the joins are working with the source data provide. This should be done because the data modelers will not experiment the transformation rules on the source data given.
  2. Reference tables – In every S2T you will have separate spread sheets that will have reference table names and the data needs to be inserted and the Insert scripts prepared by the Data Modelers.

 Thanks for reading – Comments are Expected 🙂

Cheers – Asik

ETL Testing and Teradata Control Framework widely called as TCF Framework

Posted on

Hello All,

Hope you have tested yourself on ETL Testing skills @ theoretical level at least. In this post let me explain about the Teradata Control Framework and the significance of testing role in it.

I am thankful to all the viewers from 52 countries and I request you people to comment how my blog is useful to you guys and is it Good / Bad / Ugly ?

The basic difference between conventional ETL Framework and TCF framework is, TCF is fully automated. Everything that we wanted will be given by a macro written in Excel sheet. TCF is very much helpful for all the developers because it reduces the coding effort.

TCF has three patterns and based on our requirement we need to choose which pattern will fit for our solution.

  1.        Full Apply Patter
  2.        Delta Apply Pattern
  3.        Insert & Append patter

The solution architect will decide what pattern they want to follow and cascade them to Data designers.

  •   If your target tables are not time variant and you are going to have only inserts then Full Apply    pattern will be applicable.
  •   If your target table is time variant then Delta Apply pattern will be applicable
  •   Insert & Append will be applicable for the target tables where you will not do delta, just append the new values upon the old values and just inserts

We as tester’s needs to understand the patterns and test the patterns are working as expected. We have Oracle control tables to maintain and control the ETL process likewise we have many control tables in TCF to maintain and control the ETL Process.

We have around 6 columns will be loaded by this TCF control tables in the matter of maintain the TCF process. We as testers need to verify these column values are populated from corresponding control tables.

In conventional ETL architecture we have [File to Staging table to Intermediate table to Target table], and we will maintain VARCHAR in all Staging and Intermediate tables but in TCF we will have [File to Staging table-1 to Source Image table to Intermediate table to target table]. In staging table we will do the Type Casting – changing the Data Types as per the Target table. So in TCF except Staging tables we will not keep all the columns as VARCHAR. As testers we need to verify the data type conversion is achieved as per the specification. These Data typing SQL are part of the TCF and will be generated using the TCF macros.

Unknown records are now will be updated in the Source Image tables only, so from Source Image table to IM table transformation, we do not want to verify for Null, Blank, Space, 0s.

  1.  Source Image table DDLs are generated by the Excel macro, so we need to check all the data types and the length of any columns are as per specifications.
  2.  We need to ensure the ETL is working as per the Pattern chosen – like we chose Fully Apply and if  the records are loaded for Delta Apply then it’s a mistake.
  3.  In conventional staging tables we do not keep historical records, but if you would have chosen delta apply pattern then Source Image tables will have Expiry and Inserts.
  4.  In TCF, source Image tables the copy records are treated as Delta records in Delta Apply patterns
  5.  If you are testing Soft Delete (Logical Delete) then please refer a column called – Record Delete Flag = 1
  6. In conventional ETL, if you do logical delete then you will keep Expiry Date= ETL Date – 1, but in TCF we will have the record active based on  EFFT_D and EXPY_D by record deleted flag will be set to 1
  7.  Date and Time validations will be done in Source Image tables only, if you send invalid Date or Time the Type Casting SQLs will load as Null in source image tables.
  8. As the Source Image tables are not SET tables, we need to run queries to get the duplicates out of it.
  9.    We cannot use TCF to load the data to Existing tables

Cheers – Asik

 

ETL Testing Roll Back post – Ask yourself and get ready to move to ETL Testing world

Posted on Updated on

Hello Folks

It had been long time I posted something through my Blog L , I’m really sorry for that 😦

jobinterview

I have met a person unknowingly in a common place, we were discussing about ETL testing and other stuffs and I told him I’m writing blogs about ETL Testing.  He was surprised and said that he is one visitor of my Blog. I am happy that I have met a unknown person who reads and benefited with my blog.

I have got few requests ‘Can you please share some of Interview Q & A?’

I replied them

“If you have hands on experience and if you would have mentioned in your CV about you skills and responsibilities then why you wanted to know about Q&A in ETL Testing”

Guys who are trying in ETL Testing domain, are thinking that if they know how to write SQL can clear the Interview and can get the job. Folks that’s not possible. I request you to think that ETL testing is not a different entity; your job is testing on whatever technology it is.

If you are taking an interview, please go through your CV and be strong on what you have mentioned. If you are asked other than what you have mentioned then please raise your hand ‘I’m sorry, I haven’t worked in it’

Ok let me come subject what I wanted to discuss here!!!

Here I’m going ask questions to myself to test my technical capability, if I win myself then I can win anywhere in the Globe, right guys?

So try to ask as much questions as possible to yourself!!!!

Q1

INTRV: Asik, Can you please let us know your roles and responsibilities in your previous project?

CAND: Hello, I’m from here and blaw, blaw, blaw………

Q2

INTRV: Asik, How you can scale yourself in writing in SQL from 1-5?

CAND:  [—-Fill your Best Answer here——]

Q3

INTRV: Asik, What all are the tools that you used in your ETL Projects?

CAND: [—-Fill your Best Answer here——]

Q4

INTRV: Asik, Can you please explain me the ETL architecture?

CAND: [—-Fill your Best Answer here——]

Q5

INTRV: What all are the Joins that you have worked so far? And can you please write down joins using two tables T1 and T2?

CAND: [—-Fill your Best Answer here——]

Q6

INTRV: Asik, what is the difference between Inner join and Outer Join?

CAND: [—-Fill your Best Answer here——]

Q7

INTRV: Asik, can you please explain me about the significance of Data Types in ETL Testing?

CAND: [—-Fill your Best Answer here——]

Q8

INTRV: Asik, I have a column as VARCHAR from source but corresponding target column is INTERGET, how you can write a query to check the target column?

CAND: [—-Fill your Best Answer here——]

Q9

INTRV: Asik, what is the significance of the Staging tables in the ETL loads?

CAND: [—-Fill your Best Answer here——]

Q10

INTRV: Asik, what is the difference between, copy records and duplicate records?

CAND: [—-Fill your Best Answer here——]

Q11

INTRV: Asik, what is the significance of Oracle Control tables in ETL Testing?

CAND: [—-Fill your Best Answer here——]

Q12

INTRV: Asik, what does it mean if you get a NULL, Blank, Space or 0 from Source?

CAND: [—-Fill your Best Answer here——]

Q13

INTRV: Asik, I have a VARCHAR column from source, this column can allow 0, suppose if source is sending 0000 instead of 0 , and the rule is we should not allow this record into target table. Does our SQL will stop this record if we write the condition as [Column <> 0]

CAND: [—-Fill your Best Answer here——]

Q14

INTRV: Asik, what is the difference between the Primary Index and Primary Key?

CAND: [—-Fill your Best Answer here——]

Q15

INTRV: Asik, what could be consequence if the reconciliation failed in the Day_01 load or Day_02 load?

CAND: [—-Fill your Best Answer here——]

Q16

INTRV: Asik, Why you wanted to test the DDL of any table?

CAND: [—-Fill your Best Answer here——]

Q17

INTRV: Asik, what is the difference between Data Definition Language and View Definition?

CAND: [—-Fill your Best Answer here——]

Q18

INTRV: Asik, as a tester what all are the access you required? And do you know the SQL query syntax to get the access?

CAND: [—-Fill your Best Answer here——]

 Q19

INTRV: Asik, what is the difference between DB level access, Table level access and view level access?

CAND: [—-Fill your Best Answer here——]

Q20

INTRV: Asik, What all are the steps you take to validate the Source Files? What is significance of Source File Names, Header, Trailer and Detailed Records?

CAND: [—-Fill your Best Answer here——]

Q21

INTRV: Asik, if you could not able to open a file manually by double click due to the file size? How you will access the file and look at the records?

CAND: [—-Fill your Best Answer here——]

Q22

INTRV: Asik, What is the High Severity defect that you raised so far in any ETL project and why you have classified as Sev-01?

CAND: [—-Fill your Best Answer here——]

Q22

INTRV: Asik, What is the High Priority defect that you raised so far in any ETL project and why you have classified as Sev-01?

CAND: [—-Fill your Best Answer here——]

Q23

INTRV: Asik, Explain about Inserts, Updates, Logical Deletes in any given ETL Projects?

CAND: [—-Fill your Best Answer here——]

Q24

INTRV: Asik, Why and what is necessity of Reference tables in any ETL Projects?

CAND: [—-Fill your Best Answer here——]

 Q25

INTRV: Asik, Can you modify Reference table data if you have Update access to the reference tables?

CAND: [—-Fill your Best Answer here——]

Q26

INTRV: Asik, What all are the Steps by which you can achieve the test data for a given scenarios?

CAND: [—-Fill your Best Answer here——]

Q27

INTRV: Asik, What is the difference between the Time Variant and Non time variant tables in any ETL Testing projects and how you will test it?

CAND: [—-Fill your Best Answer here——]

Q28

INTRV: Asik, Can you delete a table or Drop a table? What happened if we Drop a tables?

CAND: [—-Fill your Best Answer here——]

Q29

INTRV: Asik, What is the significance of Views in any ETL testing projects?

CAND: [—-Fill your Best Answer here——]

Q30

INTRV: Asik, Does a table contain more than one Primary Key? And Does a table contain more than one Primary Index?

CAND: [—-Fill your Best Answer here——]

Q31

INTRV: Asik, DBA has granted Select access to DB, does it mean that you have select access to all the tables sits in that DB?

CAND: [—-Fill your Best Answer here——]

 Q31

INTRV: Asik, if the target column is Integer (30) can you load a Varchar (30) into it?

CAND: [—-Fill your Best Answer here——]

Q32

INTRV: Asik, target Date column is constrained as DDMMYYYY but from source we are getting it as MMDDYYYY? What will happen to load job?

CAND: [—-Fill your Best Answer here——]

Q34

INTRV: Asik, What does it mean by Full-Snap shot and historical records?

CAND: [—-Fill your Best Answer here——]

Q35

INTRV: Asik, What does it mean Catch up logic involved in loads and how you will test it?

CAND: [—-Fill your Best Answer here——]

Q36

INTRV: Asik, What is the difference between Load projects and Extract projects?

CAND: [—-Fill your Best Answer here——]

Q37

INTRV: Asik, if the target column is defined as Decimal (4, 3), can you load the values as Decimal (4, 5)? Or can you load the values as Decimal (4, 3)

CAND: [—-Fill your Best Answer here——]

Q37

INTRV: Asik, What is the difference between Union and Union All? And where we want to use Union and where we want to use Union All?

CAND: [—-Fill your Best Answer here——]

 Q38

INTRV: Asik, What is the use of Cross Join? And what circumstances you use the Cross Join?

CAND: [—-Fill your Best Answer here——]

Q39

INTRV: Asik, What is the AutoSys comment to trigger the job, abort the job, hold the job?

CAND: [—-Fill your Best Answer here——]

Q40

INTRV: Asik, What is the significance of Group by functions in SQL and where we will use the Group by functions?

CAND: [—-Fill your Best Answer here——]

Q41

INTRV: Asik, How you can find the duplicates in the Staging tables?

CAND: [—-Fill your Best Answer here——]

Q42

INTRV: Asik, What is the difference between SET table and Multi Set tables? From where we can get these details?

CAND: [—-Fill your Best Answer here——]

Q43

INTRV: Asik, Can one View table refer another view table to create a new view table?

CAND: [—-Fill your Best Answer here——]

Q44

INTRV: Asik, Can one View table refer another view table to create a new view table?

CAND: [—-Fill your Best Answer here——]

 Q45

INTRV: Asik, What all are the common SQL Syntax errors you encountered and how you are resolved it?

CAND: [—-Fill your Best Answer here——]

Q46

INTRV: Asik, Do you have any idea what is Data Modeling?

CAND: [—-Fill your Best Answer here——]

Q47

INTRV: Asik, What is the difference between the Physical Design and Logical Design?

CAND: [—-Fill your Best Answer here——]

Q48

INTRV: Asik, What all are the Items available in a generic S2T?

CAND: [—-Fill your Best Answer here——]

Q49

INTRV: Asik, what does it mean if a column is mentioned as Value column in the S2T?

CAND: [—-Fill your Best Answer here——]

Q50

INTRV: Asik, Have I answered all of my basic questions to go to advanced questions?

CAND: I am still thinking and trying to answer few questions J

What is Quality Assurance & Quality Control? and Who is PQA, CC, DPA in QA ???

Posted on

Hi All,

Quality Assurance and Quality Control

quality-blocks1

 

Being a Test Professionals we should know about QA and QC. What we are doing is actually Quality Control related stuff and what people involved in bringing up CMMI, ISO standards are Quality Assurance. Quality Assurance is more likely related to maintain the process in any given Project or Programme. Quality Control is nothing but giving the right product by verifying the requirements are covered and working as expected.

We know there are multiple levels of testing methods are spoken in the testing world and we have the methodology by which we need to execute them like System Test, Integration Test etc., and Methodology like Water fall, Iterative etc.,

Quality Assurance:

Let me explain what I know and aware of Quality Assurance:

There 3 different roles who will be responsible for assuring the process in any projects.

  1. PQA – Project Quality Analyst
  2. CC – Configuration Controller
  3. DPA – Defect Prevention Analyst

Project Quality Analyst – PQA role

A person who involved in this role needs to know the pre-defined industry standards of that Organization.

PQA’s Roles and Responsibilities

  1. Documents Naming Convention met as per the Industry Standard
  2. Names of who prepared , reviewed , approved the deliverables
  3. Reviews happened across all the Customer facing documents
  4. Review defects are found, fixed, verified and captured for Metrics
  5. Checking whether all the deliverables are kept in a Common place where the stake holders can access
  6. Checking all the necessary deliverables are prepared by the project team
  7. Checking the actual project delivery date and the date mentioned in the documents are same
  8. Checking the Stake Holders names, document owner names are mentioned correctly in all customer deliverables
  9. Differentiating the Customer facing deliverables and Internal audit specific deliverables are as per the industry standards
  10. Verifying the Entry Criteria and Exit Criteria of any Levels in SDLC are met and collecting the proofs for the same
  11. PQA’s will be involved in all levels of SDLC

Business Analyst Teams will have separate sets of Deliverables like Business Requirement documents, Business Understanding Documents, Requirement Traceability documents etc.,

  1. Development teams will have separate sets of Deliverables like High Level Design, Low Level Design , Functional Specifications etc.,
  2. Testing teams will have separate sets of documents like Test Plans, Test Conditions

The PQA should validate all the documents that supposed to be delivered to the Clients and maintain for internal audits

CC – Configuration Controller

Configuration Controller who controls the versions and the placement of the documents in tools like VSS – Microsoft Virtual Source Safe or Documentum etc.,

Configuration Controller Roles and Responsibilities

  1. CC’s are responsible of Creating the folder structures in VSS or Documentum

Like, in any Projects the following folders will be created to maintain the projects deliverables

  1. Project Kick off
  2. Minutes of Meeting
  3. Requirements
  4. Review Log
  5. Development

1.1.  High Level design

1.2. Low Level Design

1.3. Issue Log

1.4. Emails

6. Testing

1.1   Unit Testing

1.2   System Testing

1.3   System Integration Testing

1.4   User Acceptance Testing

6.Production Support

CC’s will have the Admin rights to Grant and Revoke access to folders.

Developers should not have access to the folders related to Testing and vice versa

  1. CC’s will maintain the Check in and Check out of the documents that goes into VSS
  2. CC’s will maintain the relevant documents are kept in corresponding folders in VSS

DPA – Defect Prevention Analyst

Defect Prevent Analysts will maintain the defects across the SDLC. For any raised defects the work flow should be maintained. Proper comments for those defects should be given when they are created. All high severity defects should be fixed from one Phase to next phase to being with.

As testers when we raise defects we need to concentrate on applying Defect Cause and Defect Type in any Defect Management tool. This will help DPA’s to classify the defects and comes up prevention tips.

Defect Cause – What is the root cause of the defect that is,

  • Is the defect caused because of the Upstream Source Data or Test Data
  • Is the defect caused because of Incomplete of Missing requirements
  • Is the defect caused because the Inconsistent Requirement
  • Is the defect caused because of the Code discrepancy
  • If you find any anomalies in any Documents then raise the defects to Artefacts
  • If any of your defects leads to Changes in the Requirement then please raise them as Change Request – CR can be on the Actual Business Requirement or on Design Changes.

Defect Type – Classifications of the Defects that is,

  • Is the defect related to Data Error
  • Is the defect related to Application Code
  • Is the defect related to Work Request
  • Is the defect related to Change Request
  • Is the defect related to Deployment
  • Is the defect related to Specification
  • Is the defect related to Artefact
  • Is the defect related to Production
  • Is the defect related to Process
  • Is the defect related to Environment
  • Is the defect related to Specification
  • Is the defect related to Requirements
  • Is the defect related to Reviews

DPA’s most prominent work is to prepare the CAPA – “Corrective Analysis and Preventive Actions”

DPA roles and Responsibilities

  1. DPA’s will collect the Metrics related to Defects in a periodic manner – Weekly, Monthly or Ad-hoc
  2. DPA’s will collect the defects by Defect classifications like in a given period of time how many defects are raised in Reviews, Code, Requirement Change and collects the cause of the defects
  3. Then using the metrics that was retrieved from any Defect Management tools, they design the Fish Bone diagram by filling the necessary details
  4. Using the Statistical tools like Minitab, they calculate the Defect Density of the defects from each phase
  5. Then they will create the Prevention actions on the areas where the Defect Density is above to the threshold limits.

Suppose if your Organization has the Defect Density threshold limit as 0.5 and your Defects under Review defect type is more than 0.5 then they will ask the Reviewers to do their work better to minimize the review defects at any levels of SDLC.

Cheers – Asik