Database Testing Checklist for test engineers

Some time back I thought to create a checklist for database testing to ensure we are covering every aspect in Testing. With the help of my colleagues, Neha and Ankit,we prepared this list. Add one more column to this list for the result of each check.

 

Database Testing Checklist

#

Check Point

1)

Data Integrity

1

Is the complete data in the database is stored in tables

2

Is the data well logically organized

3

Is the data stored in tables is correct

4

Is there any unnecessary data present

5

Is the data present in the correct table

6

Is the data present in correct field within the table

7

Is the data stored correct with respect to Front End updated data

8

Is LTRIM and RTRIM performed on data before inserting data into database

 

 

2)

Field Validations

1

Is ‘Allow Null’ condition removed at database level for mandatory fields on UI

2

Is ‘Null’ as value not allowed in database

3

Is Field not mandatory while allowing NULL values on that field

4

Is the Field length specified on UI same as field length specified in table to store same element from UI into database.

5

Is the length of each field of sufficient size

6

Is the Data type of each field is as per specifications

7

Does all similar fields have same names across tables

8

Is there any computed field in the Database

 

 

3)

Constraints

1

Is required Primary key constraints are created on the Tables

2

Is required Foreign key constraints are created on the Tables

3

Are valid references are done for foreign key

4

Is Data type of Primary key and the corresponding foreign key same in two tables

5

Does Primary key’s ‘Allow Null’ condition not allowed

6

Does Foreign key contains only not NULL values

 

 

4)

Stored Procedures/ Functions

1

Is proper coding conventions followed

2

Is proper handling done for exceptions

3

Are all conditions/loops covered by input data

4

Does TRIM is applied when data is fetched from Database

5

Does executing the Stored Procedure manually gives the correct result

6

Does executing the Stored Procedure manually updates the table fields as expected

7

Does execution of the Stored Procedure fires the required triggers

8

Are all the Stored Procedures/Functions used by the application (i.e. no unused stored procedures present)

9

Does Stored procedures have ‘Allow Null’ condition checked at data base level

10

Are all the Stored Procedures and Functions successfully executed when Database is blank

 

 

5)

Triggers

1

Is proper coding conventions followed in Triggers

2

Are the triggers executed for the respective DML transactions

3

Does the trigger updates the data correctly once executed

 

 

6)

Indexes

1

Are required Clustered indexes created on the tables

2

Are required Non Clustered indexes created on the tables

 

 

7)

Transactions

1

Are the transactions done correct

2

Is the data committed if the transaction is successfully executed

3

Is the data rollbacked if the transaction is not executed successfully

4

Is the data rollbacked if the transaction is not executed successfully and multiple Databases are invlolved in the transaction

5

Are all the transactions executed by using TRANSACTION keyword

 

 

8)

Security

1

Is the data secured from unauthorized access

2

Are different user roles created with different permissions

3

Do all the users have access on Database

 

 

9)

Performance

1

Does Database perform as expected (within expected time) when query is executed for less number of records

2

Does Database perform as expected (within expected time) when query is executed for large number of records

3

Does Database perform as expected (within expected time) when multiple users access same data

4

Is Performance Profiling done

5

Is Performance Benchmarking done

6

Is Query Execution Plan created

7

Is Database testing done when server is behind Load Balancer

8

Is Database normalized

 

 

10)

Miscellaneous

1

Are the log events added in database for all login events

2

Is it verified in SQL Profiler that queries are executed only in Stored Procedures (i.e. no direct visible query in Profiler)

3

Does scheduled jobs execute timely

4

Is Test Data Dev Tool available

 

 

11)

SQL Injection

1

Is the Query parameterized

2

Does URL contain any query details

 

 

12)

Backup and Recovery

1

Is timely backup of Database taken

 

 

Advertisements

Database Testing; need for quality product

Database Testing for Quality Products

Until few years back, database testing was done by developers, who also develop the database objects. However, as the complexity of the database increasing, there is a demand of the dedicated test engineer for database who can do the testing as a Black box as well as white box tester. Database testing is important, considering the fact that a business application is about capturing users’ inputs, process them, and save the transactional data back in the database.

There is a need for the test engineers who can test the database’s objects to ensure the early detection of the issues. Considering the importance of the database testing, Database Test plan is now created separately.  A database test plan covers two aspects of the database, black box and white box.

Black box:

Verify the database schema as per the ER diagram.

White box testing:

Unit testing of database object such as Stored procedure, functions etc.

Performance Tuning

Optimization of queries, Query Execution plans, stored procedures, functions.

Orthogonal Array: High Test Coverage at low cost and efforts

The Orthogonal Array (OA) testing technique is a statistical way of reducing the test cases by removing redundant test conditions. A software system works on large number of interactions and integrations. It is very difficult to find out the erring interaction/integrations which the It is observed that major source of the bugs are interactions and integrations.

The bugs in the software are usually caused by a specific Condition independent of other conditions in the system. A condition is a set of conditions or variables, configurations, and logical linking of variables.

Think about testing a simple system, which has four variables and each variable can have 3 inputs. We need 81 cases (3x3x3x3) to test each pair of this system. If we use the OA technique, we will have only 9 cases to test the system where all the pair-wise combination will be validated. It will unearth the combination which is causing the failure in the system.

In the above case, OA technique has reduced the test condition by 89%, i.e. now with 11% of existing test cases all the variable pairs are validated. Interestingly, it will provide 100% coverage of pair combinations, 33% of three way combinations, 11% of four way combinations. Further to reduce the test condition, OA technique can be used on three way and four way combinations.

Use of OA techniques will ensure

  • All the pair-wise combinations of the selected variables are validated.
  • An effective test set containing with fewer test cases
  • Lowers the cost of testing as test cycles are shorter

For one of the client, which is Leader in the Long Term Health Care, OA techniques reduced the potential test cases from approx. 3592 to 1151 i.e. reduced by 68%. Test cycle efforts were reduced from 89 to 29 person days.

Testing team implemented the OA technique on the target system module wise. Testing team listed down the all the test conditions and then broke all the test conditions (if possible) in to the smaller condition where each condition can be tested with a pair of variables. This activity resulted in one OA of for each module. Depending on the count of variables and input values, the size of each OA was different.

Sqoop – Data Transfer Tool Between Relational and HDFS

Sqoop is software for transferring data between relational databases and Hadoop. Sqoop became a top-level Apache project in March 2012. It can be used, for example, to populate tables from a relational or NoSQL database for use with Hive or HBase.

Microsoft uses a Sqoop-based connector to help transfer data from Microsoft SQL Server databases to Hadoop. Couchbase also provides a Couchbase-Hadoop connector by means of Sqoop.

Limitations of Hadoop

Limitations of Hadoop

• Hadoop Map-reduce and HDFS are under active development.

• Programming model is very restrictive:- Lack of central data can be preventive.

• Joins of multiple datasets are tricky and slow:- No indices! Often entire dataset gets copied in the process.

• Cluster management is hard:- In the cluster, operations like debugging, distributing software, collection logs etc are too hard.

• Still single master which requires care and may limit scaling

• Managing job flow isn’t trivial when intermediate data should be kept

• Optimal configuration of nodes not obvious. Eg: – #mappers, #reducers, mem.limits