Top Interview Questions
Database testing is a crucial aspect of software quality assurance that focuses on validating the integrity, consistency, and reliability of data stored in a database. In modern applications, databases serve as the backbone, storing critical business data and ensuring smooth operation of software systems. Unlike functional testing, which primarily focuses on the user interface or application logic, database testing dives deep into the backend, examining data structures, data manipulation operations, and their adherence to business rules.
Databases are central to most applications, whether it is an e-commerce platform, banking system, healthcare application, or enterprise software. Any flaw in the database can lead to incorrect reports, business losses, or even system failures. Database testing ensures that the data retrieved, stored, or manipulated is accurate and consistent with the application requirements. Key reasons why database testing is essential include:
Data Integrity: Ensures that the data remains accurate and consistent across all operations.
Reliability of Data Retrieval: Verifies that queries fetch the correct data according to business logic.
Compliance with Business Rules: Confirms that the database constraints, triggers, and stored procedures adhere to defined rules.
Performance Optimization: Detects slow queries or performance bottlenecks, ensuring fast response times.
Security: Checks data access restrictions and prevents unauthorized operations on sensitive information.
Without proper database testing, applications may work correctly from the user interface perspective but fail in delivering reliable data outputs, leading to significant operational issues.
Database testing has several objectives, all aimed at maintaining the robustness and reliability of the database systems:
Validation of Schema: Ensures that tables, columns, keys, and relationships are correctly implemented according to the design.
Data Accuracy: Verifies that data stored matches the intended input and is correctly processed.
Stored Procedure and Trigger Testing: Ensures that business logic embedded in the database functions correctly.
Transaction Testing: Validates that database transactions (insert, update, delete) are atomic, consistent, isolated, and durable (ACID compliant).
Backup and Recovery Testing: Checks whether the database can be restored correctly after failure or crash.
Data Integrity Testing: Ensures that relationships among data entities are maintained and constraints are enforced.
Database testing can be broadly categorized into several types, each focusing on a specific aspect of the database:
Structural testing, also known as schema testing, focuses on verifying the design and structure of the database. It includes:
Validation of tables, columns, primary keys, and foreign keys.
Checking indexes, constraints, and triggers.
Ensuring normalization to eliminate data redundancy.
Confirming data types and default values for each field.
Functional testing validates that database operations perform as expected. This includes:
Verifying that stored procedures, functions, and triggers work correctly.
Checking the execution of CRUD (Create, Read, Update, Delete) operations.
Ensuring that business rules defined in the database are enforced accurately.
Non-functional database testing checks performance, scalability, and security aspects, including:
Performance Testing: Measures query execution time, indexing efficiency, and response under load.
Security Testing: Validates user roles, permissions, and access restrictions.
Backup and Recovery Testing: Ensures database can be restored to a previous state without data loss.
Whenever the database schema or business logic is updated, regression testing ensures that the existing functionality is not affected. This includes testing previously validated stored procedures, triggers, and queries after modifications.
In scenarios where data is migrated from legacy systems or other databases, data migration testing ensures:
All data is migrated accurately.
Relationships between data entities are preserved.
Data quality is maintained post-migration.
Several techniques are employed in database testing to ensure comprehensive coverage:
In black box testing, testers validate database operations without considering the internal implementation. The focus is on input and output, ensuring data stored matches expectations. For example, inserting customer data via an application form should correctly reflect in the database.
White box testing involves examining the internal database logic. Testers check stored procedures, triggers, functions, and SQL queries for correctness. This technique ensures that all logical paths are tested for proper functionality.
SQL query testing is essential in database testing. Testers validate queries for:
Accuracy: Whether queries return the correct data.
Efficiency: Whether queries are optimized for performance.
Safety: Whether queries prevent SQL injection or unauthorized access.
Data integrity testing ensures that the database maintains consistency and accuracy over its lifecycle. This includes:
Entity Integrity: Primary keys must be unique and not null.
Referential Integrity: Foreign keys must correctly reference primary keys in related tables.
Domain Integrity: Data values must be within defined constraints.
User-Defined Integrity: Custom rules specific to business logic must be enforced.
Transactions are critical in ensuring that operations are executed reliably. Transaction testing verifies:
Atomicity: Either all steps of a transaction succeed, or none do.
Consistency: Database remains in a valid state after a transaction.
Isolation: Transactions do not interfere with each other.
Durability: Changes made by a transaction persist after system failure.
Several tools can automate or assist in database testing, increasing efficiency and coverage:
Selenium with JDBC: Selenium can automate UI interactions while JDBC connects to the database for validation.
QTP/UFT: Unified Functional Testing tools allow both UI and database validation.
SQL Server Management Studio (SSMS): Useful for testing SQL Server databases manually.
Oracle SQL Developer: Used for testing Oracle databases and executing queries.
DbUnit: Java-based tool for database unit testing.
Tosca Testsuite: Supports database testing along with UI and API testing.
Automation in database testing reduces repetitive manual work and ensures accurate verification of complex queries and transactions.
Database testing is complex and comes with several challenges:
Large Volume of Data: Modern databases store millions of records, making manual validation difficult.
Complex Relationships: Ensuring consistency across multiple related tables can be challenging.
Dynamic Data: Real-time applications continuously modify data, complicating testing scenarios.
Performance Bottlenecks: Identifying inefficient queries or indexing issues requires deep expertise.
Security Concerns: Testing sensitive data without breaching privacy regulations is critical.
Synchronization Issues: Distributed databases may face synchronization or replication problems during testing.
Addressing these challenges requires careful planning, a combination of manual and automated testing, and skilled database testers.
To ensure effective database testing, organizations should follow best practices:
Understand the Database Design: Knowledge of schema, relationships, and business logic is critical before testing.
Use a Staging Environment: Avoid testing directly on production databases to prevent accidental data loss.
Automate Repetitive Tests: Use tools to automate query validation, data integrity checks, and regression testing.
Test Data Variety: Use a combination of valid, invalid, boundary, and stress data for comprehensive coverage.
Maintain Test Scripts: Keep SQL scripts organized, reusable, and maintainable.
Perform Regular Backups: Ensure backup and recovery processes are validated periodically.
Collaborate with Developers: Coordinate with database developers to understand complex logic and triggers.
Include Security and Compliance Checks: Ensure sensitive data is encrypted and access permissions are correctly implemented.
Database testing continues to evolve with technology advancements. Some trends include:
Cloud Database Testing: As more applications migrate to cloud-based databases like AWS RDS, Azure SQL, and Google Cloud SQL, testing in cloud environments becomes essential.
Big Data Testing: Testing massive datasets in technologies like Hadoop and Spark requires specialized strategies.
Automated Regression Testing: Advanced automation frameworks can run continuous database validation during DevOps pipelines.
AI-Driven Testing: Artificial intelligence can optimize query testing, detect anomalies, and predict performance issues.
Answer:
Database Testing is the process of testing the database to ensure data integrity, consistency, reliability, and correct behavior of database operations. It involves validating the data stored in tables, relationships, and transactions to ensure the application behaves as expected.
Key points to mention:
Ensures data consistency between the application and the database.
Verifies SQL queries and stored procedures.
Checks triggers, constraints, and indexes.
Validates database performance and security.
Answer:
Database Testing can be categorized into the following types:
Structural Testing (White Box Testing):
Verifies database structures such as tables, columns, indexes, and constraints.
Functional Testing (Black Box Testing):
Validates whether database operations meet the business requirements.
Non-Functional Testing:
Includes performance, scalability, reliability, and security testing of the database.
Data Integrity Testing:
Ensures data is accurate and consistent across the database.
Transactional Testing:
Checks whether transactions are committed or rolled back correctly.
Answer:
Data Integrity refers to maintaining accuracy and consistency of data in the database over its entire lifecycle.
Types of Data Integrity:
Entity Integrity: Ensures primary keys are unique and not null.
Referential Integrity: Ensures foreign keys match primary keys in another table.
Domain Integrity: Validates data type, format, and range of values.
User-Defined Integrity: Business-specific rules applied to the database.
Answer:
The main SQL commands are:
DDL (Data Definition Language): CREATE, ALTER, DROP
DML (Data Manipulation Language): SELECT, INSERT, UPDATE, DELETE
DCL (Data Control Language): GRANT, REVOKE
TCL (Transaction Control Language): COMMIT, ROLLBACK, SAVEPOINT
Answer:
Steps to perform database testing:
Requirement Analysis:
Understand the business logic, data model, and expected outputs.
Test Case Preparation:
Prepare SQL queries to validate data in tables.
Data Validation:
Check whether data entered through the application is correctly stored in the database.
Data Integrity Testing:
Verify relationships, constraints, and triggers.
Query Testing:
Validate stored procedures, views, functions, and triggers.
Transaction Testing:
Ensure proper commit and rollback mechanisms are working.
Performance Testing:
Validate indexing, query response time, and database performance.
| Aspect | Database Testing | Application Testing |
|---|---|---|
| Focus | Data, queries, integrity, and transactions | User interface, functionality, and workflow |
| Type | Back-end testing | Front-end testing |
| Tools | SQL, DBUnit, Oracle, MySQL | Selenium, QTP, JMeter |
| Goal | Ensure correct storage, retrieval, and integrity | Ensure software meets business requirements |
Answer:
ACID properties are the fundamental rules of database transactions:
Atomicity: Transaction is all or nothing (either fully completed or fully rolled back).
Consistency: Database remains in a valid state before and after the transaction.
Isolation: Transactions do not interfere with each other.
Durability: Once a transaction is committed, changes are permanent.
Answer:
Triggers are stored procedures automatically executed when certain events occur on a table, such as INSERT, UPDATE, or DELETE.
Testing Triggers:
Insert, update, or delete data to check trigger execution.
Validate if the trigger correctly modifies data or logs actions.
Ensure it doesn’t violate data integrity.
Answer:
A stored procedure is a set of SQL statements stored in the database that can be executed repeatedly.
Testing steps:
Execute stored procedure with valid inputs.
Validate output against expected results.
Test with invalid inputs to verify error handling.
Check transaction management within the procedure.
Answer:
Constraints enforce rules on data in tables:
Primary Key (PK): Unique identifier for a table row.
Foreign Key (FK): Ensures referential integrity with another table.
Unique: Ensures column values are unique.
Not Null: Column must have a value.
Check: Restricts values based on a condition.
Default: Provides a default value if none is supplied.
Answer:
Data validation involves checking:
Consistency: Compare database values with application input.
Accuracy: Correct calculations, aggregations, and sums.
Completeness: No missing or null values where mandatory.
Referential integrity: All foreign key references are correct.
| Feature | OLTP | OLAP |
|---|---|---|
| Purpose | Transaction processing | Analytical processing |
| Data Volume | Small, frequent transactions | Large, historical data |
| Query Type | Simple, fast queries | Complex queries |
| Example | Banking system, e-commerce | Data warehouse, BI reports |
Answer:
SQL Developer / TOAD: Execute queries, test procedures, and triggers.
DBUnit: Java-based testing framework.
Selenium + JDBC: For end-to-end testing.
Data Factory / Informatica: ETL testing.
QuerySurge: Automates ETL and database testing.
| Join Type | Description |
|---|---|
| Inner Join | Returns matching rows from both tables |
| Left Join | Returns all rows from the left table and matched rows from right |
| Right Join | Returns all rows from the right table and matched rows from left |
| Full Join | Returns all rows when there is a match in either table |
Answer:
Compare row counts of source and target tables.
Validate data accuracy by sampling critical records.
Verify constraints, triggers, and stored procedures.
Test application functionality using migrated data.
Ensure performance and indexing is not affected.
Answer:
ETL Testing (Extract, Transform, Load) ensures data is correctly extracted from the source, transformed as per business rules, and loaded into the target database or data warehouse.
Focus Areas:
Data completeness
Data accuracy
Transformation logic verification
Performance of ETL processes
Answer:
Handling large volumes of data.
Complex business rules for validation.
Testing stored procedures and triggers.
Maintaining test environments consistent with production.
Ensuring data integrity after migrations or updates.
Answer:
Monitor query execution time.
Check index usage and optimize queries.
Test concurrent user access.
Validate database response time for high volumes.
Identify and fix bottlenecks like locking or deadlocks.
Answer:
Yes. Using tools like SQL Developer, TOAD, or DBUnit, you can:
Execute SQL queries directly.
Run stored procedures.
Insert, update, and delete test data.
Validate outputs, triggers, and constraints.
Prepare a detailed test plan including all tables, constraints, and procedures.
Use sample test data and boundary cases.
Automate repetitive query validations.
Ensure data integrity after each transaction.
Document test cases and expected vs actual results.
π‘ Tips for Freshers:
Learn basic SQL queries: SELECT, JOIN, GROUP BY, HAVING, ORDER BY.
Understand relational database concepts and table relationships.
Practice testing stored procedures and triggers on a sample database.
Know the difference between OLTP and OLAP.
Be prepared for scenario-based questions: e.g., “If a transaction fails, how do you verify rollback?”
Answer:
A database schema is the blueprint of a database. It defines how data is organized and how the relationships between tables are structured.
Types of schema:
Physical Schema: How data is physically stored.
Logical Schema: How data is logically organized (tables, views, keys).
Example: Tables for Employees, Departments with relationships.
Answer:
Normalization is the process of organizing data in a database to reduce redundancy and improve data integrity.
Forms of normalization:
1NF: Eliminate repeating groups.
2NF: Remove partial dependencies.
3NF: Remove transitive dependencies.
Importance:
Reduces duplicate data.
Ensures consistency and efficient storage.
Makes maintenance easier.
Answer:
Denormalization is the process of introducing redundancy to improve query performance.
When used:
When complex joins reduce performance.
For reporting databases or OLAP systems.
Trade-off: Improved performance but increased storage and maintenance.
Answer:
Indexes are database structures that improve query performance by reducing search time.
Testing Indexes:
Validate that queries execute faster with indexes.
Check whether indexes exist for frequently searched columns.
Test insert/update/delete operations to ensure indexes are updated.
Answer:
A deadlock occurs when two or more transactions wait for each other to release locks, causing the system to halt.
Testing Deadlocks:
Simulate multiple concurrent transactions.
Use DB tools to monitor locks.
Check whether the database handles deadlocks (abort or rollback one transaction).
Answer:
Identify the type of trigger (BEFORE/AFTER, INSERT/UPDATE/DELETE).
Execute the corresponding action in the application or SQL.
Check if the trigger executed correctly (e.g., logging, updating another table).
Verify that no unintended changes occurred.
| Feature | DELETE | TRUNCATE |
|---|---|---|
| Deletes rows | Yes, based on condition | All rows |
| DML/DDL | DML | DDL |
| Transaction log | Logged | Minimal logging |
| Rollback | Can rollback | Can rollback in some DBs (depends on RDBMS) |
| Triggers | Activates triggers | Doesn’t activate triggers |
| Feature | Primary Key | Unique Key |
|---|---|---|
| Null values | Not allowed | Allowed (in some DBs, usually 1 null) |
| Uniqueness | Ensures unique record | Ensures unique value |
| Number of keys per table | One | Multiple |
| Purpose | Identifies record | Prevents duplicates |
Answer:
Identify input parameters and expected outputs.
Execute procedure with valid input and validate outputs.
Execute with invalid or edge case input to check error handling.
Test transaction behavior (COMMIT/ROLLBACK).
Answer:
A view is a virtual table created from a query of one or more tables.
Testing Views:
Check if the view displays correct data.
Test join and filter conditions used in the view.
Validate that updates or inserts through views work as expected (if allowed).
Answer:
A cursor is a database object that retrieves rows one by one from a result set.
Testing Cursors:
Open the cursor and fetch rows sequentially.
Validate that the fetched data matches the query.
Close the cursor and ensure no resource leaks occur.
| Feature | Clustered Index | Non-Clustered Index |
|---|---|---|
| Data storage | Sorts data physically | Separate structure |
| Only one per table | Yes | Multiple allowed |
| Query performance | Faster for range queries | Slower than clustered |
| Use case | Primary key column | Search-heavy columns |
Answer:
Verify that mandatory columns don’t accept NULL.
Check whether queries handle NULL correctly using IS NULL or COALESCE().
Validate reports and calculations for NULL handling.
Test default values for optional columns.
Answer:
Data migration testing validates that data moved from source to target is accurate, complete, and consistent.
Steps:
Compare row counts between source and target.
Validate data mapping rules and transformations.
Test primary/foreign key relationships.
Validate triggers, stored procedures, and views.
Answer:
Execute a transaction with multiple DML operations.
Introduce an error in one operation.
Verify that none of the previous operations are committed.
Check database logs and ensure data integrity.
Answer:
A sequence generates unique numbers automatically.
Used for primary keys or auto-increment values.
Testing sequences:
Verify correct increment values.
Check restart behavior after deletion or rollback.
Ensure uniqueness across concurrent transactions.
| Feature | Normal View | Materialized View |
|---|---|---|
| Storage | No storage | Stores data physically |
| Updates | Always reflects real-time data | Needs refresh to update data |
| Performance | Slower for large datasets | Faster as data is precomputed |
| Use Case | Simple reporting | Aggregated reporting |
Answer:
Work in short iterations with developers.
Write SQL queries to validate database changes for each sprint.
Use automation tools for repeated regression testing.
Validate data integrity after frequent deployments.
Answer:
Check user roles and privileges.
Verify that unauthorized users cannot access tables or data.
Test SQL injection vulnerabilities.
Ensure audit logs are maintained.
Answer:
Compare source vs target row counts.
Validate transformation rules applied correctly.
Check primary/foreign key relationships.
Verify aggregate functions and calculations.
Test incremental and full load processes.
Answer:
Orphan records: Records in a child table without a matching parent record.
Detection query:
SELECT * FROM ChildTable c
LEFT JOIN ParentTable p ON c.ParentID = p.ID
WHERE p.ID IS NULL;
Important for data integrity testing.
| Feature | UNION | UNION ALL |
|---|---|---|
| Duplicates | Removes duplicates | Keeps duplicates |
| Performance | Slower (removes duplicates) | Faster |
| Use Case | Combine unique records | Combine all records |
Answer:
Ensures that foreign key values match primary key values in another table.
Testing steps:
Insert valid child records and verify success.
Insert invalid child records and verify error.
Delete parent records and check cascade or restrict behavior.
Answer:
Perform a database backup.
Delete or corrupt some data in the database.
Restore database from backup.
Verify that data, structure, and relationships are intact.
| Feature | DBMS | RDBMS |
|---|---|---|
| Data storage | File-based | Table-based with relationships |
| Relationships | Not enforced | Enforced via keys |
| SQL support | Limited | Full SQL support |
| Example | Microsoft Access | Oracle, MySQL, SQL Server |
Answer:
Database Testing is validating the back-end database to ensure that data is accurate, consistent, secure, and performs well.
Includes verification of tables, views, triggers, stored procedures, indexes, constraints, and transactions.
Focuses on data integrity, performance, and security.
Also includes ETL, migration, and API data verification.
Answer:
Use SQL queries to verify primary key, foreign key, and unique constraints.
Validate referential integrity between parent and child tables.
Compare source vs target datasets (for migrations).
Sample data with boundary and negative cases.
Use automation scripts (Python, Java, or SQL) for repetitive checks.
| Feature | OLTP | OLAP |
|---|---|---|
| Purpose | Transaction processing | Analytical reporting |
| Data | Current, small | Historical, large |
| Queries | Short, simple | Complex, aggregations |
| Testing Focus | CRUD operations, ACID | Aggregations, ETL, performance |
| Example | Banking system | Data warehouse |
Answer:
For experienced testers, OLTP testing focuses on transaction accuracy, ACID compliance, while OLAP testing focuses on ETL validation, aggregation accuracy, and performance of analytical queries.
Answer:
Data Completeness: All records from source loaded into target.
Data Accuracy: Values match after transformation.
Transformation Rules: Business rules applied correctly.
Data Type Validation: Target column types match expected formats.
Performance: ETL loads executed within SLA.
Error Handling: Invalid data logged correctly.
Incremental Loads: Only new/changed records updated.
Answer:
Identify heavy queries or frequent stored procedures.
Measure query execution time using EXPLAIN PLAN or profiling tools.
Test concurrent transactions for locking, deadlocks, and contention.
Check indexes and optimize queries.
Monitor CPU, memory, and disk I/O for database performance.
Use tools like SQL Profiler, Oracle AWR, Query Analyzer.
Answer:
Atomicity: Ensure a transaction either commits fully or rolls back completely.
Consistency: Verify that constraints and rules are enforced after transactions.
Isolation: Test concurrent transactions do not interfere with each other.
Durability: Committed transactions persist after crash/restart.
Validation Example:
Create a multi-step transaction and intentionally cause an error to verify rollback (atomicity).
Simulate concurrent updates to test isolation levels.
Answer:
Identify the trigger type (INSERT/UPDATE/DELETE, BEFORE/AFTER).
Perform the triggering action and validate its effect (e.g., audit table insert).
Test boundary and negative scenarios.
Ensure triggers do not cause performance issues.
Monitor logs and error handling.
| Feature | Stored Procedure | Function |
|---|---|---|
| Return Value | Optional | Must return a value |
| Usage | Can be called independently | Can be called in SQL statements |
| Side Effects | Can modify data | Should not modify data (ideally) |
| Transaction Control | Allowed | Limited |
Answer:
Experienced testers validate parameters, output, error handling, and transaction handling for stored procedures and functions.
Answer:
Row Count Validation: Ensure total rows in source and target match.
Data Validation: Compare sample records for accuracy.
Schema Validation: Check constraints, indexes, triggers.
Business Rules Validation: Ensure transformation logic applied correctly.
Performance Testing: Validate queries run efficiently in the new system.
Backup/Restore Testing: Verify rollback and recovery procedures.
Answer:
Normalization: Process of organizing data to reduce redundancy and dependency.
Validation:
Check tables for repeating groups (1NF).
Ensure non-key attributes depend on the primary key (2NF).
Ensure attributes are non-transitively dependent (3NF).
Verify database changes don’t break relationships or constraints.
Answer:
Use execution plans (EXPLAIN PLAN in Oracle, SHOW PLAN in SQL Server).
Monitor CPU, memory, and I/O usage during execution.
Execute with large datasets to check scalability.
Compare response times before and after optimization.
Answer:
Worked with Informatica, Talend, DataStage, or QuerySurge.
Automate ETL validation using SQL scripts or automation frameworks.
Validate data completeness, accuracy, and transformation rules automatically.
Schedule daily/weekly ETL regression tests.
Answer:
Test using multiple simultaneous transactions.
Validate locking mechanisms and isolation levels (READ COMMITTED, SERIALIZABLE).
Simulate deadlocks and long-running queries.
Ensure ACID compliance during concurrent updates.
Answer:
Verify user roles and permissions.
Ensure no unauthorized access to sensitive tables or views.
Test encryption of sensitive data.
Check audit logs and monitor database activity.
Validate SQL injection prevention and security best practices.
Answer:
Check if frequently queried columns are indexed.
Use EXPLAIN PLAN to ensure queries use indexes.
Validate impact on insert/update/delete performance.
Monitor index fragmentation and maintenance.
Answer:
Maintain regression scripts for frequent data validation.
Automate SQL queries to check data integrity and business rules.
Validate stored procedures, triggers, and functions after changes.
Use test data management to ensure repeatable test scenarios.
Answer:
Perform full and incremental backups.
Corrupt or delete some data.
Restore from backup and validate data accuracy.
Check database recovery times and integrity after crash scenarios.
Test transaction rollback recovery.
Answer:
Validate joins, aggregations, subqueries, and group by conditions.
Compare results with manual calculations or sample datasets.
Test edge cases like NULLs, empty tables, and large data.
Monitor execution time and optimize queries if needed.
Answer:
Validate extract, transform, and load processes.
Data quality checks:
Completeness (all rows loaded)
Accuracy (values transformed correctly)
Consistency (data matches source)
Conformity (formats and types correct)
Automation: SQL scripts, ETL validation tools, or custom scripts.
Answer:
Check application query that generates the report.
Validate source database values.
Test ETL transformation logic if data comes from multiple sources.
Verify joins, filters, and aggregations in the SQL.
Compare report output vs database output.
Identify root cause (application, ETL, database, or report logic).
Answer:
Compare row counts and sample data across environments.
Check schema consistency (tables, columns, indexes).
Verify stored procedures, triggers, and views exist and behave the same.
Validate ETL jobs and reports across environments.
Answer:
SQL Developer, TOAD, MySQL Workbench – Query validation and stored procedure testing.
QuerySurge, Informatica, Talend – ETL validation and automation.
Selenium + JDBC / Python scripts – End-to-end automation including DB validation.
LoadRunner / JMeter – Database performance testing.
Answer:
Deadlock: Two transactions wait for each other, causing a cycle.
Testing: Simulate multiple concurrent transactions with locks.
Resolution:
Use proper transaction isolation levels.
Commit transactions quickly.
Detect deadlocks using database monitoring tools and terminate one transaction.
Answer:
Compare row counts and sample data before and after upgrade.
Validate stored procedures, triggers, and indexes.
Execute critical queries and reports to check results.
Test performance of queries and transactions.
Validate security and permissions post-upgrade.
Answer:
Check referential integrity constraints (foreign key).
Verify triggers and stored procedures affecting child table.
Identify failed transactions or partial commits.
Compare audit/log tables for changes.
Fix by enforcing cascade updates or error handling in triggers/procedures.
Answer:
A deadlock occurs when two or more transactions wait indefinitely for resources held by each other.
Detection methods:
Use database monitoring tools (e.g., Oracle Enterprise Manager, SQL Server Profiler).
Query system tables (V$LOCK, sys.dm_tran_locks) to identify waiting transactions.
Enable deadlock trace or logging.
Resolution:
Commit transactions quickly.
Apply proper locking and isolation strategies.
Use retry logic in applications.
Answer:
Orphan records: Child table rows with no corresponding parent.
Detection query example:
SELECT *
FROM ChildTable c
LEFT JOIN ParentTable p ON c.ParentID = p.ID
WHERE p.ID IS NULL;
Handling:
Fix data manually or through ETL.
Enforce foreign key constraints.
Implement cascading updates/deletes if appropriate.
Answer:
Compare source vs target data for row counts and values.
Validate transformation rules: aggregation, calculations, data type conversions.
Check data quality: nulls, duplicates, formatting errors.
Validate performance of ETL jobs under large volumes.
Use automation tools like QuerySurge, Informatica, Talend.
| Key Type | Primary Key | Unique Key | Foreign Key |
|---|---|---|---|
| Null Allowed | No | Yes (sometimes 1 null) | Depends on constraint |
| Uniqueness | Must be unique | Must be unique | Not necessarily unique |
| Purpose | Identify row | Prevent duplicates | Maintain referential integrity |
| Number per table | 1 | Multiple | Multiple |
Answer:
Check input and output parameters.
Execute with valid and invalid inputs.
Validate business rules implemented in procedure.
Check transaction handling (COMMIT/ROLLBACK).
Test performance for large data sets.
Answer:
Identify heavy queries or frequently accessed tables.
Test using concurrent transactions to simulate multiple users.
Monitor CPU, memory, and I/O usage.
Optimize queries with indexes or query rewriting.
Use tools: LoadRunner, JMeter, SQL Profiler.
Answer:
Index: Database structure to speed up query performance.
Validation:
Ensure indexes exist on frequently queried columns.
Use EXPLAIN PLAN to check query execution paths.
Check impact on INSERT/UPDATE/DELETE operations.
Monitor index fragmentation and rebuild if necessary.
Answer:
Compare row counts for all tables across Dev, QA, and Prod.
Validate critical data values by sampling.
Ensure schema consistency: columns, constraints, indexes, triggers.
Execute stored procedures, views, and reports to verify consistent behavior.
Answer:
Check SQL query generating the report.
Validate source database values.
Verify ETL transformation logic if data comes from multiple sources.
Check filters, joins, and aggregation in the report.
Compare report output with database output.
Identify root cause: query, ETL, or application logic.
Answer:
Data migration testing ensures data moved from source to target is accurate and complete.
Validation steps:
Compare row counts and key columns.
Check data types and formats.
Validate business rules and constraints.
Test ETL jobs, stored procedures, and triggers.
Validate performance and indexing in target.
Answer:
Identify mandatory columns that should not allow NULL.
Test queries, reports, and calculations with NULL values.
Validate default values applied for optional columns.
Check for application-level handling of NULL.
Answer:
Create a multi-step transaction.
Introduce an error in one operation.
Verify previous steps are rolled back (no partial commits).
Check database logs to confirm rollback.
Validate data integrity and constraints remain intact.
Answer:
Identify trigger type (BEFORE/AFTER, INSERT/UPDATE/DELETE).
Perform actions that activate the trigger.
Verify data modifications or audit logs.
Test negative scenarios (invalid inputs).
Monitor performance impact.
Answer:
ETL automation involves automating data validation during ETL loads.
Use SQL scripts, Python scripts, or ETL testing tools like QuerySurge.
Validate row counts, transformation rules, and key columns automatically.
Schedule regression ETL tests for each load.
Answer:
Verify user roles and permissions.
Test restricted access to sensitive tables and columns.
Validate encryption and data masking.
Check audit logs for unauthorized activity.
Test SQL injection and vulnerabilities.
Answer:
Verify foreign key constraints.
Check triggers or stored procedures that update child tables.
Identify partial or failed transactions.
Analyze audit logs.
Fix using cascading updates or proper error handling.
Answer:
Compare report data with database tables.
Validate aggregations, joins, and filters.
Test boundary cases, null values, and special characters.
Verify performance of report queries.
Automate regression using SQL scripts or testing tools.
Answer:
Compare schema (tables, columns, constraints, indexes).
Verify stored procedures, functions, and triggers.
Test critical queries and reports.
Validate performance and security.
Check ETL jobs and automated scripts still work correctly.
Answer:
Materialized view: Stores precomputed data physically.
Testing steps:
Validate data accuracy with source tables.
Test refresh schedules (complete/fast/force).
Validate performance improvement over normal views.
Check indexes on materialized views.
Answer:
Use sampling techniques for validation.
Perform batch processing for insert/update/delete.
Optimize queries with indexes and partitions.
Automate repetitive validation with scripts.
Test performance and scalability with large volumes.
| Feature | UNION | UNION ALL |
|---|---|---|
| Duplicate rows | Removed | Retained |
| Performance | Slower (duplicates removed) | Faster |
| Use Case | Unique combined results | Combine all records |
Answer:
Simulate multiple transactions updating or inserting data simultaneously.
Verify triggers fire correctly without data conflicts.
Check for deadlocks or locking issues.
Validate audit and logging triggers.
Answer:
Ensures foreign key values exist in parent table.
Testing steps:
Insert valid child records – should succeed.
Insert invalid child records – should fail.
Delete parent records – check cascade/restrict behavior.
Answer:
Maintain SQL scripts for validation of tables, views, and procedures.
Automate checks for row counts, transformations, and calculations.
Execute stored procedures and triggers after updates.
Validate ETL jobs and reports for consistent output.
Document expected vs actual results for each regression cycle.
Answer:
Check query execution plan to identify bottlenecks.
Verify indexes and partitions used in query.
Analyze table size and joins.
Check locking or deadlocks causing delays.
Optimize query or add indexes, hints, or rewrite SQL.
Test performance after changes before deploying.