Correct Answers to Objective Type Questions on Verification & Validation are at the end of this page
Few Brain Teasing Questions on Verification and Validation-V and V
Q. 1: What sort of errors are covered by the Regression Testing?
Regression testing includes mainly, four types of errors
1) Data Corruption Errors: Due to sharing of data these errors result in side effects.
2) Inappropriate Control Sequencing Errors: Due to the changes in the execution sequences, these errors result in side effects.
For example, an attempt to remove an item from a queue before it is placed into the queue.
3) Resource Contention: Potential bottlenecks and deadlocks are some examples of these types of errors.
4) Performance Deficiencies: Timing errors and storage utilization errors are some examples of these types of errors.
<<<<<< =================== >>>>>>
Q. 2: What is Criticality Analysis?
It is a method to locate and reduce high-risk problems. It is performed at the beginning of the project. It identifies the functions and modules that are required to implement critical program functions or quality requirements like safety, security etc.
Following steps are used in criticality analysis:
Step – 1: Develop a block diagram or control flow diagram of the system and its software elements. Each block or control flow box represents a system or software function (module).
Step – 2: Trace each critical function or quality requirements through the block or control flow diagram.
Step – 3: Classify all traced software functions (modules) as critical to either the proper execution of critical software functions or the quality requirements.
Step – 4: Focus additional analysis on these traced critical software functions (modules).
Step – 5: Repeat criticality analysis for each life-cycle process / activity to determine whether the implementation details shift the emphasis of the criticality.
<<<<<< =================== >>>>>>
Q. 3: What is Traceability Analysis?
Traceability analysis traces each software requirement back to the system requirements. This is done to ensure that each requirement correctly satisfies the system requirements. This analysis will also determine whether any derived software requirements are consistent with the original objectives, physical laws and technologies described in the system document.
<<<<<< =================== >>>>>>
Q. 4: What is Interface Analysis?
It is a detailed examination of the interface requirements specifications. The evaluation criteria here are on the interfaces between the software and other hardware, user and external software.
Criticality analysis is continued and updated for the software. Criticality is assigned to each software requirement. When requirements are combined into functions, the combined into functions, the combined criticality of requirements form the criticality for the aggregate function.
The criticality analysis is updated periodically as requirement changes are introduced as such changes can cause a functions criticality to increase or decrease depending on the how the revised requirements impacts system criticality.
<<<<<< =================== >>>>>>
Q. 5: Describe the tool support for review processes.
As tools become available to perform some of the tasks previously done by humans, the cost effectiveness of review processes increases.
For example, utilization of a compiler to detect syntax errors in code and thus alleviating this task for the reviewers.
Another example is the design and specification consistency checkers.
<<<<<< =================== >>>>>>
Q. 6: Describe some techniques to find the types of errors.
Some of the techniques are described below:
1) Algorithm Analysis: It examines the logic and accuracy of the software requirements by translating algorithms into some language or structured format. The analysis involves re-deriving equations or evaluating the suitability of specific numerical techniques. Algorithm analysis examines the correctness of the equations and numerical techniques, truncation and sounding effects, numerical precision of word storage and variables and data typing influences.
2) Analytic Modeling: It provides performance evaluation and capacity planning information on software design. It represents the program logic and processing of some kind of model and analyzes it for efficiency.
3) Control Flow Analysis: It is used to show the hierarchy of main routines and their sub-functions. It checks that the proposed control flow is free of problems like unreachable or incorrect code.
4) Database Analysis: It ensures that the database structure and access methods are compatible with the logical design. It is done on programs with significant data storage to ensure that common data and variable regions are used consistently among all calling routines, that integrity of the entire data is maintained. At the same time it is ensured that no accident is able to overwrite any variable or any portion of the data by any data tables which happen to overflow. It is ensured that all data typing & its use are consistent all across the program.
<<<<<< =================== >>>>>>
Q. 7: What is Desk Checking?
It involves the examination of the software design or code by an individual. It includes
1) Looking over the code for defects.
2) Checking for correct procedure interfaces.
3) Reading the comments and comparing it to external specifications and software design.
<<<<<< =================== >>>>>>
Q. 8: What are Petri-nets?
Petri-nets model system to assure software design adequacy for catastrophic failure. The system is modeled using conditions and events represented by STDs. They can be executed to see how the software design will actually work under certain conditions.
<<<<<< =================== >>>>>>
Q. 9: What is program slicing ?
Slicing is a program decomposition technique used to trace an output variable back through the code to identify all code statements relevant to a computation in the program.
<<<<<< =================== >>>>>>
Q. 10: What is Test Certification?
It ensures that the reported test results are the actual finding of the tests. Test related tools, media & documentation are certified to ensure maintainability and repeatability of tests. This technique is also used to show that the delivered software product is identical to the software product that was subjected to V&V. It is used in critical software systems to verify that the required tests have been executed and that the delivered software product is identical to the product subjected to software V&V.
Correct Answers to Objective Type Questions on Verification & Validation are as under:
Many More Articles on Verification & Validation
An expert on R&D, Online Training and Publishing. He is M.Tech. (Honours) and is a part of the STG team since inception.