Friday, February 25, 2011

What are the points that should be taken Care for validating a Text Box?




 points that should be taken Care for validating a Text Box


Validation Criteria for Test /String Fields
While checking a text field following points should be taken in to consideration

1. Aesthetic (Visual) Conditions
2. Validation Conditions
3. Navigation Conditions
4. Usability Conditions
5. Data Integrity Conditions
6. Modes (Editable Read-only) Conditions
7. General Conditions
8. Specific Field Tests
8.1. Date Field Checks
8.2. Numeric Fields
8.3. Alpha Field Checks

1. Aesthetic Conditions

• Check that the text field has a caption.
• The label is not editable.
• Check the spelling of the label.
• Move the Mouse Cursor over all Enterable Text Boxes. Cursor should change from arrow to Insert Bar
• If it doesn't then the text in the box should be grey or non-updateable.
• Are the field prompts the correct color?
• Are the field backgrounds the correct color?
• In read-only mode, are the field prompts the correct color?
• In read-only mode, are the field backgrounds the correct color?
• Are all the field prompts aligned perfectly on the screen?
• Are all the field edits boxes aligned perfectly on the screen?
• Are all the field prompts spelt correctly?
• Are all character or alpha-numeric fields left justified? This is the default unless otherwise  specified.
• Are all numeric fields right justified? This is the default unless otherwise specified.
• Is all the micro help text spelt correctly on this screen?
• Is all the error message text spelt correctly on this screen?
• Is all users input captured in UPPER case or lower case consistently?
• Assure that the password entered is visible in encrypted format.

2. Validation Conditions

•Does a failure of validation on every field cause a sensible user error message?
•Is the user required to fix entries which have failed validation tests?
•Have any fields got multiple validation rules and if so are all rules being applied?
•If the user enters an invalid value and clicks on the OK button (i.e. does not TAB off the field) is the invalid entry identified and highlighted correctly with an error message?
•Validation consistently applied at screen level unless specifically required at field level?
•For all numeric fields check whether negative numbers can and should be able to be entered.
•For all numeric fields check the minimum and maximum values and also some mid-range values 
allowable?
•For all character/alphanumeric fields check the field to ensure that there is a character limit specified and that this limit is exactly correct for the specified database size?
•Do all mandatory fields require user input?

3. Navigation Conditions

•Does the Tab Order specified on the screen go in sequence from Top Left to bottom right? This is the default unless otherwise specified.

4. Usability Condition

•Is all date entry required in the correct format?
•Are all read-only fields avoided in the TAB sequence?
•Are all disabled fields avoided in the TAB sequence?
•Can the cursor be placed in the micro help text box by clicking on the text box with the mouse?
•Can the cursor be placed in read-only fields by clicking in the field with the mouse?
•Is the cursor positioned in the first input field or control when the screen is opened?
•SHIFT and Arrow should Select Characters. Selection should also be possible with mouse. Double Click should select all text in box.

5. Data Integrity Conditions

•Check the maximum field lengths to ensure that there are no truncated characters?
•Where the database requires a value (other than null) then this should be defaulted into fields. The user must either enter an alternative valid value or leave the default value intact.
•Check maximum and minimum field values for numeric fields?
•If numeric fields accept negative values can these be stored correctly on the database and does it make sense for the field to accept negative numbers?
•If a particular set of data is saved to the database check that each value gets saved fully to the database. i.e. Beware of truncation (of strings) and rounding of numeric values.

6. Modes (Editable Read-only) Conditions

•Are the screen and field colors adjusted correctly for read-only mode?
•Are all fields and controls disabled in read-only mode?
•Check that no validation is performed in read-only mode. 

7. General Conditions

•Assure that the Tab key sequence which traverses the screens does so in a logical way.
•Errors on continue will cause user to be returned to the tab and the focus should be on the field causing the error. (i.e. the tab is opened, highlighting the field with the error on it)
•All fonts to be the same

8. Specific Field Tests

8.1. Date Field Checks

• Assure that leap years are validated correctly & do not cause errors/miscalculations
• Assure that month code 00 and 13 are validated correctly & do not cause    errors/miscalculations
• Assure that 00 and 13 are reported as errors
• Assure that day values 00 and 32 are validated correctly & do not cause errors/miscalculations
• Assure that Feb. 28, 29, 30 are validated correctly & do not cause errors/ miscalculations
• Assure that Feb. 30 is reported as an error
• Assure that century change is validated correctly & does not cause errors/ miscalculations
• Assure that out of cycle dates are validated correctly & do not cause errors/miscalculations 

8.2. Numeric Fields

• Assure that lowest and highest values are handled correctly
• Assure that invalid values are logged and reported
• Assure that valid values are handles by the correct procedure
• Assure that numeric fields with a blank in position 1 are processed or reported as an error
• Assure that fields with a blank in the last position are processed or reported as an error an error
• Assure that both + and - values are correctly processed
• Assure that division by zero does not occur
• Include value zero in all calculations
• Include at least one in-range value
• Include maximum and minimum range values
• Include out of range values above the maximum and below the minimum
• Assure that upper and lower values in ranges are handled correctly 

8.3. Alpha Field Checks

• Use blank and non-blank data
• Include lowest and highest values
• Include invalid characters & symbols
• Include valid characters
• Include data items with first position blank
• Include data items with last position blank 
• Use html tags.

Some definitions


Black box testing – Internal system design is not considered in this type of testing. Tests are based on requirements and functionality.
White box testing – This testing is based on knowledge of the internal logic of an application’s code. Also known as Glass box Testing. Internal software and code working should be known for this type of testing. Tests are based on coverage of code statements, branches, paths, conditions.
Unit testing – Testing of individual software components or modules. Typically done by the programmer and not by testers, as it requires detailed knowledge of the internal program design and code. may require developing test driver modules or test harnesses.
Incremental integration testing – Bottom up approach for testing i.e continuous testing of an application as new functionality is added; Application functionality and modules should be independent enough to test separately. done by programmers or by testers.
Integration testing – Testing of integrated modules to verify combined functionality after integration. Modules are typically code modules, individual applications, client and server applications on a network, etc. This type of testing is especially relevant to client/server and distributed systems.
Functional testing – This type of testing ignores the internal parts and focus on the output is as per requirement or not. Black-box type testing geared to functional requirements of an application.
System testing – Entire system is tested as per the requirements. Black-box type testing that is based on overall requirements specifications, covers all combined parts of a system.
End-to-end testing – Similar to system testing, involves testing of a complete application environment in a situation that mimics real-world use, such as interacting with a database, using network communications, or interacting with other hardware, applications, or systems if appropriate.
Sanity testing - Testing to determine if a new software version is performing well enough to accept it for a major testing effort. If application is crashing for initial use then system is not stable enough for further testing and build or application is assigned to fix.
Regression testing – Testing the application as a whole for the modification in any module or functionality. Difficult to cover all the system in regression testing so typically automation tools are used for these testing types.
Acceptance testing -Normally this type of testing is done to verify if system meets the customer specified requirements. User or customer do this testing to determine whether to accept application.
Load testing – Its a performance testing to check system behavior under load. Testing an application under heavy loads, such as testing of a web site under a range of loads to determine at what point the system’s response time degrades or fails.
Stress testing – System is stressed beyond its specifications to check how and when it fails. Performed under heavy load like putting large number beyond storage capacity, complex database queries, continuous input to system or database load.
Performance testing – Term often used interchangeably with ‘stress’ and ‘load’ testing. To check whether system meets performance requirements. Used different performance and load tools to do this.
Usability testing – User-friendliness check. Application flow is tested, Can new user understand the application easily, Proper help documented whenever user stuck at any point. Basically system navigation is checked in this testing.
Install/uninstall testing - Tested for full, partial, or upgrade install/uninstall processes on different operating systems under different hardware, software environment.
Recovery testing – Testing how well a system recovers from crashes, hardware failures, or other catastrophic problems.
Security testing – Can system be penetrated by any hacking way. Testing how well the system protects against unauthorized internal or external access. Checked if system, database is safe from external attacks.
Compatibility testing – Testing how well software performs in a particular hardware/software/operating system/network environment and different combination s of above.
Comparison testing – Comparison of product strengths and weaknesses with previous versions or other similar products.
Alpha testing – In house virtual user environment can be created for this type of testing. Testing is done at the end of development. Still minor design changes may be made as a result of such testing.
Beta testing – Testing typically done by end-users or others. Final testing before releasing application for commercial purpose.