Enter the characters shown in the image.

You are here

Non Functional Testing

Non-Functional testing? 

Is it a myth or just a contradiction phrase which means something different beside GUI testing ? 

First, lets check its definition on Wikipedia:  
Non-functional testing is the testing of a software application or system for its non-functional requirements: the way a system operates, rather than specific behaviors of that system.
This is contrast to functional testing, which tests against functional requirements that describe the functions of a system and its components.
The names of many non-functional tests are often used interchangeably because of the overlap in scope between various non-functional requirements. For example, software performance is a broad term that includes many specific requirements like reliability and scalability.

Non-functional testing includes: Baseline testing, Compliance testing, Documentation testing, Endurance testing, Load testing, Localization testing and Internationalization testing, Performance testing, Recovery testing, Resilience testing, Security testing, Scalability testing, Stress testing, Usability testing, Volume testing.


Baseline Testing -

Validating documents and specifications on which test cases would be designed. The requirement specification validation is baseline testing. 
Generally a baseline is defined as a line that forms the base for any construction or for measurement, comparisons or calculations. Baseline testing also helps a great deal in solving most of the problems that are discovered. A majority of the issues are solved through baseline testing.
so it is actualy testing the design, we verify that:

  • The Requirements are accurate, clear and internally consistent
  • Understanding the End user perspective and test expected results
  • Verify its consistency across all project

Poor requirement specifications will lead to development and testing serious faults and defects.

When converting project specification document to development and testing requirements, we need to consider few parameters:

  1. Be Specific - Eliminate uncertainty as much as possible, don't use words like: probably, maybe, might, etc.
  2. Measurable - Define success criteria, words like: faster or better, are not measureble definitions.
  3. Testable - Can we evaluate and verify the issue?. we must have to knowledge and understanding on how we verify the requirement. A requirement which cannot be verified or confirm is not testable.
  4. Consistant - Any requirement should not contradict another, and if contrudiction exist, we must solve it before the development process starts.
  5. Exclusive - Spec should also define what will not be done.
  6. Why - Requirement documentation should be focus only on the 'WHY', and set answers according to it. Requirement should NOT be talking on the 'HOW'


Compliance Testing

It is related with the IT standards followed by the company and it is the testing done to find the deviations from the company prescribed standards. 
It determines,whether we are implementing and meeting the defined standards.
We should take care while doing this testing,Is there any drawbacks in standards implementation in our project and need to do analysis to improve the standards.
Its basically an audit of a system carried out against a known criterion. 
Also know as Type testing is testing to determine whether a product or system or just a medium complies with the requirements of a specification, contract or regulation.


Performance Testing - 

Performance testing is target to test the application performance in a multi-user request situation. We test the performance during specific terms and traffic. In some cases we are doing the test against live systems in other we build simulators to simulate the traffic and other terms.
Also common is the use of capture-and-playback tools (automated testing). A capture tool is used to record the actions of a typical user performing a typical task on the system. A playback tool is then used to reproduce the action of that user multiple times simultaneously. The multi-user playback provides an accurate simulation of the stress the real-world system will be placed under. 
The use of capture and playback tools must be used with caution, however. Simply repeating the exact same series of actions on the system may not constitute a proper test. Significant amounts of randomization and variation should be introduced to correctly simulate real-world use.
You also need to understand the technical architecture of the system. If you don’t stress the weak points, the bottlenecks in performance, then your tests will prove nothing. You need to design targeted tests which find the performance issues.


Usability Testing

Usability testing is the process of observing users’ reactions to a product and adjusting the design to suit their needs. Marketing knows usability testing as 'focus groups' and while the two differ in intent many of the principles and processes are the same.
In usability testing a basic model or prototype of the product is put in front of evaluates who are representative of typical end-users. They are then set a number of standard tasks which they must complete using the product. Any difficulty or obstructions they encounter are then noted by a host or observers and design changes are made to the product to correct these.The process is then repeated with the new design to evaluate those changes.

There are some fairly important tenets of usability testing that must be understood :

  • Users are not testers, engineers or designers – you are not asking the users to make design decisions about the software. Users will not have a sufficiently broad technical knowledge to make decisions which are right for everyone. However, by seeking their opinion the development team can select the best of several solutions.
  • You are testing the product and not the users – all too often developers believe that it’s a ‘user’ problem when there is trouble with an interface or design element. Users should be able to ‘learn’ how to use the software if they are taught properly! Maybe if the software is designed properly, they won’t have to learn it at all ?
  • Selection of end-user evaluators is critical –You must select evaluators who are directly representative of your end-users. Don’t pick just anyone off the street, don’t use  management and don’t use technical people unless they are your target audience.
  • Usability testing is a design tool – Usability testing should be conducted early in the lifecycle when it is easy to implement changes that are suggested by the testing. Leaving it till later will mean changes will be difficult to implement.
  • Small Team - One misconception about usability studies is that a large number of evaluators is required to undertake a study. Research has shown that no more than four or five evaluators might be  required. Beyond that number the amount of new information discovered diminishes rapidly and each extra evaluator offers little or nothing new. And five is often convincing enough. If all five evaluators have the same problem with the software, is it likely the problem lies with them or with the software ? With one or two evaluators it could be put down to personal quirks. With five it is beyond a shadow of a doubt.
  • The proper way to select evaluators is to profile a typical end-user and then solicit the services of individuals who closely fit that profile. A profile should consist of factors such as age, experience, gender, education, prior training and technical expertise.
  • Separating them from the observers is a good idea too since no one performs well with a crowd looking over their shoulder. This can be done with a one-way mirror or by putting the users in another room at the end of a video monitor. You should also consider their legal rights and make sure you have their permission to use any materials gathered during the study in further presentations or reports. Finally, confidentiality is usual important in these situations and it is common to ask individuals to sign a Non-Disclosure-Agreement (NDA).


If you liked this Article, please comment, and we will update it with more information.

Thank You for reading






About The Author: 




Testing tool manufacturers world-wide list
10Levels ABID CONSULTING AccelQ Accord Software ActiMind AdaCore
AdaLog AgileLoad AgileWay Agitar Algorismi ALL4TEC
Andreas Kleffel Android Apache Apica Apollo Systems
Applitools AppPerfect Appsee ApTest Assertible Assure
Atlassian AutoIt Consulti .. Automation Anyw .. Automation Cons .. Axosoft Aztaz Software
Backtrace I/O Badboy BlazeMeter Borvid BrowserStack BSQUARE
BStriker Intern .. CA Technologies Canonical Canoo Engineeri .. Catch Software CelestialTeapot
Chris Mallett Cleanscape ClicTest CloudQA Codeborne CodeCentrix
CodePlex projec .. Codoid Cogitek Compuware Configure IT Conflair
ConSol Core Services Coronys Ltd Countersoft CresTech Softwa .. CrossBrowserTes ..
Crosscheck Netw .. Crowdsourced Te .. Cucumber Ltd Cyara Cygnet Infotech DareBoost
Databene Datamatics Glob .. DevExpress DTM soft Dynatrace LLC EasyQA
Eclipse EkaTechserv Elvior Emmanuel Jorge Empirix EPAM Systems
Equafy Esterel Technol .. eXept Software .. Experitest Finaris Froglogic
FrontEndART Ltd GeneXus GitHub project gnoso Google Code Pro .. GrammaTech
Gurock Software HelpSystems HENIX Hewlett Packard .. Hexawise High-Tech Bridg ..
Hiptest Hitex IBM Rational imbus Shanghai Impetus Inflectra
informUp InTENSO - IT Ex .. Ipswitch Jamo Solutions Janova JAR Technologie ..
JBoss Developer jClarity JetBrains Jively jQuery foundati ..
JS Foundation Jspresso Kanoah KMS Technology Kualitee LDRA Limited
Litmus LoadFocus Loadster Perfor .. MarathonITE Marketcircle Marketcircle
Maveryx Meliora Ltd Micro Focus Sof .. Microsoft Mobile Labs Mobile1st
Mockaroo, LLC Monkop Mozila MSys Technologi .. Navicat NeoTys
Neowise Softwar .. NetCart NORIZZK.COM Novosync Mobili .. NRG Global NTT Resonant
OC Systems Odin Technology OpCord Oracle Orcanos Original Softwa ..
OW2 PANAYA Parasoft PassMark Patterson Consu .. Perfecto Mobile
Pivotal, Inc. Plutora Postman (API To .. PractiTest PrimaTest Process One
Programming Res .. Psoda PureLoad PushToTest Python Q-Assurance
QA Systems QACube QASymphony QAWorks QMetry Quali
Qualitia Softwa .. Quality First S .. Quotium RadView Softwar .. Ranorex RedLine13
Reflective Solu .. ReQtest RevDeBug Robotium Tech Rogue Wave Soft .. Rommana Softwar ..
RTTS Runscope Sandklef GNU La .. Sauce Labs Seapine Softwar ..
SeleniumHQ Sencha Sensiple Siemens PLM Sof .. SmartBear Softw .. SmarteSoft
SOASTA SoftLogica Softomotive Softsmith Solution-Soft SonarSource
Sourceforge Spirent Technol .. SQS Software Qu .. Square Stimulus Techno .. Swifting AB
Synopsys T-komp T-Plan TechExcel TechTalk Telerik By Prog ..
Tellurium Test Collab Test Goat Test Recon TestCaseLab Gm ..
TestCraft Techn .. Testenium TestingBot TestLodge Testmunk
Testomato TestOptimal TestPlant TestPro Testsigma Techn .. Testuff
The Core Bankin .. The MathWorks The Open Group Thoughtbot Thoughtworks Time Simulator Top-Q Trace Technolog .. TrendIC TRICENTIS
Tritusa Pty Ltd TWD Solutions P .. TypeMock Tyto Software Ubertesters UniTESK
Universal Test .. Usetrace Ltd Utrecht Univers .. Validata Group Vanamco AG Vector Software
Veracode Verifaya Corpor .. Verit VersionOne Vornex Inc. WcfStorm Soluti .. We Are Mammoth Web Performance .. Wintask Wireshark Found ..
Worksoft Xceptance XK72 Xpand IT XQual ZAPTEST
Zeenyx Software .. Zephyr Zeta Software zutubi pty

Theme by Danetsoft and Danang Probo Sayekti