OneStopTesting - Quality Testing Jobs, eBooks, Articles, FAQs, Training Institutes, Testing Software, Testing downloads, testing news, testing tools, learn testing, manual testing, automated testing, load runner, winrunner, test director, silk test, STLC

Forum| Contact Us| Testimonials| Sitemap| Employee Referrals| News| Articles| Feedback| Enquiry
Testing Resources
  • Testing Articles
  • Testing Books
  • Testing Certification
  • Testing FAQs
  • Testing Downloads
  • Testing Interview Questions
  • Career In Software Testing
  • Testing Jobs
  • Testing Job Consultants
  • Testing News
  • Testing Training Institutes
  • Introduction
  • Designing Test Cases
  • Developing Test Cases
  • Writing Test Cases
  • Test Case Templates
  • Purpose
  • What Is a Good Test Case?
  • Test Specifications
  • UML
  • Scenario Testing
  • Test Script
  • Test Summary Report
  • Test Data
  • Defect Tracking
    Software testing
  • Testing Forum
  • Introduction
  • Testing Start Process
  • Testing Stop Process
  • Testing Strategy
  • Risk Analysis
  • Software Listings
  • Test Metrics
  • Release Life Cycle
  • Interoperability Testing
  • Extreme Programming
  • Cyclomatic Complexity
  • Equivalence Partitioning
  • Error Guessing
  • Boundary Value Analysis
  • Traceability Matrix
    SDLC Models
  • Introduction
  • Waterfall Model
  • Iterative Model
  • V-Model
  • Spiral Model
  • Big Bang Model
  • RAD Model
  • Prototyping Model
    Software Testing Types
  • Static Testing
  • Dynamic Testing
  • Blackbox Testing
  • Whitebox Testing
  • Unit Testing
  • Requirements Testing
  • Regression Testing
  • Error Handling Testing
  • Manual support Testing
  • Intersystem Testing
  • Control Testing
  • Parallel Testing
  • Volume Testing
  • Stress Testing
  • Performance Testing
  • Agile Testing
  • Localization Testing
  • Globalization Testing
  • Internationalization Testing
    Test Plan
  • Introduction
  • Test Plan Development
  • Test Plan Template
  • Regional Differences
  • Criticism
  • Hardware Development
  • IEEE 829-1998
  • Testing Without a TestPlan
    Code Coverage
  • Introduction
  • Measures
  • Working
  • Statement Coverage
  • Branch Coverage
  • Path Coverage
  • Coverage criteria
  • Code coverage in practice
  • Tools
  • Features
    Quality Management
  • Introduction
  • Components
  • Capability Maturity Model
  • CMMI
  • Six Sigma
    Project Management
  • Introduction
  • PM Activities
  • Project Control Variables
  • PM Methodology
  • PM Phases
  • PM Templates
  • Agile PM
    Automated Testing Tools
  • Quick Test Professional
  • WinRunner
  • LoadRunner
  • Test Director
  • Silk Test
  • Test Partner
  • Rational Robot
    Performance Testing Tools
  • Apache JMeter
  • Rational Performance Tester
  • LoadRunner
  • NeoLoad
  • WAPT
  • WebLOAD
  • Loadster
  • OpenSTA
  • LoadUI
  • Appvance
  • Loadstorm
  • LoadImpact
  • QEngine
  • Httperf
  • CloudTest
  • Perl Testing
  • Python Testing
  • JUnit Testing
  • Unix Shell Scripting
    Automation Framework
  • Introduction
  • Keyword-driven Testing
  • Data-driven Testing
    Configuration Management
  • History
  • What is CM?
  • Meaning of CM
  • Graphically Representation
  • Traditional CM
  • CM Activities
  • Tools
  • What Is Software Testing?
  • Effective Defect Reports
  • Software Security
  • Tracking Defects
  • Bug Report
  • Web Testing
  • Exploratory Testing
  • Good Test Case
  • Write a Test
  • Code Coverage
  • WinRunner vs. QuickTest
  • Web Testing Tools
  • Automated Testing
  • Testing Estimation Process
  • Quality Assurance
  • The Interview Guide
  • Upgrade Path Testing
  • Priority and Severity of Bug
  • Three Questions About Bug
    Home » Testing Articles » Usablity Testing Articles » 12 Tips to Improve Your Usability Testing Technique

    12 Tips to Improve Your Usability Testing Technique

    A D V E R T I S E M E N T

    Usability testing is a technique used to evaluate a product, such as an application, website, book etc. by observing people using it. The goal is to discover usability problems, collect quantitative data (e.g. time on task, error rates), and determine the participant's satisfaction with the product.

    In this article I've gathered 12 tips to sharpen your Usability testing technique, which is key to discover more errors and areas of improvement in your product.

    1. Setting clear criteria for participant recruitment

    Recruiting the right participants is key for effective user research, because your research results are only as good as the participants involved.

    You should deny participants who have conflicts of interest (working for your client or competitor), who have inappropriate computer and Web experience (too little or too much experience unless it is appropriate for the project) and those who are not very expressive.

    2. Amount of participants

    A long time ago (2000) Jacob Nielsen wrote that only 5 participants are necessary for a valuable usability test and that the gained insight diminishes rapidly after the fifth. determed this number with the help of a formula, which is not that different from Jacob Nielsen's number.

    Usually 3 to 5 respondents per round are enough to encounter many of the most significant problems related to the tasks you're testing. It's pretty much a certainty that you won't uncover some of the serious problems in a given round of testing. That is why you'll be doing more than one round.

    Amount of Participants

    Amount of Participants - As soon as you collect data from a single test user, your insights shoot up and you have already learned almost a third of all there is to know about the usability of the design.

    3. Mention your objectives clearly to the user

    Put the candidates at ease and run them through the software tools and equipment. Explain the objectives of the test, how long it will take and how the gathered data will be used.

    Inform the participant that you are testing the product, not the participant's skills. Respondents have a tendency to attribute failure in the task to their own incapability, rather than a flaw in the design. Tell them they can't do anything wrong. In fact, the more mistakes they'll find while testing, the better. Stress this point more than once so test participants understand it clearly.

    4. Choosing tasks carefully

    Set tasks that are essential to the success of the new website or application, such as buying products, paying bills or contacting the client. If these 'top-tasks' are not clear to you, you could always ask the client which questions your research will need to answer.

    People also tend to perform more naturally if you provide them with scenarios rather than instructions. Instead of asking them to find the contact section of your application, you could phrase it like a scenario. For example: "You fell down from the stairs and had to call the ambulance. You're wondering if your medical insurance is covering this and would like to contact them Find the telephone number".

    A scenario provides some context and supplies information the user needs to know, but doesn't (e.g. username and a password for a test account). It's important not giving away any clues in the scenario.

    5. Ask your respondents to think aloud during the test

    Think-aloud protocols, or TAP, involve participants thinking aloud as they are performing a set of specified tasks. Ask them to say whatever they are looking at, doing and feeling as they move through the user interface.

    This method has several advantages. You'll know what your users really think about the design which could turn into actionable redesign recommendations.

    More Usablity Testing Articles

    discussionDiscussion Center


    Yahoo Groups
    Y! Group
    Sirfdosti Groups
    Contact Us

    Looking for Software Testing eBooks and Interview Questions? Join now and get it FREE!
    A D V E R T I S E M E N T

    Members Login

    Email ID:

    Forgot Password
    New User
    Testing Interview Questions
  • General Testing
  • Automation Testing
  • Manual Testing
  • Software Development Life Cycle
  • Software Testing Life Cycle
  • Testing Models
  • Automated Testing Tools
  • Silk Test
  • Win Runner
    Testing Highlights

  • Software Testing Ebooks
  • Testing Jobs
  • Testing Frequently Asked Questions
  • Testing News
  • Testing Interview Questions
  • Testing Jobs
  • Testing Companies
  • Testing Job Consultants
  • ISTQB Certification Questions
    Interview Questions

  • WinRunner
  • LoadRunner
  • SilkTest
  • TestDirector
  • General Testing Questions

  • Testing Forum
  • Downloads
  • E-Books
  • Testing Jobs
  • Testing Interview Questions
  • Testing Tools Questions
  • Testing Jobs
  • A-Z Knowledge
    Study ABROAD ?

    Study Abroad

    Vyom Network : Free SMS, GRE, GMAT, MBA | Online Exams | Freshers Jobs | Software Downloads | Programming & Source Codes | Free eBooks | Job Interview Questions | Free Tutorials | Jokes, Songs, Fun | Free Classifieds | Free Recipes | Bangalore Info | GATE Preparation | MBA Preparation | Free SAP Training
    Privacy Policy | Terms and Conditions
    Sitemap | Sitemap (XML)
    Job Interview Questions | Placement Papers | SMS Jokes | C++ Interview Questions | C Interview Questions | Web Hosting
    German | French | Portugese | Italian