Return to site

Hewlett Packard: delivering quality products


Hewlett Packard was developing an innovative type of printer. As part of the process the product engineers would execute quality assurance (QA) testing on a daily basis. The QA testing was carried out by a team that would print reams of paper to assess any errors, which would indicate design flaws.


The QA testing team produced data that were delivered only once a day and were unreliable. The engineers had to spend a couple of hours reviewing the data in order to get it into a workable format. The root cause was an inefficient QA process - repetitive data entry into multiple legacy systems, manual processes and high employee turnover.


Project Edison: An overhaul of the entire testing process with the help of bar-code technology and an enhanced interface design.


The results were multiple. The QA testers were able to perform tests and enter results more efficiently and in less time, one system eliminated the need to reconcile information and engineers were able to obtain test results in real-time. In addition, they saved time by not having to review the data, and were able to trace testing activities back to the tester in case there were questions.

Data were entered into the desktop app then sent in real-time to the product engineers.

How it happened

I worked in an Agile team with the following members:

  • Stakeholders
  • Developers
  • > UX designer (me)
  • Visual designer  

Review of Current state

I first reviewed the process end-to-end, interviewing users to understand their pain points and observing how they performed their tasks. There were two sets of users:

  1. Quality assurance (QA)  testers, who executed test plans and reported results in the form of data
  2. Product engineers (Product) who received the data and relied on it as a basis for analysis of defects.

The main focus was on the QA testers. I met with a sample of the team members to get a sense of their day-to-day activities - observing their work environment and asking questions.

The QA testing team was based in Vancouver, WA and worked in shifts around the clock

in order to accommodate overseas engineers. Each test consisted of printing a ream of paper from the latest prototype and detecting errors, i.e. smudges, uneven color tone, Each error had to be reviewed by a tester, who then recorded it, categorizing by factors such as error type, location on page, frequency, etc. Testers often felt overwhelmed and the average tenure was six months.

Key pain points included:

  • Test procedures involved manual entry of administrative info such as test sample number, prototype name, test number, etc.
  • Work backlog resulting from high turnover and training new employees
  • Repetitive data entry into two legacy systems plus Excel template
  • Back-and-forth follow up from engineers 

User journey

I mapped out the journey for the tester, noting pain points.

Determining the future state

Next step I ran a series of discovery workshops with the stakeholders, end users and developers to review pain points and prioritize features based on technological limitations, corporate reporting cycles and other key factors.....

....then mapped out a future state process

The plan was to electronically submit test scripts. Each page printed would include a bar code with a unique identifier. Pages w errors would be scanned, QA would enter errors by checking off categories on a new interface.

  • Reduce manual steps
  • Product would receive error data in real-time.
  • Each error page could be tracked, along w the QA tester who reviewed it.
  • Product would no longer need to "scrub" data, saving 2 hours per day.

Site map

After mapping out flow and getting sign-off, I started determining where the interactions would go, taking into consideration factors such as employee access, and a taxonomy to categorize error reporting.

Once I determined the site map I started developing a set of wireframes using Axure, and iterated based on feedback from users and developers.

For the new process, each page would be assigned a bar code when printer. The QA tester would print out a ream of paper and examine each page. Those with errors would be scanned and automatically appear in the system. The tester would further evaluate the page and select the error type on the interface, plus add notes.

I performed iterative testing to identify opportunities in the design:

  • Preference for a black screen, as it differentiated from other monitors they had open
  • "Accordion" feature to quickly view range of error categories, as some pages had more than one error type.
  • For selecting errors, group by error category rather than alphabetical to be in sync with testing instructions.
  • "Clear" and "Save" features, as well as a prompt to save if there is no user action after an extended time.
  • Ability to create templates
  • Icons for easy identification.

Product engineers

Once the real-time data reporting feature came to fruition, I sat with the product engineers to understand how they would use the data. I discovered the following:

Only about half of the categories were needed and data could only be sorted by one feature at a time - date, numeric order of data field A, etc.

To address these:


In the end the team and I were able to deliver a product that was usable and time-saving for both sets of end-users. The QA testers only needed one system for a process that took a fraction of the time, and the engineers were able to receive reliable data in real-time.

A series of next steps were identified, including:

  • Enabling design to provide more sophisticated analysis for product engineers, i.e. regression
  • Further automation of QA testing process. Although there was still a  need for human oversight during the testing process, management was looking into the idea of executing tests automatically rather than have a tester interfere.
  • Tracking downtime, part changes and other features in order to gauge operating efficiency
All Posts

Almost done…

We just sent you an email. Please click the link in the email to confirm your subscription!

OKSubscriptions powered by Strikingly