PTC Servigistics.
Usability Testing for PTC Servigistics Inventory Management Page.
Overview.
An analysis of a new feature currently being implemented by the Servigistics team: The Inventory Collaboration page. This project included collaborating with the Servigistics team, conducting a heuristic evaluation, and a usability evaluation.
Duration: 4 months
Team: 5 members
Skills: Heuristic Evaluation, Usability Testing, Quantitative Analysis, Qualitative Research
Tools: Zoom, Microsoft Excel
The Product
PTC Servigistics is a leading service part optimization software that helps companies with supply chain optimization.
We analyzed the inventory collaboration page of the Servigistics service parts management software.
This product is intended to be used by parts managers responsible for inventory in heavy-equipment dealerships, automotive dealerships, computer companies, and the military.
Client Meeting
The client project kickoff meeting is an opportunity to hear from the client and their project team, as well as other key stakeholders, about the project and get a broader understanding of the project background and business drivers that led to the project being initiated, review the project brief, and discuss how to make it a success.
Introductions: Our team, Client team, contact information
Client project
Client goals
Project scope
Participant profile
Heuristic Evaluation
Heuristic evaluation is a process to Identify and focus on specific issues without having to speak to users. To discover usability problems with individual elements and how they impact the overall user experience and provide quick and inexpensive feedback to designers. We followed a three-step design process for the heuristic evaluation:
We converged to establish a common vision and understanding of the software which helped us decide which aspects of the system the team will evaluate
We diverged to examine the interface. Individually, we examined the interface based on the heuristics listed.
We discussed the violations we found and created an evaluation with the combined findings by aggregating findings for the group. In our evaluation we described the issue, rated it based on severity, and offered potential solutions.
46
Heuristic Violations Found
7
Usability catastrophe: Imperative to fix this before product can be released
11
Major usability problem: important to fix, so should be given high priority
13
Minor usability problem: fixing this should be given low priority
15
Cosmetic problem only: need not be fixed unless extra time is available on project
Test Plan
A test plan is helpful for adaptation. Usability testing requires a willingness to
adapt to the battlefield conditions like last minute changes to software, participant no-shows, equipment availability and malfunctions.
Why you want to conduct the test (goals and objectives)
What you are going to test (aspects of the system)
Who you are going to test with (participants)
How you are going to conduct it (test design)
When you are going to perform it
What data you will collect
What you will provide upon completion
Test Materials
Developing materials early on helps to explicitly structure and organize the test. Difficulty developing one particular type of test material can be a sign that there are flaws in the test objectives and test design.
List of Test Materials:
Orientation Script
Informed Consent
Team Roles
Background Questionnaire
Pre-Test Questionnaire
Think-Aloud Practice
Task Scenarios
Post-Task Questionnaire
Post-Test Questionnaire
Debrief
Pilot Testing
Pilot testing is a rehearsal of your actual usability test. It is important to pilot test your test equipment and materials with a volunteer participant
We conducted our pilot tests in the evening through Zoom. Since we used Zoom, the environment in which participants were located varied. As for the virtual environment, there were three people within the Zoom call. There was a team member, a participant, and an employee of the company. The employee provided the software on their computer, which the participant controlled remotely through Zoom. It is important to note that while the employees shared their screen, they were not present throughout the call.
Screener
Screener questions prevents incorrect users in usability testing sessions, it consists of a multiple-choice questions that can either eliminate users from taking part in your study or give them access to it.
We wanted to include participants who had at least some basic knowledge of using an inventory management system. We also needed to exclude participants who used the PTC Servigistics software to exclude any bias in the study.
We circulated our screeners through social media platforms like WhatsApp, Discord, an Slack and found out potential participants for the study. We recruited 5 participants for the usability testing of the software.
Research Questions
Defining Research questions is an important part of the usability testing process. This gives us an idea about what things we need to focus on while designing a usability test.
What are the major usability issues in the SKU levels section within the application?
Are participants able to successfully find inventory parts? How easy/hard is it?
Are participants able to successfully use the liquid layout feature? How easy/hard is it?
How difficult is it to find specific inventory parts, as compared to participants’ initial expectations?
Are there any false, missing, or wrong affordances? Do participants find it easy to navigate the application? Do participants recognize what icons/symbols refer to?
How and when do participants use help functions? Are they satisfied with the help they receive?
Data Collection
Quantitative data:
5-point Likert ratings for
Ease of use (Very Easy, Easy, Neutral, Difficult, Very Difficult)
Satisfaction with the look/feel of the user interface and help features (Very Satisfied,Satisfied,Neither Satisfied nor Dissatisfied, Dissatisfied, Very Dissatisfied)
Agreeance with statement (Strongly Agree, Somewhat Agree, Neither Agree nor Disagree, Somewhat Disagree, Strongly Disagree)
Likeliness to use a system like this in the future (Extremely likely, Somewhat likely,Neither likely nor unlikely, Somewhat unlikely, Extremely unlikely)
Background questionnaire results, aggregated across participants
Qualitative Data:
Open-ended questions
Notes taken during the test sessions
Recording from the testing session
Task Setup
Testing Environment
Zoom
Remote laptop with software
Researchers’ personal laptop
Participants laptop
Testing Times
All usability testings took place after 5pm
Testing Setup
Participants took control over the remote laptop that had the software on it through Zoom
Recording were taken by Zoom and captured the screen view of the software and face of the participants
Scenario 1:
You want to save a spreadsheet of the current SKU Levels chart for reference. Please let me know when you are done.
Scenario 2:
Your boss wants to ensure that the following parts: P44J417 and 910-014 are confirmed in the system. Please let me know when you are done.
Scenario 3:
Due to the pandemic you had to override several of the predictions in the software. The last time you did this was December 2020, when you were sure things would be back to normal by May 2021. It’s looking like you will still need those overrides for now, so you want to change the date of the override to when you think might be a good time to check back and reevaluate.
Scenario 4:
You have a bulk order to change the part ‘DA320’ for 50 cars in a transport agency scheduled for next week. Make sure you have enough parts to meet the requirements of the order. Please let me know when you are done.
Scenario 5:
You want to do a side-by-side comparison of the Part Chain Details section with the Fill Rate Exchange Curve section. Please let me know when you are done.
Scenario 6:
The graph Forecast vs ROP value did give you information on the changes of reorder values, but you no longer want a graph view, try removing it. Please let me know when you are done.
You are now interested in the exact ROP value of a specific part. Try adding the tab giving you a part wise ROP value. Please let me know when you are done.
Task Completion
Results
Usability Issues Found
The data collected through post task questionnaires and notes taken by the observers were analysed and results were drawn out. We aligned our findings with our heuristic evaluation report. Our usability tests supported most of the issues we had listed down as expert evaluators in the heuristic report.
Download Button
Heuristic Violation: External Inconsistency
The placement of the download button makes it externally inconsistent in comparison to other systems.
The location and the size of the download button makes it difficult for the user to locate the download button.
Liquid Layout
Heuristic Violation: Use of appropriate design patterns
Although some users were able to figure out the liquid layout of the system, the users still have some difficulty figuring out how to place the tabs and move things around
Button Size (help, close, and minimize)
Heuristic Violation: External Consistency
The buttons are small and the users have trouble seeing what they are
System Feedback
Heuristic Violation: System of State
There is no indication that the changes were processed in the SKU Levels Chart
Recommendations
The latest update shows that some of the issues mentioned in the submitted heuristic evaluation report have been worked on and design solutions for the same has been implemented.
Make the buttons on the top right of the view larger
Add feedback for when users make changes to the chart
In order to reduce the cognitive load on the user and make the system more efficient, the system can follow a cleaner design pattern and limit information on screen.
An example of the proposed design can be seen