Prepared by
NACD Rental and Used Equipment Services
Global Rental and Used Equipment Group
August 2006
Table of Contents
Executive Summary... 3
Project Charter... 4
Project and Team Summary... 6
Evaluation Process. 7
System Recommendations....10
Vendor Final Weighted Scores.........11
Scoring Detail by Vendor
Rental Result... 15
RentalMan.. 16
DBS 17
Systematic... 18
Exact... 19
Solutions by Computer... 20
Pricing.. 21
Selection Issues ... 24
Appendix
Usability Testing 26
Usability Lab Report: Rental Result... 28
Usability Lab Report: RentalMan... 33
Contacts.. 36
Executive Summary
Background
Earlier this year, when Caterpillar announced that Exact rental software would not be
included in DBSi 5.0, some dealers began evaluating other options. NACDs Rental and
Used Equipment Services group agreed to help with the evaluation process. A joint team,
including representatives from NACD and Global Rental and Used Equipment,
developed a process for comparing alternative solutions. The process was then used to
evaluate six rental business systems. This document recaps the results of the analysis.
Conclusions and recommendations
Selecting a new rental system is a costly, complex decision. Therefore, the team
recommends that prior to deciding whether to purchase a third-party rental solution,
any dealer using DBS/EMS should consult with Dealer Distribution Systems to
ensure that current system capabilities are being maximized.
NACD dealers who decide to replace DBS/EMS are encouraged to invest in either
Rental Result or RentalMan.
o Rental Result. A Web-based system, proven in Europe, it offers superior
configuration, customization, and on-line training capabilities, as well as
excellent workflow and easy access to all day-to-day business processing from
a single screen. Other features include integrated reporting tools and multilanguage and currency capabilities.
o RentalMan. An established rental package, well known in North America, it
is a text-driven, transactional-based system that features fast data entry and
quick user training. It has excellent built-in reporting and good online system
and analysis tools, as well as some multi-lingual and multi-currency
capabilities.
Next Steps
Caterpillar will work with the two selected vendors to define a standard deployment
configuration and methodology. In addition, Caterpillar will work with these vendors to
define data fields that are needed for common dealer interfaces, work with DDS and
these vendors to provide a cross-reference of data fields between DBS and these systems,
and define the needed validation rules for DBS, Rental Result, and RentalMan. Although
Caterpillar is going to work with these vendors on deployment and interface issues, we
have no plans to develop the interface code. This development work will be the
responsibility of the vendor or the dealer. It should be noted that the defined work (stated
above) will only be carried out with the recommended vendors for NACD. Any dealer
choosing a system by another vendor will not receive this level of support.
PROJECT CHARTER
Business Case
Caterpillar and dealers agreed to remove the Exact rental software application from
DBSi. The Rental & Used Equipment Services group in NACD offered to help evaluate
and recommend alternative solutions.
Opportunity Statement
If Caterpillar guides dealer efforts to evaluate and select a new rental business system,
then:
Dealers will have access to key information that can help them make better
decisions.
Caterpillar will not be faced with an unmanageable number of rental applications.
Dealers will have systems that meet their requirements.
Goal Statement
Develop a process to evaluate the functionality and usability of alternative rental business
systems. Use the process to create a list of recommended solutions.
The following considerations will be made in creating the process:
Determine which applications dealers are currently using.
Evaluate what is functional and what is not (as is).
Ensure compatibility of any recommended software with current rental best
practices and dealer architecture in terms of asset management, rental activities
(all equipment divisions) and financial control.
Include factors such as marketing group, dealer size and marketplace
requirements into analysis.
Project Scope
Support all Marketing Profit Centers (MPCs).
Utilize MPC input.
Pilot with NACD; roll out to other MPCs for validation/modification.
Final selection of a rental solution by dealers is not included in the scope of this
project.
Project Plan
Kickoff February 16, 2006
Present charter and scope to Rental Advisory Group March 2006
Present project to NADITA members May 2006
Present project to MPCs June 2006
GRUC presentation June 2006
MPCs announce to dealers June 2006
Update Rental Advisory Group June 2006
Conduct usability lab (dealers provide needed personnel) July 2006
Present summary and recommendations to sponsors August 2006
Project Sponsors
Glen Fauntleroy, NACD Rental & Used Equipment Services, General Manager
Jim Johnson, MPSD Global Rental and Used Equipment, General Manager
Process Owner
Sam Cooper, MPSD Global Rental and Used Equipment, Rental Support Manager
Team Leaders
William Hood, MPSD Global Rental and Used Equipment, Senior Systems Consultant
Mickey Avirett, Rental Business Consultant
Team Members
Alison Hixson, Rental Business Consultant
Stanley Hartwig, MPSD Global Rental and Used Equipment, Systems Consultant
Larry Bordner, NACD e-Business, Process Consultant Specialist
Charlie Pink, Global IT Dealer Distribution Systems, Senior IT Supervisor
Evaluation Process
The evaluation process was consistent across all vendors.
Each vendor was contacted with an introductory phone call and letter.
A two-day on-site visit was arranged to evaluate each vendors product and
complete the Software Evaluation Scenario spreadsheet.
Each software demonstration was scored on a scale of 1 to 5.
1 = does not meet the specific requirement.
2 = meets some of the specific requirement.
3 = meets most of the specific requirement.
4 = meets the entire specific requirement.
5 = far exceeds the specific requirement.
Vendors were scored independently by all team members present at the
evaluation.
If the independent scores were notably different, team members discussed and
reconciled the differences at the end of each day.
At the conclusion of the evaluation, team scores were totaled and recorded.
Below are samples of the scenario-scoring sheet and the final weighted scoring sheet:
Name
Scenario Steps
Requirements
Scenario
Functionally
1
Explanation
Explanation
Explanation
Explanation
Explanation
Explanation
Explanation / Demonstrate
Explanation
Explanation / Demonstrate
Explanation
Explanation
Explanation
Explanation / Demonstrate
Explanation
Explanation / Demonstrate
Explanation
Explanation / Demonstrate
Explanation
Explanation / Demonstrate
Explanation / Demonstrate
Functionally / 60%
Item Weight
Rental Result
DBS
Exact
Solutions By Computer
Systematic
Rental Man
General
Parameters/System
Set-up
1%
0
0
0
0
0
0
Hardware and
Platform
5%
0
0
0
0
0
0
Misc. / Other
Modules
Customer Account /
Equipment /
Credit
Consumables Search
1%
0
0
0
0
0
0
10%
0
0
0
0
0
0
Database
Interface Capabilities
5%
0
0
0
0
0
0
20%
0
0
0
0
0
0
2%
0
0
0
0
0
0
System
Administration
10%
0
0
0
0
0
0
Customer Search
2%
0
0
0
0
0
0
2%
0
0
0
0
0
0
Security
10%
0
0
0
0
0
0
25%
0
0
0
0
0
0
Vendor presence,
maturity, stability
and investment
Flexibility,
extensibility,
integration and
upgradeability
Internet and
eCommerce
capabilities
Implementation
Considerations
References
Pricing
Item Weight
21%
31%
21%
21%
6%
0%
Rental Result
DBS
Exact
Solutions By Computer
Systematic
Rental Man
Process focus
Prior to evaluating each solution, the team identified dealers core business processes and
made a list of key functional requirements that enabled those processes. The list served as
a focal point for the evaluation. We viewed each demonstration in the context of the
following guidelines:
Verify that the software addresses key business processes.
Avoid getting overly impressed with new features that add functionality unrelated
to core processes. (For example, dont be swayed by a product that can be
accessed remotely with a PDA, but does not enable demand planning.)
Never assume software is capable of handling something we consider a standard
business function.
Ask tough questions. Dont settle for a yes-we-can-do-that response. Verify that
the task cannot only be done, but done at the level required.
Ask detailed questions to determine exactly how the software works. Confirm that
the data elements required to complete each task are present.
Consider usability in addition to functionality. Functionality answers the
question, Can the software do something? Usability addresses How does a user
get the task done?
10
System Recommendations
Based on the analysis we completed, and the results of usability tests conducted on the
highest scoring software products, the team recommends the following systems for dealer
consideration:
1. Rental Result
2. RentalMan
3. DBS
Rental Result scored highest in terms of functionality and usability. The higher score
was due to its greater configuration capabilities, good workflow through the control
center and excellent training tools. Another plus is the fact that the product is developed
in JAVA. Some usability testers expressed concern about Rental Results workflow.
However, the evaluation team believes the software is very flexible and can be easily
configured and formatted to suit dealers needs.
RentalMan was a close second. It is a solid package, well known within the equipment
rental industry. However, the fact that a major competitor owns the software could create
information security issues and limit a dealers ability to achieve differentiation. In
addition, the programming language is aging, so it may become harder and more costly to
find people to maintain the software.
DBS ranked fourth in rental functionally and was not tested for usability. However, we
recognize that it may be a dealers only choice when the cost of implementing a new
rental system is not practical.
Dealers are not bound by the results of this project. Ultimately, it is up to each dealer to
choose the system that meets its requirements. However, by limiting choices to two or
three common systems, dealers may realize the following benefits:
Strengthen price negotiation position
Improve information exchange; ease the process of collecting data at a corporate
level (SIMS reporting)
Reduce development costs.
Reduce implementation time and costs
Improve business efficiency by driving common best practices
The following pages (1120) provide more detailed scoring information about each
vendor, starting with the Functionality Weighted Scores for all vendors.
11
Misc. / Other
Modules
Customer Account /
Credit
Equipment /
Consumables
Search
Customer Search
Quotation
Reservation
Rental Contract
Single Item (1)
Substitute
Equipment
Update Rental /
Update Contract
Swapping
Equipment / Request
for Repair
Partial Equipment
Return
Return Equipment
Rental Purchase
Option / Equipment
Sale
Special Contract
Services
Invoicing
Lost Deal
Contract Printing
Rental Rate
Maintenance
Repair and
Maintenance
Reporting
Item Weight
1%
1%
10%
2%
2%
2%
5%
5%
10%
10%
3%
3%
2%
5%
5%
5%
5%
10%
3%
2%
3%
3%
3%
Rental Result
Total Team
Score
83
76
77
84
93
77
87
85
84
85
80
85
80
80
79
60
100
80
72
100
76
80
94
Total Team
Weighting
0.83
0.76
7.73
1.69
1.87
1.53
4.36
4.25
8.38
8.48
2.40
2.55
1.59
4.00
3.94
3.00
5.00
8.03
2.17
2.00
2.28
2.40
2.81
Wynne System
Total Team
Score
71
68
81
84
80
80
79
79
80
80
80
80
80
80
80
60
80
78
79
80
80
80
78
Total Team
Weighting
0.71
0.68
8.09
1.69
1.60
1.60
3.94
3.94
8.05
8.05
2.40
2.40
1.60
4.00
4.00
3.00
4.00
7.82
2.37
1.60
2.40
2.40
2.34
Texada
Total Team
Score
66
60
67
69
67
63
70
68
66
66
70
77
72
70
71
33
60
75
58
80
71
62
77
Total Team
Weighting
0.66
0.60
6.67
1.38
1.33
1.27
3.52
3.42
6.62
6.62
2.10
2.30
1.45
3.50
3.56
1.67
3.00
7.45
1.73
1.60
2.12
1.85
2.32
DBS
Total Team
Score
58
60
62
62
53
60
63
59
63
62
42
60
68
68
68
67
67
66
60
60
56
80
55
Total Team
Weighting
0.58
0.60
6.22
1.24
1.07
1.20
3.15
2.94
6.29
6.24
1.25
1.80
1.35
3.39
3.39
3.33
3.33
6.61
1.80
1.20
1.68
2.40
1.66
Exact
Total Team
Score
61
57
57
80
80
80
73
74
68
71
67
63
62
59
55
60
40
58
48
73
52
60
39
Total Team
Weighting
0.61
0.57
5.73
1.60
1.60
1.60
3.67
3.72
6.81
7.10
2.00
1.90
1.24
2.94
2.75
3.00
2.00
5.85
1.43
1.47
1.56
1.80
1.16
Solutions by
Computer
Total Team
Score
61
29
59
80
40
80
73
73
74
74
55
78
30
58
59
20
57
67
29
80
75
65
44
Total Team
Weighting
0.61
0.29
5.91
1.60
0.80
1.60
3.64
3.67
7.43
7.43
1.65
2.35
0.60
2.89
2.97
1.00
2.83
6.73
0.87
1.60
2.24
1.95
1.31
Functionally
/ 60%
Final Average
General
Parameters/System
Set-up
82.05
78.67
66.73
62.73
62.10
61.96
Database
Interface
Capabilities
System
Administration
Security
Support (Help
Desk)
Performance
Item Weight
5%
5%
20%
10%
10%
25%
25%
Rental Result
Total Team
Score
82
87
87
73
60
60
80
Total Team
Weighting
4.11
4.33
17.33
7.33
6.00
15.00
20.00
Wynne System
Total Team
Score
67
63
60
80
67
67
87
Total Team
Weighting
3.33
3.17
12.00
8.00
6.67
16.67
21.67
Texada
Total Team
Score
53
30
40
60
40
73
70
Total Team
Weighting
2.67
1.50
8.00
6.00
4.00
18.33
17.50
DBS
Total Team
Score
62
60
53
60
47
80
80
Total Team
Weighting
3.11
3.00
10.67
6.00
4.67
20.00
20.00
Exact
Total Team
Score
69
53
60
60
33
53
47
Total Team
Weighting
3.44
2.67
12.00
6.00
3.33
13.33
11.67
Solutions by
Computer
Total Team
Score
73
43
60
60
40
60
60
Total Team
Weighting
3.67
2.17
12.00
6.00
4.00
15.00
15.00
Final Average
Integration
and
Technology
/ 20%
Hardware and
Platform
12
74.11
71.50
58.00
67.44
52.44
57.83
Flexibility, extensibility,
integration and upgradeability
Implementation Considerations
References
21%
31%
21%
21%
6%
80
80
80
80
80
Total Team
Weighting
16.80
24.80
16.80
16.80
4.80
Wynne
System
Total Team
Score
80
60
57
73
80
Total Team
Weighting
16.80
18.60
11.90
15.40
4.80
Texada
Total Team
Score
47
38
80
67
40
Total Team
Weighting
9.80
11.71
16.80
14.00
2.40
DBS
Total Team
Score
93
42
33
60
80
Total Team
Weighting
19.60
13.09
7.00
12.60
4.80
Exact
Total Team
Score
80
56
50
73
40
Total Team
Weighting
16.80
17.22
10.50
15.40
2.40
Solutions by
Computer
Total Team
Score
80
40
20
40
40
Total Team
Weighting
16.80
12.40
4.20
8.40
2.40
Final Average
Item Weight
Rental Result
Total Team
Score
General
Vendor
Criteria and
Service /
20%
80.00
67.50
54.71
57.09
62.32
44.20
14
80.06
75.00
62.58
62.55
60.22
57.58
15
82
74
80
80
16
RentalMan
RentalMan finished second in the analysis and had the following strengths:
Established package, proven in the US marketplace.
Current industry standard, used by most major US rental companies (so
employees who have worked in the industry may already be familiar with it).
Text-driven, transactional based system, easy to learn, fast data entry capabilities.
Functional, easy-to-use search tools.
Excellent built-in reporting, good online system and analysis tools.
Mike Young from the Cat Usability Laboratory called RentalMan more closely matched
to the systems and processes users currently utilize.
Despite the popularity of RentalMan, it has several limitations. It is owned by a major
competitor, which could jeopardize dealer differentiation and put confidential
information at risk. In addition, it has been on the market since 1993. As its
programming language continues to age, it will be more difficult and costly to maintain
the software. The company did not convey a clear direction about future language
product development. Finally, the team found some insufficiency in the vendors support
resources and has heard negative comments from some dealers regarding the companys
rapid deployment methodology.
Weighted Scores
Functionality
Integration and Technology
Vendor Criteria and Service
Final Weighted Score
79
71
67
75
17
DBS
DBS was virtually tied for third place with Systematic in functionality and was not tested
for usability since nearly all NACD dealers currently use DBS. Although the system is
deployed at CAT dealerships, it was never engineered as a rental solution. It is not user
friendly and is difficult to train. It too has aging programming language and is not Web
enabled. Despite its many limitations, DBS may be a dealers preferred choice when the
cost of implementing a new rental business system cannot be justified.
Weighted Scores
Functionality
Integration and Technology
Vendor Criteria and Service
Final Weighted Score
63
67
57
62
18
Systematic
Systematic scored third in functionality and was not tested for usability. Its strengths
include:
Well-established rental application with Web enablement.
User-friendly system, developed by company with good knowledge of the
industry.
Looks and feels like a typical Unix application with many cascading drop down
menus.
Systematic has begun developing a new JAVA-based rental business application. After
the product has been released and gained market maturity, it may warrant further
analysis.
Due to low functionality scores, the Systematic product it is not being recommended for
dealer consideration.
Weighted Scores
Functionality
Integration and Technology
Vendor Criteria and Service
Final Weighted Score
67
58
55
62
19
Exact
The stand-alone Exact product ranked fifth in functionality. The version we tested had
more features than the version included in DBSi, which was frozen at a previous release.
However, DBSi Rental (which includes all software components used in the rental
business) worked better than the stand-alone Exact product, due to integration and
functionality of the individual software components.
The team found an obvious lack of industry and rental knowledge during the evaluation.
As Exact is a complete ERP package focused primarily on manufacturing, it was never
engineered as a dedicated rental system. The company appears to have no direction or
budget for developing a future product that better meets the needs of the rental business.
We were extremely disappointed that the stand-alone version offered by Exact is not the
same version that was deployed in SCM in Japan (e.g. the stand alone version still does
not have the capability to print reports which is standard functionality in the SCM
version).
Exact was not invited to participate in the Usability Laboratory.
Due to its low functionality scores and lack of commitment, Exact will not be
recommended for dealer consideration.
Weighted Scores
Functionality
Integration and Technology
Vendor Criteria and Service
Final Weighted Score
62
52
62
60
20
Solutions By Computer
SBC chose to demonstrate Enfinity, a new rental services product. While the system
scored high in functionality on the elements presented, the overall score was driven down
due to missing segments in the new product. SBC targets its products and marketing
toward smaller, mom and pop type operations. We feel their marketing niche would
hinder SBC in implementing an effective rental business solution for CAT dealers.
SBC was not invited to participate in the Usability Laboratory.
Due to low functionality scores, SBC will not be recommended for dealer consideration.
Weighted Scores
Functionality
Integration and Technology
Vendor Criteria and Service
Final Weighted Score
62
58
44
57
21
Pricing
As part of our evaluation, the team asked the vendors to provide pricing information for
review. When you review the pricing, you will see that Wynne Systems bases some of
their pricing on the number of branches, and Rental Result bases their pricing on number
of users, so it is difficult to get an apple-to-apple comparison. In addition to the
differences in pricing structure, Wynne Systems list optional software; the functionality
provided by this optional software is included in Rental Results standard system.
For your review, we have included the pricing models provided to Caterpillar from
Wynne Systems, Inc. and Rental Result on the following pages. However, to
understand the actual cost of the rental business system for your dealership, it will
be each dealers responsibility to solicit quotes from the vendors.
22
Prepared for: Caterpillar, Inc.
Submitted by: Wynne Systems, Inc.
August 23, 2006
All information contained in this quote is proprietary and may not be reproduced without
the written consent of Wynne Systems, Inc.
NOTE
Wynne Systems quote RentalMan as a base license with optional software. The system
we evaluated and tested in the usability Lab utilized all the optional software listed here.
5 Locations
RentalMan
Options Pricing:
RentalMan Customer Portal
RentalMan Sales Tool
RentalMan Dashboard
GUIStyle by ABL
FastFax by Quadrant (one partition)
FormSprint by Integrated Custom Software (one partition)
Surveyor/400 by Linoma Software (one partition)
File Acess by Oasis Software (one partition)
Jcharge Core Software License by Verifone (one processing network)
Total
10 Locations
RentalMan
Options Pricing:
RentalMan Customer Portal
RentalMan Sales Tool
RentalMan Dashboard
GUIStyle by ABL
FastFax by Quadrant (one partition)
FormSprint by Integrated Custom Software (one partition)
Surveyor/400 by Linoma Software (one partition)
File Acess by Oasis Software (one partition)
Jcharge Core Software License by Verifone (one processing network)
Total
20 Locations
RentalMan
Options Pricing:
RentalMan Customer Portal
RentalMan Sales Tool
RentalMan Dashboard
GUIStyle by ABL
FastFax by Quadrant (one partition)
FormSprint by Integrated Custom Software (one partition)
Surveyor/400 by Linoma Software (one partition)
File Acess by Oasis Software (one partition)
Jcharge Core Software License by Verifone (one processing network)
$366,165.00
19% of license
19% of license
19% of license
20% of license
18% of license
18% of license
18% of license
18% of license
18% of license
$414,790.00
License (USD) Annual Maintenance
$150,000.00 19% of license
$30,000.00 19% of license
$30,000.00 19% of license
$80,000.00 19% of license
$174,500.00 20% of license
$11,995.00 18% of license
$11,000.00 18% of license
$1,995.00 18% of license
$955.00 18% of license
$21,595.00 18% of license
Total
$512,040.00
23
All information contained in this quote is proprietary and may not be reproduced
without the written consent of Rental Result
Indicative Costs
16-Aug-06
Cost/User
Capital
Cost
Element
3,500
20
$
Annual
Support
Costs
70,000
17,500
9,500
2,736
Cost/User
Capital
Cost
3,500
60
$
Annual
Support
Costs
210,000
52,500
2,375
684
9,500
2,736
2,375
684
7,500
7,500
1,875
1,875
7,500
7,500
1,875
1,875
97,236
24,309
237,236
59,309
Sub Totals
Project Director
Project Manager
Application Consultancy / Training
Installation
Data Conversions
BI Training & Consultancy
Total Professional Services Phase - 1
Rate/day
1,750
1,500
1,500
1,250
1,250
1,500
Days
Cost
12
24
38
5
8
16
Days
21,000
36,000
57,000
6,250
10,000
24,000
18
36
44
5
8
16
154,250
251,486
Cost
31,500
54,000
66,000
6,250
10,000
24,000
191,750
24,309
428,986
59,309
24
Selection Issues
Selecting the right system involves more than simply comparing features, functionality
and price. It requires finding a system that not only meets business objectives, but also
can be successfully implemented in an acceptable timeframe and maintained at a
reasonable cost over the long haul.
Some of the key factors to consider when evaluating potential systems include the
following:
System intuitiveness. This will affect the level of training required and the ability
of the system to control business processes. Intuitiveness is an important
consideration when staff turnover is high or significant business growth is
planned.
Payback period. The payback period should be calculated and understood prior
to selection. Some of the elements contributing to Return on Investment will be
tangible and others will be assumptions. For example, a reduction in headcount is
a calculated savings, while a 5% increase in rental revenue is an estimate.
To-be perspective. The costs and benefits of a new system should be based on
the to-be business model, as opposed to the as-is. For example, if there is an
expected headcount reduction and the system being considered has a user-based
license structure, then the projected staffing levels should be used in the cost
calculation.
Integration. The level of integration required between the rental service business
and the rest of the dealership must also be considered. In some cases, the need for
a totally integrated system may make it cost prohibitive to implement a standalone rental solution. It is a very costly mistake to develop system interfaces that
are not truly necessary. Carefully analyze the business configuration driving the
need for interfaces and only develop after examining the costs and benefits.
Other issues. Some of the other critical factors that must be considered while
evaluating alternative rental packages include:
25
o
o
o
o
o
o
The team recognizes that the selection of rental business software is a complex and costly
undertaking. Therefore, we recommend that dealers consult with Dealer Distribution
Systems prior to making a final decision to ensure that you are maximizing DBS/EMS to
its fullest extent.
26
27
28
29
Field labels, error messages, page titles, etc. all need to be written in terms of the
target audience and users in general.
Users did not go to movements. Suggest changing the label.
****There is very little process apparent in the system. Its just a collection of
pages/tabs where users fill in information. For transactional tasks like these, the
process needs to be very clear or it will result in data issues in the system.
****Users expect to be able to use searches and then perform tasks from within
that contract/customer. Making the application contract- or customer-centric
rather than area-centric would probably be more intuitive.
Rely on recognition, rather than recall. Use selection methods when possible;
provide easy access to instructions; make icons, menu names/options easy to
understand so users do not have to memorize or remember them. Score = 3
**** Invisible key strokes should not be required by users to activate something
in the GUI (enter to cancel/ok, etc.)
Users not readily seeing icons (new, save, etc.)
Contract lists dont contain the information users need such as equipment ID.
Users need to see equipment from list of contracts.
****Most users did not recognize the tiled windows for what they were; need
to have more descriptive tile names. Tiles should have more descriptive labels to
better distinguish what is open.
Icons should be redundant rather than primary; users expect main commands on
screen for these types of applications.
****Users did not know what movements contained. Even once they were
prompted there, it was not clear to them what they needed to do.
Need a legend for the different statuses or an information icon that gives them
the ability to find out what the different statuses are.
****Users expect a selection from create/update to create or update, rather than
relying on user to recognize what they need to do once in the area.
Need area labels within a page (i.e., charges).
Provide useful error messages. Strive to develop an error-proof design to minimize
the need for error messages. But when messages are required, make sure theyre
expressed in plain language, indicate the problem and suggest a solution. Score = 3
Some error messages provided both the problem and the solution, but others were
not clear to users and did not seem to match what the application had just done.
Good messages: The account number entered does not exist. To search by
customer name, click the button next to the account field.
Poor message: The application is currently busy and cant be closed at this time.
Please close or cancel all active documents and try again. (This situation was
caused by another window being opened, but the message made it sound like the
system was busy or overwhelmed and the user could just wait.)
****Poor message: Returned fuel not entered against a returned line with
dispatched fuel. (What does the user need to do?)
30
****Even worse: There is no quantity on line ___. (No indication of where the
line is that the message is referring to. The line was not apparent on the page, but
buried in another tab).
31
Allow easy reversal of actions. Users often experiment, click wrong buttons or type
incorrect information. They need to be able to undo a recent action quickly. Score =
3
Users had difficulty changing the price for pick up charge. Not clear that they
can highlight and type in changes.
****Users expect a back button to recover from an error.
Cancel should cancel the page and return user to previous state, not the starting
point.
Provide flexibility and efficiency. Individual users have differing needs which may
change over time. Provide flexibility (shortcuts, accelerator keys, pop-up menus, etc.)
to meet their changing needs and preferences. Score = 3
****Information is a bit siloed. Can find the contract, but cant return from
search. Users expect to be able to find contracts and perform rental return (actions
in general) from the search area. System needs to flow both ways; find contract
and act on it or go to action area.
Users have to go through several search areas to get to what they want to search
on.
Expect typed entry in previous field to carry through if user types and then hits
the look up.
Requiring amendments each time was frustrating to users. Amendments appeared
even if user hadnt changed anything.
****Cursor should come up in the field to be searched to minimize mouse to
keyboard.
In some areas, user had to essentially search twice on a field to get to a results
page as it brought up a confirming account number page for the customer that
had been searched on.
Keep designs attractive and simple. Eliminate unnecessary or seldom-used
information. Extra information and visual clutter competes with what is relevant.
Present information in a natural and logical order. Score = 3
Select unit buttons are not in area where units are being selected
Rental return If only job is to create new, just go there, not to a screen that
makes them select new.
****Report is very busy, poorly laid out, no clear heading labels. Looks similar
to what we see in Exact today.
Tiles should not remain on top left if user has saved and exited that area
Offer useful help & documentation. Help information should be easy to search,
focus on users task, list steps to be performed and require minimal space. Score = 4
Tutorial should be available from every screen, with ability to access different
topics.
32
Suggest moving the tutorials to the right as they are in prime real estate now
and most users look for help in the upper right corner.
Users thought tutorials were too quick.
Need a way to access the list of videos along with an indexed topic search, as
sometimes users need help finding where something is.
Some users may not see the progress bar, so wouldnt suggest that they restart.
Should just have a replay button.
Videos were used by several users to help them complete their tasks.
Videos should demonstrate where exactly the user is, rather than just the general
area.
33
34
****Users expect to be able to search on customer name from most areas, as they
dont always have the contract number.
****Users didnt pick up on the advanced search (F4) for finding other
information.
Some DBS users confused the function keys, but others were able to transfer their
knowledge of how to work with this type of system.
Rely on recognition, rather than recall. Use selection methods when possible;
provide easy access to instructions; make icons, menu names/options easy to
understand so users do not have to memorize or remember them. Score = 2
****Global commands should be visible on page at all times.
Sub-menu categories (sales, inquiries, etc.) need to be more visible. There are
redundant areas with categories (i.e., several users started with the incorrect
equipment menu item.)
Date icon is not recognizable as a calendar. Only one user attempted to use it.
Need legend explaining what different credit status codes mean.
Dont abbreviate labels if at all possible.
**** First list of left Function links should stay if look at more
Provide useful error messages. Strive to develop an error-proof design to minimize
the need for error messages. But when messages are required, make sure theyre
expressed in plain language, indicate the problem and suggest a solution. Score = 2
****Informational messages at bottom need to be brought into main grey area
only the error messages should be red
****Must protect users from losing data with prompts
Ensure visibility of system status/information. Provide feedback to users about
what is happening, how long the wait may be and what the consequences of their
actions are. Score = 3
****If there is a multiple field search where the fields are mutually exclusive, the
other fields should grey out once one field is entered.
Need to show format of date fields so users dont have to guess.
Once the user has clicked on check box, other fields should appear, not after move
to another field.
When searching, all customers appeared, rather than just those that met the search
criteria not typical search behavior.
Maintain consistency & standards. Consistency reduces learning time and increases
productivity. Provide consistency within and between applications. Score = 2
****If user clicked in middle of field, the password and search fields were
screwed up, as it recorded the first characters as spaces. When clicking in a field,
cursor should come up in first position.
Password should show asterisk when characters are typed.
****Fields should not wrap to the next field when typing a string.
35
Allow easy reversal of actions. Users often experiment, click wrong buttons or type
incorrect information. They need to be able to undo their most recent actions quickly.
Score = 2
****Need a previous screen button as a global function key.
****Users expect F3 to take them to the previous screen not back to a starting
point.
Provide flexibility and efficiency. Individual users have differing needs which may
change over time. Provide flexibility (shortcuts, accelerator keys, pop-up menus, etc.)
to meet their changing needs and preferences. Score = 3
****Check boxes force user to switch to mouse strength of this type of system
is that can operate solely from the keyboard.
****Buried/hidden commands will greatly increase a learning curve. Is there a
site map that shows the user what commands are available when in an area?
Keep designs attractive and simple. Eliminate unnecessary or seldom used
information. Extra information and visual clutter competes with what is relevant.
Present information in a natural and logical order.. Score = 3
Credit status needs to be together with credit limit.
There the extra step of showing the customer details - account number before
showing the details when searching on customer name.
****Need visual breaks between areas on a page and layout of information that
increases scanability and readability
Offer useful help & documentation. Help information should be easy to search,
focus on users task, list steps to be performed and require minimal space. Score = 1
Some users expected to be able to type a question mark into a field to get
contextual help on that field.
Help screens did not provide useful information to the users.
Help locked the other screens out if not closed.
36
Appendix: Contacts
For additional information pertaining to this report please contact the following:
Will Hood
Sr. Rental Systems Consultant
Global Rental & Used Equipment Department
Hood_William_A@cat.com
Phone: 559-323-9606 (Office)
559-260-7459 (Mobile)
Stan Hartwig
Rental Systems Consultant
Global Rental & Used Equipment Department
Hartwig_Stanley_G@cat.com
Phone: 309-675-6478 (Office)
309-253-2347 (Mobile)
Sam Cooper
Global Rental Support Manager
Global Rental & Used Equipment Department
Cooper_Sam_E@cat.com
Phone: 309-675-4720 (Office)
309-258-2206 (Mobile)