Table 1. Number of usability barriers of the MPA
BarriersFacilitators
CategoryFirst iteration (n = 7)Second iteration (n = 6)First iteration (n = 7)Second iteration (n = 6)
System: Features governed by policies, regulations or operational demands of the health care system.
Examples
17 (25.8%)
Participants’ lack of knowledge regarding pain screening tools.
12 (21.4%)
Participants’ lack of knowledge regarding interpretation of pain screening scores.

4 (8.2%)
Explanations on tools found to be very helpful.
UI: Graphical presentation of content including layout, ease of navigation and data entry fields.
Examples
28 (42.4%)
Locating patient education resources in encounter guides was not easy.
22 (39.3%)
Empty fields added clutter to the main page.
7 (16.7%)
The visual presentations (i.e. size and style of font and colour) of Pain Assistant were pleasing.
17 (34.7%)
Ability for enlarging and minimising questionnaires in encounter guides found to be helpful.
Content: Clinical aspects of the MPA (e.g. quality of medication list)
Examples
15 (22.7%)
Pain Assistant does not present any history of pain diagnosis.
20 (35.7%)
Numeric ICD-9 codes were not explicit.
29 (69.0%)
Plotted trend of changes in pain scores helped to identify pattern of pain.
28 (57.1%)
Pain Assistant offers different management strategies for different types of pains rather than only medication therapies.
Technical: Software functions such as ability to save; broken link)
Examples
6 (9.1%)
Pain Assistant did not save automatically.
2 (3.6%)
When clicking on tools in Pain Assistant, the new pages are big and not resizable.
6 (14.3%)
Auto calculation of the scores.
Total66564249