IAI experience with methods


Contents of this page:

Stakeholder meeting

We used to conduct a project initiation meeting involving a project development management and technical staff. User related (Operational requirements) were separately defined and discussed by a specialised Pilot’s group. Conducting a Stakeholders meeting allowed to identify previously unforeseen users and stakeholders, better understand the project scope and objectives, define the success factors and identify some different interpretations for follow-up discussions and resolution. Involvement of senior managers and marketing personnel contributed for identification of some strategic issues. How to do it.

Analyse context of use

We never used this method before. The facilitator guided us through a long checklist covering many aspects of the user’s skills, tasks and the MPC working environment. Many terms were not familiar to us and required explanation. Most of the data captured was not new to the participants due to their good familiarity with users environment. Some valuable information was captured, still some parts were not relevant to the MPC. We concluded that the checklist should be tailored to the developed system and be written in less professional terms to be efficient. In addition an experienced facilitator is very important to the success of this method. How to do it.

Paper prototyping: Task analysis

This method was also new to us. We realised during its planning stage that it needs significant tailoring for our needs and we did that. We wrote down on sticky notes every user function anyone could think of. The sticky notes were logically grouped. After they were grouped the hierarchy was developed. This was done dynamically during the meeting and took several iterations. The functional hierarchy changed significantly and was agreed upon. As a consequence the system architecture has been modified accordingly. The method had a great impact on the MPC software look and feel as well as it’s software requirements and architecture. How to do it.

Task scenarios

This method contribution for MPC system was low for the following reasons:

  • The few operational scenarios required for the MPC are obvious for Pilots.
  • Due to detailed documentation of task analysis, documenting scenario didn’t seem to add value.
It was concluded that this technique was not so relevant for MPC. Another Trial will be conducted on an avionics project to evaluate the technique's relevance to LAHAV. How to do it.

Evaluate usability of existing system

Four users evaluated the existing system. Each user was given short (15 minutes) training on the system. The user was given a mission to prepare and commented as he went along. Comments were captured by the facilitators generating a detailed list of about fifty problems. The problems were reviewed by the pilots defining the new system to find ways to avoid them in the design of the new system. The users filled out SUMI questionnaires after the evaluation. The technique was very productive though has been applied in a semiformal way. A more formal trial (more training, better instructions, more users) is being considered. How to do it.


Set usability requirements

Goals for task time were agreed, and a list of potential user errors were identified. We realise the need for the technique and it’s potential but more work is needed to better define it. How to do it.


Paper prototyping of screens

We haven’t used this method before and had doubts about it’s value, mainly because it is now very easy to create computerised UI prototypes. It turned out to be a false doubt and the potential users and developers liked the method and its contribution to MPC usability. Mockups of screens were posted on the wall and provided the "Big Picture", although were too small to see the detail. Each screen was displayed using an overhead projector resulting in very fruitful and productive discussions by potential users. A detailed list of 23 usability comments was created. How to do it.


Style guides

Off the shelf style guides were provided to the developer. It turned out that these style guides are very detailed and difficult to use. Given intuitive visual development tools, developers prefer to learn by click and see rather than reading lengthy manuals.

We realise the need for a style guide, but currently don’t have a good one. Good style guide in our view should:

  • Be at the appropriate (to the developers) level of detail
  • Not to be over restrictive (Leave some space for creativity)

It is still an open issue at LAHAV.

How to do it.


Evaluate Usability of Computer Prototype

The system was only partially developed. But the UI was complete and the main modules were working. General training was held at the beginning for the users resulting in some comments that were captured by the facilitators.

Each user received instructions regarding the mission he had to plan, and worked without assistance. The user spoke freely during the evaluation and the facilitators documented all comments. Software developers were present and observed the evaluation. In general the developers were very receptive and co-operative. Nevertheless towards the end of the evaluation they seemed to lose patience.

A summary meeting was held at the end of the evaluation. Comments were listed and prioritised. It was agreed to fix 93 of the 97 problems. The problems were points of detail and not major issues showing that earlier design was sound.

How to do it.


Test Usability against requirements

Major MPC parts were completed. The system was tested against timing requirements defined for two typical tasks.

Eight pilots including fighter pilots, helicopter pilots and navigators participated in this technique.

First, MPC frontal familiarisation training was held for all pilots (2 hours) following by individual hands-on practice for another two hours.

Each pilot received written instructions regarding the mission he had to plan and modify, and worked without assistance. He also could write down comments on collection of printed screens. The facilitators and developers observed the work on the repeater display and documented their observations. The time was recorded for completion of each task.

Following the completion of both tasks, each pilot completed his comments on printed screens, filled up the SUMI questionnaires and explained his comments and impression to the facilitators. All pilots were happy with the MPC, which was confirmed in the SUMI results, which were well above the industry average.

The overall duration of the tasks was within requirements

A summary meeting was held at the end of the evaluation. Comments were listed and prioritised. It was agreed to incorporate 39 of the 54 comments. Seven comments were not accepted and another eight undecided. The problems were points of detail and not major issues showing that earlier design was sound.

Following the summary meeting, the pilots filled-up the technique evaluation form.

How to do it.


Other parts of the IAI case study


Copyright © 2002 Serco Ltd. Reproduction permitted provided the source is acknowledged.