Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

User interface evaluation

Section 5.4 discussed the potential for a system s user interface to contribute to a hazardous environment and therefore to clinical risk. For prodncts reliant on a user interface for their operation it is therefore prudent to undertake an evaluation to determine whether the user interface successfully mitigates human factors. [Pg.250]

The extent to which usability can impact safety has long been established and the design and evaluation techniques involved are well documented in the literature [1, 2]. For medical devices, there is usually an explicit regulatory requirement to execute an evaluation which often translates into complying with Standard ISO 62366 2007 [3] or specific FDA requirements [4]. These approaches may be overkill for HIT which falls short of regulation. Nevertheless a great deal can be learnt from these requirements and their methods be applied to an extent commensurate with the risk to non-medical device HIT products. [Pg.250]

Usability evaluation can essentially be seen as happening in two phases  [Pg.250]

For the safety case, evidencing both evaluations is particularly convincing especially when carried out by the manufacturers. The formative evaluation helps to shape the requirements, take account of control options, iteratively refine the solution and build confidence in the design strategy. The summative evaluation allows specific user interface controls to be validated both in terms of their presence and effectiveness. [Pg.250]

A full evaluation of the user interface will look at a nnmber of characteristics including  [Pg.250]


There are many formal methods for user interface evaluation in the Uteratnre. A selection of some of the more common approaches which are snitable for HIT are set ont below. [Pg.251]

Jeffries, R., J. R. Miller, C. Wharton, and K. Uyeda, 1991, User interface evaluation in the real world A comparison of foiu techniques. Conference on Human Factors in Computing Systems. Reaching Through Technology, CHI 91, pp. 119-124,1991. [Pg.380]

The user interface is realised by 2 windows on the monitor (beside the window with the image under evaluation) ... [Pg.564]

The control of the airborne sound location system, the coupling monitor and the real-time evaluation of all signals, including the echo indications from the ultrasonic instrument, is carried out on two additional boards in the PC. The graphic user interface (under Windows 95), including online help, enables an easy operation of the system. The evaluation program links all echo indications in real time with the probe position and displays them in a graphic repre-... [Pg.775]

When selecting a performance testing system, issues that could affect the practicality, accuracy, and general utility of the system include the specific performance tests that are included in the system, the availability of norms upon which performance can be evaluated (and upon which decisions regarding readiness to perform are based), the reliability and validity of the measures, the accuracy of the measurement, the user interface, and the administrative interface. The relevance of each of these issues is discussed, as are some of the specific questions that merit some consideration when evaluating performance testing systems. [Pg.101]

The visual user interface is clustered into four main segments (cf. Fig. 35) general setup, value chain model, external parameters and evaluation. The general setup items and a subset of the external parameters (e.g., transportation costs, exchange rates) can be used across multiple value... [Pg.165]

As a result, the user interface of KomPaKt was designed, implemented, and evaluated. The evaluations have shown that KomPaKt is very well suited for sporadic usage in the context of a design process. All test persons, experienced as well as unexperienced computer users, could solve all tasks for initiating and performing a communication without problems. More details on these results are described in Sect. 5.2. [Pg.273]

Consequently, the software tools developed as part of IMPROVE were examined both analytically and empirically. Depending on the state of development of each software, one or more evaluation techniques were chosen, and by means of concrete examples of designing user interfaces, suggestions for a better configuration were acquired. In the following, we discuss the analysis and evaluation of the design support system EVA, of the flowsheet editor FBW (see also Subsect. 3.1.3), of the administration system AHEAD (see also Subsect. 3.4.2), and finally of the communication platform KomPaKt (see also Subsect. 3.3.2). [Pg.537]

This interface is a mock-up with which all relevant activities can be performed. However, the mock-up cannot be connected and used with the full-functional AHEAD system. As can be seen from Fig. 5.52 some menu items were eliminated by the software-ergonomic review and redesign. The scenario, developed for evaluation and improvement of the user interface aimed to the creation of a new document. Divided into subtasks for the work analysis, this goal is achieved by successively going through nine subtasks (6.1 to 6.9), which are represented in Fig. 5.52. The decision if the creation is started by menu or button represents the initial activity (6.1.1 or 6.1.2). After that, one task must be chosen by entering the task-number or clicking on the task. This step... [Pg.549]

Following the analysis of the user interface, the results were statistically evaluated. In order to compare two sample averages, t-tests for dependent samples were used. Figure 5.53 exemplarily displays the box plots for the time consumed to solve states and tasks on the left side (t = -11.485 p < 0.01). On the right side the box plots for the number of solved states and solved tasks are presented (t = 8.333 p < 0.01). [Pg.551]

Overall, the analysis and evaluation showed that the accomplishment of the participant using the alternative user interface differed significantly from the accomplishment using the original user interface. Moreover, it was proved experimentally that the results are independent from the order of presenting the software. Consequently, the order of presenting has no influence on the accomplishment of the participant. [Pg.551]

We are not aiming at the development of a tool comparable to a commercial one with respect to properties like robustness, performance etc. Rather, the prototype serves for evaluating the applicability of the developed concepts by our industrial partner. Nevertheless, the prototype has to be stable enough, such that it can be used for evaluation. This also covers the appropriateness of the user interface. [Pg.739]

As far as software is concerned, the interactive nature of the solution process naturally sets its own requirements (Hakanen, 2006). First of all, a good graphical user-interface (GUI) is needed in order to enable the interaction between the DM and the method. In addition, visualizations of the solutions obtained must be available for the DM to compare and evaluate the solutions generated. With interactive methods, more than three objective functions can easily be considered, which sets more requirements on the visualization when compared to, e.g., visualizing the Pareto optimal set for bi-objective problems. [Pg.168]

The third experiment evaluated Prompt and the alternative user-interface CogZ. The experiment focused on evaluating the cognitive support provided by the tools in terms of their effectiveness, efficiency, and satisfaction [Falconer 2009], Researchers assigned eighteen matching and comprehension tasks to participants that they had to perform using each tool (nine per tool). The evaluators then measured the time that it took a participant to complete the task and accuracy with which they performed the task. They measured the participant satisfaction via exit interviews and the System Usability Scale [Brooke 1996],... [Pg.45]


See other pages where User interface evaluation is mentioned: [Pg.250]    [Pg.251]    [Pg.251]    [Pg.253]    [Pg.255]    [Pg.250]    [Pg.251]    [Pg.251]    [Pg.253]    [Pg.255]    [Pg.106]    [Pg.76]    [Pg.165]    [Pg.241]    [Pg.457]    [Pg.563]    [Pg.927]    [Pg.444]    [Pg.1950]    [Pg.259]    [Pg.527]    [Pg.540]    [Pg.545]    [Pg.547]    [Pg.549]    [Pg.552]    [Pg.674]    [Pg.36]    [Pg.45]    [Pg.264]    [Pg.307]    [Pg.308]    [Pg.130]    [Pg.50]    [Pg.276]    [Pg.516]   
See also in sourсe #XX -- [ Pg.250 ]




SEARCH



User Interface Evaluation Methods

User interface

© 2024 chempedia.info