Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

User input verification

LIMS usually work tightly with SOPs or a method management system. These systems provide the documented rules for exception management in sample data analysis batch evaluation. The concept of user input verification may replace the rule-based calculation functionality in existing LIMS. The required information logistics to solve this task is based on the following assumptions ... [Pg.354]

An interpreted programming language with many similarities to Java. It is most often used in the context of client-side scripts embedded into HTML pages (for animation, user input verification, and similar tasks) or as scripting language for VRML worlds. See Internet. [Pg.1456]

Verification of user input in a data management system is a typical request from departments with regulated workflows. The basic requirements for such systems are as follows ... [Pg.350]

Double-checking System verification of critical user input NSR A11.9 Specification, design, data load, OQ... [Pg.125]

In SkyParams.xlsx the user defines the sky map to be created if no science input map is used. The first sheet in the file is the one that the simulator reads, so different configurations can be stored in the same file. The first option is the Full Width Half Maximum (FWHM) if empty, a point source is created, if full, a Gaussian source is simulated with this FWHM. The user may input the source position in the sky at jc j)os and y j)os. If empty the simulator will display a sky grid where the user selects the position or a source by clicking on that position. Temp is for the temperature in Kelvin of a Blackbody spectrum, cut-on and cut-off will filter the Blackbody spectrum at the wavenumbers given at these parameters. For verification purposes, if the user inputs 1 the spectrum for that given source will be the one measured with the Cardiff-UCL FIRI testbed. If the user inputs 0 the spectrum will consist of a single wavenumber tone, defined at tone wl. [Pg.76]

IV.74. For the purposes of Appendix IV, the following terms, as defined in Safety Series No. 113 [IV.3], apply applicant, assessment, audit, controlled document, corrective action, design input, design output, examination, inspection. Item, maintenance/servicing, measuring and test equipment, non-conformance, objective evidence, procedure, procurement document, qualification, quality, quality elements, quality assurance programme, quality plan, repair, services, specification, supplier, traceability, user, and verification. [Pg.312]

Do the design controls ensure that design reviews and design verification and validation are recorded and demonstrate the product meets the design input and user requirements ... [Pg.81]

The process of field validation and testing of models was presented at the Pellston conference as a systematic analysis of errors (6. In any model calibration, verification or validation effort, the model user is continually faced with the need to analyze and explain differences (i.e., errors, in this discussion) between observed data and model predictions. This requires assessments of the accuracy and validity of observed model input data, parameter values, system representation, and observed output data. Figure 2 schematically compares the model and the natural system with regard to inputs, outputs, and sources of error. Clearly there are possible errors associated with each of the categories noted above, i.e., input, parameters, system representation, output. Differences in each of these categories can have dramatic impacts on the conclusions of the model validation process. [Pg.157]

The following considerations, when applied during method development, are likely to produce more robust, reliable, and transferable methods (a) the concerns of the customer (user) are considered in advance, (b) key process input variables are identified, (c) criticaTto-quality factors are determined, (d) several method verification tests are installed, (e) proactive evaluation of method performance during development is performed, (f) continuous customer involvement and focus are institutionalized, and (g) method capability assessment (suitability to be applied for release testing against specification limits) is established. [Pg.3]

A successor to PESTANS has recently been developed which allows the user to vary transformation rate and with depth l.e.. It can describe nonhomogeneous (layered) systems (39,111). This successor actually consists of two models - one for transient water flow and one for solute transport. Consequently, much more Input data and CPU time are required to run this two-dimensional (vertical section), numerical solution. The model assumes Langmuir or Freundllch sorption and first-order kinetics referenced to liquid and/or solid phases, and has been evaluated with data from an aldlcarb-contamlnated site In Long Island. Additional verification Is In progress. Because of Its complexity, It would be more appropriate to use this model In a hl er level, rather than a screening level, of hazard assessment. [Pg.309]

Archive and Retrieval of Documents and Records User Results Input and Displays Analytical Report Generation Audit Trail Verification... [Pg.533]

A verification system can automatically create user dialogs or forms for input of fielded data on the basis of a human-readable textual specification, preferably in XML. This specification contains the following information ... [Pg.350]

The specification is preferably documented in XML format and can be customized by the user or administrator of the system. The verification module reads the specification and creates the corresponding input user interface dynamically rather than using any hard-coded user dialogue. A dynamic user dialogue is created with a set of... [Pg.350]

Software quality assurance for the MACCS2 code was provided through a limited-distribution beta test with DOE users and through a formal, independent verification study of the code package by the University of New Mexico which included detailed hand calculations. An independent analyst, familiar with the MACCS2 code and its applications, performed validation of the standard input (Liscum-Powell 1997). [Pg.170]

The standardization provided by the SCALE system arises from the use of well-established functional modules and data libraries in standard analytical sequences. The unified input specified in terms of easily visualized engineering parameters reduces the occurrence of input errors and lends itself to easy verification. Through the selection of standard analytical sequences, the SCALE system user can perform sophisticated state-of-the-art analysis In a commonly understood, well-documented manner. [Pg.584]

Safety-related controllers in conjunction with safety or fail safe I/O modules are used for critical and hazardous applications where an incident can result in danger to persons, and/or damage to plant and environment. These safety-related controllers can work with the safety-related distributed I/O system (may be with internal verification for input or output via safety switches as described in Clause 5.0.1—safe PLC approach), or directly with fail-safe transmitters cormected via the fieldbus. These controllers are supposed to detect faults both in the process and their own internal (self-diagnosis) to the system. It is the duty of the same to automatically set the plant to a safe state in the event of a fault. These controllers need to work in multitasking environment — may be in a mix of standard BPCS or safety-related applications, if integrated operation is permitted by the end-user. The programs of BPCS and SIS must be functionally separate, so that faults in BPCS applications have no effect on safety-related applications and vice versa. Special tasks with very short response times can also be implemented [14]. For safety applications controllers and I/O modules need to individually certified by third party and to comply SIL 2/SIL 3 (as the case may be — SIL 4 only for nuclear application) as per lEC 61508. For safety-related applications a few restrictions are followed such as ... [Pg.675]

The most prominent distinction between simulation-based verification and formal verification is that the former requires input vectors and the latter does not. The approach in simulation-based verification is first to generate input vectors, and then to derive reference outputs. The thinking process is reversed in the formal verification process. The user states what output behavior is desirable (without considering input stimuli), and then lets the formal checker prove or disprove it. [Pg.2409]

FPGAs may contain large numbers of states which are defined as don t care for certain modes of operation. Many inputs and internal variables are often defined as constants. Different synthesis tools handle the don t care states and constants quite differently. This makes formal verification a very user intensive process requiring manual customization of the verification tool. [Pg.212]

We choose LNT [5] as target specification language which derives from two standards Lotos [1] and E-Lotos [2]. This choice is justified by the expressiveness and richness of LNT. It provides expressive enough operators for data and behaviour description and it has a user-friendly notations to simplify the specification writing. Indeed, LNT is a CADP [6] (Construction and Analysis of Distributed Processes) input language. It is a popular formal verification toolbox that implements many formal methods. [Pg.147]

The primary mechanism for veril ng safety is constraint solving. After the schema is defined, the platform mechanically derives verification conditions and translates them into an Event-B model that serves as an input notation for an SMT-LIB-compUant SMT solver (Yices [8]). One downside of applying constraint solving in our context is that we cannot always receive a useful feedback to indicate the source of a problem should an error be discovered. To compensate for this, whenever a SMT solver detects a problem, the tool runs the ProB model checker [9]. Unlike solvers, the model checker explores the state space of a discrete transition system that gives semantics to the SafeCap schemas represented as the Event-B model with the DSL axioms defined in the machine context. It is thus able to report a sequence of steps (discrete train movements, point switching, etc.) that leads to violation of a safety condition (e.g. a collision or derailment). In most cases, such a sequence can be visually replayed by the tool platform to help the user debug the schema. [Pg.134]


See other pages where User input verification is mentioned: [Pg.351]    [Pg.351]    [Pg.350]    [Pg.351]    [Pg.366]    [Pg.264]    [Pg.265]    [Pg.5]    [Pg.148]    [Pg.215]    [Pg.388]    [Pg.259]    [Pg.26]    [Pg.30]    [Pg.222]    [Pg.512]    [Pg.8]    [Pg.688]    [Pg.86]    [Pg.163]    [Pg.347]    [Pg.349]    [Pg.351]    [Pg.205]   
See also in sourсe #XX -- [ Pg.350 , Pg.354 ]




SEARCH



Rule-Based Verification of User Input

Verification

© 2024 chempedia.info