Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Safety Historical Data Modeling

As a part of the plant safety framework, common data component is proposed to manage all common data for the plant safety framework and to accumulate the safety-related common data across plant activities. This component is a part of the data warehousing for the plant-wide activities across the plant lifecycle. The object-oriented modeling approach is used to abstract these common elements within the plantwide conceptual model while the physical data are within the data warehouse frame. The safety common data (SCD) component includes (but not limited to) the possible source of data errors, documentation standards (vocabulary), generic cause-consequence for each component type, checklists for operation-type jobs, and standard safety interlock levels. The common data are essential to be organized and formatted in a [Pg.43]

Data Source Possible Error Corrective Actions [Pg.44]

Operator input manual data entry Bad data Key entry error Interactive limit check with alarm [Pg.44]

External data system serial communication link Bad transmission Limit check with alarm Message validity check Limit check with default [Pg.44]

Application software Program error Rigorous testing Limit check with alarm [Pg.44]


Safety historical data has been highlighted as part of the conceptual plant safety model. Figure 3-11 showed the detailed data groups of safety historical data component where safety historical data is divided into two parts plant specifications, which covers all specification changes due to design, operation, or any other process throughout the plant lifecycle and accident/incident data, which includes plant, events, process, equipment, cause, consequence, and procedures in-place. [Pg.130]

Gupta/Maranas (2003) as one example for a demand uncertainty model present a demand and supply network planning model to minimize costs. Production decisions are made here and now and demand uncertainty is balanced with inventories independently incorporating penalties for safety stock and demand violations. Uncertain demand quantity is modeled as normally distributed random variables with mean and standard deviation. The philosophy to have one production plan separated from demand uncertainty can be transferred to the considered problem. Penalty costs for unsatisfied demand and normally distributed demand based on historical data... [Pg.128]

Most companies, especially those at Level 3 or below in the safety maturity model, tend to focus their safety programs strictly on lagging safety performance indicators. They are based on historical data. They do, and should, track accidents, incidents, safety problems, and corrective actions and make sure that they are implemented appropriately or that the corrective actions or controls are still valid. This is all very important and must be done and not dropped. However, it is not enough. [Pg.33]

In addition to the integration into a system safety performance model , it was concluded that grouping all the similar consequences together into a so called virtual consequence would be very useful if the virtual consequences correspond to accident categories collated by industry or regulatory bodies since these can be used to calibrate the model against the historical data. [Pg.78]

Eventually, two were selected for further analyses GDPPC and HDI as parameters of scale for modelling the RFR. GDPPC, however, helps with forecasts (irrespective of time) of the human development index HDI and the motorisation rate MRMV. It can be used to simulate and analyse motorisation rates and road safety at different stages of a country s social and economic development (irrespective of time). The downside of GDPPC is that it reduces the scale as its numerical value decreases (applies mainly to historical data). [Pg.103]

The proposed method has some limitations, which open room for future improvements. The main limitation is related to the estimation of the severity factor (D). For the sake of simplicity, a single point severity value has been used for all the at-risk behaviours. However, a more robust estimation should consider different severities for different causes. Again, this estimation could be done by combining historical data, when available, with experts judgments. A further limitation is related to the effectiveness of the model it should be tested by means of a long term pilot implementation to check for any positive correlation between safety performance trends and the implementation maturity of the proposed method. This means that after the implementation of the method, a test should verify that the number of non-conforming behaviours as well as the related risk level reduced over time. [Pg.1317]

Multivariable controls (MVCs) are particularly well suited for controlling highly interactive fractionators where several control loops need to be simultaneously decoupled. MVCs can simultaneously consider all the process lags, and apply safety constraints and economic optimization factors in determining the required manipulations to the process. The technique of multivariable control requires the development of dynamic models based on fractionator testing and data collection. Multivariable control applies the dynamic models and historical information to predict future fractionator characteristics. For towers that are subject to many constraints, towers that have severe interactions, and towers with complex configurations, multivariable control can be a valuable tool. [Pg.253]

In modern chemical plants, thousands of measurements are recorded at frequencies that can exceed 1 Hz. Aside from plant operating conditions including pressures, temperatures, flow rates, and stream composition, other recorded variables include product purity, contamination levels (air, water, soil), and even safety compliance. All this information is stored in enormous databases. This historical record may be interrogated to monitor process performance, and control, for troubleshooting, to demonstrate environmental compliance and modeling. Often smoothing techniques are required to help identify trends in the data that may be masked by low signal-to-noise ratios. [Pg.74]

The information that was used for this simulation is generic data. That is to say, the information about the process unit is limited to an instrument count and the instruments were modeled with generic equipment models. This is the coarsest application of the Platypus software that we term the generic analysis . A detailed analysis is also possible. In that case, the total length of pipework, the number of connections to a vessel, the SIL level of technical safety barriers may be modeled to represent the unit as closely as possible. Historic failure data of the plant can also be implemented. This is possible because Platypus allows for the adaptation of the generic models on each level of detail of the model. This makes it very flexible but the workload associated with modeling an entire process plant (which may literally consist of thousands of pieces of equipment) may not be very practical. In addition, it may be hard to find relevant data. [Pg.1367]


See other pages where Safety Historical Data Modeling is mentioned: [Pg.43]    [Pg.43]    [Pg.35]    [Pg.42]    [Pg.147]    [Pg.300]    [Pg.35]    [Pg.259]    [Pg.202]    [Pg.348]    [Pg.351]    [Pg.84]    [Pg.178]    [Pg.21]    [Pg.334]    [Pg.271]    [Pg.226]    [Pg.31]    [Pg.37]    [Pg.49]    [Pg.634]    [Pg.54]    [Pg.756]    [Pg.737]    [Pg.14]    [Pg.247]    [Pg.3]    [Pg.814]    [Pg.377]    [Pg.277]    [Pg.802]   


SEARCH



Data modeling

Historic data

Historical data

Safety data

Safety models

© 2024 chempedia.info