Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Crash modelling

Figure 16 Opel Omega B Crash Model, 76,000 elements. (From Kohlhoff et al. 1994. Reprinted by permission of VDI Verlag GmbH.)... Figure 16 Opel Omega B Crash Model, 76,000 elements. (From Kohlhoff et al. 1994. Reprinted by permission of VDI Verlag GmbH.)...
Helmer, T., Samaha, R. R., Scullion, P., Ebner, A., and Kates, R. Injury risk to specific body regions of pedestrians in frontal car crashes modeled by empirical, in-depth accident data. In Proceedings of the 54th Stapp Car Crash Conference (2010)... [Pg.201]

The ASA (now ANSI) performance code for Safety Glazing Materials was revised in 1966 to incorporate these improvements in windshield constmction. The addition of test no. 26 requiring support of a 2.3-kg ball dropped from 3.7 m defined this level of improvement. It was based on a correlation estabUshed between 10-kg, instmmented, head-form impacts on windshields, on 0.6 x 0.9-m flat laminates, and the standard 0.3 x 0.3-m laminate with the 2.3-kg ball (28). Crash cases involving the two windshield interlayer types were matched for car impact speeds and were compared (29). The improved design produced fewer, less extensive, and less severe facial lacerations than those produced in the pre-1966 models. [Pg.527]

The accuracy of absolute risk results depends on (1) whether all the significant contributors to risk have been analyzed, (2) the realism of the mathematical models used to predict failure characteristics and accident phenomena, and (3) the statistical uncertainty associated with the various input data. The achievable accuracy of absolute risk results is very dependent on the type of hazard being analyzed. In studies where the dominant risk contributors can be calibrated with ample historical data (e.g., the risk of an engine failure causing an airplane crash), the uncertainty can be reduced to a few percent. However, many authors of published studies and other expert practitioners have recognized that uncertainties can be greater than 1 to 2 orders of magnitude in studies whose major contributors are rare, catastrophic events. [Pg.47]

Statistical and algebraic methods, too, can be classed as either rugged or not they are rugged when algorithms are chosen that on repetition of the experiment do not get derailed by the random analytical error inherent in every measurement,i° 433 is, when similar coefficients are found for the mathematical model, and equivalent conclusions are drawn. Obviously, the choice of the fitted model plays a pivotal role. If a model is to be fitted by means of an iterative algorithm, the initial guess for the coefficients should not be too critical. In a simple calculation a combination of numbers and truncation errors might lead to a division by zero and crash the computer. If the data evaluation scheme is such that errors of this type could occur, the validation plan must make provisions to test this aspect. [Pg.146]

Particularly spectra and quantum theory seemed to indicate an order. A planetary model almost suggested itself, but according to classical physics, the moving electrons should emit energy and consequently collapse into the nucleus. The 28-year-old Niels Bohr ignored this principle and postulated that the electrons in these orbits were "out of law". This clearly meant that classical physics could not describe or explain the properties of the atoms. The framework of physical theory came crashing down. Fundamentally new models had to be developed.1... [Pg.25]

Stan Shostak No, I have much more respect for models than to call them analogies. Models are serious business. You allow a model to work and figure out the parts of the model. It is not a matter of taking the truck apart. You drive your truck up and it suddenly stops. Reductionist - Well, I ran out of gas. Put some gas into the tank, truck starts again, you drive off, you don t care whether that s the real explanation, but it works. Your computer breaks down, you have a crash, you can t put gas in the tank, you reboot. That s the model. Where reductionism doesn t work, you begin with modelling. [Pg.108]

Unless, of course, the costs could be paid by insurance. This might be feasible for some effects, but not for really large scale ones the whole insurance industry would be put out of business. Thus Ford were condemned in the 1960s for calculating that the costs in terms of deaths and injuries caused by a faulty model were less than the costs of rectifying the fault (the fuel tanks exploded in crashes at over 23 miles per hour) (Dowie, 1977). [Pg.153]

Chemists are not alone. The Mars Climate Orbiter crashed into Mars in September 1999 following a confusion in programming of software that modeled forces in English units instead of the metric units that were expected by the thruster controllers and other programs. This incident is related in the delightfully titled NASA document Mars Climate Orbiter Mishap Investigation Report. (Stephenson 1999) The mishap cost around 125,000,000. [Pg.21]

If you work with a model that consists of only a C trace of the protein backbone, make sure you only use C potentials, other types could lead to a program crash. [Pg.173]

One problem with computer models, or indeed with the scientists using them, is that when the models do not crash, the scientists tend to believe the results. Without proper validation, however, deducing anything from any model is a hazardous... [Pg.264]

Some vehicles have a delay system allowing deployment of the device even if the battery is destroyed or deactivated at the crash. General Motors models will deploy up to 10 minutes after power is disconnected. Mitsubishi cars reportedly have a delay of up to about half a minute. [Pg.30]

However, in the context of the everyday laboratory the question is moot. It may be that the immutable laws of a deterministic universe dictated that in the middle of a star at the edge of the universe millions of years ago an atom was stripped of its electrons and sent our way at just less than the speed of light. But when that "cosmic ray" crashes through out cloud chamber, ruining our experiment, we have no choice but to regard it as a chance event. Even if we had an infinite capacity to store facts about the present state of the universe and had them all in place (universal data base) and if we had an infinite processing rate (the ultimate computer), we still would need an exact model of the universe... [Pg.113]

Activity Tests with Model Compounds. Activity tests with model compounds were also carried out for the fresh, regenerated, and aged catalysts in a fixed bed reactor under a vapor phase condition at 5.0 MPa. 3 cm of crushed catalyst (0.35 - 0.5mm) was diluted with 9 cm of inactive alumina particles. Catalyst activities, such as hydrodesulfurization (HDS), hydrodenitrogenation (HDN), and hydrogenation (HG), were measured, feeding a mixture of 1 wt% carbon dioxide, lwt% dibenzothiophene, 1 wt% indole, and 1 wt% naphthalene in n-heptane. The catalysts were presulfided with a 5% H2S/H2 mixture at 400 °C for two hours and aged with a liquid feed at a reaction condition for 24 hours. Tests for HDS and HDN reactions were conducted at 275 °C, while those for a HG reaction were done at 325 °C. Condensed liquid products were analyzed with gas chromatography. Since all the reactions took place with the crashed catalysts in the vapor phase, we assumed that effectiveness factors were unity (9). [Pg.211]

Emulsion polymerization first gained industrial importance during World War II when a crash research program in the United States resulted in the production of styrene-co-butadiene [SBR] synthetic rubber. The Harkins-Smith-Ewart model [5-6] summarized the results of early research, which focussed on this and similar systems. Current thinking is not entirely in accord with this mechanism. It is still worthwhile to review it very brielly here, however, because it is still widely referenced in the technical literature and because some aspects of the model provide valuable insights into operating procedures. [Pg.285]

Fortunately, for a couple of reasons, the likelihood of a terrorist attack on a nuclear reactor is quite low. Nuclear reactors operate under tight security and incorporate safety systems. In addition, the extensive shielding around reactors would require large amounts of explosives to create a breach. Even if terrorists could transport large amounts of explosives, they would have to breach a security cordon to reach the reactor. Alternatively, they could commandeer a jumbo jet plane to crash into a reactor or a nuclear pond of used cores, but they would have to breach security measures to do so. Computer modeling indicates that the constraction of most reactors would sustain a 300 mph impact from a commercial aircraft, but not aU scientists agree with these findings (1). [Pg.162]

While Rutherford s model of the atom seemed well established in an experimental and theoretical sense, it also had some problems. The biggest problem was the electron death spiral. If the laws of physics applied to the electrons that orbited the nucleus, they should lose energy with every orbit and spiral down and crash into the nucleus. If this happened, the universe as we know it would not exist, so there had to be a problem with the model or with physics. [Pg.95]


See other pages where Crash modelling is mentioned: [Pg.336]    [Pg.284]    [Pg.443]    [Pg.336]    [Pg.284]    [Pg.443]    [Pg.102]    [Pg.804]    [Pg.68]    [Pg.152]    [Pg.153]    [Pg.66]    [Pg.17]    [Pg.34]    [Pg.265]    [Pg.197]    [Pg.28]    [Pg.32]    [Pg.61]    [Pg.124]    [Pg.169]    [Pg.501]    [Pg.496]    [Pg.232]    [Pg.95]    [Pg.93]    [Pg.25]    [Pg.259]    [Pg.25]    [Pg.399]    [Pg.53]    [Pg.108]    [Pg.181]   
See also in sourсe #XX -- [ Pg.284 , Pg.287 ]




SEARCH



Composite crash modelling

Crash

Crashing

© 2024 chempedia.info