Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Time-to-event analysis statistics

In this section we take the aforementioned principles and guidelines for analysis data sets and apply them to creating the most common analysis data sets. The critical variables, change-from-baseline, and time-to-event data sets are presented. Although these are the most common analysis data sets that a statistical programmer will encounter, they are by no means all of the possible analysis data sets. When it comes to analysis data sets, there is no limit to the diversity of data that you may have to create. [Pg.118]

There are also special considerations as to how to statistically evaluate specific aspects of these studies. Specifically, analysis of time to event becomes very important (Anderson et al., 2000). [Pg.743]

Anderson, H., Spliid, H., Larsen, S. and Dali, V (2000). Statistical analysis of time to event data from preclinical safety pharmacology studies. Tox. Methods 10 111-125. [Pg.760]

Survival analysis. The analysis of time to event data in particular, but not exclusively, where the event is death. A common feature of such data is that they are very skewed and that there are many censored values. Survival analysis is one of the single most important topics in medical statistics, although its importance to pharmaceutical statistics is, because of the nature of the trials usually run in drug development, relatively less important than the contents of standard textbooks on medical statistics might suggest. [Pg.478]

If there are separate analysis plans for the clinical and economic evaluations, efforts should be made to make them as consistent as possible (e.g., shared use of an intention-to-treat analysis, shared use of statistical tests for variables used commonly by both analyses, etc.). At the same time, the outcomes of the clinical and economic studies can differ (e.g., the primary outcome of the clinical evaluation might focus on event-free survival, while the primary outcome of the economic evaluation might focus on quality-adjusted survival). Thus, the two plans need not be identical. [Pg.49]

There are a number of nonparametric analysis methods dealing with continuous data. The last statistical method included in this chapter is to be used when the continuous outcome is time to an event. [Pg.169]

Details of statistical analyses for potential toxicities that should be explicitly considered for all products and AEs of special interest Aiialyses for these events will in general be more comprehensive than for standard safety parameters. These analyses may include subject-year adjusted rates, Cox proportional hazards analysis of time to first event, and Kaplan-Meier curves. Detailed descriptions of the models would typically be provided. For example, if Cox proportional hazards analysis is specified, a detailed description of the model(s) that will be used should be provided. This would generally include study as a stratification factor, covariates, and model selection techniques. More advanced methods, such as multiple events models or competing risk analyses, should be described if used (as appropriate). It is recommended that graphical methods also be employed, for example, forest plot and risk-over-time plot (Xia et al., 2011). [Pg.61]

When conducting a quantitative risk assessment the analyst(s) must define its probabilistic basis, and in most cases this means either to use the classical frequency approach or the Bayesian approach. The classical statistical approach is and has been the most commonly used probabilistic basis in health care (Schneider 2006). This approach interprets a probability as the relative fraction of times the event considered occurs if the situation analyzed were hypothetically repeated an infinite munber of times. The imder-lying probability is unknown, and is estimated in the risk analysis. In the alternative, the Bayesian perspective, a probability is a measure of imcertainty about future events and outcomes (consequences), as seen through the eyes ofthe assessor(s) and based on his/her backgroimd information and knowledge at the time of the analysis. Probability is a subjective measure of imcertainty. [Pg.1707]

Korb el al. proposed a model for dynamics of water molecules at protein interfaces, characterized by the occurrence of variable-strength water binding sites. They used extreme-value statistics of rare events, which led to a Pareto distribution of the reorientational correlation times and a power law in the Larmor frequency for spin-lattice relaxation in D2O at low magnetic fields. The method was applied to the analysis of multiple-field relaxation measurements on D2O in cross-linked protein systems (see section 3.4). The reorientational dynamics of interfacial water molecules next to surfaces of varying hydrophobicity was investigated by Stirnemann and co-workers. Making use of MD simulations and analytical models, they were able to explain non-monotonous variation of water reorientational dynamics with surface hydrophobicity. In a similar study, Laage and Thompson modelled reorientation dynamics of water confined in hydrophilic and hydrophobic nanopores. [Pg.256]

We were interested in measuring the effect of cyber-attacks on the system under study. We chose to compare the behavior of a base-line model, i.e. a model without cyber-attacks, with the behavior of the model in which cyber-attacks are enabled ( system under attack ). For the comparison we chose as a reward (utility function) the deviation of the supplied power, in the presence of failures and attacks, from the known maximum power supplied of 10,940 MW. This reward has been used in the analysis of power systems by others [9]. Other suitable candidates would be the size of cascades as we have done in the past [3]. We compute the reward at any state-machine event in the model and log these values during the simulations. Clearly, for every simulation run, the value of the supplied power varies over time to form a continuous-time stochastic process. We study the following two statistics of this process ... [Pg.322]

Data Analysis, in LDA there are two major problems faced when making a statistical analysis of the measurement data velocity bias and the random arrival of seeding particles to the measuring volume. Although velocity bias is the predominant problem for simple statistics, such as mean and RMS values, the random sampling is the main problem for statistical quantities that depend on the timing of events, such as spectrum and correlation functions (see Tropea, 1995). [Pg.220]

From a theoretical point of vie v, the blinking kinetics of these CdTe QDs can be quantified by analysis of the on- and off- time probability densities, P(ton) and respectively. Figure 9.8 displays the luminescence intermittency statistics for CdTe QDs in trehalose environment. The distribution of off times involved in the blinking is almost linear on this scale, indicating that the lengths of off times events are distributed according to an inverse po ver-la v of the type, P tos) = Pof where Pq is... [Pg.164]


See other pages where Time-to-event analysis statistics is mentioned: [Pg.247]    [Pg.259]    [Pg.247]    [Pg.259]    [Pg.255]    [Pg.182]    [Pg.278]    [Pg.52]    [Pg.292]    [Pg.164]    [Pg.107]    [Pg.473]    [Pg.165]    [Pg.464]    [Pg.425]    [Pg.255]    [Pg.411]    [Pg.203]    [Pg.320]    [Pg.72]    [Pg.153]    [Pg.1145]    [Pg.157]    [Pg.262]    [Pg.121]    [Pg.1045]    [Pg.135]    [Pg.108]    [Pg.491]    [Pg.507]    [Pg.119]    [Pg.389]    [Pg.56]    [Pg.8]   
See also in sourсe #XX -- [ Pg.259 ]




SEARCH



Obtaining Time-to-Event Analysis Statistics

Statistical analysis

Time events

Time-to-event analysis

© 2024 chempedia.info