Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Shell script

Scripts are often fairly simple software components that can be generated very quickly to execute recurring operations or collect multiple operations into a large block for efficient usage. Scripts can be simple shell scripts with no graphical output capability, but the use of extended graphical toolbox scripts provides a simple way of implementing software features with easy user interaction. [Pg.58]

Once you have your SAS data ready for transport, you need to determine a means to deliver it. There are many ways to send data, but you should strive for process simplicity and data security. To keep your data secure and to comply with 21 CFR-Part 11, you need to encrypt your data files for transport. The best encryption you can use is key exchange high-bit encryption software such as PGP, which creates essentially unbreakable files when used properly. Once your data files are encrypted, you can either send them on physical media such as CD-ROM or send them electronically with secure transmission software such as Secure File Transport Protocol (SFTP). If you need to send data to someone once, a CD-ROM is simple enough to produce. However, if you need to send the data repeatedly, then you should use a more automated electronic method of data exchange. Shell scripts and batch files can be written to automate the electronic data transfer process. [Pg.288]

Separating the data transformation into three distinct steps enforces a completely modular software design. In practice, the data transformation is executed via command shell scripts, using freely available software for both the XSLT transformation and XSD validation. The raw data contained in the 39 surveys in the New Brunswick compilation are exported into 7,000 individual KML files, which can be viewed online at http //gdr.nrcan.gc.ca/geochem. [Pg.391]

An error message occurs, the executable shell script is not found. [Pg.84]

UNIX SHELL SCRIPT TO CREATE MULTIPLE SIMULATION FILES... [Pg.335]

For Mac and Windows, just drag the uncompressed executable program to a location of the user s choice for Unix/Linux the program is in the form of a file, BugView. jar, which should be placed in the same location (most conveniently one in the user s path ) as a supplied shell script bugview.sh. For Mac OS8/9 it is advisable to rebuild the desktop to ensure the application and files acquire the correct icons. [Pg.110]

The Mac and Windows versions are launched by double-clicking the BugView icon the Unix/Linux version is launched by running the shell script, bugview.sh. [Pg.110]

K2 [13] is a distributed query system that has been developed at the University of Pennsylvania. K2 relies on a set of data drivers, each of which handles the low-level details of communicating with a single class of underlying data sources (e.g., Sybase relational databases, Perl/shell scripts, the BLAST family of similarity search programs, etc.). A data driver accepts queries expressed in the query language of its underlying data source. It transmits each such query to the source for evaluation and then converts the query result into K2 s internal complex value representation. Data drivers are also responsible for providing K2 with data source metadata (i.e., types and schemas), which are used to type check queries. [Pg.395]

Figure 13.3 shows an entity relationship diagram for the tables created by the sdf loader script. Once files have been loaded into the database, the dbutils shell script is run to define several utility functions that operate on these tables. The dbutils script is listed in the Appendix. This... [Pg.167]

Several utility functions are discussed in Chapter 13 that can be used from the linux command line. These are molgrep, molcat, molview, molarb, molrandom,andmolnear. They operate on tables named structure that contain columns of SMILES, fingerprints, and names. Schemas containing tables like these can be created using the smiloader and sdf loader functions described in the next section. This section lists the shell script that defines the molgrep and other commands. [Pg.205]

The calculations are straightforward once we have set up the input, which consists of an integral calculation, four CASSCF/CASPT2 calculations and a final CASSl calculation for the SOC. It takes about 2 h for each intemuclear distance on the Pentium 4 laptop 1 am writing this chapter on. We write a shell script that loops over 16 chosen distances, most densely spaced around the equilibrium geometry, and let the computer work a couple of days. The program MOLCAS-6.0 is used [67]. [Pg.757]

All programs are operated through the Linux shell scripts. Through shell script, users can process many different SRA files at the same time. This pipeline was designed with the following versions of the aforementioned programs ... [Pg.28]

Has been automated by using Unix shell scripts... [Pg.136]

The Shell-Script (see Note 12) sybyl wrapper.sh, also a text file ... [Pg.285]

The main intention was to give a very simple example that even the non-expert can reproduce, therefore, every step that is normally carried out by a computational chemist within Sybyl is converted into a single clear command. To circumvent a graphical user interface, a shell script is needed that handles input, output and, command arguments for use in Sybyl. [Pg.291]

Two useful commands are report.reference and report cell. These commands are helpful when one is required to find the references and instances inferred in a design after compile. For example, if a design comprises two library cells, each of which is used three times, report cell command will list all the six instances and point to the corresponding reference, while the report reference command will list just two references and the number of times each reference is used, as shown below. The distinction between cells, instances and references is extremely significant, particularly when writing dc shell scripts. Shown below are the outputs of the report cell and report reference commands on the netlist discussed in Example 1.6. Notice that the report reference shows just one reference, while the report cell shows four instances or cells. [Pg.24]

Example 1.11 shows a simple dc shell script to read in a design, compile, and write out a netlist of the design in VHDL. The file constraints.scr must contain timing and area constraints. Optimization constraints are discussed in greater detail in Chapter 4. For man pages on any of the DC commands, use the help command at the dc.shell prompt as shown below ... [Pg.28]

The characterize command helps capture the constraints imposed on the sub-design by the surrounding logic. Consider the dc shell script shown below. The write script command writes the constraints to a file. It is advisable to characterize and compile each instance separately, since after the characterize-compile of one sub-block, the characterize data for the other block is quite different. [Pg.114]

Characterize and compile each sub-block individually. In other words, you have two sub-designs s1 and s2 (instance names are u1 and u2 respectively) contained in top. The dc shell script will be as follows ... [Pg.119]

Try to capture logic in the critical path into a separate level of hierarchy. DC does a better job of optimization when the critical path does not traverse hierarchical boundaries. This can be done by ungrouping existing blocks and re-grouping them using dc shell scripts. Similarly, when declaring point to point false paths. [Pg.120]

You wish to find all the clocks defined in your design and their clock periods within a dc shell script file. Using this information, you then wish to specify some constraints and attributes related to the clocks. [Pg.122]

You are using the embedded script feature of the VHDL-Compiler to automatically write out a pre-optimized version of VHDL. In other words, you wish to read in the source VHDL and write out a db file prior to compile via a dc shell script embedded in the source VHDL. But you find that DC issues an error. The embedded script is as follows ... [Pg.136]

This is because in embedded dc shell script, only commands that set constraints and attributes are allowed. Commands such as compile, ungroup, or report cannot be used in embedded scripts. [Pg.136]

In the dc shell script file used to synthesize the FSM, notice that the script changes the encoding style to one-hot inspite of the encoding being different in the source code. Figure 5.3 shows the FSM inferred. The same dc shell script can be used for Verilog code along with the appropriate file names. [Pg.145]

After performing an initial place and route, you have available accurate load values on the nets in the design. You specify these loads on the appropriate nets in the design using a setjoad dc shell script. You then perform in-place optimization, but find that... [Pg.151]


See other pages where Shell script is mentioned: [Pg.137]    [Pg.139]    [Pg.513]    [Pg.286]    [Pg.310]    [Pg.309]    [Pg.87]    [Pg.57]    [Pg.66]    [Pg.67]    [Pg.295]    [Pg.270]    [Pg.131]    [Pg.26]    [Pg.26]    [Pg.26]    [Pg.28]    [Pg.109]    [Pg.135]    [Pg.145]   
See also in sourсe #XX -- [ Pg.167 , Pg.206 ]




SEARCH



Scripting

© 2024 chempedia.info