Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Entropy of mixing

2 Entropy of Mixing Flory counted the number of possible arrangements of p chains on sites and compared it with the number of arrangements on n N sites before mixing, that is, in the melt. Thus, he obtained the entropy of mixing AAmix per site as [Pg.72]

We can show that given by Eq. 2.3 is greater than the entropy of mixing for an ideal solution of p solute molecules and s solvent molecules (Problem 2.1). The difference is due to a greater number of conformations a polymer chain can take when the requirement that all the sites be occupied by the monomers is lifted. [Pg.72]

The entropy of mixing is an important quantity, as the entropic term always supports the formation of a homogeneous mixture (Eq. (3.1)). The entropy characterizes the degree of disorder in the system, and can be determined using the lattice [Pg.96]

F re 3.1 Schematic representation of spatial arrangements of molecules in a binary mixture of (a) low-molecular-weight components and (b) a polymer blend. [Pg.97]

After this substitution, and using Eqs (3.10) or (3.11), one of the commonly used expressions is obtained for the entropy of mixing in polymer systems  [Pg.98]

The one problem of this approach appears in the case of two polymers with considerably different chemical structures - that is, with monomer units of unequal size. It is then necessary to choose repeating units such that these occupy the same volume for both polymers. A detailed discussion of the lattice approach and alternative derivations of entropy of mixing are provided in Ref. [11]. [Pg.98]

The specific interactions are not random rather, they are formed between specific segments of the polymer chains, and therefore the specific interactions will impact also on the spatial arrangement of the blend - that is, they [Pg.99]

We note that if we have a total of N sites to occupy, there are N different ways of occupying these sites on the assumption that each configuration is distinct. However, in our case this logic must be amended on the basis of the realization that there are Na equivalent A objects and Nb equivalent B objects. This implies that in reality there are [Pg.120]

As we can see in this instance, the calculation of the entropy of mixing has been reduced to the combinatorial problem of identifying the number of distinct microstates consistent with our overall macrostate which is characterized by a fixed number of A and B objects. [Pg.120]

Further insights into this configurational entropy can be gleaned from an attempt to reckon the logarithms in an approximate fashion. We note that our expression for the entropy may be rewritten as [Pg.120]

This sum may be represented graphically as shown in fig. 3.17, which suggests the approximate expression [Pg.121]

Here it must be noted that we have made an imprecise description of what under more formal terms is known as the Stirling approximation. Within the context of our characterization of the configurational entropy, the Stirling approximation of In N allows us to rewrite the entropy as [Pg.121]

As a result of the increased number of configurational possibilities, the entropy of a mixture is higher than the sum of the entropies of the components in the demixed state, under otherwise identical conditions. [Pg.32]

For the sake of simplicity, let us consider an ideal mixture of two components, 1 and 2. The entropy of mixing is [Pg.32]

5m is the entropy of the mixture containing Aj molecules of 1 and N2 molecules of 2 5 and 52 are the entropies of molecules of pure component 1 and N2 molecules of pure component 2, respectively [Pg.32]

For an ideal system the changes in energy and volume upon mixing are zero that is, U and V are constant. At constant U and V the molecular degrees of freedom as translation, rotation, and vibration are the same for the mixed and the demixed state [Pg.32]

In the demixed state (i.e., for the pure components), there is only one way to distribute indistinguishable molecules over sites, which means that In = 0. Similarly, In = 0, for pure 2. For the mixture of 1 and 2, [Pg.33]

Let us first consider the entropy of mixing. A mixture has more accessible states than a pure substance, hence a higher entropy. This can be quantified in the following marmer. [Pg.225]

Let Na be the number of A atoms and Nb be the number of B atoms. The total number of atoms N = Na + Nb- The N atoms can be arranged in N different ways, but since atoms are indistinguishable, Na and Nb of these are redundant. Thus the number of unique ways of arranging N atoms is [Pg.225]

Since the Ns are very large, the factorials can be represented by Stirling s approximation. In xl K xlnx — x. The entropy of mixing can then be expressed as [Pg.226]

It is more convenient to work with the mole fraction x = Nb/N. Introducing this in the above expression. [Pg.226]

Polymers undergoing dissolution show much smaller entropies of mixing than do conventional solutes of low relative molar mass. This is a consequence of the size of the polymer molecules when segments of a molecule are covalently bonded to each other they cannot adopt any position in the liquid, but have to stay next to each other. Hence, the possible disordering effect when such big molecules are dissolved in solvent is much less than for molecules of, say, a typical low molar mass organic substance. [Pg.81]

We assume that polymer molecules consist of a large number of chain segments of equal length, joined by flexible links. Each link then occupies one site on the lattice. The solution has to be sufficiently concentrated that the occupied lattice sites are distributed at random, rather than having them clustered together in a non-random way. [Pg.82]

Using the lattice model, the approximate value of IV in the Boltzmann equation can be estimated. Two separate approaches to this appeared in 1942, one by P. J. Flory, the other by M. L. Huggins, and though they differed in detail, the approaches are usually combined and known as the Flory-Huggins theory. This gives the result for entropy of mixing of follows  [Pg.82]

The heat of mixing for polymer solutions, by analogy with solutions of low molar mass solutes, is given by  [Pg.83]

The symbol x-- stands for the interaction energy per solvent molecule divided by kT. Combining equations (5.7) and (5.8) gives the Flory-Huggins equation for the free energy of mixing of a polymer solution  [Pg.83]

Polymers undergoing dissolution show much smaller entropies of mixing than do conventional solutes of low relative molar mass. This is a consequence of [Pg.69]

In ideal solutions it is assumed for the pair interactions that no energy is released on replacing a unit of group 1 by a unit of group 2, that is, Ae in Equation (6-11) equals zero. Consequently, the enthalpy of mixing of an ideal solution is also equal to zero. [Pg.211]

Since all energies are, by definition, equal in magnitude with ideal solutions, all environment-dependent entropy contributions can contribute [Pg.211]

With low-molar-mass substances, 1 and 2, the mole fractions jci and JC2 are those of the molecules, since each molecule occupies a cell in the lattice that the solution is considered to be (Fig. 6-3). In contrast, the mole fractions are with respect to monomeric units when macromolecules are considered. [Pg.212]

Two pure ideal gases 1 and 2, both at the same temperature T and pressure P, are mixed isothermally with final compositiony, andy2- What is the entropy of mixing per mole of mixture formed  [Pg.84]

We start by noticing that as a result of the mixing process the pressure of component 1 goes from P to its partial pressure Pj, i.e. its contribution to the mixture pressure and the same applies to component 2. Thus, according to Eq.3.12.5  [Pg.84]

The same equation applies in the isothermal formation of an ideal solution, while real gases and real solutions will be considered in Chapter 11. Returning to Eq.3.12.8 we notice that, since all y s are less than one, the mixing process leads to an increase in entropy. This, of course, should be expected. Mixing is a natural (spontaneous) process, and as such it leads to an entropy increase. [Pg.84]

We are ready now to carry out entropy change calculations and use them in applications of the second law, such as the establishment of the feasibility of processes and the efficient utilization of energy in them. [Pg.85]

But first, to develop some sense about the physical meaning of entropy, we consider a molecular interpretation of it. [Pg.85]

3S This is equivalent to assuming chat there is no free volume, or the fluid is incompressible. [Pg.333]

FIGURE 11-7 Schematic diagram of a filled lattice. The blue balls are indistinguishable, as are the red ones. [Pg.334]

FIGURE 11-5 Two different ways of placing the first ball on the lattice. There are actually n0 ways to do this. [Pg.334]

the number of ways of putting all nQ molecules back on the lattice is  [Pg.334]


The broken bond approach has been extended by Nason and co-workers (see Ref. 85) to calculate as a function of surface composition for alloys. The surface free energy follows on adding an entropy of mixing term, and the free energy is then minimized. [Pg.270]

The entropy of mixing of very similar substances, i.e. the ideal solution law, can be derived from the simplest of statistical considerations. It too is a limiting law, of which the most nearly perfect example is the entropy of mixing of two isotopic species. [Pg.374]

Since the 0 s are fractions, the logarithms in Eq. (8.38) are less than unity and AGj is negative for all concentrations. In the case of athermal mixtures entropy considerations alone are sufficient to account for polymer-solvent miscibility at all concentrations. Exactly the same is true for ideal solutions. As a matter of fact, it is possible to regard the expressions for AS and AGj for ideal solutions as special cases of Eqs. (8.37) and (8.38) for the situation where n happens to equal unity. The following example compares values for ASj for ideal and Flory-Huggins solutions to examine quantitatively the effect of variations in n on the entropy of mixing. [Pg.517]

We express the calculated entropies of mixing in units of R. For ideal solutions the values of are evaluated directly from Eq. (8.28) ... [Pg.518]

Figure 8.1 The entropy of mixing (in units of R) as a function of mole fraction solute for ideal mixing and for the Flory-Huggins lattice model with n = 50, 100, and 500. Values are calculated in Example 8.1. Figure 8.1 The entropy of mixing (in units of R) as a function of mole fraction solute for ideal mixing and for the Flory-Huggins lattice model with n = 50, 100, and 500. Values are calculated in Example 8.1.
A plot of these values is shown in Fig. 8.1. Note the increase in the entropy of mixing over the ideal value with increasing n value. Also note that the maximum occurs at decreasing mole fractions of polymer with increasing degree of polymerization. [Pg.520]

Subtracting the entropy contributions of the pure components from gives the entropy of mixing according to the present model ... [Pg.556]

The separation of Hquid crystals as the concentration of ceUulose increases above a critical value (30%) is mosdy because of the higher combinatorial entropy of mixing of the conformationaHy extended ceUulosic chains in the ordered phase. The critical concentration depends on solvent and temperature, and has been estimated from the polymer chain conformation using lattice and virial theories of nematic ordering (102—107). The side-chain substituents govern solubiHty, and if sufficiently bulky and flexible can yield a thermotropic mesophase in an accessible temperature range. AcetoxypropylceUulose [96420-45-8], prepared by acetylating HPC, was the first reported thermotropic ceUulosic (108), and numerous other heavily substituted esters and ethers of hydroxyalkyl ceUuloses also form equUibrium chiral nematic phases, even at ambient temperatures. [Pg.243]

The simple Bragg-Williams treauiient of this behaviour assumes an unlike atom bond strengdi which is greater than the like atom bonding and calculates the entropy of mixing as a function of die disorder which counterbalances this negative heat of formation. The relationship between the Curie temperature, Tc, and the bond energies is... [Pg.189]

The drermodynamic data for CuaS-FeS (Krivsky and Schuhmann, 1957) show that drese sulphides mix to form approximately ideal ionic liquids. These are molten salts in which the heat of mixing is essentially zero, and die entropy of mixing is related to the ionic fractions of die cations and anions. In the present case die ionic fractions yield values for the activities of the two sulphides... [Pg.339]

This approach to solution chemistry was largely developed by Hildebrand in his regular solution theory. A regular solution is one whose entropy of mixing is ideal and whose enthalpy of mixing is nonideal. Consider a binary solvent of components 1 and 2. Let i and 2 be numbers of moles of 1 and 2, 4>, and 4>2 their volume fractions in the mixture, and Vi, V2 their molar volumes. This treatment follows Shinoda. ... [Pg.413]

Due to the smallness of the entropy of mixing, most polymer mixtures are at least partially incompatible, and blends contain A-rich and B-rich domains, separated by interfaces. The intrinsic width of these interfaces is rather broad (it varies from w = aJin... [Pg.204]

As shown by Frank and Evans 41 , solutions of apolar substances in water are characterized by a large entropy of mixing, leading to a high positive free energy of dissolving. [Pg.5]

It is simplest to consider these factors as they are reflected in the entropy of the solution, because it is easy to subtract from the measured entropy of solution the configurational contribution. For the latter, one may use the ideal entropy of mixing, — In, since the correction arising from usual deviation of a solution (not a superlattice) from randomness is usually less than — 0.1 cal/deg-g atom. (In special cases, where the degree of short-range order is known from x-ray diffuse scattering, one may adequately calculate this correction from quasi-chemical theory.) Consequently, the excess entropy of solution, AS6, is a convenient measure of the sum of the nonconfigurational factors in the solution. [Pg.130]

This result is nearly equal to 4.87 J K hmoT. the value that would be calculated for the entropy of mixing to form an ideal solution. We will show in Chapter 7 that the equation to calculate AmixSm for the ideal mixing process is the same as the one to calculate the entropy of mixing of two ideal gases. That is. AmixSm = -R. Vj ln.Yj. [Pg.168]

The conclusion is that AmjxSm =4.66 J K I mol 1 for the solution process at 0 Kelvin. If one assumes that the entropies of the AgBr and AgCl are zero at 0 Kelvin, then the solid solution must retain an amount of entropy that will give this entropy of mixing. [Pg.169]

An alternate way of calculating this entropy discrepancy is to attribute it to the mixing of two forms of CO that differ by having C and O reversed. The entropy of mixing 1/2 mole of each form is So and can be calculated from... [Pg.172]


See other pages where Entropy of mixing is mentioned: [Pg.214]    [Pg.623]    [Pg.630]    [Pg.2368]    [Pg.314]    [Pg.18]    [Pg.167]    [Pg.513]    [Pg.513]    [Pg.515]    [Pg.517]    [Pg.409]    [Pg.409]    [Pg.370]    [Pg.520]    [Pg.277]    [Pg.192]    [Pg.343]    [Pg.323]    [Pg.48]    [Pg.58]    [Pg.718]    [Pg.415]    [Pg.138]    [Pg.197]    [Pg.198]    [Pg.198]    [Pg.288]    [Pg.138]    [Pg.89]    [Pg.176]    [Pg.177]   
See also in sourсe #XX -- [ Pg.138 ]

See also in sourсe #XX -- [ Pg.186 ]

See also in sourсe #XX -- [ Pg.502 , Pg.505 , Pg.510 ]

See also in sourсe #XX -- [ Pg.55 ]

See also in sourсe #XX -- [ Pg.184 , Pg.188 ]

See also in sourсe #XX -- [ Pg.127 ]

See also in sourсe #XX -- [ Pg.335 ]

See also in sourсe #XX -- [ Pg.202 ]

See also in sourсe #XX -- [ Pg.104 ]

See also in sourсe #XX -- [ Pg.146 , Pg.192 ]

See also in sourсe #XX -- [ Pg.174 , Pg.175 , Pg.181 , Pg.183 ]

See also in sourсe #XX -- [ Pg.299 ]

See also in sourсe #XX -- [ Pg.140 , Pg.142 ]

See also in sourсe #XX -- [ Pg.20 , Pg.173 , Pg.271 ]

See also in sourсe #XX -- [ Pg.6 , Pg.11 , Pg.18 , Pg.24 ]

See also in sourсe #XX -- [ Pg.53 ]

See also in sourсe #XX -- [ Pg.342 ]

See also in sourсe #XX -- [ Pg.162 ]

See also in sourсe #XX -- [ Pg.34 , Pg.105 ]

See also in sourсe #XX -- [ Pg.20 , Pg.173 , Pg.271 ]

See also in sourсe #XX -- [ Pg.132 ]

See also in sourсe #XX -- [ Pg.142 ]

See also in sourсe #XX -- [ Pg.50 ]

See also in sourсe #XX -- [ Pg.215 , Pg.220 , Pg.222 ]

See also in sourсe #XX -- [ Pg.29 ]

See also in sourсe #XX -- [ Pg.181 ]

See also in sourсe #XX -- [ Pg.113 ]

See also in sourсe #XX -- [ Pg.467 , Pg.490 , Pg.498 , Pg.502 ]

See also in sourсe #XX -- [ Pg.201 ]

See also in sourсe #XX -- [ Pg.196 , Pg.226 ]

See also in sourсe #XX -- [ Pg.79 , Pg.81 ]

See also in sourсe #XX -- [ Pg.437 ]

See also in sourсe #XX -- [ Pg.236 ]

See also in sourсe #XX -- [ Pg.53 ]

See also in sourсe #XX -- [ Pg.4 , Pg.78 , Pg.345 , Pg.350 , Pg.363 ]

See also in sourсe #XX -- [ Pg.109 ]

See also in sourсe #XX -- [ Pg.172 ]

See also in sourсe #XX -- [ Pg.129 , Pg.190 , Pg.196 , Pg.390 ]

See also in sourсe #XX -- [ Pg.147 , Pg.148 , Pg.149 , Pg.152 , Pg.154 , Pg.158 , Pg.194 ]

See also in sourсe #XX -- [ Pg.224 ]

See also in sourсe #XX -- [ Pg.125 , Pg.129 , Pg.381 ]

See also in sourсe #XX -- [ Pg.240 , Pg.243 ]

See also in sourсe #XX -- [ Pg.32 , Pg.67 ]

See also in sourсe #XX -- [ Pg.140 , Pg.142 ]

See also in sourсe #XX -- [ Pg.18 , Pg.175 , Pg.315 ]

See also in sourсe #XX -- [ Pg.457 , Pg.458 ]

See also in sourсe #XX -- [ Pg.85 ]

See also in sourсe #XX -- [ Pg.42 ]

See also in sourсe #XX -- [ Pg.224 ]

See also in sourсe #XX -- [ Pg.267 , Pg.596 ]

See also in sourсe #XX -- [ Pg.7 , Pg.155 , Pg.156 ]

See also in sourсe #XX -- [ Pg.480 ]

See also in sourсe #XX -- [ Pg.39 , Pg.40 , Pg.41 ]

See also in sourсe #XX -- [ Pg.88 ]

See also in sourсe #XX -- [ Pg.99 ]

See also in sourсe #XX -- [ Pg.96 , Pg.97 ]

See also in sourсe #XX -- [ Pg.185 , Pg.190 ]

See also in sourсe #XX -- [ Pg.209 , Pg.212 ]

See also in sourсe #XX -- [ Pg.184 , Pg.188 ]

See also in sourсe #XX -- [ Pg.203 ]

See also in sourсe #XX -- [ Pg.351 , Pg.504 ]

See also in sourсe #XX -- [ Pg.11 , Pg.14 , Pg.15 , Pg.17 ]

See also in sourсe #XX -- [ Pg.360 ]

See also in sourсe #XX -- [ Pg.9 ]

See also in sourсe #XX -- [ Pg.107 ]

See also in sourсe #XX -- [ Pg.524 , Pg.533 , Pg.535 , Pg.541 , Pg.548 ]

See also in sourсe #XX -- [ Pg.50 , Pg.51 , Pg.52 , Pg.53 , Pg.57 , Pg.62 , Pg.63 , Pg.65 , Pg.66 , Pg.67 , Pg.78 , Pg.82 , Pg.83 , Pg.86 , Pg.149 , Pg.196 ]

See also in sourсe #XX -- [ Pg.97 ]

See also in sourсe #XX -- [ Pg.7 , Pg.483 ]

See also in sourсe #XX -- [ Pg.437 ]

See also in sourсe #XX -- [ Pg.7 , Pg.9 , Pg.73 ]

See also in sourсe #XX -- [ Pg.117 ]

See also in sourсe #XX -- [ Pg.141 , Pg.479 ]

See also in sourсe #XX -- [ Pg.355 ]




SEARCH



ASm Entropy of mixing

Athermal entropy of mixing

Calculation of Mixing Entropy

Combinatorial entropy of mixing

Configurational entropy of mixing

Entropy change of mixing

Entropy mixing

Entropy of binary mixing

Entropy of expansion and mixing

Entropy of isotope mixing

Entropy of mixing effects

Entropy of mixing ideal

Entropy of mixing, polymers

Excess entropy of mixing

Flory-Huggins Theory Entropy of Mixing

Gibbs energy and entropy of mixing

Molar entropy of mixing

Non-ideal entropy of mixing

Partial molal entropy of mixing

The Entropy of Mixing according to Liquid Lattice Theory

The combinatorial entropy of mixing

© 2024 chempedia.info