[go: up one dir, main page]

Next Issue
Volume 7, June
Previous Issue
Volume 6, December
 
 
entropy-logo

Journal Browser

Journal Browser

Entropy, Volume 7, Issue 1 (March 2005) – 6 articles , Pages 1-121

  • Issues are regarded as officially published after their release is announced to the table of contents alert mailing list.
  • You may sign up for e-mail alerts to receive table of contents of newly released issues.
  • PDF is the official format for papers published in both, html and pdf forms. To view the papers in pdf format, click on the "PDF Full-text" link, and use the free Adobe Reader to open them.
Order results
Result details
Section
Select all
Export citation of selected articles as:
204 KiB  
Article
Van der Waals gas as working substance in a Curzon and Ahlborn-Novikov engine
by Delfino Ladino-Luna
Entropy 2005, 7(1), 108-121; https://doi.org/10.3390/e7010108 - 18 Mar 2005
Cited by 13 | Viewed by 9344
Abstract
Using a van der Waals gas as the working substance the so called Curzon and Ahlborn-Novikov engine is studied. It is shown that some previous results found in the literature of finite time thermodynamics can be written in a more general form, means [...] Read more.
Using a van der Waals gas as the working substance the so called Curzon and Ahlborn-Novikov engine is studied. It is shown that some previous results found in the literature of finite time thermodynamics can be written in a more general form, means of this gas and by taking a non linear heat transfer law. Full article
Show Figures

Figure 1

Figure 1
<p>Curzon and Ahlborn-Novikov cycle in the temperature <span class="html-italic">T</span> and entropy <span class="html-italic">S</span> plane. <span class="html-italic">T</span><sub>1</sub> and <span class="html-italic">T</span><sub>2</sub> are the temperatures of hot and cool reservoirs respectively; <span class="html-italic">T</span><sub>1</sub><span class="html-italic"><sub>w</sub></span> and <span class="html-italic">T</span><sub>2<span class="html-italic">w</span></sub> are the hot and cool temperatures of the system respectively.</p>
Full article ">Figure 2
<p>Comparison between the Curzon and Ahlborn-Novikov efficiency, <span class="html-italic">η<sub>CAN</sub></span>, and the efficiency <span class="html-italic">η<sub>PDPVW</sub></span> at zero order in <span class="html-italic">λ<sub>VW</sub></span>, for the Dulong and Petit's heat transfer law.</p>
Full article ">
581 KiB  
Article
Physical Premium Principle: A New Way for Insurance Pricing
by Amir H. Darooneh
Entropy 2005, 7(1), 97-107; https://doi.org/10.3390/e7010097 - 28 Feb 2005
Cited by 8 | Viewed by 8360
Abstract
In our previous work we suggested a way for computing the non-life insurance premium. The probable surplus of the insurer company assumed to be distributed according to the canonical ensemble theory. The Esscher premium principle appeared as its special case. The difference between [...] Read more.
In our previous work we suggested a way for computing the non-life insurance premium. The probable surplus of the insurer company assumed to be distributed according to the canonical ensemble theory. The Esscher premium principle appeared as its special case. The difference between our method and traditional principles for premium calculation was shown by simulation. Here we construct a theoretical foundation for the main assumption in our method, in this respect we present a new (physical) definition for the economic equilibrium. This approach let us to apply the maximum entropy principle in the economic systems. We also extend our method to deal with the problem of premium calculation for correlated risk categories. Like the Buhlman economic premium principle our method considers the effect of the market on the premium but in a different way. Full article
Show Figures

Figure 1

Figure 1
<p>The loading parameter, (<span class="html-italic">p</span>/<span class="html-italic">p</span><sub>0</sub>) − 1, versus the contract duration for large <span class="html-italic">β</span> parameter. This figure shows the dependence of the premium on the period of insurance contract. It justifies our experience in trading. The premium of a risk category is not change linearly with the contract duration. The long term contract is more advantageous than some short term contraction. The loading parameter is used to get rid of the monetary unit.</p>
Full article ">Figure 2
<p>Dependence of the ln <span class="html-italic">βT</span> on the ruin probability, the initial wealth and the mean claim size. The <span class="html-italic">β</span> parameter for apparently is very small for the wealthier insurer. Such a company may offer the insurance with a low price.</p>
Full article ">Figure 3
<p>The loading parameter, (<span class="html-italic">p</span>/<span class="html-italic">p</span><sub>0</sub>) − 1, versus the, <span class="html-italic">βT</span>, parameter. The squares display the results of the canonical ensemble theory and the triangles correspond to the Esscher premium principle. The difference between two curves is the result of the market effects.</p>
Full article ">
227 KiB  
Article
The meanings of entropy
by Jean-Bernard Brissaud
Entropy 2005, 7(1), 68-96; https://doi.org/10.3390/e7010068 - 14 Feb 2005
Cited by 75 | Viewed by 6740
Abstract
Entropy is a basic physical quantity that led to various, and sometimes apparently conflicting interpretations. It has been successively assimilated to different concepts such as disorder and information. In this paper we're going to revisit these conceptions, and establish the three following results: [...] Read more.
Entropy is a basic physical quantity that led to various, and sometimes apparently conflicting interpretations. It has been successively assimilated to different concepts such as disorder and information. In this paper we're going to revisit these conceptions, and establish the three following results: Entropy measures lack of information; it also measures information. These two conceptions are complementary. Entropy measures freedom, and this allows a coherent interpretation of entropy formulas and of experimental facts. To associate entropy and disorder implies defining order as absence of freedom. Disorder or agitation is shown to be more appropriately linked with temperature. Full article
816 KiB  
Article
Numerical Study On Local Entropy Generation In Compressible Flow Through A Suddenly Expanding Pipe
by Hüseyin Yapici, Nesrin Kayatas, Nafiz Kahraman and Gamze Bastürk
Entropy 2005, 7(1), 38-67; https://doi.org/10.3390/e7010038 - 11 Feb 2005
Cited by 23 | Viewed by 8476
Abstract
This study presents the investigation of the local entropy generation in compressible flow through a suddenly expanding pipe. Air is used as fluid. The air enters into the pipe with a turbulent profile using 1/7 th power law. The simulations are extended to [...] Read more.
This study presents the investigation of the local entropy generation in compressible flow through a suddenly expanding pipe. Air is used as fluid. The air enters into the pipe with a turbulent profile using 1/7 th power law. The simulations are extended to include different expansion ratios reduced gradually from 5 to 1. To determine the effects of the mass flux, φ" the ambient heat transfer coefficient, hamb, and the inlet temperature, Tin, on the entropy generation rate, the compressible flow is examined for various cases of these parameters. The flow and temperature fields are computed numerically with the help of the Fluent computational fluid dynamics (CFD) code. In addition to this CFD code, a computer program has been developed to calculate numerically the entropy generation and other thermodynamic parameters by using the results of the calculations performed for the flow and temperature fields. The values of thermodynamic parameters in the sudden expansion. (SE) case are normalized by dividing by their base quantities obtained from the calculations in the uniform cross-section (UC) case. The contraction of the radius of the throat (from 0.05 to 0.01 m) increases significantly the maximum value of the volumetric entropy generation rate, (about 60%) and raises exponentially 11 times the total entropy generation rate with respect to the its base value. The normalized merit number decreases 73% and 40% with the contraction of the cross-section and with the increase of the ambient heat transfer coefficient (from 20 to 100 W/m2-K), respectively, whereas it rises 226% and 43% with the decrease of the maximum mass flux (from 5 to 1 kg/m2-s) and with the increase of the inlet temperature (from 400 to 1000 K), respectively. Consequently, the useful energy transfer rate to irreversibility rate improves as the mass flux decreases and as the inlet temperature increases. Full article
Show Figures

Figure 1

Figure 1
<p>Coordinate system and two-dimensional axisymmetric model of the suddenly expanding pipe (the dimensions are not in scale)</p>
Full article ">Figure 2
<p>Temperature contours within the uniform section of the pipe (T<sub>in</sub>=400 K, φ<sub>max</sub>=5 kg/m<sup>2</sup>-s and h<sub>amb</sub>=20 W/m<sup>2</sup>-K)</p>
Full article ">Figure 3
<p>Variations of axial velocities at the various radial planes in the uniform section of the pipe (T<sub>in</sub> =400 K, φ<sub>max</sub>=5 kg/m<sup>2</sup>-s and h<sub>amb</sub>=20 W/m<sup>2</sup>-K)</p>
Full article ">Figure 4
<p>Variations of radial velocities at the various radial planes in the uniform section of the pipe (T<sub>in</sub> =400 K, φ<sub>max</sub>=5 kg/m<sup>2</sup>-s and h<sub>amb</sub>=20 W/m<sup>2</sup>-K)</p>
Full article ">Figure 5
<p>Logarithmic volumetric local entropy generation rate contours within the uniform section of the pipe (T<sub>in</sub> =400 K, φ<sub>max</sub>=5 kg/m<sup>2</sup>-s and h<sub>amb</sub>=20 W/m<sup>2</sup>-K)</p>
Full article ">Figure 6
<p>Variations of the Bejan number with (a) the throat radius, (b) the maximum mass flux, (c) the ambient heat transfer coefficient and (d) the inlet temperature</p>
Full article ">Figure 7
<p>Variations of the normalized total entropy generation with (a) the throat radius, (b) the maximum mass flux, (c) the ambient heat transfer coefficient and (d) the inlet temperature</p>
Full article ">Figure 8
<p>Variations of the normalized ratio of the heat transfer to the irreversibility generated in the system (<math display="inline"> <semantics> <mrow> <mover accent="true"> <mi mathvariant="normal">Q</mi> <mo>˙</mo> </mover> <mo>/</mo> <mover accent="true"> <mi mathvariant="normal">I</mi> <mo>˙</mo> </mover> </mrow> </semantics> </math>)<sup>*</sup> with (a) the throat radius, (b) the maximum mass flux, (c) the ambient heat transfer coefficient and (d) the inlet temperature</p>
Full article ">Figure 9
<p>Variations of the normalized exergy transfer rate with (a) the throat radius, (b) the maximum mass flux, (c) the ambient heat transfer coefficient and (d) the inlet temperature</p>
Full article ">Figure 10
<p>Variations of the normalized merit number with (a) the throat radius, (b) the maximum mass flux, (c) the ambient heat transfer coefficient and (d) the inlet temperature</p>
Full article ">
330 KiB  
Article
The entropy of a mixture of probability distributions
by Alexis Vos De
Entropy 2005, 7(1), 15-37; https://doi.org/10.3390/e7010015 - 20 Jan 2005
Cited by 4 | Viewed by 7211
Abstract
If a message can have n different values and all values are equally probable, then the entropy of the message is log(n). In the present paper, we investigate the expectation value of the entropy, for arbitrary probability distribution. For that purpose, we apply [...] Read more.
If a message can have n different values and all values are equally probable, then the entropy of the message is log(n). In the present paper, we investigate the expectation value of the entropy, for arbitrary probability distribution. For that purpose, we apply mixed probability distributions. The mixing distribution is represented by a point on an infinite dimensional hypersphere in Hilbert space. During an `arbitrary' calculation, this mixing distribution has the tendency to become uniform over a flat probability space of ever decreasing dimensionality. Once such smeared-out mixing distribution is established, subsequent computing steps introduce an entropy loss expected to equal $\\frac{1}{m+1} + \\frac{1}{m+2} + ... + \\frac{1}{n}$, where n is the number of possible inputs and m the number of possible outcomes of the computation. Full article
263 KiB  
Article
Lagrangian submanifolds generated by the Maximum Entropy principle
by Marco Favretti
Entropy 2005, 7(1), 1-14; https://doi.org/10.3390/e7010001 - 12 Jan 2005
Cited by 3 | Viewed by 6665
Abstract
We show that the Maximum Entropy principle (E.T. Jaynes, [8]) has a natural description in terms of Morse Families of a Lagrangian submanifold. This geometric approach becomes useful when dealing with the M.E.P. with nonlinear constraints. Examples are presented using the Ising and [...] Read more.
We show that the Maximum Entropy principle (E.T. Jaynes, [8]) has a natural description in terms of Morse Families of a Lagrangian submanifold. This geometric approach becomes useful when dealing with the M.E.P. with nonlinear constraints. Examples are presented using the Ising and Potts models of a ferromagnetic material. Full article
Previous Issue
Next Issue
Back to TopTop