Entropy and Information by Volkenstein, Mikhail V.; Shenitzer, Abe; Burns, Robert G., 9783034600774
Note: Supplemental materials are not guaranteed with Rental or Used book purchases.
  • ISBN: 9783034600774 | 3034600771
  • Cover: Hardcover
  • Copyright: 9/3/2009

  • Rent

    (Recommended)

    $108.14
     
    Term
    Due
    Price
    *This item is part of an exclusive publisher rental program and requires an additional convenience fee. This fee will be reflected in the shopping cart.
  • Buy New

    Special Order: 1-2 Weeks

    $147.75
  • eBook

    eTextBook from VitalSource Icon

    Available Instantly

    Online: 180 Days

    Downloadable: 180 Days

    $108.96

This treasure of popular science by the Russian biophysicist Mikhail V. Volkenstein is at last, more than twenty years after its appearance in Russian, available in English translation.As its title Entropy and Information suggests, the book deals with the thermodynamical concept of entropy and its interpretation in terms of information theory. The author shows how entropy is not to be considered a mere shadow of the central physical concept of energy, but more appropriately as a leading player in all of the major natural processes: physical, chemical, biological, evolutionary, and even cultural.The theory of entropy is thoroughly developed from its beginnings in the foundational work of Sadi Carnot and Clausius in the context of heat engines, including expositions of much of the necessary physics and mathematics, and illustrations from everyday life of the importance of entropy.The author then turns to Boltzmann's epoch-making formula relating the entropy of a system directly to the degree of disorder of the system, and to statistical physics as created by Boltzmann and Maxwell---and here again the necessary elements of probability and statistics are expounded. It is shown, in particular, that the temperature of an object is essentially just a measure of the mean square speed of its molecules.''Fluctuations" in a system are introduced and used to explain why the sky is blue, and how, perhaps, the universe came to be so ordered. Whether statistical physics reduces ultimately to pure mechanics, as ''Laplace's demon" would have it, is also discussed.The final three chapters concentrate on open systems, that is, systems which exchange energy or matter with their surroundings---first linear systems close to equilibrium, and then non-linear systems far from equilibrium. Here entropy, as it figures in the theory of such systems developed by Prigogine and others, affords explanations of the mechanism of division of cells, the process of aging in organisms, and periodic chemical reactions, among other phenomena.Finally, information theory is developed---again from first principles---and the entropy of a system characterized as absence of information about the system. In the final chapter, perhaps the pièce de résistance of the work, the author examines the thermodynamics of living organisms in the context of biological evolution. Here the ''value of biological information" is discussed, linked to the concepts of complexity and irreplaceability. The chapter culminates in a fascinating discussion of the significance of these concepts, all centered on entropy, for human culture, with many references to particular writers and artists.The book is recommended reading for all interested in physics, information theory, chemistry, biology, as well as literature and art.
Loading Icon

Please wait while the item is added to your bag...
Continue Shopping Button
Checkout Button
Loading Icon
Continue Shopping Button