![]() ![]() It has been applied to biology, economics, engineering, linguistics and cosmology, at the center of one of the greatest open problems in science. It was initially introduced in thermodynamics by Clausius, developed by Boltzmann and Gibbs through the 19th century and generalized by Shannon in the 20th century to the point that it can be applied in a broad range of areas. Therefore, this paper presents a historical background on the evolution of the term “entropy”, and provides mathematical evidence and logical arguments regarding its interconnection in various scientific areas, with the objective of providing a theoretical review and reference material for a broad audience.Įntropy is a measure largely used in science and engineering. Entropy is most commonly defined as “disorder”, although it is not a good analogy since “order” is a subjective human concept, and “disorder” cannot always be obtained from entropy. Therefore, the following question is naturally raised: “what is the difference, if any, between concepts of entropy in each field of knowledge?” There are misconceptions, as there have been multiple attempts to conciliate the entropy of thermodynamics with that of information theory. ![]() However, there was an unprecedented scientific revolution in the 20th century due to one of its most essential innovations, i.e., the information theory, which also encompasses the concept of entropy. It used to be associated with heat harnessed by a thermal machine to perform work during the Industrial Revolution. Entropy is a concept that emerged in the 19th century.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |