Thus, it is not surprising that the information and thermodynamic entropies are usually compared with focus on their similarity and linking theories and experiments are developed (e.g., Szilard engines and Landauer’s principle ). Herewith, the microstates are equally likely, so that the probability of a given microstate is p i = 1/ W. The S value describes a particular macrostate that is yielded from W number of microstates (various combinations of particles in various energy states). Hence, it would be insightful to review the following points under one title: Additionally, some issues of information entropy permanently accompany relevant chemical studies, e.g., a comparison of information and thermodynamic entropies. ĭespite the comprehensive reviews on chemical applications of information entropy, some recent advances deserve separate mentioning, especially in the context of existing theories and hypotheses, which are not familiar for broad chemical community. We also mention in brief other chemical applications such as signal processing when molecules act as signal carriers (e.g., in the molecular switches based on the transits between the isomeric species). The second group deals with the quantum-chemical analysis of the electron density distribution in the molecules and redistribution upon their chemical transformations (e.g., see ). These applications have been systematically reviewed in previous works. The first group of the applications deals with the information entropy of molecular graphs that is very seminal for introducing various entropy-based topological descriptors for physical organic chemistry, digital chemistry, and QSAR/QSPR studies (quantitative structure–activity and structure–property relationships). As follows from the names of the points, information entropy is mainly applied to the molecular species described with the finite mathematical models. ‘Pure’ chemical applications of information entropy are wide and could be separated over the two major areas: (a) analysis of molecular graphs and (b) analysis of electron density of molecules. This list is not complete and is permanently extending as information entropy is efficient for assessing the complexity of various objects. In addition to the parent field, it is currently used to describe the objects of mathematics (e.g., graphs and sets), natural sciences (dissipative structures in physics, electron density, complexity of chemical and biological systems, etc.), engineering (urban infrastructure, analysis of images, etc.), and liberal arts (texts, etc.). It initially related to the complexity of a transmitted message but now it has been adapted in diverse sciences. Information entropy (Shannon entropy) originates from the first quantitative theory of the communication and transmission of information.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |