1
   

Thermodynamic Entropy & Information Entropy

 
 
Deckard
 
Reply Fri 5 Feb, 2010 12:25 am
As at least some of you already know, the equations used to represent thermodynamic entropy and the equations used to represent Shannon entropy (or information entropy) are very similar.

Entropy in thermodynamics and information theory - Wikipedia, the free encyclopedia

How important is it that the mathematics seem to line up.

1) Is it pure random coincidence and ultimately unimportant?
2) Does this point to some mystical connection between Thermodynamics and Information?
3) Does this similarity reveal something important about systems and existence in general?

I'm voting for the third option but how should we characterize the importance of this similarity?

We need not look to something as complicated as thermodynamic entropy and information entropy to see this same matching up of mathematics between fields. Here is another example:

2 planets plus 3 planets equals five planets.
But this is not just the case with astronomy but also entomology!
2 ants plus 3 ants equals five ants.

Isn't that an amazing! The equations are identical!
In a way it is amazing when you think about it a little.

I'm still hacking my way through the mathematics to prove it to myself but I think the coincidence between the equations used to model thermodynamic entropy and the equations used to model the amount of information contained in a message is, in the final analysis, really no more amazing than my silly example that compares planets to ants.

If we are to get at all in depth in this discussion we will need to deal with the equations. Keep in mind the i should be subscript. Keyboard shortcut to get into and out of subscript?

equation for thermodynamic entropy:

S = - k Σ pi log pi equation for information (Shannon) entropy:

H = pi log pi

I will continue adding to this initial post until/unless (and possibly even as) someone responds. Maybe I'm recreating the wikipedia and Wolfram MathWorld wheel here but this is something I need to sort out in my own words. (Thanks.)

For starters we need to understand what
k, the Boltzmaann constant is since it is clearly one major difference between the two equations. But what is the Boltzmann constant? As it turns out the Boltzmann constant would be unnecessary if we were working in Plank Units.

Planck units - Wikipedia, the free encyclopedia

So what are Plank Units?

In physics, Plank Units are physical units of measurement defined exclusively in terms of five universal physical constants. (Yes that's 5 for Robert Anton Wilson fans.) The five constants are:

The Gravitational constant
The Plank constant
The speed of light
The Coloumb constant
The Boltzmann constant

Each of these constants is a quantity but they are not naked quantities for they each must be understood within the context of dimensional analysis. That is each of these quantities wear the clothes of the metric system (to continue the "naked" metaphor). I'll let you look up for yourself what each of these quantities are and what dimensional analysis is if you do not know already.

Metric measurements are conventional. There is nothing magical about them. They are arbitrary. Is Man the measure of all things? Perhaps but the man Plank was attempting to remove the arbitrariness of Man's units of measurement. Plank units in fact eliminate the need for these five constants (and dimensional analysis) since these constants (and dimensional analysis) arise from the arbitrary/conventional nature of metric units.

So, as it turns out, if the equation for thermodynamic entropy S was represented in Plank Units it would be:

S = pi log pi

So, if we use this Plank Unit informed rendering of the equation for thermodynamic entropy we can set aside Boltzmann's constant k as irrelevant to our discussion. This makes the two equations (thermodynamic entropy and Shannon entropy) identical. The equations are identical except of course for what they represent.

S does not equal H any more than ants equal planets or at least that is what I intend to show.

To be continued...
  • Topic Stats
  • Top Replies
  • Link to this Topic
Type: Discussion • Score: 1 • Views: 1,701 • Replies: 0
No top replies

 
 

Related Topics

How can we be sure? - Discussion by Raishu-tensho
Proof of nonexistence of free will - Discussion by litewave
Destroy My Belief System, Please! - Discussion by Thomas
Star Wars in Philosophy. - Discussion by Logicus
Existence of Everything. - Discussion by Logicus
Is it better to be feared or loved? - Discussion by Black King
Paradigm shifts - Question by Cyracuz
 
  1. Forums
  2. » Thermodynamic Entropy & Information Entropy
Copyright © 2024 MadLab, LLC :: Terms of Service :: Privacy Policy :: Page generated in 0.07 seconds on 12/27/2024 at 06:52:00