home bbs files messages ]

Forums before death by AOL, social media and spammers... "We can't have nice things"

   sci.optics      Discussion relating to the science of op      12,750 messages   

[   << oldest   |   < older   |   list   |   newer >   |   newest >>   ]

   Message 11,691 of 12,750   
   haiticare2011@gmail.com to All   
   Re: Simple lock-in design for Oz-type me   
   08 Feb 14 04:35:57   
   
   > Ain't no *possible* about entropy, it's as real as energy.     
   >    
   > Isn't that the point, information is entropy.    
   >    
   I'll put a dog in that fight. :)    
   Entropy is NOT as real as energy. The second law says, if you take a sorted   
   deck of cards and throw them up in the air, then they will land usually in a   
   random order. (disordered.) It actually says the change must be spontaneous,   
   as in ice melting, but    
   the deck of cards is the idea.   
   Now, in chemistry, entropy is treated as something. G = H - TS. Gibbs free   
   energy G is the enthalpy minus (1) the temperature times the entropy. The   
   entropy term changes with temperature because the number of possible states   
   goes up, like increasing the    
   number of cards in the deck.    
   Just as an aside, biological beings are held together by entropic forces. It's   
   very counter-intuitive, so few "get" it. But proteins are shaped   
   statistically, not by the kind of forces we are used to. And that's why eggs   
   "cook" when we heat them -    
   because of the "T" term in the Gibbs equation changing the energy of   
   solubility. Eggs don't "uncook" when you cool them, because disulfide bonds   
   lock them cooked, but the change is reversible with T otherwise.   
   I mention all this, to give you a 'feel' for entropy, but also to point out   
   that entropy only works when you have a constrained system, like a chemistry   
   flask or an egg. The classic case is Life on Earth. If you take earth as the   
   context, why, the    
   entropy is magically going down. But if you draw your contextual boundaries to   
   include the Sun, you see entropy going up, as it should by the 2nd law. (ie   
   the Sun's energy is driving the order of life.)   
   So getting back to the case of information = -entropy, where do we draw the   
   contextual boundaries to calculate entropy of information therein? Around the   
   communication channel as Shannon does?    
   If so, consider that a certain channel has a capacity of so many states. One   
   guy codes it "the universe is made up of atoms." Another guy codes it, "I need   
   to use the toilet now." So Shannon, at least, in a simple way, by equating   
   information with    
   entropy, is ignoring context.   
   That's the initial foray of my dog, off the leash.    
      
   JB   
      
   --- SoupGate-Win32 v1.05   
    * Origin: you cannot sedate... all the things you hate (1:229/2)   

[   << oldest   |   < older   |   list   |   newer >   |   newest >>   ]


(c) 1994,  bbs@darkrealms.ca