On Monday, February 3, 2014 1:56:15 PM UTC-5, Phil Hobbs wrote:   
   > On 01/31/2014 01:03 PM, haiticaore2081@gmail.com wrote:   
   >    
   > > On Thursday, January 30, 2014 9:55:56 AM UTC-5, Phil Hobbs wrote:   
   >    
   > >> On 01/30/2014 09:35 AM, haiicare200991@gmail.com wrote:   
   >    
   > >>   
   >    
   > >>> On Wednesday, January 29, 2014 10:37:03 AM UTC-5, Phil Hobbs   
   >    
   > >>> wrote:   
   >    
   >    
   >    
   >    
   >    
   > >>   
   >    
   > >>>> How about "there is no encoding scheme that can send data at an   
   >    
   > >>>> average rate higher than BW*log_2(1+SNR) ?" All it takes is one   
   >    
   > >>>> counterexample.   
   >    
   > >>   
   >    
   > >>> Uh, could you say that in english?   
   >    
   > >>   
   >    
   > >>   
   >    
   > >>   
   >    
   > >> It's Shannon's Theorem. Entirely disprovable, novel, interesting,   
   >    
   > >> useful, and all that.   
   >    
   > >   
   >    
   > > Yes, OK, point taken. But I am serious (as much as I can muster)   
   >    
   > > about what Shannon's theorem looks like in plain english. Or to be   
   >    
   > > even more challenging, how would you explain it to a "pre-scientific"   
   >    
   > > person like Aristotle or even an Australian bush-man? You CAN explain   
   >    
   > > Newton's laws, the energy laws, and much of classical physics. If you   
   >    
   > > go into "mystical" concepts (though they are not concepts proper),   
   >    
   > > you might convey aspects of quantum physics. (The primitive man might   
   >    
   > > be more acquainted with the unknowable than us. ) I just wonder if   
   >    
   > > Shannon is a general aspect of life, or whether it only applies to   
   >    
   > > communication channels as usually presented.   
   >    
   >    
   >    
   > Well, I wouldn't necessarily want to explain, e.g. spherical harmonics    
   >    
   > or the Laplace transform to a cave man either. Some things take a    
   >    
   > little mathematical preparation. Mathematics as used in science and    
   >    
   > engineering is primarily a means of expressing certain abstruse ideas    
   >    
   > consistently, and performing correct inferences of quite astonishing    
   >    
   > complexity and accuracy. Takes a bit of training.   
   In reply to that I would say that first mathematics is not considered science.   
   Putting the requirement of math complexity on the scientific method would, in   
   modern times, rule out Einstein, Heisenberg, and Feynman. And you might say   
   that Einstien solved    
   some abstruse formulae for E = MC2, but as a historical fact, he had help, and   
   his "gedanken experimenten" were his stock in trade. Quantum mechanics had   
   it's Schrodinger wave equations, but the main ideas of duality, uncertainty   
   don't in themselves    
   require math. Finally Feynman is known for explanations which short cut around   
   abstruse math. So to put a wall up around scientific ideas, that only those   
   trained in complex math can understand them, is not part of the scientific   
   method.   
   >    
   >    
   >    
   > >   
   >    
   > > BUT, even without that generalization, would be good to hear verbal   
   >    
   > > explanation of Shannon.   
   >    
   >    
   >    
   > For arm waving, it's pretty straightforward apart from the normalization    
   >    
   > constants. In the low SNR limit, it becomes the usual   
   >    
   >    
   >    
   > CC ~ BW*SNR/ln(2), i.e. at a SNR of 0.1 you have to integrate ten times    
   >    
   > as long as you do at SNR=1. The normalization constant of ln(2) comes    
   >    
   > from the definition of channel capacity in bits per second.   
      
   OK, thanks. I'm not sure this is science - it might be engineering and math.    
   One problem I see with Shannon's theory is that it is called "information   
   theory." This is dicey, since probability and entropy don't equate with   
   something called information in a simple way.    
   On a historical note, I was dismayed to see the write-up on Wikipedia calling   
   Shannon the "Father of Information Theory." No where does Leo Szilard get the   
   credit for his ground-breaking 1936 paper on entropy and information. The   
   reason's why are    
   historical, and I will state them plainly. Szilard was a buddy of Einstein.   
   The letter starting the Manhattan Project was signed by both. But at the time   
   there was a social problem in that both were Jews, and both leftist. They   
   parked Einstein at    
   Princeton, and Szilard is relatively unknown. He never got a Nobel prize for   
   the A-bomb or information theory, and was not accepted into academic   
   establishment then.   
   Getting back to the particular formulation of "information theory," it cannot,   
   for reasons I won't belabor here, pretend to that role. But what about a   
   lesser role, as a formula governing transmission through a channel? Is that   
   science? Maybe, but it's    
   kinda narrow, and not what the theory called "information theory" promised   
   from Szilard onwards.   
      
   >    
   >    
   >    
   > In the high-SNR limit, doubling the SNR allows you to distinguish 2x    
   >    
   > more unique voltage levels in a given measurement, but that only gets    
   >    
   > you one more bit, i.e.   
   >    
   >    
   >    
   > CC ~ BW*log2(SNR). SNR >>1   
   >    
   >    
   >    
   > So again apart from a normalization constant, the asymptotic behaviour    
   >    
   > is pretty intuitive.   
   >    
   >    
   OK. I'm still a skeptic in the big picture tho. I hope we can resume this   
   discussion, as I have more to say about information theory and information   
   entropy - and it's short comings. The second law is a peculiar part of   
   physics, as it's basis is    
   statistical as well as observable. And it gives physics the polarity of time,   
   as all other equations go back and forth in time.   
      
   I don't mean to lecture, but here are several problems with information theory   
   (IT). First, consider Paul Revere's famous signal, one by land and two by sea.   
   According to Shannon, this has an information content of one bit. But it means   
   much more in the    
   context of that war, so just considering the channel does not adequately   
   describe the information transfer.   
   Here is another difficulty. The second law says all spontaneous processes   
   result in an increase in entropy, or disorder. Now consider a super-saturated   
   solution which spontaneously crystallizes. So we are presented with a case   
   where a crystal has more    
   entropy than a solution it came from.    
      
      
      
   >    
   > >   
   >    
   > > I got into these philosophical navel gazing because I thought the   
   >    
   > > scientific method inadequate to investigate complex phenomena like   
   >    
   > > the causation of cancer. Most researchers know that it has many   
   >    
   > > factors, but there is no simple "container" in the scientific method   
   >    
   > > for that. And it was immediately relevant to attempts to make an AI   
   >    
   > > SW for cancer preventrics and cures.   
   >    
   >    
   >    
   > The scientific method is inadequate for a lot of things, because most    
   >    
   > questions that we care about in real life aren't scientific questions.   
      
      
   [continued in next message]   
      
   --- SoupGate-Win32 v1.05   
    * Origin: you cannot sedate... all the things you hate (1:229/2)   
|