home bbs files messages ]

Forums before death by AOL, social media and spammers... "We can't have nice things"

   sci.optics      Discussion relating to the science of op      12,750 messages   

[   << oldest   |   < older   |   list   |   newer >   |   newest >>   ]

   Message 11,618 of 12,750   
   haiticare2011@gmail.com to All   
   Re: Simple lock-in design for Oz-type me   
   27 Jan 14 05:30:03   
   
       
   >    
   > > Hi Jeroen,   
   >    
   > >    
   >    
   > > To be fair, you are correct, the signal IS growing faster than the noise.   
   But what allows Messr. Horowitz to see it is that the 'standard deviation' of   
   the noise goes way down, more than the SD of the signal. So if you do Y   
   measurement episodes of X    
   measures each, the variation in the noise (in the reference beam) will be   
   tight, and the signal (in the measurement beam) will almost always be greater   
   than that tight noise floor.   
   >    
   > >    
   >    
   > Weird, what do you mean by the SD of the signal or even of the noise for   
   that matter?   (Hmm OK I guess for random noise the amplitude is Gaussian and   
   that could be used to define some standard deviation of the ampltude   
   distribution... Is that what you    
   mean?)     
   >    
   > >   
   > George H.   
   >    
   I am frankly using SD in the qualitative sense - as a description of the   
   variation of the 'noise'. My particular background in dealing with this is   
   study of entropy as a biochemist, where the entropy of water in the free   
   energy equation is probably the    
   most powerful force in biological microscopic interactions - and the most   
   widely misunderstood. It determines just about everything microscopically. It   
   is misunderstood because most people want to see a "force" or thing of   
   quantitative power, like    
   temperature, but no one has ever seen an "entropy." It is the entropic   
   structure of water which gives our whole body it's shape.    
   Now, Leo Szilard in 1936 wrote a paper equating information and entropy. (He   
   is also the inventor of the atomic bomb.) In the paper, a Maxwell demon sits   
   at an opening in a partition, and based on his knowledge of approaching   
   molecules, opens or shuts a    
   door. The negentropy gained is equivalent to his information. It was this   
   quantitation which formed the basis of Shannon's formulations, and information   
   theory in general.    
   So I am interested in an exact description of "noise," but not sure if it's   
   possible in open systems like electronic measuring apparatuses. (In chemistry,   
   particularly in the Gibbs free energy G = H - TdS everything is a closed   
   system.)   
   JB   
   "One man's noise is another man's music."   
      
   --- SoupGate-Win32 v1.05   
    * Origin: you cannot sedate... all the things you hate (1:229/2)   

[   << oldest   |   < older   |   list   |   newer >   |   newest >>   ]


(c) 1994,  bbs@darkrealms.ca