home bbs files messages ]

Forums before death by AOL, social media and spammers... "We can't have nice things"

   comp.os.vms      DEC's VAX* line of computers & VMS.      264,096 messages   

[   << oldest   |   < older   |   list   |   newer >   |   newest >>   ]

   Message 263,997 of 264,096   
   Dan Cross to arne@vajhoej.dk   
   Re: DCL2   
   18 Dec 25 15:03:57   
   
   From: cross@spitfire.i.gajendra.net   
      
   In article <10hvmbb$3uroi$2@dont-email.me>,   
   Arne Vajhøj   wrote:   
   >On 12/17/2025 11:57 AM, Dan Cross wrote:   
   >> In article <10ht0rk$34qt0$2@dont-email.me>,   
   >> Arne Vajhøj   wrote:   
   >>> On 12/16/2025 9:05 AM, Dan Cross wrote:   
   >>>> Python is interpreted, yes, but people who are using it to do   
   >>>> numerical analysis are often using the jit-compiled variant,   
   >>>   
   >>> Most still use CPython.   
   >>   
   >> In your world of enterprise IT?  Sure.   
   >>   
   >> For numerical analysis?  Directly in Python?  No.   
   >>   
   >>> None of the JIT implementations PyPy, GraalPy, Codon etc. has   
   >>> really gotten traction.   
   >   
   >>> Reason: fear of compatibility issues combined with the fact that   
   >>> JIT usually does not matter.   
   >>>   
   >>> Because:   
   >>>   
   >>>>                                                             and   
   >>>> more often the actual heavy computational lifting is being done   
   >>>> in a library that's exposed to Python via an FFI; so the actual   
   >>>> training code is in Fortran or C or some more traditional   
   >>>> compiled language.   
   >>>   
   >>> If CPython interpretation use 0.1-1.0% of total CPU usage   
   >>> and native library execution use 99.0-99.9% of total CPU usage,   
   >>> then ...   
   >>   
   >> See above.  I'm talking about software that's doing numerical   
   >> analysis directly in Python, _not_ via FFI.   
   >   
   >But practically nobody does that.   
      
   I haven't seen any evidence that suggests you are really in a   
   position to know that one way or the other.  On the other hand,   
   I have direct, first-hand knowledge of people who are doing   
   just that.   
      
   >By using the high level packages (pandas, polars,   
   >tensorflow, pytorch, numpy, scipy etc.) they can   
   >do what they need to do using much higher level   
   >constructs. No need to fiddle with LAPACK, BLAS,   
   >matrix multiplication and inversion algorithms.   
   >   
   >And on top of having to deal with much less   
   >much higher level code they get way better   
   >performance. The standard libraries are   
   >much faster than custom Python code even   
   >if it is JIT compiled.   
   >   
   >Nobody want to write 5-10 times more lines   
   >of code to run 5-10 times slower.   
      
   Sorry, but if you want to make a statement about some aspect of   
   the field that you just aren't involved in, you really need some   
   evidence to back it up, not just your intuition.   
      
   	- Dan C.   
      
   --- SoupGate-Win32 v1.05   
    * Origin: you cannot sedate... all the things you hate (1:229/2)   

[   << oldest   |   < older   |   list   |   newer >   |   newest >>   ]


(c) 1994,  bbs@darkrealms.ca