From: james.harris.1@gmail.com   
      
   On 10/02/2022 18:34, antispam@math.uni.wroc.pl wrote:   
   > James Harris wrote:   
      
   ...   
      
   >> From memory, it is advised to put a jump immediately after enabling   
   >> Pmode so as to flush the instruction pipeline on early processors 386,   
   >> 486, and possibly Pentium before Pentium Pro. But is the pipeline   
   >> flushed before or after the jump is executed? IOW, is the jump itself   
   >> decoded in real mode or in PMode16 (the state the processor is put in by   
   >> the load of CR0)?   
   >   
   > Your question is confused. Instructions are decoded in parallel with   
   > execution of previous instructions. This may happen few clocks   
   > before execution. So _normally_ jump is decoded before CR0 bit 0   
   > is set, so decoded in real mode. But I can imagine rather obscure   
   > situation so that jump is decoded after CR0 bit 0 is set, so in   
   > protected mode.   
      
   The point is that the encoding of relative jumps appears to be the same   
   in Real Mode and 16-bit Protected Mode so why does it matter which mode   
   they are decoded in?   
      
   >   
   >> Perhaps all jump instructions assemble to the same bytes so it wouldn't   
   >> matter. Certainly a jump to the next instruction will be coded as EB00   
   >> in either 16-bit or 32-bit mode.   
   >   
   > That is recomended instruction, safe under all cirumstances.   
      
   So, it seems, are other jumps. And possibly most instructions.   
      
   Any idea which instructions would be coded differently in RM and PM16?   
      
   AFAICS almost none. I am beginning to suspect it's only direct memory   
   accesses (not relative ones such as relative jmp and call).   
      
   >   
   >> In fact, as the assembler (Nasm, in this case but the point is   
   >> presumably generally applicable) recognises only "bits 16" or "bits 32"   
   >> and not Rmode and Pmode perhaps instructions in PM16 decode exactly as   
   >> they do in Real mode.   
   >>   
   >> But then if the encodings are the same then why would flushing the   
   >> instruction queue be necessary?   
   >   
   > Correspondence between mnemonics and intruction bits is the same,   
   > so no need for extra assembler directive. But processor treats   
   > intruction bits differently depending on mode.   
      
   Can you think of an example other than direct memory accesses?   
      
   >   
   >> To be clear, where I believe a "bits 32" directive sits is /after/ the   
   >> far jump as in   
   >>   
   >> mov cr0, eax ;Set bit 0   
   >> ... (potentially a large number of instructions)   
   >> jmp seg:offset   
   >> bits 32   
   >> offset:   
   >>   
   >> It seems a bit of a conundrum and leads to the obvious question: exactly   
   >> what differences are there between instruction decoding in real mode and   
   >> in PM16 (the mode immediately after setting CR0 bit 0?   
   >   
   > Note: "canonical" sequence has _two_ jumps: one short jump to   
   > flush decode queue and long jump to load CS. Namely, setting   
   > CR0 bit 0 gives you 16 bit proteded mode. You switch to 32-bit   
   > mode by loading 32-bit segment descriptor.   
      
   Yes, and one can apparently mix the two modes - some segments in PM16   
   and some in PM32.   
      
      
   >   
   > IIRC simpler sequence failed (on actual 386). But I did my   
   > tests more than 25 years ago so I am not 100% confident in my   
   > memory. So, if you want to be sure or want to know what happens   
   > beyond simple fail/works, then get real 386 and run enough tests...   
   > OTOH I would not expect detailed explanations, during switch   
   > processor is in strange transitory mode so can exhibit   
   > weird behaviour. Intel documented how to avoid pitfals,   
   > but they have no interest in providing deeper explanation.   
      
   Indeed.   
      
      
   --   
   James Harris   
      
   --- SoupGate-Win32 v1.05   
    * Origin: you cannot sedate... all the things you hate (1:229/2)   
|