home bbs files messages ]

Forums before death by AOL, social media and spammers... "We can't have nice things"

   comp.lang.c      Meh, in C you gotta define EVERYTHING      243,242 messages   

[   << oldest   |   < older   |   list   |   newer >   |   newest >>   ]

   Message 243,013 of 243,242   
   David Brown to James Russell Kuyper Jr.   
   Re: printf and time_t   
   14 Jan 26 09:26:39   
   
   From: david.brown@hesbynett.no   
      
   On 14/01/2026 03:24, James Russell Kuyper Jr. wrote:   
   > On 2026-01-11 08:32, Michael S wrote:   
   >> On Sun, 11 Jan 2026 04:59:47 -0800   
   >> Keith Thompson  wrote:   
   >>   
   >>> Michael S  writes:   
   >>>> On Sat, 10 Jan 2026 22:02:03 -0500   
   >>>> "James Russell Kuyper Jr."  wrote:   
   >>>>> On 2026-01-09 07:18, Michael S wrote:   
   >>>>>> On Thu, 8 Jan 2026 19:31:13 -0500   
   >>>>>> "James Russell Kuyper Jr."    
   >>>>>> wrote:   
   >>>>> ...   
   >>>>>>> I'd have no problem with your approach if you hadn't falsely   
   >>>>>>> claimed that "It is correct on all platforms".   
   >>>>>>   
   >>>>>> Which I didn't.   
   >>>>>   
   >>>>> On 2026-01-07 19:38, Michael S wrote:   
   >>>>> ...   
   >>>>>   > No, it is correct on all implementation.   
   >>>>   
   >>>> The quote is taken out of context.   
   >>>> The context was that on platforms that have properties (a) and (b)   
   >>>> (see below) printing variables declared as uint32_t via %u is   
   >>>> probably UB according to the Standard (I don't know for sure,   
   >>>> however it is probable),   
   > ...>>   
   >>> I'm sure.  uint32_t is an alias for some predefined integer type.   
   >>>   
   >>> This:   
   >>>      uint32_t n = 42;   
   >>>      printf("%u\n", n);   
   >>> has undefined behavior *unless* uint32_t happens to an alias for   
   >>> unsigned int in the current implementation -- not just any 32-bit   
   >>> unsigned integer type, only unsigned int.   
   >>>   
   >>> If uint32_t is an alias for unsigned long (which implies that   
   >>> unsigned long is exactly 32 bits), then the call's behavior is   
   >>> undefined.  (It might happen to "work".)   
   >>>   
   >>   
   >> What exactly, assuming that conditions (a) and (b) fulfilled, should   
   >> implementation do to prevent it from working?   
   >> I mean short of completely crazy things that will make maintainer   
   >> immediately fired?   
   >   
   > I'm quite positive that you would consider anything that might give   
   > unexpected behavior to such code to be "crazy". The simplest example I   
   > can think of is that unsigned int is big-endian, while unsigned long is   
   > little-endian, and I would even agree that such an implementation would   
   > be peculiar, but such an implementation could be fully conforming to the   
   > C standard.   
   >   
      
   Would it be allowed (in the sense of being possible in a hypothetical   
   but fully conforming implementation) to have "unsigned long" be 32-bit,   
   without padding, while "unsigned int" is 64-bit wide with 32 value bits   
   and 32 padding bits?  A cpu might be able to handle 64-bit lumps faster   
   than 32-bit lumps and choose such a setup to make "unsigned int" as fast   
   as it can.  (uint32_t in this case would be an alias for "unsigned   
   long", as it can't have padding bits.)   
      
   (I realise this is swapping your pink unicorn C implementation for a   
   green unicorn C implementation, but sometimes it is fun to see how weird   
   you can imagine while still being able to support conforming C.)   
      
   >>> If uint32_t and unsigned long have different sizes, it still might   
   >>> happen happen to "work", depending on calling conventions.  Passing a   
   >>> 32-bit argument and telling printf to expect a 64-bit value clearly   
   >>> has undefined behavior, but perhaps both happen to be passed in 64-bit   
   >>> registers, for example.   
   >>>   
   >>   
   >> And that is sort of intimate knowledge of the ABI that I don't want to   
   >> exploit, as already mentioned in my other post in this sub-thread.   
   >   
   > Which is precisely what's wrong about your approach - it relies upon   
   > intimate knowledge of the ABI. Specifically, it relies on unsigned int   
   > and unsigned long happening to have exactly the same size and   
   > representation.   
   >   
      
   I don't think there is anything intrinsically wrong with writing code   
   that makes assumptions about the target ABI - non-portable code has its   
   essential place in programming.  But there /is/ something wrong about   
   making assumptions about an ABI while claiming you are writing portable   
   code that does not make such assumptions.  And there is something that   
   is at least "stylistically questionable" about needlessly and wantonly   
   doing so.  By all means write code that relies on the specifics of the   
   target or compiler, but do so knowingly, do so only when you have good   
   reason for it, and do so in a way that is clear to anyone later trying   
   to re-use the code on some other system.   
      
   --- SoupGate-Win32 v1.05   
    * Origin: you cannot sedate... all the things you hate (1:229/2)   

[   << oldest   |   < older   |   list   |   newer >   |   newest >>   ]


(c) 1994,  bbs@darkrealms.ca