So I tried to find an authoritative source for you on this, but I believe it depends on the vendor.
When it comes to Linux (and iOS, it seems) you can assume that long
== machine word size. A machine word is how your architecture addresses memory - on a 32bit machine Linux or iOS system, a long is 32 bits wide. On a 64 bit system iOS or Linux system, it's 64 bits wide. You'll note I hedge strongly here on referring to Linux or iOS. Windows has different behavior, and I've seen enough anecdote to imply that there are in fact many different interpretations of what int and long should be. This wikipedia article is instructive.
In the case of your SO example, on a 32 bit machine it just so happened that UInt32
and unsigned long
took up the same number of bits. Hooray! On a 64 bit machine, unsigned long
happened to take 64 bits, whilst UInt32
, as you might assume, took 32 bits.
Now, where it gets a bit weird is that the unsigned long (a 64 bit value on your 64 bit machine) of AQInputCallback
and UInt32
(32 bits, natch) of AudioFileWritePackets
represent the same numeric quantity, which obviously gets a bit weird in the hypothetical case where your 64 bit quantity exceeds the maximum value capable by the 32 bit value. Currently the cast that happens in the example answer at UInt32 inNumPacketsTmp = inNumPackets;
does the right thing when that value is less than 4,294,967,295. What happens when it exceeds it I'm not enough of a C programmer to give you the greatest answer on. I assume it wraps around, but you could test this quite quickly on your own. You could safely assume that whatever does happen at that point is your problem though. :)
4,294,967,295 may also be interpreted as
UINT32_MAX
or whatever the contract provided byUInt32
specifies.