On 31/07/09 21:13, Paul wrote:
A char, for instance, is always 8 bits or one byte. A Unicode char,
if
implemented, is 16 bits or two bytes.
A char is always one byte (by definition) but this need not be 8 bits.
An int is a "natural" integer, a.k.a. "word", for
the CPU architecture,
so 8, 16, 32 or 64 bits depending on the CPU.
An int is at least 16 bits.
--
Paul