Advertisement
Advertisement


max value of integer


Question

In C, the integer (for 32 bit machine) is 32 bits, and it ranges from -32,768 to +32,767. In Java, the integer(long) is also 32 bits, but ranges from -2,147,483,648 to +2,147,483,647.

I do not understand how the range is different in Java, even though the number of bits is the same. Can someone explain this?

2020/02/14
1
294
2/14/2020 7:21:50 AM


In C, the integer(for 32 bit machine) is 32 bit and it ranges from -32768 to +32767.

Wrong. 32-bit signed integer in 2's complement representation has the range -231 to 231-1 which is equal to -2,147,483,648 to 2,147,483,647.

2013/02/21

A 32 bit integer ranges from -2,147,483,648 to 2,147,483,647. However the fact that you are on a 32-bit machine does not mean your C compiler uses 32-bit integers.

2013/02/21

The C language definition specifies minimum ranges for various data types. For int, this minimum range is -32767 to 32767, meaning an int must be at least 16 bits wide. An implementation is free to provide a wider int type with a correspondingly wider range. For example, on the SLES 10 development server I work on, the range is -2147483647 to 2137483647.

There are still some systems out there that use 16-bit int types (All The World Is Not A VAX x86), but there are plenty that use 32-bit int types, and maybe a few that use 64-bit.

The C language was designed to run on different architectures. Java was designed to run in a virtual machine that hides those architectural differences.

2013/02/21

The strict equivalent of the java int is long int in C.

Edit: If int32_t is defined, then it is the equivalent in terms of precision. long int guarantee the precision of the java int, because it is guarantee to be at least 32 bits in size.

2013/02/21

That's because in C - integer on 32 bit machine doesn't mean that 32 bits are used for storing it, it may be 16 bits as well. It depends on the machine (implementation-dependent).

2013/02/21