This little Javascript-Application converts arbitrary large numbers into their Decimal, Hexadecimal, Binary or Octal counterpart as well as it computes their one's and two's complement. Enter a value into the fields on top, press Enter and read the result from all other values.
Input:
Dec
Hex
Bin
Oct
Unsigned
Dec
Hex
Bin
Oct
8
16
32
64
n
One's Complement
Hex
Bin
Oct
8
16
32
64
n
Two's Complement
Dec
Hex
Bin
Oct
8
16
32
64
n
This calculator also exists as an application Bit Fiddle for Windows and macOS.
The resulting values are presented in different bit sizes, whereas the value is clamped to the specific number of least significant bits. The result at n Bits adjusts its size to the size of the value, therefore always displaying the full value without clamping but with the minimal number of bits required for the unsigned variant.
The following characters can be entered:
Dec 0,1,2,3,4,5,6,7,8,9
Hex 0,1,2,3,4,5,6,7,8,9,a,b,c,d,e,f,A,B,C,D,E,F
Bin 0,1
Oct 0,1,2,3,4,5,6,7
All other characters will simply be overlooked. Therefore, it is possible to enter numbers in various formats (like for example Hexadecimals with a prefix 0x) without problems. Be careful with the decimal point! That point will also not be detected, meaning a value like 123.456 will be interpreted as 123456. The output of the values will always have automatically generated separation characters to improve the display in the browser and enhance readability.
When displaying decimal values, a negative sign will be used to show the two's complement when interpreted as a negative number. Beware: The result of the complement does not necessary need to be the negative variant of the entered value, as due to clamping of excessive bits, the values differ uncomparably. The display of the decimal value of the one's complement is turned off by default, as a one's complement only makes sense in modern computers for binary, hexadecimal or octal numbers. If anybody is interested, the code for displaying these values is commented out in the source code.
The size of the entered values is limited only by the capabilities of Javascript.
How does the program compute?
The application works based on arrays. Any input values will first be converted into their binary representation and the bits are stored in an array. As an array can be arbitrary large, arbitrary large values can be entered. To then display the value, that array will be converted to a string representation for each output value separately. So for example, the hexadecimal input 5a7f will generate the array 0101101001111111 which then in turn will result in the decimal output value 23 167.
The biggest problem when converting into different numeral systems are the decimal numbers. While the direction decimal-binary is relatively easy using some simple additions, the direction binary-decimal had to be replicated using a subtraction-algorithm.
As the application works with arrays, it is not hyper-performant, but fast enough.
The reader is invited to have a look at the source code. When finding bugs or having other comments, an email to the author is highly appreciated. More information about binary conversions and numeral systems can be read here (in German):