2

I have two 16bit integer raw data.

For example:
High Word = 17142 (dec) or 0100001011110110 (binary)
Low Word = 59759 (dec) or 1110100101111001 (binary)

If you treat two word together as one 32bit float data, it will be "123.456"
Binary --> 01000010111101101110100101111001

How to convert integer array [59759 , 17142] to float 123.456 in Javascript?

Note: [ X (16bit low word) , Y (16 bit high word) ] ==> Z (32bit float)

Kirk Beard
  • 8,124
  • 12
  • 39
  • 43
Heng-Shou Liu
  • 33
  • 1
  • 3

1 Answers1

6

You can do this with typed arrays and ArrayBuffer, which allow you to interpret the same bits in different ways (but endianness is platform-specific). It's also possible using a DataView on the buffer, which lets you conrol endianness.

Here's the typed array approach which works with the endianness of my platform, see comments:

// Create a buffer
var buf = new ArrayBuffer(4);
// Create a 16-bit int view of it
var ints = new Uint16Array(buf);
// Fill in the values
ints[0] = 59759;
ints[1] = 17142;
// Create a 32-bit float view of it
var floats = new Float32Array(buf);
// Read the bits as a float; note that by doing this, we're implicitly
// converting it from a 32-bit float into JavaScript's native 64-bit double
var num = floats[0];
// Done
console.log(num);

Here's the DataView approach, note writing the ints in the opposite order:

// Create a buffer
var buf = new ArrayBuffer(4);
// Create a data view of it
var view = new DataView(buf);
// Write the ints to it
view.setUint16(0, 17142);
view.setUint16(2, 59759);
// Read the bits as a float; note that by doing this, we're implicitly
// converting it from a 32-bit float into JavaScript's native 64-bit double
var num = view.getFloat32(0);
// Done
console.log(num);
T.J. Crowder
  • 879,024
  • 165
  • 1,615
  • 1,639