Unity3D and RawInput

Post questions, comments and feedback to our 3Dconnexion Windows Development Team.

Moderator: Moderators

Post Reply
Hypersonic
Posts: 255
Joined: Mon Jul 12, 2010 5:58 pm

Unity3D and RawInput

Post by Hypersonic » Sun Feb 23, 2014 2:50 pm

I've recently found out that Unity3D uses RawInput, not DirectInput:

http://forum.unity3d.com/threads/131603 ... Any-advice
http://forum.unity3d.com/threads/136277 ... t-Now-what

The Editor and Player are separate

[HKEY_CURRENT_USER\Software\Unity\UnityEditor]
[HKEY_CURRENT_USER\Software\Unity\Player]

Hightree apparently wrote a UnityEditor extension, perhaps this can be easily ported to the Player portion?
http://forum.unity3d.com/threads/182382 ... OpenSource
Glancing at the github source it appears that Hightree is using 3DX's own API using TDxInput.dll rather than RawInput provided by the Unity engine.
I suppose this has an advantage over RawInput in that the 3DX driver can configure the input, while accessing RawInput bypasses 3DX's driver entirely?
Unless the driver can issue commands to the firmware, and the firmware then configures input that it then sends down the USB wire?

I found another extension called cinput2. http://cinput2.weebly.com/
What got me into this was that I was trying to configure this new 6DOF game (still in development) that uses the Unity3D engine: http://www.neonxsz.com/
While NeonXSZ does use cinput2, I'm not sure which version. There's another Unity3D cinput2 6DOF game coming as well https://www.facebook.com/pages/Geocore/163786060381825

According to Unity3D's documentation
https://docs.unity3d.com/Documentation/ ... isRaw.html
http://docs.unity3d.com/Documentation/S ... tAxis.html
It will convert the raw values to -1 to +1, I assume floating point.
AFAIK RawInput values are unprocessed values straight from USB HID. 2 byte integers (signed or unsigned?) Unity3D then converts to -1 to 1 floats or doubles.
I'm thinking what if Unity3D guessed wrong about the integer being signed or unsigned for this device.
By not giving game developers access to the 'raw RawInput', rather to their interpretation of the RawInput, perhaps game developer's wont be able to compensate due to lost information.

case1: if RawInput is 0 to 65535, and they thought it was -32768 to 32767

-when minimum input (0=0x0000) is applied it thinks input is at rest
-when half-min input (16383=0x3FFF) is applied it thinks you're pushing half-max
-when at rest input (32767=0x7FFF) is applied it thinks you're pushing maximum
-when half-max input (49151=0xBFFF) is applied it thinks you're pushing half-min (0xBFFF is interpreted as -16384)
-when maximum input (65535=0xFFFF) is applied it thinks input is near rest (0xFFFF is interpreted as -1)

case2: if RawInput is -32768 to 32767, and they thought it was 0 to 65535

-when minimum input (-32768=0x8000) is applied it thinks input is at rest
-when half-min input (-16385=0xBFFF) is applied it thinks you're pushing half-max
-when at rest input (0=0x0000) is applied it thinks you're pushing minimum
-when half-max input (16383=0x3FFF) is applied it thinks you're pushing half-min
-when maximum input (32767=0x7FFF) is applied it thinks input is near at rest

2 byte signed integer format:

0 (0x0000)
+1 (0x0001)
...
+32766 (0x7FFE)
+32767 (0x7FFF)
(sudden break 'value-wise', but not 'bit-wise')
-32768 (0x8000)
-32767 (0x8001)
...
-2 (0xFFFE)
-1 (0xFFFF)
-----

How to compensate for either case

Map Unity3D (0 to +1) to (-1 to 0)
Map Unity3D (-1 to 0) to (0 to +1)

if (x > 0) {x -= 1;} else if (x < 0) {x += 1;}
Basically shift input 1 unit in the direction of zero.
(maybe mirroring after shifting. Like mirroring around 0.5: 0.4->0.6, 0.6->0.4, 0.3->0.7, 0.7->0.3)

The behavior I'm experiencing in NeonXSZ

HKEY_CURRENT_USER\Software\IntravenousSoftware\NeonXSZ
I can't seem to adjust cInput_indAxSens_h nor cInput_indAxGrav_h in the registry, it always sets them back to 3.

cInput_saveCals_h -1

~%ANY down: no movement (sudden break value-wise)
rest: pitch down full speed
~33% up: pitch down slow
~100% up: no movement

cInput_saveCals_h 0

~ANY% down: pitch up full speed (sudden break value-wise)
rest: pitch down full speed
~33% up: no movment
~100% up: pitch up full speed

cInput_saveCals_h 1

~ANY% down: pitch up full speed (sudden break value-wise)
rest: no movement
~1% up: pitch up slowly
~33% up: pitch up full speed (sens is locked at 3, making 33% maxed I suppose)

Observations of this data

Subtracting 1 removes the ability to pitch up
Adding 1 removes the ability to pitch down

The problem is calibration shifts all values in the same direction. Perhaps my hypothesis is correct, shift input 1 unit in the direction of zero might fix it (maybe you'd need to do mirroring as well.) Or perhaps it can't be compensated for, due to lost information on account of Unity3D's erroneous conversion to -1 to +1. In that case I think Unity3D game developers should be given access to unprocessed raw input allowing them to fix it themselves.

Post Reply