Source code of xcube?

Post questions, comments and feedback to our 3Dconnexion UNIX and Linux Development Team.

Moderator: Moderators

Post Reply
Hans Meine
Posts: 12
Joined: Mon Jun 16, 2008 5:21 am

Source code of xcube?

Post by Hans Meine »

Hi,

I have no problems with receiving the 6DOF input events in my app, but I am not sure I'm doing the rotation correctly. The xcube example works fine and looks very simple, any chance that we get to see the source for this?

I know about the difference between 3 single-axis rotations and a single 3D axis rotation, which 3DxInputI_API.pdf from the windows SDK mentions to be "important":
The translation vector is fairly easy to interpret. The three components (X, Y, and Z) of the translation vector can be applied in the same manner as similar data from the keyboard or mouse is applied to the viewing transform. The rotation vector is a different matter. Applying the rotation vector as individual parts will not give the same result as rotation about the vector (see below). If the data is applied individually the user will notice a “wobble” when performing a rotation.
The problem is that there's no hint how the three components would relate to a 3D rotation vector/angle pair.

Greetings,
Hans
crobl
Moderator
Moderator
Posts: 138
Joined: Mon Feb 26, 2007 8:34 am
Location: Freiham, Germany

Post by crobl »

Hello Hans,

I've found two links, that you might find quite useful. Please have a look at:
Transformations in 3D and Appendix G, Homogeneous Coordinates and Transformation Matrices
Both describe all you need for translation, rotation, scaling and viewing in 3D.

The latter link points to the online version(?) of "OpenGL Programming Guide: The Official Guide to Learning OpenGL" - a good book, worth buying it! Besides of OpenGL it explains the formulas and matrices behind translation and rotation and viewing of objects in 3D.

Regards,
Christian Robl
3Dconnexion
Hans Meine
Posts: 12
Joined: Mon Jun 16, 2008 5:21 am

Post by Hans Meine »

Oh, thanks, it's not the math however which is my problem; in fact I have successfully written some 3D visualization software and you can find a C++ quaternion class of mine here:
http://kogs-www.informatik.uni-hamburg. ... ernion.hxx

The problem is that I do not know how to interpret the three rotational sensor values coming from the 3Dconnexion device; if it were three individual rotations, I'd need to know the order in which they are to be applied, but the cited PDF suggests that this is wrong.

It does not seem to be a rotation axis vector either, but maybe that's it and I should just normalize it to represent the axis and interpret its magnitude as being the rotation angle?[/list]
Hans Meine
Posts: 12
Joined: Mon Jun 16, 2008 5:21 am

Post by Hans Meine »

I have had a look at the "helix" demo program by Simon, and its space pilot control seemed rather intuitive. Looking at the code I found that it indeed interprets the rx/ry/rz values as three separate rotations; the code looks roughly like this:

Code: Select all

1 Rotate `camera_pos` around `camera_up` by `ry`
2 left := cross(camera_pos, camera_up)
3 Rotate `camera_pos` and `camera_up` around `left` by `rx`
4 Rotate `camera_up` around `camera_pos` by `rz`
Hans Meine
Posts: 12
Joined: Mon Jun 16, 2008 5:21 am

Post by Hans Meine »

I should add that I am still looking for documentation on "the proper way" to do it.

Also, I wonder what to do with the MagellanPeriod value?
IIUC, one could simply multiply the sensor values with the period? Or, probably one would need to multiply the previous sensor values with the period value, if the latter determines the time between the last event and this one, in which the last values were valid?

Do people use the period value at all in practice?

Oh, BTW: Does anybody have a link/hint where to look at the relevant code in blender? (I suspect that that implementation is quite well-tested and officially approved?!)
crobl
Moderator
Moderator
Posts: 138
Joined: Mon Feb 26, 2007 8:34 am
Location: Freiham, Germany

Post by crobl »

Hi Hans,

yes indeed those are independent values for translation (x/y/z) and rotation(x/y/z) not a composite rotation/translation value.
I suppose you know that already, but for completeness: The values of course are not angles, but represent a velocity, a rate of change.

The MagellanPeriod you can safely ignore. Its a legacy thing we still have in our driver/SDK. I also think that no current applications (or at east most of them) use this value.

Regarding Blender, support should be integrated in the latest version and thus be available in the source code there.

Regards,
Christian
3Dconnexion
jwick
Moderator
Moderator
Posts: 3331
Joined: Wed Dec 20, 2006 2:25 pm
Location: USA
Contact:

Post by jwick »

The rotation components are, as you guessed, the axis of rotation. The length of the vector represents a dimensionless instantaneous angle about that axis.

The Windows SDK has the src for a function to convert the components to a rotation matrix if you wish to use that code (we always do).
Hans Meine
Posts: 12
Joined: Mon Jun 16, 2008 5:21 am

Post by Hans Meine »

Hi Christian,

thanks for your reply.
crobl wrote:yes indeed those are independent values for translation (x/y/z) and rotation(x/y/z) not a composite rotation/translation value.
OK, but what is the proper order when applying the three rotations? This will probably only matter when you twist the knob in multiple axes, which is not very common I guess, but then why does the manual mention this at all? In helix, the order is Y,X,Z which looks quite arbitrary to me. In blender, there does not seem to be a 3-axes-rotation mode (from looking at the source - somehow it won't detect the space pilot here so I can't test), so I am not getting more educated from looking at that code.
crobl wrote:The MagellanPeriod you can safely ignore. Its a legacy thing we still have in our driver/SDK. I also think that no current applications (or at east most of them) use this value.
Ah, OK, thanks for the info. But then - doesn't that mean that the X11 events have to come with a fixed frequency? Or do I have to discard most events and care about a fixed timing myself? (E.g. 20 times a second, evaluate the last reported 6DOF device state.)
crobl wrote:Regarding Blender, support should be integrated in the latest version and thus be available in the source code there.
Yes, I found the code in blender; searching for NDOF or NDof brings up the following more or less relevant files:
intern/ghost/GHOST_ISystem.h
intern/ghost/intern/GHOST_SystemX11.h
intern/ghost/intern/GHOST_C-api.cpp
intern/ghost/intern/GHOST_SystemWin32.cpp
intern/ghost/intern/GHOST_System.cpp
intern/ghost/intern/GHOST_System.h
intern/ghost/intern/Makefile
intern/ghost/intern/GHOST_EventNDOF.h
intern/ghost/intern/GHOST_NDOFManager.cpp
intern/ghost/intern/GHOST_NDOFManager.h
intern/ghost/intern/GHOST_SystemCarbon.cpp
intern/ghost/intern/GHOST_SystemX11.cpp
intern/ghost/GHOST_C-api.h
intern/ghost/GHOST_Types.h
source/blender/include/transform.h
source/blender/include/blendef.h
source/blender/include/BIF_mywindow.h
source/blender/include/BIF_resources.h
source/blender/include/BSE_view.h
source/blender/include/mydevice.h
source/blender/include/BIF_transform.h
source/blender/src/ghostwinlay.c
source/blender/src/editscreen.c - contains filterNDOFvalues (dominant axis filter)
source/blender/src/transform.c
source/blender/src/header_view3d.c (GUI code)
source/blender/src/transform_generics.c
source/blender/src/space.c (only manages GUI mode switching original/fly/transform)
source/blender/src/transform_ndofinput.c
source/blender/src/view.c
I have not looked at the ghost-files, since I assume that they're for platform-dependent low-level code. I have marked the files relevant for the transform; one can safely ignore the non-bold files AFAICS.

There's also on-line access to the repository here:
https://svn.blender.org/svnroot/bf-blen ... ender/src/

Hopefully, the above links are helpful for someone else; for me it was not very interesting after all, since my assumption that it contains 3DOF rotation code seems to be wrong.
Hans Meine
Posts: 12
Joined: Mon Jun 16, 2008 5:21 am

Post by Hans Meine »

But please, could someone from 3dconnexion check whether it would be possible to release the source code of "xcube"? It's a very simple example after all, what reason could there be not to release its code?
jwick
Moderator
Moderator
Posts: 3331
Joined: Wed Dec 20, 2006 2:25 pm
Location: USA
Contact:

Post by jwick »

Hans,

Christian should be able to scare up the src code for xcube (it may not be meant for outside consumption though). The Windows 3DxWare SDK contains src for several demos that have been scrubbed a bit better.

The rotation values are NOT Euler Angles. You do not apply them in a specific order. They represent the axis of rotation. They are more like a quaternion that you mentioned you are familiar with. You must not accumulate the axis values by themselves. You must accumulate them by applying them as a delta to an existing orientation. E.g., as a delta to a rotation matrix or a quaternion. As mentioned earlier, the Windows SDK contains a C function to convert them to a delta matrix. You must then accumulate that with your existing orientation.

That having been said, you can cheat if the angles are very small. You won't notice the difference if you multiply small X,Y,Z rotations together. You must then append that to your existing matrix. And perform the same step at the next event from the device.

Finally, make sure you are applying the transformation in the correct coordinate space. If you want to move the camera vs. moving an object, you need to apply the transformation in the correct space. Always transform the vectors from the eye space to the space you want to control. The SDK doc contains details on this math. Some forum topics have also drilled down into the math.

Jim
3Dx Software Development
Hans Meine
Posts: 12
Joined: Mon Jun 16, 2008 5:21 am

Post by Hans Meine »

jwick wrote:Christian should be able to scare up the src code for xcube (it may not be meant for outside consumption though). The Windows 3DxWare SDK contains src for several demos that have been scrubbed a bit better.
Ah, good hint. I had a look at the Windows SDK in the past, but somehow I failed to try to look there this time.
jwick wrote:The rotation values are NOT Euler Angles. You do not apply them in a specific order. They represent the axis of rotation.
That's a very interesting statement; I have not found that information anywhere.
jwick wrote:They are more like a quaternion that you mentioned you are familiar with.
OK, "more like" is a bit vague.. I am not sure that I can put that into code. :wink:
jwick wrote:You must not accumulate the axis values by themselves. You must accumulate them by applying them as a delta to an existing orientation. E.g., as a delta to a rotation matrix or a quaternion. As mentioned earlier, the Windows SDK contains a C function to convert them to a delta matrix. You must then accumulate that with your existing orientation.
As I said, the accumulating/coordinate spaces/math is not the problem. Your mentioning of "a C function to convert them to a delta matrix" makes me very curious though.

I must confess that I was quite angry when I first noted how much the Linux and Windows SDK APIs and drivers differ from each other. My impression is that the Windows version is more stable, better designed, and contains more features.

Now I will have a look whether "the Windows SDK contains" means "if you used Windows, you could simply use Sensor::Rotation to get an AngleAxis object, but all this is unavailable on Unix/Linux" or whether you mean "look at the example apps, which contain the appropriate conversion math in the provided source code". I thought it was the former, but maybe I was too pessimistic. I hope in the former case you can make the correct formula/algorithm public, so that all Linux users can benefit from it.
jwick wrote:That having been said, you can cheat if the angles are very small. You won't notice the difference if you multiply small X,Y,Z rotations together. You must then append that to your existing matrix. And perform the same step at the next event from the device.
That's what I am doing now, and what the "helix" demo seems to do as well. But you just supported my suspicion that this is not the correct way (tm).

Thanks so far for your insightful answer,
Hans
jwick
Moderator
Moderator
Posts: 3331
Joined: Wed Dec 20, 2006 2:25 pm
Location: USA
Contact:

Post by jwick »

Hi Hans,

There are two SDKs available on Windows.

A COM-based SDK (3DxInput/TDxInput). It uses COM to get the data to you. This is particularly useful for some apps and languages (especially VB). It is a bit of work for C apps. This is the SDK off the SDK page. We've named the Sensor::Rotation object specifically to bring out this AngleAxis nature of the rotation data. The same data is delivered on Unix, but in only three numbers, not four.

There is another, older, Windows SDK, the 3DxWare SDK. You actually have to go into the archive area under your device to get it. It contains the function I am referring to. It's just C, you can use it anywhere. It's called SPW_ArbitraryAxisToMatrix. There is also a lot of src code there. All these demos/libs started on Unix, but have been converted to Windows.

All the SDKs (Windows, Unix, Mac) try to match the familiar programming model (if one exists) of the platform. The data you are dealing with should be essentially the same.
erlkonig
Posts: 1
Joined: Thu Feb 02, 2012 8:16 am

Quaternion rotation of manipulated objects

Post by erlkonig »

If you're digging through xcube looking for good rotation handling, I've achieved sane rotations during object manipulation (in GLUT which I hate, although I admire than it seems to be using the spaceball through Xinput correctly). Hopefully this'll be of use to someone.

1) A handler accepts the xrf, yrf, and zrf ("f" just means "float" here) rotations from the spaceball (glut has a callback for spaceball events)
2) I convert it into a quaternion with code resembling:

Code: Select all

   typedef struct {  float w, x, y, z; } Quaternionf_t;
   Quaternionf_t axis6_input_rotation;
   QuaternionFromAngles( & axis6_input_rotation, xrf, yrf, zrf);
That function's basically:

Code: Select all

   void QuaternionFromAngles(Quaternionf_t *trg, float xr, float yr, float zr)
   /*:note: pitch = xr, yaw = yr, roll = zr; angles are in radians */
   {
       float s1 = sin(zr/2.0);
       float s2 = sin(yr/2.0);
       float s3 = sin(xr/2.0);
       float s2_x_s1 = s2 * s1;
       float c1 = cos(zr/2.0);
       float c2 = cos(yr/2.0);
       float c3 = cos(xr/2.0);
       float c2_x_c1 = c2 * c1;
       trg->w = c3 * c2_x_c1  +  s3 * s2_x_s1;
       trg->x = s3 * c2_x_c1  -  c3 * s2_x_s1;
       trg->y = c3 * s2 * c1  +  s3 * c2 * s1;
       trg->z = c3 * c2 * s1  -  s3 * s2 * c1;
       QuaternionNormalize(trg);
   }
   
3) Then I compose this newly input rotational delta just obtained into the retained rotation:

Code: Select all

    QuaternionMultiply(&rotation, &axis6_input_rotation, &rotation);
That function looks like:

Code: Select all

   void QuaternionMultiply(Quaternionf_t *trg, Quaternionf_t *a, Quaternionf_t *b)
   /*:note: trg may be either of a, b, or some other target area */
   {
      float x = a->w * b->x + a->x * b->w + a->y * b->z - a->z * b->y;
      float y = a->w * b->y + a->y * b->w + a->z * b->x - a->x * b->z;
      float z = a->w * b->z + a->z * b->w + a->x * b->y - a->y * b->x;
      float w = a->w * b->w - a->x * b->x - a->y * b->y - a->z * b->z;
      trg->w = w;
      trg->x = x;
      trg->y = y;
      trg->z = z;
   }
4) I eventually apply the quaternion notation during OpenGL rendering with this, which converts the quaternion into a rotation (angle) around an arbitrarily-oriented line instead of one of the X, Y, or Z axes:

Code: Select all

    XYZf_t axis;
    float angle = QuaternionGetAxisAngle(&rotation, &axis);
    glRotatef(Rad2Deg(angle), axis.x, axis.y, axis.z);
That function being:

Code: Select all

    float QuaternionGetAxisAngle(Quaternionf_t *q, XYZf_t *axis)
    {
        float scale = sqrt(q->x * q->x + q->y * q->y + q->z * q->z);
        float angle = acos(q->w);  // (radians), will be zeroed or doubled shortly
        if(NearZerof(scale)) {
            angle = 0;
            axis->x = axis->y = 0;
            axis->z = 1;
        } else {
            angle *= 2.0;
            axis->x = q->x / scale;
            axis->y = q->y / scale;
            axis->z = q->z / scale;
            QuaternionNormalize(q);
        }
        return angle;
    }
---
With all that, my rotations end up consistently representing the object being rotated with utter disregard to its own perspective, but rather twist it with the viewer's concept of the object's up, left, and right.
Post Reply