Latency Gesture recognition, absolute positioning

davidhirzeldavidhirzel Posts: 3
edited August 2013 in Omni SDK
Hi Folks,

two questions came to me, after i ordered :)

regarding latency:

supposing that the now preorder-able Version of the Omni is using only the kinect as a sensor device,
how will the latency of gesture recognition be addressed?

are there experience values about how big this latency is going to be?

regarding absolute positioning:

i suppose that the omni will only identify gestures (e.g. walking, running etc) and translate them into keystrokes (e.g. w=walk)
the need to identify those gestures for each angle in respect to the kinect sensor seems to be given, otherwise we would have flawed keystrokes depending on your position relative to the sensor.

will the omni software derive the orientation of the user within the treadmill and apply this information to the pc?
my impression is, that this information should be anyway needed to identify the gestures correctly
and it would be very helpful to us it for proper orientation of the avatar body.

regards
david

Comments

  • RaoulRaoul Posts: 125
    Hi David welcome to the forum.

    Although Virtuix has said they will continue to support the Kinect in the Omni SDK, it will not be required. Tracking will be integrated into the Omni shoes and/or base.The only thing that might require Kinect is to recognize leaning/crouching.

    Regarding latency there is nothing that can be done here, you are totally depended on the Kinect hardware. Although latency is slightly reduced in the new Kinect for Xbox One. You still have a delay of around 60ms (compared to 90ms for the old Kinect).

    Not sure if the final Omni tracking will also include orientation. The current Omni prototype uses the Rift for determining your direction. Your feet will be tracked one to one, but this requires support from games. As for games that require traditional input (keystrokes or HID) a simple translation through the Omni SDK should be enough.

    There are currently very few games that decouple your viewing, aiming and/or walking direction anyways. But I suspect you will need another tracker like a STEM from the Sixense Wireless Hydra if you wan't to fully utilize games that do.
  • Hi Raoul, hi all,

    Thats great news that the kinect will not be needed for the core functions.
    if its confirmed that the kinect will not be used for motion translation, than the latency
    of the kinect will not have any effect.

    assuming that the shoes / base will incorporate some sensor solution, than i would suppose that the orientation of the walking direction should easy to be derived by the omni.
    of course this can only be used if the software handles walking and viewing direction independently (like ARMA II )

    obviously this kind of information is not needed / useful for the head position that will be handled by the oculus rift, hopefully improved by some additional sensor like integrated position tracking done by the final oculus, or "manually" by using some external tracker like you mentioned (STEM).

    ultimately i do not see any chance of "real :-) " vr , without adding absolute position tracking for body and view port.
    just think about drifting effects in the oculus tracker. even with a small drift, your body position and view position would deteriorate over time.
    if we add furthermore STEM like controllers, than we can walk one way, watch the other and shoot / bash in a third direction.
    from the tracker point of view it seems that the hardware "magic" for this is there.

    another thing to consider is some standardization process.
    think of how nowadays its easy to configure controllers.
    a set of axis and buttons is reported to be available for the operation system.
    those input methods can than be configured by some kind of configuration and calibration tool.
    games than use this "driver like" interface to map the input elements to the input methods in the software.

    no big deal from a technological point of view, but most likely a question of collaboration between the important players in the industry.
  • RaoulRaoul Posts: 125
    I'm not so confident that direction is so easily derived, even when the tracking is integrated into the shoes/base. There are quite a few ways to sense the speed at which your feet move, that can't accurately determine the direction. As long as the tracking is not finalized I don't think we will get an answer from Virtuix about this.

    Drift on the Rift is certainly an issue. And this is why it may be wiser for the Omni to only report the users walking speed, rather than the direction as well. Because if you do report direction you will have to compensate for drift somehow. This is why I feel that something like the STEM tracker that can report absolute position is your best bet for now.

    Although a single button to re-calibrate (center) works pretty well in most Rift demos at the moment. And the problem is greatly reduced in demos that utilize drift correction from the latest Oculus SDK.

    Definitely agree with you that absolute positional tracking is a must for proper VR. But I also realize that games take years to develop, so we will most likely be stuck with games that are designed for traditional gaming input for the next year or two.

    As far as standardization is concerned, I think it would be best if Virtuix and Sixense come together to standardize input for VR games. If a developer can integrate the input from STEM trackers and the Omni through one SDK this could greatly increase the chance games will support it.

    Considering Simon Solotko is chief marketing officer for Virtuix and on the advisory board for Sixense I have high hopes that he can make this happen.
  • RaoulRaoul Posts: 125
    edited April 2014
    Just came across this video showing what the Kinect combined with the Razer Hydra and the Rift can already do. Amazing stuff, definitely going to give this a try sometime this weekend.

Sign In or Register to comment.