Position tracking 2 devices into Unity - Tech demo animation research project

Hi everybody,

Long time since I've been here. I'll just get right to it.

I'm looking to make a little tech demo thingy, that requires the ability to read the position of 2 devices. This could be done with Wii, or PS Move, I guess, but they're kinda clunky and expensive. Phones could also work. Ideally we could use a simpler, smaller custom controller type vibe. I'm open to ideas. All I need is to be able to display the real-time positions of multiple devices from the perspective of a "camera" (could be phone).

I can program, but I'd really rather not... I'm more of an animator.
I'm pretty sure Unity can do this. I did some Googling.

My plan is to get the positions of these devices and generate procedural geometry/FX animations from them. I am currently shifting my animation toolset to be focused on Side FX Houdini, and I want to experiment with the Houdini Engine inside Unity.

To get an idea of what I'm thinking of creating, here is a link to a non-interactive version:


I would like to make something like that, except real-time, and obviously using my own FX.

So does anyone like the idea of helping me with this? I would be super keen to do is as a game jam or something. I can't do this Ludum Dare because I have a deadline for Monday, but maybe there's another game jam coming up soonish? I've somewhat lost touch with game community events.

Otherwise if someone has ideas that could point me in the right direction? I can do a little Unity, so I could could theoretically do this all myself if there are the right libraries/assets that would make it coding-handicapped-friendly.

Comments

  • I think modern phones, the ones with gyroscopes, can track their position, but Wii and PS Move controllers only have accelerometers in them and so can only track the force exerted on them. (i.e. some smart phones and MS Move and Wii motes know what direction changes affect them, but don't know where they are or what direction they are travelling in).

    I think Playstation VR does some clever calculations to track the move controllers using a camera. The same is true of Oculus controllers.

    Samsung Galaxy LTE phones for instance don't have a gyroscope. Though the phones Google is approving for their VR Daydream platform all will have gyroscopes.

    Most VR Gear is designed to track controllers, but the ones using cameras, like Oculus's controllers, won't work where the controller is invisible. The HTC Vive uses lazers which are a bit better, and its certainly the best hand tracking equipment I've experienced.

    It'd be interesting to see a thing like this work with a Kinnect, as games like Just Dance are pretty rad on that platform, only working with the Kinnect is a huge technical challenge (as I understand it).
  • edited
    Sounds like some computer vision stuff might an easier solution. You just track the hue and or alpha of the pixels to do this. Doesn't even have to be light based - you could just put colored tape on some cubes or something. In Processing is probably simpler. I have used this tutorial before to great effect (in processing).

  • edited
    Thanks for the comments :)

    @bevis - yeh that looks like the right kinda direction.

    @Evan - I read something about gyroscopes being better than accelerometers. If I need to swing around a R10000 phone I'm not sure sure about that plan though... I could look into developing a custom gyroscope that can send a wireless signal. That sounds long, but might be worth it.
    Kinect seems like overkill. Vive controls I also don't like the idea of swinging around too much. I would like to get to some kind of VR option eventually. The MS Hololens has some smaller control options.

    @Bensonance - That looks like a plan I could test sooner. I was going to be tracking LED poi, so light can work. In Processing can work to get the idea. Then it should be possible to convert that into Unity.

    There's an app that uses light input to make trails already, called Luminancer:
    One of the reasons I want to make this is because it only works on iPhone. lol
    I do want to make something more interesting than basic light trails though. So if I track lights with a camera input, I should
    be able to generate anything I want from there (like something out of a Skinner type algorithm).

    Yay.
  • Those Luminancer trails are just the actual light emitted from the LED poi, by the way.
  • The light tracking thing also obviously doesn't work beyond line-of-sight.
  • edited
    Unless I can have multiple webcam inputs... :)
    Another thing is that most webcams have crappy refresh rates. I would need to get developer type camera. I found one that can get up to 120fps going for 60 EUR (plus shipping etc).

    I think I'll probably focus on getting the FX right for now.
Sign In or Register to comment.