Arena to Unity streaming
Re: Arena to Unity streaming
Thank you for your reply. Actually, after a lot of tests, my algorithm is just using the central marker position (the one used as origin). In other words i'm not using anymore the rigidbody data as pos and ori, so i'm interested in the second method. Do you have a tutorial about it or can u tell me where can i set the flexibility of my rigidbody?
-
- NaturalPoint Employee
- Posts: 199
- Joined: Tue Jun 24, 2008 2:01 pm
- Location: Corvallis, OR, USA
- Contact:
Re: Arena to Unity streaming
The TT 2.4 manual is your best bet:
http://www.naturalpoint.com/optitrack/d ... tools.html
I would look at the Rigid Bodies settings section - esp the flexibility parameter.
http://www.naturalpoint.com/optitrack/d ... tools.html
I would look at the Rigid Bodies settings section - esp the flexibility parameter.
Re: Arena to Unity streaming
Thank you!!! I was using Arena until now but tracking tool is just amazing. After the christmas break I'll go to test my system with tracking tool instead of Arena. Hopefully setting up the deflection and the flexibility will solve my problems. Thanks again!
-
- NaturalPoint Employee
- Posts: 199
- Joined: Tue Jun 24, 2008 2:01 pm
- Location: Corvallis, OR, USA
- Contact:
Re: Arena to Unity streaming
Great. As a general rule you should using Tracking Tools unless you need full body motion capture. Tracking Tools contains advanced configuration settings for rigid body tracking.
Re: Arena to Unity streaming
Hi again,
I'm still having problems to get the streaming data correctly into Unity. This is what I use right now:
Basicly I get the data into Unity, but there is a delay of around half a second between Arena and Unity. As I am building a VR real-time application, this delay is too much! I used Arena before, to stream live data to the Blender game engine and had no delay at all. Does anybody have an idea where this delay is comming from?
Here is a short impression of the delay:
http://www.youtube.com/watch?v=ea3CVQYm3pc
Thanks in advance,
Quen
I'm still having problems to get the streaming data correctly into Unity. This is what I use right now:
Code: Select all
using UnityEngine;
using System;
using System.Collections;
using System.Net;
using System.Net.Sockets;
using System.Runtime.InteropServices;
public class Arena : MonoBehaviour {
public Camera Maincam;
public Int32 trackingPort = 1511;
UdpClient udpClient = new UdpClient();
IPEndPoint remoteIPEndPoint;
[StructLayout(LayoutKind.Sequential)]
struct PacketHeader
{
public ushort ID;
public ushort bytes;
public uint frame;
public uint markerSetCount;
public uint unknownMarker;
public uint rigidBodys;
}
struct Vector3
{
public float x;
public float y;
public float z;
}
[StructLayout(LayoutKind.Sequential)]
struct RigidBody
{
public uint rb;
public Vector3 Pos;
public UnityEngine.Quaternion Rot;
public uint mCount;
public float errors;
}
// Use this for initialization
void Start () {
udpClient.ExclusiveAddressUse = false;
remoteIPEndPoint = new IPEndPoint(IPAddress.Any, trackingPort);
udpClient.Client.SetSocketOption(SocketOptionLevel.Socket, SocketOptionName.ReuseAddress, true);
udpClient.ExclusiveAddressUse = false;
udpClient.Client.Bind(remoteIPEndPoint);
IPAddress multicastAddr = IPAddress.Parse("239.255.42.99");
udpClient.JoinMulticastGroup(multicastAddr);
}
// Update is called once per frame
void Update () {
byte[] data = udpClient.Receive(ref remoteIPEndPoint);
if (data.Length > 0)
{
GCHandle handle = GCHandle.Alloc(data, GCHandleType.Pinned);
IntPtr dataPtr = handle.AddrOfPinnedObject();
PacketHeader header = (PacketHeader)Marshal.PtrToStructure(dataPtr, typeof(PacketHeader));
dataPtr = new IntPtr(handle.AddrOfPinnedObject().ToInt64() + 20);
for (uint i = 0; i < header.rigidBodys; ++i)
{
RigidBody body = (RigidBody)Marshal.PtrToStructure(dataPtr, typeof(RigidBody));
dataPtr = new IntPtr(handle.AddrOfPinnedObject().ToInt64() + 40);
if(body.rb == 1)
{
Quaternion localRot = new Quaternion(-body.Rot.x, body.Rot.y, body.Rot.z, body.Rot.w);
//Debug.Log(body.Rot.ToString());
Maincam.transform.localRotation = localRot;
}
}
handle.Free();
}
}
}
Here is a short impression of the delay:
http://www.youtube.com/watch?v=ea3CVQYm3pc
Thanks in advance,
Quen
-
- NaturalPoint Employee
- Posts: 199
- Joined: Tue Jun 24, 2008 2:01 pm
- Location: Corvallis, OR, USA
- Contact:
Re: Arena to Unity streaming
Hi Quen,
Its possible it has to do with synchronously servicing the socket in the Update() loop. If you are streaming from Arena at 120fps, and your Update() routine is only servicing the socket at your framerate, its possible packets are buffering up on the socket itself and you are getting old packets when reading them from the socket. Up to 500ms seems pretty high but possible, esp depending on your framerate and what the socket size \ buffering behavior is.
To minimize latency I'd suggest servicing the socket in a separate thread. The original script I posted uses an asynchronous callback mechanism for listening to the port so the socket is serviced at a rate independent of the Unity Update loop. In this manner only the most recent frame is applied to the model's transforms. Any older frames are discarded (you could of course keep them if you wanted). You'll probably either want want to put a lock around your shared data (e.g. Maincam.transform.localRotation ) or else double buffer it if you don't want to put a Lock in your update loop.
Alternatively, you could try tuning your socket size/buffering settings - don't know off hand what this is by default in .NET sockets layer. Also, change your Update() function to loop through all packets currently on the socket, or disable any form of socket buffering. I don't really recommend this as much, but it can be simpler than the threaded approach, and it may be a quick way to test if packets buffering on the socket is in fact your issue.
hope this helps,
Morgan
Its possible it has to do with synchronously servicing the socket in the Update() loop. If you are streaming from Arena at 120fps, and your Update() routine is only servicing the socket at your framerate, its possible packets are buffering up on the socket itself and you are getting old packets when reading them from the socket. Up to 500ms seems pretty high but possible, esp depending on your framerate and what the socket size \ buffering behavior is.
To minimize latency I'd suggest servicing the socket in a separate thread. The original script I posted uses an asynchronous callback mechanism for listening to the port so the socket is serviced at a rate independent of the Unity Update loop. In this manner only the most recent frame is applied to the model's transforms. Any older frames are discarded (you could of course keep them if you wanted). You'll probably either want want to put a lock around your shared data (e.g. Maincam.transform.localRotation ) or else double buffer it if you don't want to put a Lock in your update loop.
Alternatively, you could try tuning your socket size/buffering settings - don't know off hand what this is by default in .NET sockets layer. Also, change your Update() function to loop through all packets currently on the socket, or disable any form of socket buffering. I don't really recommend this as much, but it can be simpler than the threaded approach, and it may be a quick way to test if packets buffering on the socket is in fact your issue.
hope this helps,
Morgan
-
- NaturalPoint Employee
- Posts: 199
- Joined: Tue Jun 24, 2008 2:01 pm
- Location: Corvallis, OR, USA
- Contact:
Re: Arena to Unity streaming
the youtube link is marked as private - can you repost?
Re: Arena to Unity streaming
Thanks for your answer, Morgan!
The video should be viewable now, the performance on the capture was very poor, but the capturing is not the reason for the delay, it happens without capturing, too.
My first idea was similar to what you described, but I could not fix it yet. Right now I am not at the office anymore, but I will try your suggestion with the asynchronous callback. I am not experienced with this at all and not sure if I really understood how it work yet. But I will give it some trys.
Thank you
The video should be viewable now, the performance on the capture was very poor, but the capturing is not the reason for the delay, it happens without capturing, too.
My first idea was similar to what you described, but I could not fix it yet. Right now I am not at the office anymore, but I will try your suggestion with the asynchronous callback. I am not experienced with this at all and not sure if I really understood how it work yet. But I will give it some trys.
Thank you
Re: Arena to Unity streaming
I reduced arena's streaming framerate to 20% and the lag was gone at once! So you were completely right
I will try to use the asynchronous callback method now.
Thanks again.
edit: reducing the receivingBufferSize from default (8192 bytes) to 512 already solved the problem completely. Still I will try out your method
I will try to use the asynchronous callback method now.
Thanks again.
edit: reducing the receivingBufferSize from default (8192 bytes) to 512 already solved the problem completely. Still I will try out your method
Re: Arena to Unity streaming
Hey Quen, if you need any more help with this approach, I'd be happy to give you some pointers. We're currently working on a similar project it would seem. I am using an async approach that can actually process incoming frames outside of the Update loop, but using the Update loop to apply incoming data.
However, if you use this approach, be very careful about how you intend to work with the physics system. If you plan to track props like we do, you will have to apply the mocap data on the FixedUpdate() loop instead of the normal Update loop. You may need to turn up your physics update cycles to make things appear "low latency".
Here's our project working with 2 simple rigid bodies in the capture volume. We're using Flex 13 and Unity 4.1 in this demo:
http://www.vrcade.com/2013/03/20/vrcade ... me-demo-1/
As you can see, the latency is VERY low for rigid bodies. The end goal is to get the game frame-rate up to 120fps with physics following close behind.
However, if you use this approach, be very careful about how you intend to work with the physics system. If you plan to track props like we do, you will have to apply the mocap data on the FixedUpdate() loop instead of the normal Update loop. You may need to turn up your physics update cycles to make things appear "low latency".
Here's our project working with 2 simple rigid bodies in the capture volume. We're using Flex 13 and Unity 4.1 in this demo:
http://www.vrcade.com/2013/03/20/vrcade ... me-demo-1/
As you can see, the latency is VERY low for rigid bodies. The end goal is to get the game frame-rate up to 120fps with physics following close behind.
VRcade - Full Motion Virtual Reality
VRcade.com | facebook.com/VRcade
VRcade.com | facebook.com/VRcade