- Home /
Using Microsoft surface 2.0 with Unity
I'm trying to use the Microsoft Surface 2.0 assemblies with Unity to see if creating Surface games would be possible. I'm getting an error:
"Unhandled Exception: System.TypeLoadException: Could not load type 'System.Runtime.Versioning.TargetFrameworkAttribute' from assembly 'Microsoft.Surface.Core'."
Things I've done/tried: I've set the build type for .Net 4.0 (this is on Unity 3.5..I've tried both VS and Mono as IDEs)
I've switched the '.net 2.0 subset' to '.net 2.0' profile to be the full one in player settings
I've tried setting the project to x86. I've added the Microsoft.Surface.Core and Microsoft.Surface assemblies to the asset folder. I've tried adding mscorlib to the asset folder (was unable to do so).
I've been fighting with this for the last 3 hours..any ideas? Any further info that I can provide to help get ideas/solutions?
One other note, I believe this is similar to what the users trying to use the $$anonymous$$S $$anonymous$$inect SD$$anonymous$$ were hitting. They moved to using TCPIP and other methods, but it seems a shame to have to proxy when we have direct c# access
Hi Jfarro, Have you solved your problem? I am about to purchase a multi touch table for a Unity project and would like to know if the $$anonymous$$icrosoft Surface is an option.
Thanks
Answer by Tseng · Apr 25, 2012 at 06:25 PM
Unity3d is using Mono and not .NET. Mono do not implement the full .NET. Some features are done, others not and some features (WPF for example) are not even planed to be ported.
Also Unity3d 3.x uses Mono 2.6 (the current Mono version is 2.10)
Answer by Jfarro · Apr 25, 2012 at 08:17 PM
Thankyou Tseng for the answer. It's dissapointing, but given the necessity for Unity to be able to go cross platform, I understand the decision.
Georgedim, this would've been ideal since it would've allowed for the vision based properties of Surface to be applied directly to Unity. Surface 2.0 devices support HID input, so any framework that can accept multitouch through Windows 7's framework will work on Windows 7.
For me, the more interesting scenario was getting the object recognition, orientation, and other advanced properties of Microsoft Suface working with Unity. The other approach to this is to simply bubble up the events via a TCP layer. I'll beging prototyping that solution when I'm done with my current project (end of May)
Answer by Isak · Jun 25, 2012 at 08:50 AM
I was recently faced with the same problem: We had a Unity prototype that we wanted to run on a Microsoft Surface 2.0 device.
Given the current state, where Unity doesn't even support touch on Windows 7 (please unity, please add this!) there are a few things that need to be done to make your unity application run on surface:
Your app needs to signal the surface shell that it has loaded successfully
You need to receive touch events from the windows OS into your unity app
Unfortunately, the surface SDK (which is needed to do the two above things) is .NET 4.0 only. You cannot load the Microsoft.Surface.*.dll
files from within your Unity application.
The solution I settled for was to write a wrapper application - a .NET 4.0 based Windows Forms application - that took care of handling touch events and expose them to my unity application through named pipes. (you can do the same over sockets also)
The wrapper also took care of signalling that app has loaded.
Here's in a nutshell what the wrapper did:
Setup a global listener for touch events, using the TouchTarget class. The trick is to pass
IntPtr.Zero
as window handle, meaning you get all touch events and not only those above a particular window:new TouchTarget(IntPtr.Zero, EventThreadChoice.OnBackgroundThread)
Launch the unity application using the Process class. I also set up notification so that I get an event when the process quits (so I can close the wrapper if that happens). I will also kill the unity application if the wrapper dies (for example, if the surface shell decides to kill it due to a timeout).
Create a NamedPipeServerStream and wait a for client connection. The client will be the Unity application.
(after a client has connected): When I get a new touch event (TouchEnter, TouchMove, TouchLeave), I send the complete list of current touch points to the client. I use a simple struct containing only X, Y and ID of the touch point, but you could send as much or as little as you need. I recommend you create your own struct though, and don't try to send the entire TouchPoint class. You need to be in control of the serialization of the data since you are handling two different runtimes (mono vs microsoft)
I also recommend that you send data a regular intervals over the named pipe even if you don't get any touch events. This is because once you quit your app, the unity application won't exit properly until a read call has completed. This could leave your unity app (or unity editor) frozen if the wrapper process crashes. I think this is a bug in mono's implementation of named pipes.
The wrapper does once more thing, and that is that as soon as it detects that it has a connection, it will tell windows to bring the unity application to the foreground. Otherwise, the surface shell will keep the wrapper process in the foreground and the unity app will be completely black
In my unity application, I create a handler for the surface touch wrapper that, in a nutshell, does:
Create a background thread that takes care of communicating with the wrapper
The background thread function loops until it gets a connection (you'll get an exception if you can't connect)
Once it has a connection, enter a new loop which will read data from the pipe. The read call will block until it gets data from the wrapper
Once data has arrived deserialize it to an array of my custom touchpoint struct that I mentioned above.
Store this array in a (static) class variable
To get the touch points from my game object scripts, I call a static method on the surface script from my Update() method that simply retrieves a copy of this array of touch points.
Integration with Surface Shell This is not different from any other surface 2.0 application. Just remember that the application exe in the surface XML file should point to the wrapper process and not the unity application exe. The wrapper will take care of starting (and possibly killing) the unity application.
The only limitation I've found with the above approach is that once the unity application is running the only way to quit is to hit ALT-F4 from a keyboard. I haven't figured out a way to have the standard surface corner quit controls visible while the surface app is running. One solution to this is to add a button or similar inside your unity application that exits it.
I would love to share the code but unfortunately, it's too big to fit the format of a single answer. Hopefully, the above description of how I solved it can help other people which are in the same situation.
Can't you put it on github? Thanks for the description though
Hi chuck,
i saw your project on github, we are french engineer school students and we want to create a unity3D/pixelsense game. We are reals noobs so if you can contact me so we can discuss about this, it would be so great. alexandrejaulin@orange.fr
Bye !