Adding touch screen functionality to a Windows Standalone build for a Surface Pro - is it possible?
Hi all,
I have an application which is currently PC/ Windows Standalone, however I need it to be able to work with the touchscreen on a Surface Pro tablet when the same build is run on here. The interaction the user has is that of clicking buttons to open and close windows etc (all buttons use the OnClick() functionality), so any input required would just be simple button taps.
I simply need this to work on a Surface Pro touchscreen display when it is run on there, so mouse clicks to be replaced with taps of the buttons.
I've read many posts on the subject, but many are from Unity 4.6 and nothing conclusive for Unity 5. Many posts I found stated that this feature was introduced for Unity 5, but others suggest it is not supported and third party plugins are needed.
I don't currently have access to a Windows touch screen device, so am unable to do a quick check. I therefore want to make sure it is possible before I purchase a device and commit to the update.
From what I gather, the 'Standalone Input Module' has now replaced the 'Touch Input Module' which apparently had a checkbox to tick if you wanted to have it work in Standalone. As such a checkbox does not exist in the former, I am unsure whether this feature is still supported.
So I would like to know -
1 - Is it supported in Unity 5?
If so -
2 - Does it 'just work' automatically or does extra work need to be done to tell it to use touch screen if it is running on a Windows touch screen device?
Many thanks in advance