- Home /
Detect when a touch has been captured by the GUI
I'm making a game for the iPhone with GUI buttons. And in Update
, I look for touches for another purpose, but I don't wan't to process touches if they are on a GUI element.
The problem is that iPhoneInput.touchCount
and iPhoneInput.touches
contain all the touch information whether or not it has been processed by GUI, and those variables are actually available in Update
and FixedUpdate
before GUI gets to do anything with them.
Now I can come up with a hacked solution, no problem (validate the touch data by my GUI function and only use touch data when it has been validated). But I wondered if there is a clean, supported solution.
Answer by equalsequals · Nov 19, 2009 at 08:38 PM
We originally tried OnGUI() for iPhone, didn't really seem to work well. (probably because of the reasons you defined above)
What we ended up doing was just implementing rectangles for various areas of interest on the screen, and then checking the touches to see if they were within one of those rectangles and at the correct phase. From there we just handled the touch appropriately.
Of course at this point it is no longer immediate-mode because we just had predefined GUITextures in the scene.
Hope that helps.
Not using GUI is might indeed be the way to go since they probably won't support address this problem as it would augment latency.
Answer by Bampf · Nov 20, 2009 at 03:14 AM
In my game the GUI does not normally overlap the gameplay area. The only time it does is when a "modal" dialog box is showing. So before reacting to a click or tap, the gameplay code asks the dialog box manager class if there's a dialog box showing. If there is, the click is ignored.
This works but is a little weak. It requires cooperation between my classes. If I were to write a new class that put up it's own masking visual element, the gameplay code would have to check for that as well.
A more elaborate system might be the following: classes interested in events would register with an event manager class. Before taking any action on a click they would first check to see if they were the "top-most" event handler. If not, they would ignore the click.
This would handle modal dialog boxes, but if you wanted to allow the current event handler to "pass" and let the next handler take an event (a non-modal dialog box for example, when the user clicks outside) then you have a new set of problems to solve.
Answer by crevelop · Jun 21, 2011 at 01:01 AM
Use layers, create custom layer for your touchable objects and then ignore the rest, or limit your raycasting by creating a layermask, and ignore the objects or areas that you don't want.
Check this out, he uses a layermask ignoring a custom layer named GUI and the ignore Raycast layer, and get touch events from the rest. Apply the same concept, so only when a ray reaches your object(under your custom layer) you will trigger your touch code.
http://www.youtube.com/watch?v=oOfPMKdJdKk
source code is here:
http://www.revelopment.co.uk/tutorials/unitytutorials/73-howtorotateanobjectbytouch
Hope it helps.
Your answer
Follow this Question
Related Questions
how to create buttons using iphone look and feel?? 2 Answers
external editor choice?? 1 Answer
IOS App and Aspect Ratio 1 Answer
large slider in the unity 3D 1 Answer
I have problem when i want to using button instead keyboard in iOS 1 Answer