- Home /
(2) Cameras, (2) Displays, all UI events seem to go to one camera
Hello. I have (2) non-overlapping cameras on separate displays, separate canvases, separate UI elements and want button click events to call the script on the appropriate camera.
The problem: Instead, click events on either display call the script on one of cameras only. Changing 'depth' on the cameras chooses which one gets the events, but I want button click events to call the script only on the appropriate camera.
Do I have to abandon the editor-configured events and use something like ViewportPointToRay() (would that even work?), or is there a simpler way to fix this?
Thanks.
may i ask how are the events being called? are the buttons set up at runtime by code? if so can you share your code? i am not sure if i undestood 100% sorry if this has nothing to do with your question.
Here is an example project that shows the issue.
https://1drv.ms/u/s!AjkOmmF8XhUUg711ZNemzRtQsHT7AQ
There are two cameras, two canvases, two displays, and two buttons. When user clicks the button on either display, display1 receives the event. If the depth is changed on either camera to exceed the other, events map to the associated display.
Again, the goal for me is to have events from elements on canvas1 to map to display1's camera. canvas2's element events to map to display2's camera.
Thanks.
Thanks for your response. Events are set up in the editor prior to runtime, with the appropriate camera set as the onClick event object.
i have tryed to replicate your case with 2 canvas 2 cameras (but one monitor i dont have 2) and the script are working fine, do you have a project that you can share with mein which it happens? also just to mention the canvas are set to screen space - camers render mode right?
Here is an example project that shows the issue.
https://1drv.ms/u/s!AjkOmmF8XhUUg711ZNemzRtQsHT7AQ
There are two cameras, two canvases, two displays, and two buttons. When user clicks the button on either display, display1 receives the event. If the depth is changed on either camera to exceed the other, events map to the associated display.
Again, the goal for me is to have events from elements on canvas1 to map to display1's camera. canvas2's element events to map to display2's camera.
Thanks.
Answer by xxmariofer · Mar 27, 2019 at 12:23 PM
hello, you are missing 2 parts, you are never enabling the multidisplay thats dine like this
Debug.Log("displays connected: " + Display.displays.Length);
// Display.displays[0] is the primary, default display and is always ON.
// Check if additional displays are available and activate each.
if (Display.displays.Length > 1)
Display.displays[1].Activate();
if (Display.displays.Length > 2)
Display.displays[2].Activate();
and remember, that you CANT test multidisplay from inside unity editor, so you will have to build the game or do some work arround. i have tested your project and adding those lines in the start (be sure you only do it once) and building worked for me
Yes, my sample project neglected to Activate() the display. But you're right - the editor makes it look like you can test multi-displays, but you can't interact with it. In other words, you can see display2 correctly, but clicking anything puts the event on display1. I'll just build to test. Thanks for your help!
Your answer
Follow this Question
Related Questions
Multiple Display UI Issues 1 Answer
Main display hijacked after enabling Render Pipeline 1 Answer
Crashing in build version NOT in Editor 0 Answers
Multi-Display - Multi Touch Screen Issues 0 Answers
Unity running over multiple monitors 1 Answer