- Home /
Question by
DM_J · Feb 08, 2021 at 02:54 AM ·
uimulti-display
The ui input works only on display 1 on the editor.
I'm making a project using multiple displays. In particular, ui input only works on display 1 only in editor. When built, it works on other displays. However, during the development of the editor, there are many difficulties in how to build every time. How can I handle this?
Comment
Answer by DM_J · Feb 08, 2021 at 03:32 AM
I found a solution. The graphic raycaster of the uicanvas on display 1 was handled by turning it on and off depending on the mouse position.
Below is a function that finds the mouse position.
using System.Runtime.InteropServices;
/// <summary>
/// Raycast into the screen underneath the touch and update its 'current' value.
/// </summary>
[DllImport("user32.dll")]
[return: MarshalAs(UnmanagedType.Bool)]
private static extern bool GetCursorPos(out MousePosition lpMousePosition);
[StructLayout(LayoutKind.Sequential)]
public struct MousePosition
{
public int x;
public int y;
public override string ToString()
{
return "[" + x + ", " + y + "]";
}
}
MousePosition mPos;