Quantcast
Viewing all articles
Browse latest Browse all 24

My HUD UI events are being intercepted

I am building a 2D game with a HUD. On the HUD, I have a button (for pause). In the game, I detect taps on the screen to fire a gun with a gameobject that spans the screen. I reviewed EventSystems for UI events from here: https://www.youtube.com/watch?v=EVZiv7DLU6E The screentaps are caught by implementing OnPointerClick on a game object that spans the screen. I am successfully generating click events when the screen is touched. However, this object is somehow blocking the UI of my HUD. My HUD is on a canvas, with the canvas render mode to Screen Space - Overlay. It is my understanding that with Overlay that it would be the top-most object, so clicking the pausebutton should not be intercepted. My object hierarchy is: Parent ->ScreenTouchManager ->HUD Canvas (Screen - Overlay) with all elements at Z = 0. So, I'm guessing that this is not a z-order issue but an order of priority issue in the event handling? My Physics Raycaster on my camera is beating out the Graphics Raycaster on the Canvas somehow. I'm guessing there is a little feature or a gotcha here that is easy to solve. Possibly to do with layers and blocking? I'm too new to Unity to figure this out. EDIT: Just an extra note, I was able to solve this by moving my ScreenTouchManager onto the canvas. But, I still need an answer to why I can't get it to work the other way. I feel that it is a Unity thing that I should understand.

Viewing all articles
Browse latest Browse all 24

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>