You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: explainer.md
+6-2Lines changed: 6 additions & 2 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -59,16 +59,20 @@ The display technology used affects how the overlay is composited. Applications
59
59
60
60
Low-level inputs that intersect the DOM overlay rectangle (including transparent areas) will be forwarded to the overlay's DOM element for processing according to the usual DOM event propagation model, using event x/y coordinates mapped to the DOM overlay rectangle. For example, screen touch or ray inputs are converted to DOM input events including `"click"` events (required) and optionally also mousedown/mousemove/mouseup events if supported by the implementation.
61
61
62
-
If a WebXR application uses a DOM overlay in conjunction with XR input, it is possible that a user action could be interpreted as both an interaction with a DOM element and a 3D input to the XR application, for example if the user touches an onscreen button, or uses a controller's primary button while pointing at the DOM overlay.
62
+
If a WebXR application uses a DOM overlay in conjunction with XR input, it is possible that a user action could be interpreted as both an interaction with a DOM element and as 3D input to the XR application. For screen-based phone AR, a screen touch represents a ray being cast into the world which the application can use to trigger interactions such as placing a virtual object on the floor. The ambiguity arises when the touched screen location also shows a DOM element in the overlay. In this case, the user intent may be to use a world-interaction ray, for example when touching a transparent part of the DOM overlay, or the intent may be to interact with a DOM UI element such as a button. The application should be able to control if screen touches should be treated as world interactions, as DOM UI interactions, or both.
63
63
64
-
WebXR's [input events](https://github.com/immersive-web/webxr/blob/master/input-explainer.md#input-events) (`"selectstart"`, `"selectend"`, and `"select"`) potentially duplicate DOM events when the user is interacting with a part of the scene covered by the DOM overlay, including transparent areas. To help applications disambiguate, the user agent generates a `beforexrselect` event on the clicked/touched DOM element. If the application calls `preventDefault()` on the event, the WebXR "select" events are suppressed. The `beforexrselect` event bubbles, so the application can set an event handler on a container DOM element to prevent XR "select" events for all elements within this container.
64
+
The same situation also applies for headset-based AR where the DOM overlay is displayed as a floating rectangle along with a tracked motion controller. DOM UI interactions would typically be based on a ray emanating from the controller. Whenever this ray intersects the DOM overlay, the UA emits DOM events at the intersected element, such as a `click` event when the controller's primary trigger is pressed. If the same controller and primary trigger are also used for world interactions, the application needs to be able to disambiguate the input. For example, if the application uses the primary trigger to activate a sculpting tool, this should be temporarily suppressed if the user is trying to click on a button shown in the DOM overlay.
65
+
66
+
More specifically, WebXR's [input events](https://github.com/immersive-web/webxr/blob/master/input-explainer.md#input-events) (`"selectstart"`, `"selectend"`, and `"select"`) potentially duplicate DOM events when the user is interacting with a part of the scene covered by the DOM overlay, including transparent areas. To help applications disambiguate, the user agent generates a `beforexrselect` event on the clicked/touched DOM element. If the application calls `preventDefault()` on the event, the WebXR "select" events are suppressed. The `beforexrselect` event bubbles, so the application can set an event handler on a container DOM element to prevent XR "select" events for all elements within this container.
A typical application would set such a `beforexrselect` handler for regions of the DOM UI that contain touchable UI elements. This often corresponds to opaque regions of the DOM overlay, but can be different. For example, noninteractive text regions could be treated as non-touchable (without a `beforexrselect` handler) so that they don't block the user from world interactions in those areas. Conversely, the application could set a `beforexrselect` handler on a transparent container element that's slightly larger than the UI elements in it, so that a slightly inaccurate touch that barely missed a button doesn't trigger an unexpected world interaction.
75
+
72
76
WebXR also supports non-event-based input. This includes controller poses, button/axis states, and transient XR input sources such as [Screen-based input](https://github.com/immersive-web/webxr/blob/master/input-explainer.md#screen) that creates 3D targeting rays based on 2D screen touch points. These inputs are not affected by DOM overlays and continue to be processed as usual. Applications are responsible for deduplicating these non-event inputs if they overlap with DOM events, though it's recommended to avoid UI designs that depend on this. There are many ambiguous corner cases, for example pointer movement that starts on a DOM element and ends outside it.
73
77
74
78
If using a DOM overlay in a headset, the implementation should behave the same as for smartphones whenever the primary XR controller's targeting ray intersects the displayed DOM overlay. In other words, generate `"click"` events when the primary trigger is pressed, subject to whatever additional restrictions are specified, such as those for disambiguation.
0 commit comments