* translate touch events from glium to egui
Unfortunately, winit does not seem to create _Touch_ events for the touch pad
on my mac. Only _TouchpadPressure_ events are sent.
Found some issues (like
[this](https://github.com/rust-windowing/winit/issues/54)), but I am not sure
what they exactly mean: Sometimes, touch events are mixed with
touch-to-pointer translation in the discussions.
* translate touch events from web_sys to egui
The are a few open topics:
- egui_web currently translates touch events into pointer events.
I guess this should change, such that egui itself performs this kind of
conversion.
- `pub fn egui_web::pos_from_touch_event` is a public function, but I
would like to change the return type to an `Option`. Shouldn't this
function be private, anyway?
* introduce `TouchState` and `Gesture`
InputState.touch was introduced with type `TouchState`, just as
InputState.pointer is of type `Pointer`.
The TouchState internally relies on a collection of `Gesture`s. This commit
provides the first rudimentary implementation of a Gesture, but has no
functionality, yet.
* add method InputState::zoom()
So far, the method always returns `None`, but it should work as soon as the
`Zoom` gesture is implemented.
* manage one `TouchState` per individual device
Although quite unlikely, it is still possible to connect more than one touch
device. (I have three touch pads connected to my MacBook in total, but
unfortunately `winit` sends touch events for none of them.)
We do not want to mix-up the touches from different devices.
* implement control loop for gesture detection
The basic idea is that each gesture can focus on detection logic and does not
have to care (too much) about managing touch state in general.
* streamline `Gesture` trait, simplifying impl's
* implement first version of Zoom gesture
* fix failing doctest
a simple `TODO` should be enough
* get rid of `Gesture`s
* Provide a Zoom/Rotate window in the demo app
For now, it works for two fingers only. The third finger interrupts the
gesture.
Bugs:
- Pinching in the demo window also moves the window -> Pointer events must be
ignored when touch is active
- Pinching also works when doing it outside the demo window -> it would be nice
to return the touch info in the `Response` of the painter allocation
* fix comments and non-idiomatic code
* update touch state *each frame*
* change egui_demo to use *relative* touch data
* support more than two fingers
This commit includes an improved Demo Window for egui_demo, and a complete
re-write of the gesture detection. The PR should be ready for review, soon.
* cleanup code and comments for review
* minor code simplifications
* oops – forgot the changelog
* resolve comment fee8ed83db (r623226656)
* accept suggestion https://github.com/emilk/egui/pull/306#discussion_r623229228
Co-authored-by: Emil Ernerfeldt <emil.ernerfeldt@gmail.com>
* fix syntax error (dough!)
* remove `dbg!` (why didnt clippy see this?)
* apply suggested diffs from review
* fix conversion of physical location to Pos2
* remove redundanct type `TouchAverages`
* remove trailing space
* avoid initial translation jump in plot demo
* extend the demo so it shows off translation
Co-authored-by: Emil Ernerfeldt <emil.ernerfeldt@gmail.com>
This commit is contained in:
parent
0d71017ad4
commit
03721dbfd8
11 changed files with 686 additions and 15 deletions
|
@ -15,6 +15,9 @@ NOTE: [`eframe`](eframe/CHANGELOG.md), [`egui_web`](egui_web/CHANGELOG.md) and [
|
|||
* [Pan and zoom plots](https://github.com/emilk/egui/pull/317).
|
||||
* [Users can now store custom state in `egui::Memory`.](https://github.com/emilk/egui/pull/257).
|
||||
* Zoom input: ctrl-scroll and (on `egui_web`) trackpad-pinch gesture.
|
||||
* Support for raw [multi touch](https://github.com/emilk/egui/pull/306) events,
|
||||
enabling zoom, rotate, and more. Works with `egui_web` on mobile devices,
|
||||
and should work with `egui_glium` for certain touch devices/screens.
|
||||
|
||||
### Changed 🔧
|
||||
* Make `Memory::has_focus` public (again).
|
||||
|
|
|
@ -96,7 +96,7 @@ impl RawInput {
|
|||
/// An input event generated by the integration.
|
||||
///
|
||||
/// This only covers events that egui cares about.
|
||||
#[derive(Clone, Debug, Eq, PartialEq)]
|
||||
#[derive(Clone, Debug, PartialEq)]
|
||||
pub enum Event {
|
||||
/// The integration detected a "copy" event (e.g. Cmd+C).
|
||||
Copy,
|
||||
|
@ -133,6 +133,22 @@ pub enum Event {
|
|||
CompositionUpdate(String),
|
||||
/// IME composition ended with this final result.
|
||||
CompositionEnd(String),
|
||||
|
||||
Touch {
|
||||
/// Hashed device identifier (if available; may be zero).
|
||||
/// Can be used to separate touches from different devices.
|
||||
device_id: TouchDeviceId,
|
||||
/// Unique identifier of a finger/pen. Value is stable from touch down
|
||||
/// to lift-up
|
||||
id: TouchId,
|
||||
phase: TouchPhase,
|
||||
/// Position of the touch (or where the touch was last detected)
|
||||
pos: Pos2,
|
||||
/// Describes how hard the touch device was pressed. May always be `0` if the platform does
|
||||
/// not support pressure sensitivity.
|
||||
/// The value is in the range from 0.0 (no pressure) to 1.0 (maximum pressure).
|
||||
force: f32,
|
||||
},
|
||||
}
|
||||
|
||||
/// Mouse button (or similar for touch input)
|
||||
|
@ -296,3 +312,47 @@ impl RawInput {
|
|||
.on_hover_text("key presses etc");
|
||||
}
|
||||
}
|
||||
|
||||
/// this is a `u64` as values of this kind can always be obtained by hashing
|
||||
#[derive(Clone, Copy, Debug, Eq, PartialEq, PartialOrd, Ord)]
|
||||
pub struct TouchDeviceId(pub u64);
|
||||
|
||||
/// Unique identifiction of a touch occurence (finger or pen or ...).
|
||||
/// A Touch ID is valid until the finger is lifted.
|
||||
/// A new ID is used for the next touch.
|
||||
#[derive(Clone, Copy, Debug, Eq, PartialEq, PartialOrd, Ord)]
|
||||
pub struct TouchId(pub u64);
|
||||
|
||||
#[derive(Clone, Copy, Debug, Eq, PartialEq)]
|
||||
pub enum TouchPhase {
|
||||
/// User just placed a touch point on the touch surface
|
||||
Start,
|
||||
/// User moves a touch point along the surface. This event is also sent when
|
||||
/// any attributes (position, force, ...) of the touch point change.
|
||||
Move,
|
||||
/// User lifted the finger or pen from the surface, or slid off the edge of
|
||||
/// the surface
|
||||
End,
|
||||
/// Touch operation has been disrupted by something (various reasons are possible,
|
||||
/// maybe a pop-up alert or any other kind of interruption which may not have
|
||||
/// been intended by the user)
|
||||
Cancel,
|
||||
}
|
||||
|
||||
impl From<u64> for TouchId {
|
||||
fn from(id: u64) -> Self {
|
||||
Self(id)
|
||||
}
|
||||
}
|
||||
|
||||
impl From<i32> for TouchId {
|
||||
fn from(id: i32) -> Self {
|
||||
Self(id as u64)
|
||||
}
|
||||
}
|
||||
|
||||
impl From<u32> for TouchId {
|
||||
fn from(id: u32) -> Self {
|
||||
Self(id as u64)
|
||||
}
|
||||
}
|
||||
|
|
|
@ -1,8 +1,12 @@
|
|||
mod touch_state;
|
||||
|
||||
use crate::data::input::*;
|
||||
use crate::{emath::*, util::History};
|
||||
use std::collections::HashSet;
|
||||
use std::collections::{BTreeMap, HashSet};
|
||||
|
||||
pub use crate::data::input::Key;
|
||||
pub use touch_state::MultiTouchInfo;
|
||||
use touch_state::TouchState;
|
||||
|
||||
/// If the pointer moves more than this, it is no longer a click (but maybe a drag)
|
||||
const MAX_CLICK_DIST: f32 = 6.0; // TODO: move to settings
|
||||
|
@ -15,9 +19,13 @@ pub struct InputState {
|
|||
/// The raw input we got this frame from the backend.
|
||||
pub raw: RawInput,
|
||||
|
||||
/// State of the mouse or touch.
|
||||
/// State of the mouse or simple touch gestures which can be mapped to mouse operations.
|
||||
pub pointer: PointerState,
|
||||
|
||||
/// State of touches, except those covered by PointerState (like clicks and drags).
|
||||
/// (We keep a separate `TouchState` for each encountered touch device.)
|
||||
touch_states: BTreeMap<TouchDeviceId, TouchState>,
|
||||
|
||||
/// How many pixels the user scrolled.
|
||||
pub scroll_delta: Vec2,
|
||||
|
||||
|
@ -55,6 +63,7 @@ impl Default for InputState {
|
|||
Self {
|
||||
raw: Default::default(),
|
||||
pointer: Default::default(),
|
||||
touch_states: Default::default(),
|
||||
scroll_delta: Default::default(),
|
||||
screen_rect: Rect::from_min_size(Default::default(), vec2(10_000.0, 10_000.0)),
|
||||
pixels_per_point: 1.0,
|
||||
|
@ -70,7 +79,7 @@ impl Default for InputState {
|
|||
|
||||
impl InputState {
|
||||
#[must_use]
|
||||
pub fn begin_frame(self, new: RawInput) -> InputState {
|
||||
pub fn begin_frame(mut self, new: RawInput) -> InputState {
|
||||
#![allow(deprecated)] // for screen_size
|
||||
|
||||
let time = new
|
||||
|
@ -84,6 +93,10 @@ impl InputState {
|
|||
self.screen_rect
|
||||
}
|
||||
});
|
||||
self.create_touch_states_for_new_devices(&new.events);
|
||||
for touch_state in self.touch_states.values_mut() {
|
||||
touch_state.begin_frame(time, &new, self.pointer.interact_pos);
|
||||
}
|
||||
let pointer = self.pointer.begin_frame(time, &new);
|
||||
let mut keys_down = self.keys_down;
|
||||
for event in &new.events {
|
||||
|
@ -97,6 +110,7 @@ impl InputState {
|
|||
}
|
||||
InputState {
|
||||
pointer,
|
||||
touch_states: self.touch_states,
|
||||
scroll_delta: new.scroll_delta,
|
||||
screen_rect,
|
||||
pixels_per_point: new.pixels_per_point.unwrap_or(self.pixels_per_point),
|
||||
|
@ -121,7 +135,13 @@ impl InputState {
|
|||
/// * `zoom > 1`: pinch spread
|
||||
#[inline(always)]
|
||||
pub fn zoom_delta(&self) -> f32 {
|
||||
self.raw.zoom_delta
|
||||
// If a multi touch gesture is detected, it measures the exact and linear proportions of
|
||||
// the distances of the finger tips. It is therefore potentially more accurate than
|
||||
// `raw.zoom_delta` which is based on the `ctrl-scroll` event which, in turn, may be
|
||||
// synthesized from an original touch gesture.
|
||||
self.multi_touch()
|
||||
.map(|touch| touch.zoom_delta)
|
||||
.unwrap_or(self.raw.zoom_delta)
|
||||
}
|
||||
|
||||
pub fn wants_repaint(&self) -> bool {
|
||||
|
@ -188,6 +208,52 @@ impl InputState {
|
|||
// TODO: multiply by ~3 for touch inputs because fingers are fat
|
||||
self.physical_pixel_size()
|
||||
}
|
||||
|
||||
/// Returns details about the currently ongoing multi-touch gesture, if any. Note that this
|
||||
/// method returns `None` for single-touch gestures (click, drag, …).
|
||||
///
|
||||
/// ```
|
||||
/// # use egui::emath::Rot2;
|
||||
/// # let ui = &mut egui::Ui::__test();
|
||||
/// let mut zoom = 1.0; // no zoom
|
||||
/// let mut rotation = 0.0; // no rotation
|
||||
/// if let Some(multi_touch) = ui.input().multi_touch() {
|
||||
/// zoom *= multi_touch.zoom_delta;
|
||||
/// rotation += multi_touch.rotation_delta;
|
||||
/// }
|
||||
/// let transform = zoom * Rot2::from_angle(rotation);
|
||||
/// ```
|
||||
///
|
||||
/// By far not all touch devices are supported, and the details depend on the `egui`
|
||||
/// integration backend you are using. `egui_web` supports multi touch for most mobile
|
||||
/// devices, but not for a `Trackpad` on `MacOS`, for example. The backend has to be able to
|
||||
/// capture native touch events, but many browsers seem to pass such events only for touch
|
||||
/// _screens_, but not touch _pads._
|
||||
///
|
||||
/// Refer to [`MultiTouchInfo`] for details about the touch information available.
|
||||
///
|
||||
/// Consider using `zoom_delta()` instead of `MultiTouchInfo::zoom_delta` as the former
|
||||
/// delivers a synthetic zoom factor based on ctrl-scroll events, as a fallback.
|
||||
pub fn multi_touch(&self) -> Option<MultiTouchInfo> {
|
||||
// In case of multiple touch devices simply pick the touch_state of the first active device
|
||||
if let Some(touch_state) = self.touch_states.values().find(|t| t.is_active()) {
|
||||
touch_state.info()
|
||||
} else {
|
||||
None
|
||||
}
|
||||
}
|
||||
|
||||
/// Scans `events` for device IDs of touch devices we have not seen before,
|
||||
/// and creates a new `TouchState` for each such device.
|
||||
fn create_touch_states_for_new_devices(&mut self, events: &[Event]) {
|
||||
for event in events {
|
||||
if let Event::Touch { device_id, .. } = event {
|
||||
self.touch_states
|
||||
.entry(*device_id)
|
||||
.or_insert_with(|| TouchState::new(*device_id));
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// ----------------------------------------------------------------------------
|
||||
|
@ -517,6 +583,7 @@ impl InputState {
|
|||
let Self {
|
||||
raw,
|
||||
pointer,
|
||||
touch_states,
|
||||
scroll_delta,
|
||||
screen_rect,
|
||||
pixels_per_point,
|
||||
|
@ -537,6 +604,12 @@ impl InputState {
|
|||
pointer.ui(ui);
|
||||
});
|
||||
|
||||
for (device_id, touch_state) in touch_states {
|
||||
ui.collapsing(format!("Touch State [device {}]", device_id.0), |ui| {
|
||||
touch_state.ui(ui)
|
||||
});
|
||||
}
|
||||
|
||||
ui.label(format!("scroll_delta: {:?} points", scroll_delta));
|
||||
ui.label(format!("screen_rect: {:?} points", screen_rect));
|
||||
ui.label(format!(
|
||||
|
|
273
egui/src/input_state/touch_state.rs
Normal file
273
egui/src/input_state/touch_state.rs
Normal file
|
@ -0,0 +1,273 @@
|
|||
use std::{
|
||||
collections::BTreeMap,
|
||||
f32::consts::{PI, TAU},
|
||||
fmt::Debug,
|
||||
};
|
||||
|
||||
use crate::{data::input::TouchDeviceId, Event, RawInput, TouchId, TouchPhase};
|
||||
use epaint::emath::{Pos2, Vec2};
|
||||
|
||||
/// All you probably need to know about a multi-touch gesture.
|
||||
pub struct MultiTouchInfo {
|
||||
/// Point in time when the gesture started.
|
||||
pub start_time: f64,
|
||||
/// Position of the pointer at the time the gesture started.
|
||||
pub start_pos: Pos2,
|
||||
/// Number of touches (fingers) on the surface. Value is ≥ 2 since for a single touch no
|
||||
/// `MultiTouchInfo` is created.
|
||||
pub num_touches: usize,
|
||||
/// Zoom factor (Pinch or Zoom). Moving fingers closer together or further appart will change
|
||||
/// this value. This is a relative value, comparing the average distances of the fingers in
|
||||
/// the current and previous frame. If the fingers did not move since the previous frame,
|
||||
/// this value is `1.0`.
|
||||
pub zoom_delta: f32,
|
||||
/// Rotation in radians. Moving fingers around each other will change this value. This is a
|
||||
/// relative value, comparing the orientation of fingers in the current frame with the previous
|
||||
/// frame. If all fingers are resting, this value is `0.0`.
|
||||
pub rotation_delta: f32,
|
||||
/// Relative movement (comparing previous frame and current frame) of the average position of
|
||||
/// all touch points. Without movement this value is `Vec2::ZERO`.
|
||||
///
|
||||
/// Note that this may not necessarily be measured in screen points (although it _will_ be for
|
||||
/// most mobile devices). In general (depending on the touch device), touch coordinates cannot
|
||||
/// be directly mapped to the screen. A touch always is considered to start at the position of
|
||||
/// the pointer, but touch movement is always measured in the units delivered by the device,
|
||||
/// and may depend on hardware and system settings.
|
||||
pub translation_delta: Vec2,
|
||||
/// Current force of the touch (average of the forces of the individual fingers). This is a
|
||||
/// value in the interval `[0.0 .. =1.0]`.
|
||||
///
|
||||
/// Note 1: A value of `0.0` either indicates a very light touch, or it means that the device
|
||||
/// is not capable of measuring the touch force at all.
|
||||
///
|
||||
/// Note 2: Just increasing the physical pressure without actually moving the finger may not
|
||||
/// necessarily lead to a change of this value.
|
||||
pub force: f32,
|
||||
}
|
||||
|
||||
/// The current state (for a specific touch device) of touch events and gestures.
|
||||
#[derive(Clone)]
|
||||
pub(crate) struct TouchState {
|
||||
/// Technical identifier of the touch device. This is used to identify relevant touch events
|
||||
/// for this `TouchState` instance.
|
||||
device_id: TouchDeviceId,
|
||||
/// Active touches, if any.
|
||||
///
|
||||
/// TouchId is the unique identifier of the touch. It is valid as long as the finger/pen touches the surface. The
|
||||
/// next touch will receive a new unique ID.
|
||||
///
|
||||
/// Refer to [`ActiveTouch`].
|
||||
active_touches: BTreeMap<TouchId, ActiveTouch>,
|
||||
/// If a gesture has been recognized (i.e. when exactly two fingers touch the surface), this
|
||||
/// holds state information
|
||||
gesture_state: Option<GestureState>,
|
||||
}
|
||||
|
||||
#[derive(Clone, Debug)]
|
||||
struct GestureState {
|
||||
start_time: f64,
|
||||
start_pointer_pos: Pos2,
|
||||
previous: Option<DynGestureState>,
|
||||
current: DynGestureState,
|
||||
}
|
||||
|
||||
/// Gesture data that can change over time
|
||||
#[derive(Clone, Copy, Debug)]
|
||||
struct DynGestureState {
|
||||
avg_distance: f32,
|
||||
avg_pos: Pos2,
|
||||
avg_force: f32,
|
||||
heading: f32,
|
||||
}
|
||||
|
||||
/// Describes an individual touch (finger or digitizer) on the touch surface. Instances exist as
|
||||
/// long as the finger/pen touches the surface.
|
||||
#[derive(Clone, Copy, Debug)]
|
||||
struct ActiveTouch {
|
||||
/// Current position of this touch, in device coordinates (not necessarily screen position)
|
||||
pos: Pos2,
|
||||
/// Current force of the touch. A value in the interval [0.0 .. 1.0]
|
||||
///
|
||||
/// Note that a value of 0.0 either indicates a very light touch, or it means that the device
|
||||
/// is not capable of measuring the touch force.
|
||||
force: f32,
|
||||
}
|
||||
|
||||
impl TouchState {
|
||||
pub fn new(device_id: TouchDeviceId) -> Self {
|
||||
Self {
|
||||
device_id,
|
||||
active_touches: Default::default(),
|
||||
gesture_state: None,
|
||||
}
|
||||
}
|
||||
|
||||
pub fn begin_frame(&mut self, time: f64, new: &RawInput, pointer_pos: Option<Pos2>) {
|
||||
let mut added_or_removed_touches = false;
|
||||
for event in &new.events {
|
||||
match *event {
|
||||
Event::Touch {
|
||||
device_id,
|
||||
id,
|
||||
phase,
|
||||
pos,
|
||||
force,
|
||||
} if device_id == self.device_id => match phase {
|
||||
TouchPhase::Start => {
|
||||
self.active_touches.insert(id, ActiveTouch { pos, force });
|
||||
added_or_removed_touches = true;
|
||||
}
|
||||
TouchPhase::Move => {
|
||||
if let Some(touch) = self.active_touches.get_mut(&id) {
|
||||
touch.pos = pos;
|
||||
touch.force = force;
|
||||
}
|
||||
}
|
||||
TouchPhase::End | TouchPhase::Cancel => {
|
||||
self.active_touches.remove(&id);
|
||||
added_or_removed_touches = true;
|
||||
}
|
||||
},
|
||||
_ => (),
|
||||
}
|
||||
}
|
||||
// This needs to be called each frame, even if there are no new touch events.
|
||||
// Otherwise, we would send the same old delta information multiple times:
|
||||
self.update_gesture(time, pointer_pos);
|
||||
|
||||
if added_or_removed_touches {
|
||||
// Adding or removing fingers makes the average values "jump". We better forget
|
||||
// about the previous values, and don't create delta information for this frame:
|
||||
if let Some(ref mut state) = &mut self.gesture_state {
|
||||
state.previous = None;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
pub fn is_active(&self) -> bool {
|
||||
self.gesture_state.is_some()
|
||||
}
|
||||
|
||||
pub fn info(&self) -> Option<MultiTouchInfo> {
|
||||
self.gesture_state.as_ref().map(|state| {
|
||||
// state.previous can be `None` when the number of simultaneous touches has just
|
||||
// changed. In this case, we take `current` as `previous`, pretending that there
|
||||
// was no change for the current frame.
|
||||
let state_previous = state.previous.unwrap_or(state.current);
|
||||
MultiTouchInfo {
|
||||
start_time: state.start_time,
|
||||
start_pos: state.start_pointer_pos,
|
||||
num_touches: self.active_touches.len(),
|
||||
zoom_delta: state.current.avg_distance / state_previous.avg_distance,
|
||||
rotation_delta: normalized_angle(state.current.heading, state_previous.heading),
|
||||
translation_delta: state.current.avg_pos - state_previous.avg_pos,
|
||||
force: state.current.avg_force,
|
||||
}
|
||||
})
|
||||
}
|
||||
|
||||
fn update_gesture(&mut self, time: f64, pointer_pos: Option<Pos2>) {
|
||||
if let Some(dyn_state) = self.calc_dynamic_state() {
|
||||
if let Some(ref mut state) = &mut self.gesture_state {
|
||||
// updating an ongoing gesture
|
||||
state.previous = Some(state.current);
|
||||
state.current = dyn_state;
|
||||
} else if let Some(pointer_pos) = pointer_pos {
|
||||
// starting a new gesture
|
||||
self.gesture_state = Some(GestureState {
|
||||
start_time: time,
|
||||
start_pointer_pos: pointer_pos,
|
||||
previous: None,
|
||||
current: dyn_state,
|
||||
});
|
||||
}
|
||||
} else {
|
||||
// the end of a gesture (if there is any)
|
||||
self.gesture_state = None;
|
||||
}
|
||||
}
|
||||
|
||||
fn calc_dynamic_state(&self) -> Option<DynGestureState> {
|
||||
let num_touches = self.active_touches.len();
|
||||
if num_touches < 2 {
|
||||
None
|
||||
} else {
|
||||
let mut state = DynGestureState {
|
||||
avg_distance: 0.,
|
||||
avg_pos: Pos2::ZERO,
|
||||
avg_force: 0.,
|
||||
heading: 0.,
|
||||
};
|
||||
let num_touches_recip = 1. / num_touches as f32;
|
||||
|
||||
// first pass: calculate force and center of touch positions:
|
||||
for touch in self.active_touches.values() {
|
||||
state.avg_force += touch.force;
|
||||
state.avg_pos.x += touch.pos.x;
|
||||
state.avg_pos.y += touch.pos.y;
|
||||
}
|
||||
state.avg_force *= num_touches_recip;
|
||||
state.avg_pos.x *= num_touches_recip;
|
||||
state.avg_pos.y *= num_touches_recip;
|
||||
|
||||
// second pass: calculate distances from center:
|
||||
for touch in self.active_touches.values() {
|
||||
state.avg_distance += state.avg_pos.distance(touch.pos);
|
||||
}
|
||||
state.avg_distance *= num_touches_recip;
|
||||
|
||||
// Calculate the direction from the first touch to the center position.
|
||||
// This is not the perfect way of calculating the direction if more than two fingers
|
||||
// are involved, but as long as all fingers rotate more or less at the same angular
|
||||
// velocity, the shortcomings of this method will not be noticed. One can see the
|
||||
// issues though, when touching with three or more fingers, and moving only one of them
|
||||
// (it takes two hands to do this in a controlled manner). A better technique would be
|
||||
// to store the current and previous directions (with reference to the center) for each
|
||||
// touch individually, and then calculate the average of all individual changes in
|
||||
// direction. But this approach cannot be implemented locally in this method, making
|
||||
// everything a bit more complicated.
|
||||
let first_touch = self.active_touches.values().next().unwrap();
|
||||
state.heading = (state.avg_pos - first_touch.pos).angle();
|
||||
|
||||
Some(state)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
impl TouchState {
|
||||
pub fn ui(&self, ui: &mut crate::Ui) {
|
||||
ui.label(format!("{:?}", self));
|
||||
}
|
||||
}
|
||||
|
||||
impl Debug for TouchState {
|
||||
// This outputs less clutter than `#[derive(Debug)]`:
|
||||
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
|
||||
for (id, touch) in self.active_touches.iter() {
|
||||
f.write_fmt(format_args!("#{:?}: {:#?}\n", id, touch))?;
|
||||
}
|
||||
f.write_fmt(format_args!("gesture: {:#?}\n", self.gesture_state))?;
|
||||
Ok(())
|
||||
}
|
||||
}
|
||||
|
||||
/// Calculate difference between two directions, such that the absolute value of the result is
|
||||
/// minimized.
|
||||
fn normalized_angle(current_direction: f32, previous_direction: f32) -> f32 {
|
||||
let mut angle = current_direction - previous_direction;
|
||||
angle %= TAU;
|
||||
if angle > PI {
|
||||
angle -= TAU;
|
||||
} else if angle < -PI {
|
||||
angle += TAU;
|
||||
}
|
||||
angle
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn normalizing_angle_from_350_to_0_yields_10() {
|
||||
assert!(
|
||||
(normalized_angle(0_f32.to_radians(), 350_f32.to_radians()) - 10_f32.to_radians()).abs()
|
||||
<= 5. * f32::EPSILON // many conversions (=divisions) involved => high error rate
|
||||
);
|
||||
}
|
|
@ -339,7 +339,7 @@ pub use {
|
|||
},
|
||||
grid::Grid,
|
||||
id::Id,
|
||||
input_state::{InputState, PointerState},
|
||||
input_state::{InputState, MultiTouchInfo, PointerState},
|
||||
layers::{LayerId, Order},
|
||||
layout::*,
|
||||
memory::Memory,
|
||||
|
|
|
@ -16,6 +16,7 @@ impl Default for Demos {
|
|||
let demos: Vec<Box<dyn super::Demo>> = vec![
|
||||
Box::new(super::dancing_strings::DancingStrings::default()),
|
||||
Box::new(super::drag_and_drop::DragAndDropDemo::default()),
|
||||
Box::new(super::zoom_rotate::ZoomRotate::default()),
|
||||
Box::new(super::font_book::FontBook::default()),
|
||||
Box::new(super::DemoWindow::default()),
|
||||
Box::new(super::painting::Painting::default()),
|
||||
|
|
|
@ -23,6 +23,7 @@ pub mod toggle_switch;
|
|||
pub mod widget_gallery;
|
||||
mod widgets;
|
||||
pub mod window_options;
|
||||
pub mod zoom_rotate;
|
||||
|
||||
pub use {app::*, demo_window::DemoWindow, demo_windows::*, widgets::Widgets};
|
||||
|
||||
|
|
140
egui_demo_lib/src/apps/demo/zoom_rotate.rs
Normal file
140
egui_demo_lib/src/apps/demo/zoom_rotate.rs
Normal file
|
@ -0,0 +1,140 @@
|
|||
use egui::{
|
||||
emath::{RectTransform, Rot2},
|
||||
vec2, Color32, Frame, Pos2, Rect, Sense, Stroke, Vec2,
|
||||
};
|
||||
|
||||
pub struct ZoomRotate {
|
||||
previous_arrow_start_offset: Vec2,
|
||||
rotation: f32,
|
||||
smoothed_velocity: Vec2,
|
||||
translation: Vec2,
|
||||
zoom: f32,
|
||||
}
|
||||
|
||||
impl Default for ZoomRotate {
|
||||
fn default() -> Self {
|
||||
Self {
|
||||
previous_arrow_start_offset: Vec2::ZERO,
|
||||
rotation: 0.,
|
||||
smoothed_velocity: Vec2::ZERO,
|
||||
translation: Vec2::ZERO,
|
||||
zoom: 1.,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
impl super::Demo for ZoomRotate {
|
||||
fn name(&self) -> &'static str {
|
||||
"👌 Multi Touch"
|
||||
}
|
||||
|
||||
fn show(&mut self, ctx: &egui::CtxRef, open: &mut bool) {
|
||||
egui::Window::new(self.name())
|
||||
.open(open)
|
||||
.default_size(vec2(512.0, 512.0))
|
||||
.resizable(true)
|
||||
.show(ctx, |ui| {
|
||||
use super::View;
|
||||
self.ui(ui);
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
impl super::View for ZoomRotate {
|
||||
fn ui(&mut self, ui: &mut egui::Ui) {
|
||||
ui.vertical_centered(|ui| {
|
||||
ui.add(crate::__egui_github_link_file!());
|
||||
});
|
||||
ui.colored_label(
|
||||
Color32::RED,
|
||||
"This only works on devices which send native touch events (mostly mobiles).",
|
||||
);
|
||||
ui.separator();
|
||||
ui.label("Try touch gestures Pinch/Stretch, Rotation, and Pressure with 2+ fingers.");
|
||||
Frame::dark_canvas(ui.style()).show(ui, |ui| {
|
||||
// Note that we use `Sense::drag()` although we do not use any pointer events. With
|
||||
// the current implementation, the fact that a touch event of two or more fingers is
|
||||
// recognized, does not mean that the pointer events are suppressed, which are always
|
||||
// generated for the first finger. Therefore, if we do not explicitly consume pointer
|
||||
// events, the window will move around, not only when dragged with a single finger, but
|
||||
// also when a two-finger touch is active. I guess this problem can only be cleanly
|
||||
// solved when the synthetic pointer events are created by egui, and not by the
|
||||
// backend.
|
||||
|
||||
// set up the drawing canvas with normalized coordinates:
|
||||
let (response, painter) =
|
||||
ui.allocate_painter(ui.available_size_before_wrap_finite(), Sense::drag());
|
||||
// normalize painter coordinates to ±1 units in each direction with [0,0] in the center:
|
||||
let painter_proportions = response.rect.square_proportions();
|
||||
let to_screen = RectTransform::from_to(
|
||||
Rect::from_min_size(Pos2::ZERO - painter_proportions, 2. * painter_proportions),
|
||||
response.rect,
|
||||
);
|
||||
let dt = ui.input().unstable_dt;
|
||||
|
||||
// check for touch input (or the lack thereof) and update zoom and scale factors, plus
|
||||
// color and width:
|
||||
let mut stroke_width = 1.;
|
||||
let mut color = Color32::GRAY;
|
||||
if let Some(multi_touch) = ui.input().multi_touch() {
|
||||
// This adjusts the current zoom factor and rotation angle according to the dynamic
|
||||
// change (for the current frame) of the touch gesture:
|
||||
self.zoom *= multi_touch.zoom_delta;
|
||||
self.rotation += multi_touch.rotation_delta;
|
||||
// the translation we get from `multi_touch` needs to be scaled down to the
|
||||
// normalized coordinates we use as the basis for painting:
|
||||
self.translation += to_screen.inverse().scale() * multi_touch.translation_delta;
|
||||
// touch pressure shall make the arrow thicker (not all touch devices support this):
|
||||
stroke_width += 10. * multi_touch.force;
|
||||
// the drawing color depends on the number of touches:
|
||||
color = match multi_touch.num_touches {
|
||||
2 => Color32::GREEN,
|
||||
3 => Color32::BLUE,
|
||||
4 => Color32::YELLOW,
|
||||
_ => Color32::RED,
|
||||
};
|
||||
} else {
|
||||
// This has nothing to do with the touch gesture. It just smoothly brings the
|
||||
// painted arrow back into its original position, for a nice visual effect:
|
||||
const ZOOM_ROTATE_HALF_LIFE: f32 = 1.; // time[sec] after which half the amount of zoom/rotation will be reverted
|
||||
let half_life_factor = (-(2_f32.ln()) / ZOOM_ROTATE_HALF_LIFE * dt).exp();
|
||||
self.zoom = 1. + ((self.zoom - 1.) * half_life_factor);
|
||||
self.rotation *= half_life_factor;
|
||||
self.translation *= half_life_factor;
|
||||
}
|
||||
let zoom_and_rotate = self.zoom * Rot2::from_angle(self.rotation);
|
||||
let arrow_start_offset = self.translation + zoom_and_rotate * vec2(-0.5, 0.5);
|
||||
let current_velocity = (arrow_start_offset - self.previous_arrow_start_offset) / dt;
|
||||
self.previous_arrow_start_offset = arrow_start_offset;
|
||||
|
||||
// aggregate the average velocity of the arrow's start position from latest samples:
|
||||
const NUM_SMOOTHING_SAMPLES: f32 = 10.;
|
||||
self.smoothed_velocity = ((NUM_SMOOTHING_SAMPLES - 1.) * self.smoothed_velocity
|
||||
+ current_velocity)
|
||||
/ NUM_SMOOTHING_SAMPLES;
|
||||
|
||||
// Paints an arrow pointing from bottom-left (-0.5, 0.5) to top-right (0.5, -0.5), but
|
||||
// scaled, rotated, and translated according to the current touch gesture:
|
||||
let arrow_start = Pos2::ZERO + arrow_start_offset;
|
||||
let arrow_direction = zoom_and_rotate * vec2(1., -1.);
|
||||
painter.arrow(
|
||||
to_screen * arrow_start,
|
||||
to_screen.scale() * arrow_direction,
|
||||
Stroke::new(stroke_width, color),
|
||||
);
|
||||
// Paints a circle at the origin of the arrow. The size and opacity of the circle
|
||||
// depend on the current velocity, and the circle is translated in the opposite
|
||||
// direction of the movement, so it follows the origin's movement. Constant factors
|
||||
// have been determined by trial and error.
|
||||
let speed = self.smoothed_velocity.length();
|
||||
painter.circle_filled(
|
||||
to_screen * (arrow_start - 0.2 * self.smoothed_velocity),
|
||||
2. + to_screen.scale().length() * 0.1 * speed,
|
||||
Color32::RED.linear_multiply(1. / (1. + (5. * speed).powi(2))),
|
||||
);
|
||||
|
||||
// we want continuous UI updates, so the circle can smoothly follow the arrow's origin:
|
||||
ui.ctx().request_repaint();
|
||||
});
|
||||
}
|
||||
}
|
|
@ -28,7 +28,12 @@ pub use painter::Painter;
|
|||
use {
|
||||
copypasta::ClipboardProvider,
|
||||
egui::*,
|
||||
glium::glutin::{self, event::VirtualKeyCode, event_loop::ControlFlow},
|
||||
glium::glutin::{
|
||||
self,
|
||||
event::{Force, VirtualKeyCode},
|
||||
event_loop::ControlFlow,
|
||||
},
|
||||
std::hash::{Hash, Hasher},
|
||||
};
|
||||
|
||||
pub use copypasta::ClipboardContext; // TODO: remove
|
||||
|
@ -185,6 +190,40 @@ pub fn input_to_egui(
|
|||
input_state.raw.scroll_delta += delta;
|
||||
}
|
||||
}
|
||||
WindowEvent::TouchpadPressure {
|
||||
// device_id,
|
||||
// pressure,
|
||||
// stage,
|
||||
..
|
||||
} => {
|
||||
// TODO
|
||||
}
|
||||
WindowEvent::Touch(touch) => {
|
||||
let pixels_per_point_recip = 1. / pixels_per_point;
|
||||
let mut hasher = std::collections::hash_map::DefaultHasher::new();
|
||||
touch.device_id.hash(&mut hasher);
|
||||
input_state.raw.events.push(Event::Touch {
|
||||
device_id: TouchDeviceId(hasher.finish()),
|
||||
id: TouchId::from(touch.id),
|
||||
phase: match touch.phase {
|
||||
glutin::event::TouchPhase::Started => egui::TouchPhase::Start,
|
||||
glutin::event::TouchPhase::Moved => egui::TouchPhase::Move,
|
||||
glutin::event::TouchPhase::Ended => egui::TouchPhase::End,
|
||||
glutin::event::TouchPhase::Cancelled => egui::TouchPhase::Cancel,
|
||||
},
|
||||
pos: pos2(touch.location.x as f32 * pixels_per_point_recip,
|
||||
touch.location.y as f32 * pixels_per_point_recip),
|
||||
force: match touch.force {
|
||||
Some(Force::Normalized(force)) => force as f32,
|
||||
Some(Force::Calibrated {
|
||||
force,
|
||||
max_possible_force,
|
||||
..
|
||||
}) => (force / max_possible_force) as f32,
|
||||
None => 0_f32,
|
||||
},
|
||||
});
|
||||
}
|
||||
_ => {
|
||||
// dbg!(event);
|
||||
}
|
||||
|
|
|
@ -84,6 +84,9 @@ pub struct WebInput {
|
|||
/// Required because we don't get a position on touched
|
||||
pub latest_touch_pos: Option<egui::Pos2>,
|
||||
|
||||
/// Required to maintain a stable touch position for multi-touch gestures.
|
||||
pub latest_touch_pos_id: Option<egui::TouchId>,
|
||||
|
||||
pub raw: egui::RawInput,
|
||||
}
|
||||
|
||||
|
|
|
@ -114,13 +114,65 @@ pub fn button_from_mouse_event(event: &web_sys::MouseEvent) -> Option<egui::Poin
|
|||
}
|
||||
}
|
||||
|
||||
pub fn pos_from_touch_event(canvas_id: &str, event: &web_sys::TouchEvent) -> egui::Pos2 {
|
||||
let canvas = canvas_element(canvas_id).unwrap();
|
||||
let rect = canvas.get_bounding_client_rect();
|
||||
let t = event.touches().get(0).unwrap();
|
||||
/// A single touch is translated to a pointer movement. When a second touch is added, the pointer
|
||||
/// should not jump to a different position. Therefore, we do not calculate the average position
|
||||
/// of all touches, but we keep using the same touch as long as it is available.
|
||||
///
|
||||
/// `touch_id_for_pos` is the `TouchId` of the `Touch` we previously used to determine the
|
||||
/// pointer position.
|
||||
pub fn pos_from_touch_event(
|
||||
canvas_id: &str,
|
||||
event: &web_sys::TouchEvent,
|
||||
touch_id_for_pos: &mut Option<egui::TouchId>,
|
||||
) -> egui::Pos2 {
|
||||
let touch_for_pos;
|
||||
if let Some(touch_id_for_pos) = touch_id_for_pos {
|
||||
// search for the touch we previously used for the position
|
||||
// (unfortunately, `event.touches()` is not a rust collection):
|
||||
touch_for_pos = (0..event.touches().length())
|
||||
.into_iter()
|
||||
.map(|i| event.touches().get(i).unwrap())
|
||||
.find(|touch| egui::TouchId::from(touch.identifier()) == *touch_id_for_pos);
|
||||
} else {
|
||||
touch_for_pos = None;
|
||||
}
|
||||
// Use the touch found above or pick the first, or return a default position if there is no
|
||||
// touch at all. (The latter is not expected as the current method is only called when there is
|
||||
// at least one touch.)
|
||||
touch_for_pos
|
||||
.or_else(|| event.touches().get(0))
|
||||
.map_or(Default::default(), |touch| {
|
||||
*touch_id_for_pos = Some(egui::TouchId::from(touch.identifier()));
|
||||
pos_from_touch(canvas_origin(canvas_id), &touch)
|
||||
})
|
||||
}
|
||||
|
||||
fn pos_from_touch(canvas_origin: egui::Pos2, touch: &web_sys::Touch) -> egui::Pos2 {
|
||||
egui::Pos2 {
|
||||
x: t.page_x() as f32 - rect.left() as f32,
|
||||
y: t.page_y() as f32 - rect.top() as f32,
|
||||
x: touch.page_x() as f32 - canvas_origin.x as f32,
|
||||
y: touch.page_y() as f32 - canvas_origin.y as f32,
|
||||
}
|
||||
}
|
||||
|
||||
fn canvas_origin(canvas_id: &str) -> egui::Pos2 {
|
||||
let rect = canvas_element(canvas_id)
|
||||
.unwrap()
|
||||
.get_bounding_client_rect();
|
||||
egui::Pos2::new(rect.left() as f32, rect.top() as f32)
|
||||
}
|
||||
|
||||
fn push_touches(runner: &mut AppRunner, phase: egui::TouchPhase, event: &web_sys::TouchEvent) {
|
||||
let canvas_origin = canvas_origin(runner.canvas_id());
|
||||
for touch_idx in 0..event.changed_touches().length() {
|
||||
if let Some(touch) = event.changed_touches().item(touch_idx) {
|
||||
runner.input.raw.events.push(egui::Event::Touch {
|
||||
device_id: egui::TouchDeviceId(0),
|
||||
id: egui::TouchId::from(touch.identifier()),
|
||||
phase,
|
||||
pos: pos_from_touch(canvas_origin, &touch),
|
||||
force: touch.force(),
|
||||
});
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
|
@ -876,7 +928,10 @@ fn install_canvas_events(runner_ref: &AppRunnerRef) -> Result<(), JsValue> {
|
|||
let runner_ref = runner_ref.clone();
|
||||
let closure = Closure::wrap(Box::new(move |event: web_sys::TouchEvent| {
|
||||
let mut runner_lock = runner_ref.0.lock();
|
||||
let pos = pos_from_touch_event(runner_lock.canvas_id(), &event);
|
||||
let mut latest_touch_pos_id = runner_lock.input.latest_touch_pos_id;
|
||||
let pos =
|
||||
pos_from_touch_event(runner_lock.canvas_id(), &event, &mut latest_touch_pos_id);
|
||||
runner_lock.input.latest_touch_pos_id = latest_touch_pos_id;
|
||||
runner_lock.input.latest_touch_pos = Some(pos);
|
||||
runner_lock.input.is_touch = true;
|
||||
let modifiers = runner_lock.input.raw.modifiers;
|
||||
|
@ -890,6 +945,8 @@ fn install_canvas_events(runner_ref: &AppRunnerRef) -> Result<(), JsValue> {
|
|||
pressed: true,
|
||||
modifiers,
|
||||
});
|
||||
|
||||
push_touches(&mut *runner_lock, egui::TouchPhase::Start, &event);
|
||||
runner_lock.needs_repaint.set_true();
|
||||
event.stop_propagation();
|
||||
event.prevent_default();
|
||||
|
@ -903,7 +960,10 @@ fn install_canvas_events(runner_ref: &AppRunnerRef) -> Result<(), JsValue> {
|
|||
let runner_ref = runner_ref.clone();
|
||||
let closure = Closure::wrap(Box::new(move |event: web_sys::TouchEvent| {
|
||||
let mut runner_lock = runner_ref.0.lock();
|
||||
let pos = pos_from_touch_event(runner_lock.canvas_id(), &event);
|
||||
let mut latest_touch_pos_id = runner_lock.input.latest_touch_pos_id;
|
||||
let pos =
|
||||
pos_from_touch_event(runner_lock.canvas_id(), &event, &mut latest_touch_pos_id);
|
||||
runner_lock.input.latest_touch_pos_id = latest_touch_pos_id;
|
||||
runner_lock.input.latest_touch_pos = Some(pos);
|
||||
runner_lock.input.is_touch = true;
|
||||
runner_lock
|
||||
|
@ -911,6 +971,8 @@ fn install_canvas_events(runner_ref: &AppRunnerRef) -> Result<(), JsValue> {
|
|||
.raw
|
||||
.events
|
||||
.push(egui::Event::PointerMoved(pos));
|
||||
|
||||
push_touches(&mut *runner_lock, egui::TouchPhase::Move, &event);
|
||||
runner_lock.needs_repaint.set_true();
|
||||
event.stop_propagation();
|
||||
event.prevent_default();
|
||||
|
@ -940,6 +1002,8 @@ fn install_canvas_events(runner_ref: &AppRunnerRef) -> Result<(), JsValue> {
|
|||
});
|
||||
// Then remove hover effect:
|
||||
runner_lock.input.raw.events.push(egui::Event::PointerGone);
|
||||
|
||||
push_touches(&mut *runner_lock, egui::TouchPhase::End, &event);
|
||||
runner_lock.needs_repaint.set_true();
|
||||
event.stop_propagation();
|
||||
event.prevent_default();
|
||||
|
@ -952,6 +1016,20 @@ fn install_canvas_events(runner_ref: &AppRunnerRef) -> Result<(), JsValue> {
|
|||
closure.forget();
|
||||
}
|
||||
|
||||
{
|
||||
let event_name = "touchcancel";
|
||||
let runner_ref = runner_ref.clone();
|
||||
let closure = Closure::wrap(Box::new(move |event: web_sys::TouchEvent| {
|
||||
let mut runner_lock = runner_ref.0.lock();
|
||||
runner_lock.input.is_touch = true;
|
||||
push_touches(&mut *runner_lock, egui::TouchPhase::Cancel, &event);
|
||||
event.stop_propagation();
|
||||
event.prevent_default();
|
||||
}) as Box<dyn FnMut(_)>);
|
||||
canvas.add_event_listener_with_callback(event_name, closure.as_ref().unchecked_ref())?;
|
||||
closure.forget();
|
||||
}
|
||||
|
||||
{
|
||||
let event_name = "wheel";
|
||||
let runner_ref = runner_ref.clone();
|
||||
|
|
Loading…
Reference in a new issue