![]() * translate touch events from glium to egui
Unfortunately, winit does not seem to create _Touch_ events for the touch pad
on my mac. Only _TouchpadPressure_ events are sent.
Found some issues (like
[this](https://github.com/rust-windowing/winit/issues/54)), but I am not sure
what they exactly mean: Sometimes, touch events are mixed with
touch-to-pointer translation in the discussions.
* translate touch events from web_sys to egui
The are a few open topics:
- egui_web currently translates touch events into pointer events.
I guess this should change, such that egui itself performs this kind of
conversion.
- `pub fn egui_web::pos_from_touch_event` is a public function, but I
would like to change the return type to an `Option`. Shouldn't this
function be private, anyway?
* introduce `TouchState` and `Gesture`
InputState.touch was introduced with type `TouchState`, just as
InputState.pointer is of type `Pointer`.
The TouchState internally relies on a collection of `Gesture`s. This commit
provides the first rudimentary implementation of a Gesture, but has no
functionality, yet.
* add method InputState::zoom()
So far, the method always returns `None`, but it should work as soon as the
`Zoom` gesture is implemented.
* manage one `TouchState` per individual device
Although quite unlikely, it is still possible to connect more than one touch
device. (I have three touch pads connected to my MacBook in total, but
unfortunately `winit` sends touch events for none of them.)
We do not want to mix-up the touches from different devices.
* implement control loop for gesture detection
The basic idea is that each gesture can focus on detection logic and does not
have to care (too much) about managing touch state in general.
* streamline `Gesture` trait, simplifying impl's
* implement first version of Zoom gesture
* fix failing doctest
a simple `TODO` should be enough
* get rid of `Gesture`s
* Provide a Zoom/Rotate window in the demo app
For now, it works for two fingers only. The third finger interrupts the
gesture.
Bugs:
- Pinching in the demo window also moves the window -> Pointer events must be
ignored when touch is active
- Pinching also works when doing it outside the demo window -> it would be nice
to return the touch info in the `Response` of the painter allocation
* fix comments and non-idiomatic code
* update touch state *each frame*
* change egui_demo to use *relative* touch data
* support more than two fingers
This commit includes an improved Demo Window for egui_demo, and a complete
re-write of the gesture detection. The PR should be ready for review, soon.
* cleanup code and comments for review
* minor code simplifications
* oops – forgot the changelog
* resolve comment
|
||
---|---|---|
.. | ||
src | ||
Cargo.toml | ||
CHANGELOG.md | ||
README.md |
egui_web
This crates allows you to compile GUI code written with egui to WASM to run on a web page.
Check out egui_template for an example of how to set it up.