1

Closed

Attached Events for gestures

description

As per the discussion here: http://blakenui.codeplex.com/Thread/View.aspx?ThreadId=228130
 
In summary: it would be nice to be able to use attached events for gestures so that one could do:
 
<Grid gestures:HoldGesture="myHoldGestureHandler" >
 
I'm working on a patch for this right now.
Closed Oct 7, 2010 at 7:35 AM by JoshB
Fixed with Patch #6945 by isaks, commited in changeset #67526

comments

isaks wrote Sep 28, 2010 at 3:52 PM

Ok, I've uploaded the patch in the SourceCode tab (id: 6935) but I think I forgot to attach it to this work item..

This is what it adds/modifies:
engines - still mostly the same as before. One engine takes care of a single gesture
EngineHandler - wrapper class that takes care of interfacing with the individual engines. It's needed in order to allow engines to be transparently used by both triggers and the attached events. I'm not super satisfied with the fact that I needed 2 different variants of it, but I couldn't come up with a cleaner solution.
IGestureEngine - interface that all engines implement. Allows the EngineHandler to treat engines in a generic way
Events - static class containing the attached events and properties that facilitates the use directly from XAML. To enable these events, an elements need to set the attached property Events.AreGesturesEnabled to true since this creates and enables the gesture engines.

I don't expect the patch to be merged as-is but rather get some early feedback on whether or not this is something you think is useful to add to Blake.NUI

JoshB wrote Sep 28, 2010 at 4:16 PM

Isaks,
Thanks for keeping involved. I just finished a project at work yesterday so I'll have a little more time to evaluate patches. I'll take a look at this and give some feedback.

isaks wrote Sep 29, 2010 at 3:26 PM

Perfect Josh, thanks.

I've uploaded a new version of the patch: http://blakenui.codeplex.com/Project/Download/FileDownload.aspx?DownloadId=153500
It has the following improvements over the first patch:
  • To enable gesture support, you now call Blake.NUI.WPF.Gestures.Events.RegisterGestureEventSupport(this) with the root element from which to enable gestures (typically the main window). No need to set it on each control listening to the events
  • Gestures now also work even if some control higher up in the hierarchy handles the individual touch events. I'm not 100% sure if this is desired, but it is how the Contact*Gesture events works in the original surface SDK so I went with that.
  • The events now properly sets Source and OriginalSource in the event arguments just like normal input events. It even uses the input manager to do the hit testing. This means that if you do a gesture over a image, you can capture it in a parent grid for instance and still see which control that initially got it.
  • much more intellisense documentation
-Isak

wrote Oct 7, 2010 at 7:34 AM

wrote Oct 7, 2010 at 7:35 AM

wrote Feb 14, 2013 at 6:57 PM

wrote May 16, 2013 at 9:31 AM