Blake.NUI includes some simple yet useful gesture recognition engines for WPF 4 Touch.
You can easily listen to touch events on any UIElement, but interpreting those events as actual gestures are more difficult. The v1 Surface SDK includes a ContactTapGesture as well as a ContactHoldGesture event, but these are not present in WPF 4 Touch API.
This represents a gap in the API.
Blake.NUI bridges this gap by adding support for both Tap, ToubleTap and Hold gestures. These gestures engines are available in two forms:
- Wrapped in Blend Behaviors. This means they can be used easily within Expression Blend to trigger any Blend Action.
- As Attached routed events.
A tap gesture is equivalent of a single click event in a traditional mouse environment. It is fully configurable. You can set it to activate for short taps, long taps, or use a custom minimum and maximum contact time. You can also customize how far the finger
is allowed to move (jitter) before the gesture is aborted.
The double tap gesture is the equivalent of a double click in a mouse environment. It has all the same customizable properties as the tap gesture, plus a maximum gap time, which limits how much time can elapse between the first and second tap. The second
tap must start within the jitter distance of the end of the first tap.
The hold gesture is typically not used in a mouse environment, but in touch based UI’s it is often used to emulate the behavior of a right click (context menu), for example to perform additional actions on an item. A hold gesture is performed by pressing
down on a control and then wait for a certain time. Like the tap gestures above, this is also configurable and allows you to specify the timeout before a hold event is raised as well as how far the finger is allowed to move before the gesture is aborted.
The gesture engines can optionally handle the touch events that it listens to. It defaults to
not handling touch events, and this is also recommended if you plan to use more than one gesture on a control. You might need to change this if you don't want the touch events propagate up the visual tree and possibly be caught by a manipulation.
Using the gesture triggers
As mentioned above, there are two main ways to use the gestures, as attached events or as blend behaviors.
As attached events:
This is essentially the same way you would listen to any input events in WPF. The only difference is that you have to add the blake gestures namespace and prefix the event names with the class and namespace containing them:
(typically you’d put the namespace on the top level element)
<!-- Grid content here -->
There is one more thing you need to do to enable gesture recognition, and this is to register a root element that the engine can monitor for touch events. In most cases, you would do this on the main window of your application, and then anything beneith
it in the tree will automatically get gesture support:
/* ... */
As Blend behaviors:
The following will make the Grid respond to the Tap event and any actions added in the XAML section will be executed when the user taps it. The method
Grid_Tapped will be called in the code behind as well.
<!-- Add actions here -->
<!-- Grid content here -->
Notice that you can use the Tap event or add Blend Actions as children of the trigger. The xmlns definitions would normally be added to the root element of the XAML document, of course. You'll need to add references for Blake.NUI.WPF.dll and System.Windows.Interactivity.dll
to your project .
You can add the trigger programmatically as well:
TapGestureTrigger tap = new TapGestureTrigger();
tap.HandlesTouches = true;
tap.Tap += new EventHandler(tap_Tap);