TouchDevices

Blake.NUI includes two custom TouchDevices for WPF 4 Touch. These TouchDevices can be used to extend the input to the touch system.

MouseTouchDevice

MouseTouchDevice takes regular mouse input and promotes it to touch events. This is useful when you are writing custom touch controls, using touch events or enabling manipulations. By default, mouse input will not affect those events, so you would need to write additional code for handling mouse events and feeding those into the same interface logic. This creates some duplication of code and increases complexity and testing requirements. For applications that need to target both touch and mouse, it would be ideal to only write against touch events.

MouseTouchDevice enables this. Once you add this one line to the constructor of your window, the whole window will respond to mouse input through the touch system.

MouseTouchDevice.RegisterEvents(this);

You can test whether a specific touch event came from the mouse by checking whether the TouchDevice is a MouseTouchDevice:

private void TouchDown(object sender, TouchEventArgs e)
{
    if (e.TouchDevice is MouseTouchDevice)
    {
        //Do something specific for the mouse
    }
}

This also means that the mouse can participate in manipulations. If you have hardware that allows simultaneous touch and mouse input, then you can even manipulate an interface element with one finger and one mouse.

SurfaceTouchDevice

After doing WPF 4 Touch development, you may wish that the Surface SDK had integrated touch events and integrated manipulations. You may also want to write code once and run it on both Surface and Windows 7. You can install .NET 4 and Visual Studio 2010 on your Surface device, but the Surface input system will not create WPF 4 events. With SurfaceTouchDevice, this is possible.

SurfaceTouchDevice listens for Surface Contact events and promotes them to WPF 4 Touch events. This enables you to write WPF 4 Touch code, including touch events and manipulations, and run it on Surface with no modification. Just add this code to your SurfaceWindow constructor:

SurfaceTouchDevice.RegisterEvents(this);

In most cases you won't need to do anything else. Surface SDK controls, such as TagVisualizer, will react to and handle Contact events. Unhandled events will be promoted to WPF 4 Touch events, which your WPF 4 code will handle.

If you do want to check an actual Surface property, such as whether the TouchDevice represents a tag or a blob, you can get access to the original Contact by casting the TouchDevice to a SurfaceTouchDevice:

private void TouchDown(object sender, TouchEventArgs e)
{
    bool isFinger = true;

    SurfaceTouchDevice device = e.TouchDevice as SurfaceTouchDevice;
    if (device != null)
    {
        isFinger = device.Contact.IsFingerRecognized;
    }
    //Do something 
}

You can also filter out touches from your manipulations by iterating through e.Manipulations, casting each IManipulator to SurfaceTouchDevice, and calling Manipulation.RemoveManipulator for undesired IManipulators.

Last edited Jul 13, 2010 at 6:22 AM by JoshB, version 4

Comments

No comments yet.