I mentioned in a previous post that I am working on a new project related to gesture input. That very day, I hit a wall regarding desktop scaling, and last night I broke through it! Perhaps a topic for another post: with some applications, TDD can get you to a certain point, but integration testing is a must pretty early on.

The FRBTouch project is no exception! There are a few different problems to solve with this project:

  • Touch event capturing
  • Gesture detection
    • Taking touch events and making gestures
    • e.g. One touch event down then up is a tap
  • Coordinate translation
    • Taking window coordinates and translating them in an application (e.g. a FlatRedBall game)

The first two bullet points turned out to be the easiest, because they were mockable. For instance:

        public void single_tap_registers_from_one_touch()
        {
            // Arrange
            _container
                .Arrange<ITouchEventProvider>(p => p.Events)
                .Returns(new List<TouchEvent>
                {
                    new TouchEvent
                    {
                        Id = 1,
                        Position = Vector2.Zero,
                        Action = TouchEvent.TouchEventAction.Down,
                        TimeStamp = DateTime.Now
                    },
                    new TouchEvent
                    {
                      Id = 1,
                      Position = Vector2.Zero,
                      Action = TouchEvent.TouchEventAction.Move,
                      TimeStamp = DateTime.Now.AddMilliseconds(10)
                    },
                    new TouchEvent
                    {
                        Id = 1,
                        Position = Vector2.Zero,
                        Action = TouchEvent.TouchEventAction.Up,
                        TimeStamp = DateTime.Now.AddMilliseconds(200.0)
                    }
                });
            var gestureProvider = _container.Instance;

            // Act
            var samples = gestureProvider.GetSamples();

            // Assert
            var gestureSamples = samples.ToList();
            Assert.AreEqual(1, gestureSamples.Count);

            var tap = gestureSamples[0];
            Assert.AreEqual(Vector2.Zero, tap.Delta);
            Assert.AreEqual(Vector2.Zero, tap.Delta2);
            Assert.AreEqual(Vector2.Zero, tap.Position);
        }

That’s the test that proves a tap gesture is detectable given how the touch events are provided. It was easy to setup a mock scenario for drag and pinch as well, and just assert the required gesture return values. The TouchEvent object maps pretty closely to the events that User32.dll provides, also, so there wasn’t that much to test for actually capturing events.

The major problems came when attempting to translate coordinates from touching an XNA game window into world coordinates. I use a surface pro for all development, and it is pretty much a necessity to have 150% scaling on at all times, because the size of the screen is small. Windows scales all windows up, but in doing so it breaks the coordinate system for touch input. This is not something you can see or solve with test driven development (at least not traditional unit tests), because it requires a live scaled window and graphics object to operate.

To solve the problem, one simply has to disable the auto scaling, and tell Windows that the application will handle understanding the DPI settings. You have to make your application DPI Aware. (More info). The window will then not auto-scale, and the coordinate system will not be broken, so normal translation routines will work.

Leave a Reply

Your email address will not be published. Required fields are marked *