Pondering How Game and Arcade User Input Influences User Interface for Modern Software Client Applications

Over the past few months I started playing Halo 4 thanks to my step son. If you have played Halo or one of the other modern high end game franchises like Call of Duty, Grand Theft Auto, etc you can't help but be a little overwhelmed with the modern game console controllers, like XBOX 360 Controllers, and application quality. I mean there are lots of buttons and knobs and they can turn, twist and even rumble in your hands. Modern games can take advantage of each of these tactile features to immerse players deep in the experience.

As I was recently playing I got to thinking about all the things that have now become second nature to me over the past couple of months and just how far the game controller has come since the original early 70's vintage Atari units. The original joysticks were simple sticks that could pass the user's desire to move up, down, left right and of course fire the weapon with the single button. Of course games were much simpler back then. We had simple processors and plugged the console into the family TV (gee that has not changed over the past 40 years). Pong was the original leader, a very simple game and eventually we had Space Invaders and Asteroids.

Atari Joystick

Other games of course followed and by the early 80's we had great home titles like Pitfall. But to a larger extent, if you were like me, we were wasting countless quarters with games like PacMan, Defender or my favorite Galaga. I could play for hours and being poor I learned to play well out of necessity. But the inputs for these games were not much more sophisticated than the classic Atari joysticks, just bigger. Typically you had a button and a joystick with a knob. Some games had 2 or even three buttons. Of course PacMan just needed a joystick, but had the 1 or 2 player button. And the buttons were large so we could just slap them in an expression of dominance or frustration.

Galaga Controller

Also in the early 80s we had other game input devices, like the Intellivision. The Intellivision controller featured a disc you would rotate your thumb around to declare your intent. A few decades later we saw this concept resurrected in the iPod and the Zune.

Intellivision Joystick

Now step aside from the controller history to think about how that affected the games themselves. Simple controllers lead to simple games. This does not mean the game or application for that matter enjoyment needs to be sacrificed. Think about the hours you spent playing those games and then watch kids today play games on our phones. I am amazed at the countless hours adults and children alike spend playing casual smartphone games likes Angry Birds or Flight Control with nothing more than a simple finger swipe and release. The game features and graphics are somewhat more sophisticated, but in many cases not really compared to the early 80's games I grew up playing.

Then consider Halo, Call of Duty, Madden (which remember started somewhere back in the 80's if I recall) and others that take full advantage of the game console platforms and controllers. Extremely sophisticated and immersive.

Microsoft Kinect

When if comes to general software applications today we have a whole new paradigm with touch, motion and soft keyboards. First lets consider motion. One of the two coolest game input creations over the past 5-6 years have been the Wii controllers and the Kinect. The Wii created the gateway we needed to start playing basketball and tennis the way we would outside, without the sun burn of course. Then came the Kinect and we don't have to even worry about tossing the controller through the TV! We are now free to play many games, exercise and much more just like we would without the game console, so we actually get some cardio and have something giving us measured feedback on our performance.

Microsoft added additional value to the Kinect experience with Kinect for Windows offering a whole new way to receive user input and drive our boring old desktop applications.  Think about defining a set of hand gestures for the user/customer to interact with the software you and I create on a daily basis? Now let the user define their own set of gestures. I cant wait for someone to pick on the double fist pound to indicate how frustrated a user is and launch Clippy :)

Of course everything has quickly migrated to touch for that up close and personal experience with our computing needs. This is going to have major implications in the consumer and to a larger extent the enterprise. We have to completely rethink the way we architect user experiences for all applications, consumer and business. The input paradigms of the past just wont work going forward. Tiny touch points, cluttered menus and toolbars are just distractions. We can make the data the primary target today and make it something the customer touches and directly interacts. Gradually we will need to incorporate the features offered by Kinect, vision, voice and motion.

My first after college project was to develop a wireless voice activated, touch input plant floor process data entry system. That was so much fun. But I did not realize at the time just how far ahead of the technology I really was. I would love to go back and leverage today's hardware, voice recognition software, etc.

Companies that embrace these new natural user interfaces and start thinking about the NUI's of tomorrow will stand out and be successful. Much of the implementation pain has been fleshed out over the past 2-3 decades. Gripping to classic input paradigms is a recipe for defeat. Its great to reminisce about the controllers of yesterday, but we need to embrace the controllers and NUIs of today to make better client software. Be confident and start integrating these mechanisms today, not tomorrow. It only has to be a small part of your enterprise, trust me one small success will lead to many, many more.

Share This Article With Your Friends!