Chapter 13: Digital Interaction

In Chapter 13: Digital Interaction, Hinton defends that interaction between humans and digital systems through interfaces are a vital part of today’s environment. He talks about interfaces made of semantic information that define human interaction with digital systems. He also touches on modes in interfaces and their impact.

Interfaces and Humans
Hinton begins by explaining that interfaces are used for defining interactions between different systems. Interfaces in software define the interactions between programs, while those in hardware systems define interactions between physical devices. As explained in Chapter 12, computers process information that is abstract and logical. Therefore the interfaces between computer hardware or software are successful when designed using detailed, logic-based structures.

The most important, yet hardest to design, interface is the one that bridges interactions between human systems and digital systems. The interface creates an artificial environment that is used to communicate information between the humans and computer. Human systems are the opposite of computers, so a bad interface forces humans to think like a computer. A good interface is intuitive for humans to master. Hinton wants to “establish that interfaces are part of the environment we inhabit and are themselves smaller environments nested within the larger ecological context"  . Interfaces are a part of the environment because they inhabit the screens of computing devices that people use regularly. Each interface has its own set of objects and actions to perform.

Human-computer interfaces have only been around as long as computers. 50 years ago interfaces were for specialists completing a specific job. After evolving 50 years, interfaces are for the common people completing wide ranges of trivial tasks. Hinton explains that the demographic change means interfaces had to become an “everyday environment” ''. ''Everyday people do not want complex instructions on how to communicate in the computer’s language. Language must instead be used to create digital agents that translate from the computer's language to how humans think.

One example of the poor translation that still happens is the digital agent that Hinton interacted with at a gas pump. The author encountered one that merely stated “ENTER DATA” on its screen and had a sticker above explaining that the digital agent was asking for his zip code. Hinton emphasizes that successful interfaces are necessary because “there will always be the need to translate the abstraction of digital information into invariants that users can comprehend" . “ENTER DATA” is the digital information. 'Data' can mean anything to the user, so a better interface is necessary to make the user understand that the computer wants their zip code.

Semantic Function of Simulated Objects
The famous painting by Rene Magritte, The Treachery of Images, contains a realistic pipe followed by the text “This is not a pipe” in French. The painting brings awareness to the fact that humans do not usually distinguish between semantic and physical information. Both a picture of a pipe and a physical pipe are called a pipe. Technology takes advantages of people's tendency to associate semantics with physical objects by simulating the object's physical properties.

With digital technology came the ability to simulate any physical object on a screen's interface, which opened up far more possibilities than physical controls or buttons. When designing the simulated controls, there are two well-known methods of design. Hinton explains that Skeuomorphic design consists of “literal translation of the physical surfaces and objects we encounter in the physical world”. The button on the screen will look like a photo realistic physical button. On the other end of the design spectrum is flat design, which strips away all realism and is very minimal in how it simulates physical objects. The Apple iOS software has used both designs in various generations. Before iOS 7, iOS was known for using skeuomorphic design with details, shading, and highlights to add depth. Then iOS changed everything to be flat. At this point, people understood how interactive objects on screens worked and no longer needed semantic information to be so realistic. For example, the iOS 7 email app changed their “Cancel” button to a text box with just the word, not even simulating a button anymore.

How humans use digital controls depends on how such controls have worked for them in the past. Because the Cancel button performed a certain action before iOS 7, users understand that the text box with “Cancel” in the top right hand corner would perform the same task. Unlike physical objects, humans have no control over what happens when a digital button is interacted with. They can assume based on past experience, but the computer controls the outcome.

Hinton describes how digital environments take objects from their physical world and “abstract them into something made entirely out of language”. For example, the interface that humans use for email is a well-known simulated environment based around a physical mailbox. The mailbox interface has folders, mail, and drafts. However, none of these physically exist. They are merely represented through semantic information on the interface.

Modes and Meaning
In this section, Hinton defines a mode as a “condition that changes the result an action would have under a different condition”. Modes define what occur when a user interacts with a control, such as a button. A change of mode can allow the user to perform different actions without changing the controls. A typical example is when a person is playing a PC game. When in play mode the keyboard controls change so that the ASDW keys control the player’s movement. Then in chat mode the keys revert back to printing letters when pressed.

Modes can either be sticky or non-sticky. With sticky modes, a mode remains changed until it is actively switched back with a control. For non-sticky modes, a control must be held down for the mode to change. The mode changes back when the control is released. Caps lock is an example of a sticky mode. After pressing the caps lock key, the interface stays in caps lock mode until another action, pressing the key again, removes it from that mode. The control key is a type of non-sticky mode. The ‘S’ key will type the letter ‘S’. However if one holds down the control key and presses ‘S’ the current application will save its work. This mode does not remain when the control key is released.

While modes can allow a wider range of functionality with fewer controls, they can also confuse the user who must keep track of the modes. Nested modes, where the user must enter a mode to enter a mode until they can perform the task, can also reduce clarity. Five fatal plane crashes from 1988 to 1996 occurred because the pilots had been using controls in the wrong mode.

Hinton explains that modes can be dangerous because they “change the fundamental meaning of action”. Physical actions produce the same results. However, with digital interactions a computer controls the results of every human action. Humans thus find the concept of modes harder to grasp.

The use of modes today is best seen in smartphones. Before smartphones, cell phones were actual phones. Now the word phone is merely a nickname for what Hinton calls a “modal device”. The actual phone controls are not present. The only control is a rectangular piece of glass, but with the modes provided by applications it can do an infinite amount of tasks. The broad range of functionality is simply not possible outside of software.

Analysis
Chapter 13 is one of the most important in the book for digital writers. Digital interaction is what digital writers must define in their work. The chapter discusses in great detail the use of interfaces, simulated objects, and modes in the digital environment that a writer would be working in. They will gain a fundamental understanding of the concepts that will allow them to produce a better interactive experience for the user.

Digital writers should learn the importance of a well-designed visual interface that both communicates information and has intuitive use. In section two, Hinton explains how interactive semantic objects work. Semantic objects are necessary for the user to interact with a digital system. The picture of a button lets a human know that touching that section of the screen will cause a reaction. Hinton also covers skeuomorphic and flat design, which are well known design practices. Understanding the design styles will allow one to understand the purpose they serve in interface usability. Skeuomorphic design was more popular when digital interfaces were new and people were still learning how they worked. The semantic button looked like a physical button. Flat design dominated after people had become comfortable with the norms of interactive design. The button was simplified to a colored box.

Hinton explains the important role of modes in the third section. Many digital writers might not even be aware of modes and their presence, despite working with them every day on their smartphones. Badly designed modes can have dangerous consequences for the user, such as faulty air conditioning or plane crashes. Therefore Hinton provides a solid explanation of the types of modes and how they change the user’s intuitive meaning of action. Now the digital writer can implement modes knowing how it will affect their audience.

Of course, today one can argue that everyone is a digital writer. In “We’re All Coders Now”, Clive Thompson talks about how a creative-writing major used Google’s App Inventor to create the No Text While Driving App. This is an example of a successful human computer interface in play. Google’s App Inventor allows humans to complete the usually complex task of coding. It successfully uses language to bridge the gap and relate more to how a human thinks. Furthermore, now humans can reapply these principles in their digital writing while using these interfaces.