Chapter 12: Digital Cognition and Agency

In Chapter 12: Digital Cognition and Agency Hinton explains the origins of digital agents, beginning with information science. He focuses on the definition of information and how information is different for humans and digital systems. As digital agents become more present in the physical world, the translation of information between humans and computers becomes an important science. The need for good translation is delved into with a look at ontology in computers and interface design.

Shannon’s Logic
Hinton begins the chapter by covering the science of information, which existed long before computers. Claude Shannon, known as the “Father of Information Theory”, was one of the biggest influences on how information is used in today’s digital technology.

Shannon removed human meaning from the language passed to machines to create more accurate transmissions. Hinton explains that Shannon’s information starts “with definition and measurement”. Shannon redefined information as a structure to be represented by Boolean logic in the form of 1's and 0's. In contrast, humans work with information that originates from perception, not logic. Digital technology adopts Shannon’s information definition and enforces a translation barrier between digital and human information. Translating between that barrier is a core part of the field of design.

Digital Learning and Agency
Computing used to be a human job, but humans make errors and have limits on the amount of work they can take on. By the mid-1900s a field emerged around creating computing machines. Hinton states these computing machines were “made of logic itself” and came along thanks to Shannon Claude, Alan Turing, and Norbert Wiener. For example, the Turing Machine used Boolean logic to create rules that dictated its actions. Turing’s idea was that “anything that could be represented by mathematical symbols and logic could be computed" . So if the world was represented to the Turing Machine through logic, then the machine could compute decisions without direct human commands. Turing's theory leads to digital agency. The digital system acts as an agent that is capable of functioning alone and makes decisions based on the given rules.

Digital agency changed human culture. The human world had to be represented by algorithms so that computers to could make decisions autonomously. However, human and computer cognition differ in how they use information. Humans perceive the physical world and afterwards can develop a method to abstract it through symbols or language. Computers can only work with information that has already been abstracted.

Abstracting information becomes more difficult the less logical it is. ''Moravec’s Paradox is the increased difficulty that appears when teaching a computer tasks from logical to cognitive. For example, a computer can easily be taught to recognize the text “berry” because it is already abstracted. But the recognition becomes harder to teach when it involves recognizing different pictures of a berry, a trivial task for a human to learn.

Since a computer cannot find meaning in objects like a human can that information must be purposefully given to it. An example of a digital agent that must be given meaning is Siri from iOS. Siri is a software that performs actions based on human voice inquiries. At some point, abstracted information must be written to tell Siri what humans want when they say things such as “Find me a restaurant nearby.”

Everyday Digital Agents
Hinton compares digital and physical information. Physical information is dominated by concrete laws. Digital information lacks friction to slow it down. Even if left alone, digital information will continue to make decisions. The Flash Crash shows the negative effects the lack of friction can have. In 2010, software trading at a faster rate than humanly possible caused the stock market to drop 9% in minutes before rising back up in the next hour.

Another issue that arises with digital agents is knowledge of cause. When an event occurs due to a computer algorithm, humans are stuck guessing what the digital agent was trying to accomplish unless the agent was programmed to tell them. In 2012, anonymous trading software performed 4% of trading activity in a week by making and canceling orders. The reasons behind the software's actions remain unknown.

On a more positive note, as software evolves, people without programming knowledge are given their own ways of creating digital agents. Hinton gives the example of an application called IFTTT(If This Then That) which allows people to define a digital action that should occur in response to a digital event, such as saving a shared photo. Hinton explains that IFTTT turns human actions into “triggers" . Many programs have defined responses to events, such that an event triggers another event .  With IFTTT, the humans can easily define their digital actions as the triggers. Hinton states that digital agents will continue to evolve and automate actions. Ironically, digital agents will evolve to control the increasing complexity they can add to human life.

Ontologies
Computers do not rely on human information to function. Only when humans must receive or give information is translation required. From human-computer translation stems the new definition of ontology.

In philosophy, ontology is used in metaphysics to study “being” and its relationship with language. Hinton defines the ontology in information science as “the way we teach digital machines how to understand the inputs we expect them to process”. He means that humans must provide definitions for computers to understand the information humans give computers. The contrast in human and computer information explained in section two through the berry example is why definitions are needed. For example, a computer does not understand what the word “calendar” means to humans unless it has been defined.

Computers must also be made to translate their information in a format understandable to humans. The translation is necessary so that the two can interact. Hinton finally defines the ontology for user interface design as “establishing understandable semantic function that solves for the contextual gap between person and machine”  . While similar to ontology in information science, this definition focuses more on benefiting the user. An example of failed ontology is with the definition of “friend” used in Google’s Buzz application in 2010. Buzz automatically generated lists of “friends” for users based on their Gmail activity. Ex-spouses, bosses, and more were added to the lists without user knowledge. The friend lists were also public for all to see. The definition of friend for humans is complex. The definition changes for everyone and can change for the same person depending on the situation. The definition of friend given to Buzz was much too simple to account for people's varying definitions. Therefore disaster followed.

Analysis
Digital writing includes writing code or posting content on a digital platform. Hinton explains how the digital platform works by first explaining the shift to creating information based on logic. He shows how information science led to machines based on logic, or computers. Computing machines can only work with information presented in an abstract form and require a set of rules to dictate their actions. Code tells the computer the meaning behind given information. Code also tells the computer what to do with the information depending on the meaning. Digital agents are capable of making autonomous decisions only because of code. Therefore, a person writing code must understand how computers interact with information.

Chapter 12 is about understanding the need for information translation. Efficiency in machines can be increased by translating human information into a logical form. Hinton states that computers “begin with abstraction”. Hinton is talking about how computers accept information in semantic form, such as text. Humans accept information through perception rather than abstract symbols. Only with good translation can humans and computers interact. In the final section about ontologies, Hinton talks about the disaster of automatic friend lists by Google’s Buzz. The translation failure with the word ‘friend’ proves how important it is for digital writers to properly define the human world for digital agents. When Buzz had the information about who a person frequently emailed, it should have known that it was not enough to define friendship.

In his piece “Writing as a Technology”, Bolter offers supporting definitions to Hinton's while analyzing digital writing technology. He defines computers as machines that process information and replace skill, yet cannot “function as the writing space in the absence of human writers and readers" .  Bolter's statement emphasizes the need for translation.  Digital writing and reading will only continue and it requires human computer interaction.  For that interaction to be fruitful, interface designs must perform proper translation of information before humans and computers.  In other words, a proper ontology must be used.