Technology is an element of the material culture shared by different societies2. As we will see, the introduction of new technologies is structurally managed by the culture which an individual belongs to. In this column we will explore both the truths of science and the beliefs of common sense, because the social meaning of information technology is closely related to intentional states of individuals and of groups. Prior to exploring how the culture manages the introduction of new elements like IT, we have to better understand what intentionality is. Cognitive sciences define the state of a system as 'intentional' when it is referred to something else: so inorganic environments (like the artificial computing) have no intentional states, because they are determined just by a series of causes and effects; on the contrary, organic environments (like the human reasoning) are full of intentional states which produce semantic meanings (that are completely absent from the syntactic of computers)3.
The AI researchers Stuart Russell and Peter Norvig noticed that, although airplanes and pigeons use different methods to fly, anyway they both fly4. However, this similarity cannot be applied to activities that depend on intentionality, like computing and games: machines have no intentional states, hence they neither compute nor they play. Nobody believes that the chessboard and pieces could form an intelligent system able to play, exactly like nobody believes that an abacus has the faculty of computing: they are just devices used by human beings in order to play or calculate. But many people interpret the automatic functioning of digital machines as if they have intentional states, so it's not rare to read that “the program is playing against me” or that “my navigation system is calculating the distance”. Alan Turing, the inventor of computer, even did not considered digital machines (that he defines 'discrete state machines') as really existent. Indeed he wrote that “strictly speaking there are no such machines [...] but there are many kinds of machines which can profitably be thought of as being discrete state machines”5. So digital machines are just instruments that we use for computation and play, exactly like abacuses and chess. But it's a matter of fact that normally we don't perceive them as normal tools: rather, at a first approach, we perceive them as quasibiological, goal-oriented objects with an unknown functioning. So the effective way of introduction of new technologies depends both on how the individual intentionality perceive novelties and on how the collective intentionality at the base of culture treats them.
The contact with new cultural elements involves fear and desire, two intentional states related with risk. The human culture bounds the exploration of new and risky things into safe environments related to the activities of learning, training, rites and games. The safeness of learning and training depends on the physical safeness of their environments. Indeed mammals are able to adapt because they can learn and freely train. This ability derives from the fact that mammals come to life unfit for the survival, so they need a span of time to grow up. In the absence of an eggshell, the protected environment must be social: a parent takes care of the newborn growth, during which the young animal develops not only bones and muscles, but also abilities and experienced behaviors. On the contrary, other animals like reptiles, fishes and shellfishes live all the life with the very same abilities inherited at the time of their birth and written in their DNA. A further passage is made by the human culture, which amplifies the process of learning of mammals by introducing a way to transmit the discoveries from individual to individual and from generation to generation: the symbolical language. Hence, coming back to our topic, the human culture manages the contact with risky things (like a new technology) introducing environments which don't use only physical solutions to create safeness, but also symbolic ones: rites and games put people in contact with new things in a symbolically safe environment. This symbolic solution is very useful for the individuals that perceive automated and unknown machines like risky things because, lacking a material risk, the protection operates only at the level of human cognition.
This is the first, short, article of a series which examines the role and the nature of games in information technology.
In the next article we will see what games and rites have in common and what they don't share
Author: Ivan Mosca
Ivan Mosca is a Ph.D. researcher in Philosophy at the University of Turin, Italy. His main research areas are Social Ontology, Game studies, Bioethics and Theoretical Dialog. He is member of Labont, Consulta Nazionale di Bioetica and he is teacher in Philosophy for Children. Among his latest articles: Fiction/Interaction, Ontology/Neurology and Computer Games (in “Stvar. Časopis za teorijske prakse”, 3), The deConstruction of Social Ontology: the Capital of Palestine (in The Nature of Social Reality, Cambridge Scholars Publishing),+10! Gamification and Degamification (in “G|A|M|E games as art, media, entertainment. The Italian Journal of Game Studies”, 1). For more details: http://labont.it/people/ivan-mosca. Don't hesitate to contact him: email@example.com