Suppose a child is abandoned in the wild and has no contact with other people. Will they even learn a language? This is a question I was recently asked, along with the follow-up “could they possibly develop a language from scratch?” Although this seems intuitively possible, and popular culture provides us with examples such as the Jungle Book, I am afraid the answer is no.
I have already tried to answer the question in a series of tweets, but —for all its virtues— Twitter can be a limiting medium, which precluded an elaborate answer. I will attempt to give such an answer in the paragraphs that follow, where I examine what we know about children that grow with no linguistic input; what we currently think about the nature of language, and finally, what we believe about the ways in which language is acquired.
What happens to children who don’t have exposure to language?
The first recorded case of children growing up without any language input is provided by the Greek historian Herodotus. In Book 2 of his Histories, Herodotus reports that the Egyptian pharaoh Psamtik I (whom he calls Psammetichus) devised an experiment to find out whether the Egyptians were the oldest people on earth. To that end, “he took two newborn children of the common people and gave them to a shepherd”, who was instructed to raise them in an isolated mountain hut, without ever speaking to them. At the age of two, the children reportedly began to utter the word “bekos”, which sounds unsurprisingly similar to the bleating of the shepherd’s goats. Herodotus reports:
Psammetichus then heard them himself, and asked to what language the word “Bekos” belonged; he found it to be a Phrygian word, signifying bread. Reasoning from this, the Egyptians acknowledged that the Phrygians were older than they.
Herodotus was the kind of historian who loved a nice story and couldn’t always be bothered to verify what he was told, but Psammetichus’ experiment seems cruel enough to be true, and the reported finding is entirely possible. Later in history, a similar experiment is said to have been carried out by King James IV of Scotland, and it is reported that the children “spak very guid Ebrew” (Fromkin & Rodman 1993: 24 [*]), proving that data fabrication predates modern science.
Feral children seem incapable of fully mastering a language after being found
These accounts aside, we know of no child who developed language skills without linguistic input. On the contrary, there are several cases of ‘feral children’, who grew up in the wild, or children who grew up in extreme social isolation (e.g., Aitchison 1992: 85-89). In such cases, it seems that the children did not develop any language of their own, but also seemed incapable of fully mastering a language after they had been found.
In short, it appears that there is no evidence of children spontaneously developing a ‘private’ or ‘personal’ language. But absence of evidence does not prove anything. After all, if people live in such isolation that they have no linguistic input at all, they would be very difficult to find, so maybe we just haven’t been able to find children with private languages. So in the next section, I will turn to the question of whether such a language is theoretically possible.Embed from Getty Images
Can a ‘personal’ language exist?
There are two main arguments against the existence of ‘private’ languages: one relates to the timescales required for language to develop, and the other relates to the social nature of language.
Languages take time to develop
Fictional characters such as Mowgli or Tarzan apparently begin to develop language skills by associating items or people in their surroundings with specific sounds. After ‘naming’ various objects, they eventually form more elaborate strings of words (‘me Tarzan, you Jane’), which function like a language, albeit a primitive one. The problem with such narratives is that the developmental processes involved in creating a language from scratch operate in a timescale of millennia. George Yule describes this process rather nicely:
The human may have first developed the naming ability, producing a specific noise (e.g. bEEr) for a specific object. The crucial additional step which was then accomplished was to bring another specific noise (e.g. gOOd) into combination with the first to build a complex message (bEEr gOOd). A few hundred thousand years of evolution later, man has honed this message-building capacity to the point where, on Saturdays, watching a football game, he can drink a sustaining beverage and proclaim “This beer is good”. (Yule 2010: 5-6)
Unless there is some kind of short-cut (see below), it would be impossible for these processes to take place in the space of a single lifetime.
Languages are social behaviour
The second argument against a ‘private’ language is that language is, by definition, social behaviour. In fact, it is suggested that a language effectively dies out when there is only one speaker left, because they cannot communicate with anyone else (Crystal 2002: 2). Whether or not thought exists independently of language is a hugely controversial topic, but a language that is never articulated and only exists in the form of mental representations would be a very peculiar language indeed.
So, going back to the question of whether a ‘private’ language can exist, the simplest answer would be ‘no’. It is, perhaps, possible to stretch the definition of language enough so that it includes more primitive forms of communication, or non-communicative behaviour and mental activity. That, however, begs the question of how far one can deviate from shared assumptions about what language is, while still legitimately talking about language. Such a question would take us out of the domain of linguistics and into philosophy, so I will not pursue it further.Embed from Getty Images
How is it possible to speak?
In the previous section, I argued that the development of language from scratch is a process that lasts millennia. How is it then that children manage to learn how to speak a language so well in the space of a few years?
The truth is that we don’t perfectly understand the process of (first) language acquisition, but we have (at least) two plausible theories that purport to explain it. In both cases, it is assumed that evolution has given humans a ‘short cut’ to mastering any language they are exposed to. Some people think that this short cut consists of innate language knowledge, whereas others think that we come biologically equipped with a special ability to process language signals very efficiently.
The first theory posits that, at birth, the human brain already contains a lot of information about language. This knowledge (a Universal Grammar) is latent, and it requires biological maturity and exposure to language input in order to be activated. When a child is at the right age, linguistic input begins to activate the Universal Grammar and shape the child’s language.
Maybe we are born with innate knowledge of linguistic principles and programmable parameters
In some ways, this theory views children like smartphones: when you first switch on a new smartphone, you may be asked for information about your location and preferred language; this input then used across all applications to determine how information like the date, time, currency, telephones and addresses are to be presented. The key to understanding this process is that users don’t need to manually configure how they prefer to view all this information; the smartphone already contains the preferred formats for every country – all it needs to know is where it is located. You configure the settigns once, and all the parameters are set for you.
Similarly, the Universal Grammar theory posits that the human mind already ‘knows’ basic linguistic facts (‘principles’), and it just needs to figure out some specifics of the language it is trying to process (‘parameters’), such as whether objects follow verbs or come before them. Although there are a limited number of parameters, the theory holds, there are multiple combinations, which account for the linguistic diversity in the real world.
The other theory is that humans are not born with linguistic knowledge as such, but rather with an ultra-efficient way of analysing linguistic information. We know very little about this ability (for instance, we are not sure whether it is a part of general intelligence, or something language-specific), but broadly speaking it seems to involve:
- An ability to notice grammatical features and prioritise them over semantic information. For example, a Greek child will effortlessly notice that «κορίτσι» (‘koritsi’, girl) is grammatically neutral, even though it is semantically female.
- A highly flexible way of making connections between language items, storing them in the mental lexicon and retrieving them from it. This might involve storing together phrases such as savage man, wild man and wild flower, but not *savage flower.
- An ability to subconsciously form, test, and evaluate linguistic hypotheses while communicating. These hypotheses form the basis of an innate ‘grammar’ system, which keeps evolving as new input becomes available.
While both theories posit some kind of innate predisposition for mastering a language at an impressive speed, in both cases exposure to linguistic input is necessary. In other words, it would probably be impossible for a person to develop a new language from scratch.
To sum all this up, language acquisition is a social process. Although we are still uncertain about the actual cognitive processes involved, we do know that these are triggered by exposure to meaningful input. The evidence we have mainly comes from the so-called ‘ferral children’, who grew up outside society. Maybe it’s just as well that we don’t know more.
[*] Sadly, this anecdote appears to be one of the few things that have been left out from the ever-expanding newer editions of the Fromkin and Rodman book. I think it’s a pity.
Featured Image Credit: Simon Blackley @ Flickr | CC BY-ND 2.0