Monday, May 17, 2021

The role of clothing in cultural evolution [agriculture, Rank 2]

Sam Dresser, The clothing revolution, Aeon.

My recent work shows that clothing wasn’t just the unique adaptation of a more-or-less hairless mammal to the changing natural environments. The development of clothing led to innovations with many repercussions for humanity, beyond survival in cold climates. A need for portable insulation from the cold in the Palaeolithic promoted major technological transitions. These include stone toolkits for working animal hides and, subsequently, bone tools such as pointed awls and needles to make tailored garments. Later, during the coldest stage of the last ice age, Homo sapiens in middle latitudes devised multi-layered outfits with an inner layer of underwear. Equipped with effective protection from wind chill, our species could penetrate into the frigid Arctic Circle, further north than cold-adapted Neanderthals had managed to venture. From the northeastern corner of Siberia, modern humans strolled across an exposed land bridge to enter Alaska by 15,000 years ago, if not earlier, to likely become the first hominins to set foot in the Americas. At the Broken Mammoth site in Alaska, archaeologists have unearthed the fragile technology that made the journey possible: a 13,000-year-old eyed needle.

Until recently, the scientific study of clothing was largely the work of physiologists who have explored its thermal properties, which are now well understood. The physiology of clothing allows us to say precisely how much clothing people must wear to survive at sub-freezing temperatures and at differing wind-chill levels. Early hominins in Africa had begun to harness fire between 1 and 2 million years ago, perhaps for cooking more than warmth. Fire was utilised as hominins spread into Europe and northern China, where Homo erectus retreated into caves to escape wind chill. However, even if earlier hominins were more hairy than modern humans, whenever they found themselves in cold conditions beyond certain well-defined survival thresholds, they needed to carry portable insulation while out in the open. For modern humans, exposure times for frostbite can be less than an hour, and life-threatening hypothermia can develop overnight, even in cities. From a thermal perspective, two aspects of clothing are important. First is the number of layers, with each extra layer increasing the total insulation value. The second aspect is whether garments are fitted, or tailored, to enclose the body, especially the limbs. Fitted garments offer superior protection from wind chill, a major risk factor for frostbite and hypothermia.

Alas, clothing doesn't survive well over time and so the archaelogical record is poor.

All the evidence we have for ice-age clothing is indirect but, nonetheless, the available evidence shows that people had tailored clothes in the last ice age. The world’s oldest eyed needles are found in southern Russia 40,000 years ago, and one needle in Denisova Cave is said to be 50,000 years old. In the vicinity of Moscow at a site called Sunghir, 30,000-year-old human burials have thousands of beads neatly arranged on the skeletons.

But...

Scientific efforts to shed light on the prehistory of clothes have received an unexpected boost from another line of research, the study of clothing lice, or body lice. These blood-sucking insects make their home mainly on clothes and they evolved from head lice when people began to use clothes on a regular basis. Research teams in Germany and the United States analysed the genomes of head and clothing lice to estimate when the clothing parasites split from the head ones. One advantage of the lice research is that the results are independent from other sources of evidence about the origin of clothes, such as archaeology and palaeoclimatology. The German team, led by Mark Stoneking at the Max Planck Institute for Evolutionary Anthropology, came up with a date of 70,000 years ago, revised to 100,000 years ago, early in the last ice age. The US team led by David Reed at the University of Florida reported a similar date of around 80,000 years ago, and maybe as early as 170,000 years ago during the previous ice age. These findings from the lice research suggest that our habit of wearing clothes was established quite late in hominin evolution.

I love it! Piecing things together through an understanding of causal linkage. Sherlock would love it, too.

And then we've got complex clothing:

My work draws on the known thermal physiology of clothes to distinguish two basic forms of clothing: simple and complex. Simple clothing is loose, not fitted, and consists of just a single layer. Examples of simple garments include capes or cloaks draped over the shoulders, and loincloths. Simple clothes can provide a certain amount of insulation in cold weather, although these loose garments can offer only limited protection from wind chill. Simple clothes made from thick furs were probably sufficient when hominins began to occupy northern Europe during colder glacial stages from half a million years ago. Complex clothes are closely fitted around the body and can have cylinders attached to enclose the limbs properly; additionally, they can have up to four or five layers. Complex clothes were a more recent development, and represent a quantum leap in clothing technology, allowing humans to defeat wind chill and survive in the coldest places on Earth. [...]

Complex clothes required scrapers but also hide-cutting tools, called blades, to cut the hides into regular shapes and make the cylinders for sleeves and leggings. The separate shapes had to be sewn together carefully, hence we start to find more dedicated hide-piercing tools, called awls, later refined into the iconic ice-age clothing tool, the eyed needle.

Technologies come and go:

Another clue pointing to the role of climate change is how these technologies sometimes disappeared during warm climate phases. We see this happen in southern Africa, for example, where stone blades and bone awls appear during a cold climate phase around 75,000 years ago. With a return to milder environmental conditions 60,000 years ago, hide-cutting blade tools and hide-piercing bone awls disappear from the archaeological record, only to reappear later towards the coldest phase 22,000 years ago, the Last Glacial Maximum (LGM). Apparently, Stone Age people wore clothes in cold weather, and went naked when clothes were not needed. Clothes became a social necessity for people more recently, perhaps after many generations of wearing clothes on a regular basis, mainly in the colder middle latitudes of the northern hemisphere from the LGM onwards. Once clothes replaced body painting for personal decoration and display, the need to wear clothes was uncoupled from climate.

Australia:

The pattern of clothing in Aboriginal Australia can challenge a number of cherished theories about the origin of clothing. For one, routine Aboriginal nakedness implies that humans didn’t invent clothes due to some inherent sense of modesty. Neither, as hunter-gatherers, did we need clothes for the sake of appearance. Along with African peoples such as the San who used antelope cloaks for warmth, habitually naked foragers relied on traditional techniques such as body painting, tattooing and scarification to dress themselves, and they got dressed up more elaborately for ceremonies and other special occasions, without clothes.

Australian evidence, or absence of evidence, is likewise pertinent to the origins of agriculture. It’s no coincidence that neither textile clothing nor agriculture featured in the traditional Aboriginal lifestyles.

Clothing and the Anthropocene:

When the Pleistocene ended 12,000 years ago, there was a new development with clothes. Global temperatures increased dramatically and, along with the melting of continental ice sheets and the rise in sea levels, environments became wetter and more humid. Adapting to these moist conditions, people shifted to making their clothes with fabrics woven from natural fibres such as wool and cotton. Compared with leathers and furs, fabrics are better at managing moisture. The woven structure is permeable to air and moisture and, in warm climates, wind penetration can help to cool the body. Moisture from higher sweating rates could evaporate more easily from the skin and also from the fabric, adding to the cooling effect. The warm and wet period after the last ice age, called the Holocene, coincides with a momentous transition, the beginning of the Neolithic era when people started to engage in agriculture.

The agricultural transition was a turning point in humanity’s relationship to the natural world, altering the environment profoundly and enabling the rise of cities and civilisations. My surprising suggestion is that there was a connection between the textile revolution and the agricultural revolution. By implication, this technological change in clothes led to the Anthropocene, a phase of humanly induced global warming that started with agriculture and was accelerated by the Industrial Revolution.

Paradigm shift?

The prevailing narrative in anthropology privileges food in the transition to agriculture, and the whole concept of hunting and gathering refers essentially, if not exclusively, to the food economy.

Nevertheless, anthropologists including Robert Kelly at the University of Wyoming have been saying for some time, and for a number of reasons, that the hunter-gatherer category has reached the end of its useful life. Yet we’re still encumbered with outdated food-focused terms such as foragers and hunter-gatherers to denote pre-agricultural lifestyles. By definition, the shift from foraging to farming must have begun in the food economy. Not necessarily so, in my view.

Aside from a paradigm shift, the textile hypothesis invites a critical re-evaluation of the evidence we have about early agriculture. Archaeological evidence for textile fibres in early agricultural contexts has been present all along but overlooked, and this evidence for fibres does provide food for thought. [...]

The popular notion of agriculture as a superior food strategy reflects anachronistic perceptions of foraging as a harsh, precarious lifestyle. In contrast, archaeologists have now recognised the serious risks of famine and malnutrition in the early farming communities, and confirmed the relative ease of traditional foraging lifestyles, even in marginal environments such as Australian deserts. Anthropologists realise that foraging for food has many advantages, including the flexibility and security that derives from exploiting a wide resource base. These benefits highlight another enigma, namely, that many Neolithic societies continued to rely on foraging for much of their food supply, sometimes for thousands of years after they adopted agricultural practices. Similarly, evidence from northern Europe suggests that forager communities were inclined to resist the spread of agriculture.

It just keeps going and going.

Directly and indirectly, textiles tipped the balance in favour of agriculture. Food production did become a dominant feature, with more plant and animal species domesticated to feed humans. Even then, as the archaeologist Brian Hayden points out, agricultural foodstuffs often served as surplus products for feasting on special occasions, not everyday food staples. As the main reason for agriculture in the first instance, food alone is insufficient and problematic. An alternative textile scenario might sound implausible, requiring a revolution in how we look at the agricultural revolution. Yet this scenario does echo the analogous role of textiles in the Industrial Revolution, a pivotal point in history when factory-style production of cotton textiles was an impetus for industrialisation. Ironically, if both agriculture and industrialisation were intertwined with clothes, this would mean a human adaptation to cold weather could ultimately make the whole world warmer.

Hey, GPT-3, analyze this!

No comments:

Post a Comment