General ancient history – HistoryExtra https://www.historyextra.com The official website for BBC History Magazine and BBC History Revealed Sun, 09 Apr 2023 06:57:06 +0100 en-US hourly 1 A brief history of Ramadan https://www.historyextra.com/period/ancient-history/ramadan-history-islam-muslim-month-fasting-when-begin-moon-fast-what/ Wed, 22 Mar 2023 10:33:39 +0000 https://www.historyextra.com/?p=109732

Ramadan, Islam’s holy month of fasting, has been observed and celebrated by Muslims around the world for more than 14 centuries. In the seventh century, Prophet Muhammad stated that Islam is built upon five pillars and that fasting in Ramadan was one of them. Today, nearly a quarter of the world’s population mark or observe the fast during daylight hours, giving great respect to the Islamic month in which the holy book of Islam, the Quran, was revealed to the Prophet.

What was initially practiced by around a hundred early Muslims in the seventh century is now emulated by 1.8 billion people around the world. (Image by RF Getty Images)

 

What does Ramadan mean?

Ramadan literally means ‘intense heat’, denoting the scorching summer month to which it was originally ascribed. It formed part of the pre-Islamic Arab calendar well before Islam came to Mecca, the holy city in today’s Saudi Arabia, in the seventh century.

When is Ramadan?

Muslims embrace Ramadan as the ninth month of the Islamic lunar calendar. Moonsighting – the practice of spotting the new moon on the first night of each Islamic month with the naked eye – is a tradition that has endured to this day, as Muslims across the world wait in anticipation and excitement for the birth of the Ramadan moon.

Moonsighting is a tradition that has endured to this day. Here, a man in Indonesia prepares to sight the new moon that would signal the end of Ramadan in 2019. (Photo by Robertus Pudyanto/Getty Images)

The practice of fasting was familiar to the pre-Islamic Arabs, as the Quran mentions: “You who believe, fasting is prescribed for you, as it was prescribed for those before you, so that you may be mindful of God.” (Quran 2:183)

It was during Ramadan that the very first revelation of the Quran occurred. This took place in 610 AD, when Muhammad retreated to a cave on Mount Hira on the outskirts of Mecca for secluded contemplation. The timing of this initial revelation is given special significance as the “night of power”: “We have revealed it (Quran) in the night of power. And what will explain to you what the night of power is? The night of power is better than a thousand months.” (Quran 97:1–4)

Why do Muslims fast during Ramadan?

The Quran was revealed to Prophet Muhammad over a period of 23 years, and the verses instructing Muslims to fast the entire month of Ramadan came in the latter half of that period. During the first 12 years in Mecca, the Muslim minority faced torture, tyranny and persecution from the Qurayshi ruling pagan tribe, with many losing their lives. The surviving Muslims migrated to the city of Medina in 622 AD, over 300km away. Two years later, the verses about fasting in Ramadan were revealed, with Prophet Muhammad establishing the holy month’s practices in the sanctuary of their new home.

How did the practices of Ramadan begin?

The early Muslim community would awake for the pre-dawn meal, known as suhoor, and refrain from eating, drinking and marital relations until sunset, when they broke their fast (iftar), typically on dates. As well as spiritual discipline and increased worship, fasting placed a strong focus on improving behaviour, as Prophet Muhammad stated: “If a person does not avoid false talk and false conduct during the fast, then God does not care if he abstains from food and drink.”

An example of an İftar during Ramadan. (Image by RF Getty Images)

During the final years of his life, Prophet Muhammad began to perform extra night prayers in Ramadan called taraweh. His companions started joining him in the mosque and as the numbers grew, the Prophet became concerned they would regard it an obligation, so he continued his prayers alone at home. Ten years after the Prophet’s death, the leader of the Muslims, Caliph Umar saw Muslims scattered around the mosque praying the extra night prayers in separate groups, and established a congregational prayer to unify their worship. Since that time, congregational taraweh has become a defining feature of Ramadan, and one through which the Quran is recited in its entirety.

One particular dietary development that aided worshippers to perform the long night prayers was coffee – possibly derived from the Arabic word qahwa originally meaning wine, or from quwwa, meaning power or strength. When coffee was cultivated in Yemen in the 14th or 15th century, it was first consumed in zawiyas, or spiritual centres, and helped Muslims stay alert for their nightly Ramadan vigil.

 

How has the observation of Ramadan changed through history?

While the core rituals and significance of Ramadan have remained unchanged since 624, the spread of Islam over the globe gave texture and diversity to the global Ramadan experience. In Ottoman times, drummers in Turkey woke people for the pre-dawn meal, and similarly in Morocco, a nafar (town crier) dressed in traditional Moroccan robe and leather slippers roamed the streets rousing people to the sound of an instrument, like a horn, trumpet, or daff. These special Ramadan callers were also to be found in Syria, and future Tunisia and Algeria.

While the core rituals and significance of Ramadan have remained unchanged since 624, the spread of Islam over the globe has brought texture and diversity

In Egypt, a Ramadan lantern or fanoos, probably originating during the Fatimid dynasty of the 10th-12th centuries, became a symbol of the sacred month – perhaps to symbolise the spiritual light and blessings that Ramadan brings. Today, intricate lanterns are seen lighting up homes, shops and lining the streets. Egypt was also said to have instigated the ‘iftar cannon’, or ‘midfa al-iftar’, where a cannon was fired to dramatically announce the time for breaking the fast. This tradition is said to have begun around 200 years ago, although some historians trace it further back to the Mamluk period of the 15th century, when the sultan in Cairo was test-firing a new cannon at the time of sunset prayers. Locals thought the sultan was signalling the time to break their fast, and seeing how much joy it brought his people, the sultan made it a daily Ramadan routine.

A Ramadan lantern, or fanoos, became a symbol of the sacred month, such as these lanterns in Cairo in 2018. (Image by RF Getty Images)

Ramadan also entered the sphere of poetry and captured the imagination of Sufi poets, acting as their muse while they penned love poems to the holy month. The famous 13th-century Persian Muslim mystic poet and scholar, Jalaluddin Rumi, wrote: “O moon-faced Beloved, the month of Ramadan has arrived. Cover the table and open the path of praise.”

Prophet Muhammad stipulated that the Muslims feed the poor in this month. Towards the end of Ramadan, zakat-al-fitr, or ‘alms of the breaking of the fast’, was a duty on every able Muslim, and equated to a portion of dates or barley given directly into the hands of the poor. Over time, the bartering system translated into a monetary one, and now a Muslim gives a minimum of £5 to mosques or charities for distribution to the poor on their behalf.

Towards the end of Ramadan, zakat-al-fitr, or ‘alms of the breaking of the fast’, was a duty on every able Muslim

It was incumbent that the zakat-ul-fitr be distributed before the festival of breaking the fast, known as Eid ul-Fitr, which marks the close of Ramadan. Prophet Muhammad appointed it as a day of community and celebration, beginning with a special communal prayer. It was tradition to begin the day with eating something sweet, which has given rise to the fond nickname of the ‘Sweet Festival’, or ‘Sweet Eid’. In the Prophet’s time, Eid morning began with a simple breakfast of dates, but the spread of Islam through different lands departed from the humble beginnings and gave rise to a variety of sweet dishes. Like sheer-kurma, a milky dessert of vermicelli, nuts and dates which is popular in the Indian subcontinent, or cambaabur, the Somali Eid bread covered with sugar and yoghurt.

What was initially practiced by around a hundred early Muslims in the seventh century is now emulated by 1.8 billion people around the world, who continue to follow the Prophetic tradition while marking Ramadan in their own culturally unique ways.

This article was first published on HistoryExtra in April 2020

]]>
Blood, sweat & marble: examining ancient bodies https://www.historyextra.com/period/ancient-history/examining-ancient-bodies-podcast-caroline-vout/ Wed, 25 Jan 2023 09:24:25 +0000 https://www.historyextra.com/?p=223461

Imagine an ancient Greek or Roman body, and the first picture that pops into your head is probably made of marble or stone – perhaps an austere bust, or a gleaming, musclebound sculpture, polished, cold and pale. But what about the experience of living in a real body, in all its pleasure, pain and flaws, during antiquity? Speaking with Elinor Evans, Caroline Vout presents the flesh and blood realities of life – and death – in ancient Greece and Rome.

 

Caroline Vout is the author of Exposed: The Greek and Roman Body (Profile Books, 2022)

]]>
Mavia of Tanukh: the warrior queen of Arabia https://www.historyextra.com/period/ancient-history/mavia-tanukh-warrior-queen/ Mon, 23 Jan 2023 13:25:26 +0000 https://www.historyextra.com/?p=222939

The summer of AD 378 was a turbulent time in the Roman east. Rumblings in Thrace – a historic region in the south-east Balkans – threatened to erupt into war as the Goths, hungry for imperial territory to settle, crossed the Danube and headed south towards Constantinople. The Roman emperor Valens, struggling to stem the rising tide of Germanic invasion from the north, began preparing for war.

The ranks of his army were bolstered by an unexpected source: Saracen cavalry from distant Arabia, dispatched by Queen Mavia of Tanukh. This was an unlikely turn of events, because Mavia had defeated Valens in battle just months earlier. But the Arabs and Romans had a common enemy now in the Goths, whose multitudes advanced upon the very doorstep of the Roman empire.

On 9 August, the emperor’s legions and allies faced the Germanic warriors at the battle of Adrianople. It was a massacre. The Goths slew Valens and all but annihilated his men. Mavia’s army, equipped for mobile assaults rather than pitched battle, were spared the worst and lived to fight another day. Later that year, and smelling blood, the Goths advanced on Constantinople itself. Outside the great walled city stood its Saracen guardians, determined that vengeance would be theirs. They confronted the Goths in wedge formation, breaking their enemy’s rank – and, in doing so, helping to save the seat of the imperium.

Constantinople flourished as the epicentre of Roman identity and Orthodox Christianity for another millennium – in part thanks to Mavia and her men.

Who was Queen Mavia of Tanukh?

Before her ascent to power, Mavia’s origins are something of a mystery. We know that she was born in the mid-fourth century, a noblewoman of a semi-nomadic Arabian people whose powerbase lay between the deserts of Syria and Hijaz (a region in what’s now western Saudi Arabia), precariously sandwiched between the warring Roman and Persian empires. Hailing from the battle-hardened Kalb tribe, she was married to the king of the Tanukhids – possibly named al-Hawari – who ruled over a tribal confederation whose land stretched from Bosra (in southern Syria) to Aleppo in the north. When he died in AD 375 without an heir, Mavia, took control.

The abrupt transfer of power – and the fact that their new leader was a woman – made the Tanukhids vulnerable to attack. Yet Mavia asserted her military dominance immediately and decisively. To the shock of male contemporaries and later chroniclers, she quickly proved to be a brilliant military general. The contemporary Latin church historian Rufinus of Aquileia recounts the moment when she burst into history: “Mavia, the queen of the Saracens, began to rock the towns and cities on the borders of Palestine and Arabia with fierce attacks, and to lay waste to the neighbouring provinces at the same time; she also wore down the Roman army in frequent battles, killed many and put the rest to flight.”

Within months, Mavia had conquered much of the Middle East, including Arabia, Palestine and the Sinai region of Egypt. She remained undefeated against a series of Roman legions and garrisons, vanquishing Valens before enacting terms for peace. Her breathtaking victories on the battlefield are nothing short of legendary. Mavia’s success in building a broad coalition of Arab tribes gave her the upper hand against a Roman army that was thinly spread across the region, and lacked the military fervour of their Saracen adversaries.

In the short term, she gained independence for the Arabs from Roman rule, setting a precedent for diplomacy and warfare with the Romans and Persia for centuries to come. In the long run, she guaranteed the Arabs a seat at the table, especially in discourse with the polities and churches of the near east.

Whatever her specific reasons for waging war against the Romans, Mavia was determined that her people should abandon paganism and adopt Christianity. In a world that was starting to turn towards the church, conversion would help build ties with the Christian Romans – a politically advantageous move for the Arabs. Ultimately, her goal was to form an alliance with the eastern Roman empire while maintaining independence. She achieved both on her owns terms.


On the podcast | Christianity has been one of the dominant forces in European history, but according to historian Peter Heather, its rise was not inevitable


Writing a few decades later, the church historian Sozomen of Gaza recounts her negotiations with the Romans. “As the war was still pursued with vigour, the Romans found it necessary to send an embassy to Mavia to solicit peace. It is said that she refused to comply with the request of the embassy unless consent were given for the ordination of a certain man named Moses, who practised asceticism in a neighbouring desert, as bishop over her subjects… He reconciled them to the Romans and converted many to Christianity, and passed his life among them as a priest.”

By securing a religious leader from among her people, rather than accepting a foreign holy man appointed by the Romans, Mavia secured independence for the Arabs. The partnership she forged with the monk Moses planted the roots of an orthodox Arab national church and began the process of unifying Arabia through religion. Varieties of Christianity spread in the fifth and sixth centuries through the works of tribal chieftains and the missions of their holy men who introduced monotheism to the Arabs two centuries before the appearance of the Prophet Muhammad.

Mavia of Tanukh’s death in battle

Amid the turmoil of warfare, Mavia gave her daughter Chasidat in marriage to a Roman officer, Victor, making her a Roman citizen. Chasidat was also a warrior, and would fall in battle against the Goths in service to her mother’s revolt. But the peace between the Romans and their Arab allies, sealed by the first marriage in history between an Arab woman and a Roman man, lasted three centuries.

Although Mavia was the primary catalyst for Arabia’s turn towards Christianity, she likely never converted. She is, therefore, associated with pre-Islamic pagan customs throughout classical Arabic literature, remembered as a fiercely independent noblewoman who dispensed with money and men freely. She is explicitly called “queen of the Arabs”, sharing this honour with only one other woman – empress Zenobia of Palmyra, ruling a century earlier.

What we know of Mavia following her defeat of the Romans comes largely from literary sources, which perhaps tell us more about wider ideas of values and identity in the Arab world than about her historical life. But these tales are enlightening, nonetheless. In these writings, Mavia occupies the most opulent throne in Arabia – that of al-Hira in Mesopotamia, in what’s now south-central Iraq. Young men of her palace chambers serve her every need, and she is courted by the finest bachelors of all Arabia.

Having taken a worthy warrior-poet as a husband, she divorces him according to ancient Bedouin custom – that is, without consulting him, and indefinitely keeping possession of their home and children. Beyond these stories, Mavia is recalled as foremost among women who “married whomever she wished”, but also among those who “used to divorce their husbands in the jahiliyyah [pre-Islamic Arabia]”.

Little is known definitively about Mavia’s later life, though it seems she survived for many years after those early military triumphs. The date of her death is a matter of dispute; one Mavia is cited in a funerary inscription dated AD 425 in Khanasir (then known as Anasartha), a town south-east of Aleppo. If this dating is correct, it means that Mavia ruled for half a century following the death of her husband.

Whatever the truth of her life or death, this great Arab warrior queen certainly changed history. She was a rugged fighter and an undefeated military general worthy of Sozomen’s hyperbole: “[Mavia] regarded not the sex which nature had given her and displayed the spirit and courage of a man.”

Her pagan freedom demonstrates the practice of polyandry and divorce culture among upper-class Arab women before the imposition of patriarchal marriage. Her triumph against the Romans, and her success in establishing a unified Arab church, launched Arabia as a whole towards monotheism long before the dawn of Islam. Most importantly, it is thanks to this proud, strong and brilliant woman that the Arabs entered history as an independent, unified, diplomatically connected, God-fearing people.

Emran El-Badawi is associate professor of Middle Eastern Studies at the University of Houston. His latest book is Queens and Prophets: How Arabian Noblewomen and Holy Men Shaped Paganism, Christianity and Islam (Oneworld, 2022)

This article was first published in the January 2023 issue of BBC History Magazine

]]>
Were the pyramids built by aliens? The real history that debunks the conspiracy https://www.historyextra.com/period/ancient-egypt/were-pyramids-built-by-aliens-conspiracy-real-history-facts/ Wed, 04 Jan 2023 16:24:51 +0000 https://www.historyextra.com/?p=222406

More than 50 pyramids still stand in Egypt: colossal feats of architecture and engineering and lasting monuments to the ancient civilisation of the pharaohs. From the oldest – the step pyramid of Djoser, erected during the Third Dynasty in the 27th century BC – the period of pyramid building lasted around a millennium, and the structures took many forms. They could be made of mud brick or limestone, stepped or with smooth white casing.

Pyramids served as royal tombs and aids for pharaohs to reach the afterlife; heavy with symbolism, the shape almost represented a stairway to heaven or the Sun’s rays coming down to Earth. They had a worldly purpose too as a status symbol for kings – who else could command Egypt, both its people and resources, to build something so wondrous?


  • On the podcast | Joyce Tyldesley explores the origins of the theory that Ancient Egypt’s iconic monuments were built by creatures from out of this world:

Listen to an ad-free version


Of all the examples in Egypt, the most famous is the Great Pyramid of Giza, built over several decades in the 26th century BC on the command of pharaoh Khufu. It was the oldest of the Seven Wonders of the Ancient World and the only one still surviving, and for nearly 4,000 years it stood as the tallest human-made structure in the world.

The conspiracy theory: an alien construction

So the theory has it, the pyramids were not the result of decades-long construction projects involving tens of thousands of people, gigantic logistical and administrative efforts, near-inexhaustible supplies of stone, and master architects and engineers. Instead, they were built by aliens, or at least aliens showed humans how to build them.

Extra-terrestrial interference has not been the only outlandish theory to explain how an ancient civilisation boasted such awesome monuments. The people of Atlantis have received similar credit. Much about the pyramids, from how they were made to what is inside them, remains unknown to this day, so perhaps unsurprisingly there are many misconceptions as well as conspiracy theories.

A common claim that still lingers is that they were built by slaves. “The Bible suggests that the Egyptians enslaved the Israelites, and classical authors visiting Egypt recorded that slaves must have been used. But if you came from a civilisation that used slaves, like Greece or Rome, you’re probably going to assume any massive monument was built by slaves,” says Joyce Tyldesley, Egyptologist at the University of Manchester. “While there certainly was what could be called forced labour, it is incorrect to imagine slaves being whipped and dragged from other countries. They were native Egyptians rather than prisoners.”

Was the moon landing faked? Did Shakespeare actually pen his works? And did Hitler escape to South America after WW2? 

Find out more in our conspiracies series

Conspiracy 900x250

What is the source of the theory?

Although pyramids have existed for more than four millennia – and across many civilisations around the world – the introduction of aliens was a fairly recent idea. “I would say it really develops after HG Wells publishes The War of the Worlds in 1897. This starts a run of sci-fi books,” says Professor Tyldesley.

“There’s one in particular in 1898, Edison’s Conquest of Mars [by American astronomer and writer Garrett P Serviss], which reveals that the Great Pyramid and the Sphinx are Martian constructions. It’s not supposed to be a serious book; it’s fiction. But this idea that someone from outside Earth might have visited Egypt and built the pyramids took hold.”

In 1968, the Swiss author Erich von Däniken published his bestselling book, Chariots of the Gods? Unsolved Mysteries of the Past, which went a long way in popularising the theory that ancient astronauts visited Earth, were welcomed as gods, and greatly influenced the cultures, religions, technologies and, of course, architecture of ancient civilisations.

The reasons why the theory took hold

There is a compelling argument that the belief in ancient aliens visiting Earth and imparting their superior knowledge stemmed, initially at least, from a deep-rooted racism and prejudice against people from the past, a reluctance or refusal to believe that an ancient civilisation from somewhere else in the world – like Africa or, in the case of the Mesoamerican pyramids, Central America – could actually construct such impressive monuments. Any explanation, even aliens, would be preferable or more believable.

“It’s not appreciating the skills and abilities of the people of the past because we have sufficient evidence to show that these people could do this type of building,” says Professor Tyldesley. “But I think there’s more to it than that. Prior to the idea of aliens helping to build the pyramids, we had the idea that people from Atlantis might have helped, and prior to that we had the idea that God inspired the builders.”

The long succession of changing and evolving theories may be a result of the longevity of the pyramids themselves: they are still there, inspiring wonder and bewilderment. “The fact that we didn’t understand the Egyptian beliefs of death and the afterlife for a long time after the end of the dynastic age meant that we were forced to find explanations to try and understand why they were there.”

The evidence that debunks the conspiracy

“If we take the Great Pyramid of Giza, we know that it was built by gangs of workers summoned under a sort of national service or corvée system,” says Professor Tyldesley. “They had fairly basic but effective tools, and the workers were able to cut blocks of stone, transport them to the construction site and gradually erect the pyramid. “There are parts of the technology we can’t see today – for example, the ramps – but basically, it was the sheer amount of person power that made it all possible.”

The existence of a limestone quarry near to the site of the Great Pyramid, plus evidence of the camp sites for the workers, reveals how extensive the project would have been. “We have evidence of how they were fed and for the cemeteries where they were buried,” adds Professor Tyldesley. But it is only in the past 50 to 70 years that our understanding of pyramid building has made significant strides, which for a long time left a vacuum in which conspiracy theories were able to take root. Egyptologists, archaeologists and historians could do more, according to Professor Tyldesley, to present the evidence and a credible alternative to the idea of alien visitors.

“The construction of the pyramids encouraged not only building techniques, it encouraged civil service to develop to coordinate all the workers; the development of boats to ship timber or stone; medical skills to deal with accidents; and possibly increased the sense of community. I think the building of the pyramids is absolutely fundamental to the building of Egypt.”

Writer and broadcaster Dr Joyce Tyldesley is honorary research fellow at the School of Archaeology, Classics and Egyptology at Liverpool University, and teaches Egyptology at Manchester University

]]>
What is the Sator Square, and how old is it? https://www.historyextra.com/period/ancient-history/sator-square-what-puzzle-how-old-when-first-tenet/ Thu, 22 Dec 2022 07:15:25 +0000 https://www.historyextra.com/?p=222049

Made up of 25 letters in a 5×5 grid, the Sator Square is no ordinary palindrome. Far more sophisticated than the average ‘racecar’ or ‘top spot’, this is made up of five words in Latin that all interact with each other. The words – Sator, Arepo, Tenet, Opera and Rotas – can be read horizontally and vertically, from the top and left, and the bottom and right.

Those words, especially the middle one, will be familiar to fans of Christopher Nolan’s time-reversing science-fiction thriller, Tenet (2020), as they all appear in the course of the movie. Sator is the name of the villain (played with menace by Kenneth Branagh); Arepo is a shadowy art forger; the opening scene takes place at the opera; Rotas is the name of an important security company; and Tenet, of course, is the title. By no means, however, was the Sator Square invented for the movie.

It is an ancient artefact, seen in myriad cultures over many centuries. Examples have been found all over Europe and beyond, from England to Syria and Sweden to North Africa, and dating from as early as the first century AD. Squares have been etched into clay tablets, roof tiles and walls, or written on papyrus, amulets and medieval textbooks.

Over the centuries and throughout the Middle Ages, the Sator Square was imbued with magical properties, seen as a charm to ward off evil or illness. It was believed it could cure anything from dog bites and rabies (by eating a piece of bread inscribed with the 25 letters) to toothaches, a fear of water, and insanity. In medieval Germany, a disc carved with the square was believed to be able to put out fires.

As to its meaning, breaking down the individual words offers a useful starting point. In Latin, ‘Sator’ means ‘sower’ or ‘seeder’, which could be a reference to a farmer. The meaning of ‘Arepo’ is unknown, but it could be a person’s name. ‘Tenet’ means ‘to hold’; ‘Opera’ can either be ‘with care’ or ‘work’; while ‘Rotas’ is thought to refer to ‘wheels’.

So, if read top to bottom – although most commonly known today for starting with ‘Sator’ at the top, there are existing examples beginning with ‘Rotas’ – it is possible to construct a sentence along the lines of: “The sower Arepo holds the wheels with care (or works the wheels).”

Of course, the Sator Square has not survived this long, across multiple civilisations, for an all-too brief story about a farmer called Arepo. The studies and debates into its meaning goes on, with the word ‘Arepo’ alone being subject to numerous theories and interpretations. Yet there is no question that it took on significant meaning in Christianity, since squares have been found in churches and in the pages of Bibles.

This may be down to the fact that the letters can be rearranged to make the words “Pater Noster” – meaning “Our Father”, the first words in the Lord’s Prayer – twice, with them intersecting at the N to form a cross. This leaves two As and two Os left over, which could be used to stand for Alpha and Omega, the first and last letters of the Greek alphabet used to refer to God (the beginning and the end). Much like the fish symbol, the Ichthus, Christians may have co-opted the Sator Square as a secret form of communicating with each other at a time when they were being persecuted.

It remains unlikely that the square originated among early Christians, though, since the earliest-known examples have been unearthed as part of the excavations at Pompeii. The city was destroyed by Vesuvius in AD 79, before Christianity had been firmly established. Theories therefore abound to the true origins of the Sator Square, from it being a Jewish symbol to its association with a number of ancient religious cults. Or it may have simply been a Roman word puzzle.

]]>
Timeline: the pharaohs and dynasties that ruled ancient Egypt https://www.historyextra.com/period/ancient-egypt/timeline-ancient-egypt-dynasties-in-order/ Mon, 31 Oct 2022 05:51:18 +0000 https://www.historyextra.com/?p=218293

Early dynastic period, c3000–2686 BC

1st Dynasty: c3000–2890 BC

First pharaoh: Aha (c3000 BC–unknown)

Last pharaoh: Qa‘a (unknown–2890 BC)

2nd Dynasty: 2890–2686 BC

First pharaoh: Hetepsekhemwy (2890 BC–unknown)

Last pharaoh: Khasekhemwy (unknown–2686 BC)

Key information: 

The era directly following the unification of Upper and Lower Egypt in c3100 BC, the Early Dynastic Period sees a capital city established at Memphis.

First pharaoh: Aha (c3000 BC–unknown). (Image by Getty Images)

Old Kingdom, 2686–2160 BC 

3rd Dynasty: 2686–2613 BC

First pharaoh: Nebka (2686–2667 BC)

Last pharaoh: Huni (2637–2613 BC)

4th Dynasty: 2613–2494 BC

First pharaoh: Sneferu (2613–2589 BC)

Last pharaoh: Shepseskaf (2503–2498 BC)

5th Dynasty: 2494–2345 BC

First pharaoh: Userkaf (2494–2487 BC)

Last pharaoh: Unas (2375–2345 BC)

6th Dynasty: 2345–2181 BC

First pharaoh: Teti (2345–2323 BC)

Last pharaoh: Nitiqret (2184–2181 BC)

7th and 8th Dynasties: 2181–2160 BC

Several ephemeral kings ruled in the 7th Dynasty, most of whom took the name of Neferkara – probably in imitation of the throne name of Pepy II (6th Dynasty, 2278–2184 BC). The short-lived 8th Dynasty was equally unstable and saw the collapse of the Old Kingdom system of control.

Key information: 

The first true, flat-sided pyramids are built during the reign of Sneferu (founder of the 4th Dynasty), in place of the step pyramids commonly found in the 3rd Dynasty. Sneferu’s son, Khufu (2589–2566 BC), builds the Great Pyramid of Giza, and the Great Sphinx follows, probably during the reign of Khafre (Khufu’s son and successor, 2558–2532 BC), together with a second pyramid. The third pyramid at Giza is probably built during the reign of Khafre’s successor, Menkaure (2532–2503 BC). Egypt splits into several political units after the 94-year reign of Pepy II (6th Dynasty).

Ancient Egypt facts

The land of the pharaohs is famous for its huge pyramids, its bandaged mummies and its golden treasures. But how much do you really know about ancient Egypt? Was the Great Pyramid built by slaves? How did mummification work? Here, Egyptologist Joyce Tyldesley shares 10 lesser-known facts…

(Photo by Michele Falzone via Getty Images)

First intermediate period, 2160–2055 BC

9th and 10th Dynasties: 2160–2025 BC

First pharaoh: K hety (2160 BC–unknown)

Last pharaoh: Merykara (unknown –2025 BC) 

11th Dynasty (Thebes only): 2125–2055 BC

First pharaoh: Intef I (2125–2112 BC)

Last pharaoh: Intef III (2063–2055 BC)

Key information:

Centralised power weakens during this period, and Egypt is ruled by two competing dynasties. One is based at Heracleopolis in the north, with the other based at Thebes in the south.


Middle Kingdom, 2055–1650 BC

11th Dynasty (all Egypt): 2055–1985 BC

First pharaoh: Mentuhotep II (2055–2004 BC)

Last pharaoh: Mentuhotep IV (1992–1985 BC)

12th Dynasty: 1985–1773 BC

First pharaoh: Amenemhat I (1985–1956 BC)

Last pharaoh: Queen Sobekneferu (1777–1773 BC)

13th Dynasty: 1773–after 1650 BC

First pharaoh: Wegaf (1773 BC–unknown)

Last pharaoh: Ay (exact dates unknown)

14th Dynasty: 1773–1650 BC 

Believed to be minor rulers whose reigns were contemporary with the 13th or 15th Dynasties

Key information:

Upper and Lower Egypt are reunified under Mentuhotep II (pictured below). Evidence indicates a shift in the pharaoh’s role as political and spiritual leader during this period, as well as changes in the organisation of society, religious beliefs, and relations with neighbouring peoples. During the 12th Dynasty a new capital is established, at Idj Tawy.

Tutankhamun: Life, Death & Legacy

Introducing a brand-new podcast series exploring the life and legacy of the boy pharaoh Tutankhamun

Listen to all episodes now

Tutankhamun's blue and gold funerary mask

Second intermediate period, 1650–1550 BC

15th Dynasty: 1650–1550 BC

First pharaoh: Salitis/Sekerher (1650 BC–unknown)

Last pharaoh: Khamudi (exact dates unknown)

16th Dynasty: 1650–1580 BC

Theban rulers contemporary with 15th Dynasty

17th Dynasty: c1580–1550 BC

First pharaoh: Rahotep (c1580 BC–unknown)

Last pharaoh: Kamose (1555–1550 BC)

Key information:

Egypt is once again ruled by competing dynasties, with the north ruled by the Hyksos – descendants of people from western Asia who had settled in the eastern Nile Delta – who ally with rulers of Kerma in Nubia against the Egyptian 16th Dynasty, based in Thebes.


New Kingdom, 1550–1069 BC

18th Dynasty: 1550–1295 BC

First pharaoh: Ahmose I (1550–1525 BC)

Last pharaoh: Horemheb (1323–1295 BC)

 

Dr Nasri Iskander restoring the mummy of Ahmose I, c2006. (Image by Getty Images)

Ramesside Period, 1295–1069 BC

19th Dynasty: 1295–1186 BC

First pharaoh: Ramesses I (1295–1294 BC)

Last pharaoh: Queen Tausret (1188–1186 BC)

20th Dynasty:1186–1069 BC

First pharaoh: Sethnakht (1186–1184 BC)

Last pharaoh: Ramesses XI (1099–1069 BC)

Key information:

Ahmose I drives the Hyksos from the Delta and reunites Egypt, ushering in nearly 500 years of political and economic stability. The following Ramesside period is a high point in Egyptian history, with great building projects – particularly under Ramesses II (1279–1213 BC) and military conquests in Syria, Libya and Nubia.


Third Intermediate Period, 1069–664 BC

21st Dynasty: 1069–945 BC

First pharaoh: Smendes (1069–1043 BC)

Last pharaoh: Psusennes II (959–945 BC)

22nd Dynasty: 945–715 BC

First pharaoh: Sheshonq I (945–924 BC)

Last pharaoh: Osorkon IV (unknown–715 BC)

23rd Dynasty: 818–715 BC (contemporary with late 22nd, 24th and early 25th dynasties)

Pharaohs include: Pedubastis I, Takelot III, Iuput II

24th Dynasty: 727–715 BC

Known pharaoh: Bakenrenef (720–715 BC)

25th Dynasty: 747–656 BC

First pharaoh: Piy (747–716 BC)

Last pharaoh: Tanutamani (664–656 BC)

Key information:

The death of Ramesses XI in 1069 BC sees Egypt descend into some 400 years of politically divided rule, with various centres of power and a loss of control over Nubia (Kush) in the south, which is ruled by an independent dynasty in the mid-eighth century BC. In the late eighth century BC, the Kushite ruler Piy (whose victory stele is pictured below), invades Egypt and lays the foundations for the 25th Dynasty.


Late Period, 664–332 BC

26th Dynasty: 664–525 BC

First pharaoh: Psamtek I (664–610 BC)

Last pharaoh: Psamtek III (526–525 BC)

27th Dynasty (1st Persian Period): 525–404 BC

First pharaoh: Cambyses (525–522 BC)

Last pharaoh: Artaxerxes II (405–359 BC)

28th Dynasty: 404–399 BC

Only pharaoh: Amyrtaios (404–399 BC)

29th Dynasty: 399–380 BC

First pharaoh: Nepherites I (399–393 BC)

Last pharaoh: Nepherites II (c380 BC)

30th Dynasty: 380–343 BC

First pharaoh: Nectanebo I (380–362 BC)

Last pharaoh: Nectanebo II (360–343 BC)

2nd Persian Period: 343–332 BC

First pharaoh: Artaxerxes III Ochus (343–338 BC)

Last pharaoh: Darius III Codoman (336–332 BC)

Key information:

Egypt sees considerable turmoil with foreign powers threatening throughout the period. In 525 BC, the Achaemenid Persian empire invades for the first time but will be ousted by Alexander the Great in 332 BC.

A fragment from the ‘Alexander Mosaic’, c100 BC, showing Alexander the Great in battle against the Persian king Darius III. (Photo by Universal History Archive/UIG via Getty Images)

Ptolemaic Period, 332–30 BC

Macedonian Dynasty: 332–305 BC

First pharaoh: Alexander the Great (332–323 BC)

Last pharaoh: Alexander IV (317–310 BC, nominal ruler 310–305 BC)

Ptolemaic Dynasty: 305 BC–30 AD

First pharaoh: Ptolemy I Soter (305–285 BC)

Last pharaoh: Ptolemy XV Caesarion (44–30 BC)

Key information:

After Alexander the Great’s death, rule passes to one of his generals, Ptolemy. Antony and Cleopatra’s defeat at Actium by Octavian (future Roman Emperor Augustus) in 31 BC is followed by the murder of Egypt’s last pharaoh, Cleopatra’s son Caesarion (pictured with his mother, below), the following year.

*The dates used in this feature are derived from The Oxford History of Ancient Egypt, edited by Ian Shaw (OUP, 2000). Please note that dates will vary between different sources.

This timeline was first published in the June 2022 issue of BBC History Revealed

]]>
Ancient history podcast episodes https://www.historyextra.com/period/ancient-history/ancient-history-podcast-episodes/ Thu, 06 Oct 2022 09:02:07 +0000 https://www.historyextra.com/?p=200536

Browse our archive of podcast episodes on ancient history, from the innovation of Roman Britain to the ghosts of ancient Mesopotamia. Scroll down for interviews with Mary Beard, Simon Schama, Natalie Haynes and Daisy Dunn…

]]>
50 important historical events: from Sutton Hoo to Rosa Parks https://www.historyextra.com/period/20th-century/great-giant-leaps-history-what-other-moon-landing/ Thu, 22 Sep 2022 16:05:45 +0000 https://www.historyextra.com/?p=86978
1

c2667-2648 BC: The first pyramid is built

Constructed around 4,700 years ago, the Step Pyramid of Djoser in Saqqara is not only the first of the Egyptian pyramids to be built, but world’s oldest intact largescale stone monument. It was designed as a tomb for the Third Dynasty Pharaoh Djoser, and was completed in his lifetime. Previous large structures in Ancient Egypt consisted of mud bricks; the time and care taken to stack and sculpt the stone’ suggests that Djoser had substantial finance and resources – as well as a huge workforce – to underpin the project. It became the prototype for the 80 or so pyramids subsequently built across the kingdom.

2

776 BC: The first Olympics are held

Although bearing little resemblance to the scale and the circus that the Olympic Games represent in the 21st century, the very first Olympics, held in the Greek sanctuary of Olympia, set the template for multisport competition for millennia to come. The contest quickly became a pageant of athletic endeavour, triumph and despair, with many events still recognisable today – in particular, running and boxing. There were also some disciplines that have fallen by the wayside: chariot-racing has rather gone out of vogue, while pankration – a strenuous, violent cross-breed of boxing and wrestling – is another now-extinct event.

The Olympics only became a global affair from 1896, after Pierre, Baron de Coubertin, set up the International Olympic Committee and resurrected the Games, which hadn’t taken place since the fourth century AD. In the original incarnation, the events were only open to male athletes from Ancient Greece’s city-states and colonies, although this wasn’t as restrictive as it might appear: at that point, the Greek Empire stretched westwards from modern-day Ukraine to Spain.

The Greeks light the first Olympic torch. (Image by Getty Images)
3

507 BC: Democracy is conceived in Athens

If Athens is the cradle of democracy, then Kleisthenes was its midwife. Despite being born into a less-than-democratic lineage (his maternal grandfather was the tyrant Cleisthenes of Sicyon), Cleisthenes the lawgiver was the architect of a new system of government, one that valued equality over patronage. Having displaced a pro-Spartan oligarchy, Cleisthenes undertook a radical reshaping of the Athenian constitution.

He sought to break up entrenched alliances and to reduce the power of aristocratic families, attempting to replace the status quo with a pan-Athenian worldview that united all strata of society. Under this mindset, the three regions of Attica (the peninsula that projects into the Aegean Sea) worked together to run the city, cutting across previous notions of clan. But while all citizens enjoyed equal rights, there was a glass ceiling. Only men were deemed to be citizens.

4

27 BC – AD 180 Pax Romana: Peace reigns over the Roman Empire

The Pax Romana describes a two-century period when the early Roman Empire was largely defined by peace and stability. Off the back of the Final War of the Roman Republic (32–30 BC), Rome’s new emperor, Caesar Augustus, successfully persuaded his subjects that peace was a more attractive option than costly back-to-back wars.

The success of Augustus’s worldview – one inherited and upheld by the following 16 emperors – led to a buoyant empire. Incomes rose across the Mediterranean, while there was a substantial uplift in trade with the Far East. The period ended with the death of Marcus Aurelius, the last of the so-called ‘good emperors’. The 3rd and 4th centuries AD descended into frequent warfare, transforming – in the words of the statesman Cassius Dio – “a kingdom of gold into one of iron and rust”.

5

AD 150: Ptolemy maps out the future

Even in the 2nd century AD, when much of the world was unexplored, the decision to compile and describe all geographical knowledge was a Herculean task. But Greek mathematician Ptolemy shouldered the task admirably. His eight volume Geographia includes a lengthy gazetteer of locations one country at a time; and a collection of maps, mainly regional in nature. Originally written in Greek, subsequent translations made centuries later into Arabic and Latin meant future explorers were indebted to this pioneering cartographer.

Ptolemy’s world map stretches from Europe to Sri Lanka. (Image by Getty Images)
6

cAD 206-220: The invention of the compass

Without navigational aids like a map or compass, sailors were reliant on what they could actually see to determine their passage, whether that be earthly landmarks or celestial bodies. However, such techniques were useless on days or nights that were particularly foggy or cloudy. The compass – invented in China during the Han Dynasty and originally known as a ‘south-governor’ – would come to revolutionise navigation, but its original use was for something else entirely.

Originally made from lodestone, a stone of iron that is naturally magnetised, it was first deployed for the purposes of feng shui – for instance, in determining in which direction a new house should face. The compass wasn’t used as an instrument of navigation until around the 10th century, during the time of the Song Dynasty.

7

AD 618-907: Paper money is first used

The exact date that paper was involved in transactions isn’t known, but it occurred during the reign of China’s Tang Dynasty.

With merchants finding copper coinage heavier the richer they became, an alternative system was devised whereby they would deposit coins with a dependable third party who would issue a note – a credit note, effectively – outlining how much they were holding for the merchant.

When copper became harder to come by during the later Song Dynasty, paper transactions became more popular. Paper money wasn’t introduced to Europe until the 17th century, when a Swedish bank began to issue banknotes. However, within three years, the bank had gone bankrupt, having devalued the currency by over-printing.

8

AD 690: China’s only female emperor takes control

If the accounts of Wu Zetian’s life are to be believed, hers is a story of raw, naked political ambition that respected few moral boundaries. To what extent these chronicles are historically accurate, though, has to be measured through the prism of male observers considering the behaviour and impact of the only female emperor in Chinese history. To reach those lofty heights surely required a well defined ruthless streak on her part, but she was often portrayed as the devil incarnate.

What isn’t disputed is the upward passage that Wu’s life took, how she scaled social strata to become the most powerful individual across the empire. Well-born and educated, she joined the Imperial household as a low-ranking concubine with domestic chores, far removed from real power. One day she managed to catch Emperor Taizong’s eye – apparently while changing his bedsheets. Upon his death in AD 649, she was sent to a Buddhist nunnery.

Wu was having none of that. She escaped and attempted to regain her position in the Imperial court. One particular unsavoury story suggests that she murdered her own baby and blamed it on the new empress, the wife of Taizong’s successor (his ninth son, Gaozong). When the Empress was exiled, Wu took her place, as Gaozong’s wife. Another story has her ordering the limbs of her rivals to be cut off, before they were left to drown in vats of wine.

When Gaozong suffered a severe stroke in AD 660, Wu became the court’s administrator – a highly powerful position. Gaozong died in AD 683, whereupon the couple’s son Zhongzong became emperor, although he reigned for just two months before Wu demoted him and sent him into exile. Zhongzong was succeeded by his brother Ruizong – he lasted six years before suffering the same fate as his sibling.

Wu then assumed power for herself, ruling as Empress Regnant for the next 15 years until AD 705. For someone so apparently ruthless, Wu’s impact on Chinese society was significant. China greatly expanded into Central Asia during her rule, while she also declared Buddhism to be the state religion, replacing Daoism.

A strong supporter of meritocratic success over hereditary privilege, Wu built up the Chinese education system to help facilitate this (the fact that she could read and write was one of the attributes that initially attracted Taizong to her), and oversaw a substantial growth in China’s agricultural output. Despite these achievements, it’s her supposedly cold-blooded ways that history remembers.

9

AD 723: A Buddhist monk harnesses time

We might regard it as a mechanical clock, but its inventor chose to name it a ‘Water-Driven Spherical Birds-Eye-View Map Of The Heavens’.

An eighth-century Chinese-Buddhist monk called I-Hsing was that inventor, a keen mathematician and astronomer who aimed to combine the two disciplines with his creation. As its name suggests, this water-powered clock was designed to trace celestial activity and proved to be a relatively accurate timepiece, accurate to within 15 minutes a day. However, over time, the water began to corrode its metal components and, on subzero days, would freeze.

More than 200 years later, another Chinese astronomer, Zhang Sixun, rebuilt the device using mercury instead.

10

1088: Europe’s first university opens its doors

The first English-speaking university was the University of Oxford, which began accepting students in 1096. However, by this point the University of Bologna had already been in existence for eight years.

This particular seat of learning was born out of the system of mutual aid societies, or universitates scholarium, which operated around the Italian city at the time. Divided by nationality, these societies helped to protect foreign students from city laws which declared them to be culpable for the sins, debts and misdemeanours of their fellow countrymen. Each society engaged the services of scholars to teach its members such subjects as law, theology and the arts. When these mutual aid societies decided to group together through common purpose and interests, the larger association they formed was effectively a university.

The university wasn’t formed with academia in mind – the provision of teaching came later. (Image by Getty Images)

This collective approach strengthened the position of foreign students in Bologna, who also determined the appointment and pay of their teachers, as well as electing a student committee, known as the Denouncers of Professors, to evaluate teaching methods and content – and, if necessary, recommend fines and other punishments. When it became a royal chartered university in 1158, the students became responsible for professors’ salaries.

Still rated as one of the leading Italian academic institutions, the university’s alumni over the centuries has included Archbishop of Canterbury Thomas Becket, Renaissance poet Petrarch, astronomer Nicolaus Copernicus, and popes Alexander VI, Innocent IX, Gregory XIII and Gregory XV.

11

1215: Magna Carta is sealed

During the early years of the13th century, King John was facing more than a little local difficulty. Considered to be one of the most disastrous kings England had ever known, he had raised taxes on the country’s barons in order to finance expensive overseas wars.

The barons revolted, and once they took control of London they forced John’s hand. Their demands were articulated and then negotiated over, resulting in the document known the world over as Magna Carta – Latin for ‘Great Charter’. In June 1215, at Runnymede on the banks of the River Thames, the two sides signed and sealed this charter in anattempt to author an uneasy peace.

Despite the apparent agreement, the mistrust between the King and the rebel barons continued to simmer, effectively rendering the accord redundant. However, the agreement had a more lasting legacy: while responding to specific demands from the barons, Magna Carta also set out a framework for political and societal reform.

Areas that it addressed includedthe rights of man, limitations ontaxation levels, and the protection of Church rights. As such, it has become something of a prototype for written constitutions across the world ever since, in particular influencing the founders of new republics from the United States in the 18th century to the newly independent India in the 20th. King John’s successor, his son Henry III, renewed the charter when he ascended the throne, as did subsequent monarchs – at least until the embryonic English Parliament truly established itself as the legitimate balance on the power of the Crown.

Magna Carta also continues to  underpin existing notions of justice on these shores and beyond, most significantly the right to a fair trial. Its words still resonate: “No free man shall be … imprisoned … except by the lawful judgment of his peers and by the law of the land.”

12

1439: Gutenberg’s printing press rolls into action

More than half a millennium before Tim Berners-Lee unveiled the internet, another inventor had already revolutionised mass communication, rapidly accelerating the gathering and dissemination of knowledge and information.

Johannes Gutenberg was born in Mainz, Germany, apprenticed as a goldsmith, but turned his attention to publishing, with the intention of making copies of the Bible more widely available. “Through it,” he declared, “God will spread His Word.” To describe the Gutenberg printing press as profound is a chronic understatement. Until then, printing had been undertaken by hand, using wooden blocks. Gutenberg’s press used mechanical movable type, which effectively introduced mass production to the publishing process. It had a huge effect on European society.

Mass production meant more books printed, and thus more people having the chance to read them. The resulting spread of ideas and knowledge empowered those previously denied such access.

13

1492: Columbus arrives in the New World

When, in October 1492, Italian explorer Christopher Columbus, sailing under the flag of the Spanish crown, landed on an island in the Bahamas, the future of the Americas and the Caribbean would never be the same. The find was accidental: Columbus wasn’t intending on discovering the New World, he was trying to find a western trade route to the East Indies. Indeed, believing he’d reached his target destination, he named the indigenous population ‘Indians’. Columbus moved on to Cuba and Hispaniola, establishing a settlement on the latter (presentday Haiti), putting in process what would become the mass colonisation of the New World.

But while his arrival was a great leap in European exploration, it was to have a devastating impact on the indigenous populations he – and later settlers – encountered. Violence, slavery and disease are among the many sources of controversy associated with the 15th-century explorer.

14

1512: Michelangelo redecorates the Sistine Chapel ceiling

As art criticism goes, it’s difficult to imagine more complimentary words than those of the German writer Johann Wolfgang von Goethe when he evaluated Michelangelo’s extensive ceiling painting in the chapel of the Apostolic Palace in Vatican City. “Without having seen the Sistine Chapel,” Goethe wrote, “one can form no appreciable idea of what one man is capable of achieving.”

Commissioned by Pope Julius II and having taken four years to complete, the ceiling is indeed an extraordinary achievement, arguably the high-water mark of Renaissance art. But Michelangelo – better renowned at that point as a sculptor and already engaged in creating sculptures for the tomb of Pope Julius II – initially declined the invitation.

Eventually he acquiesced and set about working on a fresco based on nine scenes from the Book of Genesis, one that would eventually feature no fewer than 343 figures. Working on a self-designed scaffold, Michelangelo didn’t, as myth would have it, paint while lying on his back. He stood as he worked, but the conditions were still difficult and uncomfortable. He later wrote a poem that explained how his body was strained “like a Syrian bow” and that his loins “into my paunch like levers grind”. The everlasting glory of the finished work justified his pain though: “the fruit of squinting brain and eye”.

15

1610: Galileo spots Jupiter’s moons

By the 17th century, there was a general acceptancen that Earth wasn’t flat, but a sphere. But the Aristotelian idea that our planet was the centre of the universe, around which all other planets revolved, still held sway. Then Italian Galileo Galilei came along, brandishing his homemade telescope. Through it, he observed that Jupiter is orbited by four moons, just as Earth is orbited by our solitary Moon. The conclusion he drew, which encountered great scepticism, was that the planets revolved around the Sun.

Italian Galileo Galilei. (Image by Getty Images)

 

16

1628: William Harvey makes a bloody revelation

In 1628, in the pages of the snappily titled Exercitatio Anatomica de Motu Cordis et Sanguinis in Animalibus, English physician William Harvey explained the purpose of the one-way valves found in the cardiovascular system: that they are evidence that the human heart propels blood around the body in a circulatory fashion. A handful of other medical scientists had made the observation before Harvey, but it was the depth and detail of his description of the process – gained from extensive dissections of animals – that ensured the kudos came his way, albeit with a delay of around 20 years.

William Harvey explained the purpose of the one-way valves found in the cardiovascular system. (Image from Getty Images)
17

1687: Isaac Newton announces his findings about gravity

In 1687, a prolific British mathematician and astronomer called Isaac Newton published a book that would shape thinking about the cosmos for the next 200-plus years. His Philosophiæ Naturalis Principia Mathematica (aka Mathematical Principles of Natural Philosophy) set out his thoughts about gravity, tides and the movement of planets, all but confirming the heliocentric school of thought – that the Solar System’s planets revolve around the Sun.

Newton’s findings would become the dominant worldview for more than 200 years, until Albert Einstein and his theory of relativity came along, although the Englishman remained modest about how he had recalibrated people’s thinking. “If I have seen further than others,” he once confessed, “it is by standing upon the shoulders of giants.”

18

1773: The first published African American poet

In 1761, a young girl from West Africa – who had been sold into slavery – was bought by the Wheatley family of Boston. She was named Phillis by her new owners and, unusually for the time, taught to read and write.

Wheatley was freed shortly after her poems were published. (Image by Getty Images)

Noticing Phillis’s appreciation of and aptitude for poetry, the family actively encouraged her vocation. When, in 1773, Phillis’s anthology Poems on Various Subjects, Religious and Moral was published in London (Boston publishers having declined to do so), the response was affirmative. “When we consider them as the productions of a young, untutored African, who wrote them after six months careful study of the English language,” trumpeted The London Magazine, “we cannot but suppress our admiration for talents so vigorous and lively.”

19

1776: The United States commits to human rights

“We hold these truths to be selfevident, that all men are created equal, that they are endowed by their Creator with certain unalienable Rights, that among these are Life, Liberty and the pursuit of Happiness.”

As the 13 soon-to-be-former British colonies drew up the Declaration of Independence that gave birth to what would become one of the most powerful nations on Earth, they ensured that the preservation of rights would be a key tenet of the new country’s constitution.

A committee of five men drafted the Declaration, two future presidents – John Adams and Thomas Jefferson – among them. (Image by Getty Images)

Indeed, US historian Joseph Ellis has suggested the declaration’s most famous sentence to be “the most potent and consequential words in American history”. While its application was far from watertight (slavery in the US wasn’t abolished for another 89 years, and US women wouldn’t get the vote until 1920), this commitment did offer the blueprint for an advancement of human rights.

Thirteen years later, after its own revolution, France adopted the Declaration of the Rights of Man and of the Citizen, a document which greatly influenced the opening up of democracy across Europe. In 1948, the UN General Assembly passed Resolution 217, otherwise known as the Universal Declaration of Human  Rights, a code confirming an individual’s rights, which 50 of its 58 member states had voted to uphold (the eight abstained). The spirit of 1776 lived on.

20

1796: Edward Jenner administers the first vaccine

In 1980, smallpox became the first major disease to be eradicated. And it was all down to one 18th-century doctor from rural Gloucestershire. In 1796, using a local boy as his guinea pig, he tested an old piece of folklore: that if you’d caught cowpox, you couldn’t then be infected with smallpox. Rubbing cowpox pocks on the boy’s arm, Edward Jenner witnessed that, while he did come down with the lesser disease, his patient became immune to the much more dangerous smallpox. Taking its name from vacca, the Latin word for ‘cow’, vaccination transformed global death rates. It’s believed that the work of no other single person in the world has saved as many lives as that of Jenner.

21

1792: Mary Wollstonecraft publishes seminal book on women’s rights

“Would men but generously snap our chains,and be content with rational fellowship instead of slavish obedience, they would find us more observant daughters, more affectionate sisters, more faithful wives, more reasonable mothers – in a word, better citizens.”

Wollstonecraft’s reputation was almost destroyed by a posthumous biography. (Image by Getty Images)

The most famous work of proto-feminist and author Mary Wollstonecraft, A Vindication of the Rights of Woman, was a radical manifesto for its time, published in 1792, five years before her tragically early death at the age of 38. The importance of both Wollstonecraft and her writings was damaged by posthumous accounts of her premarital affairs and her illegitimate first daughter, but she was later hailed as a guiding spirit for the suffragist movement at the turn of the 20th century.

22

1811-12: Mary Anning makes an historic discovery

In geological circles, the name of Mary Anning is sainted. At least, it is now. She was a 19th-century fossil collector and palaeontologist who, on the beaches of Dorset’s Jurassic Coast, uncovered the first-ever skeleton of an ichthyosaur, an extinct marine reptile. This was but one of her achievements, but throughout her life she was denied the praise that her science demanded – and which she would have been afforded had she been a man.

23

1821: Faraday builds the first primitive electric motor

In 1821, a young, self-taught scientist from London called Michael Faraday made a breakthrough that modern civilisation would become dependent upon. Building on the discovery of electromagnetism by Danish physicist Hans Christian Ørsted, Faraday built two instruments that produced what he dubbed ‘electromagnetic rotation’. He had, in effect, invented the electric motor, an achievement that had eluded his mentor, the celebrated inventor and chemist Sir Humphry Davy.

Faraday’s achievement cannot be overstated, as confirmed by this tribute by the physicist Ernest Rutherford: “When we consider the magnitude and extent of his discoveries and their influence on the progress of science and industry, there is no honour too great to pay to the memory of Faraday, one of the greatest scientific discoverers of all time.”

24

1822: The Rosetta Stone is translated

Discovered in Egypt in 1799, 23 years elapsed before the Rosetta Stone was fully deciphered – and with this came the unlocking of hitherto mysterious Egyptian hieroglyphics.

The stone (known as a stele) bears inscriptions in three languages: Greek, Egyptian demotic and hieroglyphics. Only bwhen the first two were translated did it become apparent that they were saying the same thing: a decree issued in 196 BC on behalf of Ptolemy V. Thus it was correctly assumed that the hieroglyphics were a third transcription, allowing French scholar Jean-François Champollion to decode them.

25

1829: Stephenson’s Rocket wins the Rainhill trials

George Stephenson, engineer of the soon-to-open Liverpool and Manchester Railway, needed to prove to its directors that steam locomotives would be the best source of power for the line’s trains, rather than using stationary steam engines to pull the trains by cables. The Rainhill Trials, held over a mile of track in Lancashire, were contested by five locomotives, but only one – Stephenson’s own Rocket – made it to the finish. Although not the very first steam locomotive, the Rocket’s engineering made it the prototype for the locomotives that followed. The rail revolution could begin.

The Rocket won Stephenson the right to build locomotives for the new railway. (Image by Getty Images)
26

1854: Japan renews trade with the west

After two centuries of isolation, Japan warmedto the idea of opening its borders in 1853. With the US and China already enjoying extensive trade, US Commodore Matthew Perry sailed to Japan with an four-strong fleet and, endowed with “full and discretionary powers” by his Secretary of State, employed intimidatory tactics to get Japanese agreement. Such gunboat diplomacy worked. Perry returned the following year, whereupon Japan signed the Treaty of Kanagawa, which opened its ports to US ships.

27

1833: Britain passes the Abolition of Slavery Act

There’s some irony that the country that was once most active in the slave trade was also the one that led the campaign to outlaw slavery and servitude. In the last decade of the 18th century, 80 per cent of Britain’s foreign income came from the triangular route that the slave trade had established – British goods going to Africa to buy slaves; slaves being transported to the West Indies; cotton, sugar and tobacco coming back to Blighty. But this was also a time when abolitionist sentiment was starting to percolate.

The impetus came from the anti-slavery committees of the Quakers, who presented a petition to Parliament in the early 1780s. A few years later, MP William Wilberforce was asked to make representations for the cause from his seat in the House of Commons. Researching further into the subject, he declared that he “thought himself unequal to the task allotted to him, but yet would not positively decline it”. In fact, his name would be forever synonymous with abolitionism.

In 1807, the Slave Trade Act was passed by Parliament, prohibiting the buying and selling of slaves on British soil, but not slavery itself. That came 26 years later, in 1833, when the Abolition of Slavery Act became law. Certain caveats ensured the legislation wasn’t as absolute as it might have been. Not only were certain parts of the British Empire exempt, but only slaves aged six and under were officially freed. The remainder were classified as ‘apprentices’, with their emancipation staggered and delayed (although this clause was removed five years later). Slave owners were also paid generous amounts for the loss of their ‘property’. Twenty million pounds was set aside to recompense them, a figure equating to 40 per cent of Britain’s annual income at the time.

However flawed, the passing of the Act effectively freed around 800,000 slaves across the empire. It also marked an acceleration of worldwide anti-slavery feeling, though US President Abraham Lincoln wouldn’t make his Emancipation Proclamation for another 30 years.

28

1859: Charles Darwin publishes On the Origin of Species

“Whilst this planet has gone cycling on according to the fixed law of gravity, from so simple a beginning endless forms most beautiful and most wonderful have been, and are being, evolved.” Charles Darwin had developed his theory of evolution – that species adapt and evolve over time as a process of natural selection – during his travels in the 1830s, specifically to the Galápagos Islands.

But it would be 20 years before he published it in On The Origin Of Species. There was a reason why he had delayed. His ideas were in contravention of the creationist explanations of the natural world that were dominant at the time – a domination not unconnected to religious benefactors underwriting the work of scientists. Had Darwin published his theory as soon as he had shaped and sanded it, he would have been at the mercy and probable ridicule of the scientific community.

In the end, Darwin’s hand was called. Another naturalist, Alfred Russel Wallace, had sent him a short overview of his research: Wallace’s theories mirrored Darwin’s own. He hurriedly edited his manuscript and published. The book’s reception from religious quarters was predictably scathing, but some of his scientific brethren stood in his corner, themselves liberated by Darwin’s bravery.

29

1861: Louis Pasteur publishes his germ theory

Throughout much of the 19th century, the dominant thinking about how disease was transmitted was explained by miasma theory. This explanation held that serious diseases, such as the plague and cholera, weren’t passed between people but were the result of some form of air pollution – specifically the apparently poisonous mist produced by decomposing natural matter. Whether you were struck down with a particular disease was thus not determined by who you had interacted or fraternised with, but rather was the result of your location and the levels of bad hygiene encountered within.

That miasma theory was superseded by another school of thought – germ theory – was largely down to one man: Louis Pasteur. The French biologist believed that microorganisms, (germs) that were too small to be visible were responsible for causing disease. By invading a host’s body and reproducing, these germs increase the chance of disease taking hold.

Pasteur also found renown for creating the first rabies and anthrax vaccines. (Image by Getty Images)

The theory wasn’t Pasteur’s own. It had been proposed in the mid-16th century, by Italian physician Girolamo Fracastoro. A couple of hundred years later, it was expanded upon by the Viennese Marcus von Plenciz. He believed that different organisms caused different diseases, but was unable to prove it.

That was where Pasteur came in. During the mid-19th century, he undertook a series of experiments in an attempt to put the link between germs and disease beyond question. One of them concerned the fermentation of beer and wine; he proved that these didn’t spoil as a result of spontaneous generation but because of active bacteria. He discovered the same was true of milk.

Not only did Pasteur find the cause, he also provided the solution. The process that would bear his name – pasteurisation – involved heating beverages to a temperature that would kill off microorganisms, thus making them safer to consume and prolonging their shelf life. For this achievement alone, Pasteur has been saluted as the father of microbiology.

30

1869: DNA is identified

When the Swiss biochemist Friedrich Miescher embarked on a pursuit to isolate the protein found in white blood cells, he instead encountered a substance with properties very unlike those of the protein he was researching.

He had effected the first purification of what he named ‘nuclein’ – what we now know as deoxyribonucleic acid, or DNA. Miescher believed his discovery to be an important one, although he remained unsure as to the exact function of nuclein. Initially, the scientific community didn’t take too much notice of it either, and it wasn’t until the last decade of the 19th century that nuclein’s hereditary properties began to be understood.

The German biochemist Albrecht Kossel successfully isolated and named the five compounds that provide molecular structure to DNA and in 1910 was awarded the Nobel Prize for Physiology or Medicine for his pioneering work in cell biology. Other scientists picked up the DNA baton. By the 1940s, the Canadian-American physician Oswald Avery and his colleagues identified that DNA was “the transforming principle” in genetics.

Their work inspired others to research further and deeper. In 1950, Erwin Chargaff made the discovery that DNA was species-specific, and two years later Rosalind Franklin advanced our understanding further still. Her high-resolution photographs of DNA were extraordinary, and pushed Franklin towards a belief that DNA took a helical structure. However, she was beaten to confirming the double helix structure of DNA by James Watson and Francis Crick in 1953. Along with their colleague Maurice Wilkins, they won the Nobel Prize for Physiology or Medicine. Despite her photographs being key to their breakthrough, Franklin wasn’t saluted with a share of the honour. Nor was the name of Friedrich Miescher remembered, either.

31

1893: New Zealand gives women the vote

During the 1990s, a new face replaced that of Queen Elizabeth II on the New Zealand ten-dollar note. It was that of another Englishwoman, albeit one who spent almost all her adult life in the southern hemisphere. This woman was Kate Sheppard, a figure largely unknown in the country of her birth, but whose actions and influence were felt right across the world.

Sheppard was the leading suffragist in New Zealand, a woman whose reasoned public speaking and writings – in publications such as Ten Reasons Why the Women of New Zealand Should Vote – successfully swung opinion towards universal suffrage. After a series of mass petitions had been collected by Sheppard and her fellow campaigners, on 19 September 1893, New Zealand governor Lord Glasgow signed the new Electoral Act into law. With neither the UK nor the US extending the vote to women until the other side of World War I, New Zealand blazed the trail, becoming the first self-governing nation to allow women to vote in parliamentary elections.

32

1898: The Curies discover Polonium and Radium

Marie Curie’s contribution to science is huge. In 1903, she became the first woman to be awarded a Nobel Prize; eight years later, she won her second. Marie shared the first with her husband Pierre, with whom she undertook pioneering work in radioactivity, and Antoine Henri Becquerel. In 1898, the Curies discovered two new elements – polonium and radium, both of which are more radioactive than uranium. Marie’s correct assumption was that radioactive rays could treat, reduce and even eradicate tumours, and her name remains synonymous with cancer treatment today.

33

1911: Roald Amundsen reaches the South Pole

The achievement of Norwegian polar explorer Roald Amundsen, who led the first expedition to reach the South Pole, has been somewhat overshadowed by the tragic tale of Captain Robert Scott. The Englishman’s own party, believing themselves to be the first to the Pole, arrived there in January 1912, only to be welcomed by the sight of a Norwegian flag. Amundsen and his men had beaten them by little more than a month. On their retreat, Scott and his four dejected compatriots perished, their bodies not found until the following November.

Scott’s party arrived at the Pole only to find the Norwegian flag planted by Amundsen and his expedition. (Image by Getty Images)

Following their successful mission, Amundsen’s party began their 11-week return journey, arriving in Hobart, Tasmania, in early March. He immediately despatched telegrams to inform the world of their achievement.  And the world was impressed. King George V sent a congratulatory telegram, even though Amundsen reached the Pole ahead of his own subjects, as did former US President Theodore Roosevelt. Having broken the news, Amundsen then set about supplying the Daily Chronicle newspaper in London, which had bought exclusive rights, with the full story of the expedition – even if his had been notably less eventful than Scott’s devastating journey.

Some quarters were less than generous withtheir praise. Sir Clements Markham, the famous geographer, cast doubt on Amundsen’s news, huffily declaring: “We must wait for the truth until the return of the Terra Nova [which was Scott’s ship].” Ernest Shackleton, no stranger to Antarctica as a member of previous Scott expeditions, didn’t share Markham’s disdain, announcing that Amundsen’s achievement made him “perhaps the greatest polar explorer of today”.

34

1922: Howard Carter opens the tomb of Tutankhamun

“Can you see anything?” “Yes, wonderful things!” This famous snippet of conversation – between financier Lord Carnarvon and the Egyptologist Howard Carter – announced one of the greatest discoveries in archaeological history: the tomb of the pharaoh Tutankhamun. Here, in Egypt’s Valley of the Kings, Carter was peering, by candlelight, through a gap in an excavated doorway, and was met by the sight of extensive golden treasures twinkling back at him. The following three months were spent cataloguing the finds of this antechamber, before work moved into the burial chamber, revealing the sarcophagus of King Tut himself. It proved to be the best-preserved tomb ever found in the Valley of the Kings.

35

1928: Penicillin is discovered

“When I woke up just after dawn on 28 September 1928,” Alexander Fleming later admitted, “I certainly didn’t plan to revolutionise all medicine by discovering the world’s first antibiotic, or bacteria killer. But I suppose that was exactly what I did.” The Scottish microbiologist was only thinking about his impending holiday when he absentmindedly left an amount of Staphylococcus bacteria on a tray in his lab. On his return, he noticed that a patch of mould had stopped the bacteria’s spread. He realised that a substance in the mould, which Fleming called penicillin, had antibiotic qualities that could stem the spread of chronic infections. The number of lives subsequently saved are countless.

36

1930: Gandhi’s Salt March

When a slight, bespectacled man named Mahatma Gandhi strode out on a protest march one spring morning in 1930 in the Indian state of Gujarat, the reason for his discontent was clear and precise. He and the 78 protestors accompanying him were voicing their dissent over the tax levied by the British rulers on the Indian population for the purchase of salt. But the march took on a deeper significance: it was the first major example of non-violent direct action against the colonial rule.

The 240-mile march was in protest at the British Raj’s weighty tax on salt. (Image by Getty Images)

The 24-day march gathered in size and number as it made its passage towards the Arabian Sea. By journey’s end, Gandhi encouraged civil disobedience by suggesting Indians prepare their own salt, an act that was prohibited by law. The tide was turning. Seventeen years later, Indian independence was declared.

37

1939: The Sutton Hoo ship burial is discovered

In the late 1930s, self-taught amateur archaeologist Basil Brown accepted an invitation from a widowed landowner who wished to learn the secrets of the mysterious barrows that dotted her land.

What was subsequently found in the Suffolk soil hugely expanded what little had, until that point, been known about Anglo-Saxon society. The greatest discovery of all was undoubtedly the remarkably intact ship from around the early 7th century, 80 feet in length.

What the ship had been buried with was also of high value and plentiful, including an exceedingly well-preserved ceremonial helmet.

38

1945: WW2 finally comes to an end

When Prime Minister Neville Chamberlain announced in a solemn radio address on 3 September 1939 that Britain was at war with Germany, there was none of the flag-waving patriotism of August 1914.

Instead, the British people – many of whom had lived and fought through the horrors of World War I – were mostly resigned to the fact that Adolf Hitler, and his aggressive form of German territorial expansion, needed to be stopped.

The road to Allied victory was far from inevitable and the German army proved to be an efficient and effective fighting force. But a combination events – from the US’s entry into the war in 1941, to D-Day (the largest seaborne invasion in history) in June 1944 – saw the conflict enter its endgame in April 1945. On 7 May 1945, Germany’s unconditional surrender was signed in Rheims and the following day – known as Victory in Europe (VE) Day – was celebrated as the war’s official end in Europe.

“This is not victory of a party or of any class. It’s a victory of the great British nation as a whole”, Chamberlain’s successor Winston Churchill announced in his VE Day address from the balcony of the Ministry of Health in London. But while the streets of Britain erupted in celebration in the wake of the German surrender, war continued to rage in the Far East as Imperial Japanese forces fought the Allies for control of eastern Asia and the western Pacific.

On 6 August, following continued Japanese refusals to surrender to the Allies, an atomic bomb was dropped on the city of Hiroshima, killing an estimated 140,000 people, 70,000 immediately and the remainder from the effects by the end of 1945. A few hours later, US President Harry S Truman again requested Japan’s surrender, stressing that the alternative was “a rain of ruin from the air, the like of which has never been seen on this Earth”.

Three days later, a second atomic bomb was dropped, this time on the city of Nagasaki, wreaking mass destruction on its civilian population. Japanese Emperor Hirohito ordered the Supreme Council for the Direction of the War to accept the Allies’ terms. On 15 August 1945, Truman declared the day as Victory over Japan (VJ) Day, signalling the end of the war.

“Our hearts are full to overflowing, as are your own. Yet there is not one of us who has experienced this terrible war who does not realise that we shall feel its inevitable consequences long after we have all forgotten our rejoicings today”, said King George VI during his address to the nation and empire on VJ Day. Out of the blood and destruction of the six-year conflict, peace had finally been achieved, but at a terrible cost to human life. Figures vary, but up to 80 million lives were lost over the course of the conflict.

39

1954: The first four-minute mile

Sporting history was made on 6 May 1954, at a modest Oxford running track, when a junior doctor by the name of Roger Bannister did something no other human had ever managed before: to run a mile in under four minutes.

Aided by pacemakers Chris Chataway and Chris Brasher, and roared on by a 3,000-strong crowd, Bannister had undertaken very little training prior to the race but forged on to break the tape in a new world record. Not that the large crowd actually heard the official time. The cheers drowned out announcer Norris McWhirter. All they heard, and all they needed to hear, was: “A European record of three…”

Bannister’s triumph didn’t last: his record of three minutes and 59.4 seconds was beaten after 46 days. (Image by Getty Images)

 

40

1955: Rosa Parks refuses to give up her seat

“When that white driver stepped back toward us, when he waved his hand and ordered us up and out of our seats, I felt a determination cover my body like a quilt on a winter night.” In the early evening of Thursday 1 December 1955, a seamstress named Rosa Parks boarded a bus in downtown Montgomery, Alabama, on her way home after a long day at work. She took her place in the first row of the ‘colored’ section of the bus, located beyond the seating reserved for white passengers. As the bus continued its journey, it became increasingly busy, with a couple of white passengers being forced to stand as all the seats in their section were taken. Noticing this, driver James F Blake stopped the bus, walked down the aisle and moved the sign that marked the ‘colored’ section.

Rosa Parks rides on a Montgomery bus on 21 December 1956 – the first day of integrated seating. (Image by Bettmann/Getty Images)

He then ordered the four African-American passengers in that first row to move further down the bus in order that the standing white passengers could take those seats. After a brief stand off, three of the African-Americans did as requested. Parks refused. In fact, she simply moved to the window seat of that row. Blake then threatened to call the police. “You may do that” was her calm but defiant response.

Arrested and charged with a violation of Montgomery’s city code, Parks was tried and found guilty the following Monday, by which time the seeds of a city-wide bus boycott had been sown. The boycott would last for a year and became a pivotal moment in the rise of the Civil Rights Movement, with the protest only ending when the US Supreme Court declared that Montgomery’s segregated buses were unconstitutional.

In 1957, despite the campaign’s success, Parks – at least in the short term – didn’t fare too well. Having lost her job as a seamstress, and facing continued harrassment, she moved to Detroit in order to find work. Congress would come to call her “the mother of the freedom movement”.

While history remembers Parks, she wasn’t the first Montgomery citizen to refuse to give up her seat to a white passenger. Nine months earlier, 15-year-old Claudette Colvin had made a similar act of defiance (“History had me glued to the seat,” she later said). But, with the unmarried teen becoming pregnant within the year, civil rights leaders chose not to promote her as a figurehead for the movement.

41

1960: The world’s first elected female head of government takes office

Sirimavo Bandaranaike didn’t formally enter the political arena until she was 54. When her husband, the Ceylon Prime Minister Solomon Bandaranaike, was assassinated in 1959, she stood for election the following year and became the first elected female head of government anywhere in the world.

Sirimavo Bandaranaike. (Image by Getty Images)

A dutiful wife until that point, Bandaranaike proved to be a formidable politician, serving three terms (1960–65, 1970–77 and 1994–2000) while adhering to a doggedly socialist agenda during some of the country’s most tumultuous years. Arguably her greatest achievement was overseeing the island’s transformation from British dominion Ceylon to independent republic Sri Lanka in 1972.

Bandaranaike was 84 when she resigned from her final term of office, and she died just two months later. Her pioneering legacy and inspiration for other female politicians lived on; at the time of her death, her daughter Chandrika Kumaratunga was serving as the country’s president.

42

1961: The first human goes into space

“Nothing will stop us. The road to thestars is steep and dangerous. But we’re not afraid.” Yuri Alekseyevich Gagarin wasn’t afraid on that April morning in 1961 when his Vostok spacecraft was launched into the (largely) unknown. More than three years after Laika the dog had been sent out of the Earth’s atmosphere, Gagarin completed a single orbit of our planet before, after 108 minutes, returning to Earth and touching down in Russia via parachute.

One of Gagarin’s qualifications was his height: the capsule was tiny, and he was only five feet and two inches tall. (Image by Getty Images)

During re-entry, Gagarin whistled the tune The Motherland Hears, The Motherland Knows, a song that contains the lines  “the motherland hears, the motherland knows, where her son flies up in the sky.” For the time being, the Soviet Union was ahead in the Space Race.

43

1963: Martin Luther King’s “I have a dream” speech

It was one of the iconic speeches of the 20th century, one that saw how – as the writer Jon Meacham has noted – “with a single phrase, Martin Luther King Jr joined Jefferson and Lincoln in the ranks of men who have shaped modern America”. Delivered before an estimated crowd of 250,000 at the March On Washington For Jobs And Freedom in August 1963, the speech defined an era in US history. It was a poetically worded, brilliantly delivered demand for long-overdue freedom and equality.

Martin Luther King led the Montgomery Bus Boycott. (Image by Getty Images)

King never got to see his dream come true: he was assassinated in Memphis, Tennessee, on 4 April 1968. A week later, the Civil Rights Act 1968, which had been making slow progress in Congress, was rushed through the legislature and immediately signed into law by President Lyndon Johnson.

44

1967: The UK decriminalises homosexuality

During the so-called Summer of Love, a significant piece of legislation was passed by the House of Commons: the Sexual Offences Act 1967, which decriminalised private sexual acts between two consenting men over the age of 21.

The Act put gay rights both on the statute books and high on the political agenda, but it didn’t represent a tide of liberalism. The bill faced great opposition in Parliament. In the House of Lords, the Earl of Dudley voiced his disapproval of gay men. “Prison is much too good a place for them,” he said.

The fight for gay rights continued through PRIDE events. (Image by Getty Images)

Nor did the Act offer parity with heterosexual citizens. One of the bill’s co-sponsors, the Earl of Arran, said of gay men that “any form of ostentatious behaviour now or in the future, or any form of public flaunting, would be utterly distasteful”.

45

1969: Apollo 11 lands men on the moon

On a steamy July day in 1969, those gathered in the control room of what is now the Johnson Space Center in Houston, Texas, held their collective breath.

Hearts were pounding. Brows, perspiring. More than 380,000 kilometres away, close to the surface of the Moon, the object of their concern and anticipation – a strange-looking spacecraft named Eagle – was possibly in difficulty. Alarms were sounding from its in-flight computer as the crew attempted to land it amongst the strewn boulders of the Moon’s Mare Tranquillitatis. Fewer than 30 seconds’ worth of fuel remained.

The tension among NASA’s ground staff in the control room was absolute and unbearable, but eight words from mission commander Neil Armstrong punctured that anxiety. “Houston, Tranquility Base here. The Eagle has landed.”

The immediate response from one of the controllers back at base said it all. “You got a bunch of guys about to turn blue. We’re breathing again. Thanks a lot.”

As heroic as it sounds, “The Eagle has landed” wouldn’t be the most-quoted statement Armstrong would make that day. This he reserved for the moment at which he planted the first human foot on the loose lunar surface. “It’s one small step for man,” he was heard saying down an understandably crackly line, “one giant leap for mankind.” The 650 million TV viewers who were tuned in at home could forgive him for slightly fluffing his lines; he should have said “one small step for a man”. Armstrong later maintained he had said it. Armstrong and his colleague Buzz Aldrin then spent a couple of hours exploring the  lunar surface, which the latter described as “magnificent desolation”. Before embarking on the return leg of their journey, the pair planted a US flag into the rocky ground, as well as affixing a plaque to one of the legs of the soon-to-beabandoned Eagle: “Here men from the planet Earth first set foot upon the Moon. July 1969 AD. We came in peace for all mankind.”

“All mankind” might be debatable. There was a definite political edge to the US’s determination to put a man on the Moon, with the accelerating Space Race being a key (and conspicuous) tenet of the Cold War. Just a month after the Soviets successfully propelled Yuri Gagarin into space to take the advantage, US President John F Kennedy delivered his ‘moonshot’ speech to Congress, outlining his vision of landing men on the Moon and returning them to Earth “before this decade is out”. For him, the US needed to be the leading party in conquering this final frontier, these uncharted waters. “Only if the United States occupies a position of preeminence,” he observed during another speech, this one in September 1962 at Rice University in Houston, “can we help decide whether this new ocean will be a sea of peace or a new, terrifying btheater of war.”

Kennedy was also driven by the idea of creating history, of titanic accomplishment. “We choose to go the Moon in this decade and do the other things, not because they are easy, but because they are hard; because that goal will serve to organise and measure the best of our energies and skills, because that challenge is one that we are willing to accept, one we are unwilling to postpone, and one we intend to win.”

The subsequent Apollo programme, which ran until 1972, consisted of both manned and preparatory unmanned missions. It wasn’t an unqualified success. In January 1967, the Apollo 1 mission ended in tragedy when a fire in the command module during a launch rehearsal killed the three-strong crew. Three years after the tragedy, and nine months after the successful Apollo 11 mission, the explosion of an oxygen tank on its outward journey denied the crew of Apollo 13 the opportunity to land onthe Moon. Their safe passage back to Earth was a dramatic, touch-and-go affair.

For those first men on the Moon, their short walk was a profound one. Buzz Aldrin later recalled the experience of gazing back at his home planet. “From the distance of the Moon, Earth was four times the size of a full Moon seen from Earth. It was a brilliant jewel in the black velvet sky. Yet it was still at a great distance,  considering the challenges of the voyage home.” The third member of the Apollo 11 mission, Michael Collins, never got to feel moondust under his feet. His experience was seen  through the window of the command module Columbia, orbiting solo around the Moon while Armstrong and Aldrin got to stretch their legs. He would report that he was neither lonely nor disappointed by this, detailing his emotions as being “awareness, anticipation, satisfaction, confidence, almost exultation”.

But was this extraordinary achievement by these three men actually an achievement? Did the 1969 Moon landing really happen? Conspiracy theorists, seeking a new cause célèbre six years after John F Kennedy’s assassination, poured scorn on the idea that science was able to accomplish a feat as far-fetched as landing a spacecraft on this distant natural satellite.

These doubters believed NASA falsified the landings, filming fake footage to trick people into believing that the Space Race had been won. While up to a fifth of US citizens continue to subscribe to this notion half a century later, substantial third-party evidence has been produced to debunk the theory, including subsequent photographs showing the tracks made by various Apollo crews, as well as the flags that each mission left behind. The Apollo missions were far more than flag-planting, strength-showing exercises.

After Armstrong and Aldrin set foot on the lunar surface, ten more astronauts did likewise over the following three-and-a-half years as five further missions successfully reached their destination. They returned to Earth with the data gathered from extensive experiments – both geological and meteorological – along with an accumulated 382 kilograms of rock samples. But did their findings justify the stratospheric expense, the $25.4 billion outlay that was reported to Congress in 1973?

When Kennedy had announced the Apollo programme, his predecessor in the White House, Dwight Eisenhower, had dismissed it as “just nuts”. But the country wasn’t with old Ike. They were dreaming.

As Andrew Smith, author of Moondust: In Search Of The Men Who Fell To Earth, points out: “For one decade, and one decade only, Americans appeared happy, even eager, to place their trust and tax dollar on the collection plate of big government and its scientist priests”. And they got what Kennedy had promised them. Footprints on the Moon. And one giant leap.

46

1974: The Terracotta Army is unearthed

When, one March morning in 1974, farmers in the Chinese province of Shaanxi began digging a well, they had no idea what that day’s endeavours would uncover. Their spades encountered an extraordinary haul: life-sized terracotta figurines of soldiers that had been buried to ‘protect’ the body of China’s first emperor, Qin Shi Huang, upon his death in around 210 BC.

The army was made to protect Qin Shi Huang in the afterlife. (Image by Getty Images)

The scale of what lay beneath was extraordinary, with estimates putting the number of ‘soldiers’ excavated at around 8,000. But the figures weren’t just soldiers. Also buried with the emperor were 130 chariots, 520 horses and 150 cavalry, along with non-military figures such as public officials, musicians and acrobats. Qin Shi Huang’s tomb formed part of a much wider necropolis, one which surveying equipment has been estimated to cover an area of almost 38 square miles – an extraordinary insight into life and death in the Qin dynasty.

47

1989: The fall of the Berlin Wall

From the first days of its construction in 1961 until its demolition at the turn of the 1990s, the Berlin Wall was a dominating feature of a city divided. It kept the West German-administered West Berlin separate from East Berlin, governed by the German Democratic Republic (aka East Germany). Governments on either side saw the wall differently: in the East, its official title was the Anti-Fascist Protection Rampart. WillyBrandt, the West Berlin mayor, labelled it the “wall of shame”.

Although not impregnable (an estimated 5,000 East Berliners successfully managed to cross into the West), around 20 times that number tried and failed. Thanks to the East Germans’ shoot-to-kill policy, around 200 escapees lost their lives in search of freedom. The Cold War began to thaw after Mikhail Gorbachev took power in the Soviet Union in 1985, and a new world order started to emerge. Global politics was coming out of the deep freeze. With it came a heightened discontent that the wall continued to divide Berlin.

Germans celebrate atop the wall on 12 November 1989. (Image by Getty Images)

In 1987, US President Ronald Reagan made a speech at the city’s Brandenburg Gate that contained a direct appeal to the Soviet Premier, a man whose policy of glasnost aimed to open up his country to outside influence. “General Secretary Gorbachev,” said Reagan, “if you seek peace, if you seek prosperity for the Soviet Union and Eastern Europe, if you seek liberalisation, come here to this gate. Mr Gorbachev, open this gate. Mr Gorbachev, tear down this wall!”

Six days earlier, David Bowie had held a concert nnear to the wall in West Berlin. The music had been heard on the other side of the divide, prompting anti-wall rioting. The following summer, in an apparent attempt to assuage and placate its younger citizens, the East German government allowed Bruce Springsteen to play a show in East Berlin. Speaking in German, Springsteen announced that he held “the hope that one day all the barriers will be torn down”.

The pressure wasn’t just cultural. Across the Eastern Bloc, the Iron Curtain was fraying. Along with Gorbachev’s reforms in the USSR, the 1989 Polish elections had ousted its communist regime, while the Hungarian government started pulling down fences along its border with Austria. This prompted many East Germans to leave for the West via Hungary and, later, via Czechoslovakia.

To stem the tide, on 9 November 1989, East Germany made an abrupt announcement: that the gates of the wall’s border crossings – hitherto only accessible to foreigners – would be flung open for all to pass through that very evening.

That night was one of the most joyous in German history, with West Berliners climbing on top of the wall to mingle with those from the eastern side of the city. The guards had put down their guns and the most visible division between Eastern and Western Europe was now rendered meaningless. While formal demolition didn’t commence until the following year, citizens on both sides of the divide hacked away at the structure, both for souvenirs and for deeply symbolic reasons. The process reached its denouement in 1990 when Germany was formally reunified after 45 years.

48

1991: The World Wide Web is launched

In 1989, after noticing that the large number of global scientists he was workingwith at CERN (the large particle physics laboratory near Geneva, Switzerland) were having difficulty sharing information easily, Oxford graduate and software engineer Tim Berners-Lee developed an idea that would revolutionise the way we communicate.

His proposal was for a “hypertext project” called “WorldWideWeb”, which would enable “browsers” to view a “web” of “hypertext documents”, rather than logging onto a different computer every time they wanted to access new information.

Essentially, what Berners-Lee was suggesting was an application that uses the Internet (conceived in the late 1960s) to share information such as videos and text. By the end of 1990, the first web page had been served on the open Internet, and in 1991, this new web community was made available to people outside of CERN. The World Wide Web had been spun.

49

1994: Nelson Mandela becomes South Africa’s first black president

“Our country has arrived at a decision. Among all the parties that contested the elections, the overwhelming majority of South Africans have mandated the African National Congress to lead our country into the future. The South Africa we have struggled for, in which all our people – be they African, Coloured, Indian or White – regard themselves as citizens of one nation is at hand.”

On 10 May 1994 – four years, two months and 29 days after taking slow, deliberate stepsb to freedom on his release from jail, where he had been held for nearly three decades – Nelson Rolihlahla Mandela was sworn in as the first black president in South Africa’s history, after a life dedicated to fighting the Apartheid system.

Initially arrested, charged, tried and jailed in 1962 for inciting workers’ strikes and leaving the country without permission, Mandela was charged the following year with sabotage and conspiracy to violently overthrow the government. At his trial in Rivonia, Mandela delivered an extraordinary, three-hour speech from the dock. It closed with a chilling confirmation of his commitment to the cause of black majority rule. “I have cherished the ideal of a democratic and free society in which all people will live together in harmony,” he told the packed courtroom. “It is an ideal for which I am prepared to die.”

Supporters crowd to see Mandela during a 1994 election rally in Durban. (Image by Getty Images)

The starkness of his words resonated around the world, with even the United Nations calling for Mandela and his fellow accused to be released. Instead, they were found guilty and incarcerated; Mandela would spend 18 of his 27 prison years on the infamous Robben Island, in a damp cell with a straw mat for a bed. All the while, the global clamour for his release continued.

The Special AKA released the anthem ‘Free Nelson Mandela’, while the occasion of his 70th birthday in 1988 was marked by a concert at Wembley that drew an estimated global audience of 600 million.

Mandela had been offered his release in 1985 in return for denouncing violence as a political tool; he refused to leave jail while the African National Congress (ANC) political party remained banned. When FW de Klerk became president four years later, an unconditional release became a very real prospect.

On his release in 1990, Mandela began negotiations for a multiracial general election. The electorate would eventually return him as president, with the ANC taking 62 per cent of the vote.

This was a transformative era in what had been one of the world’s most conflicted countries – a land where black citizens had been denied a voice at the polling booth for generations. The last paragraph of Mandela’s “I am prepared to die” speech is now written on the wall of South Africa’s Constitutional Court building in Johannesburg.

 

50

1998: The Good Friday Agreement is signed

The Good Friday Agreement was a crucial turning point in the ongoing peace process in Northern Ireland. Following the decades of bloody conflict known as The Troubles, the two-part multilateral accord – the first signed by almost all Northern Ireland’s political parties, the second by the Irish and British governments – devolved political power to a new Northern Ireland Assembly and ended direct rule from Westminster.

Irish PM Bertie Ahern (left) and British PM Tony Blair (right) were signatories. (Image by Getty Images)

Confirmed by large majorities in referenda held on both sides of the Irish border, the agreement came into force the following year. As a mark of the agreement’s significance, its two main architects – David Trimble of the Ulster Unionist Party and John Hume of the Social Democratic and Labour Party – were jointly awarded the 1998 Nobel Peace Prize “for their efforts to find a peaceful solution to the conflict in Northern Ireland”.

Nige Tassel writes about sport and popular culture as both a journalist and author

This article first appeared in the August 2019 issue of BBC History Revealed

]]>
Roman Britain: what six objects can reveal about life for women at Hadrian’s Wall https://www.historyextra.com/period/roman/hadrians-wall-what-life-like-roman-women-britain-artefacts/ Wed, 24 Aug 2022 08:19:31 +0000 https://www.historyextra.com/?p=214370

As is so often the case when exploring the past, we have a clearer picture of the lives of men – particularly Roman soldiers – on Hadrian’s Wall than of the women who lived alongside them. Yet artefacts found on the frontier reveal much about the varied experiences and backgrounds of women in the society that emerged in Britain’s military zone.

Here, Bronwen Riley highlights six such objects…

1

The tombstone of a slave who married her master

An intriguing example of what an artefact can reveal about the lives of women at Hadrian’ Wall is an elaborate gravestone of the mid-to-late second century AD, from the cemetery outside the Roman fort of Arbeia (South Shields). It depicts a woman, dressed in all her finery, sitting with her spindle on her lap. The Latin inscription beneath this picture of domestic comfort and industry tells us this is Regina, a freedwoman, wife of Barates from Palmyra (Syria) and a member of the Catuvellauni tribe, who died at the age of 30.

Beneath, a line in Palmyrene laments: “Regina, freed woman of Barates, alas!” This tombstone reveals the complex and ambiguous nature of relationships in the Roman world, particularly in Britain, on the empire’s north-western frontier.

Here is a woman from the peaceful south of Britain – the Catuvellauni tribe were based around Verulamium, near modern-day St Albans – who was enslaved, and bought by a man from Syria. At the time of her death, he had granted her freedom and regarded her (in Latin, at least) as his wife. However, neither Regina nor Barates was a Roman citizen, so any form of marriage they may have contracted would not have been recognised under Roman law.

Barates may have been a merchant, and the style of the gravestone suggests that it was the work of a Palmyrene craftsman, implying the existence of a Syrian community in Arbeia. How did this enslaved southern British woman end up on Hadrian’s Wall? She may have been born into slavery or sold into slavery by her family, or was perhaps an orphan or foundling.

Regina is Latin for “queen”, so hers is likely a slave’s nickname – ironic or admiring, perhaps a mixture of both. Did the trader who sold her call her that, or was it Barates’ name for her? Perhaps 10,000 men were stationed in the military zone on and around Hadrian’s Wall. It may have been thought politic to take women from outside the local area rather than from tribes nearby. What is left of Regina? Her face is lost and we cannot tell how she wore her hair. She wears a Gallic coat over a tunic – fashionable in both Britain and Gaul. One possible mark of a British identity is the torc around her neck. Regarded as typically Celtic, torcs were worn under Roman occupation, but designed in new styles and materials.

There is a tendency to see in this tomb stone a love story – that woeful “Alas!” of Barates who freed and married his “Queenie”. But the Palmyrene text is no more than the equivalent of “RIP” and reinforces Barates’ identity rather than Regina’s.

2

An altar that reveals the only priestess known by name in Britain

Diodora was a priestess in Corbridge, a busy Roman town on the banks of the Tyne 2.5 miles south of Hadrian’s Wall, situated at an important crossing point over the river at the intersection of key roads. Diodora’s name – and the Greek inscription she wrote on the altar that she dedicated to Herakles of Tyre – suggests that she came from the eastern part of the empire, where Greek was the predominant language.

The cult she served as priestess was rather niche, originating in the port city of Tyre in the province of Syria, and was one of several eastern religions for which there is evidence at Corbridge. Worship of the cult may have been brought to northern Britain by soldiers or merchants (such as Barates of Syria, husband of Regina, discussed above), or by troops who had become devotees of the god while serving in the eastern part of the empire in the 160s AD during the Parthian Wars.

The earliest known temples at Corbridge date from this period, when the fort became a supply base for the northern frontier, with detachments of legionaries stationed here. Naturally, such a depot attracted traders and merchants of all sorts.

How, from where and for what reason Diodora arrived in Corbridge and became a devotee of the god remains a mystery. Nor is it clear if she served the cult as a career priestess, or if her title was honorary – a sort of social distinction with a few light ceremonial duties attached. Priests and priestesses could be elected or sometimes inherited their roles; in some regions, they could buy their offices.

Women likely had a limited role in traditional Roman religion. They were largely excluded from playing any significant part in religious public life, with very few exceptions – for example, the Vestal Virgins (priestesses of the goddess Vesta) or in supporting roles to their husbands. They seem to have been forbidden to butcher animals or handle undiluted wine, both key components in animal sacrifice. Nor could they invoke prayers on behalf of a community.

By contrast, in Greek religion priestesses sometimes had key public responsibilities and were given honours and duties on a par with priests. So the appeal of many eastern or fringe cults that offered women more scope for participation – Christianity included – is understandable.


On the podcast | Rob Collins answers listener questions on Britain’s most famous Roman fortification, from its creation and purpose to everyday life on the wall


3

A discarded sandal: fast fashion for the elite

Among other extraordinary organic material found at Vindolanda is a rather exquisitely decorated sandal, stamped with the name of its Gallic maker, Lucius Aebutius Thales. It would have been worn as a house shoe, over a bare foot or with a slotted sock. It seems to have been discarded after the toe thong snapped, with no sign of any attempts at repair – indicating that its owner had access to other fine footwear.

We do not know who it belonged to but, because it was found in the commanding officer’s house, it is tempting to imagine Lepidina wearing it as she read one of Severa’s letters.

4

An invitation to a birthday party on Hadrian’s Wall

Claudia Severa’s letters to her friend Sulpicia Lepidina are among the most personal of more than 2,000 found at Vindolanda, where Lepidina’s husband was the prefect of the ninth cohort of Batavians.

In her letters, Severa invites her friend to celebrate her birthday, eagerly plans visits, and exchanges news about her husband and children. Severa dictated her letters to a secretary but wrote the affectionate endings herself: “I will hope for you, sister. Farewell, sister, my dearest soul, as I hope to prosper, and hail.” In another fragment she describes Lepidina as her desideratissima (most longed-for) soul.

Living at a military base on the empire’s remote north-western frontier added numerous constraints to these women’s lives. The roads would have been in a terrible state at certain times of the year, and there may have been restrictions in place for travelling

Written at the beginning of the second century, Severa’s letters represent the earliest known examples of a woman’s hand writing in Latin. Her sign-offs may express the warmth and exuberance of her character but perhaps also reflect the intimate way women of her background customarily addressed each other (“sister” does not necessarily imply they were related). One also detects in them a sense of isolation, and Severa’s longing for the company of other women of her own age and background.

Severa appears to have had an affectionate relationship with her husband. However, as wives the women were subject to their husbands, and clearly needed their permission to travel. Living at a military base on the empire’s remote north-western frontier added numerous other constraints to these women’s lives. The roads would have been in a terrible state at certain times of the year, and there may have been restrictions in place for travelling around the frontier zone; in any case, the women would have needed some sort of escort.

Until AD 197, rank-and-file soldiers could not contract legal marriages. This did not preclude soldiers from having relationships or families in the settlements that inevitably grew up around the forts. Inside the forts, though, only the prefects and centurions could officially have their families with them, which must have greatly constrained the women’s social lives.

Two women with wax tablet and stylus from a fresco from Herculaneum. Surviving letters from Vindolanda show that officers’ wives stationed at forts along the wall corresponded affectionately (Azoor Photo / Alamy Stock Photo)
5

A spindle whorl to remind us of women’s labour

Spinning wool was seen as women’s work – the essence of female domesticity. Even women of the highest status were expected at least to supervise their household in the task. Spinning was a crucial domestic task for women, because a constant supply of thread was needed to be woven into cloth.

Beyond the immediate requirements of the individual household, there were also thousands of soldiers on the wall to be clothed. Britain was highly prized for the quality of its wool, and its blankets and cloaks were also exported.

Wooden spindles rarely survive, but spindle whorls – made from stone, bone, jet, lead or ceramics – are commonly found on archaeological sites. Acting as weights on the spindle, adding momentum to the spinning, their size varied depending on the type of wool and form of yarn required. This amber spindle whorl was found in the settlement that grew up around Old Penrith, a fort at the centre of key trade routes on the road to Carlisle.

6

How a young girl’s gravestone is evidence of the persistence of indigenous beliefs along the frontier

How sad Sudrenus must have been as he commissioned the gravestone for his daughter Ertola, “properly called Vellibia”, who “led the happiest of lives for four years and 60 days”.

Having died in the late third or early fourth century, Ertola was buried at Corbridge, a town that seems to have attracted a sizeable indigenous population: more Celtic stone heads have been found here than at any other site in the north. The little girl is depicted on the gravestone clutching a round object, commonly interpreted as a ball. The names Sudrenus and Ertola are Celtic, as is the decoration of the gravestone, far removed in style from the classical tradition.

A Roman girl would often take a feminine form of her father’s name (so a daughter of Claudius Severus could be called Claudia Severa), but Celtic girls’ names seem to have been less rigidly prescribed. The name Cartimandua (queen of the northern Brigantes tribes), for example, means “white filly”.

Before the occupation, British women seem to have enjoyed greater freedom than their Roman counterparts. Though we hear about the British queens Cartimandua and Boudicca only through a deeply prejudiced Roman filter, accounts show that they were independent rulers who owned property, led armies into battle and divorced their husbands.

Under Roman law, girls could be legally married from the age of 12, but there is evidence that in Britain girls married later. There are also persistent reports that British women had much freer sexual relationships with men. Outside the immediate Roman sphere of the towns and forts, British women may have remained subject to local rather than Roman law in terms of marriage or inheritance. The Romans often thought it would be too inflammatory to meddle with such rules unless absolutely necessary.

Ertola’s gravestone provides an intimate insight into the grief of a father for his young daughter. It’s also a poignant reminder that women of all ages and backgrounds – and from all parts of Britain and the empire – lived, loved, worked and died along Hadrian’s Wall.

Bronwen Riley is a classicist and author, whose latest book is Journey to Britannia: From the Heart of Rome to Hadrian’s Wall, AD 130 (Head of Zeus, 2022)

This article was first published in the August 2022 issue of BBC History Magazine

]]>
Dragons: from mythological beasts of history to the fire breathers of fantasy https://www.historyextra.com/period/ancient-history/dragon-history-origin-evolution-mythology/ Sun, 21 Aug 2022 14:05:34 +0000 https://www.historyextra.com/?p=213286

Every small child of the modern west can describe a dragon: it is a broadly serpentine creature (colour of choice: green); it has an animalian head; between its longish neck and longish tail it has a fattish body; it has four legs; it has a pair of wings; it can be somewhat spiky. Adults might add the further observation that the dragon of this shape is a thing of beauty. The internet is awash with contemporary fantasy images of the creatures, lovingly tricked out in elaborate detail.

And this points up a paradox: although the function of dragons is to be creatures of ultimate terror, we just love them. Who cares about St George and his damsel in distress? It’s the dragon that makes the legend. And are the dragons not the cherries in the cakes baked by JRR Tolkien, JK Rowling and George RR Martin?

House of the Dragon: the real historical event behind the Game of Thrones prequel

It might not seem like it, but HBO’s House of the Dragon has its roots in a very real civil war in the early medieval period known as the Anarchy – though there wasn’t a dragon in sight

Emma D'Arcy as Princess Rhaeneyra Targaryen and Matt Smith as Prince Daemon Targaryen in House of the Dragon

But the universality of this dragon-shape across the west should not blind us to the fact that it is artificial, a random collection of body parts drawn from different creatures of the natural world. So where does this amalgamation come from? How did the dragon so familiar to modern fans of fantasy fiction come to be?

Daniel Ogden gets up close to the six evolutionary stages of the dragon…

1

The massive snake dragons of antiquity

To start answering the question of where the ubiquitous image of the dragon came from, we must first return to classical Greece. The word “dragon” derives, via medieval French, from the Latin draco, which is itself a borrowing of the ancient Greek term drakōn.

So what was a drakōn? Its basic form was that of a snake of enormous proportions, and it’s worth remembering that this is the creature that lies at the heart of all dragons. (The convention of referring to dragons as “serpents” helps us to bear this in mind.) There was one curious exception to their pure-snake form, however: they sported beards. These seem to have been markers not of their sex but of their supernatural nature (beards being attached to males and females alike). Already, like our modern dragons, they were fiery, this being an imaginative extrapolation of the burning sensation caused by viper venom.

One such creature was the Dragon of Ares. When the hero Cadmus needed some pure water to make a sacrifice as he founded the city of Thebes, he sent his men to the spring of Dirce. But the spring was guarded by this terrible dragon, which destroyed them. It was now up to Cadmus to redress the situation. He took on the dragon himself and slew it – and then he hacked out the dragon’s teeth and sowed them.

Soon, these teeth had seeded a race of armed warriors: they were known as the Spartoi, the “Sown Men” (nothing to do with Sparta!). These provided Cadmus with his first generation of Thebans. In a mysterious final twist, both Cadmus and his wife, Harmonia, were reputed to have been transformed into dragons at the end of their lives. Perhaps this was a divinely organised compensation for his killing of the Dragon of Ares.

We find several similarly serpentine dragons elsewhere in Greek myth: Python, the Dragon of Delphi, slain by the god Apollo; the Dragon of Colchis, guardian of the golden fleece stolen by Jason; Ladon, the Dragon of the Hesperides; and the famous Hydra, slain by Heracles. She was of the same form as these other dragons, save that she was, of course, multi-headed – and multi-bearded!

2

The great creature lurking beneath the waves

A sea monster depicted in a mosaic from third-century BC Italy. This dragon-like figure may have been inspired by the humble seahorse (Photo by Alamy)

In another myth from classical Greece, when the hero Perseus was flying home on his winged sandals after decapitating the Gorgon Medusa, and passed over Joppa (Jaffa), he saw a beautiful girl pinned out on a sea-cliff below. This was princess Andromeda.

Andromeda’s mother, Cassiepeia, had boasted foolishly that she herself was more beautiful than the Nereids (the nymphs of the sea). In anger, they had prevailed upon Poseidon, god of the sea, to send a sea monster (kētos) to ravage Joppa in revenge. King Cepheus, Andromeda’s father, learned from the Oracle of Ammon that the only way to bring an end to the creature’s depredations was to sacrifice his daughter to it.

On learning all this, Perseus struck a quick deal with Cepheus that, if he delivered Andromeda from the sea monster, he could take her in marriage. Perseus duly defeated the monster, either with his distinctive scythe-shaped sword, or with the super-weapon he had ready to hand, the head of Medusa, with which he was able to petrify it into a rock-formation.

The shape of this sea monster was similar to that of the sea-monster form familiar from ancient art. This was a massive, serpentine creature, but with a central body more bulbous than a snake’s; it had a rather dog- or horse-like head; it had a prominent pair of fore-flippers, which tended to mutate into clawed legs in the Roman era; it had a fish-tail; and it had spikes on its head and along the ridge of its back.

The origin of this mysterious configuration is lost in the mists of time: it may have originated in part in a gargantuan inflation of the innocent seahorse. At any rate, it can be seen at once that the creatures that have enchanted readers of Harry Potter and The Hobbit resemble more closely the classical sea monster than the snake-like classical dragon.

3

A locust-spewing beast becalmed by a Christian angel

For much of the classical Greek era, the dragon and the sea monster were regarded as distinct creatures. But, by the second century AD, the two had begun to merge in the western mind.

Evidence for this is provided by the Shepherd of Hermas, a Christian text (composed in c130–50 AD), in which Hermas reports a number of visions he has experienced. In one, he tells that, as he was walking down the road to Campania in southern Italy, he saw a dust-cloud approach. Out of this a terrible beast charged at him with such force that it could have destroyed a city. From its mouth poured a stream of fiery locusts.

It made for a terrifying sight, but as the beast approached Hermas, it stretched itself out on the ground and let its tongue loll out. Hermas was then met by a lady in white, an embodiment of the church, who explained to him that, because of his faith, the Lord had sent the angel Thegri to bind (metaphorically) the beast’s mouth.


On the podcast | From gallant knights to dragon-slaying damsels, Lydia Zeldenrust reveals why medieval readers couldn’t get enough of romance tales:


The description of the creature makes it clear that it is a dragon, as does the surrounding imagery: the church in the form of a lady and the binding angel are clear nods to the great dragon of the Book of Revelation, which attacks the church in the form of a parturient woman and is bound by the archangel Michael.

However, Hermas does not call the creature a “dragon” (drakōn), but a “sea monster” (kētos). This merging of leviathan and land-lubbing beast explains why the modern dragon boasts an animalian head, a bulbous central body – and, of course, the spikiness of a sea monster.

4

The dragon takes flight

In the Acts of Philip (written in the late fourth century AD), the superhero apostle takes on a dizzying array of dragons in and around Ophiorhyme (“Snake Town”). This is a pagan city identified with Hierapolis, now in Turkey, presided over by the wicked Echidna or “Viper Goddess”.

Philip has a team to help him that wouldn’t be out of place in a modern franchise like Guardians of the Galaxy. These include St Bartholomew, Mariamne the cross-dressing nun, and a leopard and a goat endowed with human speech. In one episode, Philip forces some problematic demons to come forth from the rocks beneath which they lurk. They initially manifest themselves in the form of 50 dragons, each 90 feet in length, and they are presided over by an even vaster dragon of 150 feet in length, covered in soot and belching forth fire and venom.

Philip has a team to help him that wouldn’t be out of place in Guardians of the Galaxy. These include St Bartholomew, Mariamne the cross-dressing nun, and a leopard and a goat endowed with human speech

Philip compels them to build a church for him. At his behest, they are transformed into the shape of winged humans (the familiar shape for a demon), and each flies off in this form to bring back a column for the building.

Our anonymous author leaves it unclear whether the demons’ base-form is that of a dragon or that of a winged human. But it seems that for him the two entities belong fully together and are simply two faces of the same coin. So here we see a further step on the road to the development of the modern dragon: this is the source of the creature’s wings.

5

The fire-starting, tail-swishing image is almost complete

The next stage in our supernatural journey consists not of a tale, but rather of a vignette. The Gospels of Hincmar is a beautifully illustrated manuscript of the later ninth century AD, now in the Bibliothèque Municipale in Reims. The book opens with the so-called Eusebian canons, lists of the parallel episodes across the four gospels, laid out in fine, architectural tables surmounted by pediments.

A range of playful serpents and dragons of different forms perch on and scamper over these pediments, but in among them are a pair of creatures of the following shape: an animalian head and bulbous body, with legs in the fore-flipper position (these courtesy of the sea monster); a pair of wings (these courtesy of humanoid, winged flying demons); and a coiling snake-tail (this, almost alone now, courtesy of the classical dragon).

The pair reassure us of their fundamentally dragon nature by blowing out blasts of fire in the direction of a harmless bird perching on the apex of the pediment.

These two creatures are fully formed modern dragons. Well, almost. All they’re missing are their back legs. Until then, they might be designated more particularly as “wyverns” – the technical term for a winged, two-legged dragon.

6

Standing on its own legs: the dragon of today

in a Persian scientific book (Photo by Universal History Archive/Universal Images Group via Getty Images)

Between the wyvern and the more familiar form of the dragon we know today, there stands only the second pair of legs. Examples of four-legged dragons are found in illustrated manuscripts and paintings of the saints from at least the beginning of the 12th century, but it is difficult to demonstrate that these are significant for the future of the creature: artists often like to freestyle with the forms of their dragons.

The real four-legged revolution falls at the turn of the 15th century. It is from c1400 that four-legged varieties come to predominate in representations of the Revelation dragon, confined to the abyss by St Michael. The same is true of images of the dragon of Lasia, slain by St George, and also of images of the dragon that swallows St Margaret of Antioch, only to have her burst forth from its belly (which permits this virgin martyr to become the patron saint of childbirth).

A good example is to be found in a fine Italian painting, now in New York’s Metropolitan Museum, dated to c1405. Here, the archangel Michael brandishes his sword over the supine body of the Revelation dragon: it has a coiling tail and a pair of wings, but is otherwise broadly crocodilian in configuration, not least in its four stumpy legs.

It is those legs that propelled the dragon along the final steps of its journey – via flightless serpent, spiky sea monster and flying fire-breather – to the creature we know and love today.

Daniel Ogden is professor of ancient history at the University of Exeter. His latest book is The Dragon in the West: From Ancient Myth to Modern Legend (Oxford University Press, 2021)

This content first appeared in the September 2022 issue of BBC History Magazine

]]>