Majesty & A Fall from Grace: The Lives & Times of the Latter-Day Windsors – Part Two: Borders, Brexit & Bombings, 2002-17.

The Twin Causes of ‘Brexit’ – Migration & Asylum:

In 2015, David Cameron and the Conservatives surprisingly won an overall majority of MPs and were able to form a government by themselves. In their Manifesto, they had promised to conduct a renegotiation of the terms of Britain’s membership in the EU, and then ask the electorate in a referendum, whether they accepted these terms and wished Britain to remain in the EU, or whether they wanted to leave.

Now it was not De Gaulle or even the Brussels ‘Eurocrats’ who were asking the question, but the British Prime Minister, David Cameron, and the ‘Brexiteer’ Conservatives in his cabinet and on the back benches. The people themselves had not asked to be asked, but when they answered at the 2016 Referendum, they decided, by a very narrow majority, that they preferred the vision (some would say a blurred one) of a ‘global’ Britain to the ‘gold-card benefits’ available at the European table it was already sitting at. The Tory rebels’ ‘tenacious attachment’ to ‘bloody-minded liberty’ led, among other factors, to them expressing their desire to detach themselves from the European Union, though it was by no means clear whether they wanted to remain semi-detached or move to a detached property at the very end of the street which as yet had not yet been planned, let alone built. All they had was a glossy prospectus of what may or may not be delivered or even be deliverable.

An internet poster from the 2016 Referendum Campaign

Looking back to 2002, the same year in which Simon Schama published his BBC series book, The Fate of Empire, the latest census for England and Wales was published. Enumerated and compiled the previous year, it showed the extent to which the countries had changed in the decade since the last census was taken. Douglas Murray, in the first chapter of his book, The Strange Death of Europe, first published in 2017, challenged his readers to imagine themselves back in 2002, speculating about what England and Wales might look like in the 2011 Census. Imagine, he asks us, that someone in our company had projected:

“White Britons will become a minority in their own capital city by the end of this decade and the Muslim population will double in the next ten years.”

How would his readers have reacted in 2002? Would they have used words like ‘alarmist’, ‘scaremongering’, ‘racist’, and ‘Islamophobic’? In 2002, a Times journalist made far less startling statements about likely future immigration, which were denounced by David Blunkett, then Home Secretary (using parliamentary privilege) as bordering on fascism. Yet, however much abuse they received for saying or writing it, anyone offering this analysis would have been proved absolutely right at the end of 2012, when the 2011 Census was published. It proved that only 44.9 per cent of London residents identified themselves as ‘white British’. It also revealed far more significant changes, showing that the number of people living in England and Wales born ‘overseas’ had risen by nearly three million since 2001. In addition, nearly three million people in England and Wales were living in households where not one adult spoke English or Welsh as their primary language.

Parish Churches like the one shown above, in Framlingham, Suffolk, though picturesque, had dwindling congregations of mainly elderly worshippers.

These were major ethnic and linguistic changes, but there were equally striking findings of changing religious beliefs. The Census statistics showed that adherence to every faith except Christianity was on the rise. Since the previous census, the number of people identifying themselves as Christian had declined from seventy-two per cent to fifty-nine. The number of Christians in England and Wales dropped by more than four million, from thirty-seven million to thirty-three. While the Churches witnessed this collapse in their members and attendees, mass migration assisted a near doubling of worshippers of Islam. Between 2001 and 2011 the number of Muslims in England and Wales rose from 1.5 million to 2.7 million. While these were the official figures, it is possible that they were underestimated, because many newly-arrived immigrants might not have filled in the forms at the beginning of April 2011 when the Census was taken, not yet having a registered permanent residence.

The London Boroughs

The two local authorities whose populations were growing fastest in England, by twenty per cent in the previous ten years, were Tower Hamlets and Newham in London, and these were also among the areas with the largest non-response to the census, with around one in five households failing to return the forms. Yet the results of the census clearly revealed that mass migration was in the process of altering England completely. In twenty-three of London’s thirty-three boroughs (see map above) ‘white Britons’ were now in the minority. A spokesman for the ONS regarded this as demonstrating ‘diversity’, which it certainly did, but by no means all commentators regarded this as something positive or even neutral. When politicians of all the main parties addressed the census results they greeted them in wholly positive terms.

This had been the ‘orthodox’ political view since in 2007 the then Mayor of London, Ken Livingstone, had spoken with pride about the fact that thirty-five per cent of the people working in London had been born in a foreign country. For years a sense of excitement and optimism about these changes in London and the wider country seemed the only appropriate tone to strike. This was bolstered by the sense that what had happened in the first decade of the twenty-first century was simply a continuation of what had worked well for Britain in the previous three decades. This soon turned out to be a politically-correct pretence, though what was new in this decade was not so much growth in immigration from Commonwealth countries and the Middle East, or from wartorn former Yugoslavia, but the impact of white European migrants from the new EU countries, under the terms of the accession treaties and the freedom of movement provisions of the single market.

Besides the linguistic and cultural factors, there were important economic differences between the earlier and the more recent migrations of Eastern Europeans. After 2004, young, educated Polish, Czech and Hungarian people moved to Britain to earn money to send home or to take home with them in order to acquire relatively cheap, good homes, marry and have children in their rapidly developing countries. And for Britain, as the host country, the economic growth of the 2000s was fuelled by the influx of energetic, skilled and talented people who, working, for example, in the NHS, were also denying their own country their much-needed skills for a period. But the UK government had seriously underestimated the number of these workers who wanted to come to Britain. Ministers suggested that the number arriving would be around twenty-six thousand over the first two years. This turned out to be wildly wrong, and in 2006 a Home Office minister was forced to admit that since EU expansion in 2004, 427,000 people from Poland and seven other new EU nations had applied to work in Britain. If the self-employed were included, he added, then the number might be as high as 600,000. There were also at least an additional thirty-six thousand spouses and children who had arrived, and twenty-seven thousand child benefit applications had been received. Even if most of these turned out to be temporary migrants, they still needed to find houses, schools and various services, including health for the period of their stay.

It has to be remembered, of course, that inward migration was partially offset by the annual outflow of around sixty thousand British people, mainly permanent emigrants to Australia, the United States, France and Spain. By the winter of 2006-07, one policy institute reckoned that there were 5.5 million British people living permanently overseas, nearly ten per cent of Britons, or more than the population of Scotland. In addition, another half a million were living abroad for a significant part of the year. By 2016, the number of ex-pats was estimated to have grown to over ten million. Aside from Europe, the Middle East and Asia were seeing rising ‘colonies’ of expatriate British. A worrying proportion of them were graduates; Britain was believed to be losing one in six of its graduates to emigration. Many others were retired or better-off people looking for a life in the sun, just as many of the newcomers to Britain were young, ambitious and keen to work. Government ministers tended to emphasise these benign effects of immigration, but their critics looked around and asked where all the extra people would go, where they would live, and where their children would go to school, not to mention where the extra hospital beds, road space and local services would come from, and how these would be paid for.

Members of the campaign group Citizens UK hold a ‘refugees welcome’ event outside Lunar House in Croydon. Photograph: John Stillwell/PA

A secondary issue to that of ‘numbers’ was the system for asylum seekers. In 2000, there were thirty thousand failed asylum seekers in the United Kingdom, a third of those who had applied in 1999, when only 7,645 had been removed from the country. It was decided that it was impossible to remove more, and that to try to do so would prove divisive politically and financially costly. Added to this was the extent of illegal immigration, which had caught the ‘eye’ of the British public. There were already criminal gangs of Albanians and Kosovars, operating from outside the EU, who were undermining the legal migration streams from Central-Eastern Europe in the eyes of many. The social service bill for these ‘illegal’ migrants became a serious burden for the Department of Social Security. Towns like Slough protested to the national government about the extra cost of housing, education and other services.

In addition, there was the sheer scale of the migration and the inability of the Home Office’s immigration and nationality department to regulate what was happening, to prevent illegal migrants from entering Britain, to spot those abusing the asylum system in order to settle in Britain and the failure to apprehend and deport people. Large articulated lorries filled with migrants, who had paid over their life savings to be taken to Britain, rumbled through the Channel Tunnel and the ferry ports. A Red Cross camp at Sangatte, near the French entrance to the ‘Chunnel’, was blamed by Britain for exacerbating the problem. By the end of 2002, an estimated 67,000 had passed through the camp to Britain. The then Home Secretary, David Blunkett finally agreed on a deal with the French to close the camp down, but by then many African, Asian and Balkan migrants, believed the British immigration and benefits systems to be easier than those of other EU countries, had simply moved across the continent and waited patiently for their chance to jump aboard a lorry to Britain.

Successive Home Secretaries from Blunkett to John Reid tried to deal with the trade, the latter confessing that his department was “not fit for purpose”. He promised to clear a backlog of 280,000 failed asylum claims, whose seekers were still in the country after five years. The historic Home Office was split up, creating a separate immigration and nationality service. Meanwhile, many illegal immigrants had succeeded in bypassing the asylum system entirely. In July 2005, the Home Office produced its own estimate of the number of these had been over the previous four years. It reckoned that this was between 310,000 and 570,000, or up to one per cent of the total population. A year later, unofficial estimates pushed this number up to 800,000. The truth was that no one really knew, but official figures showed the number applying for asylum was now falling, with the former Yugoslavia returning to relative peace.  Thousands of refugees were also being returned to Iraq, though the signs were already apparent that further wars in the Middle East and the impact of global warming on sub-Saharan Africa would soon send more disparate groups across the continents.

To begin with, the arrival of workers from the ten countries who joined the EU in 2004 was a different issue, though it involved an influx of roughly the same size. By the government’s own figures, annual net inward migration had reached 185,000 and had averaged 166,000 over the previous seven years. This was significantly more than the average net inflow of fifty thousand New Commonwealth immigrants which Enoch Powell (pictured below) had referred to as ‘literally mad’ in his 1968 Rivers of Blood speech, though he had been criticising the immigration of East African Asians, of course. But although Powell’s speech was partly about race, colour and identity, it was also about the numbers of immigrants and the practical concerns of his Wolverhampton constituents in finding hospital and school places in an overstretched public sector. These concerns persisted, though they were largely ignored by senior politicians.

Enoch Powell

It seems not unreasonable, and not at all racist, to suggest that it is the duty of the central government to predict and provide for the number of newcomers it permits to settle in the country, though until 2015 this was largely left to local authorities, which were already struggling with centrally-imposed austerity cut-backs. In 2006, the Projections based on many different assumptions suggested that the UK population would grow by more than seven million by 2031. Of that increase, eighty per cent would be due to migration. The organisation, Migration Watch UK, set up to campaign for tighter immigration controls, said this was equivalent to requiring the building of a new town the size of Cambridge each year, or five new cities the size of Birmingham over the predicted quarter century.

But such characterisations were, in fact, caricatures of the situation since many of these new Eastern European migrants did not intend to settle permanently in the UK and could be expected to return to their countries of origin in due course. This eventually came to pass, after the UK finally left the EU in 2019 and during the subsequent Covid19 pandemic. However, before that happened, the massive underestimations of the scale of the inward migration were, of course, obvious to anybody with any knowledge of the history of post-war migration, replete with vast underestimates of the numbers expected. But it did also demonstrate that immigration control was simply not a priority for New Labour or the Con-Libs. They gave the impression that they regarded all immigration control, and even discussion of it, as inherently ‘racist’, which made any internal or external opposition to it hard to voice. The public response to the massive upsurge in immigration and to the swift transformation of parts of Britain it had not really reached before, was exceptionally tolerant. There were no significant or sustained outbreaks of racist abuse or violence before 2016, and the only racist political party, the British National Party (BNP) was subsequently destroyed, especially in London.

In April 2006, Margaret Hodge, the Labour MP for Barking since 1996 (pictured right), commented in an interview with The Sunday Telegraph that eight out of ten white working-class voters in her constituency might be tempted to vote for the British National Party (BNP) in the local elections on 4 May 2006 because “no one else is listening to them” about their concerns over unemployment, high house prices and the housing of asylum seekers in the area. She said the Labour Party must promote…

Margaret Hodge MP

“… very, very strongly the benefits of the new, rich multi-racial society which is part of this part of London for me”.

There was widespread media coverage of her remarks, and Hodge was strongly criticised for giving the BNP publicity. The BNP went on to gain 11 seats in the local election out of a total of 51, making them the second-largest party on the local council. 

It was reported that Labour activists accused Hodge of generating hundreds of extra votes for the BNP and some local Labour members began to privately discuss the possibility of a move to deselect her. The GMB union wrote to Hodge in May 2006, demanding her resignation. The then Mayor of London, Ken Livingstone, later accused Hodge of “magnifying the propaganda of the BNP” after she said that British residents should get priority in council house allocations. In November 2009, the Leader of the BNP, Nick Griffin, announced that he intended to contest Barking at the 2010 general election, which saw New Labour finally defeated under Gordon Brown’s leadership. In spite of the union’s position, Hodge was returned as MP for Barking, doubling her majority to over 16,000, while Griffin came third behind the Conservatives. The BNP subsequently lost all of its seats on Barking and Dagenham Council.

Opinion polls and the simple, anecdotal evidence of living in the country showed that most people continued to feel no personal animosity towards immigrants or people of different ethnic backgrounds. But poll after poll also showed that a majority were deeply worried about what ‘all this’ migration meant for the country and its future. But even the mildest attempts to put these issues on the political agenda, such as the concerns raised by Margaret Hodge, were often met with condemnation from the established Labour left, especially in London, with the result that there was still no serious public discussion of them. Perhaps successive governments of all hues had spent decades putting off any real debate on immigration because they suspected that the public disagreed with them and that it was a matter they had lost control over anyway. This was done through charges of ‘racism’ and ‘bigotry’, such as the accidental ‘caught-on-mike’ remark made by Gordon Brown while getting into his car in the 2010 election campaign, when confronted by one of his own Labour councillors in a northern English town about the sheer numbers of migrants. It is said to have represented a major turning point in the campaign.

A series of deflecting tactics became a replacement for action in the wake of the 2011 census, including the remark that the public should ‘just get over it’, which came back to haunt David Cameron’s ministers in the 2016 Referendum campaign. Even Boris Johnson, then Mayor of London, in his Daily Telegraph column of December 2012, titled Let’s not dwell on immigration but sow the seeds of integration, responded to the census results by writing…

We need to stop moaning about the dam-burst. It’s happened. There is nothing we can now do except make the process of absorption as eupeptic as possible … 

It did not seem to have occurred to Johnson that there were those who might be nursing a sense of righteous indignation about the fact that for years all the main parties had taken decisions that were so at variance with the opinions of their electors, or that there was something profoundly disenfranchising about such decisions, especially when addressed to a majority of the voting public. In the same month as Johnson’s admonition, a poll by YouGov found two-thirds of the British public believed that immigration over the previous decade had been a ‘bad thing for Britain’. Only eleven per cent thought it had been a ‘good thing’. This included majorities among supporters of all three main parties. Finally, the leaders of all three parties conceded that immigration was indeed too high. But none had any clear or proven policy on how to change course. By 2015, public opinion surveys were suggesting that a failure to do anything about immigration even while talking about it was one of the key areas of the breakdown in trust between the electorate and their political representatives.

At the same time, the coalition government of 2010-15 was fearful of the attribution of base motives if it got ‘tough on immigrants’. The Conservative leadership was trying to reposition itself as more socially ‘liberal’ under David Cameron. Nevertheless, at the election, they had promised to cut immigration from hundreds of thousands to tens of thousands per year, but they never succeeded in getting near that target. To show that she meant ‘business’, however, in 2013, Theresa May’s Home Office organised a number of vans with advertising hoardings to drive around six London boroughs where many illegal immigrants and asylum seekers lived. The posters on the hoardings read, In the UK illegally? Go home or face arrest, followed by a government helpline number. The posters became politically toxic immediately. The Labour Shadow Home Secretary, Yvette Cooper, described them as “divisive and disgraceful” and the campaign group Liberty branded them “racist and illegal”.

After some months it was revealed that the pilot scheme had successfully persuaded only eleven illegal immigrants to leave the country voluntarily. Theresa May admitted that the scheme had been a mistake and too “blunt”. Indeed, it was a ‘stunt’ designed to reassure the ‘native’ population that their government was getting tough, and it was not repeated, but the overall ‘hostile environment’ policy it was part of continued into the next majority Conservative government after the 2015 election, leading to the illegal deportation of hundreds of ‘Windrush generation’ migrants from the Caribbean who had settled in Britain before 1968 and therefore lacked passports and papers identifying them as British subjects. In fact, under Cameron’s Conservative government, net immigration reached a record level of 330,000 per year, numbers which would fill a city the size of Coventry. The movement of people, even before the European migration crisis of 2015, was of an entirely different quantity, quality and consistency from anything that the British Isles had experienced before, even in the postwar period. Yet the ‘nation of immigrants’ mythology continued to be used to cover the vast changes of recent years and to pretend that history could be used to provide precedents for what had happened since the turn of the millennium.

Brexit – The Turning of the Tide for British Tolerance?

Following the 2011 Census, net migration into Britain continued to be far in excess of three hundred thousand per year. The further rise in the population of the United Kingdom recorded in 2021 was almost entirely due to inward migration, and higher birth rates among the predominantly young migrant population. In 2014 women who were born overseas accounted for twenty-seven per cent of all live births in England and Wales, and a third of all newborn babies had at least one overseas-born parent, a figure that had doubled since the 1990s. However, since the 2016 Brexit vote, statistics have shown that many recent migrants to Britain from the EU have been returning to their home countries so it is difficult to know, as yet, how many of these children will grow up in Britain, or for how long. But based on the increases projected by the Office for National Statistics in 2017, Douglas Murray asks the following rhetorical questions of the leaders of the mainstream political parties:

All these years on, despite the name-calling and the insults and the ignoring of their concerns, were your derided average white voters not correct when they said that they were losing their country? Irrespective of whether you think that they should have thought this, let alone whether they should have said this, said it differently or accepted the change more readily, it should at some stage cause people to pause and reflect that the voices almost everybody wanted to demonise and dismiss were in the final analysis the voices whose predictions were nearest to being right.

One might retort with the observation that Murray seems to lay too much emphasis on the ‘average white voter’ and hints at the need for a policy based on reversing what he seems to see as the ‘racial replacement’ of the previous half-century. In the 2016 Referendum Campaign, the UKIP (United Kingdom Independence Party) campaign also seemed to confuse the question of asylum seekers with that of freedom of movement within EU borders by using the following photo, taken in 2015, on one of the eastern external borders:

A pro-EU campaign poster from 2016 countering an original UKIP poster which used a picture of Syrian refugees in 2015 to warn of ‘mass immigration’.

Indeed, the issue of immigration as it affected the 2016 Referendum in Britain was largely about the numbers of Eastern European migrants arriving in the country, rather than about illegal immigrants from outside the EU, or asylum seekers. Inevitably, all three issues became confused in the public mind, something that UKIP used to good effect in its campaigning posters. The original version of the poster above, featuring UKIP leader Nigel Farage, caused considerable controversy by using pictures from the 2015 Crisis in Central-Eastern Europe to suggest that Europe was at a ‘Breaking Point’ and that once in the EU, refugees and migrants would be able to enter Britain and settle there. This was untrue, as the UK was not, in any case, in the ‘Schengen’ area, as shown by the map below. Campaigners against ‘Brexit’ pointed out the facts of the situation in the adapted photo on the internet.

The Schengen Area in 2004. Applicant countries are marked in orange. Croatia joined in 2022.

In addition, during the campaign, Eastern European leaders, including the Poles and the Hungarians, complained about the misrepresentation of their citizens as ‘immigrants’ like many of those who had recently crossed the EU’s Balkan borders in order to get to Germany or Sweden. As far as they were concerned, their citizens were temporary internal migrants within the EU’s arrangements for ‘freedom of movement’ between member states. Naturally, because this was largely a one-way movement in numeric terms, this distinction was lost on many voters, however, as ‘immigration’ became the dominant factor in their backing of Brexit. On 23rd June 2016, the UK electorate, with a turnout of 72%, voted by a margin of 52% to 48% to leave the EU. Leave won the majority of votes in England and Wales, while every council in Scotland saw Remain majorities, and Northern Ireland voted to remain by 56% to 44%.(https://www.bbc.co.uk/news/politics/eu_referendum/results)

An Ipsos poll published after the referendum result was declared, in July 2016, surveyed public attitudes towards immigration across Europe. It revealed just how few people thought that immigration had had a beneficial impact on their societies. To the question – Would you say that immigration has generally had a positive or negative impact on your country? – very low percentages of people in each country thought that it had had a positive effect. In fact, Britain had a comparatively positive attitude, with thirty-six per cent of people saying that they thought it had had a very or fairly positive impact. Meanwhile, only twenty-four per cent of Swedes felt the same way and just eighteen per cent of Germans. In Italy, France and Belgium only ten to eleven per cent of the population thought that it had made even a fairly positive impact on their countries. Despite the Referendum result, the British result may well have been higher because, despite UKIP’s posters, Britain had not experienced the same level of immigration from outside the EU as had happened in the inter-continental migration crisis of the previous summer.

Migrants/ Asylum Seekers arriving on the shores of the Greek island of Lesbos during the refugee crisis of 2015.

In Britain, the issue of Calais remained the foremost one in discussion in the autumn of 2016. The British government announced that it was going to have to build a further security wall near the large migrant camp there. The one-kilometre wall was designed to further protect the entry point to Britain, and specifically to prevent migrants from trying to climb onto passing lorries on their way to the UK. Given that there were fewer than 6,500 people in the camp most of the time, a solution to Calais always seemed straightforward to some. All that was needed, argued these activists and politicians, was a one-time generous offer and the camp could be cleared. But the reality was that once the camp was cleared it would simply be filled again. For 6,500 was an average day’s migration to Italy alone.

In the meantime, while the British and French governments argued over who was responsible for the situation at Calais, both day and night migrants threw missiles at cars, trucks and lorries heading to Britain in the hope that the vehicles would stop and they could climb aboard as stowaways for the journey across the Channel. The migrants who ended up in Calais had already broken all the EU’s rules on asylum in order to get there. They had not applied for asylum in their first country of entry, Greece, nor even in Hungary. Instead, they pushed on through the national borders of the ‘Schengen’ free passage area (see map above) until they reached the north of France. If they were cold, poor or just worse off, they were seen as having the right to come and settle in a European Union that seemed no longer to have the heart, and/or will, to turn anyone away.

The Disintegration of Multiculturalism in Britain:

After the 9/11 attacks on the USA, the wars in Iraq and Afghanistan and the 7/7 London bombings, there was no bigger cultural challenge to the British sense of proportion and fairness than the threat of ‘militant Islam’ or rather ‘Islamist‘ terrorism. There were plenty of angry young Muslim men prepared to listen to fanatical ‘imams’ and to act on their narrow-minded and bloodthirsty interpretations of ‘Jihad’. Their views, at odds with those of the well-established south Asian Muslim communities in Britain, referred to above, were those of the ultra-conservative ‘Wahhabi’ Arabs and Iranian mullahs who insisted, for example, on women being fully veiled. But some English politicians, like Norman Tebbit, felt justified in asking whether all Muslim communities throughout Britain really wanted to fully integrate. Would they, in Tebbit’s notorious ‘test’, support the English Cricket team when it played against Pakistan?

Britain did not have as high a proportion of Muslims as France, and not many, outside London and parts of the South East, were of Arab and North African origin. But the large urban centres of the Home Counties, the English Midlands and the North of England had third-generation Muslim communities of hundreds of thousands. They felt like they were being watched in a new way and were perhaps right to feel more than a little uneasy. In the old industrial towns on either side of the Pennines and in areas of West London there were such strong concentrations of Muslims that the word ‘ghetto’ was being used by ministers and civil servants, not just, as in the seventies and eighties, by right-wing organisations and politicians. White working-class people had long been moving, quietly, to more semi-rural commuter towns in the Home Counties and on the South Coast, and in the cities of the Midlands, like Birmingham, to the ‘suburbs’.

But those involved in this ‘white flight’, as it became known, were a minority if polling was an accurate guide, and their motives for leaving the inner city areas were often complicated, not linked to ‘race’ or culture. Only a quarter of Britons said that they would prefer to live in white-only areas. In retrospect, this may seem to be a significant minority. Yet even this measure of tolerance or ‘multiculturalism’, colloquially defined as ‘live and let live’, was being questioned. How much should the new Britons ‘integrate’ or ‘assimilate’, and how much was the retention of traditions a matter of their rights to a distinctive cultural identity? After all, Britain had a long heritage of allowing newcomers to integrate on their own terms, retaining customs and contributing elements of their own culture. Speaking in December 2006, Blair cited forced marriages, the importation of ‘sharia’ law and the ban on women entering certain mosques as being on the wrong side of this line. In the same speech he used new, harder language. He claimed that, after the London bombings, …

“… for the first time in a generation there is an unease, an anxiety, even at points a resentment that out very openness, our willingness to welcome difference, our pride in being home to many cultures, is being used against us … Our tolerance is what makes is part of what makes Britain, Britain. So conform to it; or don’t come here. We don’t want the hate-mongers … If you come here lawfully, we welcome you. If you are permitted to stay here permanently, you become an equal member of our community and become one of us.”

His speech was not just about security and the struggle against terrorism. He was defining the duty to integrate. Britain’s strong economic growth over the previous two decades, despite its weaker manufacturing base, was partly the product of its long tradition of hospitality. The question now was whether the country was becoming so overcrowded that this tradition of tolerance was finally eroding. England, in particular, had the highest population density of any major country in the Western world. It would require wisdom and frankness from politicians together with watchfulness and efficiency from Whitehall to keep the ship on an even keel. Without these qualities and trust from the people, how can we hope for meaningful interactions between Muslims, Christians, Jews and Humanists?; between newcomers, sojourners, old-timers and exiles?; between white Europeans, black Africans, south Asians and West Indians?

Map showing the location of Rotherham in South Yorkshire

In January 2011, a gang of nine Muslim men, seven of Pakistani heritage and two from North Africa, were convicted and sentenced at the Old Bailey in London for the sex trafficking of children between the ages of eleven and fifteen. One of the victims sold into a form of modern-day slavery was a girl of eleven who was branded with the initials of her ‘owner’ and abuser: ‘M’ for Mohammed. The court heard that he had branded her to make her his property and to ensure others knew about it. This did not happen in a Saudi or Pakistani backwater, nor even in one of the northern English towns that so much of the country had forgotten about until similar crimes involving Pakistani heritage men were brought to light. This happened in Oxfordshire between 2004 and 2012. Nobody could argue that gang rape and child abuse were the preserve of immigrants, but these court cases and the official investigations into particular types of child-rape gangs, especially in the case of Rotherham, have identified specific cultural attitudes towards women, especially non-Muslim women, that are similar to those held by men in parts of Pakistan. These have sometimes been extended into intolerant attitudes toward other religions, ethnic groups and sexual minorities. But they are cultural attitudes which are anathema to the teachings of the Qu’ran and mainstream Imams, but fears of being accused of ‘racism’ for pointing out such factual connections had been at least partly responsible for these cases taking years to come to light.

British Muslims and members of the British-Pakistani community condemned both the abuse and that it had been covered up. Nazir Afzal (pictured below), Chief Crown Prosecutor of the Crown Prosecution Service (CPS) for North West England from 2011–2015, himself a Muslim, made the decision in 2011 to prosecute the Rochdale child sex abuse ring after the CPS had turned the case down. Responding to the Jay report, he argued that the abuse had no basis in Islam:

Above: Nazir Afzal, Crown Prosecutor for North-West England.
Left: The front page of The Times, 24 September 2012.

“Islam says that alcohol, drugs, rape and abuse are all forbidden, yet these men were surrounded by all of these things. … It is not the abusers’ race that defines them. It is their attitude toward women that defines them.” 

Even then, however, in the Oxfordshire case, the gangs were described as ‘Asian’ by the media, rather than as men of Pakistani and Arabic origin or heritage. In addition, the fact that their victims were chosen because they were not Muslim was rarely mentioned in court or dwelt upon by the press. But despite sections of the media beginning to focus on Pakistani men preying on young white girls, a 2013 report by the UK Muslim Women’s Network found that British Asian girls were also being abused across the country in situations that mirrored the abuse in Rotherham. The unfunded small-scale report found thirty-five cases of young Muslim girls of Pakistani heritage being raped and passed around for abuse by multiple men. In the report, one local Pakistani women’s group described how Pakistani-heritage girls were targeted by taxi drivers and on occasion by older men lying in wait outside school gates at dinner times and after school. They also cited cases in Rotherham where Pakistani landlords had befriended Pakistani women and girls on their own for purposes of sex, then passed on their name to other men who had then contacted them for sex.

The Jay Report, published in 2014, acknowledged that the 2013 report of abuse of south Asian-heritage girls was ‘virtually identical’ to the abuse that occurred in Rotherham, and also acknowledged that British Asian girls were unlikely to report their abuse due to the repercussions on their family. South Asian girls were ‘too afraid to go to the law’ and were being blackmailed into having sex with different men while others were forced at knife-point to perform sexual acts on men. Support workers described how one teenage girl had been gang-raped at a ‘party’ her ‘boyfriend’ had taken her to:

“When she got there, there was no party, there were no other female members present. What she found was that there were five adults, their ages ranging between their mid-twenties going on to the late-forties and the five men systematically, routinely, raped her. And the young man who was supposed to be her boyfriend stood back and watched”.

Groups would photograph the abuse and threaten to publish it to their fathers, brothers, and in the mosques, if their victims went to the police.

In June 2013, the polling company ComRes carried out a poll for BBC Radio 1 asking a thousand young British people about their attitudes towards the world’s major religions. The results were released three months later and showed that of those polled, twenty-seven per cent said that they did not trust Muslims (compared with 15% saying the same of Jews, 13% of Buddhists, and 12% of Christians). More significantly, perhaps, forty-four per cent said that they thought Muslims did not share the same views or values as the rest of the population. The BBC and other media in Britain then set to work to try to discover how Britain could address the fact that so many young people thought this way.

Part of the answer may have had something to do with the timing of the poll, the fieldwork being carried out between 7-17 June. It had only been a few weeks before this that Drummer Lee Rigby, a young soldier on leave from Afghanistan, had been hit by a car in broad daylight outside an army barracks in South London, dragged into the middle of the road and hacked to death with machetes. The two murderers, Michael Adebolajo and Michael Adebowale identified as Muslims of African origin who were carrying letters claiming justification for killing “Allah’s enemies”. It’s therefore reasonable to suppose that, rather than making assumptions about a religious minority without any evidence, those who were asked their opinions connected Muslims with a difference in basic values because they had been very recently associated with an act of extreme violence on the streets of London.

Unfortunately, attempts to provide a more balanced view and to separate these acts of terrorism from Islam have been dwarfed by the growing public perception of a problem which will not simply go away through the repetition of ‘mantras’. The internet has provided multiple and diverse sources of information, but the simple passage of the various events related above, and the many others available examples, meant that people were able to make their own judgements, and they were certainly not as favourable as they had been at the start of the current century. By 2015, one poll showed that only thirty per cent of the general public in Britain thought that the values of Islam were ‘compatible’ with the values of British society. The passage of terrorist events on the streets of Europe continued through 2016 and 2017. 

On 22 March 2017, a 52-year-old British-born convert to Islam, Khalid Masood, ploughed his car across Westminster Bridge, killing two tourists, one American and the other Romanian, and two British nationals. Dozens more were injured as they scattered, some falling into the River Thames below. Crashing into the railings at the side of Parliament, Masood then ran out of the hired vehicle and through the gates of the palace, where he stabbed the duty policeman, PC Keith Palmer, who died a few minutes later. Masood was then shot dead by armed police, his last phone messages revealing that he believed he was “waging jihad.” Two weeks later, at an inter-faith Service of Hope at Westminster Abbey, its Dean, the Very Reverend John Hall, spoke for a nation he described as ‘bewildered’:

What could possibly motivate a man to hire a car and take it from Birmingham to Brighton to London, and then drive it fast at people he had never met, couldn’t possibly know, against whom he had no personal grudge, no reason to hate them and then run at the gates of the Palace of Westminster to cause another death? It seems that we shall never know.

Then on 22 May thousands of young women and girls were leaving a concert by the US pop singer Ariana Grande at Manchester Arena. Waiting for them as they streamed out was Salman Abedi, a twenty-two-year-old British-born man, whose Libyan parents had arrived in the UK in the early nineties after fleeing from the Gadaffi régime. In the underground foyer, Abedi detonated a bomb he was carrying which was packed with nuts, bolts and other shrapnel. Twenty-two people, children and parents who had arrived to pick them up, were killed instantly. Hundreds more were injured, many of them suffering life-changing wounds.

Next in what began to seem like a remorseless series of events, on 3 June three men drove a van into pedestrians crossing London Bridge. They leapt out of it and began slashing at the throats of pedestrians, appearing to be targeting women in particular. They then ran through the Borough Market area shouting “this is for Allah”. Eight people were murdered and many more were seriously injured before armed police shot the three men dead. Two of the three, all of whom were aged twenty to thirty, were born in Morocco. The oldest of them, Rachid Redouane, had entered Britain using a false name, claiming to be a Libyan and was actually five years older than he had pretended. He had been refused asylum and had absconded. Khurram Butt had been born in Pakistan and arrived in the UK as a ‘child refugee’ in 1998 when his family moved to the UK to claim asylum from ‘political oppression’, although Pakistan was not on the UNHCR list of ‘unsafe’ countries. On the evening of 19 June, at end of the Muslim sabbath, in what appeared to be a ‘reprisal’, a forty-seven-year-old father of four from Cardiff drove a van into crowds of worshippers outside Finsbury Park Mosque who were crossing the road to go to the nearby Muslim Welfare House. One man, who had collapsed on the road and was being given emergency aid, was run over and died at the scene. Almost a dozen more were injured.

Up to this point, all the Islamist terror attacks, from 7/7/2005 onwards, had been planned and carried out by ‘home-grown’ terrorists. Even the asylum seekers involved in the June attack in London had been in the country since well before the 2015 migration crisis. But in mid-September, an eighteen-year-old Iraqi who arrived in the UK illegally in 2015, and had been living with British foster parents ever since, left a crudely-manufactured bomb on the London Underground District line during rush hour when the carriages were also crowded with schoolchildren. The detonator exploded but failed to ignite the homemade device itself, leading to flash burns to the dozens of people in the carriage. A more serious blast would have led to those dozens being taken away in body bags, and many more injured in the stampede which would have followed at the station exit with its steep steps. As it was, the passengers remained calm during their evacuation.

Of course, it would have been difficult to predict and prevent any of these attacks, either by erecting physical barriers or by identifying individuals who might be at risk from ‘radicalisation’, much of which takes place online. Most of the attackers had been born and radicalised in the UK, so no reinforcements at the borders, either in Calais or Kent would have kept them from enacting their atrocities. But the need for secure borders is not simply a symbolic or psychological reinforcement for the British people if it is combined with a workable and efficient asylum policy. We are repeatedly told that one of the two main reasons for the 2016 referendum decision for Britain to leave the EU was in order to take back control of its borders and immigration policy, though it was never demonstrated how exactly it had lost control of these, or at least how EU membership had made it lose control over them.

‘Globule’ Britain, Exceptionalism & the Globalisation of Populism:

By 2017, there were already signs that, due to the fall in the value of the pound since the Referendum, many Central-Eastern European migrants were returning to their home countries, but the vast majority of them had already declared that they did not intend to settle permanently in the UK anyway. The fact that so many came from 2004 onwards was entirely down to the decision of the British government not to delay or derogate the operation of the accession treaties. But, after ‘Brexit’ was finally done in 2019, the reality remained that, even if they were to be replaced by other European ‘immigrants’ in future, the UK would still need to control, as ever, the immigration of people from outside the EU, including asylum seekers, and that returning failed or bogus applicants would become more difficult. So, too, would the sharing of intelligence information about any potential threats of terrorists attempting to enter Britain as bogus refugees. Other than these considerations, the home-grown threat from Islamist terrorists was unlikely to be affected by Brexit one way or another, and could only be dealt with by anti-radicalisation strategies, especially through education and more active inter-cultural community relations aimed at full integration, not ‘parallel’ development.


Since the Brexit referendum in 2016 and the election of Donald Trump, it seemed that ‘gutter’ journalists, the ‘global media’, especially ‘social media,’ just couldn’t get enough of Populism in the form of tropes and memes. In 1998, the Guardian alone published about three hundred articles that contained the term. In 2015, it was used in about a thousand articles, and one year later this number had doubled to almost two thousand. Populist parties across Europe have tripled their vote in Europe over the past twenty years and more than a quarter of Europeans voted for populists in their last elections. So, in deciding to leave the EU, the British were, ironically, not demonstrating how exceptional they were, but becoming more like their transcontinental European and American allies and partners in supporting nativist, populist and extremist policies and parties.

From the Ladybird book, The Story of Brexit (see below)
The Green & Pleasant Land under siege:

But, in reality, the biggest threat that Britain faced was not from the Brussels Eurocracy, nor from immigration. It was the threat of climate change, a physical threat rather than a demographic one, waves of water, not people. It promised to reshape the outline of Britain, as seen from space or on any map. Nothing was more sensitive to the British people than the shape of their island. Rising sea levels could make its entire coastline look different. They could eat into East Anglia, centuries after the wetlands were reclaimed with Dutch drainage, and submerge the concrete-crusted, terraced marshlands around London, and drown idyllic Scottish islands, forcing the abandonment of coastal towns which had grown up in Victorian and Edwardian times. Long-established wildlife would die out and be replaced by new species – these were already making their presence known in British gardens. All this was beyond the power of Britain to control alone since it was responsible for just two per cent of global emissions. Even if the British could be persuaded to give up their larger cars, their foreign holidays and their gadgets, would it make a real difference?

The village of Corton Denham, Somerset, with views across the Dorset countryside.

In 2009, Andrew Marr concluded, somewhat prophetically:

Without a frank, unheated conversation between the rest of us and elected politicians, who are then sent out into the the world to do the bigger deals that must be done, what hope for action on climate change? It seems certain to involve the loss of new liberties, such as cheap, easy travel. It will change the countryside as grim-looking wind farms appear. It will change how we light and heat our homes and how we are taxed. All these changes are intensely political, in a way the British of the forties would have recognised. Politics is coming back as a big force in our lives, like it or not. …

Without this frankness, what help is there for a sensible settlement between Muslim and Christian, incomer and old timer?

… The threats facing the British are large ones. But in the years since 1945, having escaped nuclear devastation, tyranny and economic collapse, we British have no reason to despair, or emigrate. In global terms, to be born British remains a wonderful stroke of luck.

Andrew Marr (2007-9), A History of Modern Britain pp. 601-2.

Postscript – The Last Years, Platinum Jubilee & A Royal Tribute:

Great Britain, Northern Ireland and the Commonwealth of Nations celebrated HM The Queen’s Platinum Jubilee (70th Anniversary of her Accession to the throne) for four days in June. Queen Elizabeth turned ninety-six on 21st April 2022. She had witnessed many triumphs and tragedies since she became heir to the throne, at the age of ten in 1936, becoming Princess Elizabeth, and turned twenty-one in 1947 when, in a radio broadcast, she told the Kingdom and the Commonwealth:

‘I declare before you all that my whole life, whether it be long or short shall be devoted to your service.’

She successfully adapted to a rapidly changing world from 1952, when there was no TV, to the digital age of the Internet. Her influence and inspiration are far-reaching, and she has definitely earned every ounce of love and respect she received during the Platinum Jubilee celebrations. She was nearly never late for anything or anyone and was extremely courteous and respectful towards everyone, no matter what their position was. She considered tardiness a sign of disrespect and her deep respect for her employees was manifested in the way she treated all twelve hundred of them. She never saw them as ‘servants’ and never called them when they were off-duty since she valued their privacy and family life as much as her own. She had a tremendous memory for detail and a great deal of compassion for each individual in her family and the royal household.

She exuded a serene authority, Majesty and calmness in the faithful conduct of her duties and frequently expresses her faith in God. Although she never attended university, she was well educated and served as a confidante to seventeen Prime Ministers from Winston Churchill to Elizabeth Truss. Her charities and patronages deal with a wide range of social problems from youth opportunities to wildlife and environmental protection. Her Majesty balanced all her public responsibilities with a full family life, raising four children, and welcoming grandchildren and great-grandchildren.

The four-day UK bank holiday weekend culminated in a series of magnificent activities held across the country, but especially in central London, to commemorate Her Majesty’s extraordinary milestone. The Trooping of the Colours, comprising fifteen hundred soldiers and commanders, four hundred musicians, 250 horses and seventy aeroplanes, kicked off the festivities. Across the country, large jubilee street parties were held, with people coming together to honour the Queen and their local communities. As part of the Jubilee celebrations, a pageant was staged in the Mall on Sunday 5th June, bringing together five thousand people from across the UK and the Commonwealth, to commemorate the Queen’s reign. It included street art, music, puppets and costume.

The bells of Westminster Abbey pealed as they did on Coronation Day, 1953. The Gold Ste Coach, pulled by eight Windsor Grey horses was led by the mounted band of the Household Cavalry. There were four acts in the pageant. The first was led by the featured a military march with 1,750 participants and two hundred horses. The second depicted changes in culture, music and fashion during the last seven decades. The third act told the story of the Queen’s life in twelve chapters, and Ed Sheeran sang ‘God Save the Queen’ in the fourth act. The Queen watched the finale of the pageant from the balcony of the Palace where she was joined by three generations of her family.

Elsewhere, a new underground line called the Elizabeth Line was inaugurated, operating twelve trains per hour between Paddington and Abbey Wood. Queen Elizabeth’s portraits were projected onto the standing stones of Stonehenge, each picture from one of the seven decades of her reign. A twelve-foot-tall floral crown was installed in St James’ Park, sparkling with the brilliant blooms of 13,500 plants in the colours of the precious stones in the crown worn on Coronation Day. The Queen also surprised and delighted millions of viewers by appearing in a special comic sketch filmed in the Palace with Michael Bond’s Paddington Bear to start the Jubilee Concert outside. The Queen and Paddington engaged in dialogue about their mutual love of marmalade sandwiches. As the long weekend’s events drew to a close, the Queen issued a statement:

“When it comes to how to mark seventy years as your Queen, there is no guidebook to follow. It really is a first. But I have been humbled and deeply touched that so many people have taken to the streets to celebrate my Platinum Jubilee. While I may not have attended every event in person, my heart has been with you all; and I remain committed to serving you to the best of my ability, supported by my family. I have been inspired by the kindness, joy and kinship that has been so evident in recent days, and I hope this renewed sense of togetherness will be felt for many years to come…”

Over the next few months, as the Queen’s health became increasingly more fragile, Prince Charles stepped up to cover duties. He had already become known as a thoughtful and caring champion of a wide range of worthy causes, as well as a hardworking, dutiful Prince and a kind, humourous man, happy to meet and chat with the public in the streets. After the Queen’s death, dealing with his own grief, supported by Camilla, the Queen Consort, he steadfastly followed his mother’s coffin, dutifully fulfilling all the ceremonial roles and providing comfort to vast numbers of the Queen’s mourning subjects. At his address to the Accession Council on 9th September, he pledged:

“… throughout the remaining time God grants me to uphold the constitutional principles at the heart of our nation.”

Text Sources:

Douglas Murray (2018), The Strange Death of Europe: Immigration, Identity, Islam. London: Bloomsbury.

Simon Schama (2002), A History of Britain III: 1776-2000, The Fate of Empire. London: BBC Worldwide.

Andrew Marr (2009), A History of Modern Britain. London: Pan Macmillan.

John Morrill (ed.), (2001), The Penguin Atlas of British and Irish History. Harmondsworth: Penguin Books.

Previous articles on the web:


Igloobooks.com (2013), The History of Britain. Sywell, NN6 OBJ: Igloo Books Ltd.

Christine Lindop (2013), Factfiles: William and Kate. Oxford: OUP (Oxford Bookworms Library):

The book contains a full list of photo acknowledgements.


Majesty & A Fall from Grace: The Lives & Times of the Latter-Day Windsors – Part One: Big Days, Budgets & Bigotry 2002-17

Another Royal Fairy Tale Begins – William & Kate, 2002-2011:

The sun was coming up over Westminster Abbey on Friday 29th April 2011, and on the Mall, some of the visitors were sleeping on chairs near the road, and others were standing and talking. They came from all over the capital city, as well as from other towns and cities all over Britain, and from other countries too. Later in the morning, Catherine Elizabeth Middleton was getting ready for her very special day. Her parents, sister and brother, were staying in the same hotel – the Goring Hotel in Belgravia. She would soon have to put on her specially designed dress.

At the same time, not far away, Prince William was getting ready too. He and his brother, Prince Harry, were putting on their uniforms – red for him, black for his brother.

Prince William was then second in line to the throne of the United Kingdom, after his father, Prince Charles. William and Kate came from two very different families, the Windsor-Mountbatten royal family, and the Middletons, an upper-middle-class family. Kate’s parents were Michael and Carole Middleton. They met when they worked for British Airways. Kate is their eldest child and she has a sister, Pippa, and a brother, James. When Kate was six, Carole Middleton began a business called Party Pieces and later Michael Middleton worked with her. It made a lot of money for the family.

But although she was from a perfectly respectable, wealthy family, Kate was a ‘commoner’, had no title and was not therefore a member of the aristocracy. Before William’s great-grandmother, Elizabeth Bowes-Lyon married the then Duke of York, the future George VI, marriages like this, between princes and ‘commoners’ could not happen. When his grandmother, as heir to the throne, married Philip Mountbatten in 1947, she was marrying into the Greek royal family. His father had originally married Lady Diana Spencer, who was from a ‘stately home’. But modern princes and princesses from countries around the world do not always, or even usually, marry people from other royal families, or even the aristocracy.

Carole, James, Michael and Pippa Middleton.

Kate was born in January 1982, six months before William, in Reading. The Middletons had lived in the (Home County) Berkshire village of Bradfield Southend since 1986 when they returned from two years in Amman, Jordan. Kate went to primary school in the village before the family moved to a large detached house in Bucklebury in 1995. She then went to Marlborough College in Wiltshire, where she played hockey and tennis for the school teams, and succeeded academically. When she left Marlborough in 2000, she took a gap year during which she went to Florence, Italy and to Chile in the New Year of 2000, arriving there, by coincidence, a few weeks after William had returned from there on his gap year, so they didn’t meet. She worked as a teaching assistant before returning to England to get ready to go to the University of St Andrew’s in Scotland as a student of art history. The University is the oldest in Scotland, first opened in 1413, and the third oldest in the UK after Oxford and Cambridge. It has a student population of over eight thousand. William and Kate lived in the same building, were both students of art history and had some of the same friends.

In March 2002, Kate was in a fashion show at the university, and William and some of his friends went to watch. First, Kate modelled a colourful jumper, and then she walked out in an exciting black dress, catching William’s attention. William and Kate soon became good friends and remained so for over a year before, in the summer of 2003, Kate turned twenty-one (she was six months older than William), and he went to a party at her parents’ house in Berkshire, together with other friends from St. Andrew’s. Later that year, Prince Charles had an ‘African’ party for William’s twenty-first, and Kate was one of the guests. In September 2003, they began their third year and William moved to a house in the country called Balgrove House, not far from the town and the university, but a quiet place, away from photographers and reporters. Then in March 2004, William and Kate were in the news together. They went on a skiing holiday to Klosters in Switzerland with some friends and William’s father. Soon, newspapers from all over the world had a photo of William and Kate. Now everybody wanted to know, Who is Kate Middleton? The newspapers started to write about Will and Kate as a couple.

September 2004 was the beginning of their last year at St. Andrew’s. There were often photos of William at parties and weddings, but without Kate. As she was not part of the royal family and its official entourage, she couldn’t go with him, but they were still a courting couple. In 2005, they graduated and Her Majesty, Prince Philip and Prince Charles all came up for the ceremony. Kate’s mother and father were there too, but the two families did not meet.

Soon after graduation, William went to New Zealand for a second time, his first official visit overseas, which lasted eleven days. Then he went on a month’s holiday to Lewa Downs in Kenya, where he was later joined, for a short time, by Kate and some of their mutual friends. William then began to have a very busy time, working in the City of London and learning about banking. He also worked for the Football Association at Chatsworth House in Derbyshire. In December, he spent two weeks with a mountain rescue team and after another skiing holiday together at Klosters in January 2006, William and Kate agreed to ‘separate’ while they both established themselves in work. William went to Sandhurst to train as an army officer, where he stayed for almost the rest of the year, not seeing Kate very often. She was trying to find work, but as the girlfriend of the future monarch, she was plagued by photographers who waited near her house and ran after her in the street. For a lot of the time, she worked in her parents’ business; there she could get away from the newspapers. In November 2006, she began work with a women’s clothes business called Jigsaw, which had a chain of high-street shops throughout the UK. The next month, William graduated from Sandhurst, and Kate and her parents were invited to the ‘passing out’ ceremony.

But the media’s invasion of Kate’s privacy got worse in 2007 and at twenty-four, William was not ready to get married. He was still very busy in the army in Dorset, and Kate was working in London, a hundred miles away. Sometimes photos appeared in the papers of William with other girls. Couples in the public eye often stop being together in these circumstances, and of the Queen’s four children, three had divorced. Only Edward, her youngest, was still in his first marriage. In April 2007, they decided to announce that they were no longer a couple. Kate probably needed time to decide whether she wanted to be a future queen consort, and went back to her parents’ house, away from London and the cameras. But she did not stay at home for very long, and she was often seen out shopping with her sister, Pippa. Nor did she stay away from William for long, since on 1st July she accepted his invitation to be among his guests at a special concert organised by Harry and himself to honour their mother, Diana, on her birthday. She did not sit next to William but was not far away and seemed happy again. After two months, they were back together again and flew to The Seychelles for a week together.

Kate goes shopping with her sister, Pippa.

William next spent twelve weeks with the RAF, learning to fly, and he then had two months with the Royal Navy, before returning to the RAF to learn how to fly a helicopter, like his father and uncle, in the autumn of 2008. Meanwhile, Kate continued to work for her parents’ business, taking photographs for them, and people took an interest in her clothes. In January 2010, William finished his helicopter training and went to the island of Anglesey in North Wales to train with the RAF in search and rescue work. He lived in a little house on a farm, and Kate visited him there frequently. William’s training ended in September 2010, and soon after that he and Kate went to Kenya for a three-week holiday.

For some of the time they were in Kenya, William and Kate were with friends. But near the end of the holiday, they had some time alone as a couple again. William had taken his mother’s engagement ring with him, gold with a big blue sapphire surrounded with little diamonds. He carried it carefully with him all through the holiday, and on 19th October, he proposed to Kate with it. She agreed to marry him, and the couple returned to the UK, but could not tell the exciting news to their friends, and Kate could not wear the ring. Finally, on 16th November, the couple appeared on TV at St. James’ Palace and announced their engagement to the public. Kate wore a stunning blue dress to match the sapphire in her ring. William had already asked the Queen and Kate’s father for their consent to the marriage. Both of them gave it.

Prudence takes a Back Seat – New Labour Spending:

In January 2000, Tony Blair announced directly to it that he would bring Britain’s health spending up to the European average within five years. That was a huge promise because it meant spending a third as much again in real terms, and his ‘prudent’ Chancellor of the Exchequer, Gordon Brown, was unhappy that Blair had not spoken enough on television about the need for health service reform to accompany the money, and had also ‘stolen’ his budget announcements. On Budget day itself, Brown announced that until 2004 health spending would rise at above six per cent beyond inflation every year, …

… by far the largest sustained increase in NHS funding in any period in its fifty-year history … half as much again for health care for every family in this country.       

The tilt away from Brown’s sharp spending controls during the first three years of the New Labour government had begun by the first spring of the new millennium, and there was more to come. With a general election looming in 2001, Brown also announced a review of the NHS and its future by a former banker. As soon as the election was over, broad hints about necessary tax rises were dropped. When the Wanless Report was finally published, it confirmed much that the winter crisis of 1999-2000 had exposed. The NHS was not, whatever Britons fondly believed, better than health systems in other developed countries, and it needed a lot more money. ‘Wanless’ also rejected a radical change in funding, such as a switch to insurance-based or semi-private health care. Brown immediately used this as objective proof that taxes had to rise in order to save the NHS. In his next budget of 2002, Brown broke with a political convention that had reigned since the mid-eighties, that direct taxes would not be raised again. He raised a special one per cent national insurance levy, equivalent to a penny on income tax, to fund the huge reinvestment in Britain’s health.

Public spending shot up with this commitment and, in some ways, it paid off, since by 2006 there were around 300,000 extra NHS staff compared to 1997. Hardly anyone was left waiting for an inpatient appointment for more than six months. Death rates from cancer for people under the age of seventy-five fell by 15.7 per cent between 1996 and 2006 and death rates from heart disease fell by just under thirty-six per cent. Meanwhile, the public finance initiative meant that new hospitals were being built around the country. But, unfortunately for New Labour, that was not the whole story of the Health Service under their stewardship. As Andrew Marr (2007-9) has attested,

…’Czars’, quangos, agencies, commissions, access teams and planners hunched over the NHS as Whitehall, having promised to devolve power, now imposed a new round of mind-dazing control.

By the autumn of 2004 hospitals were subject to more than a hundred inspections. War broke out between Brown and the Treasury and the ‘Blairite’ Health Secretary, Alan Milburn, about the basic principles of running the hospitals. Milburn wanted more competition between them, but Brown didn’t see how this was possible when most people had only one major local hospital. Polling suggested that he was making a popular point. Most people simply wanted better hospitals, not more choices. A truce was eventually declared with the establishment of a small number of independent, ‘foundation’ hospitals. By the 2005 general election, Michael Howard’s Conservatives were attacking Labour for wasting money and allowing people’s lives to be put at risk in dirty, badly run hospitals. Many newly and expensively qualified doctors and even specialist consultants could not find work. It seemed that wage costs, expensive new drugs, poor management and the money poured into endless bureaucratic reforms had resulted in a still inadequate service. 

As public spending had begun to flow during the second Blair administration, vast amounts of money had gone into pay rises, new bureaucracies and on bills for outside consultants. Ministries had been unused to spending again, after the initial period of ‘prudence’, and did not always do it well. Brown and his Treasury team resorted to double and triple counting of early spending increases in order to give the impression they were doing more for hospitals, schools and transport than they actually could. As Marr has pointed out, …

… In trying to achieve better policing, more effective planning, healthier school food, prettier town centres and a hundred other hopes, the centre of government ordered and cajoled, hassled and harangued, always high-minded, always speaking for ‘the people’.  

The railways, after yet another disaster, were shaken up again. In very controversial circumstances Railtrack, the once-profitable monopoly company operating the lines, was driven to bankruptcy and a new system of Whitehall control was imposed. At one point, Tony Blair boasted of having five hundred targets for the public sector. Parish councils, small businesses and charities found that they were loaded with directives. Schools and hospitals had many more. Marr has commented, …

The interference was always well-meant but it clogged up the arteries of free decision-taking and frustrated responsible public life. 

Throughout the New Labour years, with steady growth and low inflation, most of the country grew richer. Growth since 1997, at 2.8 per cent per year, was above the post-war average, GDP per head was above that of France and Germany and the country had the second lowest jobless figures in the EU. The number of people at work increased by 2.4 million. Incomes grew, in real terms, by about a fifth. Pensions were in trouble, but house price inflation soured, so the owners found their properties more than doubling in value and came to think of themselves as prosperous. By 2006 analysts were assessing the disposable wealth of the British at forty thousand pounds per household. However, the wealth was not spread geographically, averaging sixty-eight thousand in the southeast of England, but a little over thirty thousand in Wales and northeast England.

The Click & Collect Economy – Buying & Selling Britain by the Acre:

The internet, also known as the ‘World-Wide Web’, which was ‘invented’ by the British computer scientist Tim Berners-Lee at the end of 1989 (pictured right in 2014), was advancing from colleges and institutions into everyday life by the mid- ‘noughties’. It first began to attract popular interest in the mid-nineties: Britain’s first internet café and magazine, reviewing a few hundred early websites, were both launched in 1994.

The introduction of new forms of mail-order and ‘click and collect’ shopping in the mid-nineties quickly attracted significant adherents from different ‘demographics’. But the ‘dot-com’ bubble burst due to its own rapid and excessive expansion, and following a pause and a lot of ruined dreams, the ‘new economy’ roared on again. By 2000, according to the Office of National Statistics (ONS), around forty per cent of Britons had accessed the internet at some time. Three years later, nearly half of British homes were ‘online’. By 2004, the spread of ‘broadband’ connections had brought a new mass market in ‘downloading’ music and video. By 2006, three-quarters of British children had internet access at home.


Above: The Albert Dock in Liverpool was an example of a redundant industrial relic that took on a multitude of other uses. By the new millennium, its nineteenth-century buildings housed a maritime museum, an art gallery, a shopping centre and a television studio, becoming a major tourist attraction.

Simultaneously, the rich of America, Europe and Russia began buying up parts of London and other ‘attractive’ parts of the country, including Edinburgh, the Scottish Highlands, Yorkshire and Cornwall. ‘Executive houses’ with pebbled driveways, brick facing and dormer windows were growing across farmland and by rivers with no thought of flood-plain constraints. Parts of the country far from London, such as the English southwest and Yorkshire, enjoyed a ripple of wealth that pushed their house prices to unheard-of levels. From Liverpool to Gateshead, Belfast to Cardiff Bay, once-derelict shorefront areas were transformed. For all the problems and disappointments, and the longer-term problems with their financing, new schools and public buildings sprang up – new museums, galleries, vast shopping complexes, corporate headquarters in a biomorphic architecture of glass and steel, more imaginative and better-looking than their predecessors from the dreary age of concrete.

Supermarket chains exercised huge market power, offering cheap meat and dairy products to almost everyone’s budgets. Factory-made ready meals were transported and imported by the new global air freight market and refrigerated trucks and lorries moved freely across a Europe shorn of internal barriers. Out-of-season fruit and vegetables, fish from the Pacific, exotic foods of all kinds and freshly cut flowers appeared in superstores everywhere. Hardly anyone was out of reach of a ‘Tesco’, a ‘Morrison’s’, a ‘Sainsbury’s’ or an ‘Asda’. By the mid-noughties, the four supermarket giants owned more than fifteen hundred superstores throughout the UK. They spread the consumption of goods that in the eighties and nineties had seemed like luxuries. Students had to take out loans in order to go to university but were far more likely to do so than previous generations, as well as to travel more widely on a ‘gap’ year, not just to study or work abroad.

Those ‘Left Behind’ – Poverty, Pensions & Public Order:

Materially, for the majority of people, this was, to use Marr’s term, a ‘golden age’, which perhaps helps to explain both why earlier real anger about earlier pension decisions and stealth taxes did not translate into anti-Labour voting in successive general elections. The irony is that in pleasing ‘Middle Englanders’, the Blair-Brown government lost contact with traditional Labour voters, especially in the North of Britain, who did not benefit from these ‘golden years’ to the same extent. Gordon Brown, from the first, made much of New Labour’s anti-poverty agenda, especially child poverty. Since the launch of the Child Poverty Action Group, this latter problem had become particularly emotive. Labour policies took a million children out of relative poverty between 1997 and 2004, though the numbers rose again later. Brown’s emphasis was on the working poor and the virtue of work. So his major innovations were the national minimum wage, the ‘New Deal’ for the young unemployed, the working families tax credit, and tax credits aimed at children. There was also a minimum income guarantee and a later pension credit, for poorer pensioners.

The Tories, now under new management in the shape of a media-marketing executive and old Etonian, David Cameron, also declared that they believed in this concept of relative poverty. After all, it was on their watch, during the Thatcher and Major governments, that it had tripled. A world of ‘black economy’ work also remained below the minimum wage, in private care homes, where migrant servants were exploited, and in other nooks and crannies. Some 336,000 jobs remained on ‘poverty pay’ rates. Yet ‘redistribution of wealth’, a socialist phrase which had become unfashionable under New Labour lest it should scare away Middle Englanders, was stronger in Brown’s Britain than in other major industrialised nations. Despite the growth of the super-rich, many of whom were also immigrants anyway, overall equality increased in these years. One factor in this was the return to the means-testing of benefits, particularly for pensioners and through the working families tax credit, subsequently divided into a child tax credit and a working tax credit. Gordon Brown, as Chancellor, concluded that if he was to direct scarce resources at those in real poverty, he had little choice but to reintroduce means-testing.

Apart from the demoralising effect it had on pensioners, the other drawback to means-testing was that a huge bureaucracy was needed to track people’s earnings and to try to establish exactly what they should be getting in benefits. Billions were overpaid and as people did better and earned more from more stable employment, they then found themselves facing huge demands to hand back the money they had already spent. Compared with Mrs Thatcher’s Victorian Values and Mr Major’s Back to Basics campaigns, Labour was supposed to be non-judgemental about individual behaviour. But a form of moralism did begin to reassert itself. For the minority who made life hell for their neighbours on housing estates, Labour introduced the Anti-Social Behaviour Order (‘Asbo’). These were first given out in 1998, and granted by magistrates to either the police or the local council. It became a criminal offence to break the curfew or other sanctions, which could be highly specific. Asbos could be given out for swearing at others in the street, harassing passers-by, vandalism, making too much noise, graffiti, organising ‘raves’, flyposting, taking drugs, sniffing glue, joyriding, prostitution, hitting people and drinking in public.

001 (2)

By the ‘mid-noughties’, more than four million closed-circuit television cameras were watching the British in what was fast becoming a surveillance society.

Although Asbos served a useful purpose in many cases, there were fears that for the really rough elements in society and their tough children they became a badge of honour. Since breaking an Asbo could result in an automatic prison sentence, people were sent to jail for crimes that had not warranted this before. But as they were refined in use and strengthened, they became more effective and routine. By 2007, seven and a half thousand had been given out in England and Wales alone and Scotland had introduced its own version in 2004. Some civil liberties campaigners saw this development as part of a wider authoritarian and surveillance agenda. In addition, the number of mobile phones was already equivalent to the number of people in Britain. With global satellite positioning chips (GPS) these could show exactly where their users were and the use of such systems in cars and even out on the moors meant that Britons were losing their age-old prowess for map-reading.

The ‘Seven Seven’ Bombings – War on Terror & Home-grown ‘Jihadis’:

Despite these increasing means of mass surveillance, Britain’s cities have remained vulnerable to terrorist attacks, more recently by so-called ‘Islamic terrorists’ rather than by the Provisional IRA, who abandoned their bombing campaign in 1998. On 7 July 2005, at rush-hour, four young Muslim men from West Yorkshire and Buckinghamshire, murdered fifty-two people and injured 770 others by blowing themselves up on London Underground trains and on a London bus. The report into this worst such attack in Britain later concluded that they were not part of an al Qaeda cell, though two of them had visited camps in Pakistan, and that the rucksack bombs had been constructed at the cost of a few hundred pounds. Despite the government’s insistence that the war in Iraq had not made Britain more of a target for terrorism, the Home Office investigation asserted that the four had been motivated, in part at least, by ‘British foreign policy’.

They had picked up the information they needed for the attack from the internet. It was a particularly grotesque attack, because of the terrifying and bloody conditions in the underground tunnels and it vividly reminded the country that it was as much a target as the United States or Spain. Indeed, the long-standing and intimate relationship between Great Britain and Pakistan, with constant and heavy air traffic between them, provoked fears that the British would prove uniquely vulnerable. Tony Blair heard of the attack at the most poignant time, just following London’s great success in winning the bid to host the 2012 Olympic Games. The ‘Seven Seven’ bombings are unlikely to have been stopped by CCTV surveillance, of which there was plenty at the tube stations, nor by ID cards (which had recently been under discussion), since the killers were British subjects, nor by financial surveillance, since little money was involved and the materials were paid for in cash. Even better intelligence might have helped, but the Security Services, both ‘MI5’ and ‘MI6’ as they are known, were already in receipt of huge increases in their budgets, as they were in the process of tracking down other murderous cells.

In 2005, police arrested suspects in Birmingham, High Wycombe and Walthamstow, in east London, believing there was a plot to blow up as many as ten passenger aircraft over the Atlantic. After many years of allowing dissident clerics and activists from the Middle East asylum in London, Britain had more than its share of inflammatory and dangerous extremists, who admired al Qaeda and preached violent jihad. Once 11 September 2001 had changed the climate, new laws were introduced to allow the detention without trial of foreigners suspected of being involved in supporting or fomenting terrorism. They could not be deported because human rights legislation forbade sending back anyone to countries where they might face torture. Seventeen were picked up and held at the Belmarsh high-security prison. But in December 2004, the House of Lords ruled that these detentions were discriminatory and disproportionate, and therefore illegal.

Five weeks later, Home Secretary Charles Clarke hit back with ‘control orders’ to limit the movement of men he could not prosecute or deport. These orders would also be used against home-grown terror suspects. A month later, in February 2005, sixty Labour MPs rebelled against these powers too, and the government only narrowly survived the vote. In April 2006 a judge ruled that the control orders were an affront to justice because they gave the Home Secretary, a politician, too much power. Two months later, the same judge ruled that curfew orders of eighteen hours per day on six Iraqis were a deprivation of liberty and also illegal. The new Home Secretary, John Reid, lost his appeal and had to loosen the orders.

Britain found itself in a struggle between its ancient laws and liberties and a new, borderless world in which the hallowed principles of ‘habeas corpus’, free speech, a presumption of innocence, asylum, the right of British subjects to travel freely in their own country without identifying papers, and the sanctity of homes in which the law-abiding lived were all coming under increasing jeopardy. The new political powers seemed to government ministers the least that they needed to deal with a threat that might last for another thirty years in order, paradoxically, to secure Britain’s liberties for the long term beyond that. They were sure that most British people agreed, and that the judiciary, media, civil rights campaigners and elected politicians who protested were an ultra-liberal minority. Tony Blair, John Reid and Jack Straw were emphatic about this, and it was left to liberal Conservatives and Liberal Democrats to mount the barricades in defence of civil liberties.

As Gordon Brown eyed the premiership, his rhetoric was similarly tough, but as Blair was forced to turn to the ‘war on terror’ and Iraq, he failed to concentrate enough on domestic policy. By 2005, neither of them could be bothered to disguise their mutual enmity, as pictured above. A gap seemed to open up between Blair’s enthusiasm for market ideas in the reform of health and schools, and Brown’s determination to deliver better lives for the working poor. Brown was also keen on bringing private capital into public services, but there was a difference in emphasis which both men played up. Blair claimed that the New Labour government was best when we are at our boldest. But Brown retorted that it was best when we are Labour. 

002 (2)Tony Blair’s legacy continued to be paraded on the streets of Britain, here blaming him and George Bush for the rise of ‘The Islamic State’ in Iraq.

Immigration & Identity at the Beginning of the New Millennium:

Immigration had always been a constant factor in British life, now it was also a fact of life that Europe and the whole world had to come to terms with. Earlier post-war migrations to Britain had provoked a ‘racialist’ backlash, riots, the rise of extreme right-wing organisations and a series of new laws aimed at controlling it. New laws had been passed to control both immigration from the Commonwealth and the backlash to it. The later migrations were controversial in different ways. The ‘Windrush’ arrivals from the Caribbean (see the photo below) and those from the Indian subcontinent were people who looked different but who spoke the same language and in many ways had had a similar education to that of the ‘native’ British. Many of the later migrants from Eastern Europe looked similar to the white British but shared little by way of a common linguistic, educational and cultural background.

As Simon Schama pointed out in 2002, it was a fact that even though half of the British-Caribbean population and a third of the British-Asian population were born in Britain, they continued to constitute only a small proportion of the total population. It was also true that any honest reckoning of the post-imperial account needed to take account of the appeal of separatist fundamentalism in majority Muslim communities. At the end of the last century, an opinion poll found that fifty per cent of British-born Caribbean men and twenty per cent of British-born Asian men had, or once had, white partners. In 2000, Yasmin Alibhai-Brown found that, when polled, eighty-eight per cent of white Britons between the ages of eighteen and thirty had no objection to ‘inter-racial’ marriage; eighty-four per cent of West Indians and East Asians and fifty per cent of those from Indian, Pakistani or Bangladeshi backgrounds felt the same way. Schama commented:

The colouring of Britain exposes the disintegrationalist argument for the pallid, defensive thing that it is. British history has not just been some sort of brutal mistake or conspiracy that has meant the steamrollering of Englishness over subject nations. It has been the shaking loose of peoples from their roots.

A Jewish intellectual expressing impatience with the harping on ‘roots’ once told me that “trees have roots; Jews have legs”. The same could be said of Britons who have shared the fate of empire, whether in Bombay or Bolton, who have encountered each other in streets, front rooms, kitchens and bedrooms.

Until the Summer of 2001, this ‘integrationist’ view of British history and contemporary society was the broadly accepted orthodoxy among intellectuals and politicians, if not more broadly in the population. At that point, however, partly as a result of riots in the north of England involving ethnic minorities, including young Muslim men, and partly because of events in New York and Washington, the existence of parallel communities began to be discussed more widely and the concept of ‘multiculturalism’ began to become subject to fundamental criticism on both the right and left of the political spectrum. In the ‘noughties’, the dissenters from the multicultural consensus began to be found everywhere along the continuum.

The Breaking of the Multicultural Consensus:

One result of the long Iraqi conflict, which President Bush finally declared to be over on 1 May 2003, as part of his war on terror, was the arrival of many Iraqi asylum-seekers in Britain; Kurds, as well as Shiites and Sunnis. This attracted little comment at the time because there had been both Iraqi and Iranian refugees in Britain since the late 1970s, especially as students. But soon there was a much larger migration into the country which changed it fundamentally during the Blair years. This was a multi-lingual migration, including many Poles, some Hungarians and other Eastern Europeans whose countries had joined the EU and its single market in 2004. There were also sizeable inflows of western Europeans, though these were mostly academics and students, who (somewhat controversially) were also counted in the immigration statistics, and young professionals with multi-national companies.

At the same time, there was continued immigration from Africa, the Middle East and Afghanistan, as well as from Russia, Australia, South Africa and North America. In 2005, according to the Office for National Statistics, ‘immigrants’ were arriving to live in Britain at the rate of fifteen hundred a day. Since Tony Blair had been in power, more than 1.3 million had arrived. By the 2010s, English was no longer the first language of half the primary school children in London, and the capital had more than 350 different first languages. Five years later, the same could be said of many towns in Kent and other Eastern counties of England.


Polish tradesmen, fruit-pickers and factory workers were soon followed by shops owned by Poles or stocking Polish and East European delicacies and selling Polish newspapers and magazines. Even road signs appeared in Polish, though in Kent these were mainly put in place along trucking routes used by Polish drivers, where for many years signs had been in French and German, a recognition of the employment changes in the long-distance haulage industry. Even as far north as Cheshire (see below), these were put in place to help monolingual truckers using trunk roads, rather than local Polish residents, most of whom had enough English to understand road signs either upon arrival or shortly afterwards. Although specialist classes in English had to be provided in schools and community centres, there was little evidence that multi-lingual migrants had any long-term impact on local children and wider communities. In fact, schools were soon reporting a positive impact in terms of the migrant children’s attitudes toward learning and in improving general educational standards.


More serious problems were beginning to be posed, however, by the operations of people smugglers and criminal gangs. Chinese migrants were involved in a particular tragedy when nineteen of them were caught while cockle-picking in Morecambe Bay by the notorious tides and drowned. Many more were working for ‘gang-masters’ as virtual and in some cases actual ‘slaves’. Russian voices became common on the London Underground, and among prostitutes on the streets. The British Isles found themselves to be ‘islands in the stream’ of international migration, the chosen ‘sceptred isle’ destinations of millions of newcomers. Unlike Germany, Britain was no longer a dominant manufacturing country but had rather become, by the late twentieth century, a popular place to develop digital and financial products and services.

When the EU expanded Britain decided that, unlike France or Germany, it would not try to delay opening the country to migrant workers. The accession treaties gave nationals from these countries the right to freedom of movement and settlement. With average earnings three times higher in the UK, this was a benefit that the Eastern Europeans were keen to take advantage of. Some member states, however, exercised their right to ‘derogation’ from the treaties, whereby they would only permit migrant workers to be employed if employers were unable to find a local candidate. In terms of European Union legislation, a derogation involved a delay to the full implementation of the treaty for five years. Unlike other member states, including France, the UK decided not to exercise this option.

Featured Image -- 20189

Within the EU, however, British politicians maintained Thatcher’s earlier determination to resist the Franco-German federalist model, with its ‘social chapter’ involving ever tighter controls over international corporations and ever closer political union. Britain, it was argued, had always gone out into the world. Now, increasingly, the world came to Britain, whether poor migrants, rich corporations or Chinese manufacturers. The poorer of the new migrant groups were almost entirely unrepresented in politics, but radically changed the sights, sounds and scents of urban Britain, and even some of its market towns. The veiled women of the Muslim world or its more traditionalist Arab, Afghan and Pakistani quarters became common sights on the streets, from Kent to Scotland and across to South Wales.

An unspoken consensus existed whereby immigration, while always gradually increasing, was controlled. However, with the advent of hundreds of thousands of migrant EU workers, politicians and commentators began to break with this multicultural consensus, and their views began to have an impact because while those on the right were suspected of having ‘nativist’ if not ‘racist’ tendencies in the ‘Powellite’ tradition, those from the left could generally be seen as having less easily assailable motives. What happened after the accession treaties of 2004 was therefore a breaking of the multicultural consensus.

The journalist Douglas Murray, author of the recent (2017) book, The Strange Death of Europe has claimed that once in power in 1997, Tony Blair’s government oversaw an opening of the borders on a scale unparalleled even in the post-war decades. His government abolished the ‘primary purpose rule’, which had been used to filter out bogus marriage applications. The borders were opened to anyone deemed essential to the British economy, a definition so broad that it included restaurant workers as ‘skilled labourers’. And as well as opening the door to the rest of the world, they opened the door to the new EU member states after 2004. It was the effects of all of this that created the changed picture of the country which was eventually revealed in the 2011 Census, published towards the end of 2012.

Trevor Phillips (pictured right), the first black President of the National Union of Students in 1978-79 (of Ghanaian parentage), became the Chair of the Commission for Racial Equality in 2003, opening up territory in discussion and debate that others had not dared to ‘trespass’ into.

His realisation that the race-relations ‘industry’ was part of the problem, and that partly as a result of talking up diversity the country was ‘sleepwalking to segregation’ was an insight that others began to share. Simon Schama had argued, in his influential BBC TV (2002) series on the History of Britain looking back from the beginning of the new millennium, that Britain should not have to choose between its own multi-cultural, global identity and its place in Europe. Interestingly, he put the blame for the pressure to do so primarily on continental leaders and, latterly, on the EU bureaucracy in Brussels, suggesting that…

 … the increasing compulsion to make the choice that General de Gaulle imposed on us between our European and our extra-European identity seems to order an impoverishment of our culture. It is precisely the the roving, unstable, complicated, migratory character of our history that ought to be seen as a gift for Europe.

It is a past, after all, that uniquely in European history combines a passion for social justice with a tenacious attachment to bloody-minded liberty, a past designed to subvert, not reinforce, the streamlined authority of global bureaucracies and corporations.

Our place at the European table ought to make room for that peculiarity or we should not bother showing up for dinner. What, after all, is the alternative? To surrender that ungainly, eccentric thing, British history, with all its warts and disfigurements, to the economic beauty parlour that is Brussels will mean a loss. But properly smartened up, we will of course be fully entitled to the gold-card benefits of the inward-looking club…

Nor should Britain rush towards a re-branded future that presupposes the shame-faced repudiation of the past. For our history is not the captivity of our future; it is, in fact, the condition of our maturity. 

The Royal Wedding of 2011- A Gallery:

At the time of the royal wedding, Britain was not just becoming increasingly divided over immigration and EU membership, but it was also in recession, following the international financial crash of 2010. Some businesses were forced to close and people were losing their jobs, and it was difficult for young people to borrow enough money to buy or rent their first home. The royal wedding cheered everyone up, but the royal family were concerned that it should not be too lavish. The wedding was meant to be very traditional, but simple.

William and Catherine’s wedding day was set for 29th April 2011, in Westminster Abbey. The Queen asked about nineteen hundred people to attend the wedding. Many of these guests were family and friends, but she also invited kings and queens from around the world. A wedding ring was made of Welsh gold, a tradition within the royal family going back to his great-grandmother. William’s brother Harry was to be his best man, and Kate’s sister, Pippa, her chief bridesmaid. Then there were to be four little bridesmaids and two page boys. On the eve of the big day, thousands of people began to arrive in London, determined to camp near Westminster Abbey and on the Mall. The Duchess of Cornwall and Prince William himself came out to meet them. The day of the Wedding was warm and dry. Hundreds of thousands waited on the streets, and five thousand police officers were deployed around the route. Spread over different parts of the capital, there were more than eight thousand radio and television reporters, ready to tell people around the world about the wedding.

At eight o’clock the news came from Buckingham Palace that William and Kate would be known as the Duke and Duchess of Cambridge. At mid-morning, William and Harry came into the abbey, William wearing an Irish Guards uniform and Harry the uniform of a captain in the Blues and Royals. Minutes later, Kate’s mother arrived and soon after that, the Queen and Prince Philip. Then, just before eleven, people saw Kate leaving the hotel to get into a Rolls-Royce with her father. Unlike previous royal brides, brides, Kate did not arrive at the wedding in a horse-drawn coach. Ten minutes later, she arrived at the Abbey and everyone could see her dress, by British designer Sarah Burton, for the first time. It was made from ivory and white satin, with a V-neck bodice with lace detailing. It had a big skirt and long lace sleeves. The train measured 270cm, 110ins. Kate wore a white veil over her face, held in place by a diamond tiara, lent to her by the Queen. It was originally given by King George V to Queen Elizabeth, the Queen Mother, in 1936. They had given it to the then Princess Elizabeth on her eighteenth birthday. According to reports, a blue ribbon was stitched inside the dress. In her hands, Kate carried small white flowers.

As she walked in with her father, there were hundreds of white and green flowers lining the abbey nave, and eight tall trees. Her sister Pippa walked behind, carefully carrying the long train of the dress. Pippa almost stole the show, wearing a simple white shift dress with buttons down the back. The ceremony took a little more than an hour. Then the new Duke and Duchess walked out of the abbey with the four children, bridesmaids and pages, Prince Harry, Pippa, Prince Charles and the Duchess of Cornwall, and Kate’s parents. The bells of Westminster Abbey were rung for three hours after the wedding.

The Duke & Duchess leaving the Abbey for the ‘breakfast’ party at the Palace.

The bride and groom got into an open-top gold and black coach, the 1902 State Landau, to go to Buckingham Palace, with the other important royal guests and Carole and Michael Middleton in the following carriages. After they arrived at the Palace, thousands of people began to walk up the Mall behind a cordon of police officers. Then everyone in the crowd watched the balcony and waited for the royal couple to appear. A royal bride and groom first did this in 1858, and William’s parents, Charles and Diana famously kissed there in 1981. So when Kate and William came out onto the balcony, the crowd expected them to do the same. The noise created by the crowd’s approval made one of the little bridesmaids put her hands over her ears!

Then there were the formal photographs inside the palace, with the bride and groom together, with their pages and bridesmaids, and with their families. After that, there was a party for 650 guests in nineteen rooms, and Prince Charles made a speech. There were further speeches at the dinner for three hundred guests, from Prince Harry, Michael Middleton and from Prince William. Two of the couple’s friends also spoke. In the ballroom, the guests talked and danced until 3 a.m., when the bride and groom left and the party ended. The Mayor of London, Boris Johnson, gave the couple a tandem bike as a wedding present.

The big wedding cake was made from seventeen different cakes, but Prince William asked for a chocolate cake as well, to remind him of when he was visiting her from Eton, at Windsor, his grandmother would give him this cake.

Of course, the party at the palace was not the only one in the Kingdom. The day of the wedding was a bank holiday, and there were more than five thousand street parties across the different countries. In Scotland, at St Andrew’s, more than two thousand people came together to watch the ceremony on a giant TV. David Cameron, the Coalition Government‘s PM, also had a party in Downing Street, inviting elderly people and children to join him. His wife, Samantha, made the cakes. In cities and towns throughout the Kingdom, people closed their streets to traffic and came together for the day, watching the wedding together.

There were parties in many other places around the world, from Afghanistan to India to Canada. In a hundred and eighty countries, many millions of people watched the pictures from London with fellow members of the British armed forces, families and friends. In the early morning in Times Square in New York City, three couples got married just after William and Kate. However, there was no immediate honeymoon, and after three days away, they went back to Anglesey as, on the following Tuesday, William had to return to work with the search and rescue team. Ten days after the wedding, they flew to The Seychelles for ten days, away from the prying eyes of press photographers.

After the Wedding – The Working Duke & Duchess:

After their honeymoon, the couple returned to Anglesey, and a new life for Kate in the royal family. Before long they had their first visit as the Duke and Duchess of Cambridge, to Canada, from 30th June to 8th July. Again, thousands of people came out to see them as they attended official ceremonies. William gave speeches in both English and French. On leaving Canada, they went to California for three days, where they attended a big dinner in Los Angeles, meeting Jennifer Lopez, Nicole Kidman and other rich Hollywood celebrities. At these overseas functions, Kate wore dresses specially designed for her, but she usually wore things from British shops. So, when young women saw her wearing a new dress made in Britain of this kind, they went to their nearest high-street fashion store to see if they could get the same dress. Often, the shops sold out of these ‘Kate’ dresses within hours of her appearing on TV in it, so she was good news for the British clothes shops and fashion ‘houses’. But while in California they also spent time at the charitable foundation, Inner-City Arts, where children from poorer families went to have lessons in dance and the arts. The Duke and Duchess watched the dances and made pictures with the children.

Back home, too, many charities asked for ‘patronage’ from them as members of the royal family. The charity Centrepoint was sponsored by his mother, and William gave his time to it as well, including sleeping out on the cold streets of London for a night to learn something, firsthand, about the experience of homeless people. The Duchess also began helping four charitable organisations. One of them is the East Anglia Children’s Hospice, which helps seriously and terminally ill children and their families. Another, building on her background in art history, is the National Portrait Gallery.

When they got engaged, Kate said, “Family is very important to me,” and William said, “We want a family.” By 2022 they had two new royal princes, George (b. 2013) and Louis, and a princess, Charlotte. Had she been born first, Charlotte would have become third in line to the throne following a change in the law of succession to permit the eldest child of the Monarch to become heir to the throne, whether they are male or female. So it seems that British subjects will have two more kings after Charles III, William and George before they have another queen. In the meantime, after overcoming the difficult obstacles placed in their way in their courtship, the new royal couple has been in the news for all the right reasons over the past dozen years, balancing their private family life together with their public work for the Monarchy.

Together, while working in Canada

Prince William was a patron of a mountain rescue organisation and often did work for the Football Association, including joint bids for the British nations to stage the European Championships and World Cup. He also helped with the organisation of the summer 2012 London Olympic Games. It was a very busy time for the whole royal family, who greeted and talked with many famous visitors from around the world and went to the big opening and closing ceremonies (see the section below on the Olympics).

The Diamond Jubilee:

By 2012, Queen Elizabeth had reigned for almost as long as her great-grandmother Queen Victoria, the longest-reigning monarch to date. In 2012, Elizabeth celebrated sixty years on the throne. By 2022, of course, she had also surpassed Victoria, reaching her Platinum Jubilee before her death later in the year. There had already been a big celebration in 1977 to mark the Queen’s Silver Jubilee and another one in 2002 for her Golden Jubilee. For 2012, a celebration was planned that would be even bigger than the earlier ones. Celebrations went on throughout the year, with the Queen and Duke of Edinburgh making special visits around the country, but the focal point was London and the Jubilee Weekend in June. A special bank holiday was declared on Tuesday 5th June, so that everyone in the UK had a four-day weekend. Celebrations were held in Britain and throughout the Commonwealth:

The Jubilee Concert was held outside Buckingham Palace. It was a joint venture between the BBC and Gary Barlow, who, together with Andrew Lloyd Webber wrote a special anthem, Sing, which was performed by a choir from many Commonwealth countries. Other artists who appeared at the concert included Robbie Williams, the pianist Lang Lang, Tom Jones, Shirley Bassey and Elton John. Two thousand and twelve beacons were lit by communities and individuals throughout the Kingdom, as well as in the Channel Islands, the Isle of Man and the Commonwealth. Her Majesty herself lit the National Beacon in central London.

The River Thames Flotilla on Sunday was made up of nearly a thousand boats from around the UK, the Commonwealth and other parts of the world. The Queen and the Duke of Edinburgh travelled in the Royal Barge, which formed the centrepiece of the flotilla.

The 2012 London Olympics Games:

The spectacular opening ceremony was an unforgettable start to the Olympic Games. The whole spectacular event was orchestrated by composer Danny Boyle and writer Frank Cottrell Boyce, providing a unique journey through British history. The visual effects team created a thrilling animated journey down the river from its source to the Olympic stadium itself, passing sights real and imaginary. The sound of the shipping forecast and billowing blue sheets transformed the ‘meadow’ of ‘the Green and Pleasant Land’ into an Isle of Wonder, to the stirring sounds of Elgar’s Nimrod, played by the London Symphony Orchestra’s On Track Project. Frank Turner’s acoustic songs performed in the stadium perfectly captured the atmosphere of a long summer evening.

Above. The opening ceremony of the 2012 Olympics

The intense sporting action from the first ten days of the Games began with Team GB’s first medal from Lizzie Armistead, the silver she won in the Road Race. Although she was beaten to the gold in a sprint finish with Marianne Voss, this was the focus of national attention. Their first gold came from rowers Helen Glover and Heather Stanning. In cycling, Bradley Wiggins (“Wiggo”), who had just won the Tour de France, won gold in imperious style in the time trial. This brought his total of Olympic medals to seven, a record shared with Chris Hoy, who later won gold in the indoor team sprint. Both men were subsequently knighted, and Wiggins was named BBC Sports Personality of the Year for 2012. Many more thrilling moments followed with US swimmer Michael Phelps’ record nineteenth Olympic medal in the pool and rower Katherine Grainger’s long-awaited gold for GB. In athletics, millions watched as Jamaican sprinter Usain Bolt successfully defended his hundred-metre title.

In the middle, ‘Super Saturday’, with an electric atmosphere in the stadium, Jessica Ennis won gold in the heptathlon, a combination of seven field and track events, and Mo Farah won gold in the ten thousand metres, followed by a third gold from Greg Rutherford in the Long Jump. In tennis, Andy Murray beat Roger Federer in an emotional final, making up for his loss to Federer at Wimbledon just a few weeks before. In the final six days, the Brownlee brothers found triathlon glory, Usain Bolt completed a historic sprint treble, and Nicola Adams punched her way to the first women’s boxing gold. There was incredible tension in Greenwich Park for Team GB’s Equestrian dressage victory, the world record-breaking US sprint relay and Samantha Murray (no relation to Andy), securing the 65th and final medal for Team GB.

The London Olympics ended in style with a celebration of British musical and sporting achievements, created by Kim Gavin. It marked the end of an amazing chapter in London’s life and featured an array of British artists from the previous sixty years, including Eric Idle, The Kinks, The Spice Girls and Jessie J. As the whole event drew to a close, Britain seemed, despite the recession, to be once more at ease with itself.

(to be continued…)


Majesty & Grace X: The Reign of Elizabeth Windsor – Winter of Discontent to Golden Jubilee, 1979-2002; Part 2 – The Peace Process & The People’s Princess.

The Sectarian Divide in Belfast & the Peace Process, 1980-98:

After the Provisional IRA assassinated Lord Mountbatten on his boat off the Western coast of Ireland in 1979, the mainland bombing campaign went on with attacks on the Chelsea barracks, then Hyde Park bombings, when eight people were killed and fifty-three injured. With hindsight, the emergence of Sinn Féin, the political wing of the Provisional IRA, as a political party in the early 1980s can be seen as one element in the rethinking of British policies. Yet throughout the 1980s and 1990s, there were still periodic major incidents in various places across the province. After the Provisional IRA’s cease-fire in 1994, these were initiated by dissident republican splinter groups. However, much of the continuing street violence throughout these decades was concentrated in the areas of Belfast where the population was mainly either predominantly republican or unionist, as shown on the map below.

Sectarian divisions in Belfast, c. 1985

Another preliminary element in the Peace Process was the belated interest successive Irish governments began to take in Northern Ireland during the same period. The fear that the disturbances in the North might spread south and destabilise their state lay behind the Dublin government’s decision to start talking directly to Downing Street from 1980 onwards. This led to the Anglo-Irish Agreement of 1985, which essentially represented a reiteration of British policy aims in the province, but now with an added all-Ireland dimension, which the Unionists continued to find difficult to accept.

The political violence in Belfast was largely confined to the confrontation lines where working-class unionist districts, such as the Shankhill, and working-class nationalist areas, such as the Falls, Ardoyne and New Lodge, border directly on the city centre and/or on one another. The mixed middle-class areas did not experience any significant political violence.

It was in the wake of this agreement that an entirely new approach emerged. Sinn Féinn’s dual strategy of ‘ballot box and bullet’, and its realisation that the war could not be won militarily, led to secret negotiations with both Dublin and London in the late 1980s in pursuit of a way into active participation in the political process. Britain’s recognition that if the extremists could be brought into the search for a political solution then the mainstream parties in Northern Ireland politics would follow suit was a complete reversal of its previous approach. Once the IRA had accepted the need for a ceasefire, the remaining difficulty was to persuade unionists that the reformed ex-paramilitaries could be trusted.

US President Bill Clinton addressed a peace rally in Belfast during his visit in 1995. Clinton played a significant role as a peace broker in negotiations leading up to the Good Friday Agreement.

The willingness of some unionists to give up previous policies – which had consisted of trying to recover the power they had lost in 1972 – sprang from their pragmatic realisation that only sharing power with nationalists would guarantee unionists some say in the future government of Northern Ireland. It was not until the late 1990s that the Irish constitution was amended, recognising British sovereignty over Northern Ireland. US President Bill Clinton acted as a peace broker, contributing to an uneasy cease-fire in the North from 1994.

Unionist rioters pelt police with stones after Orangemen are prevented from marching through nationalist Drumcree, near Portadown, in July 1996.

Still, the street violence persisted and intensified during the marching season in July, when the protestant Orange Order held its traditional but often offensive or provocative parades near or through Catholic areas around Belfast.

The Good Friday Agreement & the Omagh Bombing of 1998:

After long negotiations, and with the help of former US senator George Mitchell, the Good Friday Agreement was signed between all the parties to the conflict in 1998. This sought to establish a power-sharing government at Stormont, as well as acknowledging both British and Irish interests in the future of the province. The impressive Parliament Buildings in Stormont, shown below, were built in 1921 to house the new Government of Northern Ireland after Partition, and since the Belfast Agreement, they have been the home to the Northern Ireland Assembly and its power-sharing executive. Despite recurrent crises and accusations of bad faith, the peace survived more or less intact.

Picture & graphic from BBC History Magazine.

Sadly, however, it did not immediately bring an end to the bombing. The Omagh bombing was a car bombing on 15 August 1998 in the town of Omagh in County Tyrone carried out by the Real Irish Republican Army (Real IRA), a Provisional Irish Republican Army (IRA) splinter group that opposed the IRA’s ceasefire and the Good Friday Agreement, signed earlier in the year. The bombing killed twenty-nine people and injured about 220 others, making it the deadliest single incident of the Troubles in Northern Ireland. There was a strong regional and international outcry against ‘dissident’ republicans and in favour of the Northern Ireland peace process. Prime Minister Tony Blair called the bombing an “appalling act of savagery and evil.” Queen Elizabeth expressed her sympathies to the victims’ families, while the Prince of Wales paid a visit to the town and spoke with the families of some of the victims. Pope John Paul II and President Bill Clinton also expressed their sympathies. Churches across Northern Ireland called for a national day of mourning. Church of Ireland Archbishop of Armagh Robin Eames said on BBC Radio that,

“From the Church’s point of view, all I am concerned about are not political arguments, not political niceties. I am concerned about the torment of ordinary people who don’t deserve this.”

The Return & Results of Widespread Mass Unemployment in Britain:

In February 1986, across the UK as a whole, there were over 3.4 million unemployed, although statistics were manipulated for political reasons and the real figure is therefore a matter of speculation. The socially corrosive effects of the return of widespread mass unemployment, not seen since the early thirties, were felt throughout the country, manifesting themselves in the further bouts of inner-city rioting that broke out in 1985. This was more serious for the government than the rioting against the Means Test of half a century before because it occurred in cities throughout the country, rather than in depressed mining areas. London was just as vulnerable as Liverpool, and a crucial contributory factor was the number of young men of south Asian and Caribbean heritage who saw no hope of ever entering employment: opportunities were minimal and they felt particularly discriminated against. The term underclass was increasingly used to describe those who felt themselves to be completely excluded from the return of prosperity to many areas in the late eighties.

By 1987, service industries were offering an alternative means of employment in Britain. Between 1983 and 1987 about one and a half million new jobs were created. Most of these were for women, many of whom were entering employment for the first time, and many of the jobs available were part-time and, of course, lower paid than the jobs lost in primary and secondary industries. By contrast, the total number of men in full-time employment fell still further. Many who had left mining or manufacturing for the service sector also earned far less. By the end of the century there were more people employed in Indian restaurants than in the coal and steel industries combined, but for much lower pay. The economic recovery that led to the growth of this new employment was based mainly on finance, banking and credit. Little was invested in home-grown manufacturing, however, but far more was invested overseas, with British foreign investments rising from 2.7 billion pounds in 1975 to 90 billion in 1985.

At the same time, there was also a degree of re-industrialisation, especially in the Southeast, where new industries employing the most advanced technology were growing. In fact, many industries shed a large proportion of their workforce but, using new technology, maintained or improved their output. These new industries were certainly not confined to the M4 Corridor by the late eighties. By then, Nissan’s car plant in Sunderland had become the most productive in Europe, while Siemens established a microchip plant at Wallsend. However, such companies did not employ large numbers of local workers. Siemens invested more than a billion pounds, but only employed a workforce of about eighteen hundred.

Regionally based industries suffered a dramatic decline during this period. Coal mining, for example, was decimated in the decade following the 1984-85 miners’ strike, not least because of the shift of the electricity-generating industry to other alternative energy sources, especially gas. During the period 1984-87 the coal industry shed a hundred and seventy thousand miners, and there was a further net loss of employment in the coalfields, with the exception of north Warwickshire and south Derbyshire, in the early 1990s. The economic effect upon local communities could be devastating, as the 1996 film Brassed Off accurately shows, with its memorable depiction of the social impact on the Yorkshire pit village of Grimethorpe of the 1992 closure programme. The trouble with the economic strategy followed by the Thatcher government was that South Wales, Lancashire, the West Riding of Yorkshire, Tyneside and Clydesdale were precisely those regions that had risen to extraordinary prosperity as part of the British imperial enterprise. Now they were being written off as disposable assets, so what interest did the Scots in particular, but also the Welsh, have in remaining as part of that enterprise, albeit a new corporation in the making?

The Two Britains & the Demise of Thatcher, 1987-1990:

The understandable euphoria over Thatcher and her party winning three successive general elections disguised the fact this last victory was gained at the price of perpetuating a deep rift in Britain’s social geography. Without the Falklands factor to help revive the Union flag, a triumphalist English conservatism was increasingly imposing its rule over the other nations of an increasingly disunited Kingdom. Although originally from Lincolnshire, Thatcher’s constituency was, overwhelmingly, the well-off middle and professional classes in the south of England, where her parliamentary constituency of Finchley lay. Meanwhile, the distressed northern zones of derelict factories, pits, ports and terraced streets were left to rot and rust. People living in these latter areas were expected to lift themselves up by their own bootstraps, retrain for work in the up-and-coming industries of the future and if need be get on Tory Chairman, Norman Tebbitt’s bicycle and move to one of the areas of strong economic growth such as Cambridge, Milton Keynes or Slough, where those opportunities were clustered.

However, little was provided by publicly funded retraining and, if this was available, there was no guarantee of a job at the end of it. The point of the computer revolution was to save labour, not to expand it. In the late 1980s, the north-south divide seemed as intractable as it had any period in the previous six decades, with high unemployment continuing to be concentrated in the declining manufacturing areas of the North and West of the British Isles. That the north-south divide increasingly had a political dimension as well as an economic one was borne out by the 1987 General Election in the UK. Margaret Thatcher’s third majority was largely based on the votes of the South and East of England. North of a line running from the Severn estuary through Coventry and on to the Humber estuary, the long decline of Toryism, especially in Scotland, where it was reduced to only ten seats, was apparent to all observers. At the same time, the national two-party system seemed to be breaking down so that south of that line, the Liberal-SDP Alliance was the main challenger to the Conservatives in many constituencies.

In June 1988, 185 men died when a North Sea oil platform, Piper Alpha, blew up.

Even though the shift towards service industries was reducing regional economic diversity, the geographical distribution of regions eligible for European structural funds for economic improvement confirmed the continuing north-south divide. The pace of change quickened as a result of the 1987 Single European Act, as it became clear that the UK was becoming increasingly integrated with the European continent. The administrative structure of Britain also underwent major changes by the end of the nineties. The relative indifference of the Conservative ascendancy to the plight of industrial Scotland and Wales had transformed the prospects of the nationalist parties in both countries. In the 1987 election, Scottish and Welsh nationalists, previously confined mainly to middle-class, rural and intellectual constituencies, now made huge inroads into Conservative areas and even into the Labour heartlands of industrial south Wales and Clydeside.

Culturally, the Thatcher counter-revolution ran into something of a cul-de-sac, or rather the cobbled streets of Salford typified in the long-running TV soap opera, Coronation Street. Millions in the old British industrial economy had a deeply ingrained loyalty to the place where they had grown up, gone to school, got married and had their kids; to the pub, their park, and their football team. In that sense at least the Social Revolution of the fifties and sixties had recreated cities and towns that, for all their ups and downs, their poverty and pain, were real communities. Fewer people were willing to give up on Liverpool and Leeds, Nottingham and Derby than the pure laws of the employment marketplace demanded. For many working-class British people, it was their home which determined their quality of life, not the width of their wage packet.

The notoriously violent poll-tax riot of 1990 in Trafalgar Square.

Not everything that the Thatcher government did was out of tune with social reality, however. The sale of council houses created an owner-occupier class which, as Simon Schama has written, corresponded to the long passion of the British to be kings and queens of their own little castles. Sales of remaining state-owned industries, such as the public utility companies, were less successful since the concept of stakeholdership was much less deeply rooted in British traditions, and the mixed fortunes of both these privatised companies and their stocks did nothing to help change customs. Most misguided of all was the decision to call a poll tax imposed on the house and flat owners a community charge, and then to impose it first, as a trial run, in Scotland, where the Tories already had little support. The grocer’s daughter from Grantham thought that it would be a good way of creating a property-owning, tax-paying democracy, where people paid according to the size of their household. This was another mistaken assumption.

The Tories had Nigel Heseltine as their conference darling. He also became Thatcher’s nemesis when he challenged her for the leadership of the party in 1990. But although he wielded the knife, he did not ascend the throne.

Later in 1990, the iron lady was challenged for her leadership of the Party by Michael Heseltine, and although she beat him in the first ballot, the margin was insufficient to prevent a second ballot. With her supporters and cabinet fearing her defeat and humiliation, she felt obliged to step down from the contest. She was then replaced as PM by one of her loyal deputies, John Major, another middle-class anti-patrician, the son of a garden-gnome salesman, apparently committed to family values and a return to basics. In 1990, some EEC leaders thought that Britain was too slow in making decisions. In 1986, President Mitterand had signed an agreement with Margaret Thatcher to build the long-planned Channel Tunnel between France and England. But while the French had pressed ahead with constructing road and high-speed train links to the tunnel, Britain had been slow to act.

In the cartoon above, President Mitterand of France (left) and Helmet Kohl of Germany (right) try to force the new PM, John Major to agree to the terms and conditions in the Maastricht Treaty, agreed by 1993, which aimed at bringing freer trade between member states. Britain refused.

Throughout his time in office, a significant number of Conservative back-bench MPs continued to oppose closer economic and political ties within the EEC, which then became the European Community (EC). Against all the odds, at Maastricht in 1991, he managed to slip Britain out of paying fealty to the EC on most of what was demanded. He and his Chancellor, Norman Lamont, negotiated a special opt-out from the monetary union and managed to have the social chapter excluded from the treaty altogether. For a man with a weak hand, under fire from his own side at home, it was quite an accomplishment. Briefly, he was a hero, hence the cartoons of him wearing his Y-fronts outside his trousers. He described his reception by his own party in the Commons as the modern equivalent of a Roman triumph, quite something for a boy from Brixton.

The Brixton Boy & New Growth, 1992-97:

Soon after his Maastricht triumph, flushed with confidence, Major called the election that most observers thought he must lose. The economy was still in a mess, the poll tax issue so fresh and Neil Kinnock’s Labour Party now so well purged of Militants and well organised that there was a widespread belief that the ‘Thatcher era’ was at an end. But then, Lamont’s pre-election budget came to the rescue of his PM. It proposed cutting the basic rate of income tax by five pence in the pound, which would help people on lower incomes, badly wrong-footing Labour. During the campaign, Major found himself returning to Brixton, complete with a soapbox, which he mounted to address raucous crowds through a megaphone.


On 9th April, Major’s Conservatives won fourteen million votes, more than any party in British political history. It was a great personal achievement for Major, based on voters’ fears of higher Labour taxes. It was also one of the biggest percentage leads since 1945, but the vagaries of the electoral system gave only gave Major a majority of just twenty-one seats.


Never has such a famous victory produced such a rotten result for the winners. Major was back in number ten, but Chris Patten, tipped by many as a future PM, lost his seat in Bath to the Liberal candidate.

Unlike his predecessor and successor as Prime Minister, both of whom won three elections, in a parliamentary system under which greatness is generally related to parliamentary arithmetic, John Major has not gone down as a great leader of his country, though he was far more of a unifying figure than those who went before and came after. Despite its somewhat surprising victory in the 1992 General Election, the Major government ended up being ignominiously overwhelmed by an avalanche of sexual and financial scandals and blunders. Added to this, the ‘Maastricht rebels’ on the Tory back-benches now openly campaigned for Britain to leave the European Community.

Left: Labour leader Neil Kinnock attacking ultra-left members of the Militant Tendency at the Labour Party conference in 1985. Right: John Smith MP, Kinnock’s successor after Labour’s third successive defeat in the 1992 election, who would probably have become PM in 1997 but for a heart attack.

Kinnock was devastated by the 1992 result and quickly left front-line politics, becoming even less of a footnote in parliamentary history than Major, though deserving of an honourable chapter in the history of the Labour Party. Despite his popular mandate, the smallness of his majority meant that Major’s authority in the Commons was steadily chipped away by the Labour Opposition under their less bellicose but more able parliamentary leader, Kinnock’s Scottish Shadow Chancellor, John Smith.

Deindustrialisation & Re-industrialisation into the Nineties:

The process of deindustrialisation continued into the nineties with the closure of the Swan Hunter shipyard on the Tyne in May 1993. It was the last working shipyard in the region but failed to secure a vital warship contract. It was suffering the same long-term decline that reduced shipbuilding from an employer of two hundred thousand in 1914 to a mere twenty-six thousand by the end of the century. This devastated the local economy, especially as a bitter legal wrangle over redundancy payments left many former workers without any compensation at all for the loss of what they had believed was employment for life. As the map above shows, the closure’s effects spread far further than Tyneside and the Northeast, which were undoubtedly badly hit by the closure, with two hundred and forty suppliers losing their contracts. The results of rising unemployment were multiplied as the demand for goods and services declined.

Swan Hunter suppliers in 1993. The closure of the shipyard had a ‘knock-on’ effect as far afield as London and Glasgow.

As the map above shows, the closure of Swan Hunter certainly had a widespread impact on Suppliers as far afield as Southampton and Glasgow, as well as in the West Midlands and the Southeast. They lost valuable orders and therefore also had to make redundancies. Forty-five suppliers in Greater London also lost business. Therefore, from the closure of one single, large-scale engineering concern, unemployment resulted even in the most prosperous parts of the country. In the opposite economic direction, the growing North Sea oil industry helped to spread employment more widely throughout the Northeast and the Eastern side of Scotland, with its demands for drilling platforms and support ships, and this benefit was also felt nationally, both within Scotland and more widely, throughout the UK. However, this did little in the short term to soften the blow of the Swan Hunter closure.

Oil rig in the North Sea, drilling for oil and then pumping it ashore.

The old north-south divide in Britain seemed to be eroding during the recession of the early 1990s, which hit southeast England relatively hard, but it soon reasserted itself with a vengeance later in the decade as young people moved south in search of jobs and property prices rose. Overall, however, the 1990s were years of general and long-sustained economic expansion. The continued social impact of the decline in coal, steel and shipbuilding was to some extent mitigated by inward investment initiatives. Across most of the British Isles, there was also a continuing decline in the number of manufacturing jobs throughout the nineties. Although there was an overall recovery in the car industry, aided by the high pound in the export market, much of this was due to the new technology of robotics (shown below) which made the industry far less labour-intensive and therefore more productive.


The service sector expanded, however, and general levels of unemployment, especially in Britain, fell dramatically in the 1990s. Financial services saw strong growth, particularly in the London Docklands (pictured below), which saw the transformation of former docks areas into offices and fashionable modern residential developments, with a new focus around the huge Canary Wharf scheme, to the east of the city on the previously isolated Isle of Dogs. On the right is an aerial view of the complex being built in the late 1980s. But although the new development was expected to boost commerce and local businesses.

A decade later, however, many of the buildings were still unoccupied.

In addition, by the end of the decade, the financial industry was the largest employer in northern manufacturing towns like Leeds, which grew rapidly, aided by its ability to offer a range of cultural facilities that helped to attract an array of UK company headquarters. Manchester, similarly, enjoyed a renaissance, particularly in music and football. Manchester United’s commercial success led it to become the world’s largest sports franchise. Edinburgh’s banking and finances sector also expanded. Other areas of the country were helped by their ability to attract high technology industry. Silicon Glen in central Scotland was, by the end of the decade, the largest producer of computer equipment in Europe. Computing and software design was also one of the main engines of growth along the silicon highway of the M4 Corridor west of London. But areas of vigorous expansion were not necessarily dominated by new technologies. The economy of East Anglia, especially Cambridgeshire, had grown rapidly in the 1980s and continued to do so throughout the 1990s. While Cambridge itself, aided by the university-related science parks, fostered high-tech companies, especially in biotechnology and pharmaceuticals, expansion in Peterborough, for instance, was largely in low-tech areas of business services and distribution.

The Road and Airport Network 1973-2000.

Getting around Britain was, at last, getting easier. By 1980 there were nearly one and a half thousand miles of motorway in Britain. In the last twenty years of the century, the stretching of the congested motorway network to just over two thousand miles mostly involved the linking of existing sections. Motorway building and airport development were delayed by lengthy public enquiries and well-organised public protests. Improving transport links were seen as an important means of stimulating regional development as well as combating local congestion. Major road developments in the 1990s included the completion of the M25 orbital motorway around London, the Skye bridge and the M40 link between London and Birmingham. However, despite this construction programme, congestion remained a problem: the M25 was labelled the largest car park on the planet, while average traffic speeds in central London fell to only ten miles per hour in 2001, a famous poster on the underground pointing out that this was the same average speed as in 1901. Environmental concerns became an important factor in limiting further expansion of both roads and runways and in a renewed focus on improving the rail network both between regions and within them.


Improvements to public transport networks tended to be concentrated in urban centres, such as the light rail networks in Manchester, Sheffield and Croydon. At the same time, the migration of some financial services and much of the Fleet Street national press to major new developments in London’s Docklands prompted the development of the Docklands Light Railway and the Jubilee line extension, as well as some of the most expensive urban motorway in Europe. Undoubtedly, the most important transport development was the Channel Tunnel rail link from Folkestone to Calais, completed in 1994. By the beginning of the new millennium, millions of people had travelled by rail from London to Paris in only three hours.

The Channel Tunnel was highly symbolic of Britain’s commitment to Europe in the 1990s. By the end of the decade, millions of people and vehicles had already travelled from London to Paris in a mere three hours.

The development of Ashford in Kent, following the opening of the Channel Tunnel rail link, provides a good example of the relationship between transport links and general economic development. The railway had come to Ashford in 1842 and railway works were established in the town. This was eventually run down and closed between 1981 and 1993, but this did not undermine the local economy. Instead, Ashford benefited from the Channel Tunnel rail link, which made use of the old railway lines running through the town, and its population actually grew by ten per cent in the 1990s. The completion of the Tunnel combined with the M25 London orbital motorway, with its M20 spur, gave the town an international catchment area of some eighty-five million people within a single day’s journey.

Source: Penguin Books, 2001 (see list below).

This, together with the opening of Ashford International railway station as the main terminal for the rail link to Europe, attracted a range of engineering, financial, distribution and manufacturing companies. Fourteen business parks were opened in and around the town, together with a science park owned by Trinity College, Cambridge, and a popular outlet retail park on the outskirts of the town. By the beginning of the new millennium, the Channel Tunnel had transformed the economy of Kent. Ashford is closer to Paris and Brussels than it is to Manchester and Sheffield, both in time and distance. By the beginning of this century, it was in a position to be part of a truly international economy.

Ashford in relation to its British hinterland and the European Continent. Source: Penguin Books (see list below)

Even the historic market relocated to Orbital Park to the South of the town, reflecting in microcosm a national trend towards siting not only industrial estates but also retail parks on the edge of towns.

010 (2)
At the beginning of the new century, modern-day affluence was reflected in the variety of goods and services concentrated in shopping malls: They were often built on major roads outside cities and towns to make them more accessible to the maximum number of people.

New Labour & The Renewal of the United Kingdom, 1997-2002:

In a 1992 poll in Scotland, half of those asked said that they were in favour of independence within the European Union. The enthusiasm the Scottish National Party discovered in the late 1980s for the supposed benefits that would result from independence in Europe may help to explain its subsequent revival. In the General Election of the same year, however, with Mrs Thatcher and her poll tax having departed the political scene, there was a minor Tory recovery north of the border. Five years later this was wiped out by the Labour landslide of 1997 when all the Conservative seats in both Scotland and Wales were lost. Nationalist political sentiment grew in Scotland and to a lesser extent in Wales. In 1999, twenty years after the first devolution referenda, further votes led to the setting up of a devolved Parliament in Edinburgh, Wales got an Assembly in Cardiff, and Northern Ireland had a power-sharing Assembly again at Stormont near Belfast. In 2000, an elected regional assembly was established for Greater London, the area covered by the inner and outer boroughs in the capital, with a directly elected Mayor. This new authority replaced the Greater London Council which had been abolished by the Thatcher Government in 1986 and was given responsibility for local planning and transport.

In 1997, Tony Blair won the first of three elections, creating ‘New Labour’.

The devolution promised and instituted by Tony Blair’s new landslide Labour government did seem to take some of the momenta out of the nationalist fervour, but apparently at the price of stoking the fires of English nationalism among Westminster Tories, resentful at the Scots and Welsh having representatives in their own assemblies as well as in the UK Parliament. Only one Scottish seat was regained by the Tories in 2001. In Scotland at least, the Tories became labelled as a centralising, purely English party.

After the 1992 defeat, Blair had made a bleak public judgement about why Labour had lost so badly. The reason was simple: Labour has not been trusted to fulfil the aspirations of the majority of people in the modern world. As shadow home secretary he began to put that right, promising to be tough on crime and tough on the causes of crime. He was determined to return his party to the common-sense values of Christian Socialism, also influenced by the mixture of socially conservative and economically liberal messages used by Bill Clinton and his New Democrats. So too was Gordon Brown but as shadow chancellor, his job was to demolish the cherished spending plans of his colleagues. Also, his support for the ERM (the EU’s Exchange Rate Mechanism) made him ineffective when Major and Lamont suffered their great defeat. By 1994, the Brown-Blair relationship was less strong than it had been, but they visited the States together to learn the new political style of the Democrats which, to the advantage of Blair, relied heavily on charismatic leadership. Back home, Blair pushed John Smith to reform the party rulebook, falling out badly with him in the process. Media commentators began to tip Blair as the next leader, and slowly but surely, the Brown-Blair relationship was turning into a Blair-Brown one.

The Blair-Brown partnership’ was to survive until 2007 when Tony Blair stepped down as PM and Gordon Brown replaced him unopposed.
This photo was taken at the Labour Conference in 2005, by which time neither could disguise their personal enmity.

When it arrived, the 1997 General Election demonstrated just what a stunningly efficient and effective election-winning team Tony Blair led, comprising those deadly masters of spin, Alistair Campbell and Peter Mandelson. ‘New Labour’ as it was now officially known, won 419 seats, the largest number ever for the party and comparable only with the seats won by the National Government of 1935. Its Commons majority was also a modern record, with 179 seats, thirty-three more than Attlee’s landslide majority of 1945. The swing of ten per cent from the Conservatives was another post-war record, roughly double that which the 1979 Thatcher victory had produced in the opposite direction. But the turn-out was very low, at seventy-one per cent the lowest since 1935. Labour had won a famous victory but nothing like as many actual votes as John Major had won five years earlier. But Blair’s party also won heavily across the south and in London, and in parts of Britain that it had been unable to reach or represent in recent times.

It had also a far more diverse offer of candidates, especially in terms of gender. A record number of women were elected to Parliament, 119, of whom 101 were Labour MPs, nicknamed Blair’s babes at the time. Despite becoming one of the first countries in the world to have a female prime minister, in 1987 there were just 6.3% of women MPs in government in the UK, compared with ten per cent in Germany and about a third in Norway and Sweden. Only France came below the UK with 5.7%. Before the large group of female MPs joined her in 1997, Margaret Hodge (pictured below, c.1992) had already become MP for Barking in a 1994 by-election. While still a new MP, Hodge had endorsed the candidature of Tony Blair, a former Islington neighbour, for the Labour Party leadership, and was appointed Junior Minister for Disabled People in 1998. Before entering the Commons, she had been Leader of Islington Council and had not been short of invitations from constituencies to stand in the 1992 General Election. She had turned these offers down, citing her family commitments:

“It’s been a hard decision; the next logical step is from local to national politics and I would love to be part of a Labour government influencing change. But it’s simply inconsistent with family life, and I have four children who mean a lot to me. 

“It does make me angry that the only way up the political ladder is to work at it twenty-four hours a day, seven days a week. That’s not just inappropriate for a woman who has to look after children or relatives, it’s inappropriate for any normal person.

“The way Parliament functions doesn’t attract me very much. MPs can seem terribly self-obsessed, more interested in their latest media appearance than in creating change.” 

Alistair Campbell, Toy Blair’s Press Secretary, guards the door as Blair dictates a statement.

As the sun came up on a jubilant, celebrating Labour Party returning to power after an eighteen-year absence, there was a great deal of Bohemian rhapsodizing about a new dawn for Britain. Alistair Campbell (pictured above, right) had assembled crowds of party workers and supporters to stand along Downing Street waving union flags as the Blairs strode up to claim their victory spoils. Briefly, at least, it appeared that the whole country had turned out to cheer the champions. The victory was due to a small group of self-styled modernisers who had seized the Labour Party and made it a party of the ‘left and centre-left’, at least for the time being, though by the end of the following thirteen years, and after two more elections, they had taken it further to the centre-right than anyone expected on that balmy early summer morning; there was no room for cynicism amid all the euphoria. Labour was rejuvenated, and that was all that mattered.

Blair needed the support and encouragement of admirers and friends who would coax and goad him. There was Mandelson, the brilliant but temperamental former media boss, who had now become an MP. Although adored by Blair, he was so mistrusted by other members of the team that Blair’s inner circle gave him the codename ‘Bobby’ (as in Bobby Kennedy). Alistair Campbell, Blair’s press officer and attack dog was a former journalist and natural propagandist, who had helped orchestrate the campaign of mockery against Major. Then there was Anji Hunter, the contralto charmer who had known Blair as a young rock singer and was his best hotline to middle England. Derry Irvine was a brilliant Highlands lawyer who had first found a place in his chambers for Blair and his wife, Cherie Booth. He advised on constitutional change and became Lord Chancellor in due course. These people, with the Brown team working in parallel, formed the inner core. The young David Miliband, son of a well-known Marxist philosopher, provided research support. Among the MPs who were initially close were Marjorie ‘Mo’ Mowlem and Jack Straw, but the most striking aspect about ‘Tony’s team’ was how few elected politicians it included.

The small group of people who put together the New Labour ‘project’ wanted to find a way of governing which helped the worse off, particularly by giving them better chances in education and to find jobs, while not alienating the mass of middle-class voters. They were extraordinarily worried by the press and media, bruised by what had happened to Kinnock, whom they had all worked with, and ruthlessly focused on winning over anyone who could be won. But they were ignorant of what governing would be like. They were able to take power at a golden moment when it would have been possible to fulfil all the pledges they had made. Blair had the wind at his back as the Conservatives would pose no serious threat to him for many years to come. Far from inheriting a weak or crisis-ridden economy, he was actually taking over at the best possible time when the country was recovering strongly but had not yet quite noticed that this was the case. Blair had won by being ruthless, and never forgot it, but he also seemed not to realise quite what an opportunity ‘providence’ had handed him.

How European were the British in the Nineties?:

The European Community in c. 1992 (Turkey did not join and still hasn’t)

Transport policy was only one of the ways in which the EU increasingly came to shape the geography of the British Isles in the 1990s. In 1992, British people in general were far more positive about membership in the European Community (EC) than they were twenty years later, especially those employed in transport. Roy Clementson, a fifty-nine-year-old long-distance truck driver for more than thirty years was interviewed about his attitude to driving in the EC. He said that he preferred driving in Europe to the UK because the facilities were better:

“It’s easier to get a shower and there are far more decent places to eat, especially in France. I’m not one of those drivers who won’t eat anything except sausage, eggs, beans and chips. I like French food and I like the prices they charge. In France, if you look around, you can get a five-course meal and as much wine as you can drink for Ł5.50. In Italy, the food is very good, but it is more expensive: more like ten pounds per meal.

“I can assure you all the drivers would welcome a single European currency. It would be much better if you didn’t have to change money all the time. I’ve got nothing against the Royal Family, but not having coins with the Queen’s picture on doesn’t make any difference to me.

” I don’t see any disadvantages of going into Europe. I think the French and Germans have a better standard of living than we do. I recently saw a beautiful four-bedroomed house that cost about Ł75,000 in France. Now my house is worth more than that, but it’s only about one-third the size.

“People talk about our being ruled by Brussels; well, I think we could pick up a lot of good ideas. I wouldn’t say I feel European, but I don’t agree with people who have this attitude that we’re somehow better than they are. We have a lot to learn.”

The Independent.
Roy Clementson; photo by Nicholas Turpin.

Transport policy became a key factor in the creation of the new EU administrative regions of Britain between 1994 and 1999, as shown on the map below. At the same time, a number of British local authorities opened offices in Brussels for lobbying purposes. The European connection proved less welcome in other quarters, however. Fishermen, particularly in Devon and Cornwall and along the East coast of England and Scotland, felt themselves the victims of the Common Fisheries Policy quota system. A strong sense of Euroscepticism developed in parts of southern England in particular, fuelled by a mixture of concerns about sovereignty and economic policy. Nevertheless, links with Europe had been growing, whether via the Channel Tunnel, the connections between the French and British electricity grids, or airline policy, as were the number of policy decisions shaped by the EU.

Britain and Ireland in the 1990s, showing EU Regional Policy Areas, 1994-99.

With these better transport links, international travel became easier and cheaper for ordinary Britons. In particular, the cost of air travel began to fall, with the establishment of budget airlines, package holidays were affordable for families, and more independent, longer-term cultural links were forged between individuals, families and organisations. Holiday consultant Suzi Stembridge, also featured in the article from The Independent, exemplified many of these developments and the changing attitudes to a growing European community they brought:

Suzi Stembridge was clearly enthusiastic about the vision of a new Europe that membership of the European Community offered. It seems that, with the exception of fishermen and some Tory Westminster politicians, the majority of British people were becoming more European and more enthusiastic about EC membership, especially those involved in transport and travel, education, entertainment and culture. In addition, they were no longer concerned about its effects on food prices and the cost of living, as many had been at the time of the 1975 Referendum. They could see from their own experience that, if anything, the standard of living on the continent was higher than that in Britain, and that the quality of life was, in many ways, better. On the other hand, they were not so enthusiastic about Britain joining a single currency and wanted each country to be able to keep its own identity. They may not have felt as ‘truly European’ as people on the continent, but neither were they, at that point (the time of the Maastricht Treaty negotiation and implementation), opposed to, or even sceptical of membership of the European institutions.

The Growth of Celebrity Culture:

In 1997, Tony Blair arrived in power in a country with a revived fashion for celebrities, offering a few politicians new opportunities, though often at a high cost to their privacy and family lives. It was not until 1988 that the full shape of modern celebrity culture had become apparent. That year had seen the publication of the truly modern glossy glamour magazines when Hello! was launched. Its successful formula was soon copied by OK! from 1993 and many other magazines soon followed suit, to the point where the yards of coloured ‘glossies’ filled the newsagents’ shelves in every town and village in the country. Celebrities were paid handsomely for being interviewed and photographed in return for coverage which was always fawningly respectful and never hostile. The rich and famous, no matter how flawed in real life, were able to shun the mean-minded sniping of the ‘gutter press’, the tabloid newspapers. In the real world, the sunny, airbrushed world of Hello! was inevitably followed by divorces, drunken rows, accidents and ordinary scandals. But people were happy to read good news about these beautiful people even if they knew that there was more to their personalities and relationships than met the eye.

In the same year that Hello! went into publication, ITV also launched its most successful of the daytime television shows, This Morning, hosted from Liverpool by Richard Madeley and Judy Finnigan, providing television’s celebrity breakthrough moment. A new form of television, Reality Television, was created that featured ‘ordinary’ people rather than actors or celebrities. This idea was developed from the fly-on-the-wall documentaries and talent shows of previous decades. The programme Big Brother, taking its title from the phrase in George Orwell’s novel 1984, ‘Big brother is watching you,’ was first aired in 1999. Members of the public were locked in a house and observed on camera. This turned ordinary members of the public into media stars overnight, though mostly only for a short time afterwards. The pop ‘agent’ Simon Cowell (b. 1959) created Pop Idol in 2001. Young singers were auditioned and then coached to become pop stars, eliminated one by one until a winner was chosen and then given a recording contract. This format continued into the X-factor and Britain’s Got Talent which included acts other than pop singers and was open to people of all ages and backgrounds. These programmes were ‘interactive,’ inviting viewers to vote by phone.

Just as in sixties Britain, in the nineties Britain started producing music that became popular all over the world. Boy bands included Take That, including Robbie Williams and Gary Barlow who later had solo singing careers. Unlike The Beatles, however, these ‘bands’ had to rely on backing instrumentalists, however. Girl bands like The Spice Girls promoted a fashion for “girl power” and one of them, Victoria Adams, ‘Posh Spice’, married the footballer David Beckham. Together, they became a notable celebrity couple in Britain and around the world.

There had been a feeling that life under John Major’s government was rather dull, and that its politics was full of men in dark grey suits. Tony Blair’s victory in 1997 brought a change of mood in which it was ‘cool’ to be British again. The Blair government captured this mood, inviting pop stars, sporting heroes and other celebrities to Downing Street for ‘photo opportunities’ and encouraging a ‘rebranding’ of Britain as ‘Cool Britannia’.

Monarchy as the Indispensable Marker of British Identity, 1981-2002:
The Royal lineage in 1992

By the early 1990s, another indispensible marker of British identity, the monarchy, began to look tired, under the strain of being simultaneously a ceremonial and familial institution. Behind the continuing hard work and leadership of HM Queen Elizabeth and her Consort, Philip, and the Queen Mother, there were growing signs that the younger generation of the Royal Family, including the heir, Charles, Prince of Wales, was struggling to live up to the high standards set by the older ones. The seeds of these problems were sown at the beginning of the previous decade, if not earlier, with the deliberate development of the monarchy into a ‘family’ institution.

Top: The Duke of Edinburgh, Prince Philip, was given his title after his marriage to the then Princess Elizabeth in 1947. The Duke took a great deal of interest in the achievements of young people – in 1956 he founded the Duke of Edinburgh Award Scheme (through which awards are made to young people aged 14-21 for enterprise, initiative and achievement.

Below: The Queen’s Official Birthday, the second Saturday in June, is marked by the Trooping of the Colour, a ceremony during which the regiments of the Guards Division and the Household Cavalry parade (troop) the regimental flag (colour) before the sovereign.

Ever since the abdication of Edward VIII in 1936, which suddenly propelled the ten-year-old Princess Elizabeth into the spotlight as the heir apparent, the membership of this institution was thought to require standards of personal behaviour well above the general norm of late twentieth-century expectations.

A Page from a school textbook.

This celebrity fantasy world, which continued to open up in all directions throughout the nineties, served to re-emphasise to alert politicians, broadcasting executives and advertisers to the considerable power of optimism. The mainstream media in the nineties was giving the British an unending stream of bleakness and disaster, so millions tuned in and turned over to celebrities. That they did so in huge numbers did not mean that they thought that celebrities had universally happy lives. And in the eighties and nineties, no star gleamed more brightly in the firmament than the beautiful yet troubled Princess Diana. The fairy-tale wedding of the Prince of Wales and Lady Diana Spencer at St Paul’s in 1981, which had a worldwide audience of at least eight hundred million viewers, meant that for fifteen years she was an ever-present in the media. Yet, as an aristocratic girl whose childhood had been blighted by the divorce of her parents, she found herself pledging her life to a much older man who shared few of her interests and did not even seem to be truly in love with her.

Above: Hello! looks back on the 1981 Royal Wedding from that of Catherine Middleton and William in 2011.

At first, Princess Diana seemed shy, naive and slightly awkward in front of the cameras. She suffered from an eating disorder, Bulimia, and lost weight. But as she became more confident she began to enjoy the attention she was attracting. In 1982, a little over a year after her fairy-tale wedding, Diana had her first son, William Arthur Philip Louis, who was born in London on the 21st of June. He lived at Kensington Palace with Prince Charles and Princess Diana. When he was only nine months old, his parents made an official visit to Australia and New Zealand. His mother did not want to be away from him for six weeks, so the little prince went with his parents.

Becoming ever more popular, she was soon dubbed The People’s Princess by the tabloid press. She also became the world’s most-photographed woman, and she was frequently seen on the covers of magazines and on television. She was even filmed dancing with the star of Saturday Night Fever and Grease, John Travolta.

In 1984, the royal couple had a second son, Prince Harry. William and Harry were always in the public eye as the new additions to the royal family. Photographers, newspaper reporters and TV cameras were part of their everyday life.

The two little boys enjoyed playing with each other at Kensington Palace. Young children in the royal family usually had nursery lessons at home, but Princess Diana, herself a nursery assistant, changed things. When William was three, he began to go to school in London with other young children. When he was eight, he went to Ludgrove School, about forty miles away from London, so he lived at the school and only went home once a month. At first, he was unhappy, but he soon began to enjoy football and other sports. Two years later, his brother Harry joined him at the school.

Above: The State Opening of Parliament in November 1991, for the last session before the 1992 General Election.

As ever, the opening ceremony, shown above, was a mixture of pageantry and serious political business. Once the Queen took her seat on the throne in the House of Lords, she sent ‘Black Rod’ to the Commons chamber to summon the MPs to join her and the Lords for her reading the speech outlining the new laws the Government was planning to make in the forthcoming parliamentary year. The Speaker of the Commons, Betty Boothroyd, was then followed by the Prime Minister and Leader of the Opposition into the House of Lords, followed by the rest of the MPs. Of course, it was not really the Queen’s Speech at all, but the Government’s, written by the Prime Minister at the time, in this case, John Major, and his colleagues to be read out by the Queen. In 1991, the Speech contained fewer proposals, or bills, than in previous years because the Government wanted to cut short the parliamentary year and call a General Election in the Spring (as it turned out, for 9th April). The main bills referred to in the Speech were on criminal justice and the environment, a ‘green’ bill. Other anticipated measures to bring in privately financed toll roads and new rules to ease traffic congestion in London.

From a Profile of Charles III in an English Language magazine.

Just as the monarchy had gained from its ceremonies, especially its royal weddings, so it lost commensurately from the failure of those unions. As a young man, Charles had dated Camilla Shand. But when he was sent abroad with the Royal Navy, Camilla married Andrew Parker-Bowles. Even though he eventually married Diana, it became obvious that Charles was still in love with Camilla. When rumours spread of Diana’s affairs, they no longer had the moral impact that they might have had in previous decades. By the nineties, Britain was now a divorce-prone country, in which what’s best for the kids and I deserve to be happy were phrases which were regularly heard in suburban kitchen diners. Diana was not simply a pretty woman married to a king-in-waiting but someone people felt, largely erroneously, would understand them. There was an obsessive aspect to the admiration of her, something that the Royal Family had not seen before, and its leading members found it very uncomfortable and even, at times, alarming. They were being weighed in the balance as living symbols of Britain’s ‘family values’ and found wanting. 

Yet as the two contemporary stories from The Daily Mail (below) show, even as they became increasingly estranged, the couple were able to support each others’ concerns, passions and charitable work. In the article on the left, Prince Charles pointed to the danger of teachers using teaching methods recommended by educational ‘experts’ leading to teaching only fashionable topics, neglecting students’ cultural heritage. Though controversial with some in the profession, his call for more resources for schools was well-received by most. Opening a London conference on children with AIDS, Princess Diana spoke about the confused understanding of HIV that existed and encouraged people through her own actions as well as words, to overcome their fear of touching people, especially children, with the virus. She also outlined the tragic situation of children with HIV and the prejudice they felt.

From the Daily Mail.

Nevertheless, by the mid-1990s, the monarchy was looking shaky, perhaps even mortal. The year 1992, referred to by the Queen, in her Christmas Speech, as her annus horriblis, had seen not just the separations of Charles and Diana (of Wales) as well as Andrew and Sarah (the Yorks), but also a major fire at Windsor Castle in November. When it was announced that the Crown would only pay for the replacement and repair of items in the royal private collection and that repairs to the fabric would therefore come from the tax-paying public, a serious debate began about the state of the monarchy’s finances and its tax status. In a poll, eight out of ten people asked thought the Queen should pay taxes on her private income, hitherto exempt. A year later, Buckingham Palace was opened to public tours for the first time and the Crown did agree to pay taxes.

In December 1992, Diana went to see William and Harry at school. She told them that Charles and she were not a couple anymore. Naturally, William and Harry were very upset about this. Later the same month, John Major announced the separation of Charles and Diana to the House of Commons. After their separation, Diana continued her charity work, especially with AIDS victims and in the campaign against the use of land mines in wars, all of which kept her in the public eye and maintained her popularity in Britain and worldwide.

William with his mother in the early nineties.

The journalist Andrew Morton claimed to tell Diana’s True Story in a book which described suicide attempts, blazing rows, her bulimia and her growing certainty that Prince Charles had resumed an affair with his old love Camilla Parker-Bowles, something he later confirmed in a television interview with Jonathan Dimbleby. There was a further blow to the Royal Family’s prestige in 1994 when the royal yacht Britannia, the floating emblem of the monarch’s global presence, was decommissioned. Mr and Mrs Parker-Bowles, Andrew and Camilla, divorced in 1995 and then came the revelatory 1995 (now discredited) interview on BBC TV’s Panorama programme between Diana and Martin Bashir. Breaking every taboo left in Royal circles, she freely discussed the breakup of her marriage, claiming that there were three of us in it, attacked the Windsors for their cruelty and promised to be ‘queen of people’s hearts.’ When Charles and Diana finally divorced in 1996, she began a relationship with Dodi al-Fayed, the son of the owner of Harrods, Mohammed al-Fayed.

Meanwhile, William continued his education at Eton College, near Windsor Castle, so that he could go there to visit his grandmother when she was at home. William worked hard at Eton and was also successful at sports. Then, on 31st August 1997, William and Harry were on holiday with his father and grandparents at Balmoral when the terrible news of their mother’s death in a car accident in Paris. On 6th September, the boys walked behind his mother’s coffin through the streets of London to Westminster Abbey for her funeral. Prince Harry, Prince Charles, Prince Philip and Diana’s brother, Earl Spencer, walked alongside them. Everyone felt heartbroken for the two young princes on what was a terrible day for the British royal family. William left Eton three years later, in 2000, and then went on a gap year, including a stint in Chile, where he worked as a resident teaching assistant.

To many in the establishment, Diana was a selfish, unhinged woman who was endangering the monarchy. To many millions more, however, she was more valuable than the formal monarchy, her readiness to share her pain in public making her even more fashionable. She was followed all around the world, her face and name selling many papers and magazines. By the late summer of 1997, Britain had two super-celebrities, Tony Blair and Princess Diana. It was therefore grimly fitting that Tony Blair’s most resonant words as Prime Minister which brought him to the height of his popularity came on the morning, in August 1997, when Princess Diana was killed in a car accident in Paris with Dodi Fayed. Their car was speeding away from paparazzi photographers when it crashed into the pillars of an underpass. Tony Blair was woken at his Sedgefield constituency home, first to be told about the accident, and then to be told that Diana had died. Deeply shocked and worried about what his proper role should be, Blair spoke first to Alistair Campbell and then to the Queen, who told him that neither she nor any other senior member of the Royal Family would be making a statement. He decided, therefore, that he had to say something. Later that Sunday morning, standing in front of his local parish church in Trimdon, he spoke words which were transmitted live around the world:

“I feel, like everyone else in this country today, utterly devastated. Our thoughts and prayers are with Princess Diana’s family, in particular her two sons, her two boys – our hearts go out to them. We are today a nation in a state of shock…

“Her own life was often sadly touched the lives of so many others in Britain and throughout the world with joy and with comfort. How many times shall we remember her in how many different ways, with the sick, the dying, with children, with the needy? With just a look or a gesture that spoke so much more than words, she would reveal to all of us the depth of her compassion and her humanity.

“People everywhere, not just here in Britain, kept faith with Princess Diana. They liked her, they loved her, they regarded her as one of the people. She was – the People’s Princess and that is how she will stay, how she will remain in our hearts and our memories for ever.”

Although these words seem, more than twenty-five years on, to be reminiscent of past tributes paid to religious leaders, at the time they were much welcomed and assented to. They were the sentiments of one natural charismatic public figure for another. Compared with other politicians, Tony Blair seemed very young and in tune with youth culture. Blair regarded himself as the People’s Prime Minister, leading the people’s party, beyond left and right, beyond faction or ideology, with a direct line to the people’s instincts. After his impromptu eulogy, his approval rating rose to over ninety per cent, a figure not normally witnessed in democracies.

Blair and Campbell then paid their greatest service to the ancient institution of the monarchy itself. The Queen, still angry and upset about Diana’s conduct and concerned for the welfare of her grandchildren, wanted a quiet funeral and to remain at Balmoral, away from the scenes of spontaneous public mourning in London. However, this was potentially disastrous for her public image. There was a strange mood in the country deriving from Diana’s charisma, which Blair had referenced in his words at Trimdon. If those words had seemed to suggest that Diana was a saint, a sub-religious hysteria responded to the thought. People queued to sign a book of condolence at St James’ Palace, rather than signing it online on the website of the Prince of Wales. Those queuing even reported supernatural appearances of the dead Princess’ image. By contrast, the lack of any act of public mourning by the Windsors and the suggestion of a quiet funeral seemed to confirm Diana’s television criticisms of the Royal Family as being cold if not cruel towards her.

030 (2)
A sea of flowers laid in tribute to Diana, Princess of Wales, outside Kensington Palace, London, August 1997

Also, with Prince Charles’ full agreement, Blair and his aides put pressure on the Palace first into accepting that there would have to be a huge public funeral so that the public could express their grief, and second into accepting that the Queen should return to London. She did, just in time to quieten the genuine and growing anger about her perceived attitude towards Diana. This was a generational problem as well as a class one. The Queen had been brought up in a land of buttoned lips, stoicism and private grieving. She now reigned over a country which expected and almost required exhibitionism. For some years, the deaths of children, or the scenes of fatal accidents had been marked by little shrines of cellophane-wrapped flowers, soft toys and cards. In the run-up to Diana’s funeral parts of central London seemed almost Mediterranean in their public grieving. There were vast mounds of flowers, people sleeping out, holding up placards and weeping in the streets, and strangers hugging each other.

The funeral itself was like no other before, bringing the capital to a standstill. In Westminster Abbey, campaigners stood alongside aristocrats, entertainers with politicians and rock musicians with charity workers. Elton John performed a hastily rewritten version of ‘Candle in the Wind’, originally his lament for Marilyn Monroe, now dedicated to ‘England’s Rose’, and Princess Diana’s brother Earl Spencer made a half-coded attack from the pulpit on the Windsors’ treatment of his sister. This was applauded when it was relayed outside and clapping was heard in the Abbey itself. Diana’s body was driven to her last resting place at the Spencers’ ancestral home of Althorp in Northamptonshire. Nearly a decade later, and following many wild theories circulated through cyberspace which reappeared regularly in the press, an inquiry headed by a former Metropolitan Police commissioner concluded that she had died because the driver of her car was drunk and was speeding in order to throw off pursuing ‘paparazzi’ photographers. The immense outpouring of public emotion in the weeks that followed seemed both to overwhelm and distinguish itself from the more traditional devotion to the Queen herself and to her immediate family. As Simon Schama has put it,

The tidal wave of feeling that swept over the country testified to the sustained need of the public to come together in a recognizable community of sentiment, and to do so as the people of a democratic monarchy.

Diana’s funeral was probably watched by as many people as had watched her wedding. For a short time, the Queen became very unpopular, as she did not immediately return from Balmoral to Buckingham Palace when Diana died, or even fly a flag at half-mast. But those who complained about this in the popular media did not seem to understand that Royal protocol dictates that the Royal Standard should only be flown above Buckingham Palace when the Monarch is in residence. Added to this, the Union Flag is only flown above the royal palaces and other government and public buildings on certain special days, such as the Princess Royal’s birthday, 15 August. Since it was holiday time for the Royal family, and they were away from London, there were no flags flying. Nor would the general public have known that flags are only flown at half-mast on the announcement of the death of a monarch until after the funeral, and on the day of the funeral only for the deaths of other members of the royal family.

The Queen, as the only person who could authorise an exception to these age-old customs, received criticism for not flying the union flag at half-mast in order to fulfil the deep need of her grief-stricken subjects. Although Her Majesty meant no disrespect to her estranged and now deceased daughter-in-law, the Crown lives and dies in such symbolic moments, and she duly relented. The crisis was rescued by a live, televised speech she made from the Palace which was striking in its informality and obviously sincere expression of personal sorrow. Her people now understood that their way of grieving was very different from the more conventional but no less heartfelt mourning of the Queen and her immediate family. Her Majesty quickly rose again in public esteem and came to be seen to be one of the most successful monarchs for centuries and the longest-serving ever. A popular film about her, including a sympathetic portrayal of these events, sealed this verdict.

Above: Her Majesty Queen Elizabeth in 2001, aged 75. She reigned for another twenty-one years,
and celebrated her Diamond Jubilee in 2012 and her Platinum (70th) Jubilee in 2022.

Tony Blair never again quite captured the mood of the country as he did in those sad late summer days. It may be that his advice and assistance to the Queen in 1997, vital to her as they were, in the view of Palace officials, were also thoroughly impertinent. His instinct for popular culture when he arrived in power was certainly uncanny. The New Age spiritualism which came out into the open when Diana died was echoed among Blair’s Downing Street circle. What other politicians failed to grasp and what he did grasp, was the power of optimism expressed in the glossy world of celebrity, and the willingness of people to forgive their favourites not just once, but again and again. One of the negative longer-term consequences of all this was that charismatic celebrities discovered that, if they apologised and bared a little of their souls in public, they could get away with most things short of murder. For politicians, even charismatic ones like Tony Blair, life would prove a little tougher, and the electorate would be less forgiving of oft-repeated mistakes. Nevertheless, like Magaret Thatcher, he won three elections and was in office for more than ten years.

The monarchy was fully restored to popularity by the Millennium festivities, at which the Queen watched dancers from the Notting Hill Carnival under the famous Dome, and especially by the Golden Jubilee celebrations of 2002, which continued the newly struck royal mood of greater informality.


One of the highlights of the ‘Party in the Mall’ came when Brian May, the lead guitarist of the rock band Queen began the pop concert at Buckingham Palace by playing his instrumental version of God Save the Queen from the Palace rooftop. Modern Britannia seemed, at last, to be at ease with its identity within a multi-national, multi-ethnic, United Kingdom, in all its mongrel glory.

Also in 2002, George VI’s widow, Queen Elizabeth, known as the Queen Mother after her husband’s death and her daughter’s accession in 1952, died at the age of 102. After George VI’s death, she continued to carry out many public duties at home and abroad. Many thought that Charles would not marry the divorced Camilla while his grandmother was still alive. She had become Queen Consort in 1937 because her brother-in-law, Edward VIII had not been allowed to marry a divorcée as king, choosing to abdicate the throne instead.

Prince Charles & the Duchess of Cornwall leaving their civil marriage ceremony at the Windsor Registry Office in 2005.

Charles and Camilla were able to marry in 2005, in a civil wedding at Windsor, but it was decided that Camilla should be known as Duchess of Cornwall, out of respect for Diana, Princess of Wales.

She is now (2022) also known as the Queen Consort.


Simon Schama (2002), A History of Britain; The Fate of Empire, 1776-2000. London: BBC Worldwide.

Robert McCrum, William Cran & Robert MacNeil (1987), The Story of English. London: Penguin Books.

John Haywood & Simon Hall, et.al. (2001), The Penguin Atlas of British and Irish History. London: Penguin Books.

Gwyn A. Williams (1985), When Was Wales? Harmondsworth: Penguin Books.

Bill Lancaster & Tony Mason (eds.)(n.d.), Life and Labour in a Twentieth Century City: The Experience of Coventry. Coventry: University of Warwick Cryfield Press.

Appendix: A Breakdown of Census Statistics, 1901-1991, in Graphs:


Population: By the end of the twentieth century, there were twenty million more Britons than there were at its beginning, In the first part of the century, the population grew fast due to high birth rates. Since the 1970s, this growth slowed down. Fewer babies were being born. But people born since 1991 can expect to live much longer than those born in 1901. Women, as always, can expect to live longer than men. At the beginning of the century, only about four per cent of the population was over sixty-five: By its end, this had risen to thirteen per cent. At the same time, fewer babies were being born, so whereas in 1901 the average household included 4.6 people, in 1991, it was only 2.5. It was not just that there were fewer children, but also because there were many more single-parent families.

There was also a dramatic fall in the number of servants. Over the first half of the century, the number of households with live-in servants fell from over ten per cent to only about one per cent.

Work: The number of working women doubled over the course of the century. The number of unemployed people rose even faster, especially in the period between the two world wars, and again in the 1970s and ’80s. Those employed in manual labour or ‘manufacturing’ decreased significantly, by more than a quarter, from three-quarters of the workforce to less than half, so that in 1991 most people in work were in non-manual occupations based in shops and offices. On average, working hours were reduced by nine hours per week, more than one whole working day on average, leaving more family and leisure time.

Spending (left, below): Over the twentieth century, people were gradually able to spend an increasing proportion of their income on ‘consumer goods’ and a smaller proportion on food.


Majesty & Grace X: The Reign of Elizabeth Windsor – Winter of Discontent to Golden Jubilee, 1979-2002; Part 1 – Wars & Paupers.

Britain at the End of the Cold War World:
Britain, Ireland and the World, 1970-2000.

Britain had retreated from most of its empire by 1970. The only remaining colony was Rhodesia, which had been ruled by a white minority government, illegally, since 1967 and through the seventies. Britain resumed control in 1980 and the country became independent as Zimbabwe later that year. Various smaller island colonies in the Caribbean and the Pacific were granted independence in the seventies and eighties. The European Union, the Cold war and the NATO Alliance were Britain’s main concerns until 1990, while Ireland pursued a policy of neutrality. With the end of the Cold War, Britain took a more active role in Iraq, Bosnia and Kosovo. Hong Kong and its adjacent New Territories became an autonomous region of China in 1997 upon the expiry of the hundred-year lease. The removal of that colony, with its large population, enabled the British government to offer citizenship to the remaining small colonies that formed the British Overseas Territories.

A European solution, through membership of the EEC, twice vetoed by de Gaulle in the sixties, had finally been found in the early seventies under Heath and Wilson. Simultaneously, the third way, initiated in 1970 by free-enterprise, anti-collectivist Tories like Anthony Barber, Edward du Cann and Keith Joseph at the Selsdon Park conference, prepared the way for Margaret Thatcher’s attempt in the 1980s to liquidate what was left of the welfare state. Billed as a return to Victorian values, Thatcher’s Revolution was not, in fact, a return to Gladstonian liberalism, but a reversion to the hard-faced reactionary conservatism of the 1920s, leaving industries alone to survive and thrive or to go to the wall. As in the twenties, resistance to brutal rationalisation through closures and sell-offs of uneconomic nationalised enterprises, or through wage or job cuts, was met with determined opposition, which came to a head in the long-running coal dispute of 1984-85.

Britain continued to operate as a dominant centre of world finance, but even this advantage turned into a liability when the defence of the sterling forced successive governments, especially the second Wilson government, into accepting humiliating conditions, either from the United States or from the International Monetary Fund, involving deep spending cuts. The shrinkage of sovereignty accelerated, with increasing battles over the slice sizes of the ever-diminishing economic pie, fought out between unions and government. But successive governments seemed determined to keep Britain as a substantial military power with a fully funded welfare state. All the alternative models of post-imperial power applied between the sixties and the eighties ran into trouble. Relying exclusively on the United States for its nuclear defence was ruled out as anathema by both Labour and Conservatives. This was seen as an abdication, not just of great power but any power status, seeming to be a recolonisation in reverse.

The Assassination of Lord Mountbatten by the IRA, August 1979:

Of all the areas of the United Kingdom, it was Northern Ireland that continued to suffer the highest levels of unemployment in the eighties. This was mainly because the continuing sectarian violence discouraged inward investment in the six counties of the Province.

Lord Mountbatten of Burma

On August 27, 1979, in Mullaghmore, County Sligo, on the western coast of the Republic of Ireland (see the map above), a massive 50lb remote-controlled bomb exploded on board the fishing boat Shadow V, killing Lord Louis Mountbatten, his grandson and two others while they were boating on holiday off the coast. Lord Mountbatten was HM Queen Elizabeth’s second cousin and Prince Philip’s uncle. He was also, at that time, HRH Prince Charles’ great uncle, godfather and mentor. This was the height of the Provisional IRA’s bombing campaign across the British Isles.

Charles later described Lord Louis Mountbatten as the grandfather I never had. Mountbatten was a strong influence in the upbringing of his grand-nephew, and from time to time strongly upbraided the Prince for showing tendencies towards the idle pleasure-seeking dilettantism of his predecessor as Prince of Wales, King Edward VIII, whom Mountbatten had known well in their youth. Yet he also encouraged the Prince to enjoy the bachelor life while he could, and then to marry a young and inexperienced girl so as to ensure a stable married life.

Prince Charles in the late 1970s: the eligible Bachelor.

Mountbatten’s qualification for offering advice to this particular heir to the throne was unique; it was he who had arranged the visit of King George VI and Queen Elizabeth to Dartmouth Royal Naval College on 22 July 1939, taking care to include the young Princesses Elizabeth and Margaret in the invitation, but assigning his nephew, Cadet Prince Philip of Greece, to keep them amused while their parents toured the facility. This was the first recorded meeting of Charles’s future parents. But a few months later, Mountbatten’s efforts nearly came to nought when he received a letter from his sister Alice in Athens informing him that Philip was visiting her and had agreed to repatriate permanently to Greece. Within days, Philip received a command from his cousin and sovereign, King George II of Greece, to resume his naval career in Britain which, though given without explanation, the young prince obeyed.

In 1974, Mountbatten began corresponding with Charles about a potential marriage to his granddaughter, Amanda Knatchbull. It was about this time he also recommended that the 25-year-old prince get on with “sowing some wild oats”. Charles dutifully wrote to Amanda’s mother (who was also his godmother), Lady Brabourne, about his interest. Her answer was supportive but advised him that she thought her daughter was still rather young to be courted. Four years later, Mountbatten secured an invitation for himself and Amanda to accompany Charles on his planned 1980 tour of India. Their fathers promptly objected. Prince Philip also thought that the Indian public’s reception would more likely reflect a response to the uncle, the last Viceroy, than to the nephew. Lord Brabourne counselled that the intense scrutiny of the press would be more likely to drive Mountbatten’s godson and granddaughter apart than together. Charles was rescheduled to tour India alone, but Mountbatten did not live to the planned date of departure.

Mountbatten usually holidayed at his summer home, Classiebawn Castle, on the Mullaghmore Peninsula in County Sligo, in the northwest of Ireland. The village was only twelve miles from the border with County Fermanagh in Northern Ireland and near an area known to be used as a cross-border refuge by IRA members. In 1978, the IRA had allegedly attempted to shoot Mountbatten as he was aboard his boat, but poor weather had prevented the sniper from taking his shot. On 27th August 1979, Mountbatten went lobster potting and tuna fishing in his thirty-foot wooden boat, Shadow V, which had been moored in the harbour at Mullaghmore. IRA member Thomas McMahon had slipped onto the unguarded boat the previous night and attached a radio-controlled bomb weighing fifty pounds. When Mountbatten and his party had taken the boat just a few hundred yards from the shore, the bomb was detonated. The boat was destroyed by the force of the blast and Mountbatten’s legs were almost blown off. Mountbatten, then aged seventy-nine, was pulled alive from the water by nearby fishermen but died from his injuries before being brought to shore.

Also aboard the boat were Amanda Knatchbull’s elder sister Patricia, Lady Brabourne; her husband Lord Brabourne; their twin sons Nicholas and Timothy Knatchbull; Lord Brabourne’s mother Doreen, Dowager Lady Brabourne; and Paul Maxwell, a young crew member from Enniskillen in County Fermanagh. Nicholas (aged fourteen) and Paul (fifteen) were killed by the blast and the others were seriously injured. Doreen, Dowager Lady Brabourne (eighty-three), died from her injuries the following day. The attack triggered outrage and condemnation around the world. The Queen received messages of condolence from leaders including US President Jimmy Carter and Pope John Paul II. Prime Minister Margaret Thatcher said:

His death leaves a gap that can never be filled. The British people give thanks for his life and grieve at his passing.

Admiral of the Fleet The Right Honourable
The Earl Mountbatten of Burma
Portrait by Allan Warren, 1976

On the day of the bombing, the IRA also ambushed and killed eighteen British soldiers at the gates of Narrow Water Castle, just outside Warrenpoint, in County Down in Northern Ireland, sixteen of them from the Parachute Regiment, in what became known as the Warrenpoint ambush. It was the deadliest attack on the British Army during the Troubles. Six weeks later, Sinn Féin vice-president Gerry Adams said of Mountbatten’s death:

“The IRA gave clear reasons for the execution. I think it is unfortunate that anyone has to be killed, but the furor created by Mountbatten’s death showed up the hypocritical attitude of the media establishment. As a member of the House of Lords, Mountbatten was an emotional figure in both British and Irish politics. What the IRA did to him is what Mountbatten had been doing all his life to other people; and with his war record I don’t think he could have objected to dying in what was clearly a war situation. He knew the danger involved in coming to this country. In my opinion, the IRA achieved its objective: people started paying attention to what was happening in Ireland.”

When Charles finally proposed marriage to Amanda later in 1979, the circumstances had changed and she refused him.

The Winter of Discontent:

The winter of our discontent, a phrase from Shakespeare’s play Richard III, was used by James Callaghan, Labour Prime Minister from 1976, to describe the industrial and social chaos of 1978-79. It has stuck in people’s memories, as few economic or political events had done before or have done since. In the two years, 1977 and ’78 there was an explosion of resentment, largely by poorly paid public employees, against a minority Labour government incomes policy they felt was discriminatory. It began earlier in 1978 but got far worse with a series of strikes going into winter, resulting in rubbish being left piled up and rotting in the streets throughout the country. Added to this, the schools closed, the ports were blockaded and the dead went unburied. Left-wing union leaders and activists whipped up the disputes; individual union branches and shop stewards were reckless and heartless. Right-wing newspapers, desperate to see the end of Labour, exaggerated the effects and rammed home the picture of a nation no longer governable. The scenes on Britain’s streets provided convincing propaganda for the conservatives in the subsequent election in May 1979.

Callaghan had opposed the legal restrictions on union power pleaded for by Wilson and Castle and then fought for vainly by Heath. Healey, acting in good faith, had imposed a more drastic squeeze on public spending and thus on the poorest families than had been economically necessary. They had also tried to impose an unreasonably tough new income policy on the country. Finally, by dithering about the date of the general election, he destroyed whatever fragile calm he had managed to establish and enjoy since he had taken over from Wilson. Most observers and members of the cabinet assumed that Callaghan would call an autumn election in 1978. The economic news was still good and Labour was ahead in the polls. Two dates in October had already been pencilled in, but musing on his Sussex farm during the summer, Callaghan decided that he did not trust the polls. He decided to wait until the following spring. But when he invited a dozen trade union leaders to his farm to discuss the decision, they left still thinking he was going in the autumn. But then, at the beginning of autumn, at the TUC conference, he confused the issue even more with a bizarre rendition of an old music hall concert, leaving his audience to interpret its meaning. When he finally came clean with the cabinet, they were shocked.

This might not have mattered so much had Callaghan also not promised a new five per cent pay limit to bring down inflation further. As a result of the 1974-75 cash limit on pay rises at a time of high inflation, take-home pay for most people had been falling ever since. Public sector workers, in particular, had been having a hard time, and there were stories of fat cat directors and bosses awarding themselves high settlements. The union leaders and many ministers thought that a further period of pay limits would be impossible for them to sell to their members, while a five per cent limit, was widely considered to be ludicrously tough. Had Callaghan gone to the country in October, Labour’s popularity among the general electorate might have been boosted by the pay restraint policy. But by postponing the election until the spring, Callaghan ensured that the five per cent limit would be tested in Britain’s increasingly impatient and dangerous industrial relations environment. The first challenge came from the fifty-seven thousand car workers employed by the US giant Ford. The TGWU called for a thirty per cent pay rise, on the back of high profits and an eighty per cent rise for its chairman and directors. Callaghan was severely embarrassed and when, after five weeks of lost production, Ford eventually settled for seventeen per cent, he became convinced that he would lose the coming election.

Oil tanker drivers, also TGWU members, came out for forty per cent, followed by haulage drivers, British Leyland workers, and then sewerage workers. BBC electricians threatened a blackout of Christmas TV. The docks were picketed and closed down with Hull, virtually cut off, becoming known as the second Stalingrad. The effects of this were being felt by government ministers as well as the rest of the country. Bill Rodgers, the transport minister, whose mother was dying of cancer, found that vital chemotherapy chemicals were not being allowed out of Hull. In the middle of all this, Callaghan went to an international summit in the Caribbean, from where the pictures of him swimming and sunning himself did not improve the mood back home. When he returned to Heathrow, confronted by news reporters asking about the industrial crisis, he replied blandly, “I don’t think other people in the world will share the view that there is mounting chaos.” This was famously translated into the Daily Mail’s headline, ‘Crisis? What Crisis?’


Above: Rubbish is left piled up in London’s Leicester Square in February 1979

In the centre of London and other major cities, huge piles of rotting rubbish piled up, overrun with rats and becoming a serious health hazard. Most of those striking among the public sector workers were badly paid and living in relative poverty, also having no history of industrial militancy. Nor was the crisis quite as bad as the media portrayed it. There were no deaths in hospitals as a result of union action, there was no shortage of food in the shops and there was no violence. Troops were never used. This was chaos and a direct threat to the authority of the government, but it was not a revolutionary situation, or even an attempt to overthrow a particular government. Yet that was the effect it had, and it led to what was to become Thatcherism, not socialism. A ‘St Valentine’s Day Concordat’ was eventually reached between the government and the TUC, which agreed on annual assessments and guidance, targeting long-term inflation and virtually admitting that the five per cent pay limit had been a mistake. By March most of the action had ended and various large settlements and inquiries had been set up. But by then irreparable damage had already been done to the Labour Government’s reputation as a manager of industrial disputes.

The St David’s Day Devolution Debácle & General Election of 1979:

When the matter of devolution was put to the Welsh electorate in a referendum on St. David’s Day, 1979, it voted overwhelmingly against the planned assembly, by 956,000 votes to 243,000. Every one of the relatively new Welsh counties voted ‘No’. The supporters were strongest in the north-west (Gwynedd), at twenty-two per cent of those voting and weakest in Glamorgan and Gwent at seven to eight per cent, but everywhere there was a crushing rejection of the Labour government’s proposal; only some twelve per cent voted in favour overall. The narrow failure of the Scottish referendum due to the forty per cent clause meant that under previously agreed rules, the Devolution Act setting up a Scottish Assembly had to be repealed. In the Commons, the government was running out of allies. There was therefore no longer any reason for the SNP to continue supporting the Labour government. Plaid Cymru, unlike the SNP, did not call on its MPs to vote to bring about the end of the government. As dying MPs were carried through the lobbies to keep the sinking Labour ship afloat. Michael Foot and the Labour whips continued to try to find any kind of majority, appealing to renegade Scots, Ulster Unionists and Irish nationalists simultaneously but Callaghan, by now, was in a calmly fatalistic mood. He did not want to struggle on through another summer and autumn in the hope that something would turn up.

Finally, on 28th March, the government was defeated by a single vote, and Callaghan became the first Prime Minister since Ramsay MacDonald in 1924 to have to go to Buckingham Palace and ask for a dissolution of parliament because he had lost a vote in the Commons. The election campaign began after the IRA’s assassination of Mrs Thatcher’s leadership campaign manager, Airey Neave, murdered by a Provisional IRA car bomb on his way to the Commons. Thatcher took on Callaghan, who was still more popular than his party and who now emphasised stable prices and his new deal with the unions. Thatcher showed a fresh, authentic face in the media, working with television news teams and taking the advice of advertising gurus, like the Saatchi brothers. Using the slogan Labour isn’t working, which appeared on huge hoardings showing long dole queues, the Tories came back to power with a clear and substantial majority with 339 seats. In the General Election, the Conservatives took sixty-one seats directly from Labour and gained nearly forty-three per cent of the vote.

At a time of heavy swings to the Tories everywhere, the heaviest swings of all, outside London, was in Wales, and in every part of Wales. More Tory MPs were returned in Wales than at any previous twentieth-century election, eleven in all. Nevertheless, Labour remained a massive presence, with twenty-one seats out of thirty-six and forty-seven per cent of the vote compared with thirty-two per cent for the Conservatives. In June, in the heartland of Labourism, there was also a heavy vote against the European Common Market. In the General Election of May, Wales located itself firmly in the political South of Britain, rather than its traditional role as a Northern Labour stronghold. What had happened was that a third challenger had bypassed the old debates and left Labour and Plaid Cymru gasping. The latter’s President, Gwynfor Evans, came third behind the Tory in Carmarthen, and a Sussex solicitor ‘parachuted’ onto Anglesey, Mother of Wales, to win the seat ahead of both Labour and Nationalist parties with a swing of twelve per cent. The Tories also ended generations of Liberal predominance in Montgomeryshire. Even in Labour’s industrial heartlands of the southern valleys, the swing to the Conservatives was the second strongest in Britain.

One effect of this abrupt reversal of a hundred years of history was to equally abruptly cut off an intelligentsia from its people. The ‘professional’ Welsh, blinded by their own desires, had misread Wales very badly in the 1970s. In the 1980s, historians Gwyn Williams (right) and Dai Smith (below) argued that the reasons for this were as much sociological as ideological. The decline of the Welsh coalfields entailed a withering of major political and cultural energies. History had been wilfully redefined so that it stood only as a commentary on what might, or should, have been.

Both Smith and Williams recognised that a more pragmatic, economic nationalism was on offer from the ‘National Left’ of Plaid Cymru, which took a libertarian socialist stance and tried to establish links with ecological, peace and feminist groups. Nevertheless, the broad understanding behind the resurgence of political, cultural and linguistic nationalism remained the same, implicitly, as it had been, explicitly in the 1920s and 30s: The prevailing theory endured from that original period that the anglicisation of Wales was an avoidable disaster because the industrialisation of Wales had replaced the pure ‘old Welsh collier’ population with a conglomeration of ‘half-people,’ the Cymry di-Gymraeg (‘the Welsh without Welsh’).

Of course, by the mid-1970s, Plaid Cymru was no longer advocating, with the cold logic of Saunders Lewis, the deindustrialisation of Wales. In any case, this was something that was to be largely accomplished by the Thatcher government after 1979, though for very different reasons. In the twenty years previous to 1979, what had been stressed by cultural and political nationalists was the extent to which the modern experience of Wales had been a ‘fall from grace.’ Even in the eighties, many traditional nationalists did still believe that Wales could only be saved through the restoration of the old ‘Welsh’ values, delivered by the vehicle of the ancient Welsh language, albeit in ‘living’ form. But by then, the votes of 1979 had already dramatically registered the end of an epoch in Wales in which intellectuals – liberal, nationalist and labourist – had articulated the consciousness of various social groups and classes in Welsh society.

After the Spring of 1979, managerial and administrative groups in Wales became increasingly integrated into broader, technocratic European contexts, without any specifically Welsh content. The most visible and creative formers of educated opinion among the Welsh were rejected by their people and viewed as irrelevant. Old Labourism in Wales, along with Gwynfor Evans’s Nationalism and Lloyd George’s Liberalism were all effectively dead. Michael Foot and Neil Kinnock (above) tried hard to resurrect the Bevanite tradition across Wales and England, supported by the Tribune Left in the early eighties, but they ultimately failed to provide an Alternative Economic Strategy to that of Thatcherism in the three successive general elections. Mrs Thatcher herself had an acronym for her monetarist policies, TINA (There is no alternative). Dai Smith summed up the extent to which the cultural battlefield had already shifted, writing in 1984:

Wales in the 1980s has become its own industry. … The production of history in Wales is now a battleground on which rival armies contend to dispel the confusion. … What we are witnessing is the invoking of the Welsh past in the disputed name of the Welsh future. … Social history that dips a toe in national waters is invariably accused of polluting intent. The purity of emotion is defiled. … for the Wales that is projected to the outside world is not a Wales most of the Welsh know or recognise as anything of their own. Perhaps our ambivalent condition is exemplary after all. We are already a long way down the road which England… has just begun to contemplate. After all, England, too, is a country of the mind.

Dai Smith (1984), Wales! Wales? Hemel Hempstead: George Allen & Unwin, pp. 165-68.

The Breaking of Consensus, Blitz on Jobs & Wales in the Wilderness:

Meanwhile, the players in the last act of Old British Labour and the broken post-war Consensus stumbled on. Callaghan continued as Labour leader until retiring in October 1980. Michael Foot took over as Labour leader after a contest with Denis Healey, who then fought a desperate struggle against Tony Benn and the Left to become Deputy leader, as his party did its best to commit suicide in public. Numerous moderates formed the breakaway SDP. The Scottish Nationalists, derided by Callaghan for voting him down as “turkeys voting for Christmas” lost eleven of their thirteen MPs. The unions lost almost half their members and any real political influence they briefly held.

More important than all that, mass unemployment would return to Britain. It was the one economic medicine so bitter that no minister in the seventies, Labour or Tory, had dared to uncork it, until those of the Thatcher government. In the election campaign, Margaret Thatcher promised a return to the values which had made Victorian Britain great. However, what the British people got was more of a return to the hard-nosed Toryism of the interwar years as the Thatcher government set about the task of deliberately lengthening those dole queues. As wage rises were believed to be the main source of inflation and heavy unemployment, it was often openly argued, would weaken trade union bargaining power, and was a price worth paying. At the same time, an economic squeeze was introduced, involving heavy tax increases and a reduction in public borrowing to deflate the economy, thus reducing both demand and employment. In the 1980s, two million manufacturing jobs disappeared, most of them by 1982.

When Thatcher took on the moderate ‘wets’ in her cabinet, including Jim Prior (bottom of the cartoon), she could rely on the support of much of the press.

Almost immediately after the general election, Wales was fully exposed to the Conservative crusade and the radical restructuring of increasingly multinational capitalism in Britain. The Welsh working population reached a peak in 1979 when just over a million people were at work, fifty-five per cent of them in the service sector and forty-two per cent of them women in the core industries. The run-down of the coal industry continued and was followed by a sharp reduction in steel. A Guardian columnist wrote that, in economic terms, every time England catches a cold, Wales gets influenza. Between June 1980 and June 1982, the official working population fell by no fewer than 106,000. The most catastrophic losses were in steel which lost half its workers and plummeted to 38,000. In addition, there were heavy losses in chemicals, textiles, engineering, construction and general manufacturing. The distributive trades, transport and communications also shed thousands of jobs. Public administration lost proportionately fewer, while a whole range of services in insurance, banking, entertainment, leisure, education and medical services gained four thousand workers.

By June 1983 the official working population of Wales was at 882,000, its lowest level in modern times. There was a high level of unemployment, particularly among a whole generation of young people. In the Thatcher years Wales, like Scotland, was dominated by the politics of resistance to Conservatism, but the Wales TUC was weakened by losing numbers and funds, seemingly incapable of adequate adjustment and response. Overall, the organised trade union movement seemed encased in a perception of a Welsh working class which had become a myth. There was a People’s March for Jobs, but it was a pale imitation not just of previous mass demonstrations in the valleys, but even of the contemporary ones in England. The exceptions were, again, the successful strikes and campaigns against closures by the South Wales Miners (South Wales Area, NUM) in 1981-82, but even there, the question was being asked at public meetings, Have the Miners Really Won? The answer came in 1984, by which time they were gaining support, and preparing to fight a struggle as hard and as dedicated as any in their history.

Meanwhile, much radical energy went into CND mass meetings and protests, which acquired much more weight across south Wales and the valleys, from Monmouthshire to Carmarthenshire. The protest camp at the Greenham Common missile base was started by a march of women from Cardiff. Around the language issue, the clamour and turmoil revived and continued in the campaign for the Welsh-medium television channel, S4C. Among Welsh nationalist students, support for constitutional nationalism plummeted after the Referendum result and calls for more radical direct action multiplied for the first time since the Investiture Crisis of 1969 and the botched bombing of Caernarfon Castle. This action was largely limited to student members of Cymdeithas yr Iaith (Welsh Language Society) climbing transmitter masts and smashing old television sets at the National Eisteddfod. The friction between Welsh-speaking and English-speaking students led to a division in the student unions in both Bangor and Aberystwyth, allowing right-wing English conservatives to take control. Using Gandhian tactics, Gwynfor Evans, having stood down as President of the Plaid Cymru, began a lengthy hunger strike to secure the Fourth Channel in Welsh, which was eventually launched, by Superted, the cartoon character, in the late summer of 1982.

In 1981, after a successful campaign to establish a fourth television channel in Welsh, Gwynfor Evans (left) stood down as Plaid Cymru President after thirty-six years, to be replaced by Dafydd Wigley, MP for Caernarfon (right).

There was also a no far more sinister campaign led by the shadowy organisations, Meibion Glyndwr (Sons of Glyndwr) and The Workers’ Army of the Welsh Republic (the Welsh initials spelling ‘Dawn’), which apparently acquired weapons from the former Official IRA. A major campaign of arson began in 1980, against holiday homes throughout western Wales, where passive support from local people went under the humourous slogan Strike a Light for Wales above a picture of an ‘England’s Glory’ matchbox. A major police action was launched by Gwynedd and Dyfed-Powys police forces, Operation Tán (Fire), producing a chorus of complaints over violations of civil rights, telephone tapping and the use of agents provocateurs. Several police officers were accused of fabricating evidence and confessions and of trying to falsely implicate the Meirionydd MP Dafydd Elis Thomas in a bombing campaign against military, government and Conservative Party offices.

The Iron Lady on manoeuvres with a tank, union flag and Prince of Wales’s feathers. By 1983, she had well and truly parked her tank on the green, green grass of Wales, and was at the peak of her powers throughout the United Kingdom.
Sacrificing the Young – the Case of Coventry in the Recession:

In Coventry, nearly sixty thousand jobs were lost in this period of recession from 1979 to 1983. The Conservative policy of high-interest rates tended to overvalue the pound, particularly in the USA, the major market for Coventry’s specialist cars, leading to a rapid decline in demand. Also, the Leyland management embarked on a new rationalisation plan. The company’s production was to be concentrated at its Cowley and Longbridge plants. Triumph production was transferred to Cowley, and Rover models were to be produced at the new Solihull plant. The Coventry engine plant at Courthouse Green was closed and Alvis, Climax and Jaguar were sold off to private buyers. In the first three years of the Thatcher government, the number of Leyland employees in the city fell from twenty-seven thousand to eight thousand. One writer summarised the effects of Conservative policy on Coventry in these years as turning a process of gentle decline into a quickening collapse. Overall the city’s top manufacturing firms shed thirty-one thousand workers between 1979 and 1982. Well-known pillars of Coventry’s economic base such as Herbert’s, Triumph Motors and Renold’s all disappeared. Unemployment had stood at just five per cent in 1979, the same level as in 1971. By 1982 it had risen to sixteen per cent.


None of this had been expected locally when the Thatcher government came to power. After all, Coventry had prospered reasonably well during the previous Tory administrations. The last real boom in the local economy had been stimulated by the policies of Ted Heath’s Chancellor, Anthony Barber. However, the brakes were applied rather than released by the new government. Monetarist policies were quick to bite into the local industry. Redundancy lists and closure notices in the local press became as depressingly regular as the obituary column. The biggest surprise was the lack of resistance from the local Labour movement, given Coventry’s still formidable trade union movement. There was an atmosphere of bewilderment and an element of resignation characterised in the responses of many trades-union officials. It was as if the decades of anti-union editorials in the Coventry Evening Telegraph were finally being realised.

There were signs of resistance at Longbridge, but the British Leyland boss, Michael Edwardes, had introduced a tough new industrial relations programme which had seen the removal from the plant of the union convenor, Red Robbo (Derek Robinson)Britain’s strongest motor factory trade union leader. Edwardes had also closed the Speke factory on Merseyside, demonstrating that he could and would close plants in the face of trade union opposition. Coventry’s car workers and their union leaders had plenty of experience in local wage bargaining in boom times but lacked strategies to resist factory closures during the recession. Factory occupation, imitating its successful use on the continent, had been tried at the Meriden Triumph Motorcycle factory, but with disastrous results. The opposition from workers was undoubtedly diminished by redundancy payments which in many cases promised to cushion families for a year or two from the still unrealised effects of the recession.

Above: Employment levels in Coventry to 1981, showing a sharp decline after 1979.

As experienced in Wales, young people were the real victims of these redundancies, as there were now no places or apprenticeships for them to fill. The most depressing feature of Coventry’s unemployment was that the most severely affected were the teenagers leaving the city’s newly-completed network of Community Comprehensives. As the recession hit the city, many of them joined the job market only to find that expected opportunities in the numerous factories had evaporated. By June 1980, forty-six per cent of the city’s sixteen to eighteen-year-olds were seeking employment and over half of the fourteen thousand who had left school the previous year were still unemployed. Much prized craft apprenticeships all but vanished and only ninety-five apprentices commenced training in 1981. In 1981-2, the Local Authority found posts for some 5,270 youths on training courses, work experience and community projects, but with limited long-term effects.

The early 1980s were barren years for Coventry youngsters, despite the emergence of their own sca group, The Specials, and their own theme song, Ghost Town, which also gave vent to what was becoming a national phenomenon. The lyric’s sombre comparison of boom time and bust was felt much more sharply in Coventry than elsewhere. Coventry paid a very heavy price in the 1980s for its over-commitment to the car industry, suffering more than nearby Midland towns such as Leicester and Nottingham, both of which had broader-based economies. Its peculiar dependence on manufacturing and its historically weak tertiary sector meant that it was a poor location for the so-called sunrise industries. These were high-tech enterprises, based largely along the axial belt running from London to Slough, Reading and Swindon, so they had an insignificant initial impact on unemployment in Coventry and other Midland and Northern industrial centres. The growth in service industries was also, initially at least, mainly to the benefit of the traditional administrative centres, such as Birmingham, rather than to its West Midland neighbours.

While little development work took place in local factories, Nissan recruited hundreds of foremen from Coventry for its new plant in Sunderland, announced before the Thatcher government, and Talbot removed its Whitley research and development facility to Paris in 1983, along with its French-speaking Coventrians. Only at Leyland’s Canley site did research provide a service for plants outside the city. For the first time in a hundred years, Coventry had become a net exporter of labour. By the time of the 1981 Census, the city had already lost 7.5 per cent of its 1971 population. The main losses were among the young skilled and technical management sectors, people who any town or city can ill afford to lose. Summing up the city’s position at this time, Lancaster and Mason (see source list) emphasised the dramatic transition in its fortunes from boomtown, a magnet for labour from the depressed areas, to a depressed district itself:

Coventry in the mid 1980s displays more of the confidence in the future that was so apparent in the immediate post-war years. The city, which for four decades was the natural habitat of the affluent industrial worker is finding it difficult to adjust to a situation where the local authority and university rank amongst the largest employers. Coventry’s self-image of progressiveness and modernity has all but vanished. The citizens now largely identify themselves and their environment as part of depressed Britain.  

The Falklands Factor – War in the South Atlantic:

One of the many ironies of the Thatcher story is that she was rescued from the political consequences of her monetarism by the blunders of her hated Foreign Policy. In the great economic storms of 1979-81, and on the European budget battle, she had simply charged ahead, ignoring all the flapping around her in pursuit of a single goal. In the South Atlantic, she would do exactly the same and with her good luck, she was vindicated. Militarily, it could so easily have all gone wrong, and the Falklands War could have been a terrible disaster, confirming the Argentinian dictatorship in power in the South Atlantic and ending Margaret Thatcher’s career after just one term as Prime Minister. Of all the gambles in modern British politics, the sending of a task force of ships from the shrunken and underfunded Royal Navy eight thousand miles away to take a group of islands by force was one of the most extreme.

On both sides, the conflict derived from colonial quarrels, dating back to 1833, before the reign of Queen Victoria began, when the scattering of islands had been declared a British colony. In Buenos Aires, a newly installed ‘junta’ under General Leopoldo Galtieri was heavily dependent on the Argentine navy, itself passionately keen on taking over the islands, known in Argentina as the Malvinas. The following year would see the 150th anniversary of ‘British ownership’ which the Argentines feared would be used to reassert the Falklands’ British future. The junta misread Whitehall’s lack of policy for lack of interest and concluded that an invasion would be easy, popular and impossible to reverse. In March an Argentine ship tested the waters by landing on South Georgia, a small dependency south of the Falklands, disembarking scrap-metal dealers. Then on 1 April, the main invasion began, a landing by Argentine troops which had been carefully prepared for by local representatives of the national airline. In three hours it was all over, and the eighty British marines surrendered, having killed five Argentine troops and injured seventeen with no losses of their own.

In London, there was mayhem. Thatcher had had a few hours’ warning of what was happening from the Defence Secretary, John Nott. Calling a hurried meeting in her Commons office, Sir John Leach gave her clarity and hope, when her ministers were as confused as she was. He told her he could assemble a task force of destroyers, frigates and landing craft, led by Britain’s two remaining aircraft carriers. It could be ready to sail within forty-eight hours and the islands could be retaken by force. She told him to go ahead. Soon after, the Foreign Secretary, Peter Carrington, tended his resignation, accepting responsibility for the Foreign Office’s failings. But Margaret Thatcher was confronted by a moral question which she could not duck, which was that many healthy young men were likely to die or be horribly injured in order to defend the ‘sovereignty’ of the Falkland Islanders. In the end, almost a thousand died, one for every two islanders and many others were maimed and psychologically wrecked.

In the cabinet and the Commons, Thatcher argued that the whole structure of national identity and international law was at stake. Michael Foot, who had been bellicose in parliament at first, harking back to the appeasement of fascism in the thirties, urged her to find a diplomatic answer. Later she insisted that she was vividly aware of the blood price that was waiting and not all consumed by lust for conflict. Thatcher had believed from the start that caving in would finish her. The press, like the Conservative Party itself, was seething about the original diplomatic blunders. As it happened, the Argentine junta, even more belligerent, ensured that a serious deal was never properly put. They simply insisted that the British Task Force be withdrawn from the entire area and that Argentine representatives should take part in any interim administration and that if talks failed Britain would simply lose sovereignty. The reality, though, was that their political position was even weaker than hers. She established a small war cabinet and the Task Force, now up to twenty vessels strong was steadily reinforced. Eventually, it comprised more than a hundred ships and twenty-five thousand men. The world was both transfixed and bemused.

The Empire struck back, and by the end of the month South Georgia was recaptured and a large number of Argentine prisoners taken: Thatcher urged questioning journalists outside Number Ten simply to ‘rejoice, rejoice!’ Then came one of the most controversial episodes in the short war. A British submarine, The Conqueror, was following the ageing but heavily armed cruiser, the Belgrano. The British task force was exposed and feared a pincer movement, although the Belgrano was later found to have been outside an exclusion zone announced in London, and streaming away from the fleet. With her military commanders at Chequers, Thatcher authorised the submarine attack. The Belgrano was sunk, with the loss of 321 sailors. The Sun newspaper carried the headline ‘Gotcha!’ Soon afterwards, a British destroyer was hit by an Argentine Exocet missile and later sunk. Forty died.

On 18 May 1982, the war cabinet agreed that landings on the Falklands should go ahead, despite the lack of full air cover and worsening weather. By landing at the unexpected bay of San Carlos in low clouds, British troops got ashore in large numbers. Heavy Argentine air attacks, however, took a serious toll. Two frigates were badly damaged, another was sunk, then another, then a destroyer, then a container ship with vital supplies. Nevertheless, three thousand British troops secured a beachhead and began to fight their way inland. Over the next few weeks, they captured the settlements of Goose Green and Darwin, killing 250 Argentine soldiers and capturing 1,400 for the loss of twenty British lives. Colonel ‘H’ Jones became the first celebrated hero of the conflict when he died leading ‘2 Para’ against heavy Argentine fire.

The Royal Marines ‘yomp’ towards Port Stanley during the Falklands War, June 1982

The battle then moved to the tiny capital, Port Stanley, or rather to the circle of hills around it where the Argentine army was dug in. Before the final assault on 8 June, two British landing ships, Sir Tristram and Sir Galahad were hit by missiles and the Welsh Guards suffered dreadful losses, many of the survivors being badly burned. Simon Weston was one of them. Out of his platoon of thirty men, twenty-two were killed. The Welsh Guards lost a total of forty-eight men killed and ninety-seven wounded aboard the Sir Galahad. Weston survived with forty-six per cent burns, following which his face was barely recognisable. He recalled:

Simon Weston in 2008

“My first encounter with a really low point was when they wheeled me into the transit hospital at RAF Lyneham and I passed my mother in the corridor and she said to my gran, “Oh mam, look at that poor boy” and I cried out “Mam, it’s me!” As she recognised my voice her face turned to stone.”

Simon Weston later became a well-known spokesman and charity worker for his fellow injured and disabled veterans. The Queen’s second son, Prince Andrew, Duke of York, also saw active service in the Falklands War, as a helicopter pilot. For millions around the world, however, the War seemed a complete anachronism, a Victorian gunboat war in a nuclear age. But for millions in Britain, it served as a wholly unexpected and almost mythic symbol of rebirth. Margaret Thatcher herself lost no time in telling the whole country what she thought the war meant. It was more than simply a triumph of freedom and democracy over the Argentinian dictatorship. Speaking at Cheltenham racecourse in early July, she said:

We have ceased to be a nation in retreat. We have instead a newfound confidence, born in the economic battles at home and found true eight thousand miles away … Printing money is no more. Rightly this government has abjured it. Increasingly the nation won’t have it … That too is part of the ‘Falklands factor.’ … Britain found herself again in the South Atlantic and will not look back from the victory she has won. 

A 1982 cartoon: Britain was at war with Argentina over the Falkland Islands. The inhabitants of the islands, a dependent territory of the United Kingdom, wanted to remain under British rule, but Argentina invaded.

Of course, the Falklands War fitted into Margaret Thatcher’s personal narrative and merged into a wider sense that confrontation was required in public life and the country’s politics. Thatcher was victorious, but it was a costly war for the British. Across Wales, for example, where the atmosphere was already becoming unpleasant in many respects, the impact was direct, especially relating to the disaster which befell the Welsh Guards at Bluff Cove, but also in anxieties over the Welsh communities in Patagonia in Argentina, who, like the Falklanders, had formed their colonies there in the nineteenth century. Despite opposition to the War from Plaid Cymru, a traditional pacifist party, there is little doubt that the War gave the same impetus to British patriotism, and chauvinism, and to the Conservatives, as it did in other parts of Britain. The Tories had looked destined for defeat in the 1983 General Election, but following the Falklands War, the Iron Lady, also variously characterised as Boadicea (Boudicca) and Britannia, swept back to power on a tidal wave of revived jingoistic imperialism. Even in Labour heartlands, such as south Wales, the Tories made more major gains.

The Demise of the Heartlands & Death of ‘Old King Coal’, 1983-87:

As the general election loomed, with Labour in visible disarray, and with the appreciably calamitous effects of the Tory policies on the Welsh economy, it was on the left wing of the national movement that awareness of the bankruptcy of traditional political attitudes seem to have registered. However, in the Wales of 1983, these could only be marginal movements. The great majority of the Welsh electorate remained locked within what was now essentially an unholy trinity of parties. The General Election of 1983 exposed the myth that the South Welsh valleys were still some kind of ‘heartland’ of Labour; it registered even more visibly than 1979 had done, Wales’s presence within the South of Britain in terms of political geography. In Wales as a whole, the Labour vote fell by nearly ten per cent, a fall exceeded only in East Anglia and the South East; it ran level with London again. The Conservative vote fell by only twenty-four thousand (1.7 per cent), whereas the Labour vote fell by over 178,000. The great beneficiaries were the Liberal-SDP alliance, whose vote rocketed by over two hundred thousand.

The Conservatives took the Cardiff West seat of George Thomas, the former Speaker, and the marginal seat of Bridgend, swept most of Cardiff and again pressed very hard throughout the rural west, ending up with thirteen seats out of thirty-eight. Plaid Cymru held its two seats in the northwest and moved into second place on Anglesey. It also registered significant votes in Carmarthen, Caerphilly, Ceredigion, Llanelli and the Rhondda. The success of the Liberal-SDP Alliance was spectacular. It more than doubled the former Liberal poll, reaching twenty-three per cent of the Welsh electorate, won two seats and came second in nineteen out of thirty-eight. Labour’s defeat came close to becoming a rout, but the party managed to retain a score of seats despite dropping nearly ten per cent in the poll. It was now a minority party again, at its lowest level since 1918. It held on by the skin of its teeth in Carmarthen and to Wrexham, a former stronghold. Even in the coalfield valleys, where it held all but one of its seats, six became three-way marginals without an overall majority. The Alliance came second in ten, and in the Rhondda won eight thousand votes without even campaigning. Only seven seats remained with overall Labour majorities, and only three of the old twenty-thousand majorities were left: Rhondda, Merthyr Tydfil and Ebbw Vale (Blaenau Gwent). Looking ahead (from c 1984), Gwyn A. Williams wrote that Wales was becoming…

… a country which largely lives by the British State, whose input into it is ten per cent of its gross product, faces a major reconstruction of the public sector; … faces the prospect of a new depression or a recovery, either of which will intensify the process… faces the prospect of a large and growing population which will be considered redundant in a state which is already considering a major reduction in the financial burden of welfare.

Small wonder that some, looking ahead, see nothing but a nightmare vision of a depersonalised Wales which has shrivelled up into a Costa Bureaucratic in the south and a Costa Geriatrica in the north; in between, sheep, holiday homes burning merrily away and fifty folk museums where there used to be communities … the majority of the inhabitants are choosing a British identity which seems to require the elimination of a Welsh one.


Striking Yorkshire miners barrack moderate union leaders in Sheffield.

The government then took a more confrontational approach at home. As in the 1920s, resistance to brutal rationalisation through the closure or selling off of uneconomic enterprises, or by wage or job reductions, was met by determined opposition, never tougher than in the confrontation of 1984-85 with the National Union of Mineworkers, led by Arthur Scargill. The National Coal Board, supported by the government, put forward a massive programme of pit closures. The bitter, year-long miners’ strike which followed was roundly defeated, amid scenes of mass picketing and some violence from both miners and the police. The miners’ strike was long and bitter, a fight for the survival of coalfield communities. For example, during the late nineteenth and for most of the twentieth century, Armthorpe in South Yorkshire became known for its coal mining and a deep seam colliery was sunk; the pit was named Markham Main. In 1984, an Armthorpe dress shop sales assistant was quoted as saying:

“If this pit closed down, the whole village would close down because of the number of men working there. This shop and a lot of other shops would have to close down.”

Markham Main Colliery: After the closure of the mine in 1996 the area went through a deep depression. The old colliery site is now a large housing estate, with a thriving community and parks and tracks for walking and cycling to the local wood. Today, Armthorpe remains one of the more affluent areas of Doncaster, with an IKEA warehouse providing local employment. Photo (circa 1980?) by Chris Allen, CC BY-SA 2.0, https://commons.wikimedia.org.

But ultimately the government proved equally determined and had, crucially, built up the resources to resist their anticipated demands for it to back down. In 1983, a team of researchers in south Wales published a pamphlet, Who Profits from Coal? revealing that not only had the NCB been importing stocks of South African coal for some years, but that it was also investing in its production, even raiding the miners’ pension fund to do so. There were also major divisions within the NUM itself, with the Nottinghamshire area breaking away to form the UDM (Union of Democratic Mineworkers) and the South Wales area calling for an orderly return to work when it became apparent that the NUM could not win. However, the national executive, led by its president, Arthur Scargill, refused to call a country-wide ballot on this proposal, prolonging the struggle and suffering unnecessarily.

 001 (3)

Miners’ leader, Arthur Scargill.

The strike and the colliery closures left a legacy of bitterness and division in British which was only too apparent at the time of Margaret Thatcher’s recent state funeral and is the subject or background for many recent films, some of which have distorted or trivialised our recollection of the reality. Among the better representations of it is Billy Elliott. Under the thirty years rule, the government documents from 1984 have only just become available, so we can now look forward to the more rounded perspectives of historians on these events. Already, politicians have called for government apologies to be given to the miners and their families.

Picketing miners were caught and hand-cuffed to lamp posts by police in 1986.

In the Durham Coalfield, pits were often the only real source of employment in local communities, so the economic and social impact of closures could be devastating. The 1984-5 Strike was an attempt to force a reversal of the decline. The pit closures went ahead and the severe contraction of the mining industry continued: it vanished altogether in Kent, while in Durham two-thirds of the pits were closed. The government had little interest in ensuring the survival of the industry, determined to break its militant and well-organised union. There was further resistance, as pictured above, but by 1987, the initial programme of closures was complete. The social cost of the closures, especially in places in which mining was the single major employer, as in many of the pit villages of South Yorkshire, Durham and the valleys of south Wales, was devastating. The entire local economy was crippled. On Tyneside and Merseyside, more general deindustrialisation occurred. Whole sections of industry, including coal, steel and shipbuilding, simply vanished from their traditional areas.


(to be continued)


Majesty & Grace IX: The Reign of Elizabeth Windsor, 1963-78: Part 2 – Multicultural Britons.

Present into Past – The Problem of Retrospection:

The closer that social historians get to their own times, the harder it is for them to be sure they have hold of what is essential about the period in question: the more difficult it is to separate the rich tapestry of social life which appears on the surface of the woven fabric from its underlying patterns. This is the problem of perspective that historians have to try to overcome in their craft. The period from 1963 to 1978 was one of rapid social change and one in which the pace and direction of social change itself became a matter of concern in social discourse. The discussion in the sixties was about whether the surface evidence of change really added up to a social revolution for ordinary people. What happened to the standards of living of ordinary Britons in the seventies threw into question the depth of the changes. That argument is still unresolved: more than sixty years later we are still living out its contradictory legacy. Many witnesses to the period are still alive, and each with their own differing memories, impressions and interpretations of the period.

The Caernarfon Investiture of 1969 & the botched bombings:

The ‘Investiture Crisis’ as it was known to contemporaries in Wales referred to an undercurrent of violence in Welsh nationalism and republicanism that had been getting stronger since the drowning of the village of Tryweryn to supply Liverpool with water earlier in 1957. This was done by an Act of Parliament, despite almost all Welsh MPs voting against it. As the Welsh historian John Davies wrote in his 1990 History of Wales:

Liverpool’s ability to ignore the virtually unanimous opinion of the representatives of the Welsh people, confirmed one of the central tenets of Plaid Cymru – that the national Welsh community, under the existing order, was wholly powerless.

J. Davies (1990): Allen Lane.

Attacks on the Tryweryn reservoir followed and the Free Wales Army was founded in 1963. Violent (against property) Welsh Nationalism was, thankfully, almost as unpopular and badly organised as violent Scottish nationalism. But the sabotage of reservoirs yielded to a bombing campaign during the run-up to the Investiture in which two bombers blew themselves up in trying to blow up the Royal train, and a little child was mutilated in an accident with explosives near the walls of Caernarfon Castle, where the ceremony was due to take place. Mudiad Amddiffyn Cymru (Welsh Defence Movement) was behind the bombings, in league with the Free Wales Army.

In preparation for his Investiture in 1969, Charles spent some time at the University College of Wales, Aberystwyth, learning Welsh and studying the history and culture(s) of Wales under the tutelage of Professor Teddy Millward. The initial purpose of this was so that he would know enough of the ancient language to be able to make the oath at the Investiture Ceremony and, subsequently, to read speeches out loud with intelligible pronunciation. When a delegation of student leaders met him at Lampeter in 1980, he admitted to them that his conversation was limited and that although he had retained much of the vocabulary Millward had taught him, he had not had many opportunities to use it. Part of his problem was that the formal, classical register in written form was very different to the Cymraeg Byw (Living Welsh) needed for everyday communication and simple conversation. Nevertheless, on his first visit to Cardiff as Charles III, following his mother’s death, he delivered a speech in Welsh with considerable accuracy and fluency. As Nelson Mandela once said, if you talk to people in a common language they know, you will speak to their minds. You will speak to their hearts if you talk to them in their native language. Charles seems to have understood and to have sought to apply the ‘Mandela principle.’

Tynged yr Iaith – the Fate of the Language:

By 1961, Welsh speakers made up only twenty-five per cent of the population, and by 1971 they were barely twenty per cent. The crisis appeared terminal, so in 1962 the writer and founder member of the Welsh Nationalist Party, Saunders Lewis (pictured below) returned to public affairs determined to prevent the death of the ancient language. In the BBC Wales Annual Radio Lecture that year, Tynged yr Iaith (The Fate of the Language), he called upon the Welsh people to make the salvation of the language their central priority and to be prepared to use revolutionary means to achieve it. The response was remarkable. After the formation of Cymdeithas yr Iaith Gymraeg, the Welsh Language Society, its young activists stormed all over Wales, opening twenty years of direct action campaigning against offices, roadsigns, staging sit-ins, wrecking TV masts, generally making life hell for any kind of official and the police. While Plaid Cymru remained somewhat aloof, there was much human overlap, and the heavy colonisation of Welsh institutions, particularly in the media, by Welsh-speaking professional people (‘sons of ministers‘) proved to be of powerful assistance.

Around this campaign, which assumed the character of a crusade, all sorts of movements developed: a major drive to establish Welsh-medium nurseries, primary and secondary schools, Welsh-speaking University College hostels, special posts for the teaching of Welsh, demands for an all Welsh-medium College, or Coleg Cymraeg, and for positive discrimination in favour of the use of Welsh. There were crash immersive programmes, or ‘wlpan’ for learners, dysgwyr, based on the invention of a modern form of the language, Cymraeg Byw, (Living Welsh), and the grouping of learners into an organisation at local and national levels. The new Welsh Arts Council directed subsidies towards an ailing Welsh Language publishing world. Step by step a Labour government, followed by a Conservative one, was forced to yield; a Secretary of State for Wales in 1964, a major Language Act in 1967, and a whole series of autonomy measures. Within, around and distinct from this drive a whole world of Welsh language publishing, film production, radio and television broadcasting and infinite varieties of pop, rock, folk and urban music mushroomed.

Many of these initiatives, like the Welsh-medium Channel 4, S4C, did not come to full fruition until the early 1980s, but most originated in the mid to late sixties. These were the most visibly Welsh signals of the onset of outright revolt in the late sixties. The major factor in this uprising was disillusionment with the Welsh Labour establishment. The abrupt reversal of Labour government policy in 1966, after the high hopes of 1964, seemed a culminating disappointment after long years of diminishing relevance in the Party’s policies, internal debate and inner life. The Labour hegemony in Wales had hardened into an oligarchy. On the other hand, the long and dismal history of elitist and contemptuous attitudes towards Anglo-Welsh people had been a disturbing feature of Welsh nationalism and much language-focused Welsh national feeling since the days of its founders.

Young people in Wales and Scotland generated a tide of nationalist protest, the scale of which had until then only been experienced in the Basque regions of Spain or in French Quebec. Neither Wales nor Scotland had enjoyed the economic growth of the south and midlands of England in the fifties, despite the ‘Development Areas’ legislation and programmes introduced by Labour governments before 1951. Their national aspirations were hardly fulfilled by the formal institutions as the offices of Westminster-appointed Secretaries of State. In the case of Wales, this appointment was only made for the first time in centuries by the Wilson government in 1964. Scottish nationalists complained, with some justification, that the very title Elizabeth II was a misnomer in their country since Elizabeth I had not been Queen of Scots.

Caernarfon Castle was set up for the Investiture of Prince Charles on
30th June 1969

As far as many of the Welsh were concerned, the title of ‘Prince of Wales’ (Tywysog Cymry), as bestowed by the monarch by ‘right of conquest’ on the eldest son and heir to the English throne since the reign of Edward I, was also a misnomer. In Wales, there was the added theme of an ancient language and culture threatened with extinction in an unequal battle against anglicising mass culture and media. A narrow victory for a Welsh Nationalist, Gwynfor Evans, in a parliamentary by-election in Carmarthen in 1966, was followed by a renewed civil disobedience campaign in support of the defence and propagation of the Welsh language. Under Evans’ leadership, a new Plaid Cymru (Party of Wales) began to emerge. In 1969, elaborate ceremonies greeted the formal Investiture of Prince Charles as Prince of Wales, to paraphrase Gwyn Williams, in a tumult of public acclaim, largely in English, and a tumult of public mockery, largely in Welsh.

The Anglo-Welsh historian Dai Smith, more than a decade later drew a comparison between the Investiture of Edward Prince of Wales (later Edward VIII) in 1911 and the more recent ceremony at Caernarfon:

Here was stage-managed the investiture of Edward, Prince of Wales, in a rich ceremony of dedication and loyalty. This patriotic event could be seen, like that of Charles in 1969, as a plot to dupe the ‘masses’. In this scenario, the chief Welsh politicians of ‘deliver up’ the nation to English/ British domination and, in due course, are suitably rewarded with titular baubles – Lloyd George becomes the Earl of Dwyfor and George Thomas becomes Viscount Tonypandy. … Neither event defused radical politics in Wales – after 1969 Plaid Cymru grew in influence and the miners’ unions began their five-year campaign…

Dai Smith (1984), Wales! Wales? Hemel Hempstead: George Allen & Unwin.

The Investiture Ceremony
Celtic Languages & Dialects in the British Isles:

After many decades of decline, most notably (and dangerously) in the number of ‘monoglot’ native speakers, by 1981 there was currently a stable proportion of between one-fifth and one-quarter of Welsh residents, over half a million, describing themselves as Welsh speakers. The Celtic nations of the British Isles also protected their own Celtic-influenced or Scots-influenced dialects or varieties of ‘English’, each of which can be subdivided further into localised varieties. For example, Welsh English, or Anglo-Welsh, has differing northern and southern varieties, with parts of the south, specifically south Pembrokeshire and the Gower Peninsula, having had English and Flemish colonies since the twelfth century. A uniform spoken English across the British Isles is an unlikely prospect. In Wales, the same is true of spoken Welsh, with varieties in accent and vocabulary, if not in grammar.

As far as constitutional Welsh nationalism was concerned, Gwynfor Evans lost his Carmarthen seat in the 1970 general election but two more striking Plaid Cymru by-election performances Rhondda West and Caerffili (Caerphilly) in 1967 and 1968 suggested that the Carmarthen victory had been was no flash in the pan. At last, the complacent Welsh Labour Party was being challenged, and in the 1974 election, Plaid Cymru won two northern Welsh seats, Caernarfon and Meirionydd, taking a third in the second general election that year. Welsh Labour was divided in its response. Michael Foot, MP for Aneurin Bevan’s old seat of Ebbw Vale, thought the nationalists should be ‘bought off’ through reform measures, including devolution, but Neil Kinnock, who declared that road-sign bilingualism (like that shown below) was a waste of funds, believed they should be fought. Either way, they held their seats, continuing to pose an electoral threat in Labour’s traditional heartlands in the southern valleys and refusing to ‘go away.’ Neither did the bilingual road signs.

A road sign in Powys.

The equality accorded to English and Welsh on road signs symbolised a victory for Welsh language activists. They had campaigned long and hard to gain official recognition of their native language and to ensure its survival. By 1974, both Cornish and Manx no longer had any native speakers and attempts to revive them led to only a very small number of learners of both Celtic languages. Although the official figure for users of Irish Gaelic was subsequently recorded as 1.4 million in the Republic, equivalent to forty per cent of the population. But this reflected the status of the language in Irish schools, whereas the habitual use of Irish in daily life in the Gaeltacht, the Irish-speaking communities, was estimated at 0.25 per cent of the population. In Scotland, the Scottish Gaelic community accounted for about one per cent of the population there. According to the 1981 Census, only 473 children spoke Gaelic according to the 1981 Census.

Broadford Primary School, Isle of Skye, where children are taught in Gaelic.

Skye was once thought of as the centre of the Gaeltachd, the land of the Gaels. The only large concentration of Gaelic speakers still left on the island is on the northeast tip, the Staffin peninsula. The parish is one of the last redoubts of the northern Gaels. All the children in Staffin in the 1980s knew Gaelic but seldom used it after transfer to secondary school in Portree, where the other children teased them, calling them teuchtars or bumpkins (uneducated people of the land). They also thought that the language wouldn’t get them anywhere. As Gaelic speakers, they would just stay on their croft and stay poor. It would be better for them to develop their English and head south.

A Gaelic-speaking child at Broadford Primary School.
Wales, the Cymry & the Anglo-Welsh in the late seventies:

Meanwhile, around the Welsh language, the clamour and turmoil continued into the eighties, as did the many initiatives to arrest its decline in the previous decades. According to the 1981 Census returns, these seemed to have worked. Although the overall proportion of Welsh speakers had slipped further since 1971, to 18.9 per cent, there had been a dramatic slowing in the rate of decline over the decade, and it seemed almost to be coming to a halt. However, while there were marginal increases in Anglo-Welsh areas such as Gwent and Glamorgan, there was still a serious decline in the heartlands of the language, most notably in the southwest, where the fall was six per cent. There had been some ‘retrenchment’ in Ceredigion and parts of Gwynedd, and there were unmistakable signs that all the campaigning had begun to take effect among the younger age groups. Overall, out of a population of almost 2.8 million, over 550,000 identified as Welsh speakers. Nonetheless, the continuing threat to the bro or Welsh-speaking heartland had led to the growth of a new movement in the seventies, Adfer (‘restore’), with a swathe of intellectual zealots, dedicated to building a monoglot Welsh gaeltachd in the west and to the construction of an ethnically pure and self-sufficient economy and society there. This organisation viewed the true Cymry as the Welsh speakers and adopted a chauvinistic attitude towards the remainder of the people of Wales.

Yet Welsh-speaking figures from the industrial south, like Dai Francis, the leader of the South Wales NUM (see part one), who became a Bard in the Order of the Gorsedd of the National Eisteddfod, became leading Welsh-speaking figures. Francis was nominated by Undeb Cenedlaethol Myfyrwyr Cymru (NUS Wales) as an alternative candidate to Prince Charles for the role of Chancellor of the University of Wales in 1977/78. He lost the contest (the exact result of the vote was never declared), but only after the campaign had caused considerable embarrassment to the Welsh establishment in the University Court, one of the few all-Wales bodies at this time. Most importantly, it enabled UCMC to raise support among Welsh academics for the official recognition of the language in education on a nationwide basis in subsequent years.

A cogent and effective proposal for an elected Welsh assembly had been formulated and presented by the MP and cabinet minister Cledwyn Hughes in the 1960s, but it got nowhere. The proposal which emerged in the mid-seventies was partly an afterthought to the response to the growing strength of the nationalist challenge in Scotland. It was, though, an ineffective compromise for Wales, which won no real enthusiasm, even among its supporters among the constitutionalist nationalists in Wales. After endless parliamentary agonies, it was to be submitted to a referendum on St David’s Day, 1st March 1979. However, it was already dead in the water by the end of 1978, due to an amendment at Westminster requiring forty per cent of both the Scottish and Welsh electorates to vote ‘yes.’

It was resisted by a huge bloc of opinion ranging from the representatives of multinational corporations and their British subsidiaries to the populist press and on to the Adferites in the north. It was also fiercely opposed by a bloc of South Wales Labour MPs led by Neil Kinnock and Leo Abse, who played on fears of their Anglo-Welsh constituents of being taken over and dominated by a Welsh-speaking, mainly northern, ‘crach’ (élite). There were also those on the left who were suspicious of what they had increasingly viewed as a corrupt Welsh-British establishment. Nationalist supporters also found themselves in a difficult situation, especially in Caernarfon and Meirionydd, where their voters, not just the supporters of Adfer, were not keen on being ruled from Cardiff. Not surprisingly, faced with all these obstacles, the campaign lacked conviction. As Gwyn Williams commented:

… it was an unreal war over an unreal proposal. One major reality, however, was a wholesale political revolt against the kind of Wales being presented and created through the medium of the Welsh language campaigns. It was a revulsion wholly negative in content and style.

Gwyn A. Williams (1985), When Was Wales? Pelican (Penguin) Books.

Professor Gwyn Williams, Photograph by HTV Wales.

However, Williams also pointed out that far more than a Welsh-language Wales was being rejected. Opinion polls showed the Conservatives running at a level of support they hadn’t enjoyed for a century. After the economic crisis of 1976 and its unnecessary surrender to the IMF (see below), the Labour government adopted the policies which the radical new leadership of the Conservatives had been developing. In the seventies, Labour ceased to be either socialist or social-democratic. Wales was plagued by economic difficulties, and its working population was being transformed at a pace too swift for its traditional institutions to handle. It is, in retrospect, that it was not interested in public discourse mainly among its intellectuals that focused largely and widely on language and culture.

The reality was that, by 1978, the client economy of Wales was in very deep trouble. Yet despite the success of Plaid Cymru in local elections during the final years of ‘old Labour’ rule, they did not seem to pose quite the same threat as the SNP. Of course, Welsh water was far less valuable and cheaper to extract than Scottish oil, which was fundamentally why the proposed Welsh assembly was to have fewer powers than the Scottish one. It was to oversee a large chunk of public expenditure, but it would not have law-making powers. The proposal was therefore unlikely to make anyone’s blood pound.

‘Black Gold’ – North Sea Gas & Oil Fields:

Certainly, living standards continued to rise, aided by the discovery in the North Sea of natural gas in 1965 and oil in 1969. The complex mosaic of yellow blocks in the 2016 map of the North Sea (below) illustrates the development of the UK oil industry which provided an economic boon to seventies Britain. The Forties field, shown on the map, was the biggest of the North Sea oil fields whose royalties made the country an unexpected hydrocarbon superpower for a limited period. In 1959, following a single find of natural gas in Gröningen in the Netherlands, it became apparent the geological structure of the North Sea meant that further discoveries were likely, although the prohibitive cost of drilling in offshore waters impeded initial exploration.

Section of the map below which gives a close-up of the Forties field & pipelines in Sea 2.

The first large find was also of gas, struck in the West Sole field, off the Yorkshire coast, by British Petroleum’s Sea Gem platform in September 1965, though celebrations were soon dampened when the rig sank three months later, with the loss of thirteen lives, in the North Sea’s first major disaster. By 1969 oil had been struck in the Montrose field east of Aberdeen, followed soon after by the giant Forties field in 1970 and the Brent field a year later. The system of licensing fees for exploration and royalties payable on gas and oil production provided an economic shot in the arm for a country which was struggling to restructure its traditional industries and facing increased competition from emerging industrial giants such as Japan. Production became even more profitable after the 1973 oil shock when the principal Middle Eastern oil-producing countries imposed oil embargoes in response to Western countries’ policies which they said favoured Israel in its struggle with the Palestinian Arabs. As oil importers sought alternative sources of supply, global prices rose and North Sea oil found new customers.

Large-scale map, showing the full extent of the North Sea oil fields.

By the time of the Queen’s Silver Jubilee, in 1977, North Sea Oil was coming ashore to the tune of more than half a million barrels a day, meeting a third of the country’s needs. Britain would be self-sufficient in oil by 1980 and already was in gas. Oil and gas were not the only new sources of power to be exploited in the sixties. A lonely stretch of coast near Leiston in Suffolk became the site of Britain’s second nuclear power station, built in the early 1960s. In 1966 power began surging out from the grey, cuboid plant into the national grid. By the mid-seventies, Sizewell’s five hundred and eighty thousand kilowatts were going a long way towards meeting the electricity needs of eastern England.

Sizewell Nuclear Power Station (2014)

Devaluation & Deindustrialisation:

But there were also disquieting signs that Britain was approaching an as yet undefined cultural and financial crisis. British economic growth rates did not match those of competitor states, and it was partly for this reason that Britain applied to join the European Economic Community, in 1961 and 1967, entry both times being vetoed by France.

New kinds of industry, based largely on the revolution in electronics, came into being alongside the old, without displacing them. There was a shift in the patterns of skills, and of work, and the composition of the labour force, with more workers involved in clerical, highly skilled or service occupations. At the same time, more workers were pushed down into the unskilled ranks of mass production. They became more mobile again, pulled to where the jobs were. The pattern of regional decline in the older industrial areas and rapid, unorganised growth in the new areas began to re-emerge. In some areas and industries, the long-term pattern of continuity from one generation to the next persisted, while in other, newer areas, this continuity was broken.

Long-term changes patterns of employment, 1921-76.

Hitherto the social fabric across Britain had been kept intact, at least in part, because of high and advancing living standards for the population as a whole. But clear evidence mounted up in the late 1960s that increasing economic pressures were adding to new social tensions. Britain lurched from one financial expedient to another, with the frequent balance of payments crises and many runs on the pound sterling. Devaluation in 1967 did not produce any lasting remedy. Economically, the real problems of the decade arose from both the devaluation of the currency in November 1967 and the deterioration in industrial relations. Employment in manufacturing nationally declined, until it accounted for less than a third of the workforce by 1973. The income policy declared by the Wilson Government was hard to swallow for engineering workers who had long enjoyed the benefits of free collective bargaining and wage differentials. By way of contrast, employment in the service sector rose, so that by 1973, over half of all workers in the UK were employed in providing services. The map below shows both the long-term and regional character of the decline of the British manufacturing industries. The resulting mass unemployment hurt the old industries of the Northwest badly, but the job losses were proportionately as high in the Southeast and Midlands, where there were newer manufacturers.

The Second Wilson Labour administration that followed faced a huge balance-of-payments crisis and the tumbling value of the pound and they soon found themselves under the control of the IMF (International Monetary Fund), which insisted on severe spending cuts. The contraction of manufacturing began to accelerate and inflation was also increasing alarmingly, reaching twenty-four per cent by 1975. It came to be seen as a more urgent problem than unemployment and there was a national and international move to the right and against high-taxing and high-spending governments. Demands were made that they should stop propping up lame-duck industries with public money or taking them, however temporarily, into public ownership.

By the mid-seventies, the dock area at Felixstowe covered hundreds of acres, many reclaimed, made up of spacious wharves, warehouses and storage areas equipped with the latest cargo handling machinery. The transformation had begun in 1956 as the direct result of foresight and careful planning. The Company launched a three-million pound project to create a new deep-water berth geared to the latest bulk transportation technique – containerisation. It calculated that changing trading patterns and Felixstowe’s proximity to Rotterdam and Antwerp provided exciting prospects for an efficient, well-equipped port. Having accomplished that, it set aside another eight million for an oil jetty and bulk liquid storage facilities. In addition, a passenger terminal was opened in 1975. The dock soon acquired a reputation for fast, efficient handling of all types of cargo, and consignments could easily reach the major industrial centres through faster road and rail networks.

There were many reasons for this unprecedented growth. which brought Suffolk a prosperity unknown since the expansion of the cloth trade in the mid-fourteenth century. As back then, Suffolk’s depression gave a boost to new development. Most of the county was within eighty miles of London and served by improving road and rail connections. Ports like Felixstowe were no further from the capital than those of Kent and they were a great deal closer to the industrial Midlands and the North.

An old milestone in the centre of Woodbridge, Suffolk

Some of Suffolk’s most beautiful countryside was no further from the metropolis than the stockbroker belt of the Home Counties, and yet land and property prices in Suffolk were less than half of what they were there. People were becoming more mobile and light industries were less tied to traditional centres. Companies escaping from high overheads found that they could find both the facilities and labour they needed in Ipswich, Bury, Sudbury and Haverhill. Executives also discovered that they could live in areas of great natural beauty and yet be within commuting distance of their City desks. Moreover, the shift in international trade focused attention once more on the east coast ports. As the Empire was being disbanded and Britain was drawn increasingly towards trade with the European Common Market, producers were looking for the shortest routes to the continent. More and more lorries took to the roads through Suffolk.

Meanwhile, the rate of migration into Coventry had undoubtedly slowed down by the mid-sixties. Between 1961 and 1971 the population rose by nearly six per cent compared with a rise of nineteen per cent between 1951 and 1961. The failure of Coventry’s manufacturing industry to maintain immediate post-war growth rates was providing fewer opportunities for migrant manual workers, while the completion of the city centre redevelopment programme and the large housing schemes reduced the number of itinerant building workers. Between 1951 and 1966 the local population increased by approximately four thousand every year, but in the following five years the net annual increase fell to about a thousand per annum. Moreover, the proportion of this increase attributable to migration had dramatically declined. Between 1951 and 1961, a Department of the Environment survey estimated that whereas migration accounted for about forty-five per cent of population growth in the Coventry belt, in the following five years it made up only eighteen per cent. In the following three years to 1969, the survey noted that the same belt had begun, marginally, to lose population through out-migration.

Between the census of 1961 and the mini-census of 1966, some major shifts in the pattern of migration into Coventry took place. There was a substantial increase in immigration from Commonwealth countries, colonies and protectorates during these five years. The total number of those born in these territories stood at 11,340. The expansion needs to be kept in perspective, however. Nearly two-thirds of the local population was born in the West Midlands, and there were still nearly twice as many migrants from Ireland as from the Commonwealth and Colonies. Indeed, in 1966 only 3.5 per cent of Coventry’s population had been born outside the British Isles, compared with the national figure of five per cent. The Welsh stream had slowed down, increasing by only eight per cent in the previous fifteen years, and similar small increases were registered among migrants from Northern England. There were significant increases from Scotland, London and the South East, but only a very small increase from continental Europe.

By the mid-seventies, Coventry was faced with a new challenge posed by changes in the age structure of its population. The city was having to care for its increasing numbers of elderly citizens, a cost which soon became difficult to bear, given its declining economy. By 1974, it was estimated that the local population was rising by two thousand per year, twice the rate of the late 1960s. By 1976, however, the youthfulness of the city’s population was being lost as the proportion of over sixty-five-year-olds rose above the national average. With its large migrant element, it began to lose population rapidly during this decline, from 335,238 to 310,216 between 1971 and 1981, a fall of 7.5 per cent. Nearly sixty thousand jobs were lost during the recession, and given the shallowness of the family structure of many Coventrians, this resulted in a sizeable proportion of its citizens being all too willing to seek their fortunes elsewhere. For many others, given the widespread nature of the decline in manufacturing in the rest of the UK, there was simply nowhere to go.

Migration, Immigration and Racialism – Rivers of Blood:

Elsewhere, however, the number of ‘New Commonwealth’ immigrants was proving a cause for concern, mainly, as it seems in retrospect, to the skin colour of these immigrants, and partly, in the case of south Asians, due to religious and cultural differences with the host country. These ‘concerns’ were not new, and nor were the active forms of prejudice and discrimination which had accompanied them since the mid-fifties in the general population. In addition to dilapidated housing and racial discrimination in employment, and sometimes at the hands of the police, there was the added hazard of racial bigotry in older urban areas. What was new was the way in which this was articulated and amplified from the early sixties onwards by Conservative MPs and parliamentary candidates, leading to the emergence of the National Front as a political force in the early seventies.

Harold Wilson was always a sincere anti-racist, but he did not try to repeal the Conservatives’ 1962 Act with its controversial quota system. One of the new migrations that arrived to beat the 1963 quota system just before Wilson came to power came from a rural area of Pakistan threatened with flooding by a huge dam project. The poor farming villages from the Muslim north, particularly around Kashmir, were not an entrepreneurial environment. They began sending their men to earn money in the labour-starved textile mills of Bradford and the surrounding towns. Unlike the West Indians, the Pakistanis and Indians were more likely to send for their families soon after arrival in Britain. Soon there would be large, distinct Muslim communities clustered in areas of Bradford, Leicester and other older manufacturing towns.

This image has an empty alt attribute; its file name is image-26.png

Unlike the West Indians, the Pakistanis and Indians were more likely to send for their families soon after arrival in Britain. Soon there would be large, distinct Muslim communities clustered in areas of Bradford, Leicester and other older manufacturing towns. The photo above shows a south Asian immigrant in a Bradford textile factory. But the decline of the textile industries in the 1970s led to high long-term unemployment in south Asian communities.

Unlike the Caribbean immigrants, who were largely Christian by background, these new streams of migration were bringing people who were religiously separated from the white ‘Christians’ around them and cut off from the main forms of working-class entertainment, many of which involved the consumption of alcohol, from which they abstained. Muslim women were expected to remain in the domestic environment and ancient traditions of arranged marriages carried over from the subcontinent meant that there was almost no intermarriage with the native population. To many of the ‘natives’, the ‘Pakis’, as they were then casually called, even to their faces, were less threatening than young Caribbean men, but they were also more culturally alien.

Wilson had felt strongly enough about the ‘racialist’ behaviour in the Tory campaign at Smethwick, to the west of Birmingham, in 1964, to publicly denounce its victor Peter Griffiths as a ‘parliamentary leper’. Smethwick had attracted a significant number of immigrants from Commonwealth countries, the largest ethnic group being Sikhs from Punjab in India, and there were also many Windrush Caribbeans settled in the area. There was also a background of factory closures and a growing waiting list for local council housing. Griffiths ran a campaign critical of both the opposition’s and the government’s immigration policies. The Conservatives were widely reported as using the slogan “if you want a nigger for a neighbour, vote Labour” but the neo-Nazi British Movement later claimed that its members had produced the initial slogan as well as led the poster and sticker campaign. However, Griffiths did not condemn the phrase and was quoted as saying:

“I should think that is a manifestation of popular feeling. I would not condemn anyone who said that.”

The 1964 general election involved a nationwide swing from the Conservatives to the Labour Party which resulted in the party gaining a narrow five-seat majority. However, in Smethwick, as Conservative candidate, Peter Griffiths gained the seat and unseated the sitting Labour MP, Patrick Gordon-Walker, who had served as Shadow Foreign Secretary for the eighteen months prior to the election. In these circumstances, the Smethwick campaign, already attracting national media coverage, and the result itself stood out as clearly the result of racialism.

Griffiths, in his maiden speech to the Commons, pointed out what he believed were the real problems his constituency faced, including factory closures and over four thousand families awaiting council accommodation. But in 1965, Wilson’s new Home Secretary, Frank Soskice, tightened the quota system, cutting down on the number of dependents allowed in, and giving the Government the power to deport illegal immigrants. At the same time, it offered the first Race Relations Act as a ‘sweetener’. This outlawed the use of the ‘colour bar’ in public places and by potential landlords, and discrimination in public services, also banning incitement to racial hatred like that seen in the Smethwick campaign. At the time, it was largely seen as toothless, yet the combination of restrictions on immigration and the measures to better integrate the migrants already in Britain did form the basis for all subsequent policies.

Birmingham’s booming postwar economy had not only attracted its ‘West Indian’ settlers from 1948 onwards, but had also ‘welcomed’ south Asians from Gujarat and Punjab in India, and East Pakistan (Bangladesh) both after the war and the partition of India and in increasing numbers from the early 1960s. The South Asian and West Indian populations were equal in size and concentrated in the inner city wards of the city and in west Birmingham, particularly Sparkbrook and Handsworth, as well as in Sandwell (see map above; then known as Smethwick and Warley). Labour shortages had developed in Birmingham as a result of an overall movement towards skilled and white-collar employment among the native population, which created vacancies in less attractive, poorly paid, unskilled and semi-skilled jobs in manufacturing, particularly in metal foundries and factories, and in the transport and healthcare sectors of the public services. These jobs were filled by newcomers from the Commonwealth.

Whatever the eventual problems thrown up by the mutual sense of alienation between natives and immigrants, Britain’s fragile new consensus and ‘truce’ on race relations of 1964-65 were about to be broken by another form of racial discrimination, this time executed by Africans, mainly the Kikuyu people of Kenya. After the decisive terror and counter-terror of the Mau Mau campaign, Kenya won its independence under the leadership of Jomo Kenyatta in 1963 and initially thrived as a relatively tolerant market economy. Alongside the majority of Africans, however, and the forty thousand whites who stayed after independence, there were some 185,000 Asians in Kenya.

They had mostly arrived during British rule, were mostly better off than the local Kikuyu, and were well-established as doctors, civil servants, traders, business people and police. They also had full British passports and therefore an absolute right of entry to Britain, which had been confirmed by meetings of Tory ministers before independence. When Kenyatta gave them the choice of surrendering their British passports and gaining full Kenyan nationality or becoming foreigners, dependent on work permits, most of them chose to keep their British nationality. In the generally unfriendly and sometimes menacing atmosphere of Kenya in the mid-sixties, this seemed the sensible option. Certainly, there was no indication from London that their rights to entry would be taken away.

The 1968 Immigration Act was specifically targeted at restricting Kenyan Asians with British passports. As conditions grew worse for them in Kenya, many of them decided to seek refuge in the mother country of the Empire which had settled them in the first place. Throughout 1967 they were coming in by plane at the rate of about a thousand per month. The newspapers began to depict the influx on their front pages and the television news, by now watched in most homes, showed great queues waiting for British passports and flights. It was at this point that Enoch Powell, Conservative MP for Wolverhampton and shadow minister, in an early warning shot, said that half a million East African Asians could eventually enter which was ‘quite monstrous’. He called for an end to work permits and a complete ban on dependants coming to Britain. Other prominent Tories, like Ian Macleod, argued that the Kenyan Asians could not be left stateless and that the British Government had to keep its promise to them. The Labour government was also split on the issue, with the liberals, led by Roy Jenkins, believing that only Kenyatta could halt the migration by being persuaded to offer better treatment. The new Home Secretary, Jim Callaghan, on the other hand, was determined to respond to the concerns of Labour voters about the unchecked migration.

By the end of 1967, the numbers arriving per month had doubled to two thousand. In February 1968, Callaghan decided to act. The Commonwealth Immigrants Act effectively slammed the door while leaving a ‘cat flap’ open for a very small annual quota, leaving some twenty thousand people ‘stranded’ and stateless in a country which no longer wanted them. The bill was rushed through in the spring of 1968 and has been described as among the most divisive and controversial decisions ever taken by any British government. Some MPs viewed it as the most shameful piece of legislation ever enacted by Parliament, the ultimate appeasement of racist hysteria. The government responded with a tougher anti-discrimination bill in the same year. For many others, however, the passing of the act was the moment when the political élite, in the shape of Jim Callaghan, finally woke up and listened to their working-class workers. Polls of the public showed that 72% supported the act. Never again would the idea of free access to Britain be seriously entertained by mainstream politicians.

This was the backcloth to the notorious Rivers of Blood speech made in Birmingham by Enoch Powell, in which he prophesied violent racial war if immigration continued. Powell had argued that the passport guarantee was never valid in the first place. Despite his unorthodox views, Powell was still a member of Edward Heath’s shadow cabinet which had just agreed to back Labour’s Race Relations Bill. But Powell had gone uncharacteristically quiet, apparently telling a local friend,

I’m going to make a speech at the weekend and it’s going to go up “fizz” like a rocket, but whereas all rockets fall to earth, this one is going to stay up. 

The ‘friend’, Clem Jones, the editor of Powell’s local newspaper, The Wolverhampton Express and Star, had advised him to time the speech for the early evening television bulletins, and not to distribute it generally beforehand. He came to regret the advice. In a small room at the Midland Hotel in Birmingham on 20th April 1968, three weeks after the act had been passed and the planes carrying would-be Kenyan Asian immigrants had been turned around, Powell quoted a Wolverhampton constituent, a middle-aged working man, who told him that if he had the money, he would leave the country because, in fifteen or twenty years time, the black man will have the whip hand over the white man. Powell continued by asking rhetorically how he dared say such a horrible thing, stirring up trouble and inflaming feelings. He answered himself:

“The answer is I do not have the right not to do so. Here is a decent, ordinary fellow-Englishman, who in broad daylight in my own town says to me, his Member of Parliament, that this country will not be worth living in for his children. I simply do not have the right to shrug my shoulders and think about something else. What he is saying, thousands and hundreds of thousands are saying and thinking… ‘Those whom the Gods wish to destroy, they first make mad.’ We must be mad, literally mad, as a nation to be permitting the annual flow of some fifty thousand dependants, who are for the most part the material growth of the immigrant-descended population. It is like watching a nation busily engaged in heaping its own its own funeral pyre.” 

He then used a classical illusion to make a controversial prophecy:

As I look ahead, I am filled with foreboding. Like the Roman, I seem to see ‘the river Tiber foaming with much blood.’

Enoch Powell, the influential opponent of immigration.

In the context of his speech to this point and his earlier pronouncements as a maverick right-winger, most people considered this a prophecy of a violent inter-racial war if black immigration continued. His inflammatory rhetoric was taken as a prediction that rivers of blood would flow similar to those seen in the recent race riots in the United States. The speech therefore quickly became known as The Rivers of Blood Speech and formed the backdrop of the legislation. He also made various accusations, made by other constituents, that they had been persecuted by ‘Negroes’, having excrement posted through their letterboxes and being followed to the shops by children, charming wide-grinning pickaninnies chanting “Racialist.” If Britain did not begin a policy of voluntary repatriation, it would soon face the kind of race riots that were disfiguring America. Powell claimed that he was merely restating Tory policy. But the language used and his own careful preparation suggests it was both a call to arms by a politician who believed he was fighting for white English nationhood and a deliberate provocation aimed at Powell’s enemy, Heath.

After horrified consultations when he and other leading Tories had seen extracts of the speech on the television news, Heath promptly ordered Powell to phone him, and summarily sacked him. Heath announced that he found the speech racialist in tone and liable to exacerbate racial tensions. As Parliament returned three days after the speech, a thousand London dockers marched to Westminster in Powell’s support, carrying ‘Enoch is right’ placards; by the following day, he had received twenty thousand letters, almost all in support of his speech, with tens of thousands still to come. Smithfield meat porters and Heathrow airport workers also demonstrated in support of him. Powell received death threats and needed full-time police protection for a while; numerous marches were held against him and he found it difficult to make speeches at or near university campuses. Asked whether he was a racialist by the Daily Mail, he replied:

‘We are all racialists. Do I object to one coloured person in this country? No. To a hundred? No. To a million? (A query). To five million? Definitely.’

Did most people in 1968 agree with him, as Andrew Marr has suggested? It’s important to point out that, until he made this speech, Powell had been a Tory ‘insider’, though also seen as a maverick and a trusted member of Edward Heath’s shadow cabinet. He had rejected the consumer society growing around him in favour of what he saw as a ‘higher vision’. This was a romantic dream of an older, tougher, swashbuckling Britain, freed of continental and imperial (now ‘commonwealth’) entanglements, populated by ingenious, hard-working white people rather like himself. For this to become a reality, Britain would need to become a self-sufficient island, which ran entirely against the great forces of the time. His view was fundamentally nostalgic, harking back to the energetic Victorians and Edwardians. He drew sustenance from the people around him, who seemed to be excluded from mainstream politics. He argued that his Wolverhampton constituents had had immigration imposed on them without being asked and against their will.

But viewed from Fleet Street or the pulpits of broadcasting, he was seen as an irrelevance, marching off into the wilderness. In reality, although immigration was changing small patches of the country, mostly in west London, west Birmingham and the Black Country, it had, by 1968, barely impinged as an issue in people’s lives. That was why, at that time, it was relatively easy for the press and media to marginalize Powell and his acolytes in the Tory Party. He was expelled from the shadow cabinet for his anti-immigration speech, not so much for its racialist content, which was mainly given in reported speech, but for suggesting that the race relations legislation was merely throwing a match on gunpowder. This statement was a clear breach of shadow cabinet collective responsibility. Besides, the legislation controlling immigration and regulating race relations had already been passed, so it is difficult to see what Powell had hoped to gain from the speech, apart from embarrassing his nemesis, Ted Heath.

Edward Heath, leader of the Conservatives from 1965 & Prime Minister, 1970-74.
Ted’s Grandfather Clause & the Ugandan Refugees:

Despite the dramatic increase in wealth, coupled with the emergence of distinctive subcultures, technological advances and dramatic shifts in popular culture, there was a general feeling of disillusionment with Labour’s policies nationally. In the 1970 General Election, the Conservative Party, under its new leader Edward Heath, was returned to power. Although Enoch Powell had been sacked from the shadow cabinet by Heath, more legislative action followed with the 1971 Immigration Act, which effectively restricted citizenship on racial grounds by enacting the Grandfather Clause, by which a Commonwealth citizen who could prove that one of his or her grandparents was born in the UK was entitled to immediate entry clearance. This operated to the disadvantage of Black and Asian applicants while favouring citizens of the old Commonwealth, descendants of white settlers from Australia, New Zealand, Canada and South Africa. Thus immigration control had moved away from primary immigration to restricting the entry of dependants, or secondary immigration.

Enoch Powell himself, from the back-benches, likened the distinction between ‘new’ and ‘old’ Commonwealth immigrants to a Nazi race purity law; he wanted a new definition of British citizenship instead. The grandparent rule was defeated by the right and left combining for opposite reasons but was restored two years later. In the meantime, the Kenyan crisis was replayed in another former East African colony, Uganda. Here, the swaggering, Sandhurst-educated Idi Amin had come to power in a coup. He announced that he had been told in a dream he must expel the country’s Asians, just as the Kenyans had theirs. Though Powell argued angrily that Britain had no obligation to allow the trapped Ugandan Asians into its cities, Heath acted decisively to bring them in. Airlifts were arranged, with a resettlement board to help them, and twenty-eight thousand arrived within a few weeks in 1971, eventually settling in the same areas as other East Africans, even though Leicester had published adverts in Ugandan newspapers pleading with them not to come there.

Those who knew Powell best claimed that he was not a racialist. The local newspaper editor, Clem Jones, thought that Enoch’s anti-immigration stance was not ideologically-motivated, but had simply been influenced by the anger of white Wolverhampton people who felt they were being crowded out; even in Powell’s own street of good, solid, Victorian houses, next door went sort of coloured and then another and then another house, and he saw the value of his own house go down. But, Jones added, Powell always worked hard as an MP for all his constituents, mixing with them regardless of colour:

We quite often used to go out for a meal, as a family, to a couple of Indian restaurants, and he was on extremely amiable terms with everybody there, ‘cos having been in India and his wife brought up in India, they liked that kind of food.

On the numbers migrating to Britain, however, Powell’s predicted figures were not totally inaccurate. Just before his 1968 speech, he had suggested that by the end of the century, the number of black and Asian immigrants and their descendants would number between five and seven million, about a tenth of the population. According to the 2001 census, 4.7 million people identified as black or Asian, equivalent to 7.9 per cent of the total population. Immigrants were, of course, far more strongly represented in percentage terms in English cities. Powell may have helped British society by speaking out on an issue that, until then, had remained taboo. However, the language of his discourse still seems quite inflammatory and provocative, even fifty years later, so much so that even historians hesitate to quote them. His words also helped to make the extreme-right Nazis of the National Front more acceptable. Furthermore, his core prediction of major civil unrest was not fulfilled, despite riots and street crime linked to disaffected youths from Caribbean immigrant communities in the 1980s. So, in the end, Enoch was not right, though he may have had a point.

Immigrants to Birmingham also tended to congregate in the western suburbs along the boundary with Smethwick, Warley, West Bromwich (now Sandwell), and Dudley, where many of them also settled. By 1971, the South Asian and West Indian populations were equal in size and concentrated in the inner city wards and in north-west Birmingham, especially in Handsworth, Sandwell and Sparkbrook. Labour shortages had developed in Birmingham as a result of an overall movement towards more skilled and white-collar employment among the native population, which created vacancies in the poorly paid, less attractive, poorly paid, unskilled and semi-skilled jobs in manufacturing, particularly in metal foundries and factories, and in the transport and health care sectors of the public services. These jobs were filled by newcomers from the new Commonwealth. In the 1970s, poor pay and working conditions forced some of these workers to resort to strike action. Hostility to Commonwealth immigrants was pronounced in some sections of the local white population and what became known as white flight, migration from the inner city areas to the expanding suburbs to the southwest and east of the city, though it is still unclear to what extent this migration was really due to concerns about immigrants.

In nearby Coventry, despite these emerging signs of a stall in population growth by the end of the sixties, the authorities continued to view the city and its surroundings as a major area of demographic expansion. In October 1970 a Ministry of Housing representative predicted that the city’s population would rise by a third over the next twenty years. Immigration had changed Britain more than almost any other single social factor in post-war Britain, more significant than the increase in life expectancy, birth control, the death of deference or the spread of suburban housing. The only change that eclipses it is the triumph of the car. It was not a change that was asked for by the indigenous British cultures, though the terms and circumstances of fifty million people choosing suddenly to ask and answer such a question, possibly in a referendum, are impossible to imagine.

The majority of British people did not want the arrival of large numbers of Irish, West Indians and south Asians, but neither did they want an end to capital punishment or membership in a federal European Union, or many other things that their political élite decided upon. Yet, at no stage was there a measured, rational and frank debate about immigration between party leaders in front of the electorate. And while allowing this change to take effect piecemeal and by default, the main parties did very little to ensure the successful integration of immigrants from the Caribbean or the Indian subcontinent. When help was given, in the case of the East African Asian refugees, integration was more successful. But even with these sudden influxes, there was no real attempt to nurture mixed communities, avoiding mini-ghettoes such as those that developed in the East Midlands and South Yorkshire. Race relations legislation did come, but only as a counter-balance to further restrictions.

Primary children celebrate Diwali, the Hindu festival of lights, wearing their traditional dress.

As New Commonwealth immigrants began to become established in postwar Birmingham, community infrastructures, including places of worship, ethnic groceries, halal butchers and, most significantly, restaurants, began to develop. Birmingham became synonymous with the phenomenal rise of the ubiquitous curry house, and Sparkbrook in particular developed unrivalled Balti restaurants. These materially changed the city’s social life patterns among the native population. In addition to these obvious cultural contributions, the multilingual setting in which English exists today became more diverse in the sixties and seventies, especially due to immigration from the Indian subcontinent and the Caribbean. The largest of the community languages was Punjabi, with over half a million speakers by the late seventies. Still, there were also substantial communities of Gujarati speakers, as many as a third of a million, and up to a hundred thousand Bengali speakers. In some areas, such as East London, public notices recognised this.

A Bengali road sign in East London.

A new level of linguistic and cultural diversity was introduced by Commonwealth immigration. This manifested itself not just in the various ‘new’ languages that entered Britain, but also in the development of new dialects of English originating in different parts of the old Empire, especially in the West Indies.

Inter-cultural Diversity, Music & Integration:

Within the British West Indian community, Jamaican English, or the patois – as it is known – has had a special place as a token of identity. While there were complicated social pressures that frowned on Jamaican English in Jamaica, with parents complaining when their children ‘talk local’ too much, in England, it became almost obligatory to do so in London. One Jamaican schoolgirl who made the final passage to the Empire’s capital city with her parents in the seventies put it like this:

It’s rather weird ’cos when I was in Jamaica I wasn’t really allowed to speak it (Jamaican creole) in front of my parents. I found it difficult in Britain at first. When I went to school I wanted to be like the others in order not to stand out. So I tried speaking the patois as well… You get sort of a mixed reception. Some people say, ’You sound really nice, quite different.’ Other people say, ’You’re a foreigner, speak English. Don’t try to be like us, ’cos you’re not like us.’

Despite the mixed reception from her British West Indian friends, she persevered with the patois, and, as she put it after a year I lost my British accent, and was accepted. However, for many Caribbean visitors to Britain, the patois of Brixton and Notting Hill was a stylised form that was not, as they saw it, truly Jamaican, not least because British West Indians came from all parts of the Caribbean. Another West Indian schoolgirl, born in London and visiting Jamaica for the first time, was teased for her patois. She was told that she didn’t sound right and that. The experience convinced her that…

… in London the Jamaicans have developed their own language in patois, sort of. ’Cos they make up their own words in London, in, like, Brixton. And then it just develops into patois as well.

Researchers found that there were already white children in predominantly black schools who had begun using the British West Indian patois in order to be accepted by the majority of their friends, who were black:

I was born in Brixton and I’ve been living here for seventeen years, and so I just picked it up from hanging around with my friends who are mainly Black people. And so I can relate to them by using it, because otherwise I’d feel an outcast… But when I’m with someone else who I don’t know I try to speak as fluent English as possible. It’s like I feel embarrassed about it (the patois), I feel like I’m degrading myself by using it.

The unconscious racism of such comments pointed to the predicament of Black Britons. Not fully accepted, for all their rhetoric, by the established native population, they felt neither fully Caribbean nor fully British. This was the poignant outcome of what the British Black writer Caryl Phillips called The Final Passage. Phillips, who came to Britain as a baby in the late 1950s, was one of the first of his generation to grapple with the problem of finding a means of literary self-expression that was true to his experience:

The paradox of my situation is that where most immigrants have to learn a new language, Caribbean immigrants have to learn a new form of the same language. It induces linguistic schizophrenia – you have an identity crisis that mirrors the larger cultural confusion.

In his novel, The Final Passage, the narrative is in Standard English. But the speech of the characters is a rendering of nation language:

I don’t care what anyone tell you, going to England be good for it going to raise your mind. For a West Indian boy you just being there is an education, for you going see what England do for sheself… It’s a college for the West Indian.

The lesson of this college is, as Phillips puts it, that symptomatic of the colonial situation, the language has been divided as well. English – creole or standard – was the only available language in the British Black community and in the English-speaking islands of the Caribbean.

By the end of the seventies, Caribbean and Rastafarian Reggae music were beginning to have a broad impact on British pop culture. In Birmingham, a group of out-of-work young white men formed the band UB40, named after their benefit claim forms. The multicultural band Steel Pulse also became popular. On the other side of rock ‘politics’, there was an eruption of racist, skinhead rock, and a ‘casual’ but influential interest in the far right from among more established artists. At a concert at the Birmingham Odeon, in 1976, Eric Clapton, arriving on stage an hour late either drunk or stoned (or both), enquired as to whether there were any immigrants in the audience. He then said, to the shock and disgust of almost everyone there, Powell is the only bloke whose telling the truth, for the good of the country. David Bowie was also heard to flirt with far-right ideas and Sid Vicious of the punk band, The Sex Pistols, contributed the following dubious lyrics to contemporary political thought:

‘Belsen was a gas/ I read the other day/ About the open graves/ Where the Jews all lay …’

Punk gets cheeky: Vivienne Westwood (centre), Chrissie Hynde (left) and Jordan advertise Westwood’s King’s Road punk shop, Sex, in 1976.

Reacting to the surrounding mood, as well as to concerns about these deliberately outrageous statements and actions (McLaren and Westwood produced clothing with swastikas and other Nazi emblems), Rock Against Racism was formed in August 1976, organising a series of charity concerts throughout Britain and helping to create the wider Anti-Nazi League a year later. Punk bands were at the forefront of the RAR movement, above all The Clash whose lead singer Joe Strummer became more influential than Johnny Rotten and the rest of the Sex Pistols. Ska and Soul music also had a real influence in turning street culture decisively against racism. Coventry’s Ska revival band, The Specials captured and expressed this new mood. The seventies produced, in the middle of visions of social breakdown, a musical revival which reflected the reality of a lost generation, whilst in turn reviving their sense of enjoyment of life. As one contemporary cultural critic put it:

‘A lifestyle – urban, mixed, music-loving, modern and creative – had survived, despite being under threat from the NF.’

Dave Haslam (2005), Not Abba, Fourth Estate.

Punk rock was in part a reaction against growing youth unemployment and also came to symbolise a rejection of commercialism. Ironically, as with earlier youth sub-cultures, it soon became highly commercialised.

Punk rockers

The streets might be dirty and living standards falling, but by the end of the seventies, the streets were also getting safer from racist thugs and, contrary to some stereotypes, the quality of life was improving. The integration of diverse cultures and sub-cultures was working at a ‘local’ street level and in parish schools, not directed from the top down.

The arrival of Islam, Sikhism and Hinduism transformed the celebration of religious faith, including Christianity, in many schools. In 1970 there were about 300,000 Muslims in Britain. By 1990 this number had grown to one million. There were more than three hundred mosques, the largest of which was in Central London. There were also Sikh and Hindu communities, each numbering around 300,000 members. For many Christians at Advent, the school Nativity play and carol service remained the main traditions, but many schools were developing new customs and practices to ensure that pupils from a range of backgrounds and faiths were included. It was increasingly recognised that a lack of thought or sensitivity on the part of schools at this time of year could negate much of the rest of the year to improve community relations. Parents seemed to be satisfied with the diverse menu on offer. They turned up in their thousands to see their children perform in celebrations which were primarily a source of fun for all.

Above: Christmas celebrations in the Scottish Highlands and inner-city comprehensive in London. Top: at Islington Green Secondary School, pupils of all ethnicities give traditional performances and learn each other’s national dances. Bottom: Christmas decorations at Kirkton Primary School.

Economic Decline & Deindustrialisation:

A great variety of explanations for the decline in British industrial competitiveness were put forward, and have continued to be debated since. None of these explanations has proved wholly satisfactory, however. One explanation suggests that there is a cultural obstacle, that the British have been conditioned to despise industry. This might be a relevant argument to apply to a new England, with an industrial heritage going back only two or three generations, and to the old England, the traditional rural areas, although even in these areas it would be something of a stereotype, it would be difficult to apply to industrial Britain, with its generations of coal miners, shipbuilders, foundry and factory workers.

During the depression years of the 1930s many of these workers, finding themselves unemployed, had, like the father of Norman Tebbitt (Margaret Thatcher’s Party Chairman in the late seventies) got on their bikes, or walked long distances, in their hundreds of thousands to find work in the new manufacturing areas. With no jobs to find anywhere in the seventies, these were pretty pointless words of advice. Pointless or not, Tebbitt’s speech was picked up by the popular Tory press and appeared in the banner On Your Bike headlines which have since become so emblematic of the Thatcher era. Unfortunately, the same press used them to put forward a related argument that the British were not sufficiently materialistic to work hard for the rewards associated with improved productivity. Complacency from generations of national success has also been blamed, as has the Welfare State’s cosseting of both the workforce and those out of work.

Keynes’ argument had been that keeping workers in employment multiplied the effect through the economy as they spent part of their incomes on goods and services was shown to operate in the opposite direction through the effects of rising unemployment. However, the majority of people of working and voting age had no adult memory of their own of the 1930s, and radical politicians were able to exploit these demographics to their advantage to argue the case for monetarism with tight controls on public spending. In these circumstances, voters felt that spending public money on ailing industries was wasteful and inappropriate, especially as it raised their tax burden.

The story of 1970s Britain, whether viewed from an economic, social or cultural perspective can be summed up by one word, albeit a long one – deindustrialisation. As with the processes of industrialisation two centuries before, Britain led the way in what was to become a common experience of all the mature industrial nations. The so-called maturity thesis suggested that, as industries developed and became more technologically sophisticated, they required less labour. At the same time, rising living standards meant that more wealth was available, beyond what would normally be spent on basic necessities and consumer goods, giving rise to a growing demand for services such as travel, tourism and entertainment. By 1976, services had become the largest area of employment in all the regions of Britain.

Another problem faced by the manufacturing sector was the long-standing British taste for imported goods. Many observers noted that not only was the country failing to compete internationally, but British industry was also losing its cutting edge when competing with foreign imports in the domestic market. The problem of deindustrialisation, therefore, became entwined with the debate over Britain’s long decline as a trading nation, going back over a century. It was seen not only as an economic decline but as a national failure, ownership of which in speeches and election propaganda, even in education, struck deep within the collective British cultural psyche. By 1977, if not before, its role as the world’s first and leading industrial nation was finally over, just as its time as an imperial power had effectively ended fifteen years earlier, as Dean Acheson had commented. It was another question as to whether the British people and politicians were prepared to accept these salient facts and move on.

British industry’s share of world trade fell dramatically during these years, and by 1975 it was only half what it had been in the 1950s, falling to just ten per cent. Nor could it maintain its hold on the domestic market. A particularly extreme example of this was the car industry: in 1965, with Austin minis selling like hotcakes, only one car in twenty was imported, but by 1978 nearly half were. In addition, many of the staple industries of the nineteenth century, such as coal and shipbuilding, continued to decline as employers, surviving only, if at all, through nationalisation. In addition, many of the new industries of the 1930s, including the car industry, were seemingly in terminal decline by the 1970s, as we have seen in the case of Coventry. Therefore, deindustrialisation was no longer simply a problem for an old Britain, it was also one for a new England. It was even a problem for East Anglia because although it was not so dependent on manufacturing, and services were growing, agriculture had also declined considerably.

Alternatively, the government’s failure adequately to support research and development has been blamed for Britain’s manufacturing decline, together with the exclusive cultural and educational backgrounds of Westminster politicians and government ministers, and Whitehall civil servants. This exclusivity, it is argued, left them ignorant of, and indifferent to, the needs of industry. Employment in manufacturing reached a peak of nine million in 1966. After that, it fell rapidly, to four million by 1994. Much of this loss was sustained in the older industries of Northwest England, but the bulk of it was spread across the newer industrial areas of the Midlands and Southeast.

Between 1973 and 1975 there was the first of three severe recessions. When Wilson returned to number ten in February 1974, he faced a huge balance-of-payments crisis and the tumbling value of the pound. He was trying to govern without an overall majority at a time when the economy was still recovering from the effect of the oil price shock, with inflation raging and unemployment rising. Inflation reached twenty-four per cent by 1975, and it came to be seen as a far more urgent problem than unemployment. Furthermore, the fragile and implausible Social Contract now had to be tested. Almost the first thing Labour did was to settle with the miners for double what Heath had thought possible. The new Chancellor, Denis Healey, introduced an emergency Budget soon after the election, followed by another in the autumn, raising income tax to eighty-three per cent at the top rate, and ninety-eight per cent for unearned income, a level so eyewateringly high it has been used against Labour ever since. Healey also increased help for the poorest, with higher pensions and subsidies on housing and food. He was trying to deliver for the unions by upholding his side of the social contract, as was Wilson when he abolished the Tories’ employment legislation.

In October 1974, a second general election gave Labour eighteen more seats and a workable overall majority of three. Much of Healey’s energy, continuing as Chancellor, was thrown into dealing with the unstable world economy, with floating currencies and inflation-shocked governments. He continued to devalue the pound against the dollar and to tax and cut as much as he dared, but his only real hope of controlling inflation was to control wages. Wilson insisted that his income policy must be voluntary, with no return or recourse to the legal restraints of the Heath government. The unions became increasingly worried that rampant inflation might bring back the Tories. So, for a while, the Social Contract did deliver fewer strikes, which halved and halved again the following year. Contrary to popular myth, the seventies were not all about mass meetings and walk-outs. The real trouble did not begin again until the winter of 1978-79.

But the other side of the social contract was not delivered. By the early months of 1975, the going rate for increases was already thirty per cent, a third higher than inflation. By June inflation was up to twenty-three per cent, and wage settlements were even further ahead. The government then introduced an element of compulsion, but this applied to employers who offered too much, rather than to trade unions. Nevertheless, Healey reckoned that two-thirds of his time was taken up in managing the inflationary effects of free collective bargaining. In his memoirs, he reflected:

‘Adopting a pay policy is rather like jumping out of a second-floor window: no one in his senses would do it unless the stairs were on fire. But in postwar Britain the stairs have always been on fire.’

Denis Healey (1989), The Time of My Life. Michael Joseph.

Healey did, however, manage to squeeze inflation downwards. He also reflected that, had the unions kept their promises, it would have been down to single figures by the autumn of 1975. At the same time, Healey continued to tax higher earners more, concentrating tax cuts on the worse off. Though notorious (somewhat unfairly) for promising that he would make the rich howl with anguish and that he would squeeze them until the pips squeak, Healey argued that it was the only way of making the country fairer. He never accepted the Tory tenet that higher taxes stopped people from working harder and instead blamed Britain’s poor industrial performance on low investment, and poor training and management, as others have done since.

What we now know is that Wilson wasn’t planning to stay long in his second premiership. There are many separate records of his private comments about retiring at sixty, after two more years in power. If he had not privately decided that he would go in 1976, he certainly acted as if he had. The question of who would succeed him, Jenkins or Callaghan, Healey or even Benn, had become one about the direction of the Labour government, rather than a personal threat to Wilson himself, so there was less rancour around the cabinet table. He seems likely to have known about his early stages of Alzheimer’s, which would wreak a devastating time on him in retirement. He had already begun to forget facts, confuse issues and repeat himself. For a man whose memory and sharp wit had been so important, this must have taken a huge toll. It was Jim Callaghan who finally replaced Wilson at number ten after a series of votes by Labour MPs. But for three turbulent years, he ran a government with no overall majority in Parliament, kept going by a series of deals and pacts, and in an atmosphere of constant crisis.

Callaghan was the third and last of the consensus-seeking centrist PMs after Wilson and Heath, and the first postwar occupant of the office not to have gone to ‘Oxbridge.’ In fact, he had not been to university at all. The son of a Royal Navy chief petty officer who had died young and a devout Baptist mother from Portsmouth, he had known real poverty and had clawed his way up as a young clerk working for the Inland Revenue, and then as a union official, before wartime naval service. Like Healey, he was one of the 1945 generation of MPs, a young rebel who had drifted rightwards while always keeping his strong trade union instincts. He was a social conservative, uneasy about divorce, homosexuality and vehemently pro-police, pro-monarchy and pro-military. He was also anti-hanging and strongly anti-racialist. As Home Secretary, he had announced that the permissive society had gone too far. On the economy, he became steadily more impressed by monetarists like the Tory MP Keith Joseph. He told the 1976 Labour Conference, used to Keynesian doctrines about governments spending their way out of recession, …

‘… that option no longer exists and that insofar as it ever did exist, it worked by injecting inflation into the economy … Higher inflation, followed by higher unemployment. That is the history of the last twenty years.’

Yet Callaghan is forever associated with failure in the national memory. This was due to the Labour government’s cap-in-hand begging for help from the International Monetary Fund (IMF). Healey had negotiated a six-pound pay limit with the unions that would eventually feed through into lower wage increases and thereby lower inflation. Cash limits brought in under Wilson would also radically cut public expenditure. But in the spring of 1976 inflation was still rampant and unemployment was rising fast. Healey told Callaghan that because of the billions spent by the Bank of England supporting sterling in the first months of the year, a loan from the IMF looked essential. In June, standby credits were arranged with the IMF and countries including the United States, Germany, Japan and Switzerland.

Healey had imposed tough cuts in the summer but by its end, the pound was under intense pressure again. On 27th September, he was meant to fly out to a Commonwealth finance ministers’ conference in Hong Kong with the Governor of the Bank of England, but so great was the crisis and so panicked were the markets that he decided he could not afford to be in the air and out of contact for so long. In full view of the television cameras, he turned around at Heathrow and headed for Treasury, where he decided to apply to the IMF for a conditional loan, one which gave authority to the international banking officials above Britain’s elected leaders. Almost simultaneously, the Ford workers went on strike. Close to a nervous breakdown, and against Callaghan’s advice, Healey decided to dash to the Labour conference in Blackpool and make his case to an angry and anguished party. Many on the left were making a strong case for a siege economy; telling the IMF to ‘get lost,’ cutting imports and nationalising swathes of industry. Given just five minutes to speak from the floor, the Chancellor warned his party that this would risk a trade war, mass unemployment and the return of a Tory government. He shouted against a rising hubbub that he would negotiate with the IMF, which would mean…

“… things we do not like as well as things we do like … it means sticking to the very painful cuts in public expenditure … it means sticking to the pay policy.

Denis Healey, Chancellor of the Exchequer during the economic storm, made a characteristic point (or two) to his opponents at the Labour conference.

So, with the cabinet watching on nervously, the negotiations started with the IMF, which insisted on severe funding cuts. Callaghan and Healey naturally wanted to limit these as far as they could, but the IMF, with the US Treasury standing behind them, was under pressure to squeeze even harder. The British were in a horribly weak position, not least because the government was riven by arguments and threats of resignation, including from Healey. The cabinet was split over what levels of cuts were acceptable and whether there was any real alternative in the form of a leftist siege economy. Callaghan and the lead IMF negotiator held private talks in which the PM warned that British democracy itself would be imperilled by mass unemployment. But the IMF was still calling for an extra billion pounds worth of cuts and it was only when Healey, without telling Callaghan, threatened the international bankers with yet another Who runs Britain? election, that they gave way. The final package of cuts was announced in Healey’s budget, severe but not as grim as some had feared, but still greeted with headlines about Britain’s shame.

As it turned out, however, the whole package was unnecessary from the start. The cash limits Healey had already imposed on Whitehall would cut spending far more effectively than anyone realised. More startling still, the public spending statistics, on which the cuts were based, and especially the estimates for borrowing, were wildly wrong. The public finances were stronger than they appeared. The IMF-directed cuts were therefore more savage than they needed to be. Britain’s balance of payment came back into balance long before the cuts could take effect and Healey reflected later that had he been given accurate forecasts in 1976, he would never have needed to go to the IMF at all. In the end, only half the loan was used, all of which was repaid before Labour left office.

Factory workers strike and picket over low pay and closures in 1977.

Following the IMF affair, the pound recovered strongly, the markets recovered, inflation fell, eventually to single figures, and unemployment fell too. But the contraction of manufacturing began to accelerate and there was a national and international swing to the right as a reaction against perceived high-taxing and high-spending governments. Demands were being made that governments cease propping up ‘lame duck’ industries with public money. Attacks on trade union power continued, coming to a head in the ‘Winter of Discontent’ of 1978-79, when there was an explosion of resentment, largely by poorly paid public employees, against a government income policy they felt was discriminatory.

Rubbish piled up in London during the ‘Winter of Discontent’ of 1978/79.

Appendix II: HRHs Prince Charles & Princess Anne – Vision & Work:

Following his Investiture as Prince of Wales, Charles was sent around the world as the heir to the throne and the House of Windsor-Mountbatten’s new star. Returning to Cambridge the following year, he finished his studies and took his bachelor’s degree.

He then went to the Royal Naval College in Dartmouth, where his mother and father had first met and where Prince Philip had trained as an officer in the Royal Navy before the Second World War. Charles also trained at the Royal Air Force College, becoming a helicopter pilot. From 1971 to 1976 he took a tour of duty in the Royal Navy. As Prince of Wales, Charles wanted to make a difference. He possessed a personal vision of harmony between human society and the natural world. He was outspoken in talking about the environment which was then viewed as being a niche interest and a marginal issue, and he was therefore seen by many as being quite eccentric. But he continued to develop his interests in gardening, sustainability and conservation, in addition to architecture, spirituality and social reform. He shared many of these interests with his father. On leaving the Navy in 1976, he founded The Prince of Wales’s Institute for Architecture and was involved with urban regeneration and development projects. He also set up The Prince’s Trust to help young people get into work and the Business in the Community scheme. In addition, he oversaw the management of the Duchy of Cornwall.

Throughout the 1970s, pressure grew on Charles to marry. He loved sports, especially playing polo, and was seen as one of the world’s most eligible bachelors, appearing on covers of magazines and tabloid newspapers, and with beautiful society women, including Camilla Shand, who later married and then divorced Andrew Parker-Bowles.

The Royal Family Tree, 1894-1990:

This image has an empty alt attribute; its file name is 001-15.jpg

Princess Anne, also known as Princess Royal, is the Queen’s only daughter. She was given this title by the Queen in 1987. She is a keen and capable horsewoman and won the European Championships in 1971. For this, she was also voted BBC Sports Personality of the Year by millions of television viewers. She represented Great Britain and Northern Ireland in the 1976 Olympics in Montreal. In 1970 she became President of the Save the Children Fund, gaining great admiration for her tireless work for her charity around the world, as seen in the photo below (the princess is in the centre, wearing a saffron top).


Andrew Marr (2007, ’08, ’09), A History of Modern Britain. Basingstoke: Pan Macmillan.

Philip Parker (2017), History of Britain in Maps. Glasgow: HarperCollins.

John Hayward & Simon Hall (eds.) (2001), The Penguin Atlas of British & Irish History. London: Penguin Books.

Gwyn A. Williams (1985), When Was Wales? Harmondsworth: Penguin Books.


Majesty & Grace IX: The Reign of Elizabeth Windsor, 1963-78: Part 1 – Rebellious Britons.

Protest & Planning, 1963-68 – Youth, Vietnam & Grosvenor Square:

The 1960s were dramatic years in Britain. Demographic trends, especially the increase in the proportion of teenagers in the population, coincided with economic affluence and ideological experimentation to reconfigure social mores to a revolutionary extent. In 1964, under Harold Wilson, the Labour Party came into power, promising economic and social modernisation. Economically, the main problems of the decade arose from the devaluation of the currency in 1967 and the increase in industrial action. This was the result of deeper issues in the economy, such as the decline of the manufacturing industry to less than one-third of the workforce. By contrast, employment in the service sector rose to over half of all workers.

Young people were most affected by the changes of the 1960s. Overall, the period from the early 1950s to the mid-1970s was a long period of economic expansion and demographic growth which helped to fuel educational development in England. Education gained new prominence in government circles and student numbers soared. By 1966, seven new universities had opened: Sussex (pictured below), East Anglia, Warwick, Essex, York, Lancaster and Kent.

From a pamphlet on the History of Architecture.

By 1972 there were forty-five universities, compared with just seventeen in 1945. By 1966, seven new universities had opened, including the University of East Anglia and the University of Warwick at Canley near Coventry. More importantly, perhaps, students throughout the country were becoming increasingly radicalised as a response to growing hostility towards what they perceived as the political and social complacency of the older generation. They staged protests on a range of issues, from dictatorial university decision-making to apartheid in South Africa, and the continuance of the Vietnam War. But not all members of the ‘older generation’ were ‘complacent’ and many joined in the protests.

Above: A Quaker ‘advertisement’ in the Times, February 1968.

The Vietnam War not only angered the young of Britain but also placed immense strain on relations between the US and British governments. Although the protests against the Vietnam War were less violent than those in the United States, partly because of more moderate policing in Britain, there were major demonstrations all over the country; the one which took place in London’s Grosvenor Square, home to the US Embassy, in 1968, involved a hundred thousand protesters. Like the world of pop, ‘protest’ was essentially an American import. When counter-cultural poets put on an evening of readings at the Albert Hall in 1965, alongside a British contingent which included Adrian Mitchell and Christopher Logue, the ‘show’ was dominated by the Greenwich Village guru, Allen Ginsberg.

It was perhaps not surprising that the American influence was strongest in the anti-war movement. When the Vietnam Solidarity Committee organised three demonstrations outside the US embassy in London’s Grosvenor Square, the second of them particularly violent, they were copying the cause and the tactics used to much greater effect in the United States. The student sit-ins and occupations at Hornsey and Guildford Art Colleges and Warwick University were pale imitations of the serious unrest on US and French campuses. Hundreds of British students went over to Paris to join what they hoped would be a revolution in 1968, until de Gaulle, with the backing of an election victory, crushed it. This was on a scale like nothing seen in Britain, with nearly six hundred students arrested in fights with the police on a single day and ten million workers on strike across France.

Modernising Britain, 1963-68:

Andrew Marr has commented that the term ‘Modern Britain’ does not simply refer to the look and shape of the country – the motorways and mass car economy, the concrete, sometimes ‘brutalist’ architecture, the rock music and the high street chains. It also refers to the widespread belief in planning and management. It was a time of practical men, educated in grammar schools, sure of their intelligence. They rolled up their sleeves and took no-nonsense. They were determined to scrap the old and the fusty, whether that meant the huge Victorian railway network, the Edwardian, old Etonian establishment in Whitehall, terraced housing, censorship, prohibitions on homosexual behaviour and abortion.

The map (below) unveiled by British Transport Commission Chairman Sir Richard Beeching in March 1963 marked the symbolic end of the great railway age. Taking an axe, as he had, to great swathes of rural lines, Beeching tried to fend off the challenge posed by the growth of road transportation and left large areas of the countryside with no train services at all. The spider’s web of the track as shown in his report, The Reshaping of the British Railways, with black indicating those routes which were to be closed and red the lines that had been selected for survival (though not all with stopping services), was Beeching’s way of solving a problem which had been apparent for some time: the railways were simply not profitable. Ticket revenues and freight charges were hopelessly inadequate to defray the expenses of running a comprehensive network, particularly as successive governments had taken the way of political least resistance by acceding to demands for higher wages in the rail industry, while at the same time keeping a ceiling on fare increases.

There had been some rationalisation already. Around 1,300 miles had been closed by 1939 and after the nationalisation of the railways in 1948, the new British Transport Commission had pared down another 3,300 miles by 1962. But the salami-slicing of selected lines could not stem the losses, which had reached just over a hundred million in the same year. Many feared that the unspoken policy of combining unstoppable costs with an unmovably large network would lead to the death of the railways. Beeching, on secondment from ICI, at the time Britain’s largest manufacturer, stepped in with a solution that was as unpalatable as it was logical. His report provided a stark analysis that fifty per cent of Britain’s rail routes provided only two per cent of its revenues and that half of the 4,300 stations had annual receipts of less than ten thousand pounds. Some lines covered barely ten per cent of their running costs. To save the arterial rail routes, Beeching proposed ripping out the veins, ruthlessly shutting down railway tracks where there was no prospect of a profitable service, or where routes were duplicated. He earmarked for closure half of those stations still operating in 1962 and five thousand miles of track, about a third of the total remaining.

There were howls of protest and a few lines in Scotland, Wales and the southwest of England were reprieved. Some stretches were saved and turned into heritage railways, but most remained neglected and grassed over. The losses that Beeching had hoped to stem continued. It was estimated that the savings were only thirty million per year, compared with continuing costs of a hundred million p.a. by 1968. In the same year, a new Transport Act accepted that the railways would need to receive a subsidy for a further three years. But the British government, subsequently, never did rid itself of the need to subsidise the country’s rail system. Yet the rail system avoided complete collapse, and in terms of passenger numbers, prospered, so that journeys, which had declined from 965 million a year in 1962 to 835 million in 1965. The railways were also made more efficient with the closure of almost six thousand miles of track and two thousand stations after the Beeching report of 1963. Thereafter, they concentrated on fast intercity services and bulk-freight transportation. Beeching’s axe may have wounded the railways, but the blood-letting ultimately allowed them to survive, and even, in many areas, to thrive.

The Troubles in Ireland & Terrorism in Britain, 1964-1974:

In 1963 Terence O’Neill had become Prime Minister of Northern Ireland. His government’s policies of economic modernisation coincided with and encouraged a growing self-confidence among the Catholic middle classes, who were willing to accept the continuing participation of Ireland provided that they were given equal status within Northern Ireland. This confidence found expression in the civil rights of the mid-sixties, which campaigned in particular on issues of discrimination against the Catholic minority in housing and electoral gerrymandering. O’Neill’s Catholic-friendly rhetoric began to alienate the more conservative fringes of unionism, but it was the emergence of the radical People’s Democracy movement and its socialist anti-state wing that made the prime minister’s standing within his own party increasingly difficult.

The stump of Nelson’s Pillar, on Sackville Street (Now O’Connell Street).

When Nelson’s Pillar in the centre of Dublin was blown up by the Official IRA on 8th March 1966, the biggest controversy about it was why it had taken 157 years for the demolition to happen. Constructed circa 1808, the Protestant Ascendency class who had erected it had then celebrated. For many, the resentment had run deep. Almost fifty years after the Easter 1916 Rising in Dublin had blazed the trail towards Irish Independence and the establishment of the Republic, an English colonialist still towered over every other notable in Ireland’s capital city, they groused. The reason it had taken fifty years to remove the pillar was to be found within the capital itself, but the reason for it happening in 1966 had much to do with what had been taking place in Ireland’s partitioned northern city, Belfast. There, and in other towns in the North, Nationalist politics had moved onto the streets, where demonstrations and counter-demonstrations frequently led to riots.

For Nationalists in Ireland, both North and South, Northern Ireland was an artificial state kept in being by the control of the Protestant majority from 1922 onwards. By 1962 it was in disarray. A powerful civil rights movement arose on behalf of the nationalist (and usually Roman Catholic) minority. But, in practice, attempts to maintain inter-religious and intercultural harmony broke down.

The radicals may only have wanted a fully democratic society, but the majority of the province’s population increasingly saw this as a return to the age-old struggle for power between unionists and nationalists. While the last unionist government at Stormont from 1969 to 1972 were trying to create a consensus by granting most of the civil rights demands, the revival of the latent violent sectarianism made the province ungovernable.

The Westminster government deployed troops in the province in 1969 and moved into Belfast and Londonderry to preserve order. An alarming spate of bombing attacks on English cities signified that the Provisional IRA and Sinn Fein were taking the almost century-long struggle into a new and sinister phase. Then, in 1972, the most violent year of the Troubles, Westminster and Whitehall took over the government of Northern Ireland through Direct Rule. In that year, over four hundred people in the province lost their lives as a result of political violence.

A rioter throws a petrol bomb at British soldiers and police in Belfast in 1972.

The British government had only reluctantly become involved. Its subsequent policies were aimed at finding a political solution by creating a middle ground where the liberal wings of nationalism and unionism could find a consensus that would eventually make the militants of both sides redundant. This strategy proved unsuccessful, not least because of the nature and internal logic of direct rule. Because they were denied direct access to power, both sides could attack British policies as inappropriate and for failing to deliver their respective demands. At the same time, paramilitaries of both sides could drive the point home by violence that was, at least in part, justifiable in the eyes of their respective communities.

On the afternoon of May Day in 1971 John Evans, manager of the hugely popular Kensington boutique Biba, walked nervously downstairs into the store’s basement. There had been a series of outlandish phone calls warning about some kind of bomb, which to start with had simply been ignored by the assistant on the till. However, five hundred women and children had been evacuated by the time Evans pushed open the door of the stock room. There was an almighty bang, a flash and a flame and a billow of smoke. The Angry Brigade, mainland Britain’s own terror group, had struck again. They had chosen Biba, they said in their statement, because they saw boutiques as modern slave houses, for both their staff and customers. What they did not seem to realise was that Biba’s customers found it liberating, not oppressive. In some ways, the Biba bombing was the event that marked the end of the sixties dream. Two of the main forces behind the youth culture were at war with each other, the fantasy of the Angry Young People and that of the benign hippies as part of the consumer culture of cool clothes. The two subcultures, Biba and the Angries were two sides of the same coin of sixties youth culture.

The small group of university dropouts who made up the Angry Brigade would go to prison for ten years after a hundred and twenty-three attacks, but they are little remembered now. But they were the nearest Britain came to an anarchist threat, both anti-Communist and anti-Capitalist. That also made them anti-liberal. Western democracy, Stalinism, the media, the drug-taking hippy culture, modern architecture and even tourism were all targets for attack. More broadly and seriously, in the later sixties and early seventies, with significant minorities on the march from Brixton to Belfast, the liberal consensus in the United Kingdom seemed to be breaking down just as it had almost done in 1910-14. Beneath the veneer of public contentment, there were, in reality, divisive forces deeply entrenched in British society in the sixties. In the period of Harold Wilson’s first premiership (1964-70), a wide range of radical groups were exploding into revolt. The Angries were an extreme, violent manifestation of this revolt, but many young Britons were finding the values of consumerism and conformity unappealing in a world whose ecology was being disturbed and whose very existence was threatened by weapons of unimaginable horror.

Clearly, then, in the early seventies violence, even in the form of anti-State terrorism, had become a common theme on both sides of the Irish Sea. If there was one moment when the ‘troubles’ became unstoppable it was 30th January 1972, ‘Bloody Sunday’, when troops from the Parachute Regiment killed thirteen unarmed civilians in Londonderry. Albeit reluctantly, Edward Heath introduced internment without trial for suspected terrorists. He authorised the arrest and imprisonment in Long Kesh prison of 337 IRA suspects. In dawn raids, three thousand troops had found three-quarters of the people they were looking for, though even among these were many old or inactive former ‘official’ IRA members. Many of the active ‘Provo’ (Provisional IRA) leaders escaped south of the border. Protests came in from around the world. There was an immediate upsurge of violence, with twenty-one people killed in three days. Bombings and shooting simply increased in intensity. In the first eight weeks of 1972, forty-nine people were killed and more than 250 were seriously injured. Amid this already awful background, Bloody Sunday was an appalling day when Britain’s reputation around the world was damaged almost irrevocably. In Dublin, ministers reacted with fury and the British embassy was burned to the ground.

The tragic event in Londonderry made it easier for the ‘Provos’ to raise funds abroad, especially in the United States. This support emboldened the ‘Provos’, who hit back with a bomb attack on the Parachute Regiment’s Aldershot headquarters, killing seven people, none of them soldiers. The initial escalation of violence in ‘the province’ led to the imposition, by degrees, of direct rule by Whitehall. But all British political and administrative initiatives encountered perennial problems: one side or the other, and sometimes both, was unwilling to accept what was proposed. Ted Heath believed that he needed to persuade Dublin to drop its longstanding constitutional claim to the North, and, simultaneously, to persuade mainstream Unionists to work with moderate Nationalist politicians. His first Secretary of State for Northern Ireland, a new post made necessary by direct rule, William Whitelaw, met the Provisional IRA leaders, including Gerry Adams, for face-to-face talks, a desperate and risky gamble which, however, led nowhere. There was no compromise yet available that would bring about a ceasefire.

So, ignoring the Provos, the Sunningdale Agreement of 1973 proposed a power-sharing executive of six Unionists, four Nationalists (SDLP) and one non-sectarian Alliance Party member. It failed because the majority of Unionists would not accept an Irish dimension in the form of the proposed Council of Ireland, bringing together politicians from both sides of the border with powers over a limited range of issues. This was what nationalists demanded in return for Dublin renouncing its authority over Northern Ireland. Too many Unionists were implacably opposed to the deal, and the moderates were routed at the first 1974 election. While the British government’s approach became subtler with regard to unionist concerns, a formula that was acceptable to both sides remained elusive and was to do so for another quarter of a century. At the time, Heath concluded:

‘Ultimately it was the people of Northern Ireland who threw away the best chance of peace in the blood-stained history of the six counties’.

Nonetheless, the level of political violence subsided after 1972 within Northern Ireland itself. With hindsight, ‘Bloody Sunday’ had been an exceptional, tragic event that no one had anticipated, despite the presence of British troops on the streets of Belfast and Londonderry. In most subsequent years considerably more people died in road accidents. In July 1973, however, without warning, twenty bombs went off in Belfast, killing eleven people. Mainland Britain then became the key Provo target. In October 1974, five people were killed and sixty injured in attacks on Guildford pubs and in December, twenty-one (mostly young) people were killed in Birmingham by bombs placed in two pubs in the city centre.

The South Wales Coalfield Tragedy of Aberfan, October 1966:

Aberfan in the days immediately after the disaster, showing the extent of the spoil slip.

Later in the same year as the blowing up of Nelson’s Pillar in Dublin, on 21st October, the people of the British Isles were devastated by the tragedy that befell the Welsh mining village of Aberfan in the valley below the town of Merthyr Tydfil. Twenty adults and a hundred and sixty-six children were lost when a colliery slag tip, soaked by heavy rain, slipped down the hillside above the village junior school, Pantglas, smothering classes of eight and nine-year-olds and their teachers who were just beginning their lessons for the day. Funds were raised in churches and schools across Britain and Ireland for the relief effort being led by local miners. Still, though government ministers rushed to the scene, the Queen and Royal Family were advised to stay away. At the same time, bodies were still being recovered, and it was thought that her entourage might get in the way of the search and recovery operation.

The rescue of a young girl from the school; no survivors were found after 11:00 am

She delayed her visit until nine days after the disaster, a delay that was misinterpreted by some as being callous. However, when she did visit, she was visibly moved to tears and overcome with grief, so much so that she was welcomed into a miner’s house to recover her composure. Many still remember their emotions at this time, as children and parents, especially those who were the same age as the lost children of Aberfan – almost a whole year group had been wiped out. Those visiting the Welsh coalfield today can catch sight of the 194 crosses in the memorial garden from across the Taff valley.

As Prince of Wales, Charles revisited Aberfan on the fiftieth anniversary of the disaster in 2016.

In May 1997, the Queen and the Duke of Edinburgh planted a tree at the Aberfan Memorial Garden. In February 2007, the Welsh Government announced a donation of £1.5 million to the Aberfan Memorial Charity and £500,000 to the Aberfan Education Charity, which represented an inflation-adjusted amount of the money taken to pay to secure the tip. The money for the memorial charity was used for the upkeep of the memorials to the disaster. In October 2016, on the fiftieth anniversary of the disaster, commemorative events took place in the garden and at the cemetery; the Prince of Wales represented the Queen, and government ministers were present to pay tribute. At the time of the anniversary Huw Edwards, the BBC News journalist and presenter, described the need to continue learning lessons from Aberfan:

“What we can do, however—in this week of the fiftieth anniversary—is try to focus the attention of many in Britain and beyond on the lessons of Aberfan, lessons which are still of profound relevance today. They touch on issues of public accountability, responsibility, competence and transparency.”

The dedication plaque at the Aberfan Memorial Garden

In January 2022, there was a call to find a permanent home for the artefacts salvaged from the disaster. These include a clock that had stopped when the tragedy occurred.

Harold’s Bright Young Things & The Technological Revolution:

According to Marr (2007-09), the country seemed to be suddenly full of bright men and women from lower-middle-class or upper-working-class families who were rising fast through business, universities and the professions. They were inspired by Harold Wilson’s talk of a scientific and technological revolution that would transform Britain. In October 1963 Wilson, then Labour leader of the opposition, predicted that Britain would be forged in the white heat of the technological revolution. In his speech at Labour’s 1963 conference, the most famous he ever made, Wilson pointed out that such a revolution would require wholesale social change:

‘The Britain that is going to be forged in the white heat of this revolution will be no place for restrictive practices or for outdated methods … those charged with the control of our affairs must be ready to think and speak in the language of our scientific age. … the formidable Soviet challenge in the education of scientists and technologists in Soviet industry (necessitates that) … we must use all the resources of democratic planning, all the latent and underdeveloped energies and skills of our people to ensure Britain’s standing in the world.’

Above: Grammar School Boy, Harold Wilson, Labour leader and PM

Dedicated Followers of Fashion:

In some ways, however, the new, swinging Wilsonian Britain was already out of date by the mid-sixties. In any case, his vision, though sounding ‘modern’ was essentially that of an old-fashioned civil servant. By 1965, Britain was already becoming a more feminised, sexualised, rebellious and consumer-based society. The political classes were cut off from much of this cultural undercurrent by their age and consequent social conservatism. They looked and sounded exactly what they were, people from a more formal, former time.

Barbara Hulanicki’s Kensington shop.

By 1971, sixty-four per cent of households had acquired a washing machine. This, in addition to the rapid and real growth in earnings of young manual workers, sustained over the past decade, had, by the mid-sixties, created a generation who had money to spend on leisure and luxury. The average British teenager was spending eight pounds a week on clothes, cosmetics, records and cigarettes. In London, their attitude was summed up by the fashion designer Mary Quant, whose shop, Bazaar, on King’s Road, provided clothes that allowed people to run, to jump, to leap, to retain their precious freedom.

Beatlemania had swept the British Isles in the early sixties, and by the middle of the decade the group had become a global phenomenon, playing all over Europe, as well as Australia, Japan and, of course, the USA. By 1967, returning from a trip to India, they recorded their influential album Sgt. Pepper’s Lonely Hearts Club Band, with its famous art cover. Before breaking up in the early 1970s, they gave up touring and concentrated on recording a string of albums at their Abbey Road studio in London, including Abbey Road, The Double White Album and Let it Be.

Meanwhile, a more working-class sub-culture emerged, particularly in London and the South-East, as rival gangs, successors to the Teddy Boys of the late fifties, of Mods and Rockers followed hard rock bands like The Who and The Rolling Stones. In the summer of 1964, they rode their mopeds and motorbikes from the London suburbs down to Brighton, where they met up on the beach and staged fights with each other. Pete Townsend and Roger Daltry documented the social history of the earlier period in their early seventies rock album and film, Quadrophenia.

‘The Who’ began life as a ‘Mod’ band in the mid-sixties.
By the early 1970s, their hairstyles, clothes and music had changed dramatically.
Over 250,000 went to the Rolling Stones’ open-air concert in the early seventies.
Education – The Binary Divide & Comprehensivisation:

By 1965, the post-war division of children, effectively, into potential intellectuals, technical workers and ‘drones’ – gold, silver and lead – was thoroughly discredited. The fee-paying independent and ‘public’ schools still thrived, with around five per cent of the country’s children ‘creamed off’ through their exclusive portals. For the other ninety-five per cent, ever since 1944, state schooling was meant to be divided into three types of schools. In practice, however, this became a binary divide between grammar schools, taking roughly a quarter, offering traditional academic teaching, and the secondary modern schools, taking the remaining three-quarters of state-educated children, offering a technical and/or vocational curriculum. The grandest of the grammar schools were the 179 ‘direct grant’ schools, such as those in the King Edward’s Foundation in Birmingham, which J.R.R. Tolkien had attended, and the Manchester Grammar School. They were controlled independently of both central and local government, and their brighter children would be expected to go to the ‘better’ universities, including Oxford and Cambridge, from where they would enter the professions.

Alongside the direct grant schools, also traditionalist in ethos but ‘maintained’ by the local authorities, were some 1,500 ordinary grammar schools. The division was made on the basis of the selective state examination known as the ‘eleven plus’ after the age of the children who sat it. The children who ‘failed’ this examination were effectively condemned as ‘failures’ to attend what were effectively second-rate schools, often in buildings which reflected their lower status. As one writer observed in 1965, ‘modern’ had become a curious euphemism for ‘less clever.’ Some of these schools were truly dreadful, sparsely staffed, crowded into unsuitable buildings and submitting no pupils for outside examinations before most were released for work at fifteen. At A Level, in 1964, the secondary moderns, with around seventy-two per cent of Britain’s children, had 318 candidates. The public schools, with five per cent, had 9,838. Many of those who were rejected at the eleven plus and sent to secondary moderns never got over the sense of rejection. The IQ tests were shown not to be nearly as reliable as first thought. Substantial minorities, up to sixty thousand children a year, were at the ‘wrong’ school and many were being transferred later, up or down.

In addition, the selective system was divisive of friendships, families and communities. Different education authorities had widely different proportions of grammar school and secondary modern places; division by geography, not even by examination. A big expansion of teachers and buildings was needed to deal with the post-war baby boom children who were now reaching secondary school. Desperately looking for money, education authorities snatched at the savings a simpler comprehensive system, such as that pioneered and developed in Coventry in the fifties, might produce. Socialists who had wanted greater equality, among whom Education Secretary Tony Crosland had long been prominent, were against the eleven-plus on ideological grounds. But many articulate middle-class parents who would never have called themselves socialists were equally against it because their children had failed to get grammar school places.

With all these pressures, education authorities had begun to move towards a one-school-for-all or comprehensive system during the Conservative years, Tory Councils as well as Labour ones. In 1964 the head of Whitley Abbey School in Coventry concluded that the city council now needed to choose between returning to the grammar and secondary modern school system or going fully comprehensive. Therefore, in the early 1960s at least, grammar schools and selection were still at the heart of Coventry’s so-called comprehensive revolution. There were also comprehensives elsewhere on the Swedish model, and they were much admired for their huge scale, airy architecture and apparent modernity. Crosland hastened the demise of the grammar schools by requesting local authorities to go comprehensive. He did not say how many comprehensives must be opened nor how many grammar schools should be closed, but by making government money for new school buildings conditional on going comprehensive, the change was greatly accelerated.

An early comprehensive school for 11-18-year-olds of differing abilities, taught together.
High-rise homes, Class & Communities:

New housing schemes, including estates and high-rise blocks of flats, plus the new town experiments, undermined the traditional urban working-class environments, robbing them of their intrinsic collective identities. The extended kinship network of the traditional prewar working-class neighbourhoods and communities was replaced by the nuclear family life on the new estates. Rehousing, property speculation, the rise of the consumer society, market forces, urban planning and legislation, all play their role in the further regeneration of working-class culture. In 1972, Phil Cohen, a University of Birmingham sociologist, described these processes in a Working Paper:

The first effect of the high density, high-rise schemes was to destroy the function of the street, the local pub, the corner shop… Instead there was only the privatised space of the family unit, stacked one on top of another, in total isolation, juxtaposed with the totally public space which surrounded it, and which lacked any of the informal social controls generated by the neighbourhood.

The streets which serviced the new estates became thoroughfares, their users ’pedestrians’, and by analogy so many bits of human traffic… The people who had to live in them weren’t fooled. As one put it – they might have hot running water and central heating, but to him they were still prisons in the sky… The isolated family unit could no longer call on the resources of wider kinship networks, or the neighbourhood, and the family itself became the sole focus of solidarity…

The working class family was… not only isolated from the outside but undermined from within. There is no better example of what we are talking about than the so-called ’household mother’. The street or turning was no longer available as a safe play space, under neighbourly supervision. Mum, or Auntie, was no longer just round the corner to look after the kids for the odd morning. Instead, the task of keeping an eye on the kids fell exclusively to the young wife, and the only safe play space was the ’safety of the home’.

However, away from the high-rise blocks, the stubborn continuities of working-class life and culture survived. Nevertheless, the theme of the community became a matter of widespread and fundamental concern in the period. The question emerged as to whether, as the conditions and patterns of social life for working people changed, and as what surplus money there about began to pour into the new consumer goods on offer, people might not only be uprooted from a life they knew, and had made themselves, to another made partly for them by others. This might also involve a shift from the working-class values of solidarity, neighbourliness and collectivism, to those of individualism, competition and privatisation. 

Adding Colour to Real Life – From the World Cup on TV to Roads:

The BBC archive material from the period records how television played a role in this transition to more middle-class attitudes:

Nowadays, there’s a tremendous change, an amazing change, in fact, in just a few years. People have got television. They stay at home to watch it – husbands and wives. If they do come in at the weekend they’re playing bingo. They’ve now got a big queue for the one-armed bandit as well. They do have a lot more money, but what they’re losing is togetherness.

But TV did at least bring families together to watch major events and light entertainment. Many still remember the World Cup of 1966 as the most colourful event of the era, but although a colour cine film recording of the match was made and released later, people watched it live on TV in black and white. Only the hundred thousand at Wembley that day saw the red shirts of the England team raise the Jules Rimet trophy after the match.

For many in Britain, not just England, the event which marked the high point in popular culture was Alf Ramsey’s team’s victory over West Germany. The tournament was held in England for the first time, and the team, built around Bobby Charlton, the key Manchester United midfielder who, along with Nobby Stiles, had survived the Munich air crash earlier in the decade, and Bobby Moore, the captain, from West Ham United, who also had two skilful forwards in the team in Martin Peters and striker Geoff Hurst. Most people remember (in colour, of course) Geoff Hurst’s two extra-time goals and Kenneth Wolstenholme’s commentary because they have watched them replayed so many times since. After the match, however, people dressed up in a bizarre, impromptu mixture of colourful sixties fashion and patriotic bunting and came out to celebrate with family, friends and neighbours just as if it were the end of the war again. The 1970 World Cup, from Mexico, was in colour on TV.

By the mid-sixties, there were also far more brightly coloured cars on the roads, most notably the Austin Mini, but much of the traffic was still the boxy black, cream or toffee-coloured traffic of the fifties. By 1967 motorways totalled 525 miles in length, at a cost of considerable damage to the environment. Bridges were built over the Forth and Severn between 1964 and 1966. The development of new industries and the growth of the east coast ports necessitated a considerable programme of trunk road improvement. This continued into the mid-seventies at a time when economic stringency was forcing the curtailment of other road-building schemes. East Anglia’s new roads were being given priority treatment for the first time. Most of the A12, the London-Ipswich road, was made into a dual carriageway. The A45, the artery linking Ipswich and Felixstowe with the Midlands and the major motorways, had been considerably improved. Stowmarket, Bury St Edmunds and Newmarket had been bypassed. By the end of the decade, the A11/M11 London-Norwich road was completed, bringing to an end the isolation of northern and central Suffolk. Plans to triple the 660 miles of motorway in use by 1970 were also frustrated by a combination of the resulting economic recession, leading to cutbacks in public expenditure, and environmental protest.

The continuing working-class prosperity of the Midlands was based on the last fat years of the manufacture of cars, as well as other goods. But until the mid-fifties, Coventry’s industrial over-specialisation had gone relatively unnoticed, except by a few economists writing in The Times and The Financial Times. This in turn was compounded by the fact that within the British motor industry as a whole Coventry was steadily becoming of less importance as a source of output and coupled with relatively low profits and investment levels, the economy’s stock was slowly ossifying and becoming increasingly inflexible. Yet other car centres, notably Birmingham, Cowley, Dagenham and Luton were subjected to similar pressures but retained the bulk of their manufacturing capacity to the end of the seventies. It is no coincidence that most of what remained of the British motor industry was centred in towns which were dominated by one single large manufacturing plant.

The problem peculiar to Coventry was not only that the local economy became overdependent on the motor industry but that virtually all the automotive firms were, by the 1960s, ill-suited because of their size to survive the increasing competitiveness of the international market. A major reason for Coventry’s long boom was the multiplicity of firms in the motor industry, but in the seventies, this became the major cause of its decline. The only viable motor car establishment to survive this deep recession was Jaguar. The incentives to embark on a vast restructuring of industry, whether national or local, were simply not there, especially since the policy of successive governments was to divert industry away from the new industrial areas of the interwar period in favour of Britain’s depressed areas, or development areas, as they had been redesignated in the immediate postwar period.

Britain as the Sick Man of Europe – The Economy and EEC Membership:

Edward Heath campaigning, unsuccessfully, in 1966.

In the early 1970s, inflation began to rise significantly, especially after Edward Heath’s Conservative government recklessly expanded the money supply, a misguided version of Keynesianism. All the predictions of Keynesian economists were overturned as rising inflation was accompanied by rising unemployment. At first, this was once again confined to the older industrial areas of the northeast, Scotland and South Wales. The rise of more militant nationalisms in the two Celtic nations was now as much concerned with the closure of collieries and factories, and the laying off of labour, as with cultural issues. By 1973, it was clear that the economic problems of Britain were having far more general consequences. The nation’s capacity to generate wealth, along with its share of world trade and productivity, were all in serious, if not terminal, decline. Britain was seen as ‘the sick man of Europe.’

By 1970, after a decade during which Britain had grown much more slowly than the six original members of the Common Market, Heath was in some ways in a weaker position than Macmillan had been. On the other hand, he also had some advantages. He was trusted as a serious negotiator. Britain’s very weakness persuaded Paris that this time, les rosbifs were genuinely determined to join. Pompidou also thought the time was right and he wanted to get out of De Gaulle’s shadow. But France, like the rest of the Community, had for years been struggling to understand what Britain really wanted. This had been particularly difficult in the first seven Wilson years when the British left had been riven by the issue. Heath had only promised to negotiate, however, not to join. But his enthusiasm was in stark contrast to Wilson’s blowing hot and cold. Yet opinion polls suggested that Heath’s grand vision was alien to most British people.

With Heath in power, after over eighteen months of haggling in London, Paris and Brussels, a deal was thrashed out. It infuriated Britain’s fishermen, who would lose control over most of their traditional grounds to open European competition, particularly from French and Spanish trawlers. It was the second-best deal on the budget, later reopened by Margaret Thatcher. Above all, it left intact the original Common Market designed for the convenience of French farmers and Brussels-based bureaucrats, not for Britain. Vast slews of European law had to be swallowed whole, much of it objectionable to the British negotiators. There were some marginal concessions for Commonwealth farmers and producers, but these were granted in return for the bad deal on the budget. The reality was that the negotiators were directed to get a deal at almost any price. At a press conference at the Élysée Palace in 1971, Heath and Pompidou, after a long private session of talks between the two of them, revealed to general surprise that so far as France was concerned, Britain could now join the Community. Heath was particularly delighted to have triumphed over the press, who had expected another ‘Non.’

A national debate and vote in Parliament followed about the terms of entry, but although he had publicly supported British entry before negotiations began, Wilson now began sniping at Heath’s deal. Jim Callaghan, his potential successor, was already openly campaigning against the deal, partly on the grounds that the EEC threatened the language of Chaucer, Shakespeare and Dickens. The Labour left, too, was in full cry: a special Labour Conference in July 1971 voted by a majority of five to one against membership, and Labour MPs were also hostile by a majority of two to one. Wilson announced that he was now opposed to membership on the Heath terms. After the long and tortuous journey to reach this point, the pro-Europeans were disgusted: they defied the party whip in the Commons and voted with the Conservatives. They were led by Roy Jenkins. The left, led by Barbara Castle, Michael Foot and Tony Benn were livid with the sixty-nine rebels. For both sides, this was a matter of principle that would continue to divide the party until the present day. On the night of the vote in the Commons, there were screaming matches in the lobbies between the pro- and anti-marketeers among Labour MPs. After winning his Commons vote on British membership of the Community, Heath returned to Downing Street to play Bach on the piano in a mood of restrained triumphalism.

Tony Benn began to argue that on a decision of such importance the people should vote in a referendum, since a democratic country that denied its people the right to choose its future would lose all respect. To begin with, Benn had little support for this radical thought, since most on the Left despised referenda as fascist devices, subject to manipulation in a parliamentary democracy. Pro-Europeans also feared that this was a ploy to commit Labour to pull out. Harold Wilson had committed himself publicly against a referendum, but he came to realise that opposing Heath’s deal but promising to renegotiate, offering a referendum, could be the way out. The promise would also give him some political high ground. He would trust the people even if the people were, according to the polls, already fairly bored and hostile. When Pompidou suddenly announced that France would hold its own referendum on British entry, Wilson snatched at the Benn plan. It was an important moment because a referendum would make the attitude of the whole country clear, at least for a generation. Referenda also became devices used again by politicians faced with difficult constitutional choices.

Wilson’s Renegotiation & Referendum, 1974-75:

After winning the two elections of 1974, Wilson carried out his promised renegotiation of Britain’s terms of entry to the EEC and then put the result to in the Benn-inspired referendum of 1975. The renegotiation was largely a sham, but the referendum was a rare political triumph for Wilson after the elections and before his retirement in 1976. On the continent, the renegotiation was understood to be more for Wilson’s benefit than anything else. Helmut Schmidt, the new German Chancellor, who travelled to London to help charm and calm the Labour conference, regarded it as a successful cosmetic operation. Wilson needed to persuade people he was putting a different deal to the country than the one Heath had negotiated. He was able to do this, but when the referendum campaign actually began, Wilson’s old evasiveness returned and he mumbled vaguely his support, rather than actively or enthusiastically making the European case. To preserve longer-term party unity, he allowed anti-Brussels cabinet ministers to speak from the ‘No’ platform and Barbara Castle, Tony Benn, Peter Shore and Michael Foot were among those who took up this offer.

The ‘No’ campaign was all about prices, not about ‘sovereignty’. Top: Barbara Castle leading one of many cunning ‘stunts’.

They were joined on the platform by Enoch Powell, Rev. Ian Paisley of the Democratic Unionist Party (DUP), the Scottish Nationalists and others. But the ‘Yes’ campaign included most of the Labour cabinet, with Roy Jenkins leading the way, plus most of the Heath team and the popular Liberal leader Jeremy Thorpe. It seemed to many people a fight between wild-eyed ranters on the one hand and sound chaps on the other. More important, perhaps, was the bias of business and the press. A Confederation of British Industry (CBI) survey of company chairmen found that out of 419 interviewed, just four were in favour of leaving the Community. Almost all the newspapers were in favour of staying in, including the Daily Mail, Daily Telegraph, and Daily Express. So was every Anglican bishop. The Yes campaign, led by Britain in Europe outspent the No camp by more than ten to one. In this grossly unequal struggle, both sides used scare stories. Yes warned of huge job losses if the country left the Community. The No camp warned of huge rises in food prices.

There were meetings with several thousand participants, night after night around the country, and the spectacle of politicians who usually attacked each other sitting down together and agreeing on something was a tonic to audiences. Television arguments were good, especially those between Jenkins and Benn. On the Labour side, there were awkward moments when the rhetoric got too fierce, and Wilson had to intervene to rebuke warring ministers. In the end, in answer to the simple question, Do you think that the United Kingdom should stay in the European Economic Community (The Common Market)? Just over sixty-eight per cent, around seventeen million people voted Yes and nearly thirty-three per cent (8.5 million) voted No. Only Shetland and the Western Isles of Scotland voted No.

Decimalisation was seen as a huge change to daily life, unwelcome to many older people. Though the original decision had been taken by the first Wilson government in 1965, the disappearance in 1971 of a coinage going back to ‘Anglo-Saxon’ times was widely blamed on Heath as part of the move into Europe. Away flew the beloved florins and half-crowns, ha’pennies, farthings, threepenny and sixpenny bits and out went, to the relief of many schoolchildren, the intricate triple-column maths of pounds, shillings and pence. In came the more rational decimal currency. But ‘imperial’ measures remained for milk and beer, and miles were retained in preference to kilometres. By the 1970s, the behaviour of the cliques who ran the country had been replaced as the chief motivation for political cynicism by a more general sense of alienation from ‘the State’ and ‘the Establishment.’ For many older Britons, these were years when change spun out of control. Much of the loathing of Heath on the right of politics came from British membership of the Common Market after 1973, which seemed to many to be the emblem of the ‘rage’ for new, bigger systems to replace the traditional ones. Writing in his diary in 1975, Tony Benn recorded his reaction to the possibility of a Europe-wide passport, revealing how much the left’s instincts could chime with those of the right-wing opponents of European change:

‘That really hit me in the guts … Like metrication and decimalisation, this really strikes at our national identity.’

Reading it in retrospect, the comment seems almost ironic, but at the time these symbols were emblematic of their ‘sovereignty’ for many Britons, whether they were on the left or right wing. More than forty years later, when Britain was considering leaving what by then had become the European Union, the biggest question both about Heath’s triumph in engineering Britain’s entry and then about Labour’s referendum is whether the British were told the full story about what this would mean and whether they truly understood the supranational organisation they were signing up to. Ever since that first referendum, many of those among the 8.5 million who voted against, and younger people who share their views, have suggested that Heath and Jenkins and the rest lied to the country, at least by omission. Had it been properly explained that Europe’s law and institutions would sit above the Westminster Parliament, it was argued, they would never have agreed. The Britain in Europe campaigners can point to speeches and leaflets which directly mention the loss of sovereignty. One of the latter read:

‘Forty million people died in two European wars this century. Better lose a little national sovereignty than a son or daughter.’

Expanding the membership of the EC, 1952-93.

Yet both in Parliament and in the referendum campaign, the full consequences for national independence were mumbled, and not spoken clearly enough. Geoffrey Howe, as he then was, who drafted Heath’s European Communities Bill, later admitted that it could have been more explicit about lost sovereignty. Heath talked directly about the ever-closer union of the peoples of Europe but was never precise about the effect on British law, as compared to Lord Denning who said the European treaty could be compared to…

‘… an incoming tide. It flows into the estuaries and and rivers. It cannot be held back.’

Hugo Young, the journalist and historian who studied the campaign in great detail wrote:

I traced no major document or speech that said in plain terms that national sovereignty would be lost, still less one that categorically promoted the European Community for its single most striking characteristic: that it was an institution positively designed to curb the full independence of the nation-state.

There were, of course, explicit warnings given by the No campaigners among the more populist arguments about food prices. They came, above all, from Enoch Powell, Michael Foot and Toy Benn. Foot wrote in The Times that the British parliamentary system had been made farcical and unworkable. Future historians, he said, would be amazed…

that the British people were urged at such a time to tamper irreparably with their most most precious institution; to see it circumscribed and contorted and elbowed off the centre of the stage.

So it was not true that people were not told. The truth revealed by opinion polls is that sovereignty as an issue did not concern the public nearly as much as jobs and food prices. By later standards, the position of Parliament was not taken terribly seriously in public debates. As Andrew Marr put it in his 2009 book, A History of Modern Britain:

It may be that sovereignty is always of absorbing interest to a minority – the more history-minded, politically-aware – and of less interest to the rest, except when a loss of sovereignty directly affects daily life and produces resented laws. In the seventies, Britain’s political class was not highly respected, and Europe seemed to offer a glossier, richer future. Though the pro-Community majority in business and politics did not strive to ram home the huge implications of membership, they did not deceitfully hide the political nature of what was happening, either. It was just that, when the referendum was held, people cared less. The argument would return, screaming … to be heard.

Andrew Marr (2007-9), op.cit., p.351.

‘The Booze Cruise’: A popular home-grown cartoon view of the British attitude to Europe in the late seventies & eighties.

The ‘Barber Boom,’ Inflation & Industrial Relations, 1971-73:

If Heath is associated with a single action, it is British entry into ‘Europe,’ but throughout his time in office, it was the economy which remained the biggest issue facing his government. The country was spending too much on new consumer goods and not nearly enough on modernised and more efficient factories and businesses. British productivity was still pathetically low compared with the United States or Europe, never mind Japan. Prices were rising by seven per cent and wage earnings by double that. The short-lived economic boom under the Conservative Heath government and Anthony Barber’s Chancellorship, greatly benefited the local motor industry, temporarily reversing the stall in population growth in manufacturing areas. But Britain’s falling competitiveness was making it difficult, in the early seventies, for governments to maintain high employment by intervening in the economy.

This was still the old, post-1945 world of fixed exchange rates which meant that the Heath government, just like those of Attlee and Wilson, faced a sterling crisis and perhaps another devaluation. Since 1945 successive governments had followed the tenets of Keynesian economics, borrowing in order to create jobs if unemployment approached a figure deemed as unacceptable (in the 1970s this was about six hundred thousand). During the decade, this became increasingly difficult to do as Edward Heath’s government (1970-74) struggled to follow such policies in the face of a global recession associated with the tripling of oil prices in 1973, by OPEC (the international cartel of oil producers). This caused an immediate recession and fuelled international inflation. Faced with declining living standards, the unions replied with collective industrial power. Strikes mounted up, most acutely in the coalfields.

The unions, identified by Heath as his first challenge, had just seen off Wilson and Barbara Castle. Heath had decided he would need to face down at least one major public sector strike, as well as remove some of the benefits that he thought encouraged strikes. Britain not only had heavy levels of unionisation through all the key industries but also, by modern standards, an incredible number of different unions, more than six hundred altogether. Added to this, leaders of large unions had only a wobbly hold on what actually happened on the ‘shop floor’ in factories. It was a time of politicised militancy there, well caught by the folk-rock band the Strawbs, who reached number two in the singles chart with their mock-anthem Part of the Union. Its lyrics included:

“Oh, you don’t get me I’m part of the union, till the day I die…

“As a union man I’m wise to the lies of the company spies…

“With a hell of a shout, it’s ‘out brothers, out!’ …

” And I always get my way, if I strike for higher pay…

“So, though I’m a working man, I can ruin the government’s plan … ”

Almost immediately after becoming PM, Heath faced a dock strike, followed by a big pay settlement for local authority dustmen, then a power workers’ go-slow which led to power cuts. Then the postal workers struck. Douglas Hurd, then Heath’s political secretary, recorded in his diary his increasing frustration:

‘A bad day. It is clear that all the weeks of planning in the civil service have totally failed to cope with what is happening in the electricity dispute: and all the pressures are to surrender’.

Later, Hurd confronted Heath in his dressing gown to warn him that the government response was moving too slowly, far behind events. At that stage, things in the car industry were so bad that Henry Ford III warned Heath that his company was thinking of pulling out of Britain altogether. Yet Heath’s Industrial Relations Bill of 1971 was meant to be balanced, giving new rights to trade unionists while at the same time trying to make deals with employers legally binding and enforceable through a new system of industrial courts. This was similar to the package offered by the Wilson government. There were also tax reforms, meant to increase investment, a deal with ‘business’ on keeping price increases to five per cent and even some limited privatisation, with the travel agents Thomas Cook, then state-owned, being sold off along with some breweries.

But the Tory messages and measures were confusing. Cuts in some personal taxes encouraged spending and therefore inflation. With European membership looming, Anthony Barber, Heath’s Chancellor, was dashing for growth, which meant further tax cuts and higher government spending and borrowing. Lending limits were removed for high street banks, resulting in a growth in lending from twelve per cent per year in 1971 to forty-three per cent per year in 1973. This obviously further fuelled inflation, particularly in the housing market. This led to a huge expansion of credit and capital sunk into bricks and mortar that became a feature of modern Britain.

At the same time, one of the historic constraints on successive post-war British governments was removed by President Nixon in the summer of 1971 when he suspended the convertibility of the dollar for gold and allowed exchange rates to ‘float.’ He was faced with the continuing high costs of the war in Vietnam, combined with rising commodity prices. The effect on Britain was that the government and the Bank of England no longer had to be quite so careful about maintaining sterling reserves. But it opened up fresh questions, such as how far down sterling could go and how industrialists could expect to plan ahead. Heath’s instincts on state control were also tested when the most valuable parts of Rolls-Royce faced bankruptcy over the cost of developing new aircraft engines. He quickly nationalised the company, saving eighty thousand jobs and allowing it to regroup and survive, to the relief of the defence industry. Rolls-Royce did revive, returning to the private sector, providing one example of how nationalisation could work in future.

A campaign poster during the 1972 miners’ dispute.

In 1972, in their first strike for a generation, the miners fought a dramatic battle, putting the country on a three-day week and unhinging the Heath government. This time we’ll win, they said. No one in South Wales needed to ask what last time they had in mind, especially those who still remembered the dark days of the twenties. The National Union of Mineworkers (NUM) submitted a demand for a forty-five per cent wage increase to the National Coal Board (NCB). When it was rejected, a national miners’ strike was called in February 1972. The Heath government experienced the full extent of the miners’ ability to disrupt national production and energy supplies, despite the contraction of the industry since the 1950s. The government was wholly unprepared, with modest coal stocks, and was surprised by the striking miners’ discipline and aggression.

Arthur Scargill, then a young, unknown militant and former Communist, organised fifteen thousand of his comrades from across South Yorkshire in a mass picket of the Saltley coke depot, on which Birmingham depended for much of its fuel. Scargill (below centre), a rousing speaker and highly ambitious union activist, later described the confrontation with West Midlands police as “the greatest day of my life.” However, the events of his greatest day represented, for the PM:

‘… the most vivid, direct and terrifying challenge to the rule of law that I could ever recall emerging from within our own country. … We were facing civil disorder on a massive scale.’

The Welsh miners picketed outside the Houses of Parliament during the 1972 Strike, led by their General Secretary Dai Francis (with spectacles).

Heath blamed the police for being too soft. It was clear to him that the intention was to bring down the elected government but he decided that he could not counter-attack immediately. Confronted with the prospect of the country becoming ungovernable or having to use the armed forces to restore order which public opinion would never have tolerated, Heath turned to a judge Lord Wilberforce, for an independent inquiry into miners’ wages. Wilberforce reported that they should get at least twenty per cent, which was fifty per cent more than the average increase. The NUM settled for that, plus extra benefits, in one of the most clear-cut and overwhelming victories over a government of any British trade union to date. Their strategy and tactics were wholly successful. Scargill was quickly promoted to agent, then president of the Yorkshire miners.

A boy stands outside his school, which closed because of a lack of fuel during the miners’ strike of 1972.
The Oil Price Bust, the Coal Dispute & the Three-Day Week, 1973-74:

Obstructionist trade unions were a favourite target of many, particularly after the coal dispute, which had led to a series of power cuts throughout the country and a three-day working week. Attacks on trade union power were becoming more popular owing to a growing perception that the miners in particular had become too powerful and disruptive, holding the country for ransom. Heath and his ministers knew that they would have to go directly to the country with an appeal about who was running it, but before that, they tried a final round of negotiation to reach a compromise. Triggered by the prospect of unemployment reaching a million, there now followed the famous U-turn which so marred Heath’s reputation. It went by the name of ‘tripartism’, a three-way national agreement on wages and prices, investments and benefits, between the government, the Confederation of British Industry (CBI) and the Trades Union Congress (TUC). The Industry Act of 1972 gave the government unprecedented powers of intervention, which Tony Benn called ‘spadework for socialism’. Heath had leaned so far to try to win the unions over that he was behaving like a Wilsonian socialist.

The unions, having defeated Wilson and Castle, were more self-confident than ever before or since. Many industrial workers, living in still-bleak towns far away from the fashionable big cities, did seem underpaid and left behind. The miners, certainly, were badly paid in these areas in particular. Heath argued that he was forced to accept and apply consensual policies because in the seventies any alternative set of policies, the squeeze of mass unemployment which arrived in the Thatcher era, would simply not have been accepted by the country as a whole. Besides, the economic problems the government and the country at large faced were not primarily the result of high wage claims. Management incompetence or short-termism, leading to an abdication of responsibility and the failure to restructure factories and industries, was seen as a primary cause of economic stagnation. This, as seen in the case of manufacturing in Coventry, was an argument which had some local evidence to support it, although unions at a local, shop-floor level were sometimes equally short-sighted in some instances.

What finally finished off the Heath government was the short war between Israel and Egypt in October 1973, the Yom Kippur War. Israel’s swift and decisive victory was a humiliation for the Arab world and it struck back, using oil as its weapon. OPEC, the organisation of oil-producing countries dominated by the Saudis, had seen the price of oil rising on world markets for some time. They decided to cut oil supplies to the West each month until Israel handed back its territorial gains and allowed the Palestinians their own state. There would be a total embargo on Israel’s most passionate supporters, the United States and the Netherlands. And those countries that were allowed oil would be made to pay more for it. In fact, prices rose fourfold. It was a global economic shock, pumping further inflation into the industrialised world, but in Britain, it arrived with added force. The miners put in yet another huge pay claim, which would have added twice as much again to many pay packets. Despite an appeal from the moderate NUM President, Joe Gormley, the NUM Executive rejected a thirteen per cent pay increase and voted to ballot for another national strike.

The country could survive high oil prices, even shortages, for a time, but these were the days before Brent Crude from the North Sea was being produced commercially. The same was true of natural gas. But Britain could not manage both the oil shock and a national coal strike at the same time. Barber, the Chancellor, called this the greatest economic crisis since the war. It certainly compared to that of 1947. Again, coal stocks had not been built up in preparation, so a whole series of panic measures had to be introduced. Plans were made for petrol rationing and coupons were printed and distributed. The national speed limit was cut by twenty miles per hour to fifty mph. Then in January 1974 came the announcement of a three-day working week. Ministers solemnly urged citizens to share baths and brush their teeth in the dark. Television broadcasting ended at 10.30 p.m. each evening. It was an embarrassing time in many ways, with people having to find other things to do in the dark or by candlelight, yet it also gave millions an enjoyable frisson, the feeling of taking a holiday from everyday routines. The writer Robert Elms recalled:

…this proud nation had been reduced to a shabby shambles, somewhere between a strife-torn South American dictatorship and a gloomy Soviet satellite… a banana republic with a banana shortage … The reality of course is that almost everybody loved it. They took to the three-day week with glee. They took terrific liberties.

This time Heath and his ministers struggled to find a solution to the miners’ demands, though the climate was hardly helped when Mick McGahey, the legendary Communist NUM leader, asked by Heath what he really wanted, answered ‘to bring down the government.’ When the miners voted, eighty-one per cent were for striking, including those in some of the traditionally most moderate coalfield areas. In February 1974, Heath asked the Queen to dissolve Parliament and went to the country on the election platform he had prepared two years earlier: ‘Who governs?’ The country’s answer, perhaps taking the question more literally than Heath had hoped was ‘Not you, mate!’ Meanwhile, Harold Wilson had expected the Tories to win again. A year earlier he had prepared the Opposition’s answer to the questions of union power and inflation which became known as the Social Contract. Observers saw it as a recipe for inflation which also offered the TUC a privileged place at the table in return for very little.

However, Wilson was somehow able to emerge as the calm bringer of reason and order in the election campaign, whereas Heath was hit by a slew of bad economic figures. Then Enoch Powell, Heath’s old nemesis, stepped back into the limelight to announce that he was leaving the Conservatives over their failure to offer the electorate a referendum on Europe. He, therefore, called on voters to vote Labour. This helped to produce a late surge for Wilson, as well as for the Liberals, led by the popular Jeremy Thorpe. Though Labour won the most seats by the slenderest of margins, 301 to the Tories’ 297 (the Liberals had fourteen MPs), no party had an overall majority, so Heath hung on, hoping to do a deal with Thorpe. But he eventually had to concede defeat, and Harold returned to the Palace to kiss hands with Queen Elizabeth for the third time. As Andrew Marr put it:

So Mick McGahey and friends had brought down the Heath government, with a little help from the oil-toting Saudi Royal Family, the Liberals and Enoch Powell. A more bizarre coalition of interests is difficult to imagine.

Andrew Marr (2007), A History of Modern Britain, p. 342.

Other significant changes happened on Ted Heath’s watch. The school leaving age was raised to sixteen. To cope with international currency mayhem caused by the Nixon decision to suspend convertibility, the old imperial sterling area finally went in 1972. The Pill was made freely available on the NHS. Local government was radically reorganised, with no fewer than eight hundred English councils disappearing and huge new authorities, much disliked, being created in their place. Heath defended this on the basis that the old Victorian system could cope with ‘the growth of car ownership and of suburbia, which was undermining the distinction between town and country.’ Many saw it as bigger-is-better dogma. There was more of that when responsibility for NHS hospitals was taken away from hundreds of local boards and passed to new regional and area health authorities, at the suggestion of a new cult that was then just emerging – management consultancy. In the seventies, the familiar and the local seemed everywhere in retreat.

(to be continued; for sources, see part two)

Appendix One – The Queen & Tolkien:

In 1972 – Queen Elizabeth II appointed JRR Tolkien Commander of the Order of the British Empire “for services to English Literature.” Outside Buckingham Palace with his daughter Priscilla.

After the death of his wife, Edith, in 1971, Tolkien’s happiness was added to by the honours that were conferred on him. He received several invitations to visit American universities and receive doctorates, but he couldn’t face the long journey. There were also many tributes within his homeland. He was profoundly moved when, in the spring of 1972, he was invited to Buckingham Palace to be presented with a C.B.E. by the Queen. She had been eleven when The Hobbit was published, and The Lord of the Rings had hit bookstores two years into her reign. Tolkien wrote to his publisher Rayner Unwin about the day,

“… I was very deeply moved by my brief meeting with the Queen, & our few words together. Quite unlike anything that I had expected.”

Humphrey Carpenter (ed.) (1981), Letters of J. R. R. Tolkien. London: George Allen & Unwin, Letter 334.

After everything he had lived through and all the fairy stories he had written, meeting the Queen was a special moment for him. But perhaps the most gratifying of all was the award in June 1972 of an honorary Doctorate of Letters from his own University of Oxford; not, as was made clear, for The Lord of the Rings, but for his contribution to philology.

The following year, on 2nd September, J.R.R. Tolkien died, aged eighty-one. His requiem mass was held in Oxford four days after his death, in the plain modern church in Headington, which he had attended so often since his retirement. Born in South Africa during the reign of Queen Victoria and growing up in Birmingham during the reign of Edward VII, Tolkien researched, taught and wrote in England during the reigns of all four Windsor monarchs (before Charles III), over six decades, from George V to Elizabeth.

(Appendix Two is in Part Two)