Canadian Gordon Lightfoot was arguably the greatest and most underrated folk artist of his generation and possibly of all-time. Many folk singers before and since have embodied the melancholic troubadour as a key component of their personas – most notably, Pete Seeger – but Lightfoot took this to a new level of artistic refinement.
His songs were as touching and moving as they were austere and elegant. Lightfoot was discovered by the American folk circuit and some of his songs were covered by established artists.
Lightfoot’s first album, Lightfoot, recorded in 1966 when he was 28 years old, was notable for its delicate country ballads such as Early Morning Rain and Ribbon Of Darkness. The albums The Way I Feel (1968), Did She Mention My Name (1968), and Back Here On Earth (1969) continued in that melancholic style.
But it was the 1970 album, If You Could Read My Mind, particularly the title track, that proved to be a huge creative leap forward for the artist. The song is a mature testament to love lost, a romantic elegy full of symbolic meaning and tenderness and a singer-songwriter tour de force. Few artists can claim to evoke the universal themes of love and of the human condition in such a poignant and tender way as Lightfoot achieves in this song.
His richness of prose combined with such wonderful introspective story-telling drama is revelatory. The melancholy is enhanced by a sublime baritone voice which is offset by a light acoustic guitar counterpoint that few singer-songwriters have been able to equal.
The beautiful reflective melody of Summer Side Of Life and the realist triptych of Nous Vivons Ensemble, Same Old Loverman and Cotton Jenny, reiterated Lightfoot’s contemplative personal vision of modern life and his quest to probe the fragile recesses of the human psyche.
Lightfoot continued with these themes on the album, Don Quixote (1972), cementing the musician’s stature as a dignified craftsman. Lightfoot’s follow-up, Sundown (1973), largely abandons the depressed tone of his past glories for a melodic and passionate country, exemplified by the title track. However, Carefree Highway is a more solemn song of freedom, individualism. and fatalism. Lightfoot’s golden period continued with the albums Cold On The Shoulder (1975) and Summertime Dream (1976).
Even though the artists creative decline began with Shadows (1982), the album still boasts the brilliant Heaven Help The Devil and 14 Karat Gold. The tracks Ghosts of Cape Horn and Sea Of Tranquility are the stand outs on Dream Street Rose (1980).
After the creative disappointments of Salute (1983) and East of Midnight (1986), Lightfoot’s reputation as a majestic chronicler of the human condition returned with the albums, Waiting For You (1993) and Painter Passing Through (1998).
Gordon Lightfoot will be remembered as one of the greatest singer-songwriters of all-time.
In my previous article, I rebutted the key claims of those who propose that WTC7 collapsed due to a controlled demolition. Despite this many critics of the official narrative continue to maintain that a controlled demolition occured on the basis of what witnesses claimed they heard.
I explained why the controlled demolition thesis is illogical but some critics on twitter skate over this and have instead moved onto the issue of thermite as part of the ‘controlled demolition’ process as an apparent explanation for what happened. l will endeavour to deal with this below.
That commercial aircraft laden with jet fuel travelling at high velocity struck the World Trade Center on 9-11 is incontrovertible. Commercial aircraft contain enormous amounts of aluminum, and WTC7 was an aluminum-clad, steel-cored building. Although airliners didn’t crash into WTC7, the enormous explosions and collapse of nearby towers generated as a result of planes that did, inevitably impacted on WTC7.
As I explained in my article, debris from WTC 1, 370 feet away struck WTC7 causing a gash in one corner facing Ground Zero, and by the time the evacuation order of the building was given it was visibly sagging. With debris strewn all around, we can expect to find aluminum and iron oxide and aluminum oxide and metallic iron in the debris without any thermite charges being required to explain it.
Thermite would not be practical for the demolition of WTC7. It would have spontaneously detonated at under 1000°F and would not have been controllable; no signal receiving device could have survived the fires and continued receiving the destruct command.
A more recent claim of critics is that traces of red-gray chips and iron-rich microspheres in the WTC rubble are best explained by thermite. This is held as their “smoking gun“. A study of the dust from Ground Zero contradicts this: “There is no evidence of individual elemental aluminum particles of any size in the red/gray chips…”
The chips are epoxy resins. More specifically, after further study, the red/gray chips were found to be Laclede Standard Primer by late polymer chemist Ivan Kminek, who demonstrated that they have the same chemical composition, identical XEDS spectra, and nearly identical ignition point.
Some witnesses and critics of the official narrative claim that the sound of explosions coming out the windows of WTC7 are indications of an explosive demolition. There is a simple explanation for this phenomenon: What happens to the air inside when you squeeze a balloon too much? These side-jets of air and dust were not explosions, but debris was expelled from the buildings as the floors pancaked on each other. There is a lot of air in a quarter-mile-tall office building; it has to go somewhere when compressed.
In addition, the WTC complex contained plenty of water in water mains, toilets, sinks, and beverage machines. Water heated to boiling temperatures expands violently and expands explosively if contained. Water has a high heat capacity, which usually precludes rapid heating to boiling temperatures. Still, the heat of burning jet fuel will force water to heat rapidly to boiling, which causes explosions of such objects as unopened soft drink cans or whiskey bottles. This explains many of the explosions that survivors heard.
Many things explode in fires: transformers, gas lines, water lines, air compressors, fire extinguishers, propane tanks, and refrigeration systems. An “explosion sound” is different from the high-brisance detonation necessary to cut even one 14″ x 22″ steel column of a major skyscraper (let alone 58–82 of them), which would exceed 140 decibels a half-mile away and be clearly audible from New Jersey.
Meanwhile, seismographs picked up the collapse of the interior (preceding the collapse of the exterior frame) but no detonations.Any detonation of explosives within WTC 7 would have been detected by multiple seismographs monitoring ground vibration in the general area. No such telltale “spike” or vibratory anomaly was recorded by any monitoring instrument.
As I explained in my previous article, claims by witnesses that the ‘explosions’ they heard were the sound of controlled explosives, are pure supposition. Testimonies from firefighters inside and outside of the building are consistent, and demolitions experts who saw WTC 7 collapse neither saw nor heard anything indicating an explosive demolition.
Explosive demolitions would not be very controlled, or likely to work at all, if they involved slamming tons of skyscraper debris through a building and then setting it on fire for seven hours. Precision explosives, timers, and wiring don’t like that sort of treatment. Regardless, such blasts would be loudly audible on the camera footage seconds before the collapse began; there’s nothing on the tapes.
Finally, claims by critics that “pyroclastic flows” of dust appatrently indicating that explosives must have been used, have no basis in fact. A pyroclastic flow is a movement of hot gas. In the context of a volcano, it’s usually hot gases containing hot dust and other chunks spreading out. In this context, these flows were claimed to be the cloud of dust that dispersed during the collapse and when WTC7 hit the ground.
Aside from not being hot enough to qualify as a pyroclastic flow (see volcanoes and shuttle launches), most claims try linking it with the controlled demolition theory. This debris flow indicates a fast vertical compression that caused air inside the building to push dust outward over a large area. The same flows can also be seen during controlled demolitions, but are usually much smaller than what happened at WTC7. It’s been estimated that the total mass of sheetrock in the internal walls was 1,000 tons (US). An enormous cloud of white dust is, therefore, not entirely surprising or unexplainable.
Over recent days and weeks on Twitter, I’ve witnessed the re-emergence of the theory that Building 7 at the World Trade Center on 9-11 was brought down by controlled explosives. Geopolitical analyst, Patrick Henningsen and artist, Daniel Fooks are among those commentators who appear to support this theory. I will challenge the theory by unpacking each claim in turn.
The first theory relates to the announcement in the media of the collapse of WTC 7 before it happened as evidence of establishment collusion. Those who challenge the official narrative cite a BBC report in support of their thesis. However, the BBC and others were monitoring the news from different outlets and that’s where they learned of WTC 7.
According to the fire department, by 2 p.m there was a strong possibility the building would soon collapse, so its imminent demise was picked up by reporters. The fire department relayed information to reporters that the building was going to collapse. By the time it reached outlets like the BBC and CNN it may have simply been mis-communicated from “about to collapse” to “has collapsed”.
The female BBC reporter even starts out by saying “details are very, very sketchy.” This is a clear case of journalistic incompetence which wouldn’t be the first time for a BBC reporter.
The second claim that WTC 7 collapsed as the result of a controlled demolition is that it was said to have been in freefall and collapsed into its own footprint. However, the evidence supports the NIST contention that the building collapse progressed from the penthouse out as columns were weakened by the fires. The slow sinking of the penthouses, indicating the internal collapse of the building behind the visible north wall, took 8.2 seconds according to a NIST preliminary report.
Seismograph trace of the collapse of WTC 7 indicates that parts of the building were hitting the ground for 18 seconds. This interesting set of videos, which are shot at different angles, clearly show that WTC7 does not fall straight down. The assertion by critics that the building was in freefall for 2.25 seconds state this as if it is somehow significant to their argument even though this fact was conceded by the official 2008 inquiry.
The reality is, WTC7 did not fall into its own footprint but left substantial debris scattered across the entire WTC complex site. The damage to WTC 7 was actually caused by debris from WTC 1, 370 feet away. A controlled demolition would presumably try to avoid such behaviour. If one accepts that WTC 7 was burning for many hours, it’s illogical to also propose the controlled demolition thesis because the one precludes the other.
The third claim relates to the apparent sound of explosions. But such claims are pure supposition. Testimonies from firefighters inside and outside of the building are consistent, and demolitions experts who saw WTC 7 collapse neither saw nor heard anything indicating an explosive demolition. Nothing can be seen or heard in videos that resembles explosive charges going off before the collapse.
Seismic data from multiple sources indicates that, as with the Twin Towers, the collapse of WTC 7 began slowly, completely unlike an explosive demolition but consistent with internal failures leading to global collapse. Any detonation of explosives within WTC 7 would have been detected by multiple seismographs monitoring ground vibration in the general area. No such telltale “spike” or vibratory anomaly was recorded by any monitoring instrument.
Crucially, explosive demolitions would not be very controlled, or likely to work at all, if they involved slamming tons of skyscraper debris through a building and then setting it on fire for seven hours. Precision explosives, timers, and wiring don’t like that sort of treatment (Source: Brent Blanchard of Protec http://tinyurl.com/z6zyc).
Fourth, many of those who propose the controlled demolition thesis cite controlled demolition expert, Danny Jowenko, in support of their case. Jowenko was on record as asserting, definitively, that what he saw, at that moment for the first time, was a controlled demolition. However, this is the view of one man who was asked to comment in an instance.
The November 2008 NIST report into the collapse of WTC 7 proffers a more realistic explanation, namely, that fire was the main reason for the collapse, along with lack of water to fight the fire and falling debris which ruptured the oil pipes feeding its emergency generators. The reduction in pressure triggered the automatic pumping system, which poured thousands of gallons of diesel onto fires which continued to burn throughout the afternoon on the lower floors.
At 5.20 p.m. a critical column buckled, leading to the collapse of floor 13, which triggered a cascade of floor failures within the building, eventually leading to global collapse. The lack of water to fight the fire was an important factor. The fires, which were fueled by office contents and burned for seven hours, along with the lack of water were the key reasons for the collapse.Popular Mechanics magazine polled 300 experts who came to the same conclusions.
Although it wasn’t completely obvious to the untrained eye at the time, WTC 7 had been seriously compromised by a 20-story gash in one corner facing Ground Zero, and by the time the evacuation order of the building was given it was visibly sagging.
Fifth, critics of the official theory claim that “pull” is standard jargon within the demolition industry to fire off demolition charges within buildings. But demolition experts have denied this; the usual term would be “shoot it” or “blow it.” “Pulling” refers to a procedure of attaching hauser cables to a building and using heavy vehicles to pull it over, something that would have been fairly easy for observers to detect.
Seven, some critics have remarked on the fact that some witnesses saw “flashes” inferring that it was indicative of an explosion. However, these witnesses are mistaken. This is one of the flashes in slow motion. It isn’t an explosion. What you see are window glass popping out as the floors collapse and compress the air inside. The sun is momentarily reflected in each pane of glass as it falls.
Despite all of this evidence, we are somehow expected to believe that either:
a) “Explosives” were planted when the buildings were erected. That would require the longest conspiracy planning in history.
b) They were planted later. In which case, who planted them? How did they do that in a building occupied by 50,000 people on a daily basis? Perhaps they did it on weekends when the building only had about 5,000 visitors /day?
Finally, many critics cite the events surrounding Larry Silverstein as their ‘smoking gun’. However, the theory is roundly debunked here.
To clarify the main points: Silverstein (the new leaseholder for the WTC) had been going to the Twin Towers “Windows on the World” restaurant (there were no survivors on this level) to dine and meet with his new tenants; he had been doing this straight since July 26, 2001. But on 9/11 he didn’t go because he claimed his wife made a dermatologist appointment for him.
Many critics of the official narrative point out that in the interview which Silverstein is asked where he was on 9/11 he appears to be showing signs of lying. It is, however, very likely he was indeed simply going to a dermatologist appointment.
Out of the thousands of people who worked at the site during the day, many dozens at any one time would have been on holiday, off sick or simply slacking on September 11th (a good half dozen well-known celebrities were involved in and avoided a potential end in the attacks).
That one of these happened to be the owner isn’t remarkable. There are plenty of important traders who did die in the attack — by the logic that one escaped suggests a conspiracy, the fact that many died should discredit it, right?
Also going against the idea of advanced-knowledge is that Neil D. Levin, the head of the Port Authority of New York and New Jersey (which presumably would be “in” on any conspiracy), was killed on 9/11- while dining in the Widows of the World, no less. If there was advance knowledge, why was Silverstein informed while Levin wasn’t?
It has been repeatedly reported that Silverstein had insured the Twin Towers a year earlier, and it is more than “coincidental” that this insurance covered terrorist attacks. Further, Silverstein had numerous legal disputes that aimed to increase the payout by arguing that there were two separate attacks. To a first approximation, this was successful and Silverstein managed to claim approximately $4.6 billion.
But what critics don’t mention about this is that the total cost of the towers was significantly in excess of this — the insurance value was way below what it should have been. Most of the legal wrangling after the fact was also due to the insurance contracts being incomplete. The total cost of the attack would be in the region of $7 billion or more, leaving a considerable loss once the relatively measly insurance payout was claimed.
With too low an insurance value and less-than-solid contracts, literally none of the insurance-based activities seem to point to the actions of people who knew exactly what was going to happen in advance. If it was an insurance scam, it was the worst ever.
The World Trade Center had already been bombed once before in 1993, and that several major terror plots against U.S. landmarks had been uncovered since then. In light of this, an anti-terrorism insurance policy would appear to be an entirely logical purchase.
Mark Stewart’s most significant contribution to rock music was as the vocalist for the Pop Group, arguably the most quintessential experimental rock band in history. Around this time, bands of comparable artistic importance based overseas were probably Pere Ubu and the Contortions. But I would argue that the Pop Group, led by Stewart, were creating a far more dazzling and daring primitive musical mega-fusion than their contemporaries.
Hailing from Bristol, the group were formed in 1977 and produced two seminal 45 rpm singles, She is Beyond Good and Evil and We Are All Prostitutes. The former emerged in the wake of the punk explosion in the UK but in reality both recordings were about as close to punk rock as Captain Beefheart was to the blues.
The combo recorded two brilliant albums. The best of these, and one of the greatest artistic achievements in rock music history, is the 1979 album Y, a genuinely revolutionary intense tour de force fusion of agit-prop punk, funk, jazz and dub syncopation.
Stewart’s subversive and anarchic singing style was the perfect vehicle for the nihilistic lyrics and slogans that hinted at an apocolyptic future. Each piece on Y embodies a ferocious synthesis of primitive rhythms and dissonances.
An underlying tension unfolds which is heightened by Stewart’s violent, anxious metaphysical screams and politically-charged lyrics while the music evokes a scary vision of a barbarous post-apocalyptic humanity.
Y is the punk generations equivalent of Soft Machine 3 that emerged from the Canterbury school a generation before.
The groups second masterpiece, For How Much Longer Do We Tolerate Mass Murder (Rough Trade, 1980) places a greater emphasis on the rhythmic and funk aspects of the music than its predecessor. The records heavy use of wind and percussion instruments gives more space for the anger, torrent of noise and anarchic verbosity in the songs to emerge and, in some cases, dominate.
The record utilizes a dogged rhythm accompanied by unhinged keyboards which is repeated from the beginning to the end. It is music made, as with Y, through syncope, screams and noises, and a collapsing sound wiped out by high-voltage discharge.
Within three years of the release of this masterpiece, the band had dissolved. Four new, separate, and important musical happenings spawned in its place – Rip Rig and Panic, Maximum Joy, Pig Bag, and Mark Stewart’s solo career.
Stewart went on to be a key part of the influential collective, Tackhead, who invented a unique and powerful style of rap, funk, dub and soul music and whose legacy can be found in the trip-hop genre of the nineties. Stewart also operated under the pseudonym, Maffia, continuing the programme launched with the Pop Group.
Stewart’s creative high points during this period included the albums, Learning To Cope With Cowardice (Rough Trade, 1983) which accentuated the experimentation and Friendly As A Hand Grenade (TVT, 1989), with a sound that often mimics the refined and explosive funky-soul of Was Not Was.
If the average tone is that of a hybrid between rap and soul, the most significant innovations are represented by the pressing and insane Demolition House and the brutality of Airborn Ranger, obtained by stratifying the electronics to obtain a symphonic effect and grafting heavy metal guitars on top.
Sadly, neither Stewart or the Pop Group after they reformed, were able to repeat the artistic highs of their past glories. Nevertheless, Stewart and the band that he led, made a significant mark within the cultural landscape of the UK.
More broadly, the significant contribution Stewart made to rock music history worldwide cannot be overstated. With For How Much Longer Do We Tolerate Mass Murder and particularly Y, Stewart helped pioneer a completely new militant musical language and he can be considered to be one of the main innovators of funk, trip-hop and dub music. His influence among successive generations of musicians will continue long after his passing.
I would like to get one thing straight from the outset. Until the emergence of the Covid event I, like millions of other people, was convinced of the veracity of man-made climate change. But around three years ago, I began to re-evaluate my position. After having researched the subject in great detail, I am now convinced that the purpose of the climate change narrative is to fulfill a political agenda.
As is the case with the Covid event, this agenda is about divesting more and more power away from nation states and their citizens to the bloated and corrupt United Nations bureaucracy, which is essentially controlled by the rich and powerful.
The global warming/climate change idea is a project of the (very) elitist Club of Rome, whose members have included Al Gore, Ted Warner, George Soros, Bill Gates and members of the Rockefeller and Rothschild families.
The Club of Rome is the active division of a group of entities serving a globalist agenda, which have played the major part in the establishment of the United Nations, the European Union and NATO. They include the World Economic Forum, the Committee on Foreign Relations (CFR) and the Trilateral Commission.
The global warming project enables further enrichment of the already very wealthy, through the carbon trading scheme (Al Gore was projected to become the first carbon millionaire). However, United Nations publications such as Agenda 21 make it very clear that climate alarmism has another purpose: to enable and justify expansion of UN bureaucracy, the empowerment of NGOs, inevitably controlled by the globalists, and to control and contain the populace, all in the name of the Earth and the ill-defined ‘sustainability’.
Far from being a conspiracy theory, man-made climate change is actually a proveable conspiracy enacted by a criminal cabal. In the Club of Rome’s own words:
‘‘The Global Warming debate… is a concept by the New World Order to justify the dismantling the industrial society and returning the mass of humanity to obedient serfdom.”
”If ‘balance’ means giving voice to those who deny the reality of human-triggered climate change, we will not take part in the debate”, they said.
The reason for this step, we are told, is that on the one hand there is an overwhelming scientific consensus and on the other, that there is a lobby, heavily funded by vested interests, that exists simply to sow doubt to serve those interests.
Apparently, scepticism represents ‘fringe views’ which should be ignored. Giving AGW sceptics a platform is said to be akin to showcasing flatearthers. This is despite the fact that the official position of the Flat Earth Society is that it supports the climate alarmist narrative. (Of course it was sceptics who first argued that the world wasn’t flat).
The purpose of the Guardian letter was to justify the already well-established practice of refusing to engage in debate on man-made climate change, by marginalising and belittling opponents, and to deplatform them.
Because of the shortage of real scientists prepared to put their names to the letter, we had the unedifying spectacle of the likes of Clive Lewis and Peter Tatchell declaring that they are above debating atmospheric physics with scientists of the calibre of Eric Karlstrom or Nobel Laureate, Ivar Giaever.
This exercise in dishonest narcissism demeans all who have signed or lent their support to it. Academics who speak out against the globalist narrative on climate change do so at the expense of their careers.
As I will show below, counter-arguments from AGW sceptics such as Tim Ball and Mark Steyn, have never been discredited. Of the 15 or so professors who signed the Guardian letter, the majority work in unrelated fields such as economics, law or psychotherapy.
The same applies to others with impressive sounding qualifications – Dr Teresa Belton, for example, wrote her thesis on the effects of television and video on children. In the case of 90% of the signees – academics, journalist, politicians, activists – the very idea that they could sensibly debate with serious climate scientists is ludicrous.
The letter in question comes out of the University of East Anglia and was drafted by Dr Rupert Read, Green Party politician and Reader in Philosophy at the University of East Anglia. A large number of signees have connections to the University.
The UEA is notorious as the centre of the Climategate scandal, whereby emails between scientists at the University of East Anglia Climate Research Unit (CRU) and their colleagues around the world revealed a consistent, deliberate effort to skew, hide or destroy data.
”Three themes are emerging from the newly released emails: Prominent scientists central to the global warming debate are taking measures to conceal rather than disseminate underlying data and discussions. These scientists view global warming as a political ’cause’ rather than a balanced scientific inquiry; and many of these scientists frankly admit to each other that much of the science is weak and dependent on deliberate manipulation of facts and data.”
”Man-made climate change (Anthropogenic Global Warming or AGW) is a scam and a hoax and until the average joe and jane wakes up to the truth this nonsense will continue to corrupt the scientific community, which depends on grants from those same economic and political powers, and more importantly will corrupt politicians worldwide who too are dependent upon them for campaign contributions.”
The climate change project was officially launched in the US on June 23, 1988 when NASA’s James Hansen told a Congressional committee that global warming had begun: that the then-current heat wave in Washington was caused by the relationship between ‘the greenhouse effect and observed warming.
To get the point across, Hansen and sponsor 98ii+68/Senator Tim Wirth chose what promised to be an exceptionally hot day and then sabotaged the air conditioning in the meeting room the night before.
Man-made climate change is one of those plain sight conspiracies like the Covid ‘pandemic’, where the primary movers hardly bother to conceal the contrived nature of the project, or the vast sums of money they make from it.
”In searching for a common enemy against whom we can unite, we came up with the idea that pollution, the threat of global warming, water shortages, famine and the like would fit the bill…”(p. 75).
The motivation, then, was not to solve an urgent problem, but to find a threat, real or not, that would ‘unite’ people. And divert them from real issues.
The Club of Rome, founded in 1967, has been described as being at the apex of the New World Order pyramid. It drives the global climate change project as well being concerned with population control and vaccinations. Members are world leaders and captains of industry, and have included Al Gore, Tony Blair, George Soros and other people you’d buy a used car from.
Anthropogenic Climate Change: the Official Position
The International Panel for Climate Change (IPCC) was founded with the task of providing the world with an objective, scientific view of climate change. Major points of its 2007 report are as follows:
Warming of the climate system is unequivocal, as is now evident from observations of increases in global average air and ocean temperatures, widespread melting of snow and ice and rising global average sea level.
Most of the observed increase in global average temperatures since the mid-20th century is very likely due to the observed increase in anthropogenic greenhouse gas concentrations. Continued GHG emissions at or above current rates would cause further warming.
Anthropogenic warming could lead to some impacts that are abrupt or irreversible, depending upon the rate and magnitude of the climate change.
Notable achievements of the UN and its Kyoto Protocol include the creation of an international carbon market.
Scientific rejection of the IPCC’s position
The IPCC’s findings were opposed by scientists worldwide. The New Zealand Climate Science Coalition, for example, slammed the IPCC report as ‘dangerous nonsense’ and produced a list of pillars of wisdom to counter the UN IPCC climate report.
Over the past few thousand years, the climate in many parts of the world has been warmer and cooler than it is now. Civilizations and cultures flourished in the warmer periods. A major driver of climate change is variability in solar effects, such as sunspot cycles, the sun’s magnetic field and solar particles. Evidence suggests warming involving increased carbon dioxide exerts only a minor influence.
Since 1998, global temperature has not increased. Projection of solar cycles suggests that cooling could set in and continue to about 2030. Most recent climate and weather events are not unusual; they occur regularly. For example, in the 1930s the Arctic experienced higher temperatures and had less ice than now. Stories of impending climate disaster are based almost entirely on global climate models. Not one of these models has shown that it can reliably predict future climate.
The Kyoto Protocol, if fully implemented, would make no measurable difference to world temperatures. The trillions of dollars that it will cost would be far better spent on solving known problems such as the provision of clean water, reducing air pollution, and fighting malaria.
Climate is constantly changing and the future will include coolings, warmings, floods, droughts, and storms. The best policy is to make sure we have in place disaster response plans that can deal with weather extremes.
In essence, proponents of the theory of significant anthropogenic climate change need to show two things: There is significant and dangerous global warming and that said global warming is caused by human activity, ie greenhouse gas emissions, primarily co2 emissions. Whereas sceptics need only show one thing: global climate is not significantly or dangerously affected by human activity.
Like all narratives pushed by the powerful onto the masses, the global warming hoax is supported by relentless fallacious argument, so that the public are battered with endless ad hominem, cherry-picking and appeals to authority.
Much of the data is suspect, to put it mildly, and a very large part of the ‘debate’ consists of apocalyptic scenarios, with threats of doom unless the public pours more money into the coffers of those profiting from the carbon hoax.
The IPCC’s position is still that there has been significant anthropogenic warming over the past 50 years, increasing at an exponential rate as we pump more and more CO2 into the atmosphere. Many scientists disagree, pointing to higher temperatures in the 30s, and a cooling since 1998. 150 graphs from 122 scientific papers published in peer-reviewed journals indicate modern temperatures are not unprecedented, unusual, or hockey-stick-shaped — nor do they fall outside the range of natural variability.
Data to promote the idea of runaway global warming has been questioned, for example the graphs used by NOAA ( National Oceanic and Atmospheric Administration) and NASA have been shown to have been ‘updated’, as it were.
In 2015, German professor Dr. Friedrich Karl Ewert accused NASA of ‘Massive’ Temperature Alterations’, i.e. of intentionally and systematically rigging the official government record of global temperatures: ”A comparison of the data from 2010 with the data of 2012 shows that NASA-GISS had altered its own data sets so that especially after WWII a clear warming appears – although it never existed.”
In 2008, the Telegraph reported NASA as claiming October as the hottest on record, by using September figures. The name ‘hockey stick graph’ was coined for figures showing a long-term decline followed by an abrupt rise in temperature, specifically applied to the findings of ‘a little known climate scientist named Michael Mann and two colleagues’ as described here by the Atlantic Council.
Canadian scientists Stephen McIntyre and Ross McKitrick obtained part of the programme that Mann used, and they found serious problems. Not only does the programme not do conventional principal component analysis but it handles data in such a way that whatever data was fed in, it produced a hockey stick.
Mann has queried their findings, but refused to provide necessary additional data (McIntyre and McKitrick’s adventures with Mann are described here). Michael Mann has been suing various critics for libel, including Mark Steyn, whose A Disgrace to the Profession is a compilation of scientific commentary on Michael Mann and his work.
Steyn has also termed Mann a Big Climate huckster), and also emeritus Professor Dr. Tim Ball, who likewise suggested Mann was guilty of data fraud. Mann has been reported as being in contempt of court in the Ball case for failing to provide essential data.
When the promised global warming failed to eventuate, the phrase ‘global warming’ gave way to ‘climate change’. So when cherry-picked claims of extreme heat are met with examples of low temperatures, they are countered with, ‘there you go, extreme climate change!’.Carbon Dioxide.
The cause of ‘runaway global warming’ is, according to alarmists, the production of CO2. Not carbon monoxide, note, the one that is poisonous (we’re not worried about that), but carbon dioxide, which is necessary for plant life, and which greenhouse owners often add to improve the growth of their vegetables.
Scientists have pointed out in vain that the level of carbon dioxide has been far higher in the past, during the Cambrian period about 18 times higher. Moreover, during the glaciation of the late Ordocivian period, CO2 concentrations were nearly 12 times higher than today, according to one report. This study has similar results.
Winter is Coming
From the early 14th to the mid nineteenth century, Europe and other parts of the world experienced what is called the Little IceAge. It led to much misery, with cold and hunger from the failure of crops, political upheaval, and the decolonisation of Greenland.
In 1484, Pope Innocent VIII recognized the existence of witches and echoed popular sentiment by blaming them for the cold temperatures and resulting misfortunes plaguing Europe. (N.b. Greenland still has not recovered from the Little Iceage). For some years, scientists have been predicting the coming of a new mini-iceage.
The response of British institutions like the Met Office and University of East Anglia has been interesting. In 2012 they released data that showed that the warming trend ended in 1948, but insisted that cooling from natural sources will be offset by carbon emissions.
Evidence of the Earth cooling has not given any pause to alarmist claims of dramatic warming, which have been present from the outset. In 1989 Nasa’s James Hansen was predicting that global temperatures would rise up to 9 degrees Fahrenheit by 2050.
In his film An Inconvenient Truth, Al Gore warned that increasing carbon dioxide emissions would spur catastrophic global warming that would cause more extreme weather, wipe out cities and cause ecological collapse. (The claims and predictions of An Inconvenient Truth were scrutinised 10 years on by Michael Bastasch in An Inconvenient Review).
In his review of the book that accompanied Gore’s film, Hansen claimed:
”As explained above, we have at most ten years—not ten years to decide upon action, but ten years to alter fundamentally the trajectory of global greenhouse emissions”.
To give a sense of urgency, the global warming threat has been described in the most extravagant terms. Hansen warned of a ‘global warming time bomb’ when he spoke to the Club of Rome in 2009. The concept of a ‘tipping point’ came into vogue, the peak of climate alarmism.
Marc Morano prepared a full list of apocalyptic declarations, exclaiming ‘Hours, days, months, years, millennium – the Earth is serially doomed’. Here are some examples:
It is suggested that the only authentic climate ‘tipping point’ is the one proposed by New Zealand’s Augie Auer, who predicted in 2007 that it was all going to be a joke in five years time. (Auer reckoned without the powerful forces behind the climate hoax.)
Melting of the icecaps would be a truly dramatic event, a serious indication of warming. Accordingly, climate alarmists have seized on this ‘danger’, in defiance of all the evidence. In 2007 — during his Nobel Peace Prize acceptance speech — Al Gore mooted that the northern icecap could be gone by 2014.
One study estimated that [the North polar ice cap] could be completely gone during summer in less than 22 years. In 2015 NASA data indicated that the polar icecaps were not receding, but in fact growing.
This did not deter Peter Wadhams, Professor of Ocean Physics at Cambridge University from predicting in 2016 that the icecap at the North Pole would be completely melted in the next year or two, ie by the end of summer 2018 at the latest. Nearly five years on, the Polar caps are still here.
Others are sure that the icecaps will be gone by at least 2050. This view is expounded in an article by Gilbert Mercier, who is sure that by 2100, the countryside will be parched earth and major cities like London and New York will be under water.
Another catastrophist who has repeatedly been proven wrong is Dr. Guy McPherson, Professor Emeritus of Natural Resources and Ecology and Evolutionary Biology at the University of Arizona. McPherson is described in his biography as an ”award-winning scientist and the world’s leading authority on abrupt climate change.”
Recently, McPherson claimed in a podcast that ‘abrupt climate change’ will result in the extinction of humans by 2026. Six years ago, McPherson wrote an article where he made a similar dramatic catastrophic prediction. The article included a timeline for virtual human extinction within 9-33 months from the date the article in question was published.
Conveniently, McPherson deleted the article. However, in a 2018 [Video] McPherson predicted that humans would be extinct by 2028 and that the arctic would be ice-free by 2019.
On March 20, 2000, the Independent reported that snowfalls were a thing of the past. ‘Global warming is simply making the UK too warm for heavy snowfalls. ”Children just aren’t going to know what snow is”, they claimed.
The source of these claims was Dr. David Viner of the Climatic Research Unit (CRU) of the University of East Anglia, of Climategate fame. The Independent article appears to be gone from the Web, melted away as it were, but was well reported, and certainly criticised.
Similar false claims were made by arguably the world’s leading climate doomsayer, Al Gore who has reportedly made $330m as a result of advocating on behalf of the alarmist cause. Gore made his fortune when he set up a green investment firm that’s now said to be worth $36bn, paying him $2m a month.
In An Inconvenient Truth, Gore claimed that Kilimanjaro, Africa’s tallest peak, would be snow free within a decade. In a recent speech at the World Economic Forum at Davos, Gore’s hyperbole was off the scale. In the speech, he warned about ”rain bombs” and ”boiling oceans.”
Gore’s psuedo-scientific hyperbole and the appeals to moral authority championed by his acolytes, have rarely been critiqued by journalists. Meanwhile, catastrophic warnings about the alleged impacts of ‘runaway climate change’ and the moral imperative to act against it, have become normalized across much of the panoply of social media.
Apocalyptic predictions are supported by a relentless reporting of supposedly extraordinary events proving a trend towards global warming. The cherry-picking in many cases is both obvious and ludicrous, and often the actual facts open to question.
Nonetheless, the consensus claim is a mantra repeated over and over again in the face of unwelcome factual evidence.
One might well ask, who cares?
The argument is an appeal to authority, a red herring fallacy, and the beliefs of a claimed 97% of ‘scientists’ don’t actually change the scientific facts. As often happens with the use of fallacious argument, the premise is completely false as well. It is clear that there has been concerted and substantial opposition from scientists to the AGW narrative and the carbon fraud. Essentially the 97% claim is a bare-faced lie, designed to make sceptics look like loonies.
Over 31,000 American scientists signed a petition in response to the 1997 Kyoto Accord: There is no convincing scientific evidence that human release of carbon dioxide, methane, or other greenhouse gases is causing or will, in the foreseeable future, cause catastrophic heating of the earth’s atmosphere and disruption of the Earth’s climate. Moreover there is substantial scientific evidence that increases in atmospheric carbon dioxide produce many beneficial effects upon the natural plant and animal environments of the Earth.
Attached to the petition is a summary of peer-reviewed research with 132 references. Marc Morano has given a breakdown of more than 1000 international scientists who dissented over man-made global warming claims from 2008 to 2010. Morano refers to, for example:
U. S. Senate Minority Report:More Than 700 International Scientists Dissent Over Man-Made Global Warming Claims: Scientists Continue to Debunk ‘Consensus’ in 2008 & 2009.
712 Prominent scientists from 40 countries signed the Manhattan Declaration on Climate Change, sponsored by the International Climate Science Coalition (ICSC). The 2008 declaration states in part, ‘Global climate has always changed and always will, independent of the actions of humans, and carbon dioxide (CO2) is not a pollutant but rather a necessity for all life’.
In 2009, more than 100 international scientists rebuked President Obama’s view of man-made global warming. The scientists wrote: ‘Mr. President, your characterization of the scientific facts regarding climate change and the degree of certainty informing the scientific debate is simply incorrect.’
December 8 2009, an Open Letter to the UN Secretary-General from 166+ scientists declared ‘the science is NOT settled’.
2010, 130 German Scientists called climate fears ‘pseudo religion’ and urged the Chancellor to ‘reconsider’ her views.
In 2010, more than 260 scientists who are members of the American Physical Society (APS) endorsed the efforts of skeptical Princeton University Physicist Dr. Will Happer to substantially amend the APS alarmist statement on man-made global warming.
A Japan Geoscience Union symposium survey in 2008 showed 90 per cent of the participants do not believe the IPCC report.
The prestigious International Geological Congress, dubbed the geologists’ equivalent of the Olympic Games, was held in Norway in August 2008. It prominently featured the voices of scientists sceptical of man-made global warming fears.
Professor Larry Bell of Houston University has also debunked the 97% claim, reporting.
A 2010 survey of media broadcast meteorologists conducted by the George Mason University Center for Climate Change Communication found that 63% of 571 who responded believe global warming is mostly caused by natural, not human, causes. Those polled included members of the American Meteorological Society (AMS) and the National Weather Association.
A more recent 2012 survey published by the AMS found that only one in four respondents agreed with UN Intergovernmental Panel on Climate Change claims that humans are primarily responsible for recent warming. And while 89% believe that global warming is occurring, only 30% said they were very worried.
A March 2008 canvas of 51,000 Canadian scientists with the Association of Professional Engineers, Geologists and Geophysics of Alberta (APEGGA) found that although 99% of 1,077 replies believe climate is changing, 68% disagreed with the statement that ‘…the debate on the scientific causes of recent climate change is settled.’ Only 26% of them attributed global warming to ‘human activity like burning fossil fuels.’ Regarding these results, APEGGA’s executive director, Neil Windsor, commented, ‘We’re not surprised at all. There is no clear consensus of scientists that we know of.’
Retired senior NASA atmospheric scientist, and supervisor to James Hansen, Dr. John S. Theon has called Hansen an embarrassment, and added himself to the list of NASA scientists who dissent from man-made climate fears. Others include:
Aerospace engineer and physicist Dr. Michael Griffin, the former top administrator of NASA,
Atmospheric Scientist Dr. Joanne Simpson, the first woman in the world to receive a PhD in meteorology, and formerly of NASA,
Geophysicist Dr. Phil Chapman, an astronautical engineer and former NASA astronaut,
Award-Winning NASA Astronaut/Geologist and Moonwalker Jack Schmitt,
Award-winning NASA Astronaut and Physicist Walter Cunningham of NASA’s Apollo 7,
Chemist and Nuclear Engineer Robert DeFayette was formerly with NASA’s Plum Brook Reactor,
Hungarian Ferenc Miskolczi, an atmospheric physicist with 30 years of experience and a former researcher with NASA’s Ames Research Center,
Climatologist Dr. John Christy,
Climatologist Dr. Roy W. Spencer,
Atmospheric Scientist Ross Hays of NASA’s Columbia Scientific Balloon Facility].
Rather than there being a consensus of 97% of scientists who believe in climate alarmism, the opposite is more likely to be true: that 97% of scientists of integrity and without a financial interest believe that AGW alarmism is fraudulent.
The World Climate Declaration
The alarmist narrative took a huge hit in August last year when over 1,100 scientists and professionals put their names to the ‘World Climate Declaration (WCD). The authors, drawn from across the world, led by the Norwegian physics Nobel Prize laureate Professor Ivar Giaever, reject the claim that there is a ‘climate emergency’.
The WCD posit that the ‘scientific consensus’ on man-made climate change is part of a politically-driven media agenda and that grant-dependent academics have degenerated the discipline into a discussion based on beliefs, rather than sound self-critical science.
In particular, the WCD are critical of climate models, noting that they ”are not remotely plausible as global policy tools.” The WCD contend that these models exaggerate the negative effects of carbon dioxide. They instead emphasize that the gas is beneficial for nature and agriculture; that it increases global crop yields, promotes growth in plant biomass and is essential to all life on Earth.
It is also the contention of the WCD that historic climate models have overstated the projected negative impacts of climate change compared to real world events and note that insufficient emphasis is placed on the empirical scientific method. In addition, the WCD declare that there is no statistical evidence that climate change is intensifying hurricanes, floods and droughts, or making them more frequent.
Investigative journalist and researcher, Whitney Webb, argues that the prevailing AGW consensus is intimately tied to corporate interests embodied in the UN’s annual ‘COP’ gatherings.
Commenting on the recent COP26 event in Glasgow, Webb said:
”COP is about setting up the financial infrastructure for a completely new economic system based on CBDCs and the financialization of ‘natural capital’ and ‘human capital’ into new asset classes. It’s about complete economic domination of the planet, not about ‘saving’ it.”
What Webb evokes is the endless corporate drive to privatise the planet , the tendency for capitalists to seek control of ecosystems as ‘financial assets’, and deny the rights of people around the world to benefit from nature.
Dr Yeadon’s credentials are impeccable. He has a degree in biochemistry and toxicology and a research-based PhD in respiratory pharmacology. He has spent over 30 years leading new medicines research in some of the world’s largest pharmaceutical companies, leaving Pfizer in 2011 as Vice President & Chief Scientist for Allergy & Respiratory, the most senior research position in this field in Pfizer.
Dr Yeadon argues the main reason for the lies about the novel virus is a desire for total predictability and control, with the clearly articulated intention of transforming society.
Yeadon says the intention is to:
”dismantle the financial system through lockdowns and furlough, while the immediate practical goal of lockdown was to provide the causus belli for injecting as many people as possible with materials designed not to induce immunity, but to demand repeat inoculation, to cause injury and death, and to control freedom of movement.”
The 30 minute documentary filmKilling Us Softly (1979) based on a lecture by Jean Kilbourne focuses on the effects of advertising on women’s self-image and the objectification of women’s bodies. Kilbourne argues that the superficial and unreal portrayal of women in advertising lowers their self-esteem and that sexualized images of them are being used to sell a multitude of goods.
Kilbourne goes on to posit that these images degrade women, encourage abuse, and reinforce a patriarchal and sexist society. She also makes the connection between advertising and pornography, stating that “the advertisers are America’s real pornographers”.
Over four decades since the release of ‘Killing Us Softly’, Kilbourne discussed some of the issues in her film to a new audience of young people. Significantly, she says that since the film’s initial release in 1979 “things have got worse, not better.”
It can barely get any worse than the case of 21 year old bulimia sufferer, Eloise Perry. On April 12, 2015, Ms Perry tragically died at the Royal Shrewsbury Hospital one week after having swallowed eight unlicensed fat-burning pills that she purchased from the internet.
The pills, which the Food Standards Agency describe as being illegal to sell for human consumption, contained DNP which is an industrial chemical historically used in the manufacture of explosives and fungicides. Website companies who sell this chemical depict DNP as a fat burning product and some even use the tag line“getting leaner through chemistry” as a marketing tool.
The social pressures for young women (and increasingly young men) to conform to certain expectations placed upon them by the media are immense. The upshot is that young people are involved in a constant psychological battle between myth and reality. In Britain, for example, the average size of a woman is 16 but the ‘aspirational’ size is zero – an unobtainable goal.
The contradiction between reality and aspiration and the weaponization of feminism is undermining many of the gains that women made in the debates of the 1960s and 1970s. What Ariel Levi terms “raunch culture” – the sexualisation of women as sex objects – is another symptom of the undermining of these gains.
The normalization of sexist imagery in pop videos and television commercials and the sexualization of young girls clothes, is another illustration of raunch culture outlined by Levi in which fantasies, desires and ambitions are transformed into commodities to make money.
The growth in cosmetic surgery is another factor that increases expectations on women’s appearances. Ninety-one per cent of cosmetic surgery is undertaken on women of which the most popular is breast enhancement. I was astounded to learn that in the U.S it’s widely considered normal practice for girls to be given a breast enlargement as a graduation present.
It’s a fact that a growing number of girls who suffer low self-esteem perpetuated by a media system that constantly portrays an ‘ideal’ body shape is a tendency that’s less common in the developing world.
This would seem to suggest that mental illness, of which eating disorders are a reflection, is to a large extent symptomatic of the growth of the consumerist capitalist society in which human relations are objectified. In Marxist terms, objectification is the process by which human capacities are transferred to an object and embodied in it.
Young females who read fashion magazines tend to have more bulimic symptoms than those females who do not – further demonstrating the impact the media has on the likelihood of developing the disorder.
As J. Kevin Thompson and Eric Stice have shown, individuals first accept and ‘buy into’ the ideals set by fashion magazines, and then attempt to transform themselves in order to reflect the societal ideals of attractiveness.
The thin fashion model ideal is then reinforced by the wider media reflecting unrealistic female body shapes leading to high levels of discomfort among large swaths of the female population and the drive towards thinness that this implies.
Consequently, dissatisfaction, coupled with a drive for thinness, is thought to promote dieting and its negative affects, which could eventually lead to bulimic symptoms such as purging or binging. Binges lead to self-disgust which causes purging to prevent weight gain.
Thompson’s and Stice’s research highlights the extent to which the media affect what they term the “thin ideal internalization”. The researchers used randomized experiments (more specifically programmes) dedicated to teaching young women how to be more critical when it comes to media, in order to reduce thin ideal internalization. The results showed that by creating more awareness of the media’s control of the societal ideal of attractiveness, the thin ideal internalization significantly dropped.
In other words, less thin ideal images portrayed by the media resulted in less thin ideal internalization. Therefore, Thompson and Stice were able to conclude that there is a direct correlation between the media portrayal of women and how they feel about themselves.
Social media also plays a part. A 2014 two part study  looking at social media sites, such as Facebook and Twitter, researched influence and risk for eating disorders. In the first part of the study, 960 women completed self-report surveys regarding Facebook use and disordered eating. In the second part of the study, 84 women were randomly assigned to use Facebook or to use an alternate internet site for 20 minutes.
What the cross-sectional survey illustrates, is that more frequent Facebook use is associated with greater disordered eating. The survey indicates a close correlation between Facebook use and the maintenance of weight/shape concerns and state anxiety compared to an alternate internet activity . Other research suggests an etiological link between eating disorders and the tendency towards self-harming [now referred to as Non Suicidal Self Injury (NSSI)] .
Over 1.6 million people in the UK are estimated to be directly affected by eating disorders. However, the Department of Health estimate that the figure is more likely to be 4 million due to the huge level of unmet need in the community .
Studies suggest that as many as 8 per cent of women have bulimia at some stage in their life. The condition can occur at any age, but mainly affects women aged between 16 and 40 (on average, it starts around the age of 18 or 19). Reports estimate that up to a quarter of Britons struggling with eating disorders may be male .
1.Mabe AG, Forney KJ, Keel PK. Int J Eat Disord. 2014 Jul;47(5):516-23 Do you “like” my photo?
2.Colleen M. Jacobson and Cynthia C. Luik, Epidemiology and Sociocultural Aspects of Non-suicidal Self-Injury and Eating Disorders 2014
3. Joint Commissioning Panel For Mental Health (www.jcpmh.info/wp-content/uploads/10keymsgs-eatingdisorders.pdf)
For a very long time, I have strongIy suspected that something wasn’t quite right with the media’s Greta Thunberg narrative. Then on March, 16 I was very much taken by the recent comment on twitter by Dr Simon Goddek – @goddeketal.
Dr Goddek wrote the following:
The more I dug into @GretaThunberg's story, the more I realized that something stinks here. It's no COINCIDENCE that her first appearance was on August 20, 2018, with a sit-in protest in front of the Swedish Parliament, followed COINCIDENTALLY four days later by the release of a… https://t.co/z26lkdFi80pic.twitter.com/va0V1PdP0n
Could the narrative constructed around Greta finally be crumbling?
Here’s the full text:
”The more I dug into @GretaThunberg‘s story, the more I realized that something stinks here. It’s no COINCIDENCE that her first appearance was on August 20, 2018, with a sit-in protest in front of the Swedish Parliament, followed COINCIDENTALLY four days later by the release of a book she co-authored with her mother.
But that’s not all – the PR machine for her was already in full swing on August 20, thanks to a man named Ingmar Rentzhog, who financed and drove the campaign through his company, @WeDontHaveTime.
And guess what? Rentzhog is also COINCIDENTALLY the chairman of the think tank “Global Challenge” (@ChallengesFnd), which is now COINCIDENTALLY fully financed by a billionaire named Kristine Person, a member of the Swedish Social Democratic Workers’ Party and former minister in the government under Stefan Löfven.
And if that’s not enough, Rentzhog purely COINCIDENTALLY happened to walk by the Swedish Parliament on August 20 and encounter Greta during her sit-in protest, taking a photograph of her.
But wait, there’s more – Rentzhog and Greta’s mother had already met before at a climate conference on May 4, 2018, which is COINCIDENTALLY the exact date when Rentzhog became CEO of the aforementioned think tank.
And here’s something interesting – both Kristine Person and Stefan Löfven happen to be members of Klaus Schwab’s @WEF.
It’s amazing how all these connections seem to come full circle, isn’t it?
It’s clear to me that something fishy is going on behind the scenes here. These people are manipulating the public and abusing their power for their own political gain. We need to be aware of their tactics and warn everybody we know about the Great Reset and the Fourth Industrial Revolution.”
Very well said, Dr Goddek.
Talking of coincidences:
Greta’s function is to warn of impending doom for the planet if we do not do something about CO2, is COINCIDENTALLY the same message pumped out by the WEF that’s intended to further empower the United Nations and helps pave the way for global government.
Greta’s social media accounts are completely focused on the task in hand – creating ‘climate panic’ in defiance of the facts. Her Facebook account is series of self-promotional posts with no interraction with comments. The list of people that Greta follows on twitter are world leaders and major political figures, climate and environment accounts like Soros-funded Greenpeace, the WEF and celebrities.
Over the past five years, the political and media establishment have helped galvanise the world’s youth in her support. ‘Independent analysts’, Media Lens, who falsely portray themselves an an alternative to the corporate media, heavily promote Greta’s climate catastrophism, as does the Guardian’s, George Monbiot.
Greta works in tandem with Extinction Rebellion (XR), which appears to be the climate cult’s Antifa, promoting civil disobedience in order to force action on the ‘climate emergency’.
Within a few short months Greta’s stature was such that she was invited to address the UN’s Climate Change Conference at Katowice, making her plea for ‘climate justice’.
Greta’s very first tweet back in June 2018 – which has since been deleted – was to post an article (in English of course) which warned that climate change will wipe out all of humanity unless we stop using fossil fuels over the next five years.
This is what Greta said in her tweet, dated June 21, 2018:
“A top climate scientist is warning that climate change will wipe out all of humanity unless we stop using fossil fuels over the next five years.”https://t.co/AOr2DDZORc— Greta Thunberg (@GretaThunberg).
For the first time in 33 million years, it seems, we are almost at a point where there is no ice at either pole:
”The chance that there will be any permanent ice left in the Arctic after 2022 is essentially zero,[…] with 75 to 80 percent of permanent ice having melted already in the last 35 years’‘.
How did that turn out, Greta?
Greta’s obvious claptrap underscores the fact that the Arctic was never anywhere melting away. The earth has not returned to the temperatures of the Medieval Warm Period when Greenland was colonised – how can we be approaching temperatures not seen in 33 million years?
The spectacle of the legacy media using a young girl in order to panic the world into giving more power to the WEF is, to say the least, bizarre. Thousands of the world’s scientists (see here and here) have called climate alarmism a hoax.
However ludicrous as it may seem, we are expected to ignore the facts about geological history, Co2 and global climate, and to follow the lead of a young woman who, since the age of 15, has parroted arrant nonsense embedded in unending cliché, on the say-so of the likes of George Monbiot and Media Lens.
This same media not only continue to parrot Thunberg’s nonsense, but seemingly have absolutely no qualms about having callously exploitated and manipulated a child in order to further the goal of world government.
Investigative journalist and researcher, Whitney Webb summed-up what is behind the prevailing climate change orthodoxy and the media’s fetishization of Greta.
Webb highlighted the fact that the twin phenomena are intimately tied to corporate interests embodied in the UN’s climate change agenda:
“COP26 is about setting up the financial infrastructure for a completely new economic system based on CBDCs and the financialization of ‘natural capital’ and ‘human capital’ into new asset classes. It’s about complete economic domination of the planet, not about ‘saving’ it.”
Webb revealed the nefarious Malthusian underpinnings that drive the UNs climate change agenda and that legitimate environmental concerns have been usurped in pursuit of this agenda.
”In searching for a common enemy against whom we can unite, we came up with the idea that pollution, the threat of global warming, water shortages, famine and the like, would fit the bill. In their totality and their interactions these phenomena do constitute a common threat which must be confronted by everyone together. But in designating these dangers as the enemy, we fall into the trap, which we have already warned readers about, namely mistaking symptoms for causes. All these dangers are caused by human intervention in natural processes, and it is only through changed attitudes and behavior that they can be overcome. The real enemy then is humanity itself.”
In 1978, the Australian social scientist, Alex Carey, pointed out that the twentieth century has been characterized by three developments of great political importance:
“the growth of democracy; the growth of corporate power; and the growth of corporate propaganda as a means of protecting corporate power against democracy.”
In order to defend their interests against the forces of democracy, the corporations that now dominate much of the domestic and global economies recognize the need to manipulate the public through media propaganda by manufacturing their consent. This is largely achieved through coordinated mass campaigns that combine sophisticated public relations techniques.
The result is the media underplay, or even ignore, the economic and ideological motivations that drive the social policy decisions and strategies of governments’.
Sharon Beder outlines the reasoning behind the coordinated political, corporate and media attacks on democracy:
“The purpose of this propaganda onslaught has been to persuade a majority of people that it is in their interests to eschew their own power as workers and citizens, and forego their democratic right to restrain and regulate business activity. As a result the political agenda is now largely confined to policies aimed at furtheringbusiness interests.”
This is the context in which the UK political and media establishment undermined former Labour leader, Jeremy Corbyn. The plot to oust Corbyn began the moment he became leader after a hardcore group that included shadow chancellor Chris Leslie, shadow education secretary Tristram Hunt, shadow communities secretary Emma Reynolds and shadow defence secretary Vernon Coaker, all refused to serve under him.
Others included shadow transport secretary Michael Dugher, shadow chief secretary to the Treasury Shabana Mahmood, shadow international development secretary Mary Creagh and shadow Cabinet Office minister Lucy Powell. These figures as well as the establishment in general were aware that Corbyn could not be bought off on their terms. The former Labour leader’s incorruptibility represented a potential threat to the gravy train that sustains them.
In other words, it’s not merely Corbyn who the establishment regard as a democratic threat to their hold on power, but what he represents as an example to others following in his foot steps which is the reason why, even now, they want to shut him up. It’s the potential of breaking the iron-clad neoliberal consensus that underscores what has arguably been some of the most vitriolic and biased reportage ever witnessed against any British political figure in history.
What Media Lens accurately described as a “panic-driven hysterical hate-fest right across the corporate media spectrum,” began the moment the plotting against him by members of his own party began. As the media analysts noted at the time of the leadership election, “the full extent of media bias against Jeremy Corbyn can be gauged simply by comparing the tone and intensity of attacks on him as compared to those directed at the other three candidates: Andy Burnham, Yvette Cooper and Liz Kendall.”
The level of the media attacks against Corbyn continued after he secured ‘the largest mandate ever won by a party leader’. The focus of these attacks included what colour poppy he would wear, his refusal to sing the national anthem or whether he would wear a tie or do up his top button. All of this was granted national news headlines and incessant coverage.
Not to be outdone, in October 2015, the BBCs political editor Laura Kuenssberg featured in an almost comically biased, at times openly scornful, attack on Corbyn’s reasonable stance on nuclear weapons. The BBC then broadcast five senior New Labour figures all opposing Corbyn without any opportunity for an alternative viewpoint.
Kuenssberg followed up this hatchet-job three months later when she helped to orchestrate the live resignation of Labour shadow foreign minister Stephen Doughty on the BBC2 Daily Politics show as a pre-requisite to accusing Corbyn’s team of ‘unpleasant operations’ and ‘lies’. Then came the April 12, 2016 Telegrapharticle – a non-story about Corbyn’s state-funded salary and pension.
Eleven months later (March 5, 2017), the same rag continued with the smears by suggesting Corbyn had paid insufficient tax on his declared annual earnings – a claim subsequently debunked within hours on social media.
Meanwhile, the news that then Tory Chancellor, Philip Hammond, refused point-blank to publish his own tax returns after being prompted to do so by his opposition counterpart, John McDonnell, did not receive anything like the same kind of media scrutiny.
The implication was that Corbyn had misled the public. However, similar media outrage was not leveled at then PM Theresa May after it was revealed (March 7, 2017) that she had lied to parliament after having falsely claimed that Surrey Council had not engaged in a ‘sweat heart’ deal with the Conservative government.
Academic studies indicate that when it came to criticising Corbyn’s political opponents, a completely different set of media standards were applied: A major content analysis from Cardiff University revealed that the BBC is pro-business and Conservative-leaning in its coverage. The London School of Economics and Political Science found strong media bias against Corbyn, claiming the press had turned into an “attack dog” against the opposition leader.
According to content analysis from the Media Reform Coalition, the UK’s public service broadcaster gave double the airtime to Corbyn’s critics compared to his allies.
The anti-Corbyn propaganda was systematic and entrenched within both the legacy media and the Labour party hierarchy. Both were determined to topple Corbyn, using ‘anti-Semitism’ as a weapon to achieve it. Journalists Tony Greenstein and Asa Winstanley were among the first to highlight the politically-motivated smears of the pro-Israel lobby against Corbyn.
In an excellent piece published by the Electronic Intifada (April 28, 2016), Winstanley outlined the links between right-wing, anti-Corbyn and pro-Israel forces within the Labour party. Winstanley meticulously showed how this lobby manufactured an ‘antisemitism crisis’, pinpointing the individuals involved, the tactics and dirty tricks used and the connections to powerful individuals whose ties lead to pro-Israel groups both in London and Israel.
One of key contrived ‘antisemitism’ accusations levelled at Corbyn during this period was by Labour MP, Ruth Smeeth who Wikileaks revealed is a ‘strictly protected’ US informant. Smeeth staged a highly publicised walk-out on June 30, 2016 during Corbyn’s launch of a review into the Labour party’s supposed ‘anti-semitism crisis’ which, as Jonathan Cook pointed out, was in fact, “a crisis entirely confected by a toxic mix of the right, Israel supporters and the media.”
A few days earlier another manufactured and staged anti-Corbyn story made the headlines. This time it centred around a Corbyn ‘heckler’ at Gay Pride, who in fact, as Craig Murray observed, turned out to have been Tom Mauchline. At that time Mauchline worked for the public relations firm, Portland Communications, whose ‘strategic counsel’ is Alastair Campbell, Blair’s former media chief who helped to sell the illegal invasion-occupation of Iraq.
In addition, Corbyn’s pro-Remain position with respect to the EU referendum provided his critics with the ammunition they needed in their attempts to undermine him further. Chief among these critics was Angela Eagle.
Eagle was one of the many Oxford-educated Blairite plotters who resigned her post in order to position herself as a potential replacement for Corbyn and who claimed to be dissatisfied with his performance during the EU referendum campaign. However, as the graphic below indicates, Corbyn did much better than Eagle in defending their respective Remain positions:
The Labour party gained 60,000 members in one week following the attempted coup against Corbyn. Membership levels were higher under Corbyn than the previous peak of 405,000 last seen under Tony Blair’s leadership. In his constituency of Islington North, Corbyn inherited a majority of 4,456, which increased to 21,194. He’s one of the few Labour MPs whose vote increased between 2005 and 2010, when he added 5,685 to his majority.
Furthermore, under Corbyn’s leadership, London, Bristol and Greater Manchester ushered in Labour mayors, rolling back years of Tory dominance, while Labour’s majorities in by-elections had generally increased.
It should also be remembered that pre-coup, Labour led the Tories in three polls in a row over 41 days. The long-term decline in Labour’s fortunes that preceded Corbyn can hardly be blamed on the then Labour leader. Nevertheless, these positive Corbyn statistics didn’t stop any attempts by opportunistic and self-serving careerists within the party to undermine him.
Corbyn’s alleged weakness at the dispatch box was presented as evidence of ‘ineffectual opposition’ despite the fact that under his leadership the Tories had been forced into some thirty policy u-turns. In terms of some of the core domestic policy issues, Corbyn maintained the support of the majority of the British public.
However, the establishment insisted he was ‘unelectable’. As one commentator on twitter put it, ”un-electable is media-political code for ‘likely to be highly electable but ‘will not serve elite interests.’”
Following Theresa May’s surprise decision to call a snap election for June 8, 2017, the media bias against Corbyn ramped-up another notch particularly by, but not limited to, the gutter Murdoch press.
During the build-up to the General Election, the BBC for example, no longer even pretended to be impartial, as the Tweets below illustrate:
Laura Kuenssberg, more than any other BBC correspondent, appeared to have had a particular dislike for Corbyn that bordered on the outright contemptuous. This hatred was best summed up by Media Lens who critiqued Kuenssberg’s “subtle insidious use of language” in a BBC hit-piece.
It was hardly a surprise to learn that the kind of sustained attacks against Corbyn were the result of an increasingly concentrated foreign ownership of the UK media. This media made it clear they supported the Tories in the build-up to the General Election, not least because of Theresa May’s hard Brexit strategy at that time.
The mass media frequently depicted May’s stance as indicative of her ‘strength and stable’ leadership. Conversely, their antagonistic tone and depiction of Corbyn as weak and calamitous, was the opposite of the truth.
In a rare moment of honesty, The Guardian’s Roy Greenslade wrote:
“Mainstream media as a whole took its gloves off and Corbyn’s electoral hopes have been doomed from day one. He was “a great leap backwards”, said the Mail. Beware this “absurd Marxist”, said the Express, while the Daily Telegraph referred to his “divisive ideology” and “atavistic hostility to wealth and success”. And the Sun? It just called him “bonkers”. There was scepticism too from the liberal left. The Independent thought he would not persuade middle England to accept his policies.”
Neither the Daily Mirror nor the Guardian greeted Corbyn with open arms. Support for him on social media made no impact. Meanwhile, the overall anti-Corbyn agenda, repeated week upon week, month after month, was one that broadcasters were unable to overlook, despite their belief in balance and adherence to impartiality. News bulletin reports reflected the headlines. Current affairs programmes picked up on the themes. That’s how media narratives are constructed.”
The election campaign strategies of the two leaders couldn’t have been more different. While May’s robotic and lacklustre performance overseen by Lynton Crosby’s single issue Brexit strategy was engineered to avoid public and media scrutiny, Jeremy Corbyn’s campaign was marked by a willingness to engage with the public. While Corbyn had been open, transparent and accountable, May had been robotic, secretive and aloof.
While May came across as cold, calculating and lacking in human empathy, Corbyn came across as being totally at ease with the public, smiling and relaxed in their company. Corbyn openly espoused his philosophy and numerous policy initiatives, many of them significant. May, by contrast, appeared to have no policies to discuss and came across as instinctively autocratic and awkward.
Whereas Corbyn’s campaigning had been marked by spontaneity and a willingness to reply to previously unseen questions in public meetings and press conferences, May’s series of highly evasive stage-managed PR stunts were exemplified by an eagerness to rely on focus groups and a carefully selected media who provided her with pre-vetted questions by the Tory Party.
The attempts by the Tories to restrict the media from asking May any probing questions, was highlighted by Channel 4 News journalist, Michael Crick, after he admitted to apparently being shocked that “reporters collaborate with May’s press team by agreeing to reveal their questions to them in advance.”
The BBCs Eleanor Garnier, on the other hand, was clearly of the opinion that May was not subject to this kind of overt media censorship. Garnier tweeted: “I didn’t discuss question or topic of question with May’s team. If I was ever asked to give my question there is no way I would. Ever.”
Whatever is being taught on journalism courses these days, the work of Chomsky and Herman is clearly not on the syllabus. My advice to Garnier is to spend 30 minutes watching Chomsky’s demolition of Andrew Marr before taking on her next journalistic assignment.
That Garnier, as a BBC journalist, failed to recognise that access is determined by the lack of difficult or challenging questions indicative of how the media works, is frankly staggering.
What is equally staggering, is the fact that lack of access and the closing of journalistic ranks with the governments complicity, is not seen as an outrageous attack on civil liberties, democratic accountability and press freedom.
In Britain in 2017, arguably for the first time, the public were faced with a situation in which they were denied information to enable them to be able to make informed choices ahead of a General Election. Craig Murray, succinctly expressed his outrage at that time:
“The idea that the head of the government both gets to choose what they have asked, and gets advance warning of every question so they can look sharp with their answer, is totally antithetical to every notion of democratic accountability. If we had anything approaching a genuine free media, there would be absolute outrage. All genuine media organisations would react by boycotting such events and simply refusing to cover them at all.”
It should be remembered that Theresa May, like Rishi Sunak, was not elected as PM. This was a period that mainstream political historians and journalists ought to reflect on with some degree of humility. The obtusiveness, obfuscations, evasiveness and total disregard for democracy and public accountability often associated with Boris Johnson, didn’t begin with him. Rather, Britain’s descent into authoritarianism began under the Cameron administration but mushroomed under May’s leadership.
If the UK media at that time had reported the British political and media system with honesty, then they would have acknowledged similarities to North Korea. What has happened in Britain from the Cameron, and particularly the May years, is that basic democratic norms have been trampled on.
Unfortunately, those who believe the situation will change for the better under a future Labour government led by Keir Starmer are sadly mistaken. It is not widely known that the Labour leader and establishment stooge, Sir Keir, is a member of the Trilateral Commission, an organisation that thinks the problems of governance “stemfrom an excess of democracy.”
As Britain’s descent into authoritarianism continues apace, hardly anybody, either within the political and media establishments, or among the wider public more generally, appear to have blinked an eyelid at the prospect.
There appears to be a serious dereliction of duty on the part of the panoply of economic analysts and commentators within the legacy media to discuss the limitations of economic neoliberalism. On the contrary, these commentators and analysts regard the existing growth model as a panacea rather than the death knoll for society and the environment it undoubtably is.
Economic ‘experts’ who extol the virtues of the prevailing orthodoxy are discussed by the media commentariat in reverential tones and the discipline is viewed as if it is an exact science. Mainstream economists, chancellors of the exchequer, prime ministers, heads of the Bank of England and other ‘pillars of the establishment’, are widely regarded in this light.
What all these ‘experts’ agree on is their belief in the deluded notion that sustained economic growth is emblematic of societal progress. Very rarely are the premises upon which these ‘experts’ promote neoliberal economics challenged by commentators.
The UKs current chancellor, Jeremy Hunt, is part of a political establishment that continues to perpetuate the myth that the neoliberal economic growth model is the best way to curtail the threat of further economic crisis as opposed to recognizing it’s the major cause. Consequently, Hunt will continue to systematically push for policies that fly in the face of all available evidence.
The problems are as much to do with ideology and dogma as they are to do with incompetence. Rather than the global financial crisis of 2008 acting as a wake up call, Hunt and the likes of Kwarteng, Zahawi, Hammond and Osborne who preceded him, continue with the same poisonous model until the next crisis comes along, by which time they will continue with it until the one after that. And so it goes on. This is the economics of the madhouse.
Radical visions – development not growth
What is required is a radical alternative vision for society – a break from the concept by which everything has become a commodity to be bought and sold for profit. But who, other than a handful of creative thinkers in the academic sphere, are proposing alternative, imaginative visions?
One of the most ambitious ideas I’ve come across is that postulated by Pat Devine, whose thesis is closely aligned to that of the Chilean economist, Manfred Max-Neef. While recognizing the importance, geographically, of bringing production closer to consumption, Max-Neef argues that the root of the existing problem stems from how establishment economists perceive their academic discipline as being above, and separate from, nature and the biosphere.
According to Max-Neef, mainstream economists are ignorant about ecosystems, thermodynamics and biodiversity and regard nature as a subsystem of the economy.
Max-Neef argues that economics needs to be taught in a different way based in five postulates and one fundamental value principle:
1) The economy is to serve the people and not the people to serve the economy.
2) Development is about people and not about objects.
3) Growth is not the same as development, and development does not necessarily require growth.
4) No economy is possible in the absence of ecosystem services.
5) The economy is a subsystem of a larger finite system, the biosphere, hence permanent growth is impossible.
The fundamental value to sustain a new economy, says Max-Neef, should be that no economic interest, under no circumstance, can be above the reverence of life.
For far too long, humanity and the natural world has been subordinate to the imperatives associated with an economic growth paradigm that’s perceived by mainstream economists and politicians as being separate and distinct from them.
What Max-Neef is saying in the first point above is that the dialectical relationship between economy and people has to be restored in order for society and nature to function properly.
The distinction Max-Neef makes between growth and development in point three, is particularly significant. As the economist from Berkeley points out:
“Growth is a quantitative accumulation. Development is the liberation of creative possibilities. Every living system in nature grows up to a certain point and stops growing. You are not growing anymore, nor he nor me. But we continue developing ourselves… So development has no limits. Growth has limits. And that is a very big thing that economists and politicians don’t understand. They are obsessed with the fetish of economic growth.”
This fetishization of economic growth is arguably explained, in part, by the fact that the monetary offshoots that accrue as a consequence of this growth have, since the onset of ‘trickle-down’ neoliberalism, increasingly ‘gushed upwards’ towards the top of the socioeconomic pyramid.
Statistics indicate, for instance, that economic output (GDP) in the UK, adjusted for inflation, doubled during the peak of neoliberalism, from £687bn in 1979 to £1,502bn in 2011. However, over the same period, income inequality, as measured by the Gini coefficient, increased from 0.25 to 0.34.
In other words, during the peak era of neoliberalism, working people who have created the sustained increase in wealth in society, have seen their slice of the pie reduced. Max-Neef understands that the ruling class obsession with the fetish of economic growth is underscored by the fact that this is the class that disproportionately benefits the most from it.
The threshold hypothesis
One of the later works Max-Neef authored was the famous threshold hypothesis. This states that in every society there is a period in which economic growth brings about an improvement in quality of life. But only up to a point – the threshold point – beyond which, if there is more growth, quality of life begins to decline.
According to Max-Neef, the U.S, which he terms an “undeveloping nation” is already at that point with the UK not far behind. The logic of diminishing returns applies to other parts of the system that eventually results in net costs over the long-term.
These costs are quantified, not only in strict monetary terms, but also involve human capital – something which the economic-growth fetishists rarely factor in to their cost-benefit calculations.
The graph below, highlighting the impact of immigration on UK debt, is an example of how the mainstream economists of the OBR have failed to take into account Max-Neef’s threshold hypothesis:
It would appear that the OBR is suggesting the existence of a causal link between the reduction in government debt and the notion that immigration is a net economic benefit.
However, the ORB analysis doesn’t take into account the uneven distribution of wealth which negate the benefits accrued. It also omits other indicators such as reduced quality of life resulting from, for example, a lack of school places or other pressures on public services that mass immigration potentially brings.
It’s the apparent inability of politicians to view the economic growth paradigm as destructive that opens up spaces for alternative narratives of the likes of Max-Neef to fill.
After winning the Right Livelihood Award in 1983, two years after the publication of his book, Outside Looking In: Experiences in Barefoot Economics, the Chilean economist’s metaphor was inspired as a result of the ten years he spent working in extreme poverty in the Sierras, jungles and urban areas of different parts of Latin America.
It was during this period that the economist from Berkeley began to view his profession in a different light. What subsequently happened was to change his life for ever:
“I was one day in an Indian village in the Sierra in Peru”, recalls Max-Neef. “It was an ugly day. It had been raining all the time. And I was standing in the slum. And across me, another guy also standing in the mud…This was a short guy, thin, hungry, jobless, five kids, a wife and a grandmother. And I was the fine economist from Berkeley. We looked at each other, and then suddenly I realized that I had nothing coherent to say to that man in those circumstances, that my whole language as an economist was absolutely useless.”
”Should I tell him that he should be happy because the GDP had grown five percent or something? Everything was absurd. I discovered that I had no language in that environment and that we had to invent a new language. And that’s the origin of the metaphor of barefoot economics, which concretely means that is the economics that an economist who dares to step into the mud must practice.”
Max-Neef argues that economists are divorced from the kind of poverty that’s central to their theories:
“The point is, economists study and analyze poverty in their nice offices, have all the statistics, make all the models, and are convinced that they know everything that you can know about poverty. But they don’t understand poverty. And that’s the big problem. And that’s why poverty is still there. And that changed my life as an economist completely. I invented a language that is coherent with those situations and conditions.”
The ‘language’ Max-Neef alludes to relates to how human beings in developed countries have lost the capacity to understand. Despite our ability to accumulate knowledge, this capacity, in the absence of empathy, love and understanding is, according to Max-Neef, insufficient:
“You can only attempt to understand that of which you become a part. If we fall in love, as the Latin song says, we are much more than two. When you belong, you understand. When you’re separated, you can accumulate knowledge. And that is the function of science. Now, science is divided into parts, but understanding is holistic.”
For Max-Neef, poverty from the perspective of economists, can only be understood by living among people who are poor. Only then can economists understand that in such an environment there exists a different set of values and principles that are alien to the world of academia that cannot be learned or understood their.
“What I have learned from the poor is much more than I learned in the universities. The first thing you learn…is you cannot be an idiot if you want to survive. Every minute, you have to be thinking, what next? What do I know? What trick can I do here? What’s this and that? And so, your creativity is constant. But very few people have that experience. They look at it from the outside, instead of living it from the inside”, says Max-Neef.
“In addition, you have networks of cooperation, mutual aid and all sorts of extraordinary things which you’ll no longer find in our dominant, individualistic, greedy and egotistical society. It’s the opposite of what you find there. And it’s sometimes so shocking that you may find people much happier in poverty than what you would find in your own environment. This also means that poverty is not just a question of money. It’s a much more complex thing.”
What underlines Max-Neef’s message, perhaps more than anything else, is that mainstream economists in the ‘developed’ world see themselves as sophisticated, educated and cultured. They do this while building walls, pushing away to the margins the poor of the ‘developing’ world.
Ultimately, mainstream economists fail to acknowledge that the inherent contradiction of the neoliberal economic paradigm is such that it’s undermining the very foundations upon which ‘progress’ can be sustained in the long-term.
Groups of young parents huddle in a hallway, making plans. Old men nap on couches, waiting for dessert. It’s the extended family in all its tangled, loving, exhausting glory.
This particular family is the one depicted in Barry Levinson’s 1990 film, Avalon, based on the film directors own childhood in Baltimore. But it’s also one that could equally be played out in any post-war town or city in a smog drenched Keyensian Britain.
Battle-fatigued young men who fought for God and country were in no mood for the platitudes of politician’s who had sent many of their comrades to the battlefields where the stench of death still lingered. Hitler fascism, although defeated had, through the passing of time, left indelible scars on the faces of generations of young men ravaged by its consequences.
The fraternity among men who returned to Blighty from war in the hope that society back home could be transformed for the better, was a hope not lost on the ruling class. The establishment were to suffer the torment of a disillusioned nation tired of being used as cannon fodder for the interests of Whitehall pen-pushers.
‘No more war’ was the cry of the huddled, restless, masses. Churchill had underestimated the level of the nations discontent. The epithet ‘war hero’ would have to wait for the writings of historians as the war statesman was rejected at the polls by the very people who had put him into power seven months after Germany’s fascists invaded Poland.
By 1945 it was widely understood by the ruling class that the amelioration of the antagonistic relations between capital and labour was a necessary price to pay to stave off revolutionary levels of proletarian discontent. Churchill, considered by many to be an historical footnote by the working classes who fought a war ostensibly for the vanity of others, were now in a position to force the hand of their rulers.
The Red Army paid the heaviest price for fascisms defeat. Russian revolutionary sentiment still lingered in the air on the streets of London, Birmingham and Belfast. The British working class hadn’t forgotten the toll the war had on the Soviet people, nor had generations before them forgotten the gains of the bolshevik revolution. British prols were in no mood to genuflect at the feet of their ‘betters’.
Perhaps, somewhat ironically, it was the poetic realism of the film, Brief Encounter, that managed to unify a nation distraut by war. David Lean’s 1945 masterpiece about a middle-class woman’s imagined confession of a extramarital love affair, was an allegory for a society in a state of flux, hamstrung by conformity.
The key message of the film was that the ‘free time’ available to the films protaganists, Laura and Alec, that involved their chance encounters at Boots chemist, the Palladium cinema and at a railway station refreshment room, represented newly formed spaces in the public imagination that only an immediate post-war world borne out of servitude was capable of filling. Underlying the poetic sense of realism was a quest for balance and harmony in an otherwise fractured world.
In post-war Britain, nothing less than fair play was acceptable to the masses. Brits weren’t demanding the best cut steak, caviar or smoked salmon but neither were they content with the breadcrumbs from their rulers banquetting table. People needed and demanded homes. Not dilapidated, damp-ridden, rat infested hovels, but ‘homes fit for heroes’. And they got them, three hundred thousand of them, year on year.
But widespread access to post-war housing and the consumerism that followed in its wake, marked the beginning of a societal shift. Family bonds, once close and extended, began to split apart as the desire for more convenience, privacy and mobility represented a shrinking of time and space within the newly booming capitalism.
The time-piece which was popularized at least a generation before, was now a means by which the ruling class would attempt to remind British workers that showing up at the factory gates on time was the necessary price to be paid in order for them to enjoy the exuberent excesses experienced by their counterparts across the Atlantic.
”You’ve never had it so good” was the cry of the Westminster hordes. The young had time on their hands and money with which to spend it. Full employment and ‘jobs for life’ weren’t merely the mantras of dinner partying socialists, but were the realities for millions of working class people. It wouldn’t be long before the public’s appetite for convenience and increasing levels of consumption that money and time implied, would feed into the realm of leisure.
The young bedraggled by war became increasingly hungry for entertainment. They found it in the restaurants, football terraces, shopping malls and movie theatres. American consumerism in all of it’s unbridled excesses was depicted in the latter and Brits demanded some of the action.
Food, entertainment and their corollary, leisure time, perfectly encapsulated the rapid pace of change that was beginning to take hold amid the rubble of bombed neighbourhoods. These shifts in society gave way to new antagonisms across dinner tables the length and breadth of the country as family loyalties began to be questioned.
The violation of well-established protocols was not only seen as a sign of disrespect but more broadly as a metaphor for the beginning of the collapse of the entire family structure and the speeding-up of the pace of life.
As the 1960s beckoned, the extended family began to play an ever diminishing role. In his film, Avalon, Levinson depicts the total destruction of the extended family. What we witness is a young father and mother and their son and daughter eating turkey off trays in front of the television.
In the final scene, the main character is living alone in a nursing home, wondering what happened. “In the end, you spend everything you’ve ever saved, sell everything you’ve ever owned, just to exist in a place like this.”
The scene Levinson depicts is reminscent of the famous parable about materialism and contrasting values in which an American investment banker sits at the pier of a small coastal Mexican village. After meeting a local fisherman, the banker begins to reflect on the meaning of his life, and starts to question notions of success and capitalist values to which he had become accustomed. The American begins to understand that a simpler, slow paced life experienced by the fisherman was far more enriching and fullfilling than his own urban ‘rat race’ life as a banker.
The author, Milan Kundera, said:
”To sit with a dog on a hillside on a glorious afternoon is to be back in Eden, where doing nothing was not boring – it was peace.”
Kundera’s evocation of being at peace with life is like a dream in time and space. It’s akin to the days where people told family stories, of the stable, centralized, family and the dense cluster of many siblings and extended kin. It’s like the phantasmagorical experience when watching a colourized film clip of an early twentieth century Paris street scene and a sense of stepping into that world.
The serenity of time and place thus imbued is intensified by the imagination, prolonged by a multitude of echoes. But imagination is also an expression of pain for that which has been lost and can never be recaptured. Today, such visions of historical memory are routinely mocked like a rust that corrodes all it touches. The truth is, the world of reality has its limits; the world of imagination is boundless. It will take you everywhere.