Improving service delivery in government

PUTTING CITIZENS FIRST

Imagine if your local area had only one supermarket, one jeweller, and one bank. Without competition, the supermarket would have no incentive to lower prices, the jeweller would have no reason to offer a broad range, and the bank would have no motivation to provide outstanding service.

Competition is about price, selection, and service. It benefits consumers by keeping prices low, delivering product variety, and raising service standards. Robust competition also compels businesses to fight for customers by meeting needs, solving problems, and adding value.

In a free market economy, it is the consumer who decides which products and services succeed and which ones fail. We are all free to choose where to shop and what to buy and – if we are not satisfied with a service provider – we can vote with our feet and take our business elsewhere.

Unfortunately, choice does not exist when it comes to government departments and agencies. Governments are monopoly providers of their services which means that they have no direct competitors. Nor do they have any need to worry about disruptive start-ups stealing their customers.

This is ironic as competition is acknowledged by governments the world over as the best available tool for promoting consumer well-being. Competition keeps businesses on their toes and pushes them to be the best, however, no such force is exerted on governments.

We live in a world where people expect fast and efficient service – it doesn’t matter whether it’s a shop, restaurant, or government agency. Yet agencies have fallen behind the customer experience curve and need to move from a government-centric to a citizen-centric service model.

Unsurprisingly, governments at all levels claim that they are customer centric, even though many citizens would argue otherwise. This is certainly the case when it comes to online services and the inability of many government agencies and departments to deliver a seamless digital customer experience.

Today, we can use online channels to order our groceries, pay our bills, and even consult our doctor. All of these interactions with the private sector are frictionless offerings and none require us to leave home. Businesses have worked hard to be in sync with customers’ digital expectations.

In contrast, the public (government) sector, lags behind leading companies in offering citizens what they want and need online. People are still required to deal with cumbersome forms and manual processes as government requirements are designed to fit agency silos and not citizen needs.

A personal example will help here. During the pandemic in early 2021, I uploaded some proof of identity documents (including a passport) to both a NSW Government agency and a Federal Government agency in support of my application to be appointed a volunteer ethics teacher at my local school.

To my delight, the Federal Government’s application process was able to be fully completed online. In contrast, the NSW Government’s process ended by requesting that I visit a Service NSW centre to submit for inspection the originals of the documents that I had just submitted online.

So, a few days later, I drove to my nearest Service NSW location and produced the relevant documents. It took merely seconds for the customer service officer (CSO) to verify each identity document. Yet, for me, the process – from leaving home, to signing in, to queueing, and then returning home again – took over an hour.

When I asked the CSO why the state government required me to present the original identity documents over-the-counter while the federal government has no such requirement, I was unhelpfully told “because it’s policy”. This lack of consistency between screening agencies is frustrating but not uncommon.

In Australia and around the world, each level of government (federal, state, and local) marches to the beat of its own drum resulting in a lack of integration and coordination between agencies. Consequently, the synchronised delivery of frontline services is obstructed.

My wife experienced this silo-mentality when she asked the same CSO (who I mentioned above) to arrange for a copy of our marriage certificate in support of her application for a NSW photo identification card. He responded by informing Beverley that she would need to apply to the NSW Registry of Births, Deaths, and Marriages for the certificate.

So, Beverley went home and applied online for the marriage certificate which was posted to her around four weeks later. I then drove my wife back to Service NSW in order for her to furnish them with the marriage certificate that she had just received in the post from the NSW Registry of Births, Deaths, and Marriages.

That Service NSW is not a one-stop access point for all state government services and transactions is another example of governments lagging behind private enterprise. Working across functional silos and agencies rather than within them means that citizens are not enjoying streamlined services.

Governments and their agencies need to urgently lift their game in mapping the complete end-to-end customer experience journey. When dealing with the public sector, such journeys can be long as they stretch across multiple agencies, channels, and touchpoints.

To simplify customer journeys, governments must harness data and technology to become more efficient and effective. Furthermore, governments need to develop a single view of the citizen-customer and put in place a safe and secure public sector data sharing ecosystem.

A further drawback with government monopolies concerns innovation. The free market is an innovation machine which has delivered untold benefits to humanity. These mind-blowing advances have revolutionised the way we live and work and provided us with access to goods from around the world.

Meanwhile, not much has changed in many parts of the public sector. In a world where same-day delivery of goods and same-day approval of loans is commonplace, it is unacceptable that you have to wait up to four weeks to obtain a copy of a simple marriage certificate.

While competition forces private enterprises to continually innovate to gain and maintain customers and achieve faster delivery times, a lack of competition in the provision of say, marriage certificates, provides no incentive for governments to innovate in order to achieve quicker turnaround times.

Governments everywhere worry about the misuse of monopoly powers and the gobbling up of competitors. That’s why regulators, in all markets, like to see competition as this gives consumers choice. Having one dominant player is frowned upon, yet governments occupy this privileged position.

According to one Australian academic, state-owned enterprises do not act in the interest of the general public. Rather, they respond to their own interests and the interests of the government that owns them. As adjuncts of the state, government agencies largely operate free from normal competitive pressures.

■      ■      ■

I acknowledge that some government monopolies – such as critical services like public transport and public healthcare – need to exist. These services must be available to everyone and not subject to market forces or the ability to pay. Regardless, no government monopoly should provide substandard service.

The absence of vigorous rivalry and the existence of barriers to entry should not be used by governments and their agencies to provide outputs which are below those available in the private sector. Just because a citizen cannot switch to an alternative source of supply does not mean that a particular government service should not be subject to continuous improvement.

An online article published by consulting firm Deloitte – which highlighted the dissatisfaction of American citizens with the service provided by the US Federal Government – should resonate with Australian governments. To improve satisfaction, the article recommended that:

… what agencies need is a transformative breakthrough that delivers radical government customer service improvement. Agencies need to rethink their approach to customers, or even who their customers are. Agencies may need to reorganize to better deliver a leading customer experience and respond to customer needs and think beyond their walls to consider the range of stakeholders that can help reimagine the customer experience.

Around the world, there are stark differences between people’s service experiences as customers of private enterprises and their service experiences as customers of government agencies. It is incumbent on all governments to close this gap in service and to meet the rising expectations of citizens. Embedding a citizen-first mindset in the public sector would be a good first step.

Consumers win when businesses compete.

Regards

Paul J. Thomas
Chief Executive Officer
Ductus Consulting

Why we need slow journalism in a fast world

Illustration: Golden Cosmos/The New Yorker
NEWS DELIVERED DIFFERENTLY

There’s always a lot going on in the world which is why we are bombarded with news updates 24/7. As media readers, viewers, and listeners we want to be informed about events and threats as soon as possible. Consequently, speed is a central part of contemporary journalism and has given rise to “fast journalism” and the thirst to be first with the latest news. But speed has morphed into a disease which has impacted the quality of journalism.

Accuracy is fundamental to good journalism, yet increasingly, it’s being sacrificed for expediency. This particularly applies to early reports of breaking news with unfolding minute-by-minute developments. To help media networks maintain rolling coverage in times of crisis, talking heads masquerading as experts are wheeled out and they invariably get essential facts wrong. One man’s expert is another man’s fool!

Flimsy or unsubstantiated reporting used as a basis for content runs counter to the long-held journalistic notions of truth-seeking and serving the public interest. News is meant to be a public good, however, as news cycles move ever faster in our digital age, the pressure to release stories quicker often results in reporters publishing only basic information with little in-depth analysis.

In all walks of life, slow and right beats fast and wrong and the same holds true for journalism and news reporting. There is a growing slow news movement which operates outside the 24/7 news trap. Proponents of slow journalism take the time to get things right and turn out quality journalism, which is characterised by accuracy, depth, context, analysis, and expert opinion.

Delayed Gratification is the name of the world’s first slow journalism magazine. It promotes itself as “a quarterly publication which revisits the events of the last three months to offer in-depth, independent journalism in an increasingly frantic world”. The magazine swims against the tide in taking a stand against kneejerk reporting by providing slower but better news.

Commensurate with its tagline, “Last to breaking news,” Delayed Gratification employs the benefit of hindsight to report on events after the dust has settled and the news agenda has moved on. This enables the magazine to get to the heart of a story by soberly reflecting on what’s happened and then presenting a long-form story in its proper context.

Another slow news organisation is Tortoise Media. It was launched in 2019 by British journalist and former Director of BBC News, James Harding. He believes that “too many newsrooms chase the news but miss the story” and that we need to “slow down and wise up”. The Tortoise website states:

We don’t do breaking news, but what’s driving the news. We don’t cover every story, but reveal a few. We take the time to see the fuller picture, to make sense of the forces shaping our future, to investigate what’s unseen.

Someone else who believes that journalism needs reinvention is US journalism scholar, Professor Jennifer Rauch. In her book, Slow Media: Why ‘Slow” is Satisfying, Sustainable, and Smart, she makes the case for rethinking the way media is produced and consumed. Rauch is a fan of slow journalism as it requires reporters to spend weeks analysing the accuracy and perspectives of initial reports before publishing their own reports.

In an interview following the release of her book, Rauch stated that many journalists aren’t getting to do the kind of in-depth, thoughtful, accurate reporting that drew them to the profession in the first place. She went on to opine that:

Much news coverage is incomplete at best, inaccurate or misleading at worst. Too many news stories are dependent on press releases or official sources. You hear a lot about people getting news fatigue and avoiding the news because it makes them feel anxious.

Long before Rauch published her book in 2018, another academic – Professor Nassim Nicholas Taleb – called out the deficiencies with modern news reporting. In his 2007 global bestseller, Black Swan: The Impact of the Highly Improbable, Taleb emboldened his readers “not to read newspapers, or follow the news in any way or form”.

The Black Swan illuminated the severe limitations of our thinking and the fragility of our knowledge*. Taleb was one of the first people to recognise news consumption as a serious problem. “The more information we absorb,” he cautioned, “the more difficult it becomes to discern the relevant from the irrelevant”. Taleb views journalism as “pure entertainment, not the search for the truth” and warned:

Remember that we are swayed by the sensational. Listening to the news on the radio every hour is far worse for you than reading a weekly magazine, because the longer interval allows information to be filtered a bit.

It is alleged that Mark Twain once asserted: “If you don’t read the newspaper, you are uninformed; if you do read the newspaper, you are misinformed”. Sadly, Twain’s quip is more relevant today than in his day and many would agree with his claim.

It’s my contention that to achieve one of journalism’s prime purposes – the creation of an informed citizenry – we should adopt the classic strategy of slow and steady wins the race by embracing slow journalism.

Slow journalism is seen as the antidote to social media’s need for speed. “If social media is a fast-food boxed meal”, says AdNews Australia, “then slow journalism is a high-quality degustation. In a world of same-day delivery, instant noodles and push notifications … it’s hardly surprising that there are some people who just want things to move a little more slowly”.

One media outlet which has embraced slow journalism is the Australian Broadcasting Commission (ABC). In 2018 the ABC launched a slow journalism initiative called the Remote Communities Project (RCP). The RCP created stories that provided audiences with an insight into life outside of metropolitan cities.

Reporters were able to work without the normal time constraints associated with fast journalism. They benefited greatly from spending up to a fortnight in the bush learning first-hand about the issues and experiences of people living in isolated areas of Australia. Free from the tyranny of the 24-hour news cycle, reporters were able to uncover “the untold stories”.

■      ■      ■

Since my retirement from full-time work in mid-2019, I have naturally gravitated toward longer-form, slow journalism and feel much better informed. The journals and magazines that I read are void of sensational and superficial storytelling but replete with investigative and factual journalism. The well-rounded and balanced articles which are delivered direct to my inbox are a joy to read.

While fast media feeds us small bites of trivial matter, slow media delivers big chunks of meaningful matter with the latter content designed for those who take their news seriously and think deeply about the issues behind it.

Elephant in the Room is an example of slow media as each post is written after a period of reflection and research. This blog is a place for reasoned argument supported by corroborating evidence. Its contentions aim to be intellectually compelling without being academic to give you a clear understanding of the forces shaping our world.

■      ■      ■

We know from Aesop’s classic fable, The Hare and the Tortoise, that being the fastest does not guarantee victory. When this lesson is applied to journalism, it’s clear that filling the news abyss is not a sprint but a marathon. We need to recalibrate our relationship with the media and rethink how we consume information. The days of rocket-fast news ruling the media roost is being challenged by those who believe that we need to hasten slowly.

In the race of life, the tortoise ultimately beats the hare.

* Black Swans are extremely rare and unpredictable events that have massive impacts on society. These include positive Black Swans, like the phenomenal rise of Google, as well as negative Black Swans, such as the devastating 9/11 terrorist attacks.

Regards

Paul J. Thomas
Chief Executive Officer
Ductus Consulting

Why innovation and continuous improvement are different

Image credit: thinkinmeta.wordpress.com
DON’T CONFUSE NEW WITH OLD

Thomas Edison was a true innovator. During his lifetime, he amassed a record 1,093 patents and remains the most prolific inventor in American history. Edison was the driving force behind new technology with innovations such as the phonograph, the automatic telegraph, the electric generator, the movie camera, and, of course, the incandescent light bulb.

Like his good friend Henry Ford, Edison had an uncanny knack for recognising a consumer need and then creating a product to satisfy that need. This customer focus gave birth to the light globe which obviously was not invented by continuously improving the candle. Rather, Edison stood back, assessed the human need, and then invented something completely new.

For something to be classified as an innovation, I believe that it must be brand new. To state the blindingly obvious, this means that it must not have previously existed. Continuous improvement, on the other hand, deals with things that currently exist and strives to make them better. You can forever refine (improve) a candle; however, it will never become a light-bulb. Innovation is not the same as improvement.

To my surprise, this difference is often lost on the media and business leaders. Companies typically trumpet a product or service as a new innovation when, in fact, it’s simply an improvement on an existing product or service. To be clear, continuous improvement is vital to the success of an organisation and is to be encouraged. Nonetheless, it should not be labelled as innovation.

Confusion about innovation and improvement extends to the academic world. Some years ago, a (then) colleague was completing his Doctor of Business Administration degree. After reading his thesis on innovation, I confidently told him that he would be awarded a doctorate for his scholarly inquiry. Nonetheless, I also informed my colleague that I fundamentally disagreed with his definition of innovation as it equated to continuous improvement.

During our ensuring (and friendly) debate, he asserted that a manufacturer of blue pens could claim to be innovative if it decided to also produce red pens. I rebutted by pointing out that if the only thing that changed was the colour of the ink, producing a red pen could not possibly be classified as breakthrough innovation.

I then went one step further and explained that introducing a red pen was an example of product line extension, not innovation. Product line extension refers to the expansion of an existing product line such as a soft drink manufacturer introducing a diet variety to its cola line, or a toy manufacturer introducing new characters in its line of action figures.

Businesses are always being admonished to innovate or perish. While I accept that innovation is an essential ingredient for business success, most innovation cannot be classified as ground-breaking discovery. Rather, it is the product of incremental change. McDonalds did not invent take away food but perfected the process for delivering fast food through franchisees.

Similarly, the car that I drive today is loaded with far more bells and whistles than the first car I owned over 40 years ago, yet they are fundamentally the same. Both have four doors, four wheels, a dashboard, and an engine. Only Henry Ford (Edison’s kindred spirit) can lay claim to inventing the first massed produced car. Today’s cars are not brand-new inventions but the result of continuous, small-step improvements.

Even in the industry in which I worked for over four decades – financial services – I cannot think of any radical, game-changing products over recent years. Most innovation in banking stems from process improvement and not breakthrough products. Innovation in financial services lies more in process and organisational change than in new product development.

Case in point, 40-years ago it took over a week for a bank to approve a home loan as the approval process was centralised in head office. Over time, banks decentralised decision-making and empowered area offices to approve loans and this cut the approval time to about two days. Today, thanks to even more streamlined approval processes, some credit providers approve loans within two hours.

Experience has taught me that for an innovation to be truly successful, it must solve some human problem or need. Yet many technological solutions are developed for problems which do not exist. That’s why I believe we need to reframe the way we think about innovation. In essence, innovation is about problem solving. Smart innovators don’t look for a clever idea but for a pragmatic problem.

The best ideas come in response to making people’s lives easier and better – an approach that served Edison well. Still, many organisations make the mistake of developing products and services without reference to customers. This results in organisations supplying products for which there is no demand. Google Glass is an example of this – it was a solution in search of a problem.

When it comes to digital technologies, the modus operandi must be people first and technology second. The primacy of the customer in the digital age is paramount. Technology for technology’s sake is a recipe for disaster. Prioritizing innovation above solutions is rarely effective, yet it happens with regular monotony. Successful innovation is not in the technology itself but in the actual use of the technology.

Technology must help people solve the challenges they face. To illustrate, over recent years there has been an explosion of “FinTech” companies – a ubiquitous term for technology applied to conducting financial services activities. As crazy as this sounds, FinTech is not about technology. Rather, it’s about finding new ways to solve old problems. The successful FinTech companies are giving customers greater control over how they spend, move, and manage their money.

Every business leader knows that innovation is important. Even so, there is a lack of agreement on what actually constitutes innovation as there is no universally accepted definition. Until we have common standards covering innovation, anyone will be able to claim that almost any development is innovative.

The well of human ingenuity may be bottomless, but it produces few genuine Eureka moments.

Regards

Paul J. Thomas
Chief Executive Officer
Ductus Consulting

Should you always follow the orders of a superior?

Image credit: factmyth.com
THE PERILS OF OBEDIENCE

It remains the most heinous crime in modern history. The Nazi genocide of six million European Jews during World War II was unsparingly barbaric and unfathomably evil. The state-sponsored mass murders were not carried out by Hitler himself, but by his trusted lieutenants. One of these henchmen was the notorious, Adolf Eichmann.

Eichmann was an SS Lieutenant Colonel and a key figure in the “Final Solution” – the Nazi plan to systematically eradicate the Jews of Europe from the human race. Eichmann was in charge of identifying, assembling, and transporting Jews to extermination camps in German-occupied Poland, including Auschwitz. As the architect of the Holocaust, Eichmann was effectively the chief executioner.

At the Nuremberg War Crimes trials (post World War II), he was charged with managing the mass deportation of Jews to killing centres. Eichmann’s defence – and that of other accused – was based on obedience. Eichmann claimed that he was just following orders from his Third Reich superiors. He portrayed himself as an unwilling accomplice, arguing that he was “forced to serve as a mere instrument” of the Nazi war machine.

In attempting to shift responsibility for the deaths of millions of Jews, Eichmann’s “obedience defence” helped spark the interest of Stanley Milgram, a Yale University psychologist, in the topic of obedience. In July 1961, at the same time that Eichmann was on trial, Milgram began a series of experiments focusing on the conflict between obedience to authority and personal conscience.

The fundamental aim of Milgram’s studies of obedience was to discover whether a person could be coerced into behaving heinously, like Eichmann. Specifically, Milgram tested whether “ordinary” folk would inflict harm on another person after following orders from an authority figure. Milgram dismayed the world when he revealed how easy it was to turn everyday people into torturers.

The volunteer participants recruited to the experiment were told that they were part of a study to investigate the effects of punishment on memory and learning ability. In reality, Milgram wanted to test how far his subjects (acting as “teachers”) would go in punishing other subjects (acting as “learners”) by administering potentially lethal electric shocks each time the learners made a mistake on a quiz*.

The teachers were instructed by a man in a white lab coat (the “experimenter” authority figure) to deliver a shock of increasing intensity to the learners after each incorrect answer. Unbeknown to the teachers, the learners were in fact undercover actors. The actors initially took the shocks silently but then screamed after each jolt even though the shocks were not real – Milgram had built a fake shock machine. Regardless, the teachers believed that the shocks were real.

Milgram proved that ordinary people could be induced to abandon their moral instincts by a malevolent authority. A high proportion of the teachers delivered electric shocks – they felt disconnected from their actions when complying with orders. The shocking conclusion (pun intended) from the Milgram experiment was that people obeying commands feel less responsible for their actions even though they are the ones committing the act.

Today, the Milgram experiment is criticised on both ethical and scientific grounds. Still, many psychologists argue that even with moral lapses and methodological holes, the basic finding of Milgram’s work still holds up. Indeed, it has been used to explain atrocities from the Holocaust to the Vietnam War’s My Lai massacre to the abuse of prisoners at Abu Ghraib.

In August, 1971 social psychologist Philip Zimbardo expanded upon Milgram’s research in to the psychology of obedience. Zimbardo, a former classmate of Milgram, wanted to investigate further the impact of situational variables on human behaviour by using a two-week simulation of the rigid and uneven power structure in prison environments.

Zimbardo set up a mock prison in the basement of the Stanford University psychology department. He assigned his participants to play the roles of either “inmates” or “guards”, with Zimbardo himself acting as the “prison warden”. The behaviour of all involved was so extreme that the study had to be discontinued after a mere six days.

Soon after the experiment began, the guards began utilising authoritarian techniques to gain the obedience of inmates who they humiliated and psychologically abused. The powerless prisoners, in turn, became submissive and accepted the abuse with little protest.

Like Milgram’s obedience experiment, Zimbardo’s Stanford Prison Experiment has become infamous and controversial for breaking ethical guidelines relating to the treatment of volunteers. Even so, the experiment demonstrated the power of roles and the way ordinary people can turn cruel under the wrong circumstances.

Many, if not all, of the greatest human atrocities have been described as “crimes of obedience”, according to scholars Herbert C. Kelman and V. Lee Hamilton in their 1990 book, Crimes of Obedience. The author’s initial impetus for researching and writing the book was provided by the trial of Lieutenant William Calley for crimes committed at the My Lai massacre, one of the most infamous atrocities of the Vietnam War.

During his court-martial, Calley used the Nuremberg defence – a “good soldier” simply “following orders” – to justify his premeditated murder of 22 villagers (infants, children, women, and old men), and the assault of a child of about two years of age. All the killings and the assault took place on 16 March, 1968. Calley’s plea was that he dutifully followed his captain’s order to kill everyone in My Lai.

Many Americans saw Calley as a scapegoat in the chain of command and during his trial he received more than 10,000 letters of support. Following his conviction, the White House was inundated with mail objecting to Calley’s conviction. He was sentenced to life in prison, but after just three years of lax house arrest on military bases, he was released on parole in 1974 following the intervention of President Richard Nixon who reduced his sentence.

Like the volunteers in the experiments conducted by Milgram and Zimbardo, Calley was a good and decent man. A TIME magazine reporter who got to know Calley said of him:

There was nothing about Rusty Calley, as he was called, that would make you say that he was an explosion waiting to happen. He didn’t have killer instincts. He didn’t love guns. None of that was the case. He was a young guy from South Florida who loved being around people and going to parties. He was fun to be around.

■      ■      ■

It’s clear that people can succumb to the demands of authority however immoral the consequences. Indeed, people have a propensity to obey the most abhorrent of orders – and not just on the battlefield. Obedience to authority is ingrained in all of us from an early age. Throughout our lives, we encounter authority figures to whom we are answerable and this plays out in key relationships such as between parent-child, teacher-student, boss-employee, priest-minor and so on.

Humans have a strong predisposition to obey authority and this is driven by a tendency to please authority figures. But when legitimate authority is abused, blind obedience to inappropriate orders can lead to destructive outcomes. Our challenge is to stand up against arbitrary or unjust authority and not allow another person to define for us what is right or wrong.

We all have a moral compass which guides our decision making and this can help us resist the urge to comply with commands that are at odds with our value system. As noted by the American Psychological Association:

Acquiescence to the commands of an authority that are only mildly objectionable is often, as in Milgram’s experiments, the beginning of a step-by-step, escalating process of entrapment. The farther one moves along the continuum of increasingly destructive acts, the harder it is to extract oneself from the commanding authority’s grip, because to do so is to confront the fact that the earlier acts of compliance were wrong.

In the business world, employees can sometimes face unethical demands from employers. Some bosses fail to provide appropriate moral leadership by encouraging unethical practices in the workplace in the interests of, say, improving the bottom line. With regard to corporate scandals, some white-collar criminals trace their downfall to an excessive obedience to authority.

It’s ironic that some of the greatest crimes in history have been committed by those who followed the rules, not by those who broke them. Blindly following orders can be a recipe for disaster, particularly when they conflict with common decency. While a civilised society needs rules, regulations, and policies, they should not be inviolable. In all walks of life, people must be allowed to use common sense and good judgment.

We must think for ourselves and resist malevolent orders.

*In one variant of Milgram’s experiment, 65 per cent of “teachers” went all the way in administering shocks, starting from 15 volts (labelled “slight shock” on the machine) and progressing to the maximum 450 volts (“Danger: severe shock”).

Regards

Paul J. Thomas
Chief Executive Officer
Ductus Consulting

How to change someone’s mind

Image source: Shutterstock
FACTS ALONE ARE NOT ENOUGH

You are about to debate an important issue with a colleague with whom you have a differing perspective. You have developed a strong case to support your viewpoint and it’s backed up by hard, irrefutable data. Even though you’re confident that your argument is watertight, you fail miserably to sway your opponent to your way of thinking.

You have just learned an invaluable lesson – you can’t change a person’s beliefs with facts and figures alone. How you present your case is just as important. Science has proven that evidence and logic don’t win arguments. The ability to persuade others to change their minds requires a mix of communication skills, empathy, and respect.

Our opinions are often based on emotion. Humans have an innate tendency to hold on to pre-existing beliefs and convictions as our brains are wired to ensure the integrity of our worldview. Consequently, we seek out information that confirms what we already know (confirmation bias) and dismiss facts that are contrary to our core beliefs (the backfire effect).

So, berating another because they don’t like our ideas, recommendations, or proposals is a recipe for disaster. If you want someone to see eye-to-eye with you, then – in the words of one writer – you need to remember that:

When persuading someone to change their mind on a major topic, what’s being said isn’t always quite as important as how it’s said. If a person feels attacked or disrespected or condescended to, they’ll turn off their brain and block out the most rational, correct arguments on principle alone. Homo sapiens are odd, emotional creatures, more amenable to a convincing pitch than poorly presented rightness. It’s why we vote for the guy we’d gladly have as a drinking buddy over the somewhat alienating candidates with a firmer grasp on the issues.

Productive exchanges between people are more likely to occur when there’s mutual respect. Discussions, therefore, need to be held in an environment where no one is disparaged or shamed and both sides are open to changing their minds. In short, there must be a goal shift from winning to understanding and this requires empathy.

The late Stephen Covey wrote about the importance of empathy in his bestselling book, The 7 Habits of Highly Effective People. Habit 5 – seek first to understand, then to be understood – encourages us to alter the way we listen to others. To change someone’s mind you need to address their emotional attachment to what they believe and this, Dr Covey argued, requires emphatic listening.

According to Covey, people “listen with the intent to reply, not to understand”. Most of us are so focussed on our own agenda we don’t hear the other person as we talk at or over them. In contrast, empathic listening helps us get inside another person’s frame of reference with the intent of truly understanding how they see the world. Covey writes:

When another person speaks, we’re usually “listening” at one of four levels. We may be ignoring another person, not really listening at all. We may practice pretending. “Yeah. Uh-huh. Right.” We may practice selective listening, hearing only certain parts of the conversation. … Or we may even practice attentive listening, paying attention and focusing energy on the words that are being said. But very few of us ever practice the fifth level, the highest form of listening, empathic listening.

We spend years learning to read, write, and speak but receive scant training in the art of listening. Just think of all the times that you have debated or argued with someone. Did preaching to them about right and wrong change their mind? Did acting like a “logic bully” cause them to see the light? Did accusing them of being closed-minded or unreasonable help your cause?

I’ll bet that in each of these circumstances you faced the same outcome – a stalemate. Why? Because we all want to be understood, valued, and affirmed and this requires empathic listening. So, to change someone’s mind, we must stop talking and start listening. Listening is the key pathway to changing someone’s thinking and until your conversation partner feels heard, it’s almost impossible to change their mind.

Empathic listening is your secret weapon to influencing others and ensuring that you don’t butt heads. A columnist for the on-line publishing platform, Medium, put it this way:

When you come in guns blazing with all of your clear evidence, the other person will lock up. They’ll feel bullied and incapable of hearing you out. The best arguers are proven to use a small number of key points. They don’t rapid-fire or clap in the person’s face while they talk. They ask questions. They know changing someone’s mind is damn-near impossible. By asking questions, that person will change their own mind.

Great arguers stay calm, kind, and empathetic — no matter how ignorant or stupid their target is. They often open by acknowledging the things they agree on. Quite often, they compliment their opponent in the first minute. Opening soft is disarming. It’s unexpected. It highlights a desire for consensus rather than war and condescension.

Communications consultant and author, Lauren Schieffer, urges us to “get to know the person you are trying to influence. What matters to them? What brings them joy? What makes them angry? Understanding even a little bit about them helps you walk in their shoes with empathy”. You can then frame your message around the values of the other person, not your own.

In combination with empathic listening, another communication tool that you should consciously utilise is body language. Your non-verbal behaviours – facial expressions, gestures, posture, and tone of voice – send very clear messages which can be deciphered easily. If you roll your eyes or stamp your feet, for example, it’s blindingly clear that you’re not happy.

Your actions and mannerisms can speak louder than words, so remember that a genuine smile or tilt of the head will aid effective communication. Of course, it’s impossible to read body language and gauge sentiment if you are not communicating face-to-face. So, don’t try to resolve important matters via emails or messaging apps.

■      ■      ■

In my recent post, Has the world gone mad?, I mentioned that science deniers – whether on vaccines, climate, or evolution – cherry-pick evidence and draw on flawed reasoning techniques. Still, we should not give up on them even though their detachment from veritable reality is incomprehensible to science believers.

According to Lee McIntyre, a research fellow at Boston University, the only possible way to change the mind of science deniers is to talk to them calmly and respectfully, In his book, How to Talk to a Science Denier, McIntyre acknowledges that the truth is under assault with feelings outweighing evidence.

Even so, he believes that for most science deniers, change is possible and that if we don’t try, things will only get worse. McIntyre states:

Science denial is not just about doubt, it’s about distrust. The way you overcome distrust is not through sharing accurate information, it’s through conversation, face to face, in which you’re calm and patient and show respect and listen. Having the right attitude is the only thing that gives hope of success.

The world is undeniably polarised and our sense of shared reality is under attack. Denialism is dangerous and unfathomable, but one thing is clear:

“The ability to hear is a gift. The willingness to listen is a choice.”Mike Greene

Regards

Paul J. Thomas
Chief Executive Officer
Ductus Consulting

Has the world gone mad?

Source: aconsciousrethink.com
THINGS DON’T MAKE SENSE

Maybe it’s due to the persistent drumbeat of bad news. Or perhaps social media has messed up our brains. It could even be the fault of the pandemic which has pushed some of us over the edge. Whatever the cause, the world seems to have gone a bit bonkers. We have lost our collective minds and our ability to make intelligent judgements.

Humans, of course, have always been notorious for making irrational decisions. Irrespective, poor choices have become a mental contagion which has infected normally sane people and fuelled a growing disconnect between fact and fiction. An increasing number of us embrace conspiracy theories, reject scientific consensus, elect populist leaders, and promote wacky cures.

Even the smartest among us have moments when common sense escapes them, but things have got out of hand. During these uncommon times, illogical thinking has come to the fore in the face of uncertainty. Uncertainty causes our brains to overact and many of us have capitulated to irrational fear. Fear, in turn, influences our risk assessments by overestimating threats.

Fear can become problematic when it’s disproportionate to the actual risk faced, such as with COVID-19 vaccines. Despite irrefutable scientific evidence to the contrary, millions have embraced the misleading claims and outright lies about the safety of COVID inoculations. This misinformation has largely been spread on social media platforms including Facebook, Instagram, and Twitter.

Unquestionably, vaccinations are one of the greatest achievements of modern medicine and have turned many childhood diseases into distant memories. Like all vaccines, COVID shots were proven to be safe and effective through rigorous testing processes. Even so, anti-vaxxers have been unwilling to roll up their sleeves for a jab – because they are fearful.

Vaccine deniers have been spooked by the spurious and unsupported claims about COVID vaccines including that they: contain microchips for government tracking; include metals and other problematic ingredients; alter your DNA and stunt fertility; and have caused widespread death and disease. It’s even claimed that the pandemic is a ruse by big pharmaceutical companies to profiteer off a vaccine.

While these conspiracy theories might seem harmless, they demonstrate a detachment from verifiable reality that can cause someone to believe almost anything. To paraphrase a headline in The New York Times, the real horror of anti-vaxxers is that their behaviour isn’t just a public health crisis – it’s a public sanity one.

Another group that clings to beliefs which are at odds with conventional scientific thought are climate change sceptics. These sceptics hold a range of views including outright denial (it’s a hoax) to interpretive denial (it’s not a threat). This latter form of denial causes people to reframe climate change as natural and climate action as unwarranted. Thus, they do not contest the facts but interpret them in ways that distort their importance.

Humans instinctively push back against or completely reject facts that are contrary to their beliefs and this cognitive bias (the backfire effect) impacts how new information is processed. On the other hand, humans look for evidence which supports what they already believe to be true and this causes them to give credence to data which confirms that their view is right (confirmation bias).

These two cognitive biases work in tandem and help explain why climate deniers (a) ignore the hundreds of studies which show that humans are responsible for climate change and (b) latch on to the one study that they can find which casts doubt on human culpability arising from anthropogenic emissions of greenhouse gases.

Many believe that the catastrophic framing of climate change is self-defeating as it alienates people. I agree that doomsday scenarios don’t inspire action among deniers and also accept that merely talking about evidence or data does not change the mind of a sceptic.

So, I was drawn to a story in The New York Times which is void of scare tactics. The feature story, The Science of Climate Change Explained: Facts, Evidence and Proof, is written by a journalist with a PhD in geology. She calmly and pragmatically explains what will happen if we fail to address climate change – well worth a read.

Beyond vaccines and climate change, large swaths of humanity still snub science when it comes to Darwin’s theory of evolution. The beginning of the Earth, along with the birth of humans, remains a contentious issue between creationists and evolutionists. These protagonists continue to debate whether life on Earth was created in the blink of an eye or whether it evolved over millions of years.

Creationists insist that everything in nature was created by a deity who formed all life over a period of six days, as described in the Book of Genesis. Evolutionists reject this assertion by biblical literalists, citing scientific evidence showing that the Earth is about 4.5 billion years old and that all life evolved from primitive, single cell organisms.

To any evolutionary biologist, creationism is ludicrous. But to millions of creationists, particularly those in America’s Bible Belt, God remains the supernatural “intelligent designer” of the universe. The clashes between creationists and biologists can be explained, as noted in one article, through the lens of confirmation bias.

The latter (biologists) use scientific evidence and experimentation to reveal the process of biological evolution over millions of years. The former (creationists) see the Bible as being true in the literal sense and think the world is only a few thousand years old. Creationists are skilled at mitigating the cognitive dissonance caused by factual evidence that disproves their ideas. Many consider the non-empirical “evidence” for their beliefs (such as spiritual experiences and the existence of scripture) to be of greater value than the empirical evidence for evolution.

Debating creationists is a slippery slope as they do not adhere to facts or logic. What is scientific fact for evolutionists is irreverent blasphemy for creationists. As creationism argues that faith should take precedence over science, there is little hope for enlightenment – the scientific worldview is unlikely to ever supplant a creationist one. Well may we say “let there be light”!

Belief in ideas that have clearly been disproven by science remains widespread around the world. Rejecting scientific consensus has given rise to scientific denialism (dubbed the anti-enlightenment movement) and it has moved from the fringes to the centre of public discourse. An article in the international science journal, Nature, put it this way:

Science deniers – whether on vaccines, evolution or climate – all draw on the same flawed reasoning techniques: cherry-picking evidence, relying on conspiracy theories and fake experts, engaging in illogical reasoning and insisting that science must be perfect.

■      ■      ■

We view our ancestors as being blinkered by myth and superstition yet see ourselves as reasoned and enlightened. However, for all our advancement as a species, humans still behave irrationally. You just have to witness the global rise of a new political culture based on emotion and fear, in lieu of fact and policy, to know that something is wrong.

Perhaps there is no better example of this political irrationality than the election of Donald Trump which left millions of people around the world perplexed. His campaign – described in one critique as “a toxic mix of exaggerations, lies, fearmongering, xenophobia and sex scandal” – succeeded in elevating an unsuitable and unpopular nominee to the office of president.

Irrationality has defined much of human life and history and will continue to do so. We make irrational decisions with regular monotony such as stripping supermarket shelves bare of toilet paper during a pandemic. As I explored in a recent post, How our lives are shaped by the choices we make, our reasoning processes are imperfect and this leads to poor choices.

To suggest that humans are rational is an irrational idea.

Regards

Paul J. Thomas
Chief Executive Officer
Ductus Consulting

Do as I say, not as I do

Source of two-headed image: Quora
THE HYPOCRITE IN ALL OF US

How many of us practice what we preach? While it’s easy doling out gratuitous advice to others, living by the principles and values we espouse is another matter. The hard truth is that many of us display glaring contradictions in our behaviour, adopting one pose in public and another persona in private.

In all domains of life, people put on false fronts. Examples of double standards include pious politicians promoting family values while secretly having an affair to two-faced parents telling their children not to smoke while doing so themselves. Inconsistencies between what we say and do abound as we often fail to meet our own moral code.

While most humans can be accused of duplicity, higher standards are expected of those who claim the moral high ground. Priests and other religious implore us to love our neighbour, yet (some) have committed unspeakable transgressions against children. Such unvirtuous behaviour is repugnant and has exposed the heinous moral hypocrisy of religious institutions.

Just as churches need to put their own houses in order before damning others, so do we. Everyone is prone to hypocrisy at one point or another in their life. Humans are not cold logical robots but fallible emotive beings, which is why we suffer from a misalignment between words and deeds, thereby making hypocrisy unavoidable.

High-status people are some of the worst hypocrites in society. These individuals are frequently admired by others and often occupy leadership roles. Yet, as author Peter Schweizer outlined in his 2006 book, Do as I Say (Not as I Do): Profiles in Liberal Hypocrisy, famous people are not holier-than-thou and also fall short in living their beliefs.

Schweizer conducted an investigation in to the private lives of a handful of prominent US citizens and found a long list of blatant contradictions. To quote the book’s promotional copy:

Michael Moore … claims to have no stock portfolio, yet he owns shares in Halliburton, Boeing, and Honeywell and does his postproduction film work in Canada to avoid paying union wages in the United States. Noam Chomsky opposes the very concept of private property and calls the Pentagon “the worst institution in human history,” yet he and his wife have made millions of dollars in contract work for the Department of Defense and own two luxurious homes. Barbra Streisand prides herself as an environmental activist, yet she owns shares in a notorious strip-mining company. Hillary Clinton supports the right of thirteen-year-old girls to have abortions without parental consent, yet she forbade thirteen-year-old Chelsea to pierce her ears and enrolled her in a school that would not distribute condoms to minors.

The business world is similarly guilty of hypocrisy with companies displaying a lack of coherence between talk and action. Consumer activists have long argued that most business models prioritise profits over people despite the assertion by firms to the contrary. The classic example is the rag trade where global clothing brands have been complicit in the exploitation of sweatshop workers.

Sweatshops are as old as the industrial age and were started by heartless businessmen. Modern-day consumers must be careful of being too sanctimonious about the plight of garment workers because they (as shoppers) have knowingly bought high-street brands supplied by factories which mistreat their workers.

One of the reasons that high-street clothing has been getting cheaper and cheaper for decades is that sweatshop workers do not receive a living wage. The suffering of these unknown workers on the other side of the world is easy for us as consumers to ignore, particularly as we have become accustomed to reaping the benefits of lower production costs.

Something else that we have become accustomed to is politics in sport and this was on full display during the Beijing Winter Olympics. The overwhelming message of the opening ceremony was about peace and togetherness. A giant LED snowflake sculpture was used to symbolise all people coming together and living in harmony.

Yet human rights organisations branded the 2022 Olympics as “the genocide games” and accused China of holding a million Uyghurs (a largely Muslim ethnic group) against their will in re-education centres. In response, many nations – including the US, Britain, Canada, and Australia – staged a diplomatic boycott of the games in protest at China’s repressive policies toward the Uyghur minority group native to Xinjiang.

Many saw the International Olympic Committee’s decision to award China the games as political hypocrisy. Having an alleged human rights abuser as host was called out as clashing with one of the fundamental principles contained in the Olympic Charter – a commitment to “the preservation of human dignity”.

Another international body which recently came in for criticism is the United Nations entity that supports and co-ordinates action on climate change. For nearly three decades, the UN has brought together almost every nation on Earth for global climate summits called Conferences of the Parties (COPs). The 26th annual summit – COP26 – took place in Glasgow last November.

As leaders from around the world made promises to tackle an existential threat to humanity, climate change activists and experts railed against the hypocrisy that accompanied it. As noted in a University of Southern California (USC) Annenberg Media report:

… a total of 400 private jets flew down to Glasgow from all over the world, carrying more than 100 leaders. This emitted 13,000 tonnes of carbon dioxide into the atmosphere. For comparison, the average person’s carbon footprint globally is 7 tonnes per year and the carbon footprint of an average American is 21 tonnes per year. The leaders have been called out by critics as “eco-hypocrites” for emitting a huge amount of CO2 while gathering for an event organized to curb greenhouse gas emissions.

Climate change hypocrisy also extends to members of the British royal family. On numerous occasions over recent years, Prince Harry and Meghan Markle have been criticised for flying around the world in private jets while lecturing the world about climate change. A former UK government minister told Newsweek:

It’s completely hypocritical for Prince Harry or other members of the royal family to lecture people about climate change when they’re emitting more carbon than almost everyone else on the planet. People using private jets are in the top one percent of carbon emitters in the world.

Many citizens understandably jump up and down about humanity’s need to take climate change seriously. These same people typically look to governments and businesses to find eco-friendly solutions, when the real power for change is in our collective hands. We support governments with votes and businesses with dollars, which means that we can choose who governs and where we spend our money. We need to put our votes and our money where our mouths are!

■      ■      ■

“Hypocrisy is the natural state of the human mind,” according to Robert Kurzban, author of Why Everyone (Else) Is a Hypocrite. Kurzban argues that our behavioural inconsistencies are caused by the mind’s design, which consists of many specialised modules. These modules don’t always work together seamlessly resulting in impossibly contradictory beliefs and violations of our supposed moral principles.

Consequently, hypocrisy is everywhere and can manifest itself in countless ways. To pretend that we can live our lives without hypocrisy and contradiction is itself a form of deception. We must, therefore, exercise care before angrily lambasting others for their deeds, while doing the same ourselves. People who live in glass houses should not throw stones.

We’re all hypocrites, it’s just a matter of scale.

Regards

Paul J. Thomas
Chief Executive Officer
Ductus Consulting

How our lives are shaped by the choices we make

Source: smartholistics.com.uk
LIFE DOESN’T JUST HAPPEN

Ever since Adam and Eve’s original decision to eat fruit from the forbidden tree in the Garden of Eden, humans have made some spectacularly poor choices. History bears witness to these monumental mistakes including the crew of the Titanic ignoring warnings of icebergs in their path, NASA proceeding with the space shuttle Challenger launch despite known problems with the solid rocket boosters, and engineers filling the Hindenburg with highly flammable hydrogen.

Bad decisions are part of life, though most do not have consequences that weigh as heavily as those just cited. Examples of non-fatal, flawed judgements include the 12 publishers who rejected J.K. Rowling’s first Harry Potter manuscript, the Decca Records executive who declined to sign The Beatles, and the Yahoo co-founder who turned down a US$44 billion takeover offer from Microsoft.

Hindsight may be 20/20, but risk is an inescapable part of every decision. We never know the outcome of a decision in advance – sometimes our choices turn out to be spot on, while on other occasions our judgments prove to be seriously flawed. In the words of the late French philosopher, Albert Camus, “life is the sum of all our choices”. History, by extrapolation, equals the accumulated choices of all mankind.

Our lives are defined by the series of choices we make every single day. They play out over a lifetime and ultimately determine our destiny. Our choices not only change our lives but the lives of others. We are not alone in our choices as we are part of a bigger picture – there is a chain of events associated with every decision we make. Thus, an individual deciding to buy environmentally friendly products can help change the world and make it a better place for everyone.

Some of the life-changing decisions that we make include where to live, how many children to have, and what career to follow. More mundane and routine choices include what to wear, and what to watch on Netflix. One of the paradoxes of life is that our bigger decisions are often less calculated than our smaller ones. We can agonise for weeks over what new car to buy but rapidly end a long-term relationship with little thought or deliberation.

Sometimes a snap judgment or instinctive choice is appropriate. Your emotions, though, can easily cloud your judgment, which is why most experts agree that the best decisions are made when there is a balance between logic and emotions. The invisible tug-o-war between the head and the heart is not a bad thing as you are more likely to carefully weigh the pros and cons of each decision before choosing an alternative.

When your emotions are running high, your logic will be low, which can lead to irrational decisions. To illustrate, anger makes you vulnerable to high-risk, low payoff choices such as the rash decisions made during a bitter divorce. Happiness, on the other hand, makes you confident and optimistic about the future but can cause you to overestimate your chances of success, such as believing that your winning streak at the casino will continue indefinitely (aka gambler’s fallacy).

Knowing how to make good decisions is one of the most important skills we can possess. Many people look back at some of the terrible decisions they have made and ask themselves: What was I thinking? We make endless decisions, so we are bound to regret some of them. A Cornell University study estimated that the average adult makes thousands of remotely conscious decisions every day.

Each decision you make is a trade-off as everything you say, do, or pursue has a cost and a benefit. In the language of economists, this trade-off is called an opportunity cost. The term “opportunity cost” is defined as “the cost of an alternative that must be forgone in order to pursue a certain action”. Put simply, it’s what a person sacrifices when they choose one option over another. An example will help here.

Let’s say that you have $100 in your purse and you can spend it on a pair of jeans or a meal. You choose to buy the denim jeans, so the opportunity cost is the restaurant meal you cannot afford. For everything you choose to do, there’s something else you won’t be able to do. Every day as consumers, we are forced to make such choices due to “scarcity”. Scarcity and opportunity cost are two interlinking economic concepts.

Economists view the world through the lens of scarcity. Indeed, without scarcity, the science of economics would not exist. Scarcity arises because, as a society, we have unlimited wants but limited resources. We all know that you can’t have everything you want – we have to choose and make trade-offs. Economics examines how individuals, businesses, and governments deal with the limitations imposed by scarcity.

Broadly speaking, economics is the study of human behaviour as it relates to money. When it comes to financial decisions, economists erroneously claim that humans are rational and unemotional decision makers. Psychologists, on the other hand, correctly contend that economists’ models bear little relationship to actual human behaviour. The harsh reality is that humans do not obey the efficient, orderly principals espoused by free-market thinkers.

The Global Financial Crisis (GFC) confirmed that we are far too emotive for rational economic models to accurately predict our conduct. Many people in the US bought houses at grossly inflated prices and expected their value to keep rising. In the process, borrowers saddled themselves with loans that they could not afford, which led to the subprime mortgage meltdown and ultimately the catastrophic GFC.

This “irrational exuberance” was not confined to the household sector. Borrowers, bankers, and brokers were united in the delusional belief that house prices never go south. Post-GFC, many people turned to behavioural economics to understand what happened. Behavioural economics combines psychology and economics to explain how people really make decisions when they spend, invest, save, and borrow.

Unsurprisingly, few people reach the level of expertise necessary to rightfully claim that they are an expert decision-maker. The development of genuine expertise in any field requires years of struggle and sacrifice. Still, you can be a good decision-maker if you choose actions that produce the best outcome for yourself and others. The trick is to make each decision with an open mind and be aware of your unconscious biases.

Cognitive biases distort thinking, influence beliefs, and sway the decisions we make every day, yet most people are unaware of them. Over the course of our lives, we all develop cognitive biases. Just watch the daily news, listen to talkback radio, or scroll through social media posts to witness biases in action as people argue over politics, climate change, and other hot topics. Everyone, of course, claims that their position is the right one.

Differences of opinion occur because we all have our own perspectives based on our preconceptions, past experiences, and the information we draw on in forming judgements. When it comes to gathering information, many of us are guilty of confirmation bias – readily embracing information and conclusions which align with our views and largely ignoring anything which contradicts our beliefs.

■      ■      ■

The next time that you make a bad decision, just remember that it could have been worse. Imagine being the individual responsible for allowing the famous Trojan Horse to be brought inside the City of Troy, not realising it was full of Greek soldiers. And how would you have felt standing in Napoleon’s shoes after he invaded Russia, suffered a catastrophic defeat, and returned home with just a fraction of his once grand army?

You can minimise regrettable decisions by learning from your mistakes – history does not have to repeat itself. Humans have a tendency, however, of replicating the same blunders over and over (poor diets, dysfunctional relationships, impulsive buying, etc.) causing us to relive our errors. If you want a different result, you have to do something different – make better decisions!

We are what we choose to be.

Regards

Paul J. Thomas
Chief Executive Officer
Ductus Consulting

Why we fail to detect subtle changes

Source: hu-hu.facebook.com
PERILS OF SHORT-TERM THINKING

Humans sit at the apex of the evolutionary tree with the most complex brain of any animal, yet some believe that there is a design fault. Our brains have evolved to respond to immediate threats, so we are not wired to detect more gradual warning signs. That’s why we can duck out of the way of a cricket ball in a fraction of a second, but fail to react to repeated and serious threat assessments about a deadly new virus for which there is no treatment.

In the early phases of human existence, our ancestors faced an onslaught of daily challenges to their survival – from predators to natural disasters. Too much information can confuse our brains, leading us to inaction or poor choices and this can place us in harm’s way. Consequently, our brains evolved to filter information rapidly and focus on what is most immediately essential to our survival.

Daniel Gilbert, a professor of psychology at Harvard University, argues that threats that develop over decades – rather than seconds – circumvent our brain’s alarm system. To illustrate, he says that we take alarm at terrorism, but much less to global warming, even though the odds of a disgruntled shoe bomber attacking our plane are, he claims, far longer than the chances of the ocean swallowing parts of Manhattan.

Assessing and reacting to risk is one of the most important things we do as humans. Nonetheless, as Professor Gilbert points out, in our short-sighted world we don’t perceive long-term challenges which threaten our existence, which is why he asserts that:

… if alien scientists were trying to design something to exterminate our race, they would know that the best offense is one that does not trigger any defense. And so, they would never send little green men in spaceships. Instead, they would invent climate change, which produces apathy not action (bold text added).

“Humans are very bad at understanding statistical trends and long-term changes,” notes political psychologist, Conor Seyle. “We have evolved to pay attention to immediate threats. We overestimate threats that are less likely but easier to remember, like terrorism, and underestimate more complex threats, like climate change.”

Right now, humanity faces a number of risks, but they are not on our collective radar as they will not impact us for a long time – decades and longer. Some of these risks are called existential risks as they have the capacity to wipe out humanity. For instance, in about a billion years – give or take a few hundred million years – the increased brightness of the Sun will doom the Earth’s biosphere.

In the more immediate future – say, the next century – the greatest threat to humanity is ourselves. More specifically, according to an article published by online media outlet Quartz, the most dangerous threat to humanity is the human mind.

The defining characteristic of humans is our capacity for complex thinking and advanced reasoning. These abilities have allowed us to develop innovations that transform our lives and our world … (but these) … innovations have also created new problems, many of which threaten our existence .… Climate change, pollution, economic and social disruption due to emerging technologies, political polarization, misinformation, inequality, and large-scale conflict are all major challenges for humanity to overcome that have arisen from our own innovation.

We are unlikely to effectively solve these problems unless we truly understand their ultimate source: the human mind. In line with this thinking, the Centre for the Study of Existential Risk at the University of Cambridge believes that the four greatest threats to the human species are all man-made – artificial intelligence, global warming, nuclear war, and rogue biotechnology.

In his bestselling book, Sapiens: A Brief History of Humankind, historian and renowned author, Professor Yuval Noah Harari, states that humans “… have the dubious distinction of being the deadliest species in the annals of biology”. We are the most advanced and most destructive animal to have ever lived – making us brilliant and deadly. This lethal combination causes some to proffer that a man-made global pandemic should be added to the list of threats to humanity.

Experience has taught me that it would be wise to further augment the list with unknown unknows. We humans are sometimes too clever by half in believing that we have covered all bases. In reality, no one can say with absolute certainty that there is not an unknown threat lurking around the corner which will take us by surprise. Consequently, the greatest risks in the years ahead may come not from threats we’ve identified, but from those we haven’t.

It’s clear that our shortterm brains can’t cope with longterm perils. We are focussed on the here and now to the detriment of distant risks. Our inability to look beyond the current news cycle is reflected in a phenomenon called short-termism – the constant pressure to deliver instant results.

Short-termism has become endemic in society, and it pervades all aspects of our lives. We want quick-fix surgery to rectify imperfections IMMEDIATELY. We crave crash diets to lose weight FAST. We consume energy drinks to heighten alertness NOW. We expect politicians to respond to tracking polls TODAY. And we require companies to achieve a turnaround in earnings PROMPTLY.

In an article titled – The perils of short-termism: Civilisation’s greatest threatBBC journalist, Richard Fisher, paraphrases angel investor Esther Dyson: in politics the dominant time frame is a term of office, in fashion and culture it’s a season, for corporations it’s a quarter, on the Internet it’s minutes, and on the financial markets mere milliseconds.

The world is plagued by short-termism and our challenge is to look at things through a longer lens. Perhaps we should remember that when we feel like shrieking in anger at the need to don a face mask during a pandemic while ignoring the long-term benefits to humanity of controlling a deadly virus.

COVID-19 is the latest example of long-term success being held hostage to short-term thinking. The pandemic influenced many people to focus on short-term outcomes and instant gratification. Cleary, we need to reframe our thinking and develop a longer game plan for society.

It’s time for humanity to see the bigger picture.

Regards

Paul J. Thomas
Chief Executive Officer
Ductus Consulting

Why politicians are not qualified to run a country

Source: AZ Quotes
SPECIFIC TRAINING NEEDED

In every sector of society, qualifications play a critical role in determining the minimum level of knowledge and skills required for a given profession or occupation. Qualifications define what a person needs to know – and be able to do – to carry out a certain activity, but no such entry requirements apply to the group of people who are elected to run a country.

Around the world, politicians need no formal training before entering parliament. Those running for public office are elected by citizens, many of whom rate charm and charisma over ability and competence. Most voters in modern democracies do not make informed decisions about the capability of candidates or the efficacy of their policies.

Plato, the father of Greek philosophy, was one of the earliest to see democracy as a problem. He, like later critics, argued that democracy meant rule by the ignorant, or worse, rule by the charlatans who hoodwink the people. It would be much safer, Plato thought, to entrust power to carefully educated wise guardians who would decide upon matters on behalf of citizens.

Fast forward to the present and John Hewson, the former Liberal opposition leader, believes that Australia’s contemporary parliamentarians aren’t qualified to run the country. Writing in The Sydney Morning Herald, Dr Hewson lamented that politics is dominated by career politicians who concentrate on winning points rather than delivering good policy and good government. He states:

Unfortunately, the skill sets and experience required of a career politician essentially make them incompetent to govern effectively. Their career path is often from university, community or union politics, through local government/party engagement, perhaps serving as a ministerial staffer, to pre-selection, then election, and so on.

We would not let an unqualified teacher educate our children nor allow a physician without a medical degree to perform surgery on us. Yet, we place no job prerequisites on those seeking to become members of parliament (MPs). In Australia and elsewhere, it is not mandatory for MPs to take part in initial or ongoing education and training programs specific to their role.

While I’m not advocating minimum education standards for aspiring politicians, there is a case for introducing compulsory Continuing Professional Development (CPD) for all parliamentarians once elected to office. This happens in other professions (and many occupations) to ensure continued proficiency and capability in one’s chosen field.

As lawmakers, politicians – either directly through regulation or indirectly via industry bodies – compel professionals like doctors, lawyers, engineers, and accountants to undertake CPD as a condition of their ongoing professional registration. Politicians the world over, however, are not subject to defined education and training standards for accreditation thereby creating a double standard.

Like all CPD programs, the learning activities for politicians should include a combination of structured activities (such as training courses, online modules, and seminars) and unstructured activities (such as reading documents, articles, and publications). Any formal or informal learning activity which improves or broadens an MP’s knowledge or expertise can be included in the CPD toolbox.

MPs are involved in decisions that have far reaching consequences for the populace at large and deal with an almost unlimited range of subjects. As noted in one report, “those elected to public office are expected to possess indefinable qualities to accomplish an indescribable job”.

I am the first to acknowledge that developing CPD programs for politicians is a huge undertaking. Nonetheless, it should be done and the two subjects that I would place at the apex of CPD curricula are debunking economic fallacies and understanding game theory. Please let me explain each in turn.

One of the primary activities of modern governments is to determine economic policies. In every country, the government takes steps to help the economy achieve growth, full employment, and price stability. Given this, it’s vital that elected representatives be economically literate, even though many politicians display an astounding ignorance of economics.

Over recent years, for example, populist politicians have driven an anti-globalisation agenda by promoting protectionists and isolationists policies. Specifically, the peddlers of populism have challenged the undeniable economic benefits of free trade (which has lifted millions out of poverty) and immigration (which has increased the size of economies).

Economic ineptitude was also on display following the Global Financial Crisis (GFC). Politicians in many nations (including the UK and Greece) adopted fiscal austerity measures which proved catastrophic. Pleasingly, politicians in Australia and America implemented fiscal stimulus policies which gave their economies a much needed boost.

One of the reasons that governments adopted fiscal austerity measures post-GFC was to keep a lid on government debt loads in the mistaken belief that government debt is bad. This old chestnut is arguably the single biggest economic myth of all and is called the household fallacy.

Politicians fuel this fallacy by constantly drawing false parallels between household budgets and government budgets. Our elected leaders love trotting out the familiar line that governments – like households – need to live within their means. Yet every time they espouse this untrue analogy, they engage in unnecessary fear-mongering about government debt.

Around the world, ill-informed politicians claim that governments should somehow have a balanced budget year-to-year. Politicians show empathy with the electorate by promising to cut government spending in line with belt tightening by households. Governments that run deficits are erroneously accused of being poor financial managers.

In Fifty Economic Fallacies Exposed, Geoffrey Wood – Professor Emeritus of Economics – examines a range of popular economic misconceptions and explains how these mistaken beliefs misinform economic discussion. It should be mandatory for all politicians to read Professor Wood’s book. My recent post, Why it’s important to understand economics, provides a precis of three of the most common economic myths that we encounter.

Let me now turn to the second subject that I would include in CPD curricula – understanding game theory. Game theory is a framework for examining competitive situations where “players” have conflicting interests. It models human actions on the presumption that everyone tries to maximise their potential gain against everyone around them.

At its core, game theory – which is a special branch of mathematics – is used to study decision-making in complex situations. It examines how our choices affect others and how the choices others make affect us (so-called “games”). The “games” involve two or more opposing parties pursuing actions in their own best interest resulting in an outcome for each that is worse than if they had cooperated.

Game theory is applied in a number of fields and a classic example can be found in the arms race between two superpowers. Both countries are clearly better off when they cooperate and avoid an arms race. Yet the dominant strategy is for each to arm itself heavily. If a new weapon is invented that is more destructive than any in existence, acquisition of this weapon is seen as enhancing the security of one’s country. But if both act accordingly, everyone’s security is jeopardised rather than improved.

Game theory was used most notably during the Cold War. Both the United States and the Soviet Union quickly saw its value for forming war strategies. A balance was struck in which neither nation could gain advantage through nuclear attack as the reprisals would be too devastating. This became known as the doctrine of Mutually Assured Destruction (MAD).

MAD is founded on the notion that a nuclear attack by one superpower would be met with an overwhelming nuclear counterattack such that both the attacker and the defender would be annihilated. Given this lose-lose outcome, it is said that only a madman would engage in such a self-defeating strategy.

By this logic, Vladimir Putin has been labelled a madman given his recent announcement that his nuclear forces are on “high-combat” alert. He is playing the madman card, threatening the use of nuclear weapons to get what he wants out of his war with Ukraine. With 6,000 plus nuclear warheads, Putin can end our world as we know it.

While many suspect that Putin is unhinged, I believe that he is a cunning expansionist with dreams/delusions of restoring Mother Russia to her former greatness. He is a megalomaniac who is obsessed with increasing his power and will do almost anything to get his way. The dilemma the West faces is how to deal with Russia without risking nuclear war. As the UN Secretary-General, António Guterres, recently warned: “The prospect of nuclear conflict, once unthinkable, is now back within the realm of possibility”.

Putin’s actions in brandishing the nuclear option are reckless. If Putin and every other politician on the planet truly understood game theory, they would not threaten the use of nuclear warfare. Nuclear weapons are an intolerable threat to humanity. Instructing politicians in the nuances of game theory would teach them that a nuclear war cannot be won and therefore must never be fought.

■      ■      ■

At the risk of sounding defeatist, I accept that studying game theory will not be made compulsory for world leaders let alone rank and file politicians. Equally, I’m sure that politicians will not be forced to study economic fallacies. This is disappointing as both of these educational initiatives would make the world a significantly better place.

One can only dream!

Regards

Paul J. Thomas
Chief Executive Officer
Ductus Consulting