Why innovation and continuous improvement are different

Image credit: thinkinmeta.wordpress.com
DON’T CONFUSE NEW WITH OLD

Thomas Edison was a true innovator. During his lifetime, he amassed a record 1,093 patents and remains the most prolific inventor in American history. Edison was the driving force behind new technology with innovations such as the phonograph, the automatic telegraph, the electric generator, the movie camera, and, of course, the incandescent light bulb.

Like his good friend Henry Ford, Edison had an uncanny knack for recognising a consumer need and then creating a product to satisfy that need. This customer focus gave birth to the light globe which obviously was not invented by continuously improving the candle. Rather, Edison stood back, assessed the human need, and then invented something completely new.

For something to be classified as an innovation, I believe that it must be brand new. To state the blindingly obvious, this means that it must not have previously existed. Continuous improvement, on the other hand, deals with things that currently exist and strives to make them better. You can forever refine (improve) a candle; however, it will never become a light-bulb. Innovation is not the same as improvement.

To my surprise, this difference is often lost on the media and business leaders. Companies typically trumpet a product or service as a new innovation when, in fact, it’s simply an improvement on an existing product or service. To be clear, continuous improvement is vital to the success of an organisation and is to be encouraged. Nonetheless, it should not be labelled as innovation.

Confusion about innovation and improvement extends to the academic world. Some years ago, a (then) colleague was completing his Doctor of Business Administration degree. After reading his thesis on innovation, I confidently told him that he would be awarded a doctorate for his scholarly inquiry. Nonetheless, I also informed my colleague that I fundamentally disagreed with his definition of innovation as it equated to continuous improvement.

During our ensuring (and friendly) debate, he asserted that a manufacturer of blue pens could claim to be innovative if it decided to also produce red pens. I rebutted by pointing out that if the only thing that changed was the colour of the ink, producing a red pen could not possibly be classified as breakthrough innovation.

I then went one step further and explained that introducing a red pen was an example of product line extension, not innovation. Product line extension refers to the expansion of an existing product line such as a soft drink manufacturer introducing a diet variety to its cola line, or a toy manufacturer introducing new characters in its line of action figures.

Businesses are always being admonished to innovate or perish. While I accept that innovation is an essential ingredient for business success, most innovation cannot be classified as ground-breaking discovery. Rather, it is the product of incremental change. McDonalds did not invent take away food but perfected the process for delivering fast food through franchisees.

Similarly, the car that I drive today is loaded with far more bells and whistles than the first car I owned over 40 years ago, yet they are fundamentally the same. Both have four doors, four wheels, a dashboard, and an engine. Only Henry Ford (Edison’s kindred spirit) can lay claim to inventing the first massed produced car. Today’s cars are not brand-new inventions but the result of continuous, small-step improvements.

Even in the industry in which I worked for over four decades – financial services – I cannot think of any radical, game-changing products over recent years. Most innovation in banking stems from process improvement and not breakthrough products. Innovation in financial services lies more in process and organisational change than in new product development.

Case in point, 40-years ago it took over a week for a bank to approve a home loan as the approval process was centralised in head office. Over time, banks decentralised decision-making and empowered area offices to approve loans and this cut the approval time to about two days. Today, thanks to even more streamlined approval processes, some credit providers approve loans within two hours.

Experience has taught me that for an innovation to be truly successful, it must solve some human problem or need. Yet many technological solutions are developed for problems which do not exist. That’s why I believe we need to reframe the way we think about innovation. In essence, innovation is about problem solving. Smart innovators don’t look for a clever idea but for a pragmatic problem.

The best ideas come in response to making people’s lives easier and better – an approach that served Edison well. Still, many organisations make the mistake of developing products and services without reference to customers. This results in organisations supplying products for which there is no demand. Google Glass is an example of this – it was a solution in search of a problem.

When it comes to digital technologies, the modus operandi must be people first and technology second. The primacy of the customer in the digital age is paramount. Technology for technology’s sake is a recipe for disaster. Prioritizing innovation above solutions is rarely effective, yet it happens with regular monotony. Successful innovation is not in the technology itself but in the actual use of the technology.

Technology must help people solve the challenges they face. To illustrate, over recent years there has been an explosion of “FinTech” companies – a ubiquitous term for technology applied to conducting financial services activities. As crazy as this sounds, FinTech is not about technology. Rather, it’s about finding new ways to solve old problems. The successful FinTech companies are giving customers greater control over how they spend, move, and manage their money.

Every business leader knows that innovation is important. Even so, there is a lack of agreement on what actually constitutes innovation as there is no universally accepted definition. Until we have common standards covering innovation, anyone will be able to claim that almost any development is innovative.

The well of human ingenuity may be bottomless, but it produces few genuine Eureka moments.

Regards

Paul J. Thomas
Chief Executive Officer
Ductus Consulting

Should you always follow the orders of a superior?

Image credit: factmyth.com
THE PERILS OF OBEDIENCE

It remains the most heinous crime in modern history. The Nazi genocide of six million European Jews during World War II was unsparingly barbaric and unfathomably evil. The state-sponsored mass murders were not carried out by Hitler himself, but by his trusted lieutenants. One of these henchmen was the notorious, Adolf Eichmann.

Eichmann was an SS Lieutenant Colonel and a key figure in the “Final Solution” – the Nazi plan to systematically eradicate the Jews of Europe from the human race. Eichmann was in charge of identifying, assembling, and transporting Jews to extermination camps in German-occupied Poland, including Auschwitz. As the architect of the Holocaust, Eichmann was effectively the chief executioner.

At the Nuremberg War Crimes trials (post World War II), he was charged with managing the mass deportation of Jews to killing centres. Eichmann’s defence – and that of other accused – was based on obedience. Eichmann claimed that he was just following orders from his Third Reich superiors. He portrayed himself as an unwilling accomplice, arguing that he was “forced to serve as a mere instrument” of the Nazi war machine.

In attempting to shift responsibility for the deaths of millions of Jews, Eichmann’s “obedience defence” helped spark the interest of Stanley Milgram, a Yale University psychologist, in the topic of obedience. In July 1961, at the same time that Eichmann was on trial, Milgram began a series of experiments focusing on the conflict between obedience to authority and personal conscience.

The fundamental aim of Milgram’s studies of obedience was to discover whether a person could be coerced into behaving heinously, like Eichmann. Specifically, Milgram tested whether “ordinary” folk would inflict harm on another person after following orders from an authority figure. Milgram dismayed the world when he revealed how easy it was to turn everyday people into torturers.

The volunteer participants recruited to the experiment were told that they were part of a study to investigate the effects of punishment on memory and learning ability. In reality, Milgram wanted to test how far his subjects (acting as “teachers”) would go in punishing other subjects (acting as “learners”) by administering potentially lethal electric shocks each time the learners made a mistake on a quiz*.

The teachers were instructed by a man in a white lab coat (the “experimenter” authority figure) to deliver a shock of increasing intensity to the learners after each incorrect answer. Unbeknown to the teachers, the learners were in fact undercover actors. The actors initially took the shocks silently but then screamed after each jolt even though the shocks were not real – Milgram had built a fake shock machine. Regardless, the teachers believed that the shocks were real.

Milgram proved that ordinary people could be induced to abandon their moral instincts by a malevolent authority. A high proportion of the teachers delivered electric shocks – they felt disconnected from their actions when complying with orders. The shocking conclusion (pun intended) from the Milgram experiment was that people obeying commands feel less responsible for their actions even though they are the ones committing the act.

Today, the Milgram experiment is criticised on both ethical and scientific grounds. Still, many psychologists argue that even with moral lapses and methodological holes, the basic finding of Milgram’s work still holds up. Indeed, it has been used to explain atrocities from the Holocaust to the Vietnam War’s My Lai massacre to the abuse of prisoners at Abu Ghraib.

In August, 1971 social psychologist Philip Zimbardo expanded upon Milgram’s research in to the psychology of obedience. Zimbardo, a former classmate of Milgram, wanted to investigate further the impact of situational variables on human behaviour by using a two-week simulation of the rigid and uneven power structure in prison environments.

Zimbardo set up a mock prison in the basement of the Stanford University psychology department. He assigned his participants to play the roles of either “inmates” or “guards”, with Zimbardo himself acting as the “prison warden”. The behaviour of all involved was so extreme that the study had to be discontinued after a mere six days.

Soon after the experiment began, the guards began utilising authoritarian techniques to gain the obedience of inmates who they humiliated and psychologically abused. The powerless prisoners, in turn, became submissive and accepted the abuse with little protest.

Like Milgram’s obedience experiment, Zimbardo’s Stanford Prison Experiment has become infamous and controversial for breaking ethical guidelines relating to the treatment of volunteers. Even so, the experiment demonstrated the power of roles and the way ordinary people can turn cruel under the wrong circumstances.

Many, if not all, of the greatest human atrocities have been described as “crimes of obedience”, according to scholars Herbert C. Kelman and V. Lee Hamilton in their 1990 book, Crimes of Obedience. The author’s initial impetus for researching and writing the book was provided by the trial of Lieutenant William Calley for crimes committed at the My Lai massacre, one of the most infamous atrocities of the Vietnam War.

During his court-martial, Calley used the Nuremberg defence – a “good soldier” simply “following orders” – to justify his premeditated murder of 22 villagers (infants, children, women, and old men), and the assault of a child of about two years of age. All the killings and the assault took place on 16 March, 1968. Calley’s plea was that he dutifully followed his captain’s order to kill everyone in My Lai.

Many Americans saw Calley as a scapegoat in the chain of command and during his trial he received more than 10,000 letters of support. Following his conviction, the White House was inundated with mail objecting to Calley’s conviction. He was sentenced to life in prison, but after just three years of lax house arrest on military bases, he was released on parole in 1974 following the intervention of President Richard Nixon who reduced his sentence.

Like the volunteers in the experiments conducted by Milgram and Zimbardo, Calley was a good and decent man. A TIME magazine reporter who got to know Calley said of him:

There was nothing about Rusty Calley, as he was called, that would make you say that he was an explosion waiting to happen. He didn’t have killer instincts. He didn’t love guns. None of that was the case. He was a young guy from South Florida who loved being around people and going to parties. He was fun to be around.

■      ■      ■

It’s clear that people can succumb to the demands of authority however immoral the consequences. Indeed, people have a propensity to obey the most abhorrent of orders – and not just on the battlefield. Obedience to authority is ingrained in all of us from an early age. Throughout our lives, we encounter authority figures to whom we are answerable and this plays out in key relationships such as between parent-child, teacher-student, boss-employee, priest-minor and so on.

Humans have a strong predisposition to obey authority and this is driven by a tendency to please authority figures. But when legitimate authority is abused, blind obedience to inappropriate orders can lead to destructive outcomes. Our challenge is to stand up against arbitrary or unjust authority and not allow another person to define for us what is right or wrong.

We all have a moral compass which guides our decision making and this can help us resist the urge to comply with commands that are at odds with our value system. As noted by the American Psychological Association:

Acquiescence to the commands of an authority that are only mildly objectionable is often, as in Milgram’s experiments, the beginning of a step-by-step, escalating process of entrapment. The farther one moves along the continuum of increasingly destructive acts, the harder it is to extract oneself from the commanding authority’s grip, because to do so is to confront the fact that the earlier acts of compliance were wrong.

In the business world, employees can sometimes face unethical demands from employers. Some bosses fail to provide appropriate moral leadership by encouraging unethical practices in the workplace in the interests of, say, improving the bottom line. With regard to corporate scandals, some white-collar criminals trace their downfall to an excessive obedience to authority.

It’s ironic that some of the greatest crimes in history have been committed by those who followed the rules, not by those who broke them. Blindly following orders can be a recipe for disaster, particularly when they conflict with common decency. While a civilised society needs rules, regulations, and policies, they should not be inviolable. In all walks of life, people must be allowed to use common sense and good judgment.

We must think for ourselves and resist malevolent orders.

*In one variant of Milgram’s experiment, 65 per cent of “teachers” went all the way in administering shocks, starting from 15 volts (labelled “slight shock” on the machine) and progressing to the maximum 450 volts (“Danger: severe shock”).

Regards

Paul J. Thomas
Chief Executive Officer
Ductus Consulting

How to change someone’s mind

Image source: Shutterstock
FACTS ALONE ARE NOT ENOUGH

You are about to debate an important issue with a colleague with whom you have a differing perspective. You have developed a strong case to support your viewpoint and it’s backed up by hard, irrefutable data. Even though you’re confident that your argument is watertight, you fail miserably to sway your opponent to your way of thinking.

You have just learned an invaluable lesson – you can’t change a person’s beliefs with facts and figures alone. How you present your case is just as important. Science has proven that evidence and logic don’t win arguments. The ability to persuade others to change their minds requires a mix of communication skills, empathy, and respect.

Our opinions are often based on emotion. Humans have an innate tendency to hold on to pre-existing beliefs and convictions as our brains are wired to ensure the integrity of our worldview. Consequently, we seek out information that confirms what we already know (confirmation bias) and dismiss facts that are contrary to our core beliefs (the backfire effect).

So, berating another because they don’t like our ideas, recommendations, or proposals is a recipe for disaster. If you want someone to see eye-to-eye with you, then – in the words of one writer – you need to remember that:

When persuading someone to change their mind on a major topic, what’s being said isn’t always quite as important as how it’s said. If a person feels attacked or disrespected or condescended to, they’ll turn off their brain and block out the most rational, correct arguments on principle alone. Homo sapiens are odd, emotional creatures, more amenable to a convincing pitch than poorly presented rightness. It’s why we vote for the guy we’d gladly have as a drinking buddy over the somewhat alienating candidates with a firmer grasp on the issues.

Productive exchanges between people are more likely to occur when there’s mutual respect. Discussions, therefore, need to be held in an environment where no one is disparaged or shamed and both sides are open to changing their minds. In short, there must be a goal shift from winning to understanding and this requires empathy.

The late Stephen Covey wrote about the importance of empathy in his bestselling book, The 7 Habits of Highly Effective People. Habit 5 – seek first to understand, then to be understood – encourages us to alter the way we listen to others. To change someone’s mind you need to address their emotional attachment to what they believe and this, Dr Covey argued, requires emphatic listening.

According to Covey, people “listen with the intent to reply, not to understand”. Most of us are so focussed on our own agenda we don’t hear the other person as we talk at or over them. In contrast, empathic listening helps us get inside another person’s frame of reference with the intent of truly understanding how they see the world. Covey writes:

When another person speaks, we’re usually “listening” at one of four levels. We may be ignoring another person, not really listening at all. We may practice pretending. “Yeah. Uh-huh. Right.” We may practice selective listening, hearing only certain parts of the conversation. … Or we may even practice attentive listening, paying attention and focusing energy on the words that are being said. But very few of us ever practice the fifth level, the highest form of listening, empathic listening.

We spend years learning to read, write, and speak but receive scant training in the art of listening. Just think of all the times that you have debated or argued with someone. Did preaching to them about right and wrong change their mind? Did acting like a “logic bully” cause them to see the light? Did accusing them of being closed-minded or unreasonable help your cause?

I’ll bet that in each of these circumstances you faced the same outcome – a stalemate. Why? Because we all want to be understood, valued, and affirmed and this requires empathic listening. So, to change someone’s mind, we must stop talking and start listening. Listening is the key pathway to changing someone’s thinking and until your conversation partner feels heard, it’s almost impossible to change their mind.

Empathic listening is your secret weapon to influencing others and ensuring that you don’t butt heads. A columnist for the on-line publishing platform, Medium, put it this way:

When you come in guns blazing with all of your clear evidence, the other person will lock up. They’ll feel bullied and incapable of hearing you out. The best arguers are proven to use a small number of key points. They don’t rapid-fire or clap in the person’s face while they talk. They ask questions. They know changing someone’s mind is damn-near impossible. By asking questions, that person will change their own mind.

Great arguers stay calm, kind, and empathetic — no matter how ignorant or stupid their target is. They often open by acknowledging the things they agree on. Quite often, they compliment their opponent in the first minute. Opening soft is disarming. It’s unexpected. It highlights a desire for consensus rather than war and condescension.

Communications consultant and author, Lauren Schieffer, urges us to “get to know the person you are trying to influence. What matters to them? What brings them joy? What makes them angry? Understanding even a little bit about them helps you walk in their shoes with empathy”. You can then frame your message around the values of the other person, not your own.

In combination with empathic listening, another communication tool that you should consciously utilise is body language. Your non-verbal behaviours – facial expressions, gestures, posture, and tone of voice – send very clear messages which can be deciphered easily. If you roll your eyes or stamp your feet, for example, it’s blindingly clear that you’re not happy.

Your actions and mannerisms can speak louder than words, so remember that a genuine smile or tilt of the head will aid effective communication. Of course, it’s impossible to read body language and gauge sentiment if you are not communicating face-to-face. So, don’t try to resolve important matters via emails or messaging apps.

■      ■      ■

In my recent post, Has the world gone mad?, I mentioned that science deniers – whether on vaccines, climate, or evolution – cherry-pick evidence and draw on flawed reasoning techniques. Still, we should not give up on them even though their detachment from veritable reality is incomprehensible to science believers.

According to Lee McIntyre, a research fellow at Boston University, the only possible way to change the mind of science deniers is to talk to them calmly and respectfully, In his book, How to Talk to a Science Denier, McIntyre acknowledges that the truth is under assault with feelings outweighing evidence.

Even so, he believes that for most science deniers, change is possible and that if we don’t try, things will only get worse. McIntyre states:

Science denial is not just about doubt, it’s about distrust. The way you overcome distrust is not through sharing accurate information, it’s through conversation, face to face, in which you’re calm and patient and show respect and listen. Having the right attitude is the only thing that gives hope of success.

The world is undeniably polarised and our sense of shared reality is under attack. Denialism is dangerous and unfathomable, but one thing is clear:

“The ability to hear is a gift. The willingness to listen is a choice.”Mike Greene

Regards

Paul J. Thomas
Chief Executive Officer
Ductus Consulting

Has the world gone mad?

Source: aconsciousrethink.com
THINGS DON’T MAKE SENSE

Maybe it’s due to the persistent drumbeat of bad news. Or perhaps social media has messed up our brains. It could even be the fault of the pandemic which has pushed some of us over the edge. Whatever the cause, the world seems to have gone a bit bonkers. We have lost our collective minds and our ability to make intelligent judgements.

Humans, of course, have always been notorious for making irrational decisions. Irrespective, poor choices have become a mental contagion which has infected normally sane people and fuelled a growing disconnect between fact and fiction. An increasing number of us embrace conspiracy theories, reject scientific consensus, elect populist leaders, and promote wacky cures.

Even the smartest among us have moments when common sense escapes them, but things have got out of hand. During these uncommon times, illogical thinking has come to the fore in the face of uncertainty. Uncertainty causes our brains to overact and many of us have capitulated to irrational fear. Fear, in turn, influences our risk assessments by overestimating threats.

Fear can become problematic when it’s disproportionate to the actual risk faced, such as with COVID-19 vaccines. Despite irrefutable scientific evidence to the contrary, millions have embraced the misleading claims and outright lies about the safety of COVID inoculations. This misinformation has largely been spread on social media platforms including Facebook, Instagram, and Twitter.

Unquestionably, vaccinations are one of the greatest achievements of modern medicine and have turned many childhood diseases into distant memories. Like all vaccines, COVID shots were proven to be safe and effective through rigorous testing processes. Even so, anti-vaxxers have been unwilling to roll up their sleeves for a jab – because they are fearful.

Vaccine deniers have been spooked by the spurious and unsupported claims about COVID vaccines including that they: contain microchips for government tracking; include metals and other problematic ingredients; alter your DNA and stunt fertility; and have caused widespread death and disease. It’s even claimed that the pandemic is a ruse by big pharmaceutical companies to profiteer off a vaccine.

While these conspiracy theories might seem harmless, they demonstrate a detachment from verifiable reality that can cause someone to believe almost anything. To paraphrase a headline in The New York Times, the real horror of anti-vaxxers is that their behaviour isn’t just a public health crisis – it’s a public sanity one.

Another group that clings to beliefs which are at odds with conventional scientific thought are climate change sceptics. These sceptics hold a range of views including outright denial (it’s a hoax) to interpretive denial (it’s not a threat). This latter form of denial causes people to reframe climate change as natural and climate action as unwarranted. Thus, they do not contest the facts but interpret them in ways that distort their importance.

Humans instinctively push back against or completely reject facts that are contrary to their beliefs and this cognitive bias (the backfire effect) impacts how new information is processed. On the other hand, humans look for evidence which supports what they already believe to be true and this causes them to give credence to data which confirms that their view is right (confirmation bias).

These two cognitive biases work in tandem and help explain why climate deniers (a) ignore the hundreds of studies which show that humans are responsible for climate change and (b) latch on to the one study that they can find which casts doubt on human culpability arising from anthropogenic emissions of greenhouse gases.

Many believe that the catastrophic framing of climate change is self-defeating as it alienates people. I agree that doomsday scenarios don’t inspire action among deniers and also accept that merely talking about evidence or data does not change the mind of a sceptic.

So, I was drawn to a story in The New York Times which is void of scare tactics. The feature story, The Science of Climate Change Explained: Facts, Evidence and Proof, is written by a journalist with a PhD in geology. She calmly and pragmatically explains what will happen if we fail to address climate change – well worth a read.

Beyond vaccines and climate change, large swaths of humanity still snub science when it comes to Darwin’s theory of evolution. The beginning of the Earth, along with the birth of humans, remains a contentious issue between creationists and evolutionists. These protagonists continue to debate whether life on Earth was created in the blink of an eye or whether it evolved over millions of years.

Creationists insist that everything in nature was created by a deity who formed all life over a period of six days, as described in the Book of Genesis. Evolutionists reject this assertion by biblical literalists, citing scientific evidence showing that the Earth is about 4.5 billion years old and that all life evolved from primitive, single cell organisms.

To any evolutionary biologist, creationism is ludicrous. But to millions of creationists, particularly those in America’s Bible Belt, God remains the supernatural “intelligent designer” of the universe. The clashes between creationists and biologists can be explained, as noted in one article, through the lens of confirmation bias.

The latter (biologists) use scientific evidence and experimentation to reveal the process of biological evolution over millions of years. The former (creationists) see the Bible as being true in the literal sense and think the world is only a few thousand years old. Creationists are skilled at mitigating the cognitive dissonance caused by factual evidence that disproves their ideas. Many consider the non-empirical “evidence” for their beliefs (such as spiritual experiences and the existence of scripture) to be of greater value than the empirical evidence for evolution.

Debating creationists is a slippery slope as they do not adhere to facts or logic. What is scientific fact for evolutionists is irreverent blasphemy for creationists. As creationism argues that faith should take precedence over science, there is little hope for enlightenment – the scientific worldview is unlikely to ever supplant a creationist one. Well may we say “let there be light”!

Belief in ideas that have clearly been disproven by science remains widespread around the world. Rejecting scientific consensus has given rise to scientific denialism (dubbed the anti-enlightenment movement) and it has moved from the fringes to the centre of public discourse. An article in the international science journal, Nature, put it this way:

Science deniers – whether on vaccines, evolution or climate – all draw on the same flawed reasoning techniques: cherry-picking evidence, relying on conspiracy theories and fake experts, engaging in illogical reasoning and insisting that science must be perfect.

■      ■      ■

We view our ancestors as being blinkered by myth and superstition yet see ourselves as reasoned and enlightened. However, for all our advancement as a species, humans still behave irrationally. You just have to witness the global rise of a new political culture based on emotion and fear, in lieu of fact and policy, to know that something is wrong.

Perhaps there is no better example of this political irrationality than the election of Donald Trump which left millions of people around the world perplexed. His campaign – described in one critique as “a toxic mix of exaggerations, lies, fearmongering, xenophobia and sex scandal” – succeeded in elevating an unsuitable and unpopular nominee to the office of president.

Irrationality has defined much of human life and history and will continue to do so. We make irrational decisions with regular monotony such as stripping supermarket shelves bare of toilet paper during a pandemic. As I explored in a recent post, How our lives are shaped by the choices we make, our reasoning processes are imperfect and this leads to poor choices.

To suggest that humans are rational is an irrational idea.

Regards

Paul J. Thomas
Chief Executive Officer
Ductus Consulting

Do as I say, not as I do

Source of two-headed image: Quora
THE HYPOCRITE IN ALL OF US

How many of us practice what we preach? While it’s easy doling out gratuitous advice to others, living by the principles and values we espouse is another matter. The hard truth is that many of us display glaring contradictions in our behaviour, adopting one pose in public and another persona in private.

In all domains of life, people put on false fronts. Examples of double standards include pious politicians promoting family values while secretly having an affair to two-faced parents telling their children not to smoke while doing so themselves. Inconsistencies between what we say and do abound as we often fail to meet our own moral code.

While most humans can be accused of duplicity, higher standards are expected of those who claim the moral high ground. Priests and other religious implore us to love our neighbour, yet (some) have committed unspeakable transgressions against children. Such unvirtuous behaviour is repugnant and has exposed the heinous moral hypocrisy of religious institutions.

Just as churches need to put their own houses in order before damning others, so do we. Everyone is prone to hypocrisy at one point or another in their life. Humans are not cold logical robots but fallible emotive beings, which is why we suffer from a misalignment between words and deeds, thereby making hypocrisy unavoidable.

High-status people are some of the worst hypocrites in society. These individuals are frequently admired by others and often occupy leadership roles. Yet, as author Peter Schweizer outlined in his 2006 book, Do as I Say (Not as I Do): Profiles in Liberal Hypocrisy, famous people are not holier-than-thou and also fall short in living their beliefs.

Schweizer conducted an investigation in to the private lives of a handful of prominent US citizens and found a long list of blatant contradictions. To quote the book’s promotional copy:

Michael Moore … claims to have no stock portfolio, yet he owns shares in Halliburton, Boeing, and Honeywell and does his postproduction film work in Canada to avoid paying union wages in the United States. Noam Chomsky opposes the very concept of private property and calls the Pentagon “the worst institution in human history,” yet he and his wife have made millions of dollars in contract work for the Department of Defense and own two luxurious homes. Barbra Streisand prides herself as an environmental activist, yet she owns shares in a notorious strip-mining company. Hillary Clinton supports the right of thirteen-year-old girls to have abortions without parental consent, yet she forbade thirteen-year-old Chelsea to pierce her ears and enrolled her in a school that would not distribute condoms to minors.

The business world is similarly guilty of hypocrisy with companies displaying a lack of coherence between talk and action. Consumer activists have long argued that most business models prioritise profits over people despite the assertion by firms to the contrary. The classic example is the rag trade where global clothing brands have been complicit in the exploitation of sweatshop workers.

Sweatshops are as old as the industrial age and were started by heartless businessmen. Modern-day consumers must be careful of being too sanctimonious about the plight of garment workers because they (as shoppers) have knowingly bought high-street brands supplied by factories which mistreat their workers.

One of the reasons that high-street clothing has been getting cheaper and cheaper for decades is that sweatshop workers do not receive a living wage. The suffering of these unknown workers on the other side of the world is easy for us as consumers to ignore, particularly as we have become accustomed to reaping the benefits of lower production costs.

Something else that we have become accustomed to is politics in sport and this was on full display during the Beijing Winter Olympics. The overwhelming message of the opening ceremony was about peace and togetherness. A giant LED snowflake sculpture was used to symbolise all people coming together and living in harmony.

Yet human rights organisations branded the 2022 Olympics as “the genocide games” and accused China of holding a million Uyghurs (a largely Muslim ethnic group) against their will in re-education centres. In response, many nations – including the US, Britain, Canada, and Australia – staged a diplomatic boycott of the games in protest at China’s repressive policies toward the Uyghur minority group native to Xinjiang.

Many saw the International Olympic Committee’s decision to award China the games as political hypocrisy. Having an alleged human rights abuser as host was called out as clashing with one of the fundamental principles contained in the Olympic Charter – a commitment to “the preservation of human dignity”.

Another international body which recently came in for criticism is the United Nations entity that supports and co-ordinates action on climate change. For nearly three decades, the UN has brought together almost every nation on Earth for global climate summits called Conferences of the Parties (COPs). The 26th annual summit – COP26 – took place in Glasgow last November.

As leaders from around the world made promises to tackle an existential threat to humanity, climate change activists and experts railed against the hypocrisy that accompanied it. As noted in a University of Southern California (USC) Annenberg Media report:

… a total of 400 private jets flew down to Glasgow from all over the world, carrying more than 100 leaders. This emitted 13,000 tonnes of carbon dioxide into the atmosphere. For comparison, the average person’s carbon footprint globally is 7 tonnes per year and the carbon footprint of an average American is 21 tonnes per year. The leaders have been called out by critics as “eco-hypocrites” for emitting a huge amount of CO2 while gathering for an event organized to curb greenhouse gas emissions.

Climate change hypocrisy also extends to members of the British royal family. On numerous occasions over recent years, Prince Harry and Meghan Markle have been criticised for flying around the world in private jets while lecturing the world about climate change. A former UK government minister told Newsweek:

It’s completely hypocritical for Prince Harry or other members of the royal family to lecture people about climate change when they’re emitting more carbon than almost everyone else on the planet. People using private jets are in the top one percent of carbon emitters in the world.

Many citizens understandably jump up and down about humanity’s need to take climate change seriously. These same people typically look to governments and businesses to find eco-friendly solutions, when the real power for change is in our collective hands. We support governments with votes and businesses with dollars, which means that we can choose who governs and where we spend our money. We need to put our votes and our money where our mouths are!

■      ■      ■

“Hypocrisy is the natural state of the human mind,” according to Robert Kurzban, author of Why Everyone (Else) Is a Hypocrite. Kurzban argues that our behavioural inconsistencies are caused by the mind’s design, which consists of many specialised modules. These modules don’t always work together seamlessly resulting in impossibly contradictory beliefs and violations of our supposed moral principles.

Consequently, hypocrisy is everywhere and can manifest itself in countless ways. To pretend that we can live our lives without hypocrisy and contradiction is itself a form of deception. We must, therefore, exercise care before angrily lambasting others for their deeds, while doing the same ourselves. People who live in glass houses should not throw stones.

We’re all hypocrites, it’s just a matter of scale.

Regards

Paul J. Thomas
Chief Executive Officer
Ductus Consulting

How our lives are shaped by the choices we make

Source: smartholistics.com.uk
LIFE DOESN’T JUST HAPPEN

Ever since Adam and Eve’s original decision to eat fruit from the forbidden tree in the Garden of Eden, humans have made some spectacularly poor choices. History bears witness to these monumental mistakes including the crew of the Titanic ignoring warnings of icebergs in their path, NASA proceeding with the space shuttle Challenger launch despite known problems with the solid rocket boosters, and engineers filling the Hindenburg with highly flammable hydrogen.

Bad decisions are part of life, though most do not have consequences that weigh as heavily as those just cited. Examples of non-fatal, flawed judgements include the 12 publishers who rejected J.K. Rowling’s first Harry Potter manuscript, the Decca Records executive who declined to sign The Beatles, and the Yahoo co-founder who turned down a US$44 billion takeover offer from Microsoft.

Hindsight may be 20/20, but risk is an inescapable part of every decision. We never know the outcome of a decision in advance – sometimes our choices turn out to be spot on, while on other occasions our judgments prove to be seriously flawed. In the words of the late French philosopher, Albert Camus, “life is the sum of all our choices”. History, by extrapolation, equals the accumulated choices of all mankind.

Our lives are defined by the series of choices we make every single day. They play out over a lifetime and ultimately determine our destiny. Our choices not only change our lives but the lives of others. We are not alone in our choices as we are part of a bigger picture – there is a chain of events associated with every decision we make. Thus, an individual deciding to buy environmentally friendly products can help change the world and make it a better place for everyone.

Some of the life-changing decisions that we make include where to live, how many children to have, and what career to follow. More mundane and routine choices include what to wear, and what to watch on Netflix. One of the paradoxes of life is that our bigger decisions are often less calculated than our smaller ones. We can agonise for weeks over what new car to buy but rapidly end a long-term relationship with little thought or deliberation.

Sometimes a snap judgment or instinctive choice is appropriate. Your emotions, though, can easily cloud your judgment, which is why most experts agree that the best decisions are made when there is a balance between logic and emotions. The invisible tug-o-war between the head and the heart is not a bad thing as you are more likely to carefully weigh the pros and cons of each decision before choosing an alternative.

When your emotions are running high, your logic will be low, which can lead to irrational decisions. To illustrate, anger makes you vulnerable to high-risk, low payoff choices such as the rash decisions made during a bitter divorce. Happiness, on the other hand, makes you confident and optimistic about the future but can cause you to overestimate your chances of success, such as believing that your winning streak at the casino will continue indefinitely (aka gambler’s fallacy).

Knowing how to make good decisions is one of the most important skills we can possess. Many people look back at some of the terrible decisions they have made and ask themselves: What was I thinking? We make endless decisions, so we are bound to regret some of them. A Cornell University study estimated that the average adult makes thousands of remotely conscious decisions every day.

Each decision you make is a trade-off as everything you say, do, or pursue has a cost and a benefit. In the language of economists, this trade-off is called an opportunity cost. The term “opportunity cost” is defined as “the cost of an alternative that must be forgone in order to pursue a certain action”. Put simply, it’s what a person sacrifices when they choose one option over another. An example will help here.

Let’s say that you have $100 in your purse and you can spend it on a pair of jeans or a meal. You choose to buy the denim jeans, so the opportunity cost is the restaurant meal you cannot afford. For everything you choose to do, there’s something else you won’t be able to do. Every day as consumers, we are forced to make such choices due to “scarcity”. Scarcity and opportunity cost are two interlinking economic concepts.

Economists view the world through the lens of scarcity. Indeed, without scarcity, the science of economics would not exist. Scarcity arises because, as a society, we have unlimited wants but limited resources. We all know that you can’t have everything you want – we have to choose and make trade-offs. Economics examines how individuals, businesses, and governments deal with the limitations imposed by scarcity.

Broadly speaking, economics is the study of human behaviour as it relates to money. When it comes to financial decisions, economists erroneously claim that humans are rational and unemotional decision makers. Psychologists, on the other hand, correctly contend that economists’ models bear little relationship to actual human behaviour. The harsh reality is that humans do not obey the efficient, orderly principals espoused by free-market thinkers.

The Global Financial Crisis (GFC) confirmed that we are far too emotive for rational economic models to accurately predict our conduct. Many people in the US bought houses at grossly inflated prices and expected their value to keep rising. In the process, borrowers saddled themselves with loans that they could not afford, which led to the subprime mortgage meltdown and ultimately the catastrophic GFC.

This “irrational exuberance” was not confined to the household sector. Borrowers, bankers, and brokers were united in the delusional belief that house prices never go south. Post-GFC, many people turned to behavioural economics to understand what happened. Behavioural economics combines psychology and economics to explain how people really make decisions when they spend, invest, save, and borrow.

Unsurprisingly, few people reach the level of expertise necessary to rightfully claim that they are an expert decision-maker. The development of genuine expertise in any field requires years of struggle and sacrifice. Still, you can be a good decision-maker if you choose actions that produce the best outcome for yourself and others. The trick is to make each decision with an open mind and be aware of your unconscious biases.

Cognitive biases distort thinking, influence beliefs, and sway the decisions we make every day, yet most people are unaware of them. Over the course of our lives, we all develop cognitive biases. Just watch the daily news, listen to talkback radio, or scroll through social media posts to witness biases in action as people argue over politics, climate change, and other hot topics. Everyone, of course, claims that their position is the right one.

Differences of opinion occur because we all have our own perspectives based on our preconceptions, past experiences, and the information we draw on in forming judgements. When it comes to gathering information, many of us are guilty of confirmation bias – readily embracing information and conclusions which align with our views and largely ignoring anything which contradicts our beliefs.

■      ■      ■

The next time that you make a bad decision, just remember that it could have been worse. Imagine being the individual responsible for allowing the famous Trojan Horse to be brought inside the City of Troy, not realising it was full of Greek soldiers. And how would you have felt standing in Napoleon’s shoes after he invaded Russia, suffered a catastrophic defeat, and returned home with just a fraction of his once grand army?

You can minimise regrettable decisions by learning from your mistakes – history does not have to repeat itself. Humans have a tendency, however, of replicating the same blunders over and over (poor diets, dysfunctional relationships, impulsive buying, etc.) causing us to relive our errors. If you want a different result, you have to do something different – make better decisions!

We are what we choose to be.

Regards

Paul J. Thomas
Chief Executive Officer
Ductus Consulting

Why we fail to detect subtle changes

Source: hu-hu.facebook.com
PERILS OF SHORT-TERM THINKING

Humans sit at the apex of the evolutionary tree with the most complex brain of any animal, yet some believe that there is a design fault. Our brains have evolved to respond to immediate threats, so we are not wired to detect more gradual warning signs. That’s why we can duck out of the way of a cricket ball in a fraction of a second, but fail to react to repeated and serious threat assessments about a deadly new virus for which there is no treatment.

In the early phases of human existence, our ancestors faced an onslaught of daily challenges to their survival – from predators to natural disasters. Too much information can confuse our brains, leading us to inaction or poor choices and this can place us in harm’s way. Consequently, our brains evolved to filter information rapidly and focus on what is most immediately essential to our survival.

Daniel Gilbert, a professor of psychology at Harvard University, argues that threats that develop over decades – rather than seconds – circumvent our brain’s alarm system. To illustrate, he says that we take alarm at terrorism, but much less to global warming, even though the odds of a disgruntled shoe bomber attacking our plane are, he claims, far longer than the chances of the ocean swallowing parts of Manhattan.

Assessing and reacting to risk is one of the most important things we do as humans. Nonetheless, as Professor Gilbert points out, in our short-sighted world we don’t perceive long-term challenges which threaten our existence, which is why he asserts that:

… if alien scientists were trying to design something to exterminate our race, they would know that the best offense is one that does not trigger any defense. And so, they would never send little green men in spaceships. Instead, they would invent climate change, which produces apathy not action (bold text added).

“Humans are very bad at understanding statistical trends and long-term changes,” notes political psychologist, Conor Seyle. “We have evolved to pay attention to immediate threats. We overestimate threats that are less likely but easier to remember, like terrorism, and underestimate more complex threats, like climate change.”

Right now, humanity faces a number of risks, but they are not on our collective radar as they will not impact us for a long time – decades and longer. Some of these risks are called existential risks as they have the capacity to wipe out humanity. For instance, in about a billion years – give or take a few hundred million years – the increased brightness of the Sun will doom the Earth’s biosphere.

In the more immediate future – say, the next century – the greatest threat to humanity is ourselves. More specifically, according to an article published by online media outlet Quartz, the most dangerous threat to humanity is the human mind.

The defining characteristic of humans is our capacity for complex thinking and advanced reasoning. These abilities have allowed us to develop innovations that transform our lives and our world … (but these) … innovations have also created new problems, many of which threaten our existence .… Climate change, pollution, economic and social disruption due to emerging technologies, political polarization, misinformation, inequality, and large-scale conflict are all major challenges for humanity to overcome that have arisen from our own innovation.

We are unlikely to effectively solve these problems unless we truly understand their ultimate source: the human mind. In line with this thinking, the Centre for the Study of Existential Risk at the University of Cambridge believes that the four greatest threats to the human species are all man-made – artificial intelligence, global warming, nuclear war, and rogue biotechnology.

In his bestselling book, Sapiens: A Brief History of Humankind, historian and renowned author, Professor Yuval Noah Harari, states that humans “… have the dubious distinction of being the deadliest species in the annals of biology”. We are the most advanced and most destructive animal to have ever lived – making us brilliant and deadly. This lethal combination causes some to proffer that a man-made global pandemic should be added to the list of threats to humanity.

Experience has taught me that it would be wise to further augment the list with unknown unknows. We humans are sometimes too clever by half in believing that we have covered all bases. In reality, no one can say with absolute certainty that there is not an unknown threat lurking around the corner which will take us by surprise. Consequently, the greatest risks in the years ahead may come not from threats we’ve identified, but from those we haven’t.

It’s clear that our shortterm brains can’t cope with longterm perils. We are focussed on the here and now to the detriment of distant risks. Our inability to look beyond the current news cycle is reflected in a phenomenon called short-termism – the constant pressure to deliver instant results.

Short-termism has become endemic in society, and it pervades all aspects of our lives. We want quick-fix surgery to rectify imperfections IMMEDIATELY. We crave crash diets to lose weight FAST. We consume energy drinks to heighten alertness NOW. We expect politicians to respond to tracking polls TODAY. And we require companies to achieve a turnaround in earnings PROMPTLY.

In an article titled – The perils of short-termism: Civilisation’s greatest threatBBC journalist, Richard Fisher, paraphrases angel investor Esther Dyson: in politics the dominant time frame is a term of office, in fashion and culture it’s a season, for corporations it’s a quarter, on the Internet it’s minutes, and on the financial markets mere milliseconds.

The world is plagued by short-termism and our challenge is to look at things through a longer lens. Perhaps we should remember that when we feel like shrieking in anger at the need to don a face mask during a pandemic while ignoring the long-term benefits to humanity of controlling a deadly virus.

COVID-19 is the latest example of long-term success being held hostage to short-term thinking. The pandemic influenced many people to focus on short-term outcomes and instant gratification. Cleary, we need to reframe our thinking and develop a longer game plan for society.

It’s time for humanity to see the bigger picture.

Regards

Paul J. Thomas
Chief Executive Officer
Ductus Consulting

“Night Before Christmas” 2021 – pandemic year in review

Source: pymnts.com
A COVID SPIN ON A CLASSIC POEM

‘Tis the week before Christmas after a year of precaution,
Masks are off, yet there is still much caution.
The holidays are approaching, but not for the pandemic,
The virus remains a threat, it’s not academic.

As people the world over nestle snuggly in their beds,
Memories of lockdowns dance in their heads.
The hope of families, a Christmas that’s virus free,
The best sort of present, under the tree.

Stay-at-home restrictions generated such a clatter,
Yet keeping the sick isolated really did matter.
Containment measures, the order of the day,
Zoom meetings and home schooling, little time to play.

A surge in e-commerce, our behaviour shifted online,
Our lives became very different, we did just fine.
We found everything we needed, nothing to fear,
A fleet of Amazon trucks, delivering some cheer.

Long before Christmas my shopping was done,
No last-minute rushes, ’cause that’s not fun.
My grandchildren are fine, and exceptionally nice,
Their presents are coming, they won’t have to ask twice.

Now, EMMA! now, JESSICA! now, OSCAR and ELIAS!
On, HARRISON, on, ABRAHAM, on, NAYAH and EMILY, there’s no bias.
Granddad loves you all, you bring such joy,
You each deserve a gift, perhaps a big toy.

Santa’s arriving, with eight socially distanced reindeer,
And he’ll be kitted out, in personal protective gear.
His sleigh will be sanitised, and wiped thoroughly clean,
It will sparkle and shine, fit for a queen.

For a while it seemed gifts would be delivered by drones,
Without clearance to travel, Santa was to be replaced by drop zones.
The Christmas supply chain, held together by the elves,
They did a marvellous job, so we can enjoy our festive selves.

But before letting down our hair to celebrate another year,
Let’s remember those who suffered, and those no longer here.
Many succumbed to COVID, the pandemic’s tragic cost,
Celebrating Christmas without loved ones, makes us feel lost.

Infection rate suppression, remains the name of the game,
Until we defeat COVID, life won’t be the same.
Vaccines are our best hope, to keep the virus at bay,
Be sure to get a jab, so the world can come out and play.

Pfizer, Moderna, AstraZeneca, each a household name,
They’re also scientific heroes, a virus they did tame.
Prevention measures remain important, they’ve acted as a tether,
The dream of unrestricted movements, requires us to work together.

Christmas must not be, a super spreader event,
Let’s do the right thing, another outbreak to prevent.
In this season of goodwill and kindness to others,
Be on your guard, protect our sisters and brothers.

For now our thoughts turn to the season of goodwill,
And the excitement that comes from stockings to fill.
It won’t be long before Santa’s on his way,
If you listen carefully, you’ll soon hear his sleigh.

As I sign off for Christmas, I thank all readers of this blog,
I hope my fortnightly posts have left you agog.
May the spirit of the season fill your home with cheer,
As I say “Merry Christmas to all and to all a good New Year.”

Before you go …
This is my final blog post for the year. I hope that I’ve kept you informed and entertained during 2021. I’m taking a short break from my blogging duties and will be back on-line on Sunday, 30 January 2022. Have a great New Year.

Regards

Paul J. Thomas
Chief Executive Officer
Ductus Consulting

Behind the scenes in the life of a blogger

Photo: Sean Boyd/In the Frame Productions
PERSONAL INSIGHTS

In the few short moments that we have together as you read this post, may I begin by thanking you for following my blog during 2021. In the lead up to Christmas, people give shout-outs to loyal clients and I wanted to let you know how much your continuing patronage of Elephant in the Room means to me.

The public comments and private feedback that I receive inspire me to continue as a blogger, and to work hard to curate great content – delivered straight into your inbox. I have a loyal community of readers who click-on each fortnight to view the latest post that I have published in cyberspace.

In our rapidly changing digital world, we must always be learning, which is why the best blogs provide information that help people in search of answers. Lifelong learning is now seen as an economic imperative and well-crafted blogs can assist online knowledge seekers.

This blog is a place for reasoned argument supported by corroborating evidence to give you a clear understanding of the forces shaping our world. My blog brings readers face-to-face with the issues that are shaping politics, impacting economies, transforming societies, and driving technology.

It is my enduring hope that this eclectic mix of topics will pique your interest and encourage you to read more extensively for yourself. Nelson Mandela believed, quite rightly, that “education is the most powerful weapon which you can use to change the world”.

All posts published under the Elephant in the Room banner are designed to be interesting and educational. They are replete with content which is topical and open to debate and discussion. I do my best to present both sides of an argument before outlining my own position on contentious issues.

While I’m not a journalist, I’m aware that a basic tenet of fair journalism is captured in the Latin phrase audi alteram partem meaning “let the other side be heard as well”. That maxim requires that any report should be balanced and fair towards all parties.

Unlike most blogs, I don’t focus on a single niche topic (e.g., dog training, gardening tips, and so on). Rather, I deliberately cast a broad net and publish posts that are wide in sweep – but that does not mean my blog is a hodgepodge of anything that interests me.

The assorted topics that I cover are grouped under four umbrella categories – Political, Economic, Social, and Technological. These categories work in unison to provide readers with fresh perspectives on the interplay between a range of PEST issues which are of national and international significance.

Elephant in the Room shines a light on some of humanity’s biggest challenges. In a world which is increasingly interdependent, the subliminal message in many of the posts is that we need to reframe our thinking and see ourselves as global citizens working together to create a more harmonious society.

The posts are deliberately designed to make you think as they tangle and weave through disparate but connected topics. By joining the dots, you will gain a helicopter view of where individual disciplines intersect and overlap, thereby enabling you to see more creative solutions to contemporary problems.

For my part, I have an inquiring mind and am always imagining how the world could be a better place. That’s why one of my all-time favourite quotes is by Robert Kennedy: “Some men see things as they are, and say ‘Why?’ I dream of things that never were, and say ‘Why not’”.

Each post ends with a pithy one-liner, often in the form of an aphorism. Aphorisms are pointed, witty statements which express a general truth and are sometimes paraphrased quotes. My closing one-liners are designed to pack a punch and leave you pondering.

Blogging helps me keep up with what’s happening in the world. It’s also a great way to become a thought leader, but it does require some effort. Unless you are a walking encyclopedia, most posts require you to conduct research and check facts and this increases your understanding of an issue.

When it comes to blogging, content is king and the seed of an idea for a post can come from anywhere. Some of the articles that I have written germinated when I grew curious about a subject and decided to explore it. Others have been penned in direct response to a contemporary issue.

Regardless, this blog has provided me with a creative outlet in which to share my ideas and opinions. In the process, it has enabled me to create a professional portfolio of “short papers” on important topics. This has required me to distil a lot of information into coherent and cohesive arguments.

The golden rule of blogging is that you have to be authentic as it’s an up close and personal writing medium. So, my relationship with my audience is built on being open, transparent, and factual. My blog is an online extension of my true personality – a real version of my “doubting Thomas” self.

I’m always intrigued as to what subject matter piques the interest of my readers. I still can’t explain what makes certain posts more popular than others. The reality for all bloggers is that some posts rank higher than others on Google and attract more social shares and “likes”.

The biggest thing that I have learned in researching and writing blogs is how often supposed experts are wrong. “Experts” who appear on television, get quoted in newspapers, and speak at conferences are often no better than the rest of us when it comes to the risky business of predictions.

I’m deeply indebted to my behind-the-scenes webmaster, Kieran Weston. Kieran is a family friend and one of nature’s gentlemen. He meticulously uploads and publishes each post and professionally maintains the blog site. He is a talented executive and web designer and I salute his unfailing support – on a voluntary basis.

Someone else who deserves praise is my wife, Beverley. She proofreads each post before publication and has developed an eagle eye for spotting grammatical and typographical errors. Beverley is also a volunteer but extracts payment in other ways! I have made a rod for my own back by encouraging her to point out my mistakes – which she happily does!

■      ■      ■

This post is my penultimate missive for 2021. In reflecting on the year that was, humanity faced looming threats and some hard truths. Yet, despite the dire warnings of the headline grabbing doom-and-gloom merchants, we are still here. COVID-19 did not wipe us out, China did not start a nuclear war, and America did not implode. Even the Tokyo Olympics went ahead!

My next blog post on 19 December will be the final one published for 2021 and will take the form of a Christmas parody. It will be set to the rhyme scheme of Clement Moore’s classic poem, The Night Before Christmas. It will broadly imitate the style and form of Moore’s original lyric while addressing a different subject matter – a look back at the biggest news story of the year, COVID-19.

As we approach the season of goodwill to all, my Christmas wish is that we reflect as a nation on all that is good about Australia. In truth, we have little to complain about. There may be a place where the grass is greener, but in all my travels, I am yet to find it. May peace and happiness be yours during this holiday season.

Have a sparkling New Year!

Regards

Paul J. Thomas
Chief Executive Officer
Ductus Consulting

Is the era of the specialist over?

Source: thereluctantcfo.com
BECOME A DEEP GENERALIST

In 2018, the world held its collective breath as two Australian doctors spearheaded the rescue mission of 12 school boys and their soccer coach who were trapped underground in a flooded Thai cave. Both Aussie rescuers are proficient medicos and adept divers and it was this atypical combination of skills that made the duo perfect for the daring operation.

Research has long shown that we can all gain from spending time outside of our specialism. With reference to the cave rescuers, they have formal training and acclaim in two unrelated domains – medicine and cave diving – and this qualifies each of them to be called a polymath*. The word polymath is a 17th century Greek term which describes a person with “many learnings”.

Throughout history, many notable individuals have pursued multiple interests. Albert Einstein was an accomplished violinist as well as a physicist. Leonardo da Vinci was an artist, inventor, scientist, architect, and engineer. Thomas Edison was a prolific inventor, entrepreneur, poet, and writer. The world’s most intriguing Renaissance men were all polymaths or deep generalists.

The label “polymath” is often applied to Elon Musk as he excels in multiple fields and has used his cross-discipline expertise as a physicist, engineer, economist, and entrepreneur to tackle some of society’s most pressing challenges. He has built three multibillion-dollar companies in three disparate industries – aerospace (SpaceX), automotive (Tesla Inc.), and energy (SolarCity). But that’s not all!

In 2016, Musk co-founded a mind-computer interface company (Neuralink Corp.) which is developing brain implants that can communicate with computers. In the same year, he started a tunnel construction business (The Boring Company) to create fast-to-dig transportation tunnels. Musk also came up with the idea for an ultra-high speed, futuristic transportation system (The Hyperloop).

Throughout his life, Musk has displayed a relentless pursuit of knowledge and an unrivalled talent for applying his learnings across a range of industries. He has been called the quintessential modern polymath. His world-changing intellect has become a symbol of the power of being an expert generalist with the ability to generate breakthrough insights and innovations.

Yet conventional wisdom still frowns on being a “jack-of-all-trades, master of none”. From the time we enter school, we are constantly encouraged to specialise by choosing a clear path and then sticking with it. And once we enter the workforce, the pressure to specialise is ever present. Being a generalist has long been seen as the road to mediocrity.

Paradoxically, research shows that people with too many interests are more likely to succeed. This certainly holds true for the founders of five of the largest companies in the world – Bill Gates (Microsoft), Steve Jobs (Apple), Warren Buffett (Berkshire Hathaway), Larry Page (Google), and Jeff Bezos (Amazon). All are polymaths who follow the 5-hour rule (minimum learning time each work week).

Polymaths see the world differently and make connections that are otherwise ignored. A case in point is Francis Crick who discovered the structure of DNA. He began his scientific career in physics and later made the transition in to biology. Crick claimed that this diverse background gave him the confidence to solve problems that other biologists couldn’t.

Many of the world’s other great inventions also arose as a result of multifaceted thinking. Nikola Tesla was a pioneer in many fields but is most remembered for inventing the radio. In doing so, he drew on his skills as an electrical engineer, theoretical physicist, mathematician, and futurist. Elon Musk’s electric car company is named after Nikola Tesla.

Even though the world remains obsessed with specialisation, the evidence for deep generalists is growing. In his book, Range – Why Generalists Triumph in a Specialized World, David Epstein examines the world’s most successful individuals across a range of human endeavours. He discovers that in most fields – especially those that are complex and unpredictable – generalists, not specialists, are the ones who excel.

Epstein reports that when researchers study great innovators, they typically find “systems thinkers” with an “ability to connect disparate pieces of information from many different sources” and who “read more than other technologists”. Simply put, generalists invariably do better than specialists in putting two and two together across domains.

Charles Darwin is considered by Epstein to be the ultimate example of someone whose breadth of training enabled him to remain open-minded and innovative. Prior to sailing to the Galápagos Islands, Darwin studied natural history, medicine, theology, and geology. This cross-training enabled him to build the intellectual firepower that he would later need to overturn centuries of dogma.

The Digital Age has made it easier for us to become polymaths. Today, information is everywhere, and more often than not, it’s free. Wannabe polymaths can become proficient in multiple fields by allocating at least one hour per day for deliberate learning and reading. The one habit that all high-performers share is reading lots of books across various disciplines.

My desire to continually learn new things and improve my knowledge is one of the reasons I maintain a blog. Without exception, the research I undertake in writing each post helps broaden my horizons and aids my self-development. Moreover, learning keeps my brain active and stimulated and this (hopefully!) helps boost my cognitive health.

For readers of this blog, my posts are deliberately designed to make you think, as they tangle and weave through disparate but connected topics. By joining the dots, you will gain a helicopter view of where individual disciplines intersect and overlap. And by cross-pollinating ideas from a range of fields, you will be able to make new connections and see more creative solutions to contemporary problems.

Notwithstanding this, Western educational systems still lean towards deep specialisation, which is why UK researchers argue that we need a radical shake-up of school curriculums to ensure arts and sciences are no longer taught separately. Educational experts believe that teaching children to think like Leonardo da Vinci would better prepare them for tackling complex issues.

When it comes to tertiary education, it has long been my contention that universities around the world will increasingly be challenged to turn out graduates with broader interdisciplinary degrees. The answers to the big global issues we face – like climate change – cannot be found within traditional single disciplines such as economics or science or politics on their own.

For this reason, I believe that one subject that should be embedded in most university degrees is the study of biomimicry. Biomimicry is the art and science of emulating nature’s best biological ideas and applying these solutions to product design, architecture, engineering, technology, business, and medicine. Biomimicry is relevant in every sector of society.

Velcro is probably the best-known example of innovation inspired by nature. The product’s inventor, George de Mestral, stumbled upon the idea by examining how burrs stuck to the hair of his dog. By mimicking the strong attachment forces of the burrs’ small hooks, he was able to develop Velcro straps and fasteners.

Similarly, Airbus observed how sea birds sense gust loads in the air with their beaks and adjust the shape of their wing feathers to suppress lift. As a result, Airbus installed probes on its A350 aircraft which detect gusts ahead of the wing and deploy moveable surfaces for more efficient flight. Airbus engineers continue to study the natural world for modern aircraft design solutions.

Mother Nature is by far the smartest “person” I know. She is the ultimate polymath and genius and we humans can learn much from her. She has been giving lessons in design and solving problems for billions of years, but only in recent times have we started “enrolling” in her classes. Using nature as a mentor, professionals from a range of fields are now studying biomimicry, but more of us need to look to nature for creative solutions.

■      ■      ■

It’s never too late to pick up a new area to add to your repertoire of skills. If you can combine unique skills in creative ways, you may well be one of tomorrow’s great problem-solvers and innovators. In an era of rapid-fire technological and social change, we all need to embrace our inner polymath because we are more than the sum of our parts.

May polymaths inherit the Earth.

*Some academics argue that only individuals proficient in three disparate areas can call themselves polymaths.

Regards

Paul J. Thomas
Chief Executive Officer
Ductus Consulting