VUCA: Blah! Blah! Blah!
Agility: Blah! Blah! Blah!
Shareholder activism: Blah! Blah! Blah
Big Data, digitization, robotics, artificial intelligence (AI), robotics and cybernetic technologies overshadow these business terms and activities and remains the big elephant in the room. In fact, technology is moving at such a pace that Moshe Vardi, a Rice computer scientist, argues that all human jobs will be obsolete by 2045.
Just a couple of decades ago it was electronics and telephony that made us both nervous and excited at the speed of their progression. Today, it is computing devices, the Internet, online retailing, social media platforms, hi-tech technologies, 3D printing, drones and driverless cars etc. that together with their speed and scale, have changed the way we work and live.
For those who have the knowledge and skills in this area it provides them with a distinct advantage. However, for many others it presents a huge challenge, which may possibly leave them behind and even out of a job. Is there a chance that this digital landscape could bankrupt our future way of living and working?
Before panic sets in that the end of civilisation as we are comfortable with is nigh, let us explore the similarities between technology and humans as well as, what might be the differences. These insights just might put us humans a little bit at ease!
Contrary to past beliefs, as we age the brain as an organ still undergoes molecular changes with synaptic plasticity increasing and, in recent years, advances in the fields of neuroscience, neuropsychology, neurobiology, as well as fMRI technology has provided us with insights and evidence on how our brain structure and processes can change to mediate behaviour.
For context, a simple way of understanding the brain is that it is a biological computer that consists of three parts working as one. The neocortex brain (also known as the analytical mind) is responsible for higher order functions, analytical thinking, decision-making and creativity. The limbic brain is the emotional command centre, running all basic social interactions. It is the seat of our value judgements and our habits. The reptilian brain/instinctive mind is non-conscious geared for survival and regulating major body processes. It can be somewhat rigid and compulsive.
Additionally, human beings have the capacity to 'think'. Whether we are daydreaming or contemplating we have the ability to interpret and our brains are wired to connect socially.
Another way of understanding the brain is offered by Kahneman who describes 'System 1' and 'System 2' thinking - two systems that drive the way we think and make choices.
System 1 works easily and automatically and doesn't take much effort; it makes quick judgments based on familiar patterns. It is fast, intuitive and emotional. System 2 takes more effort; it requires intense focus and operates methodically. It is more deliberative and more logical. These two systems interact continually.
So what are the similarities: The neocortex (analytical mind) is somewhat similar to a computer in that it is analyses and decodes information. And, like 'System 2' thinking, computers are more logical. Similarly, computers are neural networks programmed with a set of algorithms or logical operations as much as the brain is made up of 100 billion nerves that communicate in trillions of connections called synapses. Computers work through an interaction of physical hardware components and software instructions and like our own brain it can translate instructions and perform calculations.
However, the difference is computers lack the functioning of the limbic brain and 'System 1' as they do not have emotions and feelings. Although computers can be trained to read the emotion and reaction of individuals, computers are not wired to connect socially and are not intuitive.
This is where the concept of cognitive diversity comes into its own. Each and every one of us thinks in different ways. Individuals have different ways of perceiving, interpreting, categorising, organising, processing, reflecting, inferring, adapting and communicating.
A robust argument for encouraging cognitive diversity is that it fosters and embeds innovation and creativity into our ways of working and living. It harbours a 'diversity mindset'. It goes without saying that if individuals with differing cognitive abilities work together, then differing perspectives and alternative proposals for other courses of action will follow. An increasing awareness of issue and options will result in more extensive discussion of strategic options, and more learning opportunities. Further, as the brain pool is diversified it, reduces the likelihood of a groupthink-type phenomenon occurring.
Further arguments for the merit of cognitive diversity are attributed to the market, customer and product mix. A diverse group allows industry to understand and respond better to the needs of ever-growing complex markets. A company that can match its own internal diversity with the external diversity of its customers is going to satisfy more people more of the time, and prosper in the process.
As stated, computers are logical and rational. They are programmed and their output is objective. Nevertheless, they do not hold the attributes that humans have such as feelings, mood, nuance, emotion, needs, insights, intuition, and ethics - the very hallmarks of humans. The mind is not a linear computer and is richly paradoxical in that it leans towards irrational rationality. It produces a rich mosaic of competing thoughts, ideas and opinions. The downside of this richness is that as humans, we lean towards subjectivity and biases and of course, we can not be automated which in some instances can lessen efficiency and increases costs.
However, even in areas where machines match or exceed human capabilities and question our creativity (e.g. Google's artificially intelligent Go-playing computer system game against South Korean GO genius Lee Sedol and Google's 'Deep Dream' art generator that works with neural networks to recognise patterns) there will be insistence that certain tasks and decisions remain in the hands of humans. For example, juries in courts of law, face-to-face doctor/patient interaction, religious ceremonies, primary school teachers, police and detectives, nursing aids etc. Processes and activities in these professions might be digitised but interpreting and communicating outputs remains a human activity.
There is no doubt that technology has created tremendous new avenues for growth and profitability that companies such as Amazon, Microsoft, Google, Facebook etc. enjoy. Add into the equation the many more entrepreneurial businesses who offer different ways of providing services such as Uber, Airbnb, Alibaba, Netflix, Zappos and Snapchat to mention a few. What they offer is sometimes called disruptive innovation or creative destruction. Nonetheless, needless to point out that these services still require a human brain to conceptualise and actualise these differentiated service offerings.
Increasingly, there is also the rise of the artisan. An artisan is somebody who does the entire work largely by themselves and does not typically belong to a large organization. Another way of describing an artisan is as a creative person that produces non-standard items. In a world where consumers typically preferring high-end products that materially differentiate them from others then artisans will flourish.
I began this blog by asking ss there a chance that this digital landscape could bankrupt our future way of living and working? The fact is that digital is a journey of discovery with no clear destination and humans should not be pessimistic in the face of advancing technology. Rather humans and technology can complement each other. Evolution is natural state of society and, from an optimistic standpoint, as computers to date do not have the capacity to be emotional or social, not only can we can learn and evolve with them but we also have the advantage of our brains evolving and adding humanness into our lives and work.