Scroll down for earlier posts
Brexit and Trump: Missed, the OTHER damning reason that needs urgent attention, or else ….
With social media, the newspapers and airwaves, the politicians, the ‘experts’ and the public full of it, there is one explanation for the Brexit and Trump revolutions that seems to have escaped everyone. The UK and the US’s obstinate low levels of productivity growth.
For the earthquakes, the given explanations have been raging and wide-ranging, everything from widespread economic exclusion, globalisation, immigration, xenophobia, overseas conflict, global warming, political elitism and terrorism. But the one issue, productivity growth – i.e. the ability of employees, the companies for which they work and the countries that host them to continue producing their special somethings of value at competitive prices – is one that has an importantly pervasive effect way beyond its perceived impact.
Against the background that productivity shrinkage is a worldwide trend that could have been spotted much earlier, both countries – including other large industrialised nations – have entered a long period of productivity stagnation that the “experts” can’t fully explain. For the US, which remains among the world’s leaders in this matrix, the recent admission is that productivity is the country’s “central economic issue” (Forbes, October 29, 2016) whilst the UK’s position, according to Philip Hammond, the new post-Brexit Chancellor of the Exchequer, is that the country’s productivity gap is “shocking“ (Budget statement, November 23, 2016).
Plain and simply put, our employers and our employees are not up to it any more. We’re not as good as we were. We’ve slipped and it’s been happening for some time. And the crazy thing is that, in the world of business, we’ve never been more educated and never been more experienced ….
The simple effect of this corporate inertia is that low or negative productivity growth becomes more and more difficult – and expensive – to maintain in an environment where growth is increasingly important. Without systemic growth, employment either flatlines or decreases – or when artificially increased, further increases costs – increasing workplace competitiveness for jobs and social/political discontent, which is heightened in a globalised marketplace. Alongside all the other suggested reasons, then, comes Brexit and Trump.
However unconsidered is this explanation for what happened on both sides of the Atlantic, the big reason for the productivity growth decline (or at least a hefty part of it) is also largely unnoticed and only now becoming acknowledged. Equally surprising, it’s the widespread flexible labour market, the 1980s phenomenon that overturned workplace tenures from one or two employers in a working lifetime to around eight and rising in many developed countries. With some spikes, its advent and acceleration has broadly coincided with the decline in productivity.
Consider the consequence of this level of disruption across the entire modern workplace over more than three decades – i.e. every employee, including top decision-makers, replacing themselves repeatedly. Originally designed to allow employers to change their employee base to accommodate high levels of market change, flexible working’s expectation was that employees would additionally be able to broaden their experience and employers would then have that much more of a knowledge base from which to choose.
Unfortunately, this was a theory that didn’t take account of the swathe of unaddressed and harmful effects that an average four-to-five years employee tenure would create and upset institutional productivity. The acceleration in jobs churn would, in fact, more than offset the perceived advantages and savage productive output to the core. The surprise is that the effects of high workplace turnover were unconsidered, unnoticed and completely misjudged, a classic example of one of those unintended consequences of Adam Smith’s invisible hand or, dare one mention it, another reason not to rely on those pesky experts.
In an environment where the corporate raison d’etre is to ensure growth, continual jobs disruption was followed by the most damaging of all – the rolling loss of employers’ unique knowledge that interrupts the conventional organic way that most progress occurs, i.e. from the building of one experience on another. This ‘corporate amnesia’ instantly also removes another important component of good decision-making – the opportunity to factor in relevant historical perspective and precedent, an outcome that replaces the employers’ hard-won practice with someone else’s less-relevant knowledge and experience. Other casualties of this walkabout environment – either imposed or, now, foisted on employers as ploys to boost individuals’ wages – include a loyalty deficit and the constant dilution of difficult-to-plan individual corporate cultures. Ever wondered why there are so many expensive mistakes that are repeated, wheels that are reinvented and lessons unlearned …? Simply because the current generation isn’t aware of what the previous generation did or didn’t do and for which, anyway, they don’t see themselves responsible. It’s called experiential NON-learning.
All of which contributes to commerce/industry exposing itself to the corporate equivalent of Alzheimers, where the result is that EVERY employee and EVERY employer automatically becomes less productive. For another insensitive medical analogy, we’ve unnecessarily amputated the most important of our limbs.
Just imagine if the natives WERE more productive? For one, employers would not be so keen to employ non-natives, a very politically incorrect suggestion that would have helped to impede at least one of the Brexit/Trump perceived causes. And for two, commerce and industry would be able to provide the missing growth to rebuff some of the others. In any event, wouldn’t better productivity make it easier for the UK to get through its intended Brexit?
The effect on productivity of flexible working and so-called corporate amnesia is only now being acknowledged, albeit diffidently. In his time as the UK’s Business Secretary in the recent coalition Government, Vince Cable conceded that flexible working could be “too flexible” and that it ”was” contributing to low productivity (Resolution Foundation, May 13, 2014). More recently, an academic study in the US looking into the ageing workforce conceded that the loss of baby boomers’ knowledge and experience was a part-contributor to the country’s productivity decline (National Bureau of Economic Research, October 2016). Significantly, the more widespread knowledge loss in the rest of the working population was not considered.
This late acknowledgement of the problem has meant that efforts to address declining productivity growth have been ignored for, literally, decades. Employers’ efforts to address their knowledge loss have focussed mainly on vain attempts to reduce staff turnover and the token introduction of mentoring and internships, whilst business educators have done virtually nothing to address the one-size-fits-all decision-making instruction that was designed for the old workplace when employees rarely moved. Alongside this, Government efforts have been restricted to top-down strategies, mainly involving emphasis on infrastructure and education which, as the record shows, have also been less than successful.
It should be noted that not all workplace turnover is unhelpful or, indeed, unneeded but given the requirement to compensate for unusually high employee churn and knowledge loss – i.e. enable ALL replacement employees to have a wider awareness of their new employer – institutions need to employ much better knowledge capture systems to include important tacit knowledge that can be efficiently shared across the corporate hierarchy. Thereafter, this needs to be complemented by more specific reflective instruction to enable replacement employees to APPLY tried-and-tested practice to their employers’ new circumstances and environments. This is the discipline known as Experiential Learning (EL), whose methodology has been refined for modern usage by its main proponent David Kolb, Professor of Organizational Behavior at Case Western Reserve University’s Weatherhead School of Management. The solution is not rocket science, rather an expectation that modern market economies should have accommodated its needs instinctively.
The point being made here is that endemically high staff turnover, which includes the higher-placed individuals that plot and plan coalface strategy, is affecting every single employer. On this basis, it is obviously pandemic, structural, and thus part-responsible for today’s indelible outcome on productivity – and with it, I believe, carrying the knock-on consequences reflected in the startling populist political changes in the UK and the US.
To some, this observation might still seem frivolous but, in truth, no other explanation has been proffered that could produce the pervasively robust effect on broad-spectrum productivity. More importantly the efforts to date by both employers and Governments to directly address the problem have been conspicuously ineffective. For deniers, the question must be asked what they’ve got against addressing evident huge corporate knowledge losses and better decision-making?
One more point to rattle the sensibilities. Is productivity primarily a problem for the Government, employers or for the managers/politicians who steer them? Over the years it would seem that it is Government that has taken the lead. The post-Brexit/Trump announcements of further huge programmes of infrastructure improvements are another of these customary stratagems but the outcome of similar fiscal efforts in the past suggests that this is still just part of the solution. Better roads, rail, telecoms and, in the UK’s case, affordable housing, will undoubtedly help to further build the productivity wheel but employees still have to be told how, when and where to apply the grease to keep it turning sweetly. For grease read good and better decision-making on the ground ……
For this I quote the wisdom of the late management guru, Peter Drucker: “It is only managers – not nature or laws of economics or government – that makes resources productive (Managing in Turbulent Times, 1980). And for productivity’s underlying importance, I further quote Drucker’s challenge, written 17 years ago, that: “The urgency of the productivity challenge is great. The country that does this first will dominate the twenty-first century economically” (Harvard Business Review, November-December 1991).
It’s obviously too late to do something to avoid Brexit and Trump but there are several other productivity-stressed countries about to hold THEIR elections? There’s Holland, France, Germany and Norway in 2017, and Italy and Austria in 2018. Again, just imagine ….!
Verdict on the 2008 recession: “Could have done better if we had known history better.”
Professor Barry Eichengreen’s new book – Hall of Mirrors (New York: Oxford University Press, 2015) – is important if you have any regard for the idea that the evidence of history is useful for present-day decision-making, whether for political policy making, economic decisions or even at the individual business level.
This University of California economist’s book is a rigorous account of the uses and misuses of historical precedent, and in particularly how the decisions behind the Great Recession of 1929 compared with – and impacted on – those of the Great Depression of 2008, the two biggest economic crises of the past 100 years.
He assesses that contemporary policy makers were “powerfully informed by received wisdom about the mistakes of their predecessors.” Important lessons were learned, enabling policy makers to prevent the worse. But with the crisis still costing an arm and a leg, he determines that they could have done better. Economists’ reading of the 1930s was “incomplete” and “often erroneous”, limitations that led to outcomes after 2008 of weak or no growth and too-timid reforms of the financial systems. Interestingly he is especially tough on officials in Europe for flirting with austerity that has resulted in serial recession and, recently, deflation.
But the equally interesting point is that the book is being published while the dust of the crisis is still choking some parts of the world. Significantly, many of Eichengreen’s reflective conclusions of how decision-makers might have better learned from their predecessors are STILL outstanding, with some notable observers now agreeing with him.
Comparing outcomes, he warns that because 2008 was less severe than 1929, implemented reforms were watered down, risking a deeper crisis in the future. It is counsel that has just been echoed by the last head of the Bank of England, Mervyn King, who believes the failure to adequately reform banking presages another crash “sooner rather than later” (February 27, 2016). Eichengreen’s observed lesson about austerity also gets late support from the OECD which, after first agreeing with the UK’s deficit reduction programme, is now backing less reliance on monetary policy and more public investment across the richer countries (February 18, 2016).
These conclusions aside, the fact that relevant comparisons between the two crises were recognised and that identified lessons were learned and also acknowledged as not learned, confirms the underlying importance of history as a learning tool.
Yet the discipline is conspicuously undervalued as a mechanism for good decision-making in both business education and the organisations that depend on their employees making good and better judgements.
In business education, for example, why is the teaching of economic history declining. And why, except for a mention of the industrial revolution in some classrooms, is business history non-existent in the wider business curricula? Then, why has business education not adapted its decision-making instruction to accommodate the single biggest change in workplace practice – the flexible labour market? One of the traditional justifications for short-tenure working is that it provides employers with employees with a wider experience of work that would otherwise be unavailable. Whilst new blood is often invaluable, what is not considered is the effect on an evidential base that excludes the awareness of actual employer-specific experience. Today, high employee turnover loses the distinctive instructional awareness of employers’ prior practice, allowing decision-making to take place with only a proportion – arguably the less important – of the institutions’ singular core essence. As such, the one-size-fits-all set of taught rules, which used to be sufficient for a more stable workforce, is lacking, both in content and, importantly, process. Widespread corporate amnesia is short-changing the modern decision-making process.
The explanation for this has its roots in attitudes at the employer level, where individual organisations still treat their unique knowledge and experience with similar disregard, their knee-jerk explanation being that change in the marketplace is so fast, any focus on the historical is largely unrelated. As such, they feel justified in allowing much of their expensively acquired experience to walk out of the front door on a regular basis, arguing that replacements will compensate. What actually happens is that they sacrifice the traditional way most progress occurs – organically, i.e. one experience leading into another. High employee turnover also affects innovative learning, one source of which comes from the same rolling churn of new blood that also moves on. This pervasive disconnect is the single largest contributor to the pandemic of repeated mistakes, re-invented wheels and other unlearned lessons that litter the workplace.
Whether the decisions to be made belong to a nation’s President or Prime Minister, a CEO, a departmental manager or, indeed, anyone with an important determination to make for a current employer, the outlook for zero inheritance is little or no ability to learn from any hard-won experience …..
My own reading of the way the Great Recession and other recent economic and financial fiascos were handled largely reflects this evidential oversight which, I conclude, is undeserving of supposedly intelligent educationalists, the students they teach and the progeny – i.e. the decision-making employees – they harvest. The fact that 2008 occurred at all indicates that the people whose job it was to properly manage it were ill prepared and/or unqualified. Moreover, whatever history was used, was typically applied only defensively – i.e. after the event. More of its proactive usage would – to borrow Eichengreen’s words – avoid the “mother of failure”.
To be competent in today’s changed workplace, economists and the policy makers they advise need, first, to be better business historians and then better experiential learners, a skill that requires the reflective deliberation of a more comprehensive evidential base. The principle also extends to the many micro-level misadventures, where coalface managers also need to be better versed in business history, their own employer’s corporate history and, likewise, the formalised skill of Experiential Learning (EL) that realigns outdated all-purpose decision-making towards employer contexts.
To enable this, employers need to be better depositories of their short-, medium- and long-term experience, including their overlooked tacit knowledge, and business historians need to be more prolific, incisive and communicable in their endeavors; alongside this, business educators have to teach business history better and managers need to apply business history more effectively, asks that would fire up the sleepy worlds of business history and business education. It would also help to address the Henry Mintzberg-like criticism that business schools prefer the stratagem of using quick responses to packaged versions of business problems rather than understanding real-world experience.
In the world of work, it’s little good for a country to have just a handful of specialist historians. It’s more beneficial for history to be a part of everyone’s DNA – and especially decision-makers.
Whether at the macro or micro level, the underlying value of economic, business and corporate history is that, whether its bygone outcome was successful or unsuccessful, it is the ONLY practical repository of tried-and-tested evidence outside of random and poor memory recall. And for commerce and industry itself, such valuable organisational memory is all the more important in today’s walkabout labour market. With all that corporate amnesia out there, where does it leave the quality of decision-making for the next unanticipated (or even anticipated) event?
How many times does one hear the infamous justification ‘Hindsight is a wonderful thing,’ and the equally ineffectual resolution ‘We must learn the lessons’ (my last search on Google UK news found 1.72 million hits). Given the cost alone of the Great Depression and the Great Recession, what price the value of making this a more realistic aspiration? In which case, does it make any sense NOT to make better use of history?
Postscript. Putting history in an occupational context, many professions – among them architecture, art, music, the military, medicine, politics, science, the clergy and so on – contain an element of their generic history in their education. If this is considered constructive for them, history’s widespread absence in business education and the workplace must represent one of the biggest neglects in how the rest of us are taught to make our livings, a circumstance that suggests that educators, politicians and businessmen themselves don’t consider business, and least of all the management role, a profession. For societies wholly dependent on mercantile endeavor, odd, very odd ….
History is dead! Long live history! The missing link to KM, EL and decision making ….
There is a curious convention following the death of some British and Danish Monarchs. The eight-word proclamation has read: “The King is dead. Long live the King”, last properly used in the UK in 1936 when King George V was succeeded by King Edward V111, who later abdicated and became the Duke of Windsor. Coming from a tradition started in 1422 in France, it signifies continuity, stability and the expected wisdom that comes from Monarchical experience. Unfortunately, such virtues are not reflected in British business and – strangely – many other places.
My purpose in mentioning this quaint ritual is to highlight the acknowledgement of continuity’s importance to a major institution and the visible gap it doesn’t play in business education. Continuity is the element of experience that allows experiential learning to take place seamlessly while the observable hole in business education is the discipline known as business history. As the memoir of how we did business, it’s the only portably reliable mouthpiece that can provide experience comprehensively and – incidentally – cheaply. It is a subject that is widely neglected – and which could provide an answer to the hugely big problem of experiential NON-learning as presented by the banking crisis and the developing health catastrophe (see https://biggernumbers.wordpress.com/growth-how/). In addition to being titanic, these examples are particularly difficult to address managerially because poor decision-making spans several business sectors that operate independently and whose combined effects are inter-related and larger than them all by a huge margin.
In both cases, individual businesses within their speciality activity were, and/or still are, either unaware, unconcerned with and/or not responsible for other sectors’ behaviour. In health care, for example, the soil sector, food producers, food manufacturers, doctors and the pharmaceutical industries all operate independently of each other, and ALL have contributed to the problem now harmfully affecting the health of more than a quarter of the world’s population. The soil scientists have given farmers a nutrient-deficient way of increasing their production. This critical shortfall has encouraged a food manufacturing industry to use chemical additives to substitute but – would you believe it – the nutrients used are not all absorbable into the human body. Competition has also encouraged the food manufacturers to introduce into their products sugar, salt and trans-fats – the manufacturers call these additions ‘bliss factors’ to sell their wares – many of which have hosted a range of opportunistic diseases to fill doctors surgeries and hospitals. To this double whammy comes the pharmaceutical industry, whose experts then skilfully invent pills for all reasons, extending the life of many, expensively. Metaphorically, the result in the First World has been generations of elderly people dying unkindly on their Zimmer frames. In the Third World, they just die.
However one assesses the outcome of this domino effect of daisy-chain companies in separate business sectors, many major decision makers and their experts are making poor decisions, remedial actions around which would normally fall into the disciplines of KM, EL and decision-making. Conventionally these disciplines can be more easily applied to individual companies. But for the evidence-based decision-making that takes their consequences to another level, the established practices are deficient; they don’t normally address the bigger issues and, anyway, at a sector level, it’s none of their business ….
The overview of business history WILL do the job. It will give the next generations of businessmen the perspectives that the current generations lack. Perhaps, then, they’ll be able to apply their wares more responsibly.
I know brevity is King when it comes to busy-busy managers but indulge me a few more paragraphs to illustrate how resistant to change have been the main characters in this dramatic tragedy. Over the past half decade I’ve been preoccupied how to better apply KM to really big issues, the banking misdemeanours and the healthcare holocaust being the inducement. I’ve argued that the Cinderella discipline of KM should be able to handle a straightforward problem like experiential non-learning, however it fits in the business pyramid.
When I first took an interest in KM, the discipline still had nine years to be conceived in its modern format out of a Boston conference in 1993. Then, all I was aware of was that so many managers and employees had little historical awareness of their products, their companies or their industry. Without this, they were unable to benefit from any hindsight they might otherwise have acquired so, given that the newly-arrived flexible labour market was stripping companies of their unique organisational memory, I took to offering them a service to produce their corporate histories in a form that was readable and suitable for both induction and in-house learning.
It wasn’t an easy sell in an environment where the fourth generation of ‘educated’ businessmen and women were emerging from the newly created business schools. In quite short measure, the corporate history industry died on its feet, no thanks to the imposed departure of older, experienced hands in favour of waves of new bloods who, literally, thought they knew better. They didn’t, because the incidence of repeated mistakes escalated, wheels were re-invented and other lessons went unlearned, all reflected in their more difficult-to-maintain productivity growth scores.
For me, it opened up another opportunity, this time to introduce the oral debriefing of important, low-tenure employees to companies who were finding themselves in various stages of corporate amnesia. It was an attempt to give to business what US historian Studs Turkel gave to social history after the invention of the tape recorder. He was followed by the US academic Professor Allan Nivens, who persuaded many US educationalists to introduce oral history as a tool for serious scholarship.
In the process of a near dead demand for corporate histories and my launching of oral debriefing, I came across the academic world of business history. It, too, was leaving its heyday, with fewer companies wanting to embark on worthy scholastic tomes with hand-full audiences. As a teaching discipline, the wider subject of business history was unknown except at Harvard Business School, where all first year students were compulsorily exposed to the subject (apparently temporarily as the specialty was dropped in the 2000s). In the UK the academic stars of the field took flight to the US and Japan and the discipline locally also almost died on its feet. Its cousin, economic history, additionally took a dive as it was reduced or subsumed into social history curricula.
Over all this time my own specific efforts to persuade academia to introduce corporate history and its bigger brother, business history, fell on deaf ears. To the suggestion, for example, that business schools trying to educate students to enter, say, the textile industry should put several corporate histories of textile companies on their reading list, the typical response was: “We know better.” I came away with the feeling that business schools and the business history community were – if I’m being kind – kind of resistant to outside ideas and especially change.
Twenty years later I am of the same opinion towards an even smaller community of business historians, but with the belief that their ownership of a discipline could yet provide a solution to a management problem that has – and is – costing us more than any other. For one, it could provide missing perspectives and recall the precedents and vital awareness of past business activity to help avoid tomorrow’s banking and health catastrophes, which – as history confirms – will reoccur without adequate suitable defensive measures. In banking, for example, there are at least a dozen precedents in recent history that could have helped avoid the 2008 crash and provide the evidence to apply workable solutions. Then there are all the other unaddressed business-related fiascos ….
To achieve continuity in a short-tenure jobs environment – as even top decision-makers and experts experience – I would prefer that corporate connectivity should come from a good helping of business history to support locally-relevant and conventional KM, EL and decision-making processes. This means that business history needs to be introduced as a prominent feature of business education and training. In many other professions, for example in music, architecture, art, soldiering and politics, there is a generic component of their history in their education. So why not business? That business history is not already an established learning tool – as general history is elsewhere in the academic biosphere – has always been a puzzle. Business is, after all, our flagship breadwinner.
But what if business history don’t, won’t or can’t put their toes into KM, or KM don’t, won’t or can’t put their toes into business history? Given the scale and importance of nothing happening, there’s always the scarier option – the politicians. Someone has to do it!
But hang on, historical precedence points to one other alternative. In the early 1970s, when Japan was on its productivity roll, the then Crown Prince instructed that every faculty of business and commerce should have its own business historian. Perhaps Monarchy does have a role in the business world? What’s Buckingham Palace’s email address? Ma’am, c.c. The Prince of Wales. Will you front up? Respectfully, on behalf of British business, Knowledge Management.
Does ‘Rhodesgate’, the campaign to remove the statues of a Victorian colonialist from public view, help to explain why Southern Africa, indeed Africa itself, finds it so hard to progress?
At first sight, the question might seem far-fetched but the underlying premise is more acknowledged intellectually than recognised in practice. It is that removing a reminder of existent history – otherwise known as experience and/or actual past practice – restricts the ability to learn from that experience. Experiential Learning (EL), the skill required to build one experience on another, is notably absent in most of Africa’s social, political and educational mind-sets. Widespread political dogma and its other non-philosophic convictions have, quite simply, removed much of the continent’s ability to apply former reality to new circumstances and environments, what used to be known as organic learning and which is acknowledged to be the most efficient form of making progress.
The bottom line is that if you blue-pencil the evidence, whether that evidence is good or bad, the ability to adapt is irretrievably compromised. Said another way, it removes continuity. And without continuity, development retreats to the bottom of the learning curve and imposes the inevitability of having to re-invent many wheels and repeat mistakes unnecessarily – as Africa’s post-colonial political and economic record visibly demonstrates. It’s known as experiential NON-learning.
Is actual history and/or experience – however it turned out – a component of learning? And if so, how damaging is its deliberate destruction to the acquisition of wisdom?
In South Africa, the University of Cape Town’s black students recently succeeded in removing Rhodes’s statue from their campus, a move welcomed by the SA government as a way for the country to deal with its past. In Zimbabwe, the African designation for Rhodesia, which was named after Rhodes himself, the issue is still ongoing, with opportunistic calls to exhume Rhodes’s body from its resting place in the Matopos Hills and return it to the UK. Generally though, the underlying argument is that Rhodes was variously a “racist colonial thief” and that his physical depiction reinforces the symbol of “historical oppression”. It is a rallying call that #RhodesMustFall students in Oxford, one of the world’s best-known hosts of academic excellence, are now also calling for for one of their own statues of Rhodes.
Is this not the antithesis of Knowledge Management and Experiential Learning theory, and the result of institutional amnesia, whether arrived at deliberately or inadvertently?
Interestingly, there is at least one African voice in Zimbabwe that concurs with these detractors. Godfrey Mahachi, one of the country’s foremost archaeologists and the director of Zimbabwe’s National Museums and Monuments, said Rhodes’s grave was an important reminder of the country’s colonialist past that should not be airbrushed out. It is a view apparently supported by President Robert Mugabe who, perversely, rates as one of the biggest experiential NON-learners of modern Africa. On this basis, Mr Mugabe is implementing good Knowledge Management but poor Experiential Learning.
Read how the developed world is practicing its own version of the same dysfunction. Follow this blog ….