John Barrington, FAICD, a leading not-for-profit chairman and strategy consultant, is unequivocal about the impact of technology-led disruption and the preparedness of boards to govern through the coming Industrial Revolution 4.0.
“Boards, collectively, are not ready for the effect of advanced technologies on their organisation,” says Barrington. “When it arrives in full force, growth in artificial intelligence (AI), automation and the Internet of Things (connected devices) will be exponential. That is, the longer AI change occurs, the faster it becomes.”
Barrington says boards that govern strategy, risk management and organisational culture in a linear or sequential fashion will be left behind. “The scale of change ahead from AI is unprecedented. Boards that are naturally change resistant, risk averse or unable to tolerate failure will not be agile enough to govern through AI uncertainty.”
Directors who are well read on advanced technologies, of which AI is the connecting force, will have encountered much hyperbole on these trends. A widely quoted 2013 Oxford University study, for example, estimated almost half of all jobs are susceptible to computerisation – a trend that would shake industry and society to the core if proved correct.
Barrington has a more practical view. As Chairman of Anglicare WA, an innovative NFP that helps the disadvantaged, and the Perth International Arts Festival, Barrington has seen first-hand the challenges that confront boards when assessing AI opportunities and threats.
AI, he says, is as much an issue for NFPs and government boards as it is for those governing commercial organisations. “AI, in one form or another, will affect all organisations across industry. It’s not just an issue for start-up entrepreneurs who are attacking large organisations. Every board should be focused on technology-enabled disruption.”
AI represents profound change. We have historically seen technology as a tool for workers. Now, technology is becoming the worker… Directors have a responsibility to their organisation and to their employees to ensure that appropriate re-training, re-skilling and re-deployment programs are in place to cope with this change.
As a consultant who specialises in AI, Barrington works at the intersection of technology, strategy and governance. He has advised dozens of executive teams and boards on strategy and governance and recently co-founded a technology start-up, AlphaIntell, which is partnering with IBM to provide AI services and rapid prototyping of AI technologies for organisations.
Barrington was awarded the AICD PwC Director Award for Excellence in the NFP Sector in October 2017. Here is an edited extract of his interview with the Governance Leadership Centre on the impact of AI on boards:
GLC: John, there’s a lot of hype about AI. Is there a risk that boards over-react?
John Barrington: There’s always a risk that boards over-emphasise their focus on advanced technologies, distract the executive team and take the organisation in too many unfocused directions. That said, AI represents profound change. We have historically seen technology as a tool for workers. Now, technology is becoming the worker. As more industries become digitised, the effect on organisations and boards will be immense. Directors have a responsibility to their organisation and to their employees to ensure that appropriate re-training, re-skilling and re-deployment programs are in place to cope with this change.
GLC: You argue that boards are collectively unprepared for AI. Why is this the case?
JB: Partly it’s because some directors are underestimating the speed of change. They see AI as an issue for the ‘future’ when the reality is it’s here now and rapidly getting bigger every day. It’s already on our smartphones, for example, and in search engines. Also, some boards don’t have the people or processes to deal with AI and other technology-led disruption.
That is no criticism: understanding emerging technologies and their effect on organisations, and integrating that into strategy, risk-management frameworks, compliance processes and organisation culture is complex. AI is very challenging for boards.
GLC: Should more boards appoint directors with strong technology skills and experience?
JB: Yes and no. A board that lacks tech skills might benefit from recruiting a younger, tech-savvy director who is well versed in these trends. But I believe AI and disruption is an issue for every director, not something that should be delegated to a director or two who have a leaning towards this field. The strategic governance challenge is ensuring all directors are capable of questioning how the organisation responds to AI opportunities and threats.
GLC: Is boardroom diversity the answer to AI governance?
JB: Absolutely. Boards need directors who can approach issues through different lenses and avoid groupthink. That is true of all board issues, not only advanced technologies. Most of all, boards need directors who are capable of challenging assumptions behind strategy and are willing to ‘rock the boat’. It takes a special director who is prepared to prove management wrong and can do so in a way that is productive, collegiate and strengthens the board/executive relationship.
Boards also need a culture where directors feel safe to present divergent views and not always converge quickly to the consensus decision. It’s a question of balance: you don’t want directors shooting off on random tangents and distracting management. Equally, you don’t want a boardroom culture where directors are hesitant to express left-field views about emerging technologies and their potential impact.
GLC: Should boards develop special processes, such as technology sub-committees or working groups, to deal with disruption from AI and other emerging trends?
JB: Generally, I’m wary of working groups or sub-committees on technology issues because the risk is they delegate the issue to a few directors and take the full board’s focus off it. More committees or working groups can also add to board complexity and workloads.
AI must be an issue for the entire board. It’s dangerous to be prescriptive on how each board should develop processes to consider it; there’s no one-size-fits-all approach. I’ve found presentations from management or outsiders on AI topics, before some board meetings, can be effective, as is having AI and strategy as a structured discussion point within some meetings.
GLC: John, how do boards find the balance between focusing on today’s business and commercial realties, and future impact of disruption?
JB: It’s a good question. A simple model is thinking about strategy in three horizons. The first horizon is the here and now and what the organisation needs to do to keep performing. Horizon two is about building on that through business development, and horizon three is about buying options for the future through new ideas and experimentation.
In every boardroom, you’ll find directors who gravitate to different horizons. Some directors are very good at governing today’s business; other are more forward-looking and creative. The key is ensuring the board has the collective skill to cover all three timeframes. Horizon three requires organisations to seed several innovations, knowing only a small percentage of them will succeed. It’s in this horizon that AI experiments should be conducted. Here, boards need to tolerate failure and ensure executives are incentivised and rewarded for the failure and its positive outcome, such as new organisation capabilities that are developed from learning through a failed innovation. The learning outcome is fundamental to success in innovation.
GLC: How can boards create structured thinking about industry disruption?
JB: The key is to be relentlessly customer-centric. The executive team cannot go off on wild tangents that distract them from the main game. Rather, they should focus on a small number of experimentations in horizon three that the board approves. A relatively small amount of capital is devoted to each and the board acknowledges that several innovations will fail. The trick for a board is to encourage the organisation to engage in radical innovation, in a way that is part of a carefully considered strategy and does not detract from other strategic horizons.
GLC: How can directors use AI within the boardroom to create a stronger culture of evidence-based decision making?
JB: The use of AI by directors has not had much consideration. Done well, it could make a vast contribution to boardroom decision making. AI, for example, can allow organisations to run thousands of decision-making scenarios in real time, rather than focus on just a handful.
In time, we’ll see more boards encouraging, even requiring, executive teams to use AI to test decisions by running a much larger number of strategic scenarios and outcomes. There would create be a culture of evidence-based decision-making in many organisations. We’re seeing that now in the energy sector, for example.
GLC: John, you have developed the PRIMER model to help boards assess advanced technologies and the impact on strategy and governance. How does that work?
JB: The P in PRIMER stands for purpose. It’s critical that the board understands why the organisation is developing an AI strategy. Is it to serve customers, take cost of out of the supply chain and improve production efficiencies, create vast amounts of real-time data for analysis, or a combination thereof?
The R in PRIMER stands for resources. Having identified AI’s purpose, the board must understand the human and financial resources needed to implement it. Does the organisation have sufficient AI skills, can it get them from outside and how does it upskill staff with AI skills through real-time learning and development systems?
I recommend that organisations have an integrated AI strategy, linked to the main strategic plan, which outlines resources required, delivery timeframes, milestones, targets and expected return on investment.
The I in PRIMER stands for Insight. That is, how will the organisation use AI to learn more about current and prospective customers?
The M in PRIMER stands for Move. Big data, AI and robotics are all about speed. The winners in the data wars will be organisations that are able to spot trends in vast amounts of real-time data and react before the competition.
The E in PRIMER stands for Experiment. In some ways, AI will allow larger organisations to work like smaller entrepreneurial ventures through rapid experimentation.
The R in PRIMER stands for Reputation. Every organisation that embarks on AI must consider its ethical implications and effect on the organisation’s corporate social responsibility. The board must be clear on the line that the organisation will not cross with technology, which is not always clearcut. For example, how far will the organisation go with analysing customer data and encroaching on privacy? How will it protect that data and use it ethically?
GLC: How can AI affect the NFP sector?
JB: Anglicare WA, for example, could use AI to scan thousands of phone calls from people who experience financial distress around the payment of utility bills. It’s early days, but we could consider a form of robo-advice could help the organisation deal with escalating volumes of calls and provide a high standard of advice that is complemented by human interaction in more complex cases. At this time, such considerations are hypothetical. But that is the point – boards must be actively considering hypotheticals if they are to remain relevant in the future.
Some banks are using robo-advice to screen mortgage applications and the like; there’s no reason why the NFP sector cannot use this technology to serve more people and augment advice from human counsellors. Research shows some people experiencing distress prefer to deal with an automated service because it does not provide subjective opinions or, in some cases, judge them.
In the arts, AI could be used to extend the patron relationship by providing more information before, during and after an event – and innovating the overall experience. In time, more arts organisations will analyse big data to better tailor arts events for stakeholders.
GLC: John, what advice would you give directors who want to focus more on AI?
JB: The obvious first step is to be well read on the topic. Directors should ensure their information set covers emerging technology developments relevant to their industry and others. They should read well beyond their industry; AI will increasingly blur sectors, creating opportunity for organisations to move rapidly into new fields.
As part of this information, directors should actively look for opportunities to hear from leading AI thinkers, either by attending events on their own or through board-organised events. Tours to innovation clusters, such as Silicon Valley, can be beneficial.
Networking is equally important. More than ever, directors need to have diverse networks; knowing successful start-up entrepreneurs or tech innovators can challenge your perspectives.
Again, it’s about being prepared to listen to different views and, if needed, prove things wrong in a way that fits within your broader director role and responsibilities – and is part of a collegiate boardroom decision-making process.