AI: too much information?
Yuval Noah Harari and Parmy Olson on how the race for superintelligence may amplify the worst of human nature.
An increased number of AI experts on boards raises questions about the realities of board composition and the dynamics of skills inflation.
At a recent conference I attended in the US the attendees were polled on whether they had a dedicated AI expert on their board. I was surprised at the poll results.
A significant number of boards have a dedicated AI expert on their board – 25 per cent in 2024 compared with 5 per cent in 2023. The percentage who said they had board members with some knowledge of AI was 36 per cent (down from 49 per cent in 2023) while 39 per cent said they had no AI expertise (46 per cent in 2023).
Having a dedicated AI expert is a high bar for a board to reach, in my view. I feel the huge increase in claimed dedicated AI experts in only 12 months raises important questions about the realities of board composition and the dynamics of skills inflation.
Such a dramatic increase, without corresponding board turnover, is indeed surprising. It implies that nearly every new board seat would have had to be filled by an AI expert, which is highly unlikely given the limited number of board vacancies annually.
This raises the possibility that directors are rebranding their existing digital skills or overestimating the depth of their AI expertise; perhaps after attending a few courses or briefings. This phenomenon is known as "skills inflation," where individuals amplify their credentials or knowledge to meet evolving market expectations. In the case of AI, a technology that has rapidly ascended the strategic agenda, directors may feel pressured to portray themselves as more knowledgeable than they actually are, especially as the importance of AI governance becomes more pronounced.
There is an increasing trend for directors to add new competencies, particularly in AI, to their skill sets. However, completing a short course or attending a workshop is not equivalent to deep expertise. This tendency towards "rebranding", or skills inflation, can lead to significant gaps in board oversight.
AI is a complex and evolving field requiring not just knowledge of how the technology works, but also a deep understanding of its risks, ethical considerations and regulatory frameworks.
If directors are overstating their AI knowledge, boards could find themselves in a precarious situation where they believe they have the expertise required to govern AI technologies effectively but actually lack the depth needed to address the nuanced challenges AI presents. This could lead to blind spots in governance – particularly in areas such as data governance, AI ethics, and algorithmic bias – which could expose the organisation to significant risks.
The Institute of Directors' AI guide released this year highlights the growing importance of good governance. Boards are expected to not only integrate AI into strategic decision making but also to understand its ethical implications, manage AI-related risks and ensure robust data governance. This requires more than just surface-level knowledge; it demands a deep, technical understanding of how AI works and the broader regulatory landscape surrounding its use.
Boards that do not have genuine AI expertise may struggle to make informed decisions, resulting in missed opportunities or unchecked risks.
The dramatic shift in survey results suggests there might be a disconnect between how boards perceive their AI capability and the actual expertise present. If boards are claiming to have dedicated AI experts without having made corresponding board appointments, it could point to a tendency to overstate capabilities to appear more future focused or innovative.
This is a crucial governance issue. Boards have a duty to be transparent about their skills and capabilities, particularly in high-stakes areas like AI governance.
To address these concerns, boards should:
While the survey results suggest a growing recognition of AI’s importance, the significant rise in AI expertise claimed by boards may be inflated. Directors must ensure that they are not overstating their AI capabilities and should take proactive steps to build genuine expertise in this area. This is critical for effective governance in an increasingly AI-driven world and to avoid governance blind spots that could expose organisations to undue risks.
For resources and webcasts exploring expert views on AI governance, search AI on our website.
AI assisted in the creation of this article.