Taking comfort from art and history in the age of Artificial Intelligence
The Gallery Companion on Apple Podcasts
Shortlisted for the Independent Podcast Awards 2023, The Gallery Companion is hosted by writer and historian Dr…
Last week as I watched the British Prime Minister Rishi Sunak in discussion with Elon Musk at the first government-level AI Safety Summit held in the UK, I was struck once again by the short-term thinking of politicians. Sunak needs all the good news coverage he can get at the moment what with his government’s popularity sinking in the polls and the election on the horizon. So the attempt to position the UK as a leading player in the field of Artificial Intelligence was a real win for him as global leaders convened in the English countryside for the summit. It was also a big win for him to score an interview with one of the world’s richest and most powerful men. With an audience of millions, Sunak was able to tick his way through key political messages, pointing out all the good stuff that AI will bring to healthcare, education, and investment in the UK tech industry.
But the other more important point of the summit was to address the potential dangers of AI, and there was a familiarity in the lack of urgency to take action. Sort of similar to the COP climate and biodiversity summits, where vague agreements are signed by governments and promises are made to think about thinking about doing something in the future. Sunak said something nebulous about how we shouldn’t rush to regulate AI without fully understanding the risks. But given the breakneck speed at which AI is developing, do we have the luxury of time? Since open source AI — code that is freely available for anyone to access — is about a year behind the capabilities of the leading tech companies, government regulation already seems a bit irrelevant. AI is arguably beyond anyone’s control.
I try not to think very much about the dangers of Artificial Intelligence anymore, because I find the potential scenarios that could emerge even within…