On the 20th and 21st of June in London we were invited to attend the CogX conference all about AI's growing impact on our society.
We took ten of our girls who were interested to learn more about what feels like the imminent future that is artificial intelligence across so many fields including the creative industry.
Not quite knowing what to expect, or if it would all be way over our heads, we walked straight into a key note speech being given by Siavash Mahdavi, Chief Executive Officer & Founder of AI Music. Kenzie was soon tapping her feet to the 'trap' AI-remix of an Imagine Dragons song whilst also feeling unnerved, that her songs could potentially be manipulated this way. It was followed by a panel discussion on if AI could create music composition like humans and, if so, could it create theatre better than Andrew Lloyd Webber? And what of journalism when AI can aggregate data and information faster than any number of interns and string together impactful and informative news reports.
This session on News, Media and Entertainment was most closely to our hearts as a creative agency and though the developments of AI seemed both fascinating and terrifying the consensus seemed to be that it still had a way to go before it replaced humans in terms or true imagination, creativity or opinion. Though it may be able to create thousands of remixes at once or steal some of the jobs of Bloomberg style newsrooms (The Washington Post used AI to cover nearly 500 races on Election Day), journalists writing long-form opinion pieces can't be replaced... yet.
We then ventured into the Data privacy, Data Ethics and GDPR session. Dave Eggers' novel, The Circle, whose timely film adaptation is released this year, comes to mind when I think of sharing personal data, or how we might be obliged to share personal data on everything and who has the rights to that data. Was data to the be currency of the future, was it already? Will the new EU General Data Protection Regulation (GDPR) be enough?
Like with most of the topics covered at the conference, I had quite strong beliefs and fears, though, to be fair, they were mostly based on works of fiction like The Circle or I Robot, when walking into them but I walked away demystified and somehow less troubled by the concept of AI. It seemed that sharing data, freely rather than having big companies own it, could make our lives even easier and get us what we need faster. Even Sophia, the world's most human robot, was kind, considerate and had a comedic spark I wasn't anticipating.
Another highlight of the first day for everyone was the session on mental health, a subject of importance to all of us no matter what our backgrounds. We discussed sophisticated chat bots that teenagers can speak to when they need answers or information but also how to create a 'happier' city when doing urban planning - it seemed AI could help with both dealing with conditions people developed as well as the root cause of them. Overall, however, it again seemed that we were a long way off from creating AI that could substitute good psychiatrists, though it could potentially supplement their work as people can often feel more comfortable sharing information with a computer due to its inability to judge them. The panel discussion raised some interesting questions on accountability - if a chat bot ill informed someone and it led to their suicide, for example, who was to blame? The bot? The engineer making it? The creators behind it?
That evening we attended the CogX Awards Dinner, where one of the awards was given by Sophia and we met more diverse people in the world of AI.
On the second day of the conference, during the Science and The Environment key note presentation we heard about the applications of AI to weather predictions by Alberto Arribas (Head of informatics lab at the Met Office). With large sets of weather data AI can be applied to not only predict weather forecasts for everyday needs but also in the event of natural disasters so that communities can better prepare for their effects. More interestingly, what started to emerge during the panel discussion was the ethics behind making AI available 'for the many and not the few'. How do we now not let AI increase the gap of social divide? How do you provide access to enormous data sets which are needed for building AI predictions and algorithms not just to the big corporations but to everybody? A big problem is that even when you do make the data freely available, as Alberto had done in the past, nobody seems to use it because of the sheer hardware requirements needed to cope with it. From this discussion further questions emerged about who's responsibility does it become to morally conduct the AI industry. Panellist Gemma Milne from the organisation Science Disrupt also argued that AI is great with finding correlations in the data sets but actually much harder to find the scientific causes for the results and that we need to be clearer about what the goals are that we are trying to achieve with AI. A smart city is not a 'goal'. It seems there is a problem with how well many of our scientific and environmental challenges are defined for AI to solve and that the ethical regulation of AI needs to be dealt with hand in hand with the progress it makes in the science industries.
Overall the problem of accountability and ethics has deterred any large AI operation from going full swing or taking over any particular industry. It's hard to point a finger of blame at an AI system, which is why we need people beside it at all times, a bit like how David Hanson is never far from his robot, Sophia.
However, AI is already and will continue to be a part of our lives, which is why everyone should be informed on its developments in our society - even ten female creatives.
Thank you to Cog X for having us.