• NEWS
  • On 23 June, 2023 AKADEMIA Forum Presented Life at the edge of the future: Human culture artificial general intelligence (AGI)

2023/08/07

On 23 June, 2023 AKADEMIA Forum Presented Life at the edge of the future: Human culture artificial general intelligence (AGI)

     

    In the AKADEMIA CineForum event on 23 June, Anne Mette Fisker-Nielsen used the title from Sarah Pink’s new book “Life at the edge of the future” (2023) to raise such existential questions about us living with ‘culture’ now being constructed by AGI. Humans could be said to have been haunted by language  - those who could control the language and tell the story could control people. Drawing on Yuval Noah Harari who points out that fear of being trapped in illusion characterise thousands of years of human conversations. However, we now face to issue with AGI as gaining mastery over language – and acquiring the ability to tell stories.  

     

    In fact, AI and algorithms are in many ways serving as our new oracle: we seek answers in ChatGPT, but even with relatively primitive AI, Harari points out, the impact of social media demonstrate how chaos can be instigated – “we do not need killer robots, humans will do it when they believe.” Millions of people are willing to believe illusions about COVID, about vaccines, that climate change is a hoax etc. How is the AI revolution, for all its convenience too, also bringing us to this very human condition of how to deal with illusions? Will increasingly powerful AI be able to create a curtain of illusions that is so real we won’t know it is an illusion anymore? Harari points to the concern with AI taking over culture (language, feelings, thinking, construction of reality), it could create whole new cultures that may not reflect human concerns. Fisker-Nielsen drew on such ideas from Harari and Sarah Pink (Yuval Noah Harari -Humanity is not that simple) https://www.youtube.com/watch?v=4hIlDiVDww4 before we watched Yuval Noah Harari - The Oppenheimer Moment of AI https://www.youtube.com/watch?v=Bpy6X7kF7-s

     

    One of the key issues (which we also study in the Global Japan/Social Anthropology Seminar) is the issue of intimacy, which has for some time been the subject of study in the anthropology of Japan. What happens when you have entities that are not human building relationships with you, like haptic creatures, but this time what if you don’t even know they are not human? Harari points out in his talk linked above that: imagine you go online and you begin to have a discussion about a social issue you care about, and you interact with this super nice ‘person’ who knows just what you think (based on algorithms that track your activity online). Then it starts building an intimate relationship with you (which we know happened increasingly during the COVID pandemic as people were stuck at home); and after a few weeks it begins to convince you of different ways of thinking, and feeling. Intimacy is the most powerful way to persuade people; what happens when AGI competes for your attention at the level of intimate relations?

     

    We started our first AKADEMIA CineForum in April this semester with considering the question about ‘attention’ and the way algorithms compete for your attention, they are designed in that way. This is now referred to at the attention economy. Algorithms have discovered that the best way to grab your attention is not with kindness and complexity of feelings and ideas, nor with compassion and love, but is much more successful when capturing your attention with outrage, with fear, with hatred; and we observe clearly the impact on our world. The battle is now shifting from attention to intimacy; AI will be competing for intimacy because it is the best way to persuade you (please see link above by Harari for these arguments).

     

    Anthropology has long insisted that complex social issues require historical, political and moral awareness and structural change, but also shows how this requires a commitment to understand a continuously emerging world, and as pointed out by Sarah Pink who argues that we need to rethink the role of technology. How can we, rather than caught up in profit-making make emerging technologies serve a different agenda that work for an ethic of care.

     

    In the discussion groups and open forum that followed, we addressed the questions:

     

    1. Do you think your view on the world is influenced by algorithms? How might ChatGPT (Large Language Machines based on word probability) play a role in influencing your views?

    2. Why is control of language so significant? Daisaku Ikeda (founder of Soka University) says human history plays out as a battle for words. If we seek the answers from our new omnipresent oracle, the data base, what do you think our future culture might look like?

    3. What would you like AI technology to look like? How can citizens play a constructive role rather than an adaptive role in the development of AI technology?

     

    This event sparked a lot of interests amongst Soka University students, faculty and guests who attended the event. A follow up event in the autumn will be organized in corporation with researchers in AI technology at Soka University Faculty of Engineering. We look forward to welcoming you as we continue to engage with these highly significant issues that will deeply affect the kind of future that will evolve.

     

    This event was organized by Anne Mette Fisker-Nielsen, Faculty of Letters, and the AKADEMIA Forum team.

     

     

    ページ公開日:2023/08/07