(Text provided by the organizers)
The Creative-Ai (AI and the Artistic Imaginary – WASP-HS) and MUSAiC project teams at KTH kindly welcome you to the third seminar in our series “dialogues: probing the future of creative technology” on Thursday 2 February, 15:00-16:00 (CEST).
This seminar (held on zoom, https://kth-se.zoom.us/j/67706212115), we talk about “Artistic and legal-philosophical perspectives on deep fakes”. We start with two presentations from our invited guests (see below), followed by a discussion between each other and then with the audience.
Our guests are Ania Catherine and Dejha Ti and Katja de Vries:
Ania Catherine and Dejha Ti are an award-winning experiential artist duo who founded their collaborative art practice, known as Operator, in 2016. Referred to as “the two critical contemporary voices on digital art’s international stages” (Clot Magazine), their expertises collide in large scale conceptual works recognizable for their poetic approach to technology. Ti’s background as an immersive artist and HCI technologist, and Catherine’s as a choreographer, performance artist and gender scholar make for a uniquely medium-fluent output–bringing together environments, technology and the body.
Operator has been awarded a Lumen Prize (Immersive Environments), ADC Award (Gold Cube), S+T+ARTS Prize (Honorary Mention), and MediaFutures (a European Commission funded programme). They’ve been speakers at Christie’s Art+Tech Summit, Art Basel, MIT Open Doc Lab, BBC Click, Bloomberg ART+TECHNOLOGY, Ars Electronica, Contemporary Istanbul, and CADAF. Ti and Catherine are originally from Los Angeles and currently based in Berlin.
Title: Soft Evidence–Synthetic cinema as contemporary art
Abstract:
Art has always explored notions of truth and fiction, and the relationship between image and reality. Synthetic media’s capability to depict events that never happened makes that relationship more complex than ever. How can artists use synthetic media/deepfakes creatively, and start conversations about ethics and the social implications of unreliable realities? In this presentation, artist duo Ania Catherine and Dejha Ti of Operator discuss their work Soft Evidence–a slow synthetic cinema series created as part of MediaFutures in 2021. They will detail how research and interviews with experts on media manipulation in law, education, and activism informed their creative and technical processes. As experiential artists, Ti and Catherine plan to exhibit Soft Evidence as an installation, a site for the public to learn and process a rapidly changing media landscape through immersion and feeling states.
(For Katja:)
Katja de Vries is an assistant professor in public law at Uppsala
University. Her work operates at the intersection of IT law and
philosophy of technology. Her current research focuses on the challenges
that AI-generated content (‘deepfakes’ or ‘synthetic data’) poses to
data protection, intellectual property and other fields of law.
Title: How can law deal with the counterfactual metaphysics of synthetic
media?
Abstract:
How can law deal with deep fakes and synthetic media? Law is influenced
by the politics, norms and ontologies of the society in which it
operates but is never exhausted by it. Law always first and foremost
obeys to an already existing system of parameters, rules concepts and
ontologies, to which new elements can only be incrementally added. This
contributes to legal certainty and foreseeability, as well as law’s
slowness to adapt.
The EU legislator is trying to adapt to new digital challenges and
opportunities by creating a true avalanche of legislation. In the case
of deep fakes and other synthetic media the question, however, is if
operative concepts such as transparency and informed consent and
dichotomies such as fact v. fiction, human v. machine, etc. work well
with the counterfactual metaphysics of synthetic media, namely the
articulation of what is possible into digital mathematical spaces of
seemingly endless alternative realities, and extensions in time and
space. More concretely: is it important to simply flag that we are
interacting with a synthetic work? Can we consent to live-on forever in
disseminating digital alter-egos?