How can artifical intelligence and speculative design help to strengthen biodiversity?
The ongoing anthropogenic loss of biodiversity that has been called a “sixth extinction” of species is going to impact Earth forever, threatening both natural ecosystems and jeopardizing our species’ own wellbeing. Consequently, it must be the prime responsibility of human kind to lessen this loss of life. Could artificial intelligence (AI) technologies be used to help this effort? Extrapolating from today’s possibilities in AI and the life sciences, Artificial Affinity is a speculation on how we could aid wild species in more rapidly adapting to anthropocentric environments and vice-versa, increasing non-human animals’ chance of survival and giving their perspective a greater presence in the human world.
Taking inspiration from a phenomenon called ‘industrial melanism’ which has been observed in the peppered moth (Biston betularia), the artists choose moths as examples and also imagined what it would mean to consider the natural world as the ‘user’ of our designs. Crucially, Artificial Affinity partially replaces the cognitive-creative work of the human, for instance an urban planner wishing to make an environment more amenable to insects, with that of a machine learning algorithm that might ‘see’ patterns we can not.
The resulting system is able to exchange deep features between sets of images of moths and the city of Berlin, making them aesthetically adapt to one another in the process of machine learning. The first set of results so created shows individual moths that have taken on features of the city while the second set shows textures of the city that have taken on a moth-like character.
The artwork contemplates that humans may not always be the best and most reliable advocates of nature. Therefore, the artists suggest that the potential powers of AI could well play a role in the future of natural history.
Photos of moths: ImageNet
Photos of cityscape: Jan-Henning Raff
Algorithm: CycleGAN by Aitor Ruano based on research at UC Berkeley by Jun-Yan Zhu, Taesung Park, Phillip Isola and Alexei A. Efros
The project was developed as a cooperation between STATE and the application laboratory Mediasphere For Nature” with the Museum for Natural History Berlin.