During self-isolation I’ve been revisiting past work, expanding my knowledge set and gradually building new work in Touchdesigner, Processing, Blender, Unity, etc., etc.
Working with OSC, and Fungi I’ve been exploring how mycelium might influence visuals, and a multi-player game. Similar to my work with Mycelium Music through bio-sonification.
Symbiosis +/- Dysbiosis is a multiplayer experience. The first iteration involves Human players and the choices they make within the world of Nanotopia. Nanotopia is very similar to Earth, with multiple species, unusual organisms that dwell in dirt, air, water. Lichens and mosses, flowers, trees, corals, sponges, mud> teaming with life, how will the Human players navigate this world? The Humans must pair with various microbes in order to survive, and also to evolve.
The Fungi and other frilly, floofy, tentacled ones are the ‘wild card’ player, if you will. These fungi and microbes might pair with Humans, offering help, offering special abilities, or food, or they might create illness and disease. Ultimately, it is up to the Humans and how they move within this world that truly decides how the nonHumans react and respond, whether or not the nonHumans choose to pair for Symbiosis which can amount to evolution.
Dysbiosis is the condition of having imbalances in the microbial communities either in or on the body; internal, external, in the environment.
COVID-19, for instance, could be seen as a global dysbiosis for humankind.
Dysbiosis/The Air We Breath is about our collective environment. Our microbiome, collectively speaking and individually. How our presence in a space, within our shared environments is felt and what we leave behind. How environments affect us, how, on a microbial level everything connects, everything interacts. Some for good, some not-so-good, some deadly.
With COVID-19 and other potential hazards in our air streams, the concept here is towards visualizing our multi species entanglements, our interactions and our awareness or unawareness of how we impact the nonhuman world.
The installation can be purely a VR/AR environment, web-based and/or with physical elements that employ haptic sensors within a space along with the mixed reality. The physical elements are soft circuit/soft sculptures.
Everything in the environment is visualized in its microscopic forms. From the bacteria and fungi that populate our tissue, hair, exhales, the floors, other organisms in this environment; birds, cats, how our microbes intermingle when we come close to one another, when we touch. In order to make the subjects in the space more ‘clear’ human and other life forms are outlined, however, the biomes spill over and out -they are not confined (as in reality) the outlines here are used as a marker of sorts.
Air readings are taken from within the space as well as the surrounding area this installation might take place in. This data is shown in percentages as well as in the virtual space. In some circumstances some of this data will be sonified towards creating a soundscape within the installation. Kinect sensors map visitors in the space to project their microbiome, TouchDesigner will be used for some of these visuals
In 2012/13/14 I started building large and small scale phytoplankton, fungi and bacteria out of glass and sometimes metal: copper/sterling.
In 2013/14 I received grants from the Glass Art Association of Canada and the Ontario Arts Council towards creating the working space to do so, supplies, and time to create sketches in flame-worked and cast glass.
The concept being to create macro versions of pollens, fungi, bacteria, dust/dirt, phytoplankton and other particulate found in our airstreams. Using a Vitrigraph kiln arraignment I started building up murrini- glass cane that would create the textural elements. It was amazing to create custom colour palletes, and work in my super small studio similarly to working in a Hot shop (aka glass blowing studio). For the most part this initial stage was successful, however, the kiln and space required to create large renderings was out of my means, and the only available studio is in Brooklyn, NY and I am in Toronto so, I created a foundation that could potentially be utilized at a later date.
Since the 90’s I have been fascinated with the concept of Virtual worlds and creating them! The work of Jaron Lanier only added to this. Plus, he coined the term or maybe William Gibson did or? Who knows! It sounded amazing. Add to that my dreams as a small child in the 70’s of cars that had windscreens like televisions (for lack of a better word then) and motorcycles that turned into robotic suits around the rider (think Transformers but in 71/72 and I don’t think that even existed yet!) I have always been a child of SciFi.
In 2015/16 I participated in workshops that taught Unity gaming and AR. I believe now that a mixed reality, maybe working with Magic Leap, Holo Lens or similar (like a AR contact lens!) is the way to create the visualizations I am considering.
I have included information below from the first iteration of the Air We Breath below.
Imagine if you could actually see, touch, feel and perhaps hear everything around you on a microscopic level. What would that be like? Would you want to consciously interact with these organisms, and molecular structures? Would your conscious interaction(s) change the environment in any way, positive or negative?
The air we breathe invites participation through several non-linear sensory components.
Auditory: highly sensitive microphones record, filter and play back (live) breathing within the installation.
Touching, and feeling the Phytoplankton(2)(3)., Zooplankton, Radiolarians and Melethallia creates generative sounds, displays immediate air quality (VOCs, particulates, biogenic, etc.).
As the night progresses, and the air quality diminishes/increases, the structures within the space appear to grow or decrease in number. Mirrored floor and walls create the illusion of infinite space. Infinite air.
Significance To draw a direct link between the Oceans health, air quality, and Humanity. To open a positive dialog concerning the climate crisis, and how events like Fukushima, war, fossil fuel use/production all contribute to The air we breathe, and perhaps even our shared microbiome.
Between the Earth, and the Sky IS the possibility of everything- our air/our oxygen without it there is no possibility.
Project description: The human body is inhabited by trillions of microorganisms such as bacteria, viruses and fungi, which we call the microbiome. Boreal forest, or taiga, represent the largest terrestrial biome. Forests occupy approximately one-third of Earth’s land area, account for over two-thirds of the leaf area of land plants, and contain about 70% of carbon present in living things. They have been held in reverence in folklore and worshipped in ancient religions. However, forests are becoming major casualties of civilization as human populations have increased over the past several thousand years, bringing deforestation, pollution, and industrial usage problems to this important biome. For ‘Life As We Know It’ Terán and Vogl propose a fully immersive, mixed reality environment that incorporates living mycelium, soft tactile interfaces, and bio-sonification in a virtual reality environment. The virtual environment is very similar to Earth, with multiple species, unusual organisms that dwell in dirt, air, water within a Boreal forest, under a tree. How will the Human visitors navigate in this world? The Humans must pair with various microbial and mycelial life in order to survive, in order to evolve. The fungi (mycelium), and other frilly, tentacled ones are the ‘wild card’ players, if you will. The non-Humans might pair with the Visitors, offering help, offering special abilities or food; they might create illness and disease or not appear at all. Ultimately, it is up to the Humans and how they respond to/within this environment that determines how the non-humans react and respond. Whether the non-Humans choose to pair for Symbiosis, which can lead to evolution or Dysbiosis, which can end in a COVID-19 scenario, or worse.
In the physical gallery environment: Tactile interfaces suspended from servo-motors, and attach to the wall. Depending on COVID-19 related restrictions, a tactile ‘bean-bag’ like seating situation might exist or a Subpac vest, however, visitors are able to stand and walk around regardless. The tactile interfaces utilize soft-circuit switches, buttons and haptics that trigger input that directly tie into the Virtual environment. The servo-motors lower individual mycelial organisms when triggered by Visitor’s bio-data*. *Terán’s Bio-Sonification modules and/or Heart Rate Sensor, that can read oxygen levels in blood are placed onto Visitors sending realtime biodata that trigger the servo motors to lower a particular tactile interface. A special housing hosts living mycelium cultures which send bio-data/sonification into the Virtual environment. Depending upon the Visitors bio-feedback, the mycelium might influence the Visitors experience within the VR environment. Being that the mycelium is alive and experiencing the environment throughout the installation it, too will be reacting and changing over this course of time.
From past bio-sonification performances and installations, I have experienced people growing empathic and even compassionate towards non-human organisms such as fungi, and trees where prior to ‘hearing’ the fungi or tree they might have admired its beauty or viewed it as deadly, poisonous, or even without life. People literally moved to tears when experiencing mycelium bio-sonification, to people no longer able to cut flowers they once harvested for bouquets. The emotional power of sonification is palpable.
I propose we humans pair with nature, bio-mimic mycelial nets and (perhaps) modes of thinking and being.
The power of embodiment is not only a source of knowledge about the outside world and represent a major way people acquire information. Interactive virtual worlds open new ways of self exploration and are widening users minds through unseen perspectives and dimensions. The power of VR influencing our minds through direct audiovisual input opens a new playground for arts and culture to flourish and for systemic problems to be overcome through knowledge and awareness gathered by perceiving and discovering other perspectives and dimensions of being.
Looking at research about how our environment influences us and affects our emotional state and how our emotional state affects us at a deep and long term level we are now realizing what a powerful tool we have in our hands. I am using this information as a key point to imagining virtual worlds that support our healing system. Working with the mind-body connection to respect and enhance ones capacity for self-knowledge, self-healing, and self-care through evidence-based techniques. Virtual Reality, I believe, can enrich humanity and enhance the positive aspects of our culture and civilization through empathy, exploration and learning. When defining this new virtual landscape and society we need to bring our roots – include rituals, techniques to ground ourselves and stay in touch with our energy that we are sharing with nature. Where lies the spiritual growth in a digital future and where we can mirror ourselves in whole new ways and make the invisible visible or even tangible and audible? Understanding that all these powerful factors can equally be used to enable negative as positive impact is crucial in creating meaningful, transformative VR content.
Sara Lisa Vogl is an immersive media artist and futurist who collaboratively explores and constructs new virtual and augmented realities to inspire, unite and enrich humanity. A background in communication arts & interactive media and in love with the idea of new worlds it is Sara’s mission to go beyond the status quo of what immersive virtual realities are and explore their diverse potentials for the future. Besides directing and developing new XR experiences Sara is curating long-term VR trips and guiding people on these trips as the world’s first VR Shaman. In her current engineering studies, learning about humanoid robotics, the Berlin based futurist is deepening her knowledge about sensors, haptics and the intersection between algorithms and real worlds.
I’ve been trying to get back up to speed with my basic to intermediate knowledge here. The primary issue really being my computer (circa 2013 model), which enables or inhibits my ability within TD. For now I am unable to work in GLSL. Here is a video of a reactive piece using fractals, and some python. I’ve set channels from Ableton LIVE (where I have my Eurorack playing into) to affect the X and Z axis of a Sphere with noise and fractals which creates the insects, fungi, avian morphs. I am experimenting with numbers here towards making more delineated insect and fungi formations. Perhaps this is not the correct approach, but again, my mac is old. Regardless, for now I am satisfied with the reactive imagery I am able to create with Touchdesigner. For this output I had to lay the audio into the video in Final Cut. During the output from TD, however, the music was taking place at the same time.. if that makes sense? The visuals were responding to the sounds