hypernatural gardening


360° video for Biosfera (4m spherical display)

awarded ‘Special Mention’ Biosfera Prize
Italian Capital of Culture 2024 Pesaro
Piazza del Popolo, Pesaro, IT
10 01 25 – ?

















Hypernatural Gardening
360° video, 4 channel spatialized sound
6:00, 2240×1084 px
2025


Hypernatural Gardening explores the intersection of digital hypernaturalism, more-than-human perspectives, and postdigital glitch-feminist thought. The work draws inspiration from James Bridle’s Ways of Being, which introduces a more-than-human intelligence framework rooted in unknowing, diversification, and coexistence with non-human systems. The work rejects anthropocentric analysis, instead embracing nature’s intrinsic randomness to shape in its artistic process. Algorithms based on natural phenomena, such as fractals and Fibonacci sequences, intertwine with nature-simulating constructs like Perlin noise and Voronoi diagrams to form a complex digital ecosystem. 

The visual narrative unfolds through a blend of imperfect 3D scans from an archive built over the last 5 years. These 3D scans, created using photogrammetry, capture the transience of natural environments, where the movement of wind and other natural forces introduce glitches and artifacts into the digital representation. These imperfections—holes, overlaps, and distortions—are not corrected but celebrated, becoming a part of the work’s aesthetic language. The resulting landscapes are both familiar and alien: fragmented ecosystems collaged from various European environments, reconstructed as dynamic, pulsating point clouds.

By working with the “glitches” introduced by the process and technologies used, the artwork reflects on the fluid boundaries between simulation and representation. It embraces the multiplicity of particles and fragments, creating a visual experience where meaning arises not from isolated units but from their collective interrelations within the spherical display.

Hypernatural Gardening leverages photogrammetry and 3D scanning techniques to turn natural environments  into detailed point clouds. By capturing multiple photographs over time, the process inherently integrates time as a medium, recording the subtle shifts in light, wind, and movement, as well as the algorithms, with which the software treats the data. These temporal variations introduce artifacts and glitches, which are transformed into visual features. The point cloud data serves as a foundation for constructing immersive 3D worlds.

Using Blender, these point clouds are composited into immersive digital environments. Algorithms simulate natural behaviors, adding layers of complexity and motion to the static scans. The 360-degree animated video is then rendered specifically for a spherical screen. The spherical format’s unique geometry complements the point cloud aesthetic, where individual particles appear as luminous spheres, contributing to a larger, cohesive image only perceivable in its entirety.

The audio design works with software synths, as well as recordings, using granular synthesis and will be worked out as ambisonics (4 channel sound) for the final versions.  It creates a multidirectional soundscape that mirrors the visual complexity, guiding viewers through the fragmented yet interconnected ecosystems.

Together, the visual and auditory elements construct a meditative space where the interplay of technology and nature invites contemplation of the fragile, interconnected systems that sustain life.