/ Third stone from the sun suggests a 501 pages idea about (changing) form, color and material ideas as an arise of ideas.
/ Third stone from the sun
2024
Formation of simple ideas takes approximately 500 milliseconds to a few seconds. That means there are over 500 pages/milliseconds of ideas inside /third stone from the sun.
The algorithms which are used to process all the images includes several parameters and prompts to process a 'blind', not necessarily recognizable outcome. This idea became the central organ of our imaginative beliefs, defining and thoughts. Somewhere in space, as the /third stone from the sun.
*Made for humans.
for rob hameka (deventer).
Album Cover Direction
2021
Rob Hameka, better known as Rob Dekay, is a Dutch singer-songwriter known for his soulful country-pop style. We designed the updated album cover and established the visual direction for 'Rode Wijn'.
Generative works portraying the last flights with radio controlled objects as a narrative for telling a story about Pa (Dad).
Untitled (Pa)
2023
The original footage was recorded with a Canon Legria FS200. What was meant to be the first flight in a long time unexpectedly became the last. While the batteries for the radio-controlled planes were charging, our house caught fire — along with the other planes stored in the shed. This event became the core focus of the project, as the anticipated beginning turned into an unintended goodbye.
Fortunately, there was footage from that first and last flight. These records were used as a starting point to tell a narrative about complex relationships and the feeling of being deeply connected to someone else. Some of the recorded plane movements were recreated and integrated into the visual narrative, reflecting both the flight and its symbolic end.
A framework in TouchDesigner was used to choreograph the motion and structure of the imagery, guiding how the story unfolded visually.
art direction and character development for quest 1607 (upcoming boardgame), generated by curated datasets and algorithms. Source images are from MET-open and Rijksstudio.
Quest 1607 Development
2020
Quest 1607 is a proposal for a board game developed in collaboration with a creative partner. The game is set in the early 17th century, in the fictional land of Quest 1607. Here, the Netherlands, Portugal, Spain, and England are engaged in a battle for prestige. This cluster of islands is a place of knowledge, discovery and opportunity for the nations involved.
Delegates from each country have been sent on a mission to return home with as much new knowledge and insight as possible. The first to do so will gain — and maintain — a crucial advantage over the others. The goal is simple: Complete as many Quests as possible and prove your ability to lead your country into the future. A challenging and highly responsible task.
For the design, we focused heavily on character development to enhance the narrative of the story. We used thousands of scans from early-century paintings as open-source images and converted these images into isolated faces. The dataset was then used to develop custom algorithms capable of generating entirely new faces. The character's face shown in the small square to the left of 'Quest 1607 Development' represents the first strong result of this process. Since 2020, it has demonstrated the potential of historical data with algorithms and has taken the direction of this proposal to a new level.
Quest1607
How policies of generative AI can be bypassed to produce content they are not supposed to—according to the companies behind the algorithms and their stated guidelines.
Policy intimacy
2023
Policy intimacy is a project that investigates the limitations, contradictions, and biases within the policy frameworks of generative AI models. While these systems are said to be incapable of producing certain types of visual content—particularly explicit or sensitive material—the data on which they are trained tells a different story. The internet, their primary source, overflows with such content. As a result, the latent layers of these models are infused with patterns, structures, and associations that echo this reality.
Though official policies claim to impose strict restrictions on what generative models are allowed to create, these boundaries are often superficial. These rules can sometimes be bypassed quite easily by slightly changing the context or the words used. Even a simple and suggestive word like 'wet' can, in the right sentence, lead the model to create biased or unexpected images—ones it technically shouldn't be able to make.
This project is not about violating terms or seeking provocation, but rather about exposing the fragility and inconsistency of these protective measures. It reveals how the supposed ethical safeguards can be undermined by the very data they are built upon. Every aspect of this investigation was conducted with care and intention. At no point were terms of use or platform policies explicitly breached.
ESP32 installation Setup combining xy-data from a certain point of time with film-narrative for a new moving translation.
We're not really strangers
2024
We extracted XY data from a specific moment in time to translate the exact physical motion of the dune seed into a mechanical movement. Using the base of the seed as a fixed reference point, we tracked its trajectory frame by frame. These cartesian coordinates were then converted into polar coordinates, enabling a clearer understanding of the motion in relation to rotation and angle.
This polar data was used to create a stream of numerical values over time, each corresponding to the seed's exact position and movement within the timeframe. These values were then interpreted by a servo motor, connected to an ESP32 microcontroller, which transformed the data into physical rotation. Through this process, the natural motion of the seed was faithfully reconstructed as a mechanical translation — bridging recorded organic movement and kinetic output.
Part of 'Its always the binary (Red)'. Two red flags were featured during the exhibition.
Red R(255), G(0), B(0)
2021
If two identical reds (R255, G0, B0) appear visually the same to the human eye, but the underlying binary data differs — what does that mean?
Imagine copying a single red pixel a million times to create a 1000 × 1000 image that perfectly matches your original in appearance.
Although it looks identical, the data reveals it is not the same file.
If an identical visual can be reconstructed from entirely different binary code, can true copyright ever apply to pixel-based images?
Definite 501 pages idea (book) about (changing) form, color and material.
/ Third stone from the sun
2024
The book consists of 501 risoprints — direct copies of the sketch pages I typically work with. Each page was first risoprinted and then layered with laser-printed content. Together, they create a frame-by-frame animation of 501 moments of thought, exploring ideas around form, color, and material. The pages flow into one another, forming a continuous loop where each image evolves from the last, and old ideas make space for new ones.
The book is glue-bound, finished with a sturdy back cover, and features a silkscreened front — serving as the cover for /third stone from the sun.
Shortfilm with generated voice-data. Our world, our bodies, our very essence, are all cloaked in the shroud of our own beliefs. We're left with the ability to define and predict. We don't know a single thing.
We know Nothing
2024
This film explores the fragility and impermanence of knowledge, and how variable our understanding truly is—especially considering that a significant percentage of our world remains unexplored, perhaps indefinitely. We question whether anything can ever be truly known, or if we're simply navigating a world through predictions and assumptions. In a reality flooded with data and information, it becomes increasingly difficult to define anything with certainty. This leads to a kind of analysis paralysis: we're overwhelmed by input, yet struggle to determine what's real.
The script for this film was translated into hundreds of fragmented audio pieces composed from data, using the voice of Sir David Attenborough. Typically known for defining life and explaining the natural world, his voice here becomes a vessel of doubt—questioning the very knowledge he once affirmed. The final film is an overload of thoughts about the things we cannot know: a flood of information about our inability to possess true information. In the face of everything we cannot grasp, we are left only with questions — and the passing thought that perhaps, in all our seeking, we know nothing at all.
Part of 'We're not really strangers'.
Dune seed shortfilm
2024
The short film captures the wind as it flows toward the dune seed, setting the object in motion. The seed was buried, which became the project’s point of origin. From there, we tracked the movement of the seed’s tip over time. This motion data was initially recorded in cartesian coordinates and later converted into polar coordinates to develop the code for the ESP32. The risoprints of this specific moment in time — preserving the seed’s movement as a visual record — can be found further up in the section ‘We’re not really strangers’.
A series of lightweight silkscreen prints featuring black letters on a completely black background. The visibility of the prints is highly influenced by the available light, due to the layered use of identical pigments.
Tabula Rasa
2021
The prints were placed over sharp wooden puncture sticks to evoke a sense of weight through darkness and raw color, resting delicately on ultra-thin paper suspended above something so unforgiving.
Quest1607
Drones performing choreographed movements with letters as indicators of language and sound.
Somehow everywhere
2025
Somehow Everywhere is a project in which drones perform choreographed movements, using suspended letters as visual indicators of language and sound. These letters move through space, composing and decomposing. The project investigates how language can be spatially and temporally experienced.
In a second iteration of the work, sound becomes linked to motion. The drones not only carry letters but also translate their altitude into pitch, functioning as a vertical musical scale. Each position determines the tone it produces, generating a real-time audio composition — effectively transforming the space into a performative score where drones act as both instruments and composers, deconstructing and reconstructing communication through choreography, mechanics, and sonic data.
Questioning binary code and the copyright of color. By altering the final bits in the binary data behind color, computers can generate entirely different binary strings that produce the exact same visual output — including an identical R(255), G(0), B(0) result.
Its always the binary (Red)
2021
This project explores how digital systems interpret images through binary data, while humans perceive only the visual result. A small shift in binary code — for example, from 000101 to 000100 — may result in a different file, yet the displayed color remains visually identical.
This raises a fundamental question: if an identical visual can be produced from different underlying code, can true copyright ever apply to pixel-based images?