
Quantum Computational Creativity (QCC) is an emerging research program pioneering the direct transduction of quantum hardware execution into quantum computer music and time-based spatial visualization. Developed through a three-way creative symbiosis between human artistic vision, AI collaboration, and quantum physical systems, this work positions the quantum computer not as a calculator but as a generative artistic instrument — one whose interference patterns, entanglement topologies, and hardware signatures become the raw material of composition and visual art. Crucially, the creative material produced by this methodology is physically impossible to replicate through classical computing — the decoherence, gate errors, and entanglement-specific quantum signatures embedded in the output are authentic products of real quantum hardware execution, unavailable by any other means.
The research documented here represents the foundational phases of what may become a broader collaborative research group, inviting composers, visual artists, computer scientists, and quantum physicists to explore this emerging territory together.
Below are listed the various phases of the research and creative praxis as it is evolving.
Phase I — Foundations

Phase I explores the fundamental feasibility of quantum hardware as a creative instrument, sonifying quantum interference patterns through a hybrid methodology combining local statevector simulation with IBM quantum hardware execution. Utilizing 10-qubit parametric circuits and Csound synthesis, this phase established the core workflow and validated the approach on real hardware — documenting the striking difference between classically simulated and physically executed quantum results, and laying the conceptual and technical groundwork for everything that followed.
Sonifying Quantum Interference Patterns Through
Hybrid Simulation and Hardware Execution
Michael Rhoades, PhD (PI)
Introduction
This research approaches musical composition as the generation of complex audio waveforms through computational processes, fundamentally departing from traditional notation-based methodologies. While acknowledging the significant artistic achievements of conventional compositional practice, this work pursues an alternative paradigm wherein generative algorithms, data sonification, machine learning, and quantum computing form the basis for sophisticated waveform synthesis. The compositional process is conceptualized as a collaborative interaction between artistic sensibilities and expanding computational capabilities, in which technological advancements open novel territories for sonic exploration.
This project investigates the sonification of quantum interference patterns generated through quantum computing (QC), utilizing both IBM quantum hardware and local Qiskit simulations. The research aims to create spatially distributed multichannel audio works wherein quantum mechanical phenomena, specifically quantum superposition, entanglement, and measurement collapse, directly inform both spectral content and spatial parameters. This approach positions quantum computing not merely as a tool for calculation, but as a generative system capable of producing acoustic material embodying fundamental quantum uncertainty and probability distributions.
Background and Motivation
Quantum computing offers exponential computational advantages over classical computing for specific problem domains, particularly quantum simulation, optimization, and interference pattern generation. These capabilities are directly relevant to generative audio synthesis. Unlike classical bits that exist in discrete states (0 or 1), quantum bits (qubits) exist in superposition: simultaneous linear combinations of basis states represented as complex-valued probability amplitudes. When multiple qubits interact through quantum gates, they become entangled, creating interference patterns wherein probability amplitudes reinforce or cancel according to their phase relationships. Upon measurement, this superposition collapses to a single classical outcome, with probabilities determined by the squared magnitudes of the quantum amplitudes.
This fundamental quantum behavior, the transition from coherent superposition to measured classical state, presents a unique opportunity for sonification. The pre-measurement quantum state contains rich interference patterns encoded in both amplitude and phase information across an exponentially large state space (2^n states for n qubits).
However, measurement fundamentally destroys phase information, collapsing the quantum superposition to yield classical probability distributions—this collapse mechanism is precisely what enables quantum computers to produce usable computational results. For the purposes of this sonification research, this characteristic necessitates a hybrid approach: extracting complete statevector information (amplitude and phase) from simulation while capturing real quantum hardware characteristics (noise, decoherence, measurement statistics) from physical execution, which provides optimal access to both the coherent quantum interference patterns and the stochastic measurement outcomes.
Technical Framework
Quantum Computing Environment
The research utilizes IBM’s Qiskit framework, an open-source software development kit for quantum computing, interfacing with IBM’s 156-qubit Heron-series quantum processors via cloud access. Development occurs within Jupyter notebooks, enabling iterative algorithm design, local simulation, and hardware execution within a unified environment.

AI-Assisted Development
Initial project estimates anticipated months of study to achieve sufficient proficiency in quantum algorithm development. This timeline was significantly compressed through collaboration with Claude (Anthropic’s large language model), which served as a technical intermediary translating conceptual sonification goals into executable Qiskit code. This AI-assisted methodology enabled rapid iteration between conceptual design and implementation, though all algorithmic decisions, quantum circuit designs, and compositional choices remained under direct authorial control. The use of large language models in this capacity represents an emerging paradigm in computational creativity research, warranting acknowledgment as both a methodological tool and a subject for critical examination.



Quantum Circuit Design
The research employs parametric quantum circuits designed to generate varied interference patterns through systematic variation of gate parameters. The initial implementation focuses on 10-qubit circuits (generating 2^10 = 1,024 quantum states), utilizing circuit families characterized by:
1. Superposition generation: Hadamard gates applied to all qubits
2. Entanglement structure: CNOT gates creating controlled correlations between qubits
3. Parametric rotation: RY and RZ gates with systematically varied angles across circuit variations
This circuit architecture creates dense entanglement and complex interference patterns while remaining tractable for near-term quantum hardware with limited coherence times.

Hybrid Simulation-Hardware Methodology
The core technical innovation involves combining complementary data from two sources:
Statevector Simulation (local execution):
• Provides complete quantum state information pre-measurement
• Extracts amplitude values: |ψ(i)| for each basis state i
• Extracts phase values: arg(ψ(i)) for each basis state i
• Represents ideal quantum behavior without hardware imperfections
• Execution time: <1 second per circuit variation
Hardware Execution (IBM quantum processors):
• Measurement collapses superposition to classical outcomes
• Provides probability distributions across measured states
• Captures real quantum hardware characteristics: decoherence, gate errors, readout errors
• 4,096 repeated measurements per circuit to statistically sample the probability distribution across 1,024 quantum states
• Execution time: ~12 seconds per batch of 10 circuit variations
The hybrid approach combines simulation-derived phases (capturing quantum interference structure) with hardware-derived amplitudes (capturing physical quantum behavior), yielding data that reflects both ideal quantum mechanics and real-world quantum device characteristics.
Data Extraction and Processing
For each circuit variation, the following data is extracted and combined:
• State index (0-1023): Identifies which quantum basis state
• Amplitude (simulation): |ψ(i)|, representing ideal quantum amplitude
• Amplitude (hardware): √(p(i)), derived from measurement probability p(i)
• Phase: arg(ψ(i)) normalized to [0,1], from simulation statevector
• Measurement count: Raw number of times each state was measured (out of 4,096 shots)
• Probability: Normalized measurement frequency from hardware

This data structure preserves both the coherent quantum interference patterns (via phase) and the stochastic measurement outcomes (via hardware probabilities).
Three-Dimensional Spatial Mapping
Quantum states are mapped to three-dimensional Cartesian coordinates for multichannel spatial audio diffusion. The mapping translates abstract quantum state properties to acoustic spatial properties as follows:
• X-axis (left-right, [0,1]): Quantum phase angle
• Y-axis (lower-upper, [0,1]): State index, providing vertical distribution
• Z-axis (front-rear, [0,1]): Inverted amplitude (1 – amplitude)
• High amplitude → near listener (front) → minimal reverberation
• Low amplitude → distant from listener (rear) → increased reverberation
This mapping is calibrated for an eight-speaker cubic array positioned at the vertices of a normalized unit cube, with the listener at the center. The coordinate system directly correlates quantum phase relationships in Hilbert space to acoustic spatial relationships in physical space, while the amplitude-to-depth mapping integrates with convolution reverberation algorithms to create coherent distance cues.
Implementation and Results
Initial Output Characteristics
The first-generation audio output exhibits several distinctive characteristics:
1. Spectral density: High information density reflecting 1,024 quantum states per circuit variation
2. Stochastic texture: Granular, non-periodic structure reflecting quantum measurement statistics
3. Sparse amplitude distribution: Most quantum states exhibit near-zero amplitude due to destructive interference, creating pronounced dynamic contrast
4. Phase-derived spatial motion: Continuous spatial trajectories derived from phase evolution across quantum states
5. Listen to the first-generation audio waveform:


The audio material does not conform to conventional musical organization (tonal harmony, rhythmic periodicity, melodic contour). Rather, it exhibits characteristics aligned with quantum stochasticity—probability distributions, interference-driven amplitude modulation, and measurement-induced discretization. This suggests that while the sonification successfully captures quantum mechanical behavior, additional compositional structuring will be necessary to bridge quantum-generated material with aesthetic intelligibility and musical discourse.
Validation of Quantum Signatures
The output successfully demonstrates several quantum mechanical signatures:
• Interference patterns: Amplitude distributions showing constructive and destructive interference
• Measurement collapse: Discrete state occupancy reflecting wavefunction collapse
• Hardware noise: Perceptible differences between simulation-based and hardware-based amplitude distributions, indicating successful capture of decoherence and gate errors
• Entanglement effects: Correlated amplitude structures across quantum state indices
These characteristics confirm that the sonification methodology preserves quantum mechanical information in the acoustic domain.
Discussion and Future Directions
Compositional Challenges
The primary compositional challenge lies in constraining quantum stochasticity while preserving quantum signatures. This represents a balance between chaos (inherent quantum randomness and exponential state-space complexity) and order (algorithmic structure and compositional intent). Future work will explore:
1. Algorithm families: Designing quantum circuits with specific interference characteristics (sparse vs. dense patterns, periodic structures, controlled symmetries)
2. Temporal organization: Mapping quantum state evolution to musical time through interpolation strategies and temporal grain structures
3. Parametric control: Systematic variation of circuit parameters to create compositional sections with distinct sonic identities
4. Multi-scale structure: Hierarchical organization where individual quantum circuits generate local texture while circuit sequences create larger formal structures
Expansion to Multichannel Spatial Composition
The immediate next phase involves expanding from initial proof-of-concept to full-scale multichannel works:
• Eight primary waveforms (expandable to 32 channels)
• Target duration: 8+ minutes per composition
• Integration with Csound for waveform synthesis via GEN08 cubic spline interpolation
• Real-time spatial processing utilizing the three-dimensional coordinate mappings
Quantum Computing Resource Considerations
Hardware execution on IBM quantum processors requires careful resource management. Initial benchmarking indicates:
• 10-qubit circuits, 10 variations: 12 – 31 seconds quantum execution time
• Current IBM pricing: $1.60/second, $96/minute, $5760/hour
• Estimated full-project requirement: 30 – 60 minutes quantum time
• Total projected cost: $2880 – $5760
• This figure does not include numerous sub renders required for testing and proof of concept

The research strategy prioritizes extensive local simulation for algorithm development and aesthetic evaluation, reserving hardware execution for final production where quantum noise characteristics provide unique acoustic signatures unavailable through simulation.
Theoretical Implications
This work operates at the intersection of quantum mechanics, sonification methodology, and electroacoustic composition. By sonifying quantum interference patterns, the research creates acoustic material that embodies fundamental quantum uncertainty and measurement-induced state collapse. The hybrid methodology—combining ideal quantum interference (simulation) with physical quantum behavior (hardware)—mirrors the relationship between theoretical and empirical physics, translating this epistemological duality into aesthetic experience.
The spatial mapping extends this conceptual framework by translating quantum phase relationships in abstract Hilbert space to acoustic relationships in physical space, creating a direct perceptual analogy between quantum mechanical structure and spatial acoustic structure.
Conclusion
This research demonstrates the feasibility of sonifying quantum interference patterns through a hybrid simulation-hardware methodology, successfully capturing both ideal quantum mechanical behavior and real-world quantum device characteristics. Initial results confirm that quantum signatures are preserved in the acoustic domain, though substantial compositional development remains necessary to create aesthetically coherent musical works.
Phase II — Quantum Computer Music Through Transduction

Phase II moves beyond sonification into direct transduction — the quantum computer itself becomes the instrument. Eight novel entanglement schemas executed on Quantum Inspire’s Tuna-9 processor each produce unique interference patterns and waveform signatures that are inseparable from the physical hardware on which they were generated. The result is seven completed quantum computer music compositions totaling 76 minutes, representing a fundamental departure from any compositional approach achievable through classical computing alone. The title of the collection of compositions is, “Reflections from Hilbert Space”.
This work is the subject of a submitted paper to the Computer Music Journal’s special issue on Quantum Computer Music. As such it is currently in the process of being reviewed. View the submitted version below.
![]() | |||
| Track 1: Looking Toward Hilbert Space
| 9:53 | ||
| Track 2: Quantum Etude_00
| 00:24 | ||
| Track 3: the Superstring Octet
| 10:49 | ||
| Track 4: Collapsing Waves
| 10:19 | ||
| Track 5: Entangled Ambience
| 15:09 | ||
| Track 6: Song of Interference (in three movements)
| 11:05 | ||
| Track 7: The Infinite and the Infinitesimal (in four movements)
| 18:23 | ||
Reflections From Hilbert Space
In quantum mechanics, Hilbert space is an infinite-dimensional, complex vector space that serves as the mathematical basis for describing all possible states of a quantum system. Each state represented by a vector symbolized as |ψ⟩. (psi = the wave function) Since the wave functions of quantum states collapse when measured, we can only observe them as inferred by the measurements. In other words, we observe quantum wave field interferences through their reflections.
Dear Listener,
The compositions in this project represent the results of Michael’s research and creative practice in the area of quantum computing (QC) and artificial intelligence (AI). With the exception of Track 1, each track was derived directly from this research. Michael has been studying quantum physics for many years, as a layman, and it was a natural progression of these studies to begin working with QC and AI toward musical composition.
There is an arc to the order of the compositions here. Track 1 was a general target for the level of musical expression for the project. The subsequent tracks begin by investigating workflows and timbral relationships. As the research and creative praxis progressed, the compositions began being more about the composition than the research. The last four pieces are truly musical compositions that express quantum wave fields interacting during circuit execution. What you are actually hearing are the qubits as they interact with each other and this interaction is composed into a musically relevant form.
Track 1: “Looking Toward Hilbert Space” was inspired and influenced by Michael’s research into this area but was not directly derived from it. It is a result of a generative MAX patch titled, “an Eternal Song version 5”… the 5th iteration, in as many years, of the patch.
Track 2: “Quantum Etude_00” represents the first step in the direction of research and creative practice in the area of QC. It is the result of a sonification of quantum waveform interference patterns during circuit execution using 10 qubits. It was achieved through the use of one of IBM’s Heron series, 156 qubit, quantum computers and their suite of Python libraries called the Quantum Information Science Kit (Qiskit). As such it established a workflow and proof of concept toward future work in this area. Can you hear the quantum quanta buzzing around and interacting? You can read more about this first phase of the project here.
Track 3: The “Superstring Octet” is the result of direct transductions from wave field interference patterns, during specifically designed circuit executions in Hilbert space, to the audio listening space. It was an important tenant of this research and creative practice to employ QC in a manner that is only possible with QC. Therefore sonifications and quasi-random number generations, which can be accomplished with classical computing, were not within the scope. Conversely, implementing direct correlations between QC and the audio listening space is precisely the goal. This composition is thus comprised of humanly perceivable quantum wave function activities captured during specifically constrained quantum circuit executions. The results are truly novel and establish an unprecedented musical syntax. Listen carefully. What do you hear within these waveforms?
Track 4: “Collapsing Waves” is comprised of 64 samples, each 1/100th of a second in length, that were transduced directly from wavefield interference patterns during quantum circuit execution. The samples were time-stretched numerous times and their frequencies drastically raised to put them into the range of human hearing. They are the same samples used in the creation of Track 3, however, they were further manipulated by a generative MAX patch. The results were edited into the final composition.
Track 5: “Entangled Ambience” is a further extension of Track 4. However, the results were unedited. What you are hearing are the raw waveforms being playfully manipulated, while maintaining their original quantum-based relationships. While listening to this track, allow yourself to be enveloped by this novel quantum audio environment.
Track 6: “Song of Interference” was composed from samples in the purist forms of the timbres that resulted from waveform interference and entanglement schemas during quantum circuit execution. The samples were processed through a generative MAX patch, which was created specifically for this composition. Once again, the results were edited into the final 32-channel composition, mixed down to stereo, and presented here.
Track 7: Finally, “The Infinite and the Infinitesimal” is a fully quantum derived work. Using the Quantum Inspire, Tuna-9, backend, each of the 64, .01 second in length, audio waveforms were transduced directly from quantum interference over a 12-hour quantum circuit execution in five batches. This composition is the crowning achievement of this research project. Though the waveforms differed from the simulations due to random noise and momentary decoherence, thus appearing more complex, they sounded “smoother” and more refined. This composition demonstrates that the initial intuition that quantum computing opens exciting new worlds of quantum computational creativity was not misguided and certainly justified the extreme depth and breadth of endeavor involved.
Research and Creative Practice in Progress
This ongoing research and creative practice is in the early stages. The results of the 1st phase are posted here. Further expressions of uniquely quantum-derived musical compositions are in the pipeline and new projects will be posted as they develop. This new frontier is utterly fascinating… future facing… and there are no roadmaps. The depth and breadth of potential in the area quantum computational creativity is truly boundless and it is the future occurring NOW!
Thank you for listening!
Phase III — Visual Music Through Tomography

Phase III extends quantum tomographic data into the visual domain, asking whether the same quantum hardware execution that drives the music can simultaneously generate coherent visual art. Full quantum state tomography across three measurement bases extracts Bloch sphere trajectories across 600 parameter steps per schema, animating quantum filaments in a stereoscopic VR environment rendered across a high-performance GPU cluster. The organic coherence between the visual and audio material — both rooted in identical quantum hardware execution — validates the core premise of the research.
This video is the result of quantum circuit execution on a quantum computer. Eight animated filaments, each encapsulated within their own energetic spheroid, display each qubit’s trajectory within the quantum processor. The confinement seen here is real — these are actual quantum states bouncing around within real quantum hardware. Because the qubits are entangled, they are not independent paths but eight expressions of a single unified quantum system. The entanglement relationships are visible as the qubits come together and move apart. This is not a metaphor for quantum physics. It is quantum physics, rendered directly into perceptual space.
This is an extremely rough cut of the first video sequence render for the quantum computer music composition, “the Infinite and the Infinitesimal”. No editing is involved. Simply checking the viability of the audio and video working together. Even though the music is slowly evolving and the visual objects are moving quickly, there is some, indefinable, cohesion between the two. This is likely due to the fact that they are derived from the same basic quantum computer circuit executions.

Michael Rhoades, PhD (PI)
Introduction
Phase III of the Quantum Computational Creativity research program extends the transduction methodology established in Phase II into the visual domain, asking a fundamental question: can the same quantum hardware execution that generates music simultaneously generate coherent visual art? The answer, as this phase demonstrates, is not only affirmative but reveals an unexpected and profound organic coherence between the two sensory domains — one that could only emerge from a shared quantum physical origin.
Where Phase II transduced quantum circuit evolution directly into audio waveforms, Phase III employs full quantum state tomography to extract the complete three-dimensional trajectory of each qubit’s quantum state as it evolves across the parameter space. These trajectories — expressed as coordinates on the Bloch sphere — are then translated into the spatial and temporal parameters of a stereoscopic VR animation, rendered across a high-performance GPU cluster. The result is a time-based visual artwork whose motion, rhythm, and structure are as authentically quantum-derived as the compositions that accompany it.
Background and Motivation
The Bloch sphere is the natural geometric representation of a single qubit’s quantum state. Any pure qubit state can be represented as a point on the surface of a unit sphere, with the north pole representing |0>, the south pole representing |1>, and all points on the surface representing superposition states. The X, Y, and Z coordinates of this point — derived from measurements in three complementary bases — provide a complete description of the qubit’s state. As circuit parameters evolve, each qubit traces a continuous trajectory across this sphere, encoding the full dynamics of the quantum system in geometric form.
This geometric richness makes quantum state tomography a natural foundation for visual art. The trajectories are not arbitrary — they are shaped by entanglement topology, gate sequences, and hardware-specific noise characteristics. Different circuit schemas produce fundamentally different trajectory geometries, reflecting the unique quantum physical processes occurring during execution. Like the audio transductions of Phase II, these visual trajectories carry the authentic signature of real quantum hardware — impossible to replicate through classical simulation.
Technical Framework
Quantum State Tomography
Full quantum state tomography was performed across all eight entanglement schemas from Phase II, executed on Quantum Inspire’s Tuna-9 processor. For each schema, 600 parameter steps were measured across three complementary bases:
- Z-basis: Standard computational basis measurement — reveals the north/south position on the Bloch sphere
- X-basis: Hadamard gate applied before measurement — reveals the east/west position
- Y-basis: S† gate followed by Hadamard applied before measurement — reveals the front/back position
Each measurement basis required a complete set of 600 circuits executed at 2048 shots per circuit, yielding three full datasets per schema. Combined, the three basis measurements provide the complete Bloch sphere coordinates (X, Y, Z) for each of the eight qubits at every parameter step — 600 points tracing each qubit’s trajectory through the full parameter evolution.
Data Processing Pipeline
Raw measurement counts from the Tuna-9 hardware are processed into expectation values for each qubit at each parameter step. From these expectation values, three derived quantities are calculated:
- Bloch X, Y, Z coordinates: The three-dimensional position of each qubit’s state on the Bloch sphere, normalized to the range [−1, 1]
- Bloch radius (purity): The distance from the center of the sphere, calculated as √(X² + Y² + Z²). A radius of 1.0 indicates a pure quantum state; values less than 1.0 indicate mixed states resulting from decoherence and entanglement
- Rate of change (dZ): The frame-to-frame velocity of the Z coordinate, used to drive vibrational and dynamic parameters in the animation
These processed coordinates are exported as CSV files — one per schema — serving as the direct input to the Blender animation pipeline.


Circuit Schemas — Eight Entanglement Topologies
The eight schemas from Phase II each produce a distinct tomographic dataset, reflecting their unique entanglement architectures. Each schema’s visual character in the animation is a direct expression of its quantum physical structure:
Schema 01 — Star
A central hub qubit entangled with all seven peripheral qubits simultaneously. Produces radially symmetric trajectory patterns with the hub qubit exhibiting maximum decoherence from collective entanglement pressure.

Schema 02 — Chain
Sequential entanglement cascading from qubit 0 through qubit 7 with accumulating phase shifts. Produces wave-like trajectory propagation across the qubit array, with later qubits showing increasingly complex phase relationships.

Schema 03 — Ring
Circular entanglement topology with the final qubit connected back to the first. Produces rotationally symmetric trajectories with periodic interference patterns arising from the closed loop structure.

Schema 04 — Pairs
Independent entangled pairs (0-1, 2-3, 4-5, 6-7) with no inter-pair coupling. Produces four distinct paired trajectory families, each evolving independently — creating visual counterpoint between the pairs.

Schema 05 — Tree
Hierarchical binary tree entanglement structure. Produces trajectories organized by depth level, with qubits at different levels of the hierarchy exhibiting characteristically different trajectory geometries.

Schema 06 — GHZ
Greenberger-Horne-Zeilinger maximally entangled state — all eight qubits in a single maximally entangled superposition. Produces the most dramatically correlated trajectories of all schemas, with collective quantum behavior dominating individual qubit dynamics.

Schema 07 — Gradient
Progressive entanglement strength increasing across the qubit array. Produces a smooth gradient of trajectory complexity from the weakly entangled end to the strongly entangled end of the chain.

Schema 08 — Islands
Two isolated entangled clusters with no inter-cluster coupling. Produces two visually distinct trajectory families coexisting within the same animation, creating a natural visual dialogue between independent quantum subsystems.

The Blender VR Animation Pipeline
Coordinate Mapping
The processed Bloch sphere coordinates are mapped to three-dimensional animation parameters within Blender. Each of the eight qubits is represented as a quantum filament — a luminous, volumetric strand whose spatial position, orientation, and dynamic behavior are governed entirely by the tomographic data:
- Spatial position: The X, Y, Z Bloch coordinates drive the three-dimensional position of each filament’s control points across the 600-frame animation sequence
- Bloch radius (purity): Maps to filament luminosity and thickness — high purity states produce bright, well-defined filaments while mixed states produce dimmer, more diffuse forms
- Rate of change (dZ): Maps to vibrational frequency and amplitude — rapid state evolution produces agitated, energetic filament behavior while slow evolution produces smooth, flowing motion

Scene Architecture
The animation environment is a chrome reflective VR cuboid — an infinite mirrored interior that multiplies and propagates the quantum filaments across the visual field, creating an immersive environment that suggests the infinite-dimensional nature of Hilbert space itself. Eight quantum filaments occupy the interior, their trajectories animated across 2400 frames at 30 frames per second, producing an 80-second stereoscopic VR animation.
The chrome environment was chosen deliberately — its reflective properties mean that the quantum filaments are never seen in isolation but always in relation to their own reflections and to each other, creating a visual analogy for quantum entanglement: each element simultaneously independent and inseparable from the whole.

Rendering
The animation is rendered across a high-performance GPU cluster equipped with multiple NVIDIA RTX 4090 graphics cards, producing stereoscopic 3D frames at full resolution. The render pipeline was optimized for the specific requirements of the quantum filament geometry, with subdivision surface modifiers and volumetric glass materials producing the characteristic luminous quality of the final output.

Results — The Organic Coherence Discovery
The most significant and unexpected finding of Phase III emerged during review of the initial render output. When the first 689 frames of the Schema 01 Star topology animation were played back at quarter speed alongside the Phase II compositions, a striking organic coherence was observed with “The Infinite and the Infinitesimal” — the final and most fully realized composition of Phase II.
This coherence was not designed. No explicit synchronization was attempted between the visual and audio material. Both derive independently from the same quantum hardware execution — the audio from direct waveform transduction in Phase II, the visual from tomographic trajectory extraction in Phase III. Yet when experienced together, the slowly evolving four-movement structure of the composition and the fluid, continuous motion of the quantum filaments appear to inhabit the same temporal and expressive world.
This finding carries significant implications for the research program as a whole. It suggests that authentic quantum hardware execution produces a kind of perceptual unity across sensory domains — a coherence that emerges naturally from shared physical origin rather than from deliberate compositional synchronization. Classical simulation, which produces idealized and deterministic results, could not generate this quality. It is a property of real quantum hardware execution specifically, and it validates the core premise of the Quantum Computational Creativity research program.

The Phase III visual work achieves something that pure audio cannot — a direct, unmediated view of quantum entanglement in action. Rendered as eight animated filaments, each qubit’s trajectory is traced within the quantum computer processor. The qubits are constrained — what appears as confinement is real confinement, within the quantum processor itself. And because the qubits are entangled, the eight filaments are not eight independent paths. They are eight expressions of a single unified quantum system, their movements structured by the correlations built into the entanglement topology. The entanglements are actually seen as the qubits come together and move apart. The visualization does not interpret the physics through metaphor or artistic mediation. It is the physics, rendered directly into perceptual space.
Hardware Signature — Tuna-9 vs ibm_fez
The Quantum Inspire Tuna-9 processor was instrumental in establishing and validating the core methodology of this research program, providing the hardware access that made Phases II and III possible. As the research evolves toward the dramatically expanded scale of Phase IV — sixteen schemas, nine qubits each, engaging 144 qubits simultaneously — IBM’s hardware architecture is the natural and necessary next step. To establish accurate compute time estimates for the Phase IV work and in support of an IBM Quantum Credits Program grant application, a direct benchmark comparison between Tuna-9 and IBM’s ibm_fez backend was conducted using identical 60-step Chain schema tomographic circuits.
The comparison revealed that ibm_fez completed execution in 1.94 minutes versus 9.31 minutes on Tuna-9 — a 4.8× speed advantage — while also producing significantly lower noise and higher qubit purity values. The smoother, more coherent trajectories of ibm_fez reflect its superior hardware fidelity, while Tuna-9’s characteristic noise signature gives the Phase II and III material its own distinctive and authentic quantum character.
Both hardware signatures are valid artistic voices. Phase IV simply requires the scale and fidelity that IBM’s architecture makes possible.

Validation
Phase III validates several key propositions of the Quantum Computational Creativity research program:
Quantum tomography as artistic medium: Full quantum state tomography, conventionally a scientific measurement technique, functions equally as a generative artistic process when its outputs are transduced into visual parameters. The scientific and creative applications are not merely compatible — they are identical.
Hardware authenticity as artistic value: The noise, decoherence, and hardware-specific signatures present in the tomographic data are not imperfections to be corrected but essential components of the visual character. Different hardware produces different art from identical circuits.
Cross-modal quantum coherence: Audio and visual material derived independently from the same quantum hardware execution exhibit organic coherence across sensory domains — a coherence unavailable through classical simulation and attributable specifically to shared quantum physical origin.
Scalability: The tomographic pipeline executed reliably across all eight schemas, producing complete 600-step datasets for each, demonstrating that the methodology is robust and scalable to the expanded circuit architectures planned for Phase IV.
Discussion and Future Directions
Phase III establishes time-based spatial visualization as a natural and essential component of the Quantum Computational Creativity methodology. The organic coherence discovery in particular opens new compositional possibilities — if audio and visual material derived from the same quantum execution naturally cohere, then a unified quantum audiovisual work becomes not merely possible but perhaps inevitable.
Phase IV will expand this visual methodology alongside the audio work, applying the tomographic pipeline to 16 circuit schemas across 9 qubits each on IBM’s hardware. The expanded qubit count, novel circuit topologies, and IBM’s superior hardware fidelity will produce visual material of substantially greater complexity and variety. The non-uniform parameter sampling strategies planned for Phase IV will give each schema’s visual trajectory an internal rhythmic life — phrases, accelerations, and clumping — that the evenly spaced Phase III trajectories could not produce.
The longer term vision is a fully integrated quantum audiovisual performance work: a single quantum hardware execution generating both the compositional material and the visual environment simultaneously, unified by their shared physical origin in Hilbert space.
Conclusion
Phase III demonstrates that quantum state tomography, executed on real quantum hardware, generates not only scientifically meaningful data but visually compelling artistic material whose character is inseparable from the physics of its origin. The eight entanglement schemas produce eight distinct visual languages, each reflecting its unique quantum topology. The chrome VR environment situates these trajectories within an immersive spatial context that invites the viewer into Hilbert space itself.
Most significantly, the organic coherence between the Phase III visual material and the Phase II audio compositions — both derived from identical quantum hardware execution — validates the foundational premise of this research: that quantum hardware is a generative artistic instrument capable of producing unified creative experiences across sensory domains, in ways that classical computing fundamentally cannot replicate.
Quantum Computational Creativity — Phase III Documentation — March 2026
Phase IV — Expansion (in development)
Phase IV represents a dramatic scaling of the methodology, expanding to sixteen circuit schemas utilizing nine qubits each — engaging 144 of IBM’s 156 available qubits in a single large-scale hardware execution. Novel circuit topologies, an expanded native gate vocabulary, and non-uniform parameter sampling will produce a unified body of stratified waveforms of far greater complexity and variety than previous phases — raw material for multiple distinct quantum computer music compositions and stereoscopic visualizations. Direct benchmark testing on IBM’s ibm_fez processor confirms the methodology executes at 4.8× the speed of Tuna-9, making this ambitious expansion genuinely feasible. An IBM Quantum Credits Program application is currently in preparation to support this work.
Purpose of This Document
This document records the conceptual discussion between Dr. Michael Rhoades and Claude (Anthropic) regarding Phase IV of the Quantum Computational Creativity research project. It is intended as a reference for when active development begins, preserving the creative and technical intentions formulated during this early planning stage. It also serves as supporting context for the IBM Quantum Credits Program grant application.
Previous Phases Summary — Where We Are Now
Completed Work
“Reflections from Hilbert Space” comprises seven completed quantum computer music compositions totaling 76 minutes, including “The Infinite and the Infinitesimal” created using Quantum Inspire’s Tuna-9 hardware processor. A research paper has been submitted to Computer Music Journal’s special issue on Quantum Computer Music.
Core Methodology Validated
The fundamental approach of Phases II and III — using quantum circuit evolution directly as waveform generation, with entanglement topology determining compositional relationships — has been validated on real quantum hardware. Key findings include:
- Entanglement correlation structures survive real hardware execution
- Quantum interference patterns in Hilbert space translate meaningfully to acoustic interference patterns in Ambisonics listening space
- Hardware noise and decoherence are not errors but authentic creative signatures
- Moderate asymmetric entanglement produces the richest compositional output
- Eight distinct circuit schemas (Star, Chain, Ring, Pairs, Tree, GHZ, Gradient, Islands) each produce unique sonic character
Tomographic Data Collection — Complete
Full quantum state tomography (X, Y, Z measurement bases, 2048 shots, 600 parameter steps) has been executed on Tuna-9 for all eight schemas. Each schema produces a 600-row Bloch coordinate CSV used for both audio synthesis and Blender VR visualization. All eight schemas are complete.
Blender VR Visualization — Work in Progress
A parallel visual art project renders the quantum tomographic data as animated 3D filaments in a chrome reflective VR environment. Each of the eight qubits is represented as a luminous filament whose spatial position, motion, and dynamic behavior are governed entirely by the Bloch sphere trajectory data extracted during tomographic execution.
Schema 01 — Star topology — was rendered on a high-performance cluster equipped with multiple NVIDIA RTX 4090 GPUs, producing 2400 frames at 30fps for an 80-second stereoscopic VR animation. Early review of the initial frames has already yielded a significant finding: an unexpected organic coherence between the visual animation and the Phase II composition “The Infinite and the Infinitesimal” — both derived independently from the same quantum hardware execution. This cross-modal coherence, emerging naturally from shared quantum physical origin rather than deliberate synchronization, validates a core premise of the research.
Blender scenes for the remaining seven schemas are planned, with potentially multiple scenes per schema exploring different visual interpretations of the same quantum data. Given the extensive rendering times involved, this work represents a long-term visual art project extending well into 2026 and beyond. Phase IV development will proceed in parallel with the ongoing Phase III rendering work.
Phase IV Vision — Conceptual Framework
Core Departure from Previous Phases
Phase IV is not an extension of the previous phases — it is a fundamental reimagining of the methodology. The goal is to produce material that is sonically and compositionally distinct from anything previously produced. This requires changes at multiple levels: circuit architecture, gate vocabulary, parameter sampling strategy, and compositional interpretation.
Scale — Toward Maximal Qubit Utilization
Phase IV targets 16 circuit schemas, each using 9 qubits, for a total of 144 qubits — approaching the maximum of IBM’s 156-qubit Heron-series processors. This represents a nearly 18× increase in total qubit utilization compared to the 8-qubit architecture of previous phases. This scale is directly aligned with IBM’s stated interest in projects that maximize qubit utilization and push hardware limits.
The Unified Composition Concept
Unlike previous phases where each schema produced an independent composition, Phase IV envisions the 16 waveforms as raw material for a single, larger compositional ecosystem. The waveforms are not movements — they are layers. Their value lies in combinatorial flexibility:
- Layered simultaneously in varying combinations and densities
- Cut into sections and recombined in non-linear arrangements
- Interpreted differently across multiple distinct compositions from the same source material
- The same quantum data yielding many different musical outcomes depending on editorial choices
This approach positions the quantum hardware execution as the generative foundation and the composer as the interpreter — a true creative symbiosis between quantum physics and human artistic judgment.
Non-Uniform Parameter Spacing — A Key Innovation
Previous phases used evenly spaced theta values across the parameter range (0 to 2π). Phase IV will use deliberately structured non-uniform spacing to give each waveform internal rhythmic and textural life — phrases, breaths, accelerations, decelerations, and clumping. This is not random (which would be as uninteresting as even spacing) but compositionally intentional.
Possible spacing strategies under consideration:
- Logarithmic spacing — gradual acceleration through the parameter space
- Fibonacci-derived intervals — naturally proportioned irregular spacing
- Harmonically related proportions — spacing derived from musical interval ratios
- Acceleration and deceleration curves — the feeling of quantum states rushing toward or receding from configurations
- Clumped phrasing — dense clusters separated by longer gaps, creating phrase-like structures
Different schemas may use different spacing strategies, so that when layers are combined their phrasings drift in and out of alignment — creating emergent polyrhythmic structures that were never explicitly composed.
Circuit Architecture — Beyond Previous Phases
Previous phases relied primarily on RZZ, Ry, Rz, and CNOT gates. Phase IV will expand the gate vocabulary to include native IBM gate families not available or practical on Tuna-9, creating fundamentally different interference geometries in Hilbert space. Different gate types create different quantum interference structures which translate directly to different sonic character.
The 16 schemas will likely include:
- Variations on Phases II and III topologies (Star, Chain, Ring, etc.) with expanded gate sets
- Entirely new topologies exploiting 9-qubit connectivity
- Schemas specifically designed to produce complementary or contrasting sonic material
The specific 16 schemas will be developed collaboratively once IBM hardware access is confirmed and the native gate set is known precisely.
IBM Quantum Credits Program — Grant Application
Application Strategy
The application is being submitted proactively due to IBM’s stated multi-week review backlog. The one-year credit validity means there is no urgency to begin immediately — applying now ensures access is available when Phase IV development is ready to begin, estimated several months from submission.
Grant strategy follows standard academic practice: request more than the minimum needed to ensure sufficient compute budget, with willingness to reduce scope if IBM awards a smaller allocation.
Compute Requirements — Based on Direct Benchmark Testing
A direct benchmark comparison between Quantum Inspire’s Tuna-9 processor and IBM’s ibm_fez backend was conducted using identical 60-step Chain schema tomographic circuits (180 circuits, 2048 shots). ibm_fez completed execution in 1.94 minutes versus 9.31 minutes on Tuna-9 — a 4.8× speed advantage. Scaling to Phase IV — 16 schemas × 600 steps × 3 bases = 28,800 circuits — yields a linear extrapolation of approximately 5 hours for full execution on ibm_fez. Adding overhead for iterative circuit development, testing, and the expanded 9-qubit circuit complexity, the estimated Phase IV execution requirements will fall comfortably within the 10–15 hours afforded by the grant if awarded. The application will request compute time at the upper end of this estimate.
Key Application Arguments
- Established methodology: Phases I, II, and III validated on real quantum hardware with published research outcomes
- Novel application domain: quantum computing as direct creative instrument — no classical equivalent
- State-of-the-art utilization: 144 of 156 available qubits — maximal hardware engagement
- Methodological innovation: non-uniform parameter sampling as compositional tool
- Concrete deliverables: compositions, VR visualizations, and peer-reviewed publications
- Documented prior work: quantumcomputationalcreativity.com and Computer Music Journal submission
Supporting Documentation
The following resources are available as supporting links in the application:
- IDIA Lab profile: https://idialab.org/michael-rhoades/
- Overall research phases documentation: https://quantumcomputationalcreativity.com
- Computer Music Journal submission (pending publication)
Open Questions for Phase IV Development
The following questions remain to be resolved when active development begins:
- Final selection of 16 circuit topologies and their specific gate architectures
- Specific non-uniform spacing algorithms for each schema
- Target qubit count confirmation based on IBM hardware availability
- Whether 16 schemas can be executed as parallel jobs on a single large circuit or must be sequential
- Transduction methodology — how Bloch coordinates map to audio parameters in ways that produce fundamentally different material than previous phases
- VR visualization approach for Phase IV — same chrome environment with different objects, or entirely new environment per schema
- Whether Phase IV visual and audio work will be synchronized to create unified audiovisual compositions
Notes on AI Collaboration
This project has been developed through an ongoing collaborative process between Dr. Rhoades and Claude (Anthropic). Claude has served as technical intermediary, code generator, and conceptual discussion partner throughout all phases of the research. All creative decisions, circuit designs, compositional choices, and research directions have remained under Dr. Rhoades’ sole authorship and control.
For grant application purposes, Dr. Rhoades is listed as sole PI. The AI collaboration is acknowledged as a methodological tool in the research documentation at quantumcomputationalcreativity.com.
Quantum Computational Creativity — Phase IV Initial Conceptualization — Updated March 2026

