Hello.
I am Ivaylo Getov
I am an Interactive Producer and Technical Director leading teams and productions at the intersection of technology and creative vision. I have over a decade of experience building public experiences that use technology to bring people together in museums, galleries, and live events around the world.
I take responsibility for complex, technically demanding workflows, and I feel as comfortable writing a server backend or prototyping a sensor system as I am working with stakeholders or translating a technical constraint into a creative opportunity. That range is the job.
From 2015 to 2019 I was Cofounder and Technology Director at Luxloop, a creative technology studio in NY and LA, where I led technical development on interactive storytelling projects.
From 2019 to 2024 I was Head of Interactive at dandelion+burdock, a London-based company creating content and systems for LED, Projection, and Real-Time Environments.
Also from 2019 until right now, I have been a Technical Director and Interactive Producer at Pixels Pixels, producing and supporting interactive installations in the fine-art world for globally renowned artists and institutions.
These are some of the things I have made:
A permanent LED installation at the restaurant bar of the New Museum featuring an AI-driven autonomous character tending to objects in his virtual space. Through hidden cameras he can see visitors at the bar, recognizing and acknowledging returning guests.
- Oversaw a distributed team of 6 devs and artists across multiple technology stacks, including Python AI developers, Unity artists, and AV designers.
- Served as primary technical liaison among construction vendors, exhibition designers, AV integrators, and software contractors, translating physical design constraints into technical requirements.
- Developed on-prem AI infrastructure and a C# tools backend to expose Unity behaviors to local LLMs, eliminating cloud dependencies and per-use charges.
- Designed, prototyped, and oversaw manufacturing for custom sensor enclosures.
A neural network pushed beyond its limits endlessly attempts to generate a coherent face on a large LED wall. Custom furniture measures visitors' breath and heart rates to affect the output.
- Directed an accelerated technical installation period managing LED vendor, sound technicians, and museum staff under tight external time constraints.
- Researched and evaluated a range of non-contact biometric sensors to identify the right approach for live audience interaction.
- Collaborated with the exhibition designer and sound engineer to design custom furniture integrating subwoofer transducers, heart rate sensors, and breath sensors.
- Built a bespoke TouchDesigner media server and control system to drive and modulate a live Neural Network (GAN) in response to viewer presence and biometrics.
The interactive component of The Shed's retrospective on Christo and Jeanne-Claude's "The Gates." Visitors use a custom iPad app above an 18-foot physical map to explore an augmented-reality recreation of the original 2005 Central Park installation.
- Researched, prototyped, and oversaw development of the custom iPad AR app.
- Managed on-site deployment and software tuning during the opening installation period.
An early-stage startup building AI-driven family experiences, including "Dragon Time" — an animated character capable of open-ended real-time conversation over FaceTime video calls.
- Advised on DevOps strategy, technical infrastructure, and cost optimization.
- Prototyped and evaluated systems across the full AI stack: dynamic LLM tool calls, AI-driven analytics, on-prem inference, computer vision, and audio models (STT and TTS).
A real-time simulation of the artist himself waiting in a featureless void for 100 years of actual time. Commissioned by HTC VIVE Arts, this VR artwork is driven by performance capture and a nonlinear game loop, the work runs continuously while multiple synchronized instances can exist simultaneously in different locations.
- Developed a budget and technical plan anticipating a sustained 100-year runtime.
- Authored technical documentation designed to preserve the artwork's operational knowledge across its proposed 100-year lifespan.
- Partnered with move.ai to develop novel performance capture workflows for a wheelchair user, enabling full-body capture without a standard T-pose.
- Designed and developed a multi-client synchronization architecture using a central authoritative node.js server and redundant sockets to keep remote Unity instances in sync.
A museum-wide major solo exhibition at the Venice Biennale in which 15 networked artworks form a single interconnected system, including four new nonlinear or interactive works.
- Developed a comprehensive system plan across the full exhibition, comprising 15 networked artworks connected to each other and to live external data streams.
- Managed a multi-week installation period in Venice, coordinating construction, networking, sensor systems, and AV across the logistical constraints of moving supplies via boat.
- Directly led a team of 11 developers and animators working across Unreal Engine, TouchDesigner, and Python backends on the four new works created for the exhibition.
- Led development of a TouchDesigner backend for a self-editing film, including systems during production as well as the editing itself.
- Led development for a set of wearable masks using computer vision to catalog their surroundings and generate a fictional language based on what they observe.
- Partnered with AV vendor to design and implement centralized system controls for exhibition-wide control and maintenance.
An animated film playing as a real-time simulation, featuring a "worldwatching" mode allowing viewers to pause any scene and tap on any character, artifact, or entity in the frame to investigate its lore and mythology.
- Built remote production pipelines to coordinate and sustain an 80+ person team through the pandemic, enabling uninterrupted delivery across distributed locations.
- Directly oversaw a 12-person team in Unity, integrating keyframe animation, procedural animation, live interaction, and performance capture workflows.
- Designed hybrid pipelines mixing traditional keyframe animation with real-time game engine simulation.
| Skills | |
| Production | Technical direction, technical producing, research & strategy, team management |
| Real-Time & Media Servers | TouchDesigner, Unity, Unreal Engine, XR virtual production, AR/VR |
| AI & Machine Learning | LLMs, on-prem inference, custom tools, MCP, computer vision |
| Physical Computing | Sensors (environmental and biometric), person tracking, custom hardware design and prototyping |
| Languages & Backends | C#, node.js, Python, C++ |
| Misc | D&D dungeon master, a little too obsessed with karaoke, EU & US citizen |
Selected Exhibitions/Screenings/Publications
- NY Times - October 2016 - "Technology Invites a Deep Dive Into Art" (as Luxloop)
- Media Lounge NYC - November, 2015 - Group Show featuring "If The Walls had Eyes" (as Luxloop)
- Internet Yami Ichi - 12 September, 2015 - "The Internet Yami-Ichi is a free-to-attend flea market where people gather and exchange 'Internet-ish' things in real life." (as Luxloop)
- El Confidencial - 23 August, 2015 - Technology feature article about Luxloop and "Social Sound"
- VICE: The Creators Project - 26 July, 2015 - Feature article about "Social Sound" (as Luxloop)
- Prosthetic Knowledge - 25 July, 2015 - Feature article about "Social Sound" (as Luxloop)
- SELECT FAIR, NYC - 13-17 May, 2015 - Show of interactive work "If the Walls Had Eyes" (as Luxloop)
- ONE NIGHT STAND, LA - 2015 - Group show of interactive work. (as Luxloop)
- Lift Conference, Geneva - 2015 - New Techniques in Science Storytelling (with "Axion")
- Refest - 2014 - Show of interactive work at Culture Hub, LaMaMa Theater, NYC (as Luxloop)
- MIT Hacking Arts 2014 - 2014 - Winner: Best Hack in Film/TV/VR (as Luxloop)
- Filmteractive Festival - 6 October, 2014 - NDPC Special Award for Best Project at Filmteractive Market (for "Axion")
- BBC News - 11 April, 2014 - BBC News feature about CERN Story Matter Hackathon, featuring "Axion"
- BBC World Service - 19 March, 2014 - BBC episode of "Click" featuring "Axion" at CERN Story Matter Hackathon
- CineGlobe - 19 March, 2014 - International Film Festival of Science held at CERN (with "Axion")
- You Are Here - 18 July, 2013 - Group show at 3rd Ward, Brooklyn, NY (with "Untitled(Triptych)")
- Duality - July 2013 - Group show at Linus Galleries, Los Angeles (with "Untitled(Triptych)")
- FRAMERATES - 16-18 November, 2012 - Video art group show at The Brewery, Downtown Los Angeles (role: curator)
- PSFK - 7 December, 2012 - Photo essay: "Bulgarian Squat Shops"
- Brooklyn Film Festival - June 2011 - Winner: Audience Choice Award for "Pose"
Teaching/Lectures/Presentations
CalArts - Music Technology: Interaction, Intelligence & Design
- Guest Forum Speaker
- March 2017
Eyeo Festival 2016
- Ignite Talk: "The UX of Story" (as Luxloop)
- June 2016
Agencia de Apoyo a la Arquitectura de Barcelona (Agency for the Support of Architecture, Barcelona)
- Guest Lecturer: "Urbanism in Media"
- March 2012
University of Arizona, Tucson - College of Architecture, Planning, and Landscape Architecture
- Guest Lecturer: "Urbanism in Media"
- November 2010
NYU Tisch Summer High School Film Workshop
- Prof: Richard Litvin
- Head TA (2009) - Responsible for managing 11 TAs and 28 students, as well as lecturing on technical and aesthetic aspects of film production.
- Assistant TA (2007 - 08) - Responsible for managing groups of four students, producing individual film projects, and teaching aspects of producing and directing.
Education
New York University Tisch School of the Arts
- Bachelor of Fine Arts (Film & Television, 2009 Dean's List)
- Minor Concentration in Philosophy and Ethics
- Documentary Production Program - Ludwig Foundation, Havana, Cuba - 2008
back to top
back to main