Responsive 3D Printing
ABOUT THE PROJECT
This project introduces a responsive, stimulus-driven approach to 3D printing, inspired by phototropism—the process by which plants dynamically grow toward light. Rather than following fixed CAD paths, it uses live camera sensing to detect the direction of a light source, guiding each bead of material placement in real-time. An agent-based model (ABM) translates this sensory input into immediate deposition targets, while Discrete Deposition Modelling generates G-code instantly for each cycle, enabling truly adaptive fabrication.
Using off the shelf hardware -a 3D printer, a webcam, and an LED lamp- the system coordinates sensing, responding, and printing within a ROS2 workflow. Local interactions like branching and merging enrich the main phototropic response, proactively adjusting geometry. Tests showed reliable adaptation to shifting light positions and emergent multi-agent behaviors.
This behavioral 3D printing method allows each material placement to respond directly to live environmental data. Its modular design scales easily from desktop setups to large-format clay or concrete printing, marking a significant step toward practical, adaptive fabrication.
PROJECT CREDITS
CREATE Group - Led by Ass. Prof. Dr. Roberto Naboni
Research Team: Roberto Naboni, Lucas Helle, Joseph Naguib
Funded by: Villum Foundation