INTERACTIVE PROJECTION


Syllabus

Interactive Projection is a hands-on seminar where students explore human-in-the-loop fabrication using dynamic projection and sensing. Participants will design interactive systems that combine projectors and sensors (e.g., depth cameras, motion trackers) to guide users through physical tasks or immersive experiences. The focus is on creating visual symbols, rule-based feedback, and real-time adaptations—projecting instructions, corrections, or dynamic visuals that respond to user actions. Students will prototype workflows where sensors validate interactions and trigger projected cues, blending digital precision with tangible engagement. The seminar encourages experimentation with tools like TouchDesigner, Unity, or custom code, though no prior expertise is required. Projects could range from practical fabrication aids (e.g., woodworking guides, interactive manuals) to artistic installations or collaborative games. By the end, attendees will have built a functional system that rethinks how projection can mediate between humans and machines—enhancing creativity, accuracy, and play in making processes.

Learning Objectives 

Through prototyping and critique, students will leave with a framework for designing interactive projection as a mediating tool between humans and machines.

  • Conceptualize & Design
  • Integrate Sensing & Projection
  • Prototype Human-in-the-Loop Workflows
  • Finite state machine & Behavior Trees

Faculty


Projects from this course

Rock the Rock

A Real-Time Audio-Visual Stone Symphony Rock the Rock is an interactive audio-visual installation that identifies and tracks rocks in real time, generating dynamic sound and projection overlays. By leveraging computer vision and finite state machines, it transforms geological forms into a sensory experience. Concept and Context This project served as our introduction to Finite State … Read more

PalmPilot 2000

No mouse. No keyboard. Just you.  Github Link: https://github.com/Adronegenius/palmpilot2000.git Problem/Opportunity Current 2D CAD layout tools rely heavily on complex UI layers and precise mouse manipulation. These become barriers for: PalmPilot2000 addresses this by introducing a body-centered interaction system, where gestures drive the logic of selecting, moving, and placing elements on a plan — with visual … Read more

Hardware III _ LEGO_GUIDE

Github : https://github.com/Clarrainl/Lego_AR_Interface Introduction What if LEGO instructions could appear right on your table, adapting to the bricks you have?Lego_AR_Interface turns the simple act of LEGO building into a smart and interactive experience. Combining computer vision, real-time projection, and gesture recognition, the system guides users through constructing custom models—no screen touches required. Concept: Build from Random Bricks … Read more