This project documents a 1-week workshop that introduced students to the COMPAS framework. Our goal was to design a tile mosaic that would be part of a collaborative pick-and-place workflow with the UR5, incorporating COMPAS Fab, Kangaroo (Grasshopper), and ROS. Our design took a bottom-up approach, embracing the randomness of tile size and color, and ultimately finding order through the integration of vector fields to align the tiles in a specific direction. After taking a step back from our work, we saw its value as a potential wayfinding solution for the built environment.

Figure 01: Wayfinding Examples

Our design was simple, utilizing three different sizes and colors of Cycles, that were cut out of Aludibond. Each one has a small cut in it, we wanted to use these to indicate some kind of information. in the image below, we have the three tiles we are using on the left. On the right are the three tiles rotated in a direction, which we want to use for wayfinding.

The slits in the circles could always point in a desired direction, helping lead people somewhere or guide the eye to a certain position. It is a subtle cue that can be implemented into many different locations.

Figure 03: Splitting path application
Figure 04: Potential Wall Design Feature
Figure 05: Wall Design Feautre

Digital Workflow

The tiles are packed into a bounding area where the kangaroo circle packing simulation is applied. Once the tiles are nested, the orientation of the tiles is then pivoted to align with the vector field applied over the bounding area. Once the design stage is complete, the tiles are sorted in order of distance from the 0,0 point – this sorting process corresponds to the placing order in the robotic workflow. Each tile output contains metadata regarding the tile’s color, position, and orientation.

Figure 06: Grasshopper Workflow

Robotic picking approach

We aimed to enhance the scanning process to compute the orientation of randomly placed tiles at the picking station. This required developing a computer vision methodology capable of feature detection and determining object orientation.
Picking station

Macro workflow
design, openCV and compasFab

Our overarching workflow combines Grasshopper, ROS, Docker, and CompasFab in Grasshopper. As demonstrated in the previous step, from Grasshopper, we obtain the number of each circle and their associated color as output. We feed this information into CompasFab, where it is merged with data received from Python running OpenCV. The Python script, detailed later, can detect the color, position, and rotation of each circle in the pickup area in front of the robot. The information is sorted, and as output, color, picking plane position, and rotation are transmitted to CompasFab. CompasFab then identifies the first circle required in the design, finds its location from the OpenCV data, picks it up, and places it where needed.

Computer Vision Logic

Exploiting the main feature in our tile – the notch – we were able to determine the orientation of each tile by finding and clustering intersection points between the detected edge and an offset circle. This allowed us to define a vector direction that can be read by Grasshopper and CompasFab in the robotic picking process.
Image processing diagram
Image processing developments in openCV
Open CV results