I'm calling the visual representations of each color track
drones. These, like worker bees and pollen, will track the RGB elements across the UV space. Currently the
makeDrone subprocess creates:
- visible geometry
- RGB attributes from a colorEditor call
- unique object name
- colorSeek inclusion toggle (include in batch color tracking or not)
- RGB swatch via canvas command
- target-rig attribute to control (what does this drone control)
- "edit RGB" and delete options
Possible future features:
- per-drone area constraints
- initial search pattern
- per-frame census (how many frames does drone find target RGB)
No comments:
Post a Comment