Vibe-Based Art Discovery
ai Prototyping & Apps
Vibe-Based Art Discovery
For this project at LUMAS, I used ai for a mood-driven discovery interface for a huge portfolio of limited-edition art prints. The core idea: instead of searching by keyword or category, users express what they're looking for through five aesthetic sliders — Subtle vs. Intense, Nature vs. Urban, Abstract vs. Figurative, Minimalist vs. Detailed, and Geometric vs. Organic.
My contribution was the scoring pipeline behind it. I used a multimodal vision model — a transformer architecture that combines a ViT-style vision encoder with a language model for joint visual-semantic reasoning — to analyze each artwork and assign it a position along every slider axis. This happens once, at indexing time.
The filtering itself is intentionally classical and deterministic: no neural inference at query time, just fast lookups against the precomputed scores. The result is an interface that feels expressive and intuitive, backed by logic that is fully predictable and auditable.
