iconLogo
Published:2026/1/8 13:13:44

デザイン爆誕!「OnomaCompass」で素材探しが超絶捗るってマジ!?

超要約: オノマトペ(擬音語)でテクスチャ探し✨デザインが捗る魔法のツール爆誕!

ギャル的キラキラポイント✨

● 言葉(オノマトペ)とテクスチャをリンク!素材探しが爆速になるって、まさに神!🙏 ● デザイン初心者でも、自分のイメージを形にしやすいって、最高じゃない?💖 ● IT業界に革命を起こす可能性大!新しいサービスがどんどん生まれそう!🚀

詳細解説

続きは「らくらく論文」アプリで

OnomaCompass: A Texture Exploration Interface that Shuttles between Words and Images

Miki Okamura / Shuhey Koyama / Li Jingjing / Yoichi Ochiai

Humans can finely perceive material textures, yet articulating such somatic impressions in words is a cognitive bottleneck in design ideation. We present OnomaCompass, a web-based exploration system that links sound-symbolic onomatopoeia and visual texture representations to support early-stage material discovery. Instead of requiring users to craft precise prompts for generative AI, OnomaCompass provides two coordinated latent-space maps--one for texture images and one for onomatopoeic term--built from an authored dataset of invented onomatopoeia and corresponding textures generated via Stable Diffusion. Users can navigate both spaces, trigger cross-modal highlighting, curate findings in a gallery, and preview textures applied to objects via an image-editing model. The system also supports video interpolation between selected textures and re-embedding of extracted frames to form an emergent exploration loop. We conducted a within-subjects study with 11 participants comparing OnomaCompass to a prompt-based image-generation workflow using Gemini 2.5 Flash Image ("Nano Banana"). OnomaCompass significantly reduced workload (NASA-TLX overall, mental demand, effort, and frustration; p < .05) and increased hedonic user experience (UEQ), while usability (SUS) favored the baseline. Qualitative findings indicate that OnomaCompass helps users externalize vague sensory expectations and promotes serendipitous discovery, but also reveals interaction challenges in spatial navigation. Overall, leveraging sound symbolism as a lightweight cue offers a complementary approach to Kansei-driven material ideation beyond prompt-centric generation.

cs / cs.HC