iconLogo
Published:2025/10/23 9:38:50

ロボ触覚制御「NeuralTouch」爆誕!✨

超要約: ロボが触覚でモノ掴むの、精度爆上げ&色んな形に対応!

ギャル的キラキラポイント

● 視覚(カメラ)だけじゃダメ🙅‍♀️触覚(タッチ)も使うのがミソ! ● 色んな形に対応できるから、色んな現場で活躍できるってコト💖 ● シミュレーション(仮想)で練習したこと、現実世界でも使えるようにしたの、天才👏


続きは「らくらく論文」アプリで

NeuralTouch: Neural Descriptors for Precise Sim-to-Real Tactile Robot Control

Yijiong Lin / Bowen Deng / Chenghua Lu / Max Yang / Efi Psomopoulou / Nathan F. Lepora

Grasping accuracy is a critical prerequisite for precise object manipulation, often requiring careful alignment between the robot hand and object. Neural Descriptor Fields (NDF) offer a promising vision-based method to generate grasping poses that generalize across object categories. However, NDF alone can produce inaccurate poses due to imperfect camera calibration, incomplete point clouds, and object variability. Meanwhile, tactile sensing enables more precise contact, but existing approaches typically learn policies limited to simple, predefined contact geometries. In this work, we introduce NeuralTouch, a multimodal framework that integrates NDF and tactile sensing to enable accurate, generalizable grasping through gentle physical interaction. Our approach leverages NDF to implicitly represent the target contact geometry, from which a deep reinforcement learning (RL) policy is trained to refine the grasp using tactile feedback. This policy is conditioned on the neural descriptors and does not require explicit specification of contact types. We validate NeuralTouch through ablation studies in simulation and zero-shot transfer to real-world manipulation tasks--such as peg-out-in-hole and bottle lid opening--without additional fine-tuning. Results show that NeuralTouch significantly improves grasping accuracy and robustness over baseline methods, offering a general framework for precise, contact-rich robotic manipulation.

cs / cs.RO