I am interested in how robots can learn from humans to perform dexterous, everyday tasks in the real world.
EgoVerse
Ryan Punamiya1,
Simar Kareer1,
Zeyi Liu2,
Josh Citron2,
Ri-Zhao Qiu3,
Xiongyi Cai3,
Alexey Gavryushin4,
Jiaqi Chen4,
Davide Liconti4,
Lawrence Y. Zhu1,
Patcharapong Aphiwetsa1,
Baoyu Li1,
Aniketh Cheluva1,
Pranav Kuppili1,
Yangcen Liu1,
Dhruv Patel1,
Aidan Gao1,
Hye-Young Chung1,
Ryan Co1,
Renee Zbizika2,
Jeff Liu2,
Xiaomeng Xu2,
Haoyu Xiong2,
Geng Chen2,
Sebastiano Oliani2,
Chenyu Yang2,
Xi Wang2,
James Fort2,
Richard Newcombe2,
Josh Gao2,
Jason Chong2,
Garrett Matsuda2,
Aseem Doriwala2,
Marc Pollefeys4,
Robert Katzschmann4,
Xiaolong Wang3,
Shuran Song2,
Judy Hoffman1,
Danfei Xu1
1Georgia Tech
2Stanford
3USC
4ETH Zurich
arXiv 2025
Webpage •
Paper •
Code
TL;DR: A data collection and policy learning framework that learns whole-body mobile manipulation directly from robot-free human demonstrations.
Wensi Ai,
Ruohan Zhang,
Karthik Dharmarajan,
Nishal P Shah,
Chaofei Fan,
Tasha Kim,
Yunfan Jiang,
Chen Wang,
Alex Hodges,
Renee Zbizika,
Mattia Rigotti-Thompson,
Donald Avansino,
Nick Hahn,
Akansha Singh,
Foram Kamdar,
Payton Bechefsky,
Alexander Acosta,
Lucille Panagos,
Kushaal Rao,
Bayardo E. Lacayo,
Carlos E. Vargas-Irwin,
Nicholas Au Yong,
Chethan Pandarinath,
Leigh Hochberg,
Francis R Willett,
Jaimie M. Henderson,
Li Fei-Fei,
and Jiajun Wu.
In submission to Science Robotics (2025)
TL;DR: A brain-robot interface system enabling dexterous robot manipulation via neural signals from intracortical recordings.
Music making, see portfolio.