Portrait of Binghao Huang

Binghao Huang

黄秉豪
Robot Learning

CS Ph.D. student
A robotic hand demonstration

Intro

I'm a third-year Ph.D. student in CS at Columbia University, advised by Prof. Yunzhu Li. I received M.S. from Mechanical and Aerospace Engineering at UC San Diego, advised by Prof. Xiaolong Wang. I've also had great experiences working at NVIDIA Seattle Robotics Lab.

My research interests lie in Robot Learning, Dexterous Manipulation, Multi-Modal Perception.

Email / CV_25.07 / Google Scholar / Twitter / Bilibili Video Channel

News

Learning with Flexible Tactile Skin
[2025/10] Invited Talk, UPenn GRASP
• [2025/11] Invited Talk at New York University, General-purpose Robotics and AI Lab .
• [2025/11] Invited Talk at Amazon, Frontier AI & Robotics.
• [2025/10] Our paper VT-Refine receives the Best Paper Award at IROS 2025 AHFHR Workshop. [Link]
• [2025/10] Invited Talk at Duke Robotics.
• [2025/10] Invited Talk at UPenn GRASP SFI Seminar Series.
• [2025/08] Keynote Speaker at UW AI & Robotics Data Summit.
• [2025/08] One paper is accepted by CoRL 2025.
• [2025/07] Released our new work Touch in the wild and Code.
• [2025/07] Invited Talk at Facebook AI Research(FAIR). [Slides]
• [2025/06] Our paper Touch in the Wild is awarded the Best Demo Award at RSS 2025 Workshop on Robot Hardware-Aware Intelligence. [Link]
• [2025/06] Invited Talk at University of Washington, Mechanical Engineering Department. [Slides]
  • [Show more]
  • Featured Publications

    * Equal contribution, + Equal advising

    Touch in the Wild: Learning Fine-Grained Manipulation with a Portable Visuo-Tactile Gripper
    Xinyue Zhu*,1, Binghao Huang *,1, Yunzhu Li1
    [Webpage] [Paper] [Video] [Code]
    Conference on Neural Information Processing Systems (NeurIPS), 2025
    Best Demo Award at RSS 2025 Workshop on Robot Hardware-Aware Intelligence [Link]

    3D-ViTac project preview

    3D-ViTac: Learning Fine-Grained Manipulation with Visuo-Tactile Sensing
    Binghao Huang, Yixuan Wang, Xinyi Yang, Yiyue Luo, Yunzhu Li
    Conference on Robot Learning (CoRL), 2024
    [Webpage] [Paper] [Hardware Tutorial] [Video]

    Dynamic Handover project preview

    Dynamic Handover: Throw and Catch with Bimanual Hands
    Binghao Huang*, Yuanpei Chen*, Tianyu Wang, Yuzhe Qin, Yaodong Yang, Nikolay Atanasov, Xiaolong Wang.
    Conference on Robot Learning (CoRL), 2023
    [Webpage] [Paper] [Code]

    Robot Synesthesia project preview

    Robot Synesthesia: In-Hand Manipulation with Visuotactile Sensing
    Ying Yuan*, Haichuan Che*, Yuzhe Qin*, Binghao Huang, Zhao-Heng Yin, Kang-Won Lee, Yi Wu, Soo-Chul Lim, Xiaolong Wang
    International Conference on Robotics and Automation (ICRA), 2024
    [webpage] [arxiv] [code]

    Rotating without Seeing project preview

    Rotating without Seeing: Towards In-hand Dexterity through Touch
    Zhao-Heng Yin*, Binghao Huang*, Yuzhe Qin, Qifeng Chen, Xiaolong Wang.
    Robotics: Science and Systems (RSS), 2023
    [Webpage] [Paper] [Code]

    AnyTeleop project preview

    AnyTeleop: A General Vision-Based Dexterous Robot Arm-Hand Teleoperation System
    Yuzhe Qin, Wei Yang, Binghao Huang, Karl Van Wyk, Hao Su, Xiaolong Wang, Yu-Wei Chao, Dietor Fox
    Robotics: Science and Systems (RSS), 2023
    [Webpage] [Paper]

    DexPoint project preview

    DexPoint: Generalizable Point Cloud Reinforcement Learning for Sim-to-Real Dexterous Manipulation
    Yuzhe Qin*, Binghao Huang*, Zhao-Heng Yin, Hao Su, Xiaolong Wang.
    Conference on Robot Learning (CoRL), 2022
    [Webpage] [Paper] [Code]

    Robot Systems

    FlexiTac Tactile Platform

    FlexiTac is an open-source, scalable tactile sensing solution designed to make touch sensing easier to build, customize, and deploy across robotic systems. Based on the FlexiTac platform, we support fast hardware fabrication, tactile simulation for robot learning workflows, and system integration from manipulation hardware to real-world learning pipelines. [Project] [Hardware Repo] [Hardware Tutorial] [Simulation]

    Open-Source Hardware

    Tactile Simulation

    System Designs


    Tactile Bimanual Manipulation System

    If you want to know more about how tactile sensors can benefit your robot system, feel free to contact me. We propose 3D-ViTac, a multi-modal sensing and learning system for dexterous bimanual manipulation. This system features flexible, scalable, low-cost tactile sensors, each finger equipped with a 16 × 16 sensor array. [Hardware Tutorial] [Project] [Paper]

    Tactile Hardware

    Flexible Grasping


    Tactile Dexterous Hand System

    We propose Touch Dexterity, a new dexterous manipulation system to perform in-hand object rotation with only touch sensing. On the left, we show our hardware setup with 16 FSR sensors attached to an Allegro hand.

    Hardware Setup

    In-hand Object Rotation

    Contact Signal Simulation


    Bimanual Hand Robot System

    We propose Dynamic Handover, a new bimanual dexterous hands system designed for throwing and catching tasks. The system consists of two Allegro Hands, each individually attached to a separate XArm robot, arranged in a facing configuration.

    Bimanual hand robot system pipeline

    Hardware Setup

    Throw and Catch in Real

    System in Simulation


    Humanoid Robot with Peception and Navigation

    I have experience in developing a ROS-based control pipeline for a navigation system that utilizes 2D Lidar and depth cameras. Additionally, I have designed a vision-based tracking method that leverages object detection algorithms to enable obstacle avoidance for mobile robots. [paper] [code]

    Humanoid robot mobile platform

    Hardware Setup

    Navigation in Simulation

    Navigation in Real World


    Work Experience

    Professional Service

    • Conference Reviewer: CoRL, ICLR, RSS, IROS, ICRA
    • Journal Reviewer: IEEE T-RO, IEEE RA-L, IEEE Signal Processing Letters
    • Workshop Organizer: Learning Dexterous Manipulation @RSS 2023

    Interests

    In addition to my research in Robotics, I am also a content creator with a strong passion for sharing my knowledge of the field. I currently manage a Robotics Video Channel with over 63,000 followers and 4 million views in total. One of my Most Popular Video, which discusses robots combined with brain-computer interfaces, has garnered over 1.84 million views and is widely recognized within the field.

    Motor Augmentation

    Atlas&MPC

    Soft Robot




    Update: 2026.02

    Credit: web source from Dr. Songfang Han