Patrick Rim
I am a first-year CS Ph.D. student at Yale University in Prof. Alex Wong’s Yale Vision Laboratory. Previously, I completed my B.S. in Computer Science with a minor in Information and Data Sciences at the California Institute of Technology (Caltech).
I am also currently a Research Scientist Intern at Meta Reality Labs, working on the Extended Reality (XR) team with Kun He and Shoou-I Yu.
Research Interests
My research centers on building embodied AI agents with adaptive, efficient, robust egocentric perception and multimodal capabilities (as in vision + range sensors, language, or audio). Within this broader goal, my recent work focuses on sensor fusion—that is, combining inputs from cameras and range sensors (such as lidar or radar)—in challenging and dynamic settings. I am also interested in recognition tasks, such as 3D object detection for applications like AR/VR and autonomous driving, and generation tasks, such as text-conditional depth map generation with diffusion models. Currently, I am exploring in-the-wild egocentric motion capture (hand and body tracking).
Recent Publications
ProtoDepth: Unsupervised Continual Depth Completion with Prototypes
Patrick Rim, Hyoungseob Park, S. Gangopadhyay, Ziyao Zeng, Younjoon Chung, Alex Wong
CVPR 2025
SparseFusion: Fusing Multi-Modal Sparse Representations for Multi-Sensor 3D Object Detection
Yichen Xie, Chenfeng Xu, MJ Rakotosaona, Patrick Rim, Federico Tombari, Kurt Keutzer, Masayoshi Tomizuka, Wei Zhan
ICCV 2023
Quadric Representations for LiDAR Odometry, Mapping and Localization
Chao Xia*, Chenfeng Xu*, Patrick Rim, Mingyu Ding, Nanning Zheng, Kurt Keutzer, Masayoshi Tomizuka, Wei Zhan
RA-L 2023