EgoFish3D: Egocentric 3D Pose Estimation from a Fisheye Camera via Self-Supervised Learning

نویسندگان

چکیده

Egocentric vision has gained increasing popularity recently, opening new avenues for human-centric applications. However, the use of egocentric fisheye cameras allows wide angle coverage but image distortion is introduced along with strong human body self-occlusion imposing significant challenges in data processing and model reconstruction. Unlike previous work only leveraging synthetic training, this paper presents a real-world EgoCentric Human Pose (ECHP) dataset. To tackle difficulty collecting 3D ground truth using motion capture systems, we simultaneously collect images from head-mounted camera as well two third-person-view cameras, circumventing environmental restrictions. By self-supervised learning under multi-view constraints, propose simple yet effective framework, namely EgoFish3D, pose estimation single different scenarios. The proposed EgoFish3D incorporates three main modules. 1) The module takes exocentric input estimates represented third-person frame; 2) xmlns:xlink="http://www.w3.org/1999/xlink">the predicts 3) interactive rotation matrix between views. Experimental results on our ECHP dataset existing benchmark datasets demonstrate effectiveness which can achieve superior performance to methods.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

3D Face pose estimation and tracking from a monocular camera

In this paper, we describe a new approach for estimating and tracking three-dimensional (3D) pose of a human face from the face images obtained from a single monocular view with full perspective projection. We assume that the shape of a 3D face can be approximated by an ellipse and that the aspect ratio of 3D face ellipse is given. Given a monocular image of a face, we ®rst perform an ellipse d...

متن کامل

3D Reconstruction from Full-view Fisheye Camera

In this report, we proposed a 3D reconstruction method for the full-view fisheye camera. The camera we used is Ricoh Theta, Fig. 1, which captures spherical images and has a wide field of view (FOV). The conventional stereo apporach based on perspective camera model cannot be directly applied and instead we used a spherical camera model to depict the relation between 3D point and its correspond...

متن کامل

Joint Camera Pose Estimation and 3D Human Pose Estimation in a Multi-camera Setup

In this paper we propose an approach to jointly perform camera pose estimation and human pose estimation from videos recorded by a set of cameras separated by wide baselines. Multi-camera pose estimation is very challenging in case of wide baselines or in general when patch-based feature correspondences are difficult to establish across images. For this reason, we propose to exploit the motion ...

متن کامل

Body Pose Tracking From Uncalibrated Camera Using Supervised Manifold Learning

We present a framework to estimate 3D body configuration and view point from a single uncalibrated camera. We model shape deformations corresponding to both view point and body configuration changes through the motion. Such observed shapes present a product space (different configurations × different views) and therefore lie on a two dimensional manifold in the visual input space. The approach ...

متن کامل

Camera Pose Estimation from a Stereo Setup

This thesis addresses the problem of estimation of the camera poses with respect to a rigid object, which is equivalent to the problem of tridimensional registration of a moving rigid object before fixed cameras. Matching, tracking and 3D reconstruction of feature points by a stereoscopic vision setup allows the computation of the homogeneous transformation matrix linking two consecutive scene ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: IEEE Transactions on Multimedia

سال: 2023

ISSN: ['1520-9210', '1941-0077']

DOI: https://doi.org/10.1109/tmm.2023.3242551