XR Reality Check: what Commercial Devices Deliver For Spatial Tracking
페이지 정보
작성자 Astrid 작성일25-09-18 15:12 조회2회 댓글0건본문
Inaccurate spatial monitoring in extended actuality (XR) devices leads to digital object jitter, misalignment, and person discomfort, basically limiting immersive experiences and pure interactions. On this work, we introduce a novel testbed that permits simultaneous, iTagPro smart tracker synchronized evaluation of multiple XR gadgets underneath identical environmental and kinematic conditions. Leveraging this platform, we current the first complete empirical benchmarking of 5 state-of-the-artwork XR devices across sixteen diverse situations. Our results reveal substantial intra-system performance variation, with particular person devices exhibiting up to 101% will increase in error iTagPro shop when operating in featureless environments. We additionally reveal that monitoring accuracy strongly correlates with visual conditions and movement dynamics. Finally, we explore the feasibility of substituting a motion capture system with the Apple Vision Pro as a sensible floor truth reference. 0.387), highlighting each its potential and its constraints for rigorous XR analysis. This work establishes the primary standardized framework for comparative XR tracking evaluation, providing the analysis community with reproducible methodologies, comprehensive benchmark datasets, and open-supply instruments that enable systematic analysis of monitoring efficiency throughout gadgets and situations, thereby accelerating the development of more strong spatial sensing applied sciences for XR programs.
The fast advancement of Extended Reality (XR) applied sciences has generated significant curiosity across research, luggage tracking device improvement, iTagPro tracker and consumer domains. However, inherent limitations persist in visual-inertial odometry (VIO) and visible-inertial SLAM (VI-SLAM) implementations, notably beneath difficult operational conditions including high rotational velocities, iTagPro tracker low-light environments, and textureless spaces. A rigorous quantitative analysis of XR tracking systems is important for developers optimizing immersive applications and users choosing units. However, three basic challenges impede systematic efficiency analysis across industrial XR platforms. Firstly, major XR manufacturers do not reveal crucial monitoring performance metrics, sensor (monitoring digicam and IMU) interfaces, or algorithm architectures. This lack of transparency prevents impartial validation of tracking reliability and limits choice-making by builders and end customers alike. Thirdly, existing evaluations deal with trajectory-stage performance however omit correlation analyses at timestamp stage that hyperlink pose errors to digicam and IMU sensor information. This omission limits the ability to analyze how environmental factors and user kinematics influence estimation accuracy.
Finally, most prior work doesn't share testbed designs or experimental datasets, limiting reproducibility, validation, and subsequent analysis, reminiscent of efforts to model, predict, or best item finder gadget adapt to pose errors based mostly on trajectory and sensor knowledge. On this work, we propose a novel XR spatial tracking testbed that addresses all the aforementioned challenges. The testbed allows the following functionalities: (1) synchronized multi-device monitoring performance analysis underneath numerous motion patterns and iTagPro tracker configurable environmental circumstances; (2) quantitative evaluation among environmental characteristics, user motion dynamics, multi-modal sensor information, and pose errors; and (3) open-supply calibration procedures, knowledge assortment frameworks, iTagPro tracker and analytical pipelines. Furthermore, itagpro tracker our analysis reveal that the Apple Vision Pro’s monitoring accuracy (with an average relative pose error (RPE) of 0.Fifty two cm, which is the very best among all) allows its use as a ground fact reference for evaluating other devices’ RPE without the usage of a motion seize system. Evaluation to promote reproducibility and standardized evaluation in the XR analysis community. Designed a novel testbed enabling simultaneous analysis of multiple XR units beneath the same environmental and kinematic circumstances.
This testbed achieves correct analysis by way of time synchronization precision and extrinsic calibration. Conducted the primary comparative analysis of five SOTA industrial XR gadgets (4 headsets and one pair of glasses), quantifying spatial monitoring performance throughout 16 diverse scenarios. Our evaluation reveals that average tracking errors range by up to 2.8× between gadgets beneath equivalent challenging situations, with errors starting from sub-centimeter to over 10 cm relying on gadgets, motion types, and iTagPro website setting conditions. Performed correlation analysis on collected sensor information to quantify the influence of environmental visible options, SLAM inner status, and IMU measurements on pose error, demonstrating that different XR units exhibit distinct sensitivities to these components. Presented a case research evaluating the feasibility of utilizing Apple Vision Pro in its place for conventional motion capture methods in monitoring analysis. 0.387), this means that Apple Vision Pro offers a reliable reference for native tracking accuracy, making it a sensible instrument for many XR analysis situations regardless of its limitations in assessing world pose precision.
댓글목록
등록된 댓글이 없습니다.