Search Results for author: Yi-Ting Shen

Found 6 papers, 0 papers with code

Diversifying Human Pose in Synthetic Data for Aerial-view Human Detection

no code implementations24 May 2024 Yi-Ting Shen, Hyungtae Lee, Heesung Kwon, Shuvra S. Bhattacharyya

We present a framework for diversifying human poses in a synthetic dataset for aerial-view human detection.

Exploring the Impact of Synthetic Data for Aerial-view Human Detection

no code implementations24 May 2024 Hyungtae Lee, Yan Zhang, Yi-Ting Shen, Heesung Kwon, Shuvra S. Bhattacharyya

Therefore, synthetic data can be a good resource to expand data, but the domain gap with real-world data is the biggest obstacle to its use in training.

Progressive Transformation Learning for Leveraging Virtual Images in Training

no code implementations CVPR 2023 Yi-Ting Shen, Hyungtae Lee, Heesung Kwon, Shuvra Shikhar Bhattacharyya

To effectively interrogate UAV-based images for detecting objects of interest, such as humans, it is essential to acquire large-scale UAV-based datasets that include human instances with various poses captured from widely varying viewing angles.

Archangel: A Hybrid UAV-based Human Detection Benchmark with Position and Pose Metadata

no code implementations31 Aug 2022 Yi-Ting Shen, Yaesop Lee, Heesung Kwon, Damon M. Conover, Shuvra S. Bhattacharyya, Nikolas Vale, Joshua D. Gray, G. Jeremy Leong, Kenneth Evensen, Frank Skirlo

Learning to detect objects, such as humans, in imagery captured by an unmanned aerial vehicle (UAV) usually suffers from tremendous variations caused by the UAV's position towards the objects.

Human Detection Model Optimization +4

What Synthesis is Missing: Depth Adaptation Integrated with Weak Supervision for Indoor Scene Parsing

no code implementations ICCV 2019 Keng-Chi Liu, Yi-Ting Shen, Jan P. Klopp, Liang-Gee Chen

Our proposed two-stage integration more than halves the gap towards fully supervised methods when compared to previous state-of-the-art in transfer learning.

Scene Parsing Transfer Learning

Cannot find the paper you are looking for? You can Submit a new open access paper.