Search Results for author: Ozge Mercanoglu Sincan

Found 7 papers, 1 papers with code

Using an LLM to Turn Sign Spottings into Spoken Language Sentences

no code implementations15 Mar 2024 Ozge Mercanoglu Sincan, Necati Cihan Camgoz, Richard Bowden

Sign Language Translation (SLT) is a challenging task that aims to generate spoken language sentences from sign language videos.

Language Modelling Large Language Model +2

Giving a Hand to Diffusion Models: a Two-Stage Approach to Improving Conditional Human Image Generation

1 code implementation15 Mar 2024 Anton Pelykh, Ozge Mercanoglu Sincan, Richard Bowden

Our approach not only enhances the quality of the generated hands but also offers improved control over hand pose, advancing the capabilities of pose-conditioned human image generation.

Anatomy Image Generation

Is context all you need? Scaling Neural Sign Language Translation to Large Domains of Discourse

no code implementations18 Aug 2023 Ozge Mercanoglu Sincan, Necati Cihan Camgoz, Richard Bowden

Sign Language Translation (SLT) is a challenging task that aims to generate spoken language sentences from sign language videos, both of which have different grammar and word/gloss order.

Machine Translation NMT +3

Gloss Alignment Using Word Embeddings

no code implementations8 Aug 2023 Harry Walsh, Ozge Mercanoglu Sincan, Ben Saunders, Richard Bowden

As a result, research has turned to TV broadcast content as a source of large-scale training data, consisting of both the sign language interpreter and the associated audio subtitle.

Word Alignment Word Embeddings

Using Motion History Images with 3D Convolutional Networks in Isolated Sign Language Recognition

no code implementations24 Oct 2021 Ozge Mercanoglu Sincan, Hacer Yalim Keles

In this paper, we propose an isolated sign language recognition model based on a model trained using Motion History Images (MHI) that are generated from RGB video frames.

Sign Language Recognition

Cannot find the paper you are looking for? You can Submit a new open access paper.