no code implementations • COLING 2022 • Ryoko Tokuhisa, Keisuke Kawano, Akihiro Nakamura, Satoshi Koide
Pre-trained language models (PLMs) such as BERT and RoBERTa have dramatically improved the performance of various natural language processing tasks.
no code implementations • 21 Feb 2024 • Yasushi Esaki, Akihiro Nakamura, Keisuke Kawano, Ryoko Tokuhisa, Takuro Kutsuna
We propose an accuracy-preserving calibration method using the Concrete distribution as the probabilistic model on the probability simplex.
no code implementations • 9 Mar 2023 • Keisuke Kawano, Takuro Kutsuna, Ryoko Tokuhisa, Akihiro Nakamura, Yasushi Esaki
One major challenge in machine learning applications is coping with mismatches between the datasets used in the development and those obtained in real-world applications.
no code implementations • 3 Sep 2021 • Yasuyuki Suzuki, Keigo Togame, Akihiro Nakamura, Taishin Nomura
The intermittent on-off switching of feedback control is considered as a major mechanism of postural stabilization during human quiet standing, which can be modeled by switched-type hybrid stochastic delay differential equations with unstable subsystems.
no code implementations • 1 Oct 2019 • Akihiro Nakamura, Tatsuya Harada
With both tasks, we show that our method achieves higher accuracy than common few-shot learning algorithms.
no code implementations • 4 Apr 2019 • Akihiro Nakamura, Michihiro Kobayashi
We propose a simple method for estimating noise level from a single color image.