no code implementations • EcomNLP (COLING) 2020 • Ryo Shimura, Shotaro Misawa, Masahiro Sato, Tomoki Taniguchi, Tomoko Ohkuma
Previous laboratory studies have indicated that the ratings recorded by these systems differ from the actual evaluations of the users, owing to the influence of historical ratings in the system.
no code implementations • EcomNLP (COLING) 2020 • Shotaro Misawa, Yasuhide Miura, Tomoki Taniguchi, Tomoko Ohkuma
To generate a slogan, we apply an encoder–decoder model which has shown effectiveness in many kinds of natural language generation tasks, such as abstractive summarization.
no code implementations • IJCNLP 2019 • Toru Nishino, Shotaro Misawa, Ryuji Kano, Tomoki Taniguchi, Yasuhide Miura, Tomoko Ohkuma
The results show that our model generates more consistent headlines, key phrases and categories.
no code implementations • COLING 2018 • Yasuhide Miura, Ryuji Kano, Motoki Taniguchi, Tomoki Taniguchi, Shotaro Misawa, Tomoko Ohkuma
We proposed a model that integrates discussion structures with neural networks to classify discourse acts.
no code implementations • IJCNLP 2017 • Yasuhide Miura, Tomoki Taniguchi, Motoki Taniguchi, Shotaro Misawa, Tomoko Ohkuma
We propose a hierarchical neural network model for language variety identification that integrates information from a social network.
no code implementations • WS 2017 • Shotaro Misawa, Motoki Taniguchi, Yasuhide Miura, Tomoko Ohkuma
The contributions of this work are (1) verifying the effectiveness of the state-of-the-art NER model for Japanese, (2) proposing a neural model for predicting a tag for each character using word and character information.