no code implementations • 15 Mar 2024 • Namiko Saito, Joao Moura, Hiroki Uchida, Sethu Vijayakumar
Recognising the characteristics of objects while a robot handles them is crucial for adjusting motions that ensure stable and efficient interactions with containers.
no code implementations • 26 Sep 2023 • Namiko Saito, Mayu Hiramoto, Ayuna Kubo, Kanata Suzuki, Hiroshi Ito, Shigeki SUGANO, Tetsuya OGATA
We tackled on the task of cooking scrambled eggs using real ingredients, in which the robot needs to perceive the states of the egg and adjust stirring movement in real time, while the egg is heated and the state changes continuously.
no code implementations • 8 Sep 2023 • Marina Y. Aoyama, João Moura, Namiko Saito, Sethu Vijayakumar
We validate the approach on the wiping task using sponges with different stiffness and surface friction.
no code implementations • 4 Jun 2021 • Namiko Saito, Tetsuya OGATA, Satoshi Funabashi, Hiroki Mori, Shigeki SUGANO
We also examine the contributions of images, force, and tactile data and show that learning a variety of multimodal information results in rich perception for tool use.
no code implementations • 23 Sep 2018 • Namiko Saito, Kitae Kim, Shingo Murata, Tetsuya OGATA, Shigeki SUGANO
We confirm that the robot is capable of detecting features of tools, objects, and actions by learning the effects and executing the task.