Miraikan Accessibility Lab.
Miraikan Accessibility Lab. is a research consortium to invent technologies that will support the future lives of the visually impaired by cooperating with partners.

Introduction video of the works of Accessibility Lab.
Vision
Science and technology have transformed the lives of people with disabilities. Speech synthesis and mobile devices are now indispensable technologies for the daily lives of the visually impaired in areas such as education and jobs. Miraikan Accessibility Lab. is a research consortium that collaborates with companies and institutions with advanced AI and robotics technologies, striving to create technologies that can help the visually impaired move freely about the city, recognize information overflowing inside the city, and become independent within the city. We will expedite the social implementation of these technologies by having visitors actually experience them throughout Miraikan and together, consider both issues and possibilities.
Research
AI Suitcase

The AI Suitcase is an autonomous, navigation robot that guides visually impaired people. It looks like a commercially available suitcase, yet it can recognize obstacles and people to guide the visually impaired user safely to their destination. We are improving its functions through collaboration with partners.
Cooperation: Advanced Assistive Mobility Platform
Publications
2025
Hironobu Takagi, Kakuya Naito, Daisuke Sato, Masayuki Murata, Seita Kayukawa, and Chieko Asakawa. 2025. Field Trials of Autonomous Navigation Robot for Visually Impaired People. In Extended Abstracts of the 2025 CHI Conference on Human Factors in Computing Systems (CHI 2025 Case Study).
Masaki Kuribayashi, Kohei Uehara, Allan Wang, Shigeo Morishima, and Chieko Asakawa. 2025. Wanderguide: Indoor map-less robotic guide for exploration by blind people. In Proceedings of the 2025 CHI Conference on Human Factors in Computing Systems (CHI2025).
Rie Kamikubo, Seita Kayukawa, Yuka Kaniwa, Allan Wang, Hernisa Kacorri, Hironobu Takagi, and Chieko Asakawa. 2025. Beyond Omakase: Designing Shared Control for Navigation Robots with Blind People. In Proceedings of the 2025 CHI Conference on Human Factors in Computing Systems (CHI 2025).
Masaki Kuribayashi, Kohei Uehara, Allan Wang, Daisuke Sato, Renato Ribeiro, Simon Chu, and Shigeo Morishima. 2025. Memory-Maze: Scenario Driven Visual Language Navigation Benchmark for Guiding Blind People. In IEEE Robotics and Automation Letters (RA-L)
2024
Masaya Kubota*, Masaki Kuribayashi*, Seita Kayukawa, Hironobu Takagi, Chieko Asakawa, and Shigeo Morishima (*-equal contribution). 2024. Snap&Nav: Smartphone-based Indoor Navigation System For Blind People via Floor Map Analysis and Intersection Detection. In Proceedings of the 26th International Conference on Human-Computer Interaction with Mobile Devices and Services (MobileHCI 2024).
Yuka Kaniwa*, Masaki Kuribayashi*, Seita Kayukawa, Daisuke Sato, Hironobu Takagi, Chieko Asakawa, and Shigeo Morishima (* - equal contribution). 2024. ChitChatGuide: Conversational Interaction Using Large Language Models for Assisting People with Visual Impairments to Explore a Shopping Mall. In Proceedings of the 26th International Conference on Human-Computer Interaction with Mobile Devices and Services (MobileHCI 2024).
2023
Seita Kayukawa, Daisuke Sato, Masayuki Murata, Tatsuya Ishihara, Hironobu Takagi, Shigeo Morishima, and Chieko Asakawa. 2023. Enhancing Blind Visitor’s Autonomy in a Science Museum Using an Autonomous Navigation Robot. In Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems (CHI 2023).
2022
Seita Kayukawa, Daisuke Sato, Masayuki Murata, Tatsuya Ishihara, Akihiro Kosugi, Hironobu Takagi, Shigeo Morishima, and Chieko Asakawa. 2022. How Users, Facility Managers, and Bystanders Perceive and Accept a Navigation Robot for Visually Impaired People in Public Buildings. In Proceedings of the 31st IEEE International Conference on Robot & Human Interactive Communication (IEEE RO-MAN 2022)
"Touchable" Exhibits

Telescopes and Microscopes are essential science equipment that allows us to see the unseeable. To enable visually impaired people to understand visual information by touch, we are developing "touchable" exhibits, especially with an interactive audio guide.
Publications
2025
Xiyue Wang, Seita Kayukawa, Hironobu Takagi, and Chieko Asakawa. 2025. Engaging Blind People in Science Museums Through an Immersive Workshop: Practices, Challenges, and Technological Opportunities. In The 27th International ACM SIGACCESS Conference on Computers and Accessibility (ASSETS 2025).
Ayaka Tsutsui*, Xiyue Wang* Hironobu Takaigi, and Chieko Asakawa (* - equal contribution). 2025. Investigating "Touch and Talk" for Blind and Low Vision People: Science Communication Assistance Through Exploring Multiple Tactile Objects. In The 27th International ACM SIGACCESS Conference on Computers and Accessibility (ASSETS 2025).
2024
Xiyue Wang, Seita Kayukawa, Hironobu Takagi, Giorgia Masoero, and Chieko Asakawa. 2024. Direct or Immersive? Comparing Smartphone-based Museum Guide Systems for Blind Visitors. In The 21st International Web for All Conference (W4A’24).
2023
Xiyue Wang, Seita Kayukawa, Hironobu Takagi, and Chieko Asakawa. 2023. TouchPilot: Designing a Guidance System That Assists Blind People in Learning Complex 3D Structures. In The 25th International ACM SIGACCESS Conference on Computers and Accessibility (ASSETS ’23).
2022
Xiyue Wang, Seita Kayukawa, Hironobu Takagi, and Chieko Asakawa. 2022. BentoMuseum: 3D and Layered Interactive Museum Map for Blind Visitors. In The 24th International ACM SIGACCESS Conference on Computers and Accessibility (ASSETS ’22), October 23–26, 2022, Athens, Greece.
Researcher

Xiyue Wang
Ph.D. (Information Science) / Researcher, Miraikan - The National Museum of Emerging Science and Innovation
My current research focuses on developing tangible and tactile user interfaces that provide barrier-free museum experiences to people with vision impairments and diverse needs.
Keywords: Human-Computer Interaction, 3D Printing, Touch Sensing, Behavioral Data Analysis

Allan Wang
Ph.D. (Robotics) / Researcher, Miraikan - The National Museum of Emerging Science and Innovation
My research interest is Social Navigation, or how can we let a robot navigate smoothly while conforming to the various social rules in a pedestrian-rich environment. It has many application scenarios, such as navigation for the AI suitcase robot.
Keywords: Social Navigation, Visual Navigation, Motion Prediction, Human-Robot Interaction

Renato Ribeiro
Researcher, Miraikan - The National Museum of Emerging Science and Innovation
I have researched methods for making navigation in virtual environments accessible for people with visual impairments. I currently explore ways to enable robots to navigate smoothly in pedestrian-rich environments.
Keywords: Human-Computer Interaction, Virtual Reality, Navigation

Masaki Kuribayashi
Part-time Researcher, Miraikan - The National Museum of Emerging Science and Innovation
My research focuses on developing robots that assist visual impaired people in navigating and exploring unfamiliar environments or places they are visiting for the first time.
Keywords:Human Computer Interaction, Map-less Navigation, Visual Language Navigation

Ayaka Tsutsui
Part-time Researcher, Miraikan - The National Museum of Emerging Science and Innovation
My research explores how people with visual impairments use tactile information to understand and engage with the world, with a focus on designing interactive systems that support tactile exploration through the lens of interaction design and assistive technology.
Keywords: Human-Computer Interaction, Assistive Technology, Multimodal

Kengo Tanaka
Part-time Researcher, Miraikan - The National Museum of Emerging Science and Innovation
I am conducting research on interfaces to enhance accessibility to cultural, artistic, and educational content for people with visual impairments.
Keywords: Human-Computer Interaction, Generative AI, 3D Printing
Alumni
Open Data / Open Source
-
Miraikan Accessibility Lab. GitHub
-
AI Suitcase Project GitHub
-
Miraikan 360-degree Video Dataset
This is a data set of all-around images of Miraikan captured using a 360-degree camera.
Anyone can use it for research and development purposes. -
Model data for 3D printing
The model data for 3D printing such as “touchable” exhibits is published on Thingiverse.
https://www.thingiverse.com/accessibilitylabmiraikan/designs
Participating Organizations
Sponsorship
Miraikan (Accessibility Lab.) sponsors the following events and organizations.
IPSJ (Information Processing Society of Japan), SIG AAC (Assistive and Accessible Computing)
Recruitment of Researchers, Experimental Participants
Miraikan Accessibility Lab. is seeking researchers who aspire to work with us at Miraikan, as well as people who can participate in experiments as users.
About Recruitment of Researchers
- If you wish to participate in experiments as users: Please send us your request from the “Contact through the Internet” link below.
Contact
Miraikan - The National Museum of Emerging Science and Innovation