Joint RI / HCII / QoLT Seminar - Chieko Asakawa

Can We Make Our World Accessible?:

Abstract
Computers have been changing the lives of the blind. For example, OCR (optical character recognition) allowed them to access printed documents and synthesized voice Web access helped them access online services. Now, the new cognitive computer technologies are reaching the point where computers can help in sensing, recognizing, and understanding our living world. Crowd-driven accessibility is an important new approach that combines human and machine intelligence for real-world accessibility. Dr Asakawa will try to predict the future directions beginning with a review of the technologies that have allowed people with disabilities, and especially blind people, to have increasingly better access to the world where vision is taken for granted. She will review how cognitive assistance was introduced via science fiction leading to a discussion of a possible assistance application. Finally, she will discuss paths toward the grand challenge to make the world accessible.

Bio Sketch
Chieko Asakawa is an IBM Fellow and Chief Technology Officer for Accessibility Research and Technology at IBM Research. Her contributions to Web accessibility include IBM Home Page Reader, one of the first voice browsers for the visually impaired, the aDesigner tool for accessibility evaluation, ai-browser for multimedia content accessibility, and then Social Accessibility, a system to bring the power of crowdsourcing to improve Web accessibility. Recently she began focusing on making the real world more accessible thorough cognitive computing strategies and on expanding the use of accessibility technologies to help senior citizens. Her many awards include the Anita Borg Institute Women of Vision Award in 2011, the Medal of Honor with Purple Ribbon from the Government of Japan in 2013 and ACM SIGACCESS 2013 Impact Award. She is a member of ACM, IEICE, IPSJ, and the IBM Academy of Technology. Her Ph.D. in engineering is from the University of Tokyo, Japan.