Abstract

This document outlines various accessibility related user needs, requirements and scenarios for Real-time communication (RTC). These user needs should drive accessibility requirements in various related specifications and the overall architecture that enables it. It first introduces a definition of RTC as used throughout the document and outlines how RTC accessibility can support the needs of people with disabilities. It defines the term user needs as used throughout the document and then goes on to list a range of these user needs and their related requirements. Following that some quality related scenarios are outlined and finally a data table that maps the user needs contained in this document to related use case requirements found in other technical specifications.

This document is most explicitly not a collection of baseline requirements. It is also important to note that some of the requirements may be implemented at a system or platform level, and some may be authoring requirements.

Status of This Document

This section describes the status of this document at the time of its publication. Other documents may supersede this document. A list of current W3C publications and the latest revision of this technical report can be found in the W3C technical reports index at https://www.w3.org/TR/.

This is a First Public Working Draft of RTC Accessibility User Requirements by the Accessible Platform Architectures Working Group. It is developed by the Research Questions Task Force (RQTF) who work to identify accessibility knowledge gaps and barriers in emerging and future web technologies. The requirements outlined here come from research into user needs that then provide the basis for any technical requirements. This version is published to collect public feedback on the requirements prior to finalization as a Working Group Note.

Publication as a First Public Working Draft does not imply endorsement by the
W3C Membership. This is a draft document and may be updated, replaced or
obsoleted by other documents at any time. It is inappropriate to cite this
document as other than work in progress.

1. Introduction

What is Real-time communication (RTC)?

The traditional data exchange model is client to server. Real-time communication (RTC) is game-changing as it is enabled in part by specifications like WebRTC that provides real-time peer to peer audio, video and data exchange directly between supported user agents. This enables instantaneous applications for video and audio calls, text chat, file exchange, screen sharing and gaming, all without the need for browser plugins. However, WebRTC is not the sole specification with responsibility to enable accessible real-time communications, as use cases and requirements are broad - as outlined in the IETF RFC 7478 'Web Real-Time Communication Use Cases and Requirements' document. [ietf-rtc]

1.1 Real-time communication and accessibility

RTC has the potential to allow improved accessibility features that will support a broad range of user needs for people with a wide range of disabilities. These needs can be met through improved audio and video quality, audio routing, captioning, improved live transcription, transfer of alternate formats such as sign-language, text-messaging / chat, real time user support and status polling.

Accessible RTC is enabled by a combination of technologies and specifications such as those from the Media Working Group, Web and Networks IG, Second Screen, and Web Audio Working group as well as AGWG and ARIA. APA hopes this document will inform how these groups meet various responsibilities for enabling accessible RTC, as well updating related use cases in various groups. For examples, view the current work on WebRTC Next Version Use Cases First Public Working Draft. [webrtc-use-cases]

1.2 User needs definition

This document outlines various accessibility related user needs for Accessible RTC. These user needs should drive accessibility requirements for Accessible RTC and its related architecture.

User needs are presented here with their related requirements; some in a range of scenarios (which can be thought of as similar to user stories). User needs and requirements are being actively reviewing by RQTF/APA.

2. User needs and requirements

The following outlines a range of user needs and requirements. The user needs have also been compared to existing use cases for Real-time text (RTT) such as the IETF Framework for Real-time text over IP Using the IETF Session Initiation Protocol RFC 5194 and the European Procurement Standard EN 301 549. [rtt-sip] [EN301-549]

2.1 Incoming calls and caller ID

User Need 1: A screen-reader user or user with a cognitive impairment needs to know a call is incoming and needs to recognise the ID of the caller.

REQ 1a: Provide indicatation of incoming calls in an unobtrusive way via a symbol set or other browser notification.

REQ 1b: Alert assistive technologies via relevant APIs.

2.2 Routing and communication channel control

User Need 2: A blind user of both screen reader and braille output devices may need to manage audio and text output differently e.g. once they have accepted the call a user may wish to route notifications to a separate braille device while continuing the call on bluetooth headphones.

REQ 2a: Provide or support a range of browser level audio output and routing options.

REQ 2b: Allow controlled routing of alerts and other browser output to a braille device or other hardware.

User Need 3: A deaf user needs to move parts of a live teleconference session (as separate streams) to one or more devices for greater control.

REQ 3a: Allow the separate routing of video streams such as a sign language interpreter to a separate high resolution display.

User Need 4: Users with cognitive disabilities or blind users may have relative volume levels set as preferences that relate to importance, urgency or meaning.

REQ 4a: Allow the panning or setting of relative levels of different audio output.

REQ 4b: Support multichannel audio in the browser.

2.3 Dynamic audio description values in live conferencing

User Need 5: A user may struggle to hear audio description in a live teleconferencing situation.

2.7 Video relay services (VRS) and remote interpretation (VRI)

User Need 9: A deaf, speech impaired, or hard of hearing user, needs to communicate on a call using a remote video interpretation service to access sign language and interpreter services.

REQ 9a: Provide or ensure support for video relay and remote interpretation services. This user need may relate to interoperability with third-party services; IETF has looked at standardizing a way to use Session Initiation Protocol (SIP) with VRS services. [ietf-relay] VRS calls may be made between ASL (American Sign Language) users and hearing persons speaking either English or Spanish.

2.8 Distinguishing sent and received text with RTT

User Need 10: A deaf or deaf blind user needs to tell the difference between incoming text and outgoing text.

REQ 10a: Ensure when used with RTT functionality, WebRTC handles the routing of this information to a format or output of the users choosing.

2.9 Call participants and status

User Need 11: In a teleconference a blind screen-reader user needs to know what participants are on the call, as well as their status.

REQ 11a: Ensure participant details such as name and status; whether the person is muted or talking is accessible to users of Assistive Technologies.

2.10 Live transcription and captioning support

User Need 12: A deaf user or user with a cognitive disability needs to access a channel containing live transcriptions or captioning during a conference call or broadcast.

REQ 12a: Honor user preferences relating to live transcription and captioning e.g. provide support for signing or a related symbol set.

2.11 Assistance for users with cognitive disabilities

User Need 13: Users with cognitive disabilities may need assistance when using audio or video communication.

REQ 13a: Ensure a WebRTC video call can host a technical or user support channel.

REQ 13b: Provide support that is customised to the needs of the user.

2.12 Personalized symbol sets for users with cognitive disabilities

User Need 14: Users with cognitive disabilities may need to use symbol sets for identifying functions available in a WebRTC enabled client for voice, file or data transfer.

REQ 14a: Provide personalization support for symbols set replacements of existing user interface rendering of current functions or controls.

REQ 15a: Preserve IRC as a configuration option in agents that implement WebRTC as opposed to having only the Real-time text (RTT) type interface. While (RTT) is very imporant it is favoured by users who are deaf or hearing impaired. For screen reader users, TTS cannot reasonably translate text into comprehensible speech unless the characters to be pronounced are transmitted in very close timing to one another. Typical gaps will result in stuttering and highly unintelligible output from the TTS engine.

Braille users and RTT

Some braille users will also prefer the RTT model. However, braille users desiring text displayed with standard contracted braille might better be served in the manner users relying on text to speech (TTS) engines are served, by buffering the data to be transmitted until an end of line character is reached.

2.14 Deaf users: Video resolution and frame rates

User Need 15: A deaf user watching a signed broadcast needs a high-quality frame rate to maintain legibility and clarity in order to understand what is being signed.

3. Quality of service scenarios

3.1 Bandwidth for audio

Scenario: A hard of hearing user needs better stereo sound to have a quality experience in work calls or meetings with friends or family. Transmission aspects, such as decibel range for audio needs to be of high-quality. For calls, industry allows higher audio resolution but still mostly in mono only.

3.2 Bandwidth for video

Scenario: A hard of hearing user needs better stereo sound so they can have a quality experience in watching HD video or having a HD meeting with friends or family. Transmission aspects, such as frames per minute for video quality needs to be of high-quality.

NOTE:WebRTC lets applications prioritise bandwidth dedicated to audio / video / data streams; there is also some experimental work in signalling these needs to the network layer as well as support for prioritising frame rate over resolution in case of congestion. [webrtc-priority]

4. Data table mapping user needs with related specifications

The following table maps the user needs and requirements presented in this document with other related specifications such as those defined in RFC 5194 - Framework for Real-time text over IP Using SIP and EN 301 549 - the EU Procurement Standard. [rtt-sip] [EN301-549]

Overview of what specifications may address some of the use cases outlined above

Related specs or groups

Mapping to RFC 5194 - Framework for Real-time text over IP Using SIP:

Mapping to EN 301 549 - EU procurement standard

Incoming calls

WCAG/AGWG, ARIA.

Similar to 6.2.4.2 Alerting - RFC 5194/ pre-session set up with RTT 6.2.1