Previous | Table of Contents | Next |
There are three main types of multimedia conferencing systems: point-to-point, multipoint, and multicast.
The notion of conferencing is changing to include such features as shared windows and whiteboards enabled by distributed computing. In addition, the use of computer mediation and integration is increasing.
A shared application or conferencing system permits two or more users at separate workstations to simultaneously view and interact with a common instance of an application and content. With such applications, users working on a report, for example, can collectively edit a shared copy. In general, documents used in groupware are active (i.e., the document displayed on the screen is connected to content in a database or spreadsheet).
Groupware provides such features as support for group protocols and control, including round-robin or on-demand floor-control policy, and both symmetric and asymmetric views of changes. In symmetric view, a change that is made is immediately shown to other users. In asymmetric view, applicable in applications involving teacher-student or physician-patient interactions, the changes made in one window may not be shown in another window. Groupware systems also support such issues as membership control (i.e., latecomers) and media control (i.e., synchronization of media).
The immediate multimedia applications discussed (i.e., video-on-demand, multimedia conferencing, groupware, and web browsing) have several technical requirements.
Latency refers to the delay between the time of transmission from the data source to the reception of data at the destination. Associated with delay is the notion of jitter. Jitter is the uncertainty of arrival of data. In the case of multimedia conferencing systems, practical experience has shown that a maximum delay of 150 milliseconds is appropriate.3 Synchronous communications involve a bounded transmission delay.
Storage | Communication | |
---|---|---|
Text | 2k bits per page | 1k bps |
Graphics | 20k bits per page | 10k bps |
Audio | 20k bits per signal | 20k bps |
Image | 300k bits per image | 20 kb compressed) 100kbps |
Motion Video | 150k bps (compressed)for MPEG1 | 0.42M bps for MPEG227M bytes for NTSE quality 150k bps |
Animation | 15k bps | 15k bps |
Existing networks and computing systems treat individual traffic streams (i.e., audio, video, data) as completely independent and unrelated units. When different routes are taken by each of these streams, they must be synchronized at the receiving end through effective and expeditious signaling.
Bandwidth requirements for multimedia are steep, because high data throughput is essential for meeting the stream demands of audio and video traffic. A minimum of 1.5M bps is needed for MPEG2, the emerging standard for broadcast-quality video from the Moving Picture Experts Group. Exhibit 2 depicts the storage and communications requirements for multimedia traffic streams.
The high data-presentation rate associated with uncompressed video means that errors such as a single missed frame are not readily noticeable. Most digital video is compressed, however, and dropped frames are easily noticeable. In addition, the human ear is sensitive to loss of audio data. Hence, error controls (such as check sums) and recovery mechanisms (i.e., retransmission requests) need to be built into the network. Adding such mechanisms raises a new complexity, because retransmitted frames may be too late for real-time processing.
Quality-of-service guarantees aim to conserve resources. In a broad sense, quality of service enables an application to state what peak bandwidth it requires, how much variability it can tolerate in the bandwidth, the propagation delay it is sensitive to, and the connection type it requires (i.e., permanent or connectionless, multipoint). The principle of quality of service states that the network must reliably achieve a level of performance that the user/application finds acceptable, but no better than that. Network systems can either guarantee the quality of service, not respond to it, or negotiate a level of service that they can guarantee.
Exhibit 3. Quality-of-Service Components in Networked Multimedia Applications.
Quality of service has several components, which are depicted in Exhibit 3 and described in the sections that follow.4
Application quality-of-service parameters describe requirements for applications, such as media quality and media relations. Media quality refers to source/sink characteristics (e.g., media data-unit rate) and transmission characteristics (e.g., end-to-end delay). Media relations specifies media conversion and inter- and intrastream synchronization.
System quality-of-service requirements are specified in qualitative and quantitative terms for communication services and the operating system. Qualitative parameters define the following expected level of services:
Quantitative parameters are more concrete measures that include specifications such as bits per second, number of errors, job processing time, and data size unit.
Network quality-of-service parameters describe requirements on the network, such as network load (i.e., ongoing traffic requirements such as interarrival time), and performance or guaranteed requirements in terms of latency and bandwidth. In addition, traffic parameters such as peak data rate, burst length or jitter, and a traffic model are specified. Traffic models describe arrival of connection requests or traffic contract based on calculated expected traffic parameters.
Device quality-of-service parameters typically include timing and throughput demands for media data units.
Previous | Table of Contents | Next |