RICE : A Reliable and Efficient Remote Instrumentation Collaboration Environment
Remote access of scientific instruments over the Internet (i.e., remote instrumentation) demand high-resolution (2D and 3D) video image transfers with simultaneous real-time mouse and keyboard controls. Consequently, user quality of experience (QoE) is highly sensitive to network bottlenecks. Furthe...
Gespeichert in:
Veröffentlicht in: | Advances in multimedia 2009, Vol.2008 (2008), p.1-17 |
---|---|
Hauptverfasser: | , , , , |
Format: | Artikel |
Sprache: | eng |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Remote access of scientific instruments over the Internet (i.e., remote instrumentation) demand high-resolution (2D and 3D) video image transfers with simultaneous real-time mouse and keyboard controls. Consequently, user quality of experience (QoE) is highly sensitive to network bottlenecks. Further, improper user control while reacting to impaired video caused due to network bottlenecks could result in physical damages to the expensive instrument equipment. Hence, it is vital to understand the interplay between (a) user keyboard/mouse actions toward the instrument, and (b) corresponding network reactions for transfer of instrument video images toward the user. In this paper, we first present an analytical model for characterizing user and network interplay during remote instrumentation sessions in terms of demand and supply interplay principles of traditional economics. Next, we describe the trends of the model parameters using subjective and objective measurements obtained from QoE experiments. Thereafter, we describe our Remote Instrumentation Collaboration Environment (RICE) software that leverages our experiences from the user and network interplay studies, and has functionalities that facilitate reliable and efficient remote instrumentation such as (a) network health awareness to detect network bottleneck periods, and (b) collaboration tools for multiple participants to interact during research and training sessions. |
---|---|
ISSN: | 1687-5680 1687-5699 |