A simulation study of bursty data traffic with hybrid source smoothing in an ATM node
In this paper, we study the mean delay and maximum buffer requirements at different levels of burstiness for highly bursty data traffic in an ATM node. This performance study is done via an event-driven simulation program which considers both real-time and data traffic. We assume that data traffic i...
Gespeichert in:
Veröffentlicht in: | Computers & industrial engineering 1993-09, Vol.25 (1), p.151-154 |
---|---|
Hauptverfasser: | , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | In this paper, we study the mean delay and maximum buffer requirements at different levels of burstiness for highly bursty data traffic in an ATM node. This performance study is done via an event-driven simulation program which considers both real-time and data traffic. We assume that data traffic is loss-sensitive. A large buffer (fat bucket) is allocated to data traffic to accommodate sudden long burst of cells. Real-time traffic is delay-sensitive. We impose input traffic shaping on real-time traffic using a leaky-bucket based input rate control method. Channel capacity is allocated based on the average arrival rate of each input source to maximize the utilization of channel capacity. Simulation results show that both the maximum buffer requirements and mean node delay for data traffic are directly proportional to the burstiness of its input traffic. Results for mean node delay and cell loss probability of real-time traffic are also analyzed. The simulation program is written in C++ and has been verified using the zero mean statistics concept by comparing simulation results to known theoretical or observed results. |
---|---|
ISSN: | 0360-8352 1879-0550 |
DOI: | 10.1016/0360-8352(93)90243-Q |