Informatics in radiology (infoRAD): introduction to the language of three-dimensional imaging with multidetector CT
The recent proliferation of multi-detector row computed tomography (CT) has led to an increase in the creation and interpretation of images in planes other than the traditional axial plane. Powerful three-dimensional (3D) applications improve the utility of detailed CT data but also create confusion...
Gespeichert in:
Veröffentlicht in: | Radiographics 2005-09, Vol.25 (5), p.1409-1428 |
---|---|
Hauptverfasser: | , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | The recent proliferation of multi-detector row computed tomography (CT) has led to an increase in the creation and interpretation of images in planes other than the traditional axial plane. Powerful three-dimensional (3D) applications improve the utility of detailed CT data but also create confusion among radiologists, technologists, and referring clinicians when trying to describe a particular method or type of image. Designing examination protocols that optimize data quality and radiation dose to the patient requires familiarity with the concepts of beam collimation and section collimation as they apply to multi-detector row CT. A basic understanding of the time-limited nature of projection data and the need for thin-section axial reconstruction for 3D applications is necessary to use the available data effectively in clinical practice. The axial reconstruction data can be used to create nonaxial two-dimensional images by means of multiplanar reformation. Multiplanar images can be thickened into slabs with projectional techniques such as average, maximum, and minimum intensity projection; ray sum; and volume rendering. By assigning a full spectrum of opacity values and applying color to the tissue classification system, volume rendering provides a robust and versatile data set for advanced imaging applications. |
---|---|
ISSN: | 1527-1323 |
DOI: | 10.1148/rg.255055044 |