A New Algorithm for Displaying Images With High Resolution Using a Directional Volumetric Display With Threads and a Projector

Using threads and a projector, a directional volumetric display capable of moving images and full-color representation was developed in our previous work. However, the horizontal resolution of the directional volumetric display could only achieve 20 pixels with 400 threads. Further, the conventional...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE access 2022, Vol.10, p.15288-15297
Hauptverfasser: Imamura, Tomoya, Baba, Mitsuru, Hoshikawa, Naoto, Nakayama, Hirotaka, Ito, Tomoyoshi, Shiraki, Atsushi
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Using threads and a projector, a directional volumetric display capable of moving images and full-color representation was developed in our previous work. However, the horizontal resolution of the directional volumetric display could only achieve 20 pixels with 400 threads. Further, the conventional algorithm requires P squared threads per P horizontal pixels of the input image. Because it is difficult to place a large number of threads, thus, a new algorithm for developing projected images to improve the directional volumetric display's resolution was proposed. It is feasible to display images of P pixels with at least P threads using this technique. However, the higher the resolution, the lower the image quality in the proposed algorithm. Thus, it was verified how many threads can be used to display high-resolution images without degrading the image quality. Further, by representing the horizontal resolution of the input image with 5-6 threads per pixel, it is possible to display high-resolution images while maintaining the image quality. The proposed technique can display 64 pixels per 384 threads, whereas the conventional method can only display 20 pixels input images per 400 threads.
ISSN:2169-3536
2169-3536
DOI:10.1109/ACCESS.2022.3148391