Image and dataset based nutrition estimation for hospitalized patients using AI

The food nutrition intake calculator detects the nutrition level of the hospitalized patient’s food. The objective of the system is to predict the images of the food which the patients will take in hospital. It detects the nutrient level whether it is a higher level or a lower level of the nutrient....

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Hauptverfasser: Sathana, V., Sathya, M., Swathi, R., Swetha, J., Vaishnavi, S.
Format: Tagungsbericht
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:The food nutrition intake calculator detects the nutrition level of the hospitalized patient’s food. The objective of the system is to predict the images of the food which the patients will take in hospital. It detects the nutrient level whether it is a higher level or a lower level of the nutrient. In this process, the food intake images of what the patients have eaten in the hospital. Here the images were predicted by the image processing method. In image processing, classification of the images done by SVM algorithm. In addition, Gaussian Smoothing algorithm, which is used to detect outline of the image. The prediction of the food’s nutrition level is measured by grams. The food information collected from the hospital are stored in the database. After that patient intake food image is compared with database. Then the result of the image processing method will be converted into the dataset. The result of the dataset is classified by the Few shot learning algorithm. It also shows the recovery details of the patient. In future, there is a need to improve more processes for food nutrition intake. Nonetheless, the trouble lies in the way that the modern comment prerequisites characteristically limit the quality and the size of the food pictures information base for a supplement consumption appraisal.
ISSN:0094-243X
1551-7616
DOI:10.1063/5.0173223