Food-Agnostic Dish Detection: A Simple Baseline
At present, the mainstream restaurant automatic pricing system realizes automatic pricing by using the object detection technology based on deep learning to locate plate and identify its category. In order to make the accuracy to reach practical application, collecting and labeling lots of plate ima...
Gespeichert in:
Veröffentlicht in: | IEEE access 2021, Vol.9, p.125375-125383 |
---|---|
Hauptverfasser: | , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | At present, the mainstream restaurant automatic pricing system realizes automatic pricing by using the object detection technology based on deep learning to locate plate and identify its category. In order to make the accuracy to reach practical application, collecting and labeling lots of plate images with kinds of foods is required, it increases the labor and costs. This paper notes that detection and identification of the plate is different from that of the conventional object. A plate is a container, in which any foods and items can be placed, and the foods are unknown, and the fine-grained detection and identification is reached through the shape, edge and color of a plate. In order to improve the plate recognition rate, for the first time this paper releases a dataset named EP-20 which contains images of the empty plate without foods, and a testing dataset named EP-Test which is collected under actual using scenes. In addition, this paper proposes the data enhancement method based on the attention mechanism, which is that images of the empty plate are filled. This method guides the neural network to pay more attention to and learn more the features of the plate's edge and color, and learn less the features of the food in the plate, which are used as the plate's features. The method can achieve a highest accuracy of 89.63%, the highest accuracy can be improved by 58% compared to not using the method. |
---|---|
ISSN: | 2169-3536 2169-3536 |
DOI: | 10.1109/ACCESS.2021.3108184 |