Modeling collective behaviors from optic flow and retinal cues
Animal collective behavior is often modeled with self-propelled particles, assuming each individual has “omniscient” knowledge of its neighbors. Yet, neighbors may be hidden from view and we do not know the effect of this information loss. To address this question, we propose a visual model of colle...
Gespeichert in:
Veröffentlicht in: | Physical review research 2024-04, Vol.6 (2), p.023016, Article 023016 |
---|---|
Hauptverfasser: | , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Animal collective behavior is often modeled with self-propelled particles, assuming each individual has “omniscient” knowledge of its neighbors. Yet, neighbors may be hidden from view and we do not know the effect of this information loss. To address this question, we propose a visual model of collective behavior where each particle moves according to bioplausible visual cues, in particular the optic flow. This visual model successfully reproduces three classical collective behaviors: swarming, schooling, and milling. This model offers a potential solution for controlling artificial swarms visually. |
---|---|
ISSN: | 2643-1564 2643-1564 |
DOI: | 10.1103/PhysRevResearch.6.023016 |