Retaining the lessons from past for better performance in a dynamic multiple task environment
Human beings learn to do a task and then go on to learn some other task. However, they do not forget the previous learning. If need arises, they can call upon their previous learning and do not have to relearn from scratch again. In this paper, we build upon our earlier work in which we presented a...
Gespeichert in:
Hauptverfasser: | , |
---|---|
Format: | Tagungsbericht |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Human beings learn to do a task and then go on to learn some other task. However, they do not forget the previous learning. If need arises, they can call upon their previous learning and do not have to relearn from scratch again. In this paper, we build upon our earlier work in which we presented a mechanism for learning multiple tasks in a dynamic environment where the tasks can change arbitrarily without any warning to the learning agents. The main feature of the mechanism is that a percentage of the learning agents is periodically made to reset its previous learning and restart learning again. Thus, there is always a sub-population which can learn the new task, whenever there is a task change, without being hampered by previous learning. The learning then spreads to the other members of the population also. In our current work we experiment with the incorporation of archive for preserving those strategies which have performed well. The strategies in the archive are tested time to time in the current environment. If the current task is the same as the task for which the strategy was first discovered, then that strategy rapidly comes in vogue for the whole population. The criteria by which strategies are selected for storage in the archive, the deletion of some strategies because the archive has limited space and the mechanism for selecting strategies for utilization in the current environment are presented. |
---|---|
ISSN: | 1089-778X 1941-0026 |
DOI: | 10.1109/CEC.2009.4983062 |