A general-purpose organic gel computer that learns by itself

To build energy minimized superstructures, self-assembling molecules explore astronomical options, colliding ∼10 9 molecules s −1 . Thus far, no computer has used it fully to optimize choices and execute advanced computational theories only by synthesizing supramolecules. To realize it, first, we re...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Neuromorphic computing and engineering 2023-12, Vol.3 (4), p.44007
Hauptverfasser: Sahoo, Pathik, Singh, Pushpendra, Saxena, Komal, Ghosh, Subrata, Singh, R P, Benosman, Ryad, Hill, Jonathan P, Nakayama, Tomonobu, Bandyopadhyay, Anirban
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:To build energy minimized superstructures, self-assembling molecules explore astronomical options, colliding ∼10 9 molecules s −1 . Thus far, no computer has used it fully to optimize choices and execute advanced computational theories only by synthesizing supramolecules. To realize it, first, we remotely re-wrote the problem in a language that supramolecular synthesis comprehends. Then, all-chemical neural network synthesizes one helical nanowire for one periodic event. These nanowires self-assemble into gel fibers mapping intricate relations between periodic events in any-data-type, the output is read instantly from optical hologram. Problem-wise, self-assembling layers or neural network depth is optimized to chemically simulate theories discovering invariants for learning. Subsequently, synthesis alone solves classification, feature learning problems instantly with single shot training. Reusable gel begins general-purpose computing that would chemically invent suitable models for problem-specific unsupervised learning. Irrespective of complexity, keeping fixed computing time and power, gel promises a toxic-hardware-free world. One sentence summary: fractally coupled deep learning networks revisits Rosenblatt’s 1950s theorem on deep learning network.
ISSN:2634-4386
2634-4386
DOI:10.1088/2634-4386/ad0fec