Rapid prediction of lab-grown tissue properties using deep learning
The interactions between cells and the extracellular matrix are vital for the self-organisation of tissues. In this paper we present proof-of-concept to use machine learning tools to predict the role of this mechanobiology in the self-organisation of cell-laden hydrogels grown in tethered moulds. We...
Gespeichert in:
Veröffentlicht in: | Physical biology 2023-11, Vol.20 (6), p.66005 |
---|---|
Hauptverfasser: | , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | The interactions between cells and the extracellular matrix are vital for the self-organisation of tissues. In this paper we present proof-of-concept to use machine learning tools to predict the role of this mechanobiology in the self-organisation of cell-laden hydrogels grown in tethered moulds. We develop a process for the automated generation of mould designs with and without key symmetries. We create a large training set with
N
= 6400 cases by running detailed biophysical simulations of cell–matrix interactions using the contractile network dipole orientation model for the self-organisation of cellular hydrogels within these moulds. These are used to train an implementation of the
pix2pix
deep learning model, with an additional 100 cases that were unseen in the training of the neural network for review and testing of the trained model. Comparison between the predictions of the machine learning technique and the reserved predictions from the biophysical algorithm show that the machine learning algorithm makes excellent predictions. The machine learning algorithm is significantly faster than the biophysical method, opening the possibility of very high throughput rational design of moulds for pharmaceutical testing, regenerative medicine and fundamental studies of biology. Future extensions for scaffolds and 3D bioprinting will open additional applications. |
---|---|
ISSN: | 1478-3975 1478-3967 1478-3975 |
DOI: | 10.1088/1478-3975/ad0019 |