Parallel implementation of the time-evolving block decimation algorithm for the Bose–Hubbard model
A system of ultracold atoms in an optical lattice represents a powerful experimental setup for testing the fundamentals of quantum mechanics. While its microscopic interaction mechanisms are well understood, the system behavior for a moderate number of particles is difficult to simulate due to a hig...
Gespeichert in:
Veröffentlicht in: | Computer physics communications 2016-02, Vol.199, p.170-177 |
---|---|
Hauptverfasser: | , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | A system of ultracold atoms in an optical lattice represents a powerful experimental setup for testing the fundamentals of quantum mechanics. While its microscopic interaction mechanisms are well understood, the system behavior for a moderate number of particles is difficult to simulate due to a high dimension of its many-body space. This article presents TEBDOL, a parallel implementation of the time-evolving block decimation (TEBD) algorithm that can efficiently simulate time evolution of a one-dimensional chain of atoms in optical lattices. We investigate the parallelization strategy and the strong and weak scaling with the number of processes.
Program title: TEBDOL
Catalogue identifier: AEYN_v1_0
Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEYN_v1_0.html
Program obtainable from: CPC Program Library, Queen’s University, Belfast, N. Ireland
Licensing provisions: GNU General Public License v3.0
No. of lines in distributed program, including test data, etc.: 9060
No. of bytes in distributed program, including test data, etc.: 54338
Distribution format: tar.gz
Programming language: Common Lisp.
Computer: x86-64.
Operating system: Linux.
Has the code been vectorized or parallelized?: Parallelized using MPI
RAM: 1–64 GB
Classification: 7.7.
External routines: Basic Linear Algebra Subprograms (BLAS), Linear Algebra Package (LAPACK), Message Passing Interface (MPI)
Nature of problem: A system of neutral atoms in an optical lattice is a many-body quantum system that can be described using the Bose–Hubbard model. Hilbert space dimensions of many-body quantum models grow exponentially with the number of particles. Simulating time evolution in the Bose–Hubbard model is therefore a hard problem even in case of a moderate number of particles.
Solution method: A system state is represented by a tensor network. Its time evolution is then simulated using the time-evolving block decimation (TEBD) algorithm.
Restrictions: TEBDOL is limited to one-dimensional systems. The times accessible in the simulations are restricted by the growth of entanglement in the system.
Unusual features: Tensor networks in TEBDOL support a global Abelian symmetry, i.e., the program conserves the total number of particles. Models with multiple particle species are supported as well. TEBDOL is implemented in Common Lisp and can run in parallel on a computer cluster.
Running time: Running time depends on the lattice size, on the number of particles, and on the maximal allowed tensor dime |
---|---|
ISSN: | 0010-4655 1879-2944 |
DOI: | 10.1016/j.cpc.2015.10.016 |