Higher Order Transformers: Enhancing Stock Movement Prediction On Multimodal Time-Series Data

In this paper, we tackle the challenge of predicting stock movements in financial markets by introducing Higher Order Transformers, a novel architecture designed for processing multivariate time-series data. We extend the self-attention mechanism and the transformer architecture to a higher order, e...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Hauptverfasser: Omranpour, Soroush, Rabusseau, Guillaume, Rabbany, Reihaneh
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page
container_issue
container_start_page
container_title
container_volume
creator Omranpour, Soroush
Rabusseau, Guillaume
Rabbany, Reihaneh
description In this paper, we tackle the challenge of predicting stock movements in financial markets by introducing Higher Order Transformers, a novel architecture designed for processing multivariate time-series data. We extend the self-attention mechanism and the transformer architecture to a higher order, effectively capturing complex market dynamics across time and variables. To manage computational complexity, we propose a low-rank approximation of the potentially large attention tensor using tensor decomposition and employ kernel attention, reducing complexity to linear with respect to the data size. Additionally, we present an encoder-decoder model that integrates technical and fundamental analysis, utilizing multimodal signals from historical prices and related tweets. Our experiments on the Stocknet dataset demonstrate the effectiveness of our method, highlighting its potential for enhancing stock movement prediction in financial markets.
doi_str_mv 10.48550/arxiv.2412.10540
format Article
fullrecord <record><control><sourceid>arxiv_GOX</sourceid><recordid>TN_cdi_arxiv_primary_2412_10540</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2412_10540</sourcerecordid><originalsourceid>FETCH-arxiv_primary_2412_105403</originalsourceid><addsrcrecordid>eNqFzjEOgkAQheFtLIx6ACvnAiAgJMZWMTQEE2gN2cAAE9ldM6xEb68Se5v3N6_4hFj7nhvuo8jbSn7S6AahH7i-F4XeXFwTajtkyLj-bMFSD41hhTwcINad1BXpFnJrqhukZkSF2sKFsabKktGQaUgfvSVlatlDQQqdHJlwgJO0cilmjewHXP26EJtzXBwTZ4KUdyYl-VV-QeUE2v1_vAFl1ED-</addsrcrecordid><sourcetype>Open Access Repository</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype></control><display><type>article</type><title>Higher Order Transformers: Enhancing Stock Movement Prediction On Multimodal Time-Series Data</title><source>arXiv.org</source><creator>Omranpour, Soroush ; Rabusseau, Guillaume ; Rabbany, Reihaneh</creator><creatorcontrib>Omranpour, Soroush ; Rabusseau, Guillaume ; Rabbany, Reihaneh</creatorcontrib><description>In this paper, we tackle the challenge of predicting stock movements in financial markets by introducing Higher Order Transformers, a novel architecture designed for processing multivariate time-series data. We extend the self-attention mechanism and the transformer architecture to a higher order, effectively capturing complex market dynamics across time and variables. To manage computational complexity, we propose a low-rank approximation of the potentially large attention tensor using tensor decomposition and employ kernel attention, reducing complexity to linear with respect to the data size. Additionally, we present an encoder-decoder model that integrates technical and fundamental analysis, utilizing multimodal signals from historical prices and related tweets. Our experiments on the Stocknet dataset demonstrate the effectiveness of our method, highlighting its potential for enhancing stock movement prediction in financial markets.</description><identifier>DOI: 10.48550/arxiv.2412.10540</identifier><language>eng</language><subject>Computer Science - Learning ; Quantitative Finance - Statistical Finance</subject><creationdate>2024-12</creationdate><rights>http://creativecommons.org/licenses/by/4.0</rights><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>228,230,776,881</link.rule.ids><linktorsrc>$$Uhttps://arxiv.org/abs/2412.10540$$EView_record_in_Cornell_University$$FView_record_in_$$GCornell_University$$Hfree_for_read</linktorsrc><backlink>$$Uhttps://doi.org/10.48550/arXiv.2412.10540$$DView paper in arXiv$$Hfree_for_read</backlink></links><search><creatorcontrib>Omranpour, Soroush</creatorcontrib><creatorcontrib>Rabusseau, Guillaume</creatorcontrib><creatorcontrib>Rabbany, Reihaneh</creatorcontrib><title>Higher Order Transformers: Enhancing Stock Movement Prediction On Multimodal Time-Series Data</title><description>In this paper, we tackle the challenge of predicting stock movements in financial markets by introducing Higher Order Transformers, a novel architecture designed for processing multivariate time-series data. We extend the self-attention mechanism and the transformer architecture to a higher order, effectively capturing complex market dynamics across time and variables. To manage computational complexity, we propose a low-rank approximation of the potentially large attention tensor using tensor decomposition and employ kernel attention, reducing complexity to linear with respect to the data size. Additionally, we present an encoder-decoder model that integrates technical and fundamental analysis, utilizing multimodal signals from historical prices and related tweets. Our experiments on the Stocknet dataset demonstrate the effectiveness of our method, highlighting its potential for enhancing stock movement prediction in financial markets.</description><subject>Computer Science - Learning</subject><subject>Quantitative Finance - Statistical Finance</subject><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2024</creationdate><recordtype>article</recordtype><sourceid>GOX</sourceid><recordid>eNqFzjEOgkAQheFtLIx6ACvnAiAgJMZWMTQEE2gN2cAAE9ldM6xEb68Se5v3N6_4hFj7nhvuo8jbSn7S6AahH7i-F4XeXFwTajtkyLj-bMFSD41hhTwcINad1BXpFnJrqhukZkSF2sKFsabKktGQaUgfvSVlatlDQQqdHJlwgJO0cilmjewHXP26EJtzXBwTZ4KUdyYl-VV-QeUE2v1_vAFl1ED-</recordid><startdate>20241213</startdate><enddate>20241213</enddate><creator>Omranpour, Soroush</creator><creator>Rabusseau, Guillaume</creator><creator>Rabbany, Reihaneh</creator><scope>AKY</scope><scope>GOX</scope></search><sort><creationdate>20241213</creationdate><title>Higher Order Transformers: Enhancing Stock Movement Prediction On Multimodal Time-Series Data</title><author>Omranpour, Soroush ; Rabusseau, Guillaume ; Rabbany, Reihaneh</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-arxiv_primary_2412_105403</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2024</creationdate><topic>Computer Science - Learning</topic><topic>Quantitative Finance - Statistical Finance</topic><toplevel>online_resources</toplevel><creatorcontrib>Omranpour, Soroush</creatorcontrib><creatorcontrib>Rabusseau, Guillaume</creatorcontrib><creatorcontrib>Rabbany, Reihaneh</creatorcontrib><collection>arXiv Computer Science</collection><collection>arXiv.org</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Omranpour, Soroush</au><au>Rabusseau, Guillaume</au><au>Rabbany, Reihaneh</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Higher Order Transformers: Enhancing Stock Movement Prediction On Multimodal Time-Series Data</atitle><date>2024-12-13</date><risdate>2024</risdate><abstract>In this paper, we tackle the challenge of predicting stock movements in financial markets by introducing Higher Order Transformers, a novel architecture designed for processing multivariate time-series data. We extend the self-attention mechanism and the transformer architecture to a higher order, effectively capturing complex market dynamics across time and variables. To manage computational complexity, we propose a low-rank approximation of the potentially large attention tensor using tensor decomposition and employ kernel attention, reducing complexity to linear with respect to the data size. Additionally, we present an encoder-decoder model that integrates technical and fundamental analysis, utilizing multimodal signals from historical prices and related tweets. Our experiments on the Stocknet dataset demonstrate the effectiveness of our method, highlighting its potential for enhancing stock movement prediction in financial markets.</abstract><doi>10.48550/arxiv.2412.10540</doi><oa>free_for_read</oa></addata></record>
fulltext fulltext_linktorsrc
identifier DOI: 10.48550/arxiv.2412.10540
ispartof
issn
language eng
recordid cdi_arxiv_primary_2412_10540
source arXiv.org
subjects Computer Science - Learning
Quantitative Finance - Statistical Finance
title Higher Order Transformers: Enhancing Stock Movement Prediction On Multimodal Time-Series Data
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-28T05%3A57%3A40IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-arxiv_GOX&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Higher%20Order%20Transformers:%20Enhancing%20Stock%20Movement%20Prediction%20On%20Multimodal%20Time-Series%20Data&rft.au=Omranpour,%20Soroush&rft.date=2024-12-13&rft_id=info:doi/10.48550/arxiv.2412.10540&rft_dat=%3Carxiv_GOX%3E2412_10540%3C/arxiv_GOX%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_id=info:pmid/&rfr_iscdi=true