A Hybrid Quantum-Classical Model for Stock Price Prediction Using Quantum-Enhanced Long Short-Term Memory
<p>Schematic for the internal structure of a classical LSTM cell.</p> "> Figure 2
<p>Four different methods for integrating ML with QC.</p> "> Figure 3
<p>Angle encoding quantum circuit.</p> "> Figure 4
<p>The overall architecture of the proposed QLSTM for stock closing price prediction.</p> "> Figure 5
<p>A QLSTM cell consists of VQCs as replacements to LSTM gates.</p> "> Figure 6
<p>The general architecture for a single variational quantum circuit (VQC) is described as follows: <math display="inline"><semantics> <mrow> <mi>U</mi> <mo>(</mo> <mi>x</mi> <mo>)</mo> </mrow> </semantics></math> represents the quantum operations for encoding classical data (<span class="html-italic">x</span>), and <math display="inline"><semantics> <mrow> <mi>U</mi> <mo>(</mo> <mi>θ</mi> <mo>)</mo> </mrow> </semantics></math> represents the repetition of variational layers, from 1 to <span class="html-italic">i</span>, each with tunable parameters <math display="inline"><semantics> <mi>θ</mi> </semantics></math>. The final layer is a measurement layer employed to obtain the VQC probability distribution.</p> "> Figure 7
<p>Variation quantum circuit in the QLSTM architecture, as utilized in [<a href="#B12-entropy-26-00954" class="html-bibr">12</a>,<a href="#B38-entropy-26-00954" class="html-bibr">38</a>]. <span class="html-italic">H</span>, <math display="inline"><semantics> <msub> <mi>R</mi> <mi>y</mi> </msub> </semantics></math>, and <math display="inline"><semantics> <msub> <mi>R</mi> <mi>z</mi> </msub> </semantics></math> denote quantum gates, while <span class="html-italic">x</span> represents the classical input data vector, functioning as a data encoding layer. Parameters <math display="inline"><semantics> <mrow> <mo>(</mo> <msub> <mi>α</mi> <mi>i</mi> </msub> <mo>,</mo> <msub> <mi>β</mi> <mi>i</mi> </msub> <mo>,</mo> <msub> <mi>γ</mi> <mi>i</mi> </msub> <mo>)</mo> </mrow> </semantics></math> are adjustable and require optimization. The line connecting • and ⊗ represents a CNOT gate. The circuits conclude with a measurement layer.</p> "> Figure 8
<p>Efficient gradient computation through the technique parameter-shift rules on a VQC.</p> "> Figure 9
<p>Selected stock price data from 1 January 2022, to 1 January 2023. The training data are depicted on the left side of the blue dashed line, whereas the testing data are on the right side.</p> "> Figure 10
<p>The training losses of the <span class="html-italic">Noiseless</span> and <span class="html-italic">Noisy</span> QLSTM, classical LSTM, and other models on the stock price dataset over 50 epochs, where the loss function descends to the lowest point near zero.</p> "> Figure 11
<p>Comparison of the prediction performance of QLSTM in various quantum environments with classical LSTM and other models using 20 stock price data points.</p> "> Figure 12
<p>Visualizing the training accuracy and loss of QLSTM models across different numbers of qubits, spanning from 4 to 15. Green bars represent accuracy, while red bars denote losses in RMSE.</p> ">
Abstract
:1. Introduction
- We introduced a specifically designed hybrid quantum-classical computing framework for predicting the stock price data of a single company.
- We detailed and utilized the quantum circuit for encoding classical data into quantum states, potentially leading to more efficient representations that better capture the underlying patterns in stock price data.
- We also offered a thorough discussion on limitations and proposed future works to extend its applicability beyond stock price prediction, encompassing other domains, particularly focusing on time-series data analysis.
- Our experiments illustrate the superior performance of the proposed QLSTM model over classical LSTM and other specialized time series models. It achieves a significantly decreased root mean square error (RMSE) and an improved accuracy, evident throughout both the training and prediction phases.
- We rigorously evaluated the QLSTM model by executing it on an IBM quantum simulator and further validated its capabilities by conducting predictions on actual IBM quantum hardware. Additionally, we conducted tests under various quantum environments, including noiseless and noisy quantum simulators, showcasing its remarkable performance superiority.
2. Background
2.1. Classical Machine Learning
2.2. Quantum Computing
2.3. Quantum Machine Learning
2.4. Quantum Encoding
3. Related Works
3.1. Classical Machine Learning
3.2. Quantum Machine Learning
4. Stock Price Prediction Using QLSTM
4.1. Overall Architecture
4.2. QLSTM Structure
- Forget Gate: The box processes the concatenated hidden state alongside input data to produce a vector using the sigmoid function. This vector contains values between 0 and 1. The role of is to dictate whether to preserve or eliminate corresponding elements in the cell state from the previous time step. This is executed through element-wise operations applied to . Assigning a value of 1 indicates full retention of the corresponding element within the cell state, while a value of 0 signifies forgetting. However, in QLSTM, the operations on the cell state is not limited to 0 or 1 but encompass a linear combination between them, making QLSTM suitable for efficiently learning temporal dependencies.
- Input and Update Gates: The goal of these gates is to determine the new information to be added to the cell state. First, processes the input , passing the output through the sigmoid function to determine which values will be incorporated into the cell state. Concurrently, processes the concatenated input and undergoes a transformation via a hyperbolic tangent (tanh) function, generating a new cell state candidate . Subsequently, the output from is element-wise multiplied by , and the resultant vector is used to update the cell state.
- Output Gate: First, processes the input by passing it through the sigmoid function to determine the relevance of values in the cell state . Following this, the cell state is transformed via the hyperbolic tangent function (tanh) and then multiplied by the output of . Optionally, the resulting value can undergo further processing through to generate the hidden state , or through to yield the output . In general, the dimensions of the cell state , the hidden state , and the output are not identical. To ensure correct dimensions, we utilize to transform into , and to transform into , respectively.
4.3. Variational Quantum Circuits
4.3.1. Data Encoding Layer
4.3.2. Variational Layer
4.3.3. Measurement Layer
4.4. Parameter Learning
5. Experiments and Results
5.1. Experimental Settings
5.1.1. Dataset
5.1.2. Evaluation Metrics
5.1.3. Model Hyperparameter
5.2. Experimental Results
5.2.1. Accuracy and Loss
5.2.2. Prediction Performance
5.2.3. Number of Qubits
6. Discussion
- Dataset Scope: Given the limitations of quantum simulations on classical computers and the prolonged queuing time for accessing the IBM actual quantum machines, we were restricted to utilizing only 251 stock-price data points. Even though the proposed model delivered promising performance, further experimentation using larger datasets is warranted.
- Model Design: Our model architecture is configured with specific hyperparameters, a specific quantum data-encoding layer, quantum rotation gates, and a particular type of variational layer to refine the prediction task. However, extensive investigation is required for in-depth understanding, including diverse quantum circuit designs, variations in gate types and quantities, and exploration into the depth of the variational layer.
- Simulation Limitations: We initially employed a few qubits. However, given the current availability of quantum devices with hundreds of qubits, it is recommended that we consider evaluating the QLSTM model with large datasets and qubit numbers to provide a more comprehensive assessment of its real-world performance.
- Possibility of Classical Simulation: While variational quantum circuits are promising for quantum advantage, recent studies suggest that certain types of VQCs, both noisy and noiseless, can be simulated on classical computers in polynomial time. For small to medium problem sizes, classical simulation may achieve comparable results without the overhead of quantum hardware. However, as we move towards larger, more complex tasks, the polynomial scaling of quantum models is expected to surpass classical capabilities, particularly when combined with specialized quantum hardware. In this study, we focus on cases where quantum circuits maintain an edge in efficiency and predictive accuracy. Nevertheless, we acknowledge the importance of identifying the boundaries where quantum computation significantly outperforms classical simulation to fully validate the advantage of our approach [59,60].
7. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Adebiyi, A.A.; Ayo, C.K.; Adebiyi, M.; Otokiti, S.O. Stock price prediction using neural network with hybridized market indicators. J. Emerg. Trends Comput. Inf. Sci. 2012, 3. [Google Scholar]
- Long, W.; Gao, J.; Bai, K.; Lu, Z. A hybrid model for stock price prediction based on multi-view heterogeneous data. Financ. Innov. 2024, 10, 48. [Google Scholar] [CrossRef]
- Vijh, M.; Chandola, D.; Tikkiwal, V.A.; Kumar, A. Stock closing price prediction using machine learning techniques. Procedia Comput. Sci. 2020, 167, 599–606. [Google Scholar] [CrossRef]
- Esteva, A.; Chou, K.; Yeung, S.; Naik, N.; Madani, A.; Mottaghi, A.; Liu, Y.; Topol, E.; Dean, J.; Socher, R. Deep learning-enabled medical computer vision. NPJ Digit. Med. 2021, 4, 5. [Google Scholar] [CrossRef] [PubMed]
- Shah, N.; Bhagat, N.; Shah, M. Crime forecasting: A machine learning and computer vision approach to crime prediction and prevention. Vis. Comput. Ind. Biomed. Art 2021, 4, 9. [Google Scholar] [CrossRef] [PubMed]
- Mhlanga, D. Open AI in education, the responsible and ethical use of ChatGPT towards lifelong learning. In FinTech and Artificial Intelligence for Sustainable Development: The Role of Smart Technologies in Achieving Development Goals; Springer: Cham, Switzerland, 2023. [Google Scholar]
- White, H. Economic prediction using neural networks: The case of IBM daily stock returns. In Proceedings of the IEEE 1988 International Conference on Neural Networks, San Diego, CA, USA, 24–27 July 1988; Volume 2, pp. 451–458. [Google Scholar]
- Kolarik, T.; Rudorfer, G. Time series forecasting using neural networks. ACM SIGAPL APL Quote Quad 1994, 25, 86–94. [Google Scholar] [CrossRef]
- Chen, K.; Zhou, Y.; Dai, F. A LSTM-based method for stock returns prediction: A case study of China stock market. In Proceedings of the 2015 IEEE International Conference on Big Data (Big Data), Santa Clara, CA, USA, 29 October–1 November 2015; pp. 2823–2824. [Google Scholar]
- Fischer, T.; Krauss, C. Deep learning with long short-term memory networks for financial market predictions. Eur. J. Oper. Res. 2018, 270, 654–669. [Google Scholar] [CrossRef]
- Benedetti, M.; Fiorentini, M.; Lubasch, M. Hardware-efficient variational quantum algorithms for time evolution. Phys. Rev. Res. 2021, 3, 033083. [Google Scholar] [CrossRef]
- Cao, Y.; Zhou, X.; Fei, X.; Zhao, H.; Liu, W.; Zhao, J. Linear-layer-enhanced quantum long short-term memory for carbon price forecasting. Quantum Mach. Intell. 2023, 5, 26. [Google Scholar] [CrossRef]
- Kim, Y.; Eddins, A.; Anand, S.; Wei, K.X.; Van Den Berg, E.; Rosenblatt, S.; Nayfeh, H.; Wu, Y.; Zaletel, M.; Temme, K.; et al. Evidence for the utility of quantum computing before fault tolerance. Nature 2023, 618, 500–505. [Google Scholar] [CrossRef] [PubMed]
- Endo, S.; Cai, Z.; Benjamin, S.C.; Yuan, X. Hybrid quantum-classical algorithms and quantum error mitigation. J. Phys. Soc. Jpn. 2021, 90, 032001. [Google Scholar] [CrossRef]
- McClean, J.R.; Boixo, S.; Smelyanskiy, V.N.; Babbush, R.; Neven, H. Barren plateaus in quantum neural network training landscapes. Nat. Commun. 2018, 9, 4812. [Google Scholar] [CrossRef] [PubMed]
- Selvin, S.; Vinayakumar, R.; Gopalakrishnan, E.; Menon, V.K.; Soman, K. Stock price prediction using LSTM, RNN and CNN-sliding window model. In Proceedings of the 2017 International Conference on Advances in Computing, Communications and Informatics (Icacci), Udupi, India, 13–16 September 2017; pp. 1643–1647. [Google Scholar]
- Ribeiro, A.H.; Tiels, K.; Aguirre, L.A.; Schön, T. Beyond exploding and vanishing gradients: Analysing RNN training using attractors and smoothness. In Proceedings of the International Conference on Artificial Intelligence and Statistics, PMLR, Palermo, Sicily, Italy, 26–28 August 2020; pp. 2370–2380. [Google Scholar]
- Ceschini, A.; Rosato, A.; Panella, M. Design of an LSTM Cell on a Quantum Hardware. IEEE Trans. Circuits Syst. II Express Briefs 2021, 69, 1822–1826. [Google Scholar] [CrossRef]
- Vedral, V.; Plenio, M.B. Basics of quantum computation. Prog. Quantum Electron. 1998, 22, 1–39. [Google Scholar] [CrossRef]
- Lloyd, S.; Mohseni, M.; Rebentrost, P. Quantum algorithms for supervised and unsupervised machine learning. arXiv 2013, arXiv:1307.0411. [Google Scholar]
- Du, Y.; Hsieh, M.H.; Liu, T.; Tao, D. Expressive power of parametrized quantum circuits. Phys. Rev. Res. 2020, 2, 033125. [Google Scholar] [CrossRef]
- Yamasaki, H.; Isogai, N.; Murao, M. Advantage of Quantum Machine Learning from General Computational Advantages. arXiv 2023, arXiv:2312.03057. [Google Scholar]
- Butler, K.T.; Davies, D.W.; Cartwright, H.; Isayev, O.; Walsh, A. Machine learning for molecular and materials science. Nature 2018, 559, 547–555. [Google Scholar] [CrossRef] [PubMed]
- Zeguendry, A.; Jarir, Z.; Quafafou, M. Quantum machine learning: A review and case studies. Entropy 2023, 25, 287. [Google Scholar] [CrossRef]
- Khan, M.A.; Aman, M.N.; Sikdar, B. Beyond Bits: A Review of Quantum Embedding Techniques for Efficient Information Processing. IEEE Access 2024, 12, 6118–46137. [Google Scholar] [CrossRef]
- Cortese, J.A.; Braje, T.M. Loading classical data into a quantum computer. arXiv 2018, arXiv:1803.01958. [Google Scholar]
- Ovalle-Magallanes, E.; Alvarado-Carrillo, D.E.; Avina-Cervantes, J.G.; Cruz-Aceves, I.; Ruiz-Pinales, J. Quantum angle encoding with learnable rotation applied to quantum–classical convolutional neural networks. Appl. Soft Comput. 2023, 141, 110307. [Google Scholar] [CrossRef]
- Chen, W.; Zhang, H.; Mehlawat, M.K.; Jia, L. Mean–variance portfolio optimization using machine learning-based stock price prediction. Appl. Soft Comput. 2021, 100, 106943. [Google Scholar] [CrossRef]
- Mehtab, S.; Sen, J.; Dutta, A. Stock price prediction using machine learning and LSTM-based deep learning models. In Proceedings of the Machine Learning and Metaheuristics Algorithms, and Applications: Second Symposium, SoMMA 2020, Chennai, India, 14–17 October 2020; Revised Selected Papers 2. Springer: Singapore, 2021; pp. 88–106. [Google Scholar]
- Khan, W.; Ghazanfar, M.A.; Azam, M.A.; Karami, A.; Alyoubi, K.H.; Alfakeeh, A.S. Stock market prediction using machine learning classifiers and social media, news. J. Ambient. Intell. Humaniz. Comput. 2022, 13, 3433–3456. [Google Scholar] [CrossRef]
- Hamayel, M.J.; Owda, A.Y. A novel cryptocurrency price prediction model using GRU, LSTM and bi-LSTM machine learning algorithms. AI 2021, 2, 477–496. [Google Scholar] [CrossRef]
- Emmanoulopoulos, D.; Dimoska, S. Quantum machine learning in finance: Time series forecasting. arXiv 2022, arXiv:2202.00599. [Google Scholar]
- Kutvonen, A.; Fujii, K.; Sagawa, T. Optimizing a quantum reservoir computer for time series prediction. Sci. Rep. 2020, 10, 14687. [Google Scholar] [CrossRef] [PubMed]
- Mujal, P.; Martínez-Peña, R.; Giorgi, G.L.; Soriano, M.C.; Zambrini, R. Time-series quantum reservoir computing with weak and projective measurements. npj Quantum Inform. 2023, 9, 16. [Google Scholar] [CrossRef]
- Kornjača, M.; Hu, H.Y.; Zhao, C.; Wurtz, J.; Weinberg, P.; Hamdan, M.; Zhdanov, A.; Cantu, S.H.; Zhou, H.; Bravo, R.A.; et al. Large-scale quantum reservoir learning with an analog quantum computer. arXiv 2024, arXiv:2407.02553. [Google Scholar]
- Srivastava, N.; Belekar, G.; Shahakar, N. The Potential of Quantum Techniques for Stock Price Prediction. In Proceedings of the 2023 IEEE International Conference on Recent Advances in Systems Science and Engineering (RASSE), Kerala, India, 8–11 November 2023; pp. 1–7. [Google Scholar]
- Paquet, E.; Soleymani, F. QuantumLeap: Hybrid quantum neural network for financial predictions. Expert Syst. Appl. 2022, 195, 116583. [Google Scholar] [CrossRef]
- Chen, S.Y.C.; Yoo, S.; Fang, Y.L.L. Quantum long short-term memory. In Proceedings of the ICASSP 2022–2022 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), Singapore, 23–27 May 2022; pp. 8622–8626. [Google Scholar]
- Hassan, E.; Hossain, M.S.; Saber, A.; Elmougy, S.; Ghoneim, A.; Muhammad, G. A quantum convolutional network and ResNet (50)-based classification architecture for the MNIST medical dataset. Biomed. Signal Process. Control 2024, 87, 105560. [Google Scholar] [CrossRef]
- Schuld, M.; Sweke, R.; Meyer, J.J. Effect of data encoding on the expressive power of variational quantum-machine-learning models. Phys. Rev. A 2021, 103, 032430. [Google Scholar] [CrossRef]
- Mitarai, K.; Negoro, M.; Kitagawa, M.; Fujii, K. Quantum circuit learning. Phys. Rev. A 2018, 98, 032309. [Google Scholar] [CrossRef]
- Cerezo, M.; Sone, A.; Volkoff, T.; Cincio, L.; Coles, P.J. Cost function dependent barren plateaus in shallow parametrized quantum circuits. Nat. Commun. 2021, 12, 1791. [Google Scholar] [CrossRef] [PubMed]
- Crooks, G.E. Gradients of parameterized quantum gates using the parameter-shift rule and gate decomposition. arXiv 2019, arXiv:1905.13311. [Google Scholar]
- Wang, H.; Li, Z.; Gu, J.; Ding, Y.; Pan, D.Z.; Han, S. Qoc: Quantum on-chip training with parameter shift and gradient pruning. In Proceedings of the 59th ACM/IEEE Design Automation Conference, San Francisco, CA, USA, 10–14 July 2022; pp. 655–660. [Google Scholar]
- Kea, K.; Chang, W.D.; Park, H.C.; Han, Y. Enhancing a Convolutional Autoencoder with a Quantum Approximate Optimization Algorithm for Image Noise Reduction. arXiv 2024, arXiv:2401.06367. [Google Scholar]
- Audibert, J.; Michiardi, P.; Guyard, F.; Marti, S.; Zuluaga, M.A. Usad: Unsupervised anomaly detection on multivariate time series. In Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, Virtual Event, CA, USA, 6–10 July 2020; pp. 3395–3404. [Google Scholar]
- Zong, B.; Song, Q.; Min, M.R.; Cheng, W.; Lumezanu, C.; Cho, D.; Chen, H. Deep autoencoding gaussian mixture model for unsupervised anomaly detection. In Proceedings of the International Conference on Learning Representations, Convention Center, Vancouver, BC, Canada, 30 April–3 May 2018. [Google Scholar]
- Zhang, C.; Song, D.; Chen, Y.; Feng, X.; Lumezanu, C.; Cheng, W.; Ni, J.; Zong, B.; Chen, H.; Chawla, N.V. A deep neural network for unsupervised anomaly detection and diagnosis in multivariate time series data. In Proceedings of the AAAI Conference on Artificial Intelligence, Honolulu, HI, USA, 27 January–1 February 2019; Volume 33, pp. 1409–1416. [Google Scholar]
- Zhao, H.; Wang, Y.; Duan, J.; Huang, C.; Cao, D.; Tong, Y.; Xu, B.; Bai, J.; Tong, J.; Zhang, Q. Multivariate time-series anomaly detection via graph attention network. In Proceedings of the 2020 IEEE International Conference on Data Mining (ICDM), Sorrento, Italy, 17–20 November 2020; pp. 841–850. [Google Scholar]
- Bergholm, V.; Izaac, J.; Schuld, M.; Gogolin, C.; Ahmed, S.; Ajith, V.; Alam, M.S.; Alonso-Linaje, G.; AkashNarayanan, B.; Asadi, A.; et al. Pennylane: Automatic differentiation of hybrid quantum-classical computations. arXiv 2018, arXiv:1811.04968. [Google Scholar]
- Hur, T.; Kim, L.; Park, D.K. Quantum convolutional neural network for classical data classification. Quantum Mach. Intell. 2022, 4, 3. [Google Scholar] [CrossRef]
- Ravi, G.S.; Smith, K.N.; Murali, P.; Chong, F.T. Adaptive job and resource management for the growing quantum cloud. In Proceedings of the 2021 IEEE International Conference on Quantum Computing and Engineering (QCE), Broomfield, CO, USA, 17–22 October 2021; pp. 301–312. [Google Scholar]
- Wilkens, S.; Moorhouse, J. Quantum computing for financial risk measurement. Quantum Inf. Process. 2023, 22, 51. [Google Scholar] [CrossRef]
- Wang, H.; Gu, J.; Ding, Y.; Li, Z.; Chong, F.T.; Pan, D.Z.; Han, S. Quantumnat: Quantum noise-aware training with noise injection, quantization and normalization. In Proceedings of the 59th ACM/IEEE Design Automation Conference, San Francisco, CA, USA, 10–14 July 2022; pp. 1–6. [Google Scholar]
- Qin, D.; Chen, Y.; Li, Y. Error statistics and scalability of quantum error mitigation formulas. npj Quantum Inf. 2023, 9, 35. [Google Scholar] [CrossRef]
- Resch, S.; Karpuzcu, U.R. Benchmarking quantum computers and the impact of quantum noise. ACM Comput. Surv. (CSUR) 2021, 54, 1–35. [Google Scholar] [CrossRef]
- Buonaiuto, G.; Gargiulo, F.; De Pietro, G.; Esposito, M.; Pota, M. The effects of quantum hardware properties on the performances of variational quantum learning algorithms. Quantum Mach. Intell. 2024, 6, 9. [Google Scholar] [CrossRef]
- Cerezo, M.; Verdon, G.; Huang, H.Y.; Cincio, L.; Coles, P.J. Challenges and opportunities in quantum machine learning. Nat. Comput. Sci. 2022, 2, 567–576. [Google Scholar] [CrossRef] [PubMed]
- Angrisani, A.; Schmidhuber, A.; Rudolph, M.S.; Cerezo, M.; Holmes, Z.; Huang, H.Y. Classically estimating observables of noiseless quantum circuits. arXiv 2024, arXiv:2409.01706. [Google Scholar]
- Schuster, T.; Yin, C.; Gao, X.; Yao, N.Y. A polynomial-time classical algorithm for noisy quantum circuits. arXiv 2024, arXiv:2407.12768. [Google Scholar]
Models | Training Acc | Training RMSE | Prediction Acc | Prediction RMSE |
---|---|---|---|---|
Noiseless QLSTM | 1.00 | 0.0371 | 0.9736 | 0.0602 |
Noisy QLSTM | 0.9714 | 0.0511 | 0.9210 | 0.0648 |
Actual QLSTM | N/A | N/A | 0.7619 | 0.1401 |
LSTM | 0.92 | 0.0567 | 0.8815 | 0.0693 |
QSVM [36] | N/A | N/A | 0.5894 | N/A |
USAD [46] | 0.9342 | 0.0708 | 0.8874 | 0.0672 |
DAGMM [47] | 0.8947 | 0.0768 | 0.8410 | 0.0721 |
MSCRED [48] | 0.9342 | 0.0720 | 0.8828 | 0.0680 |
MTAD_GAT [49] | 0.9473 | 0.0668 | 0.8857 | 0.0624 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Kea, K.; Kim, D.; Huot, C.; Kim, T.-K.; Han, Y. A Hybrid Quantum-Classical Model for Stock Price Prediction Using Quantum-Enhanced Long Short-Term Memory. Entropy 2024, 26, 954. https://doi.org/10.3390/e26110954
Kea K, Kim D, Huot C, Kim T-K, Han Y. A Hybrid Quantum-Classical Model for Stock Price Prediction Using Quantum-Enhanced Long Short-Term Memory. Entropy. 2024; 26(11):954. https://doi.org/10.3390/e26110954
Chicago/Turabian StyleKea, Kimleang, Dongmin Kim, Chansreynich Huot, Tae-Kyung Kim, and Youngsun Han. 2024. "A Hybrid Quantum-Classical Model for Stock Price Prediction Using Quantum-Enhanced Long Short-Term Memory" Entropy 26, no. 11: 954. https://doi.org/10.3390/e26110954