||Let machine read candlestick charts like human beings
||Master’s Program, Department of Computer Science and Information Engineering (English-taught program
||Candlestick charts have been very important tool for human traders while making trading decisions since 18th century. Inspired from people reading candlestick charts for decision making, this paper proposed a deep network framework, Deep Candlestick Predictor (DCP), to forecast the price movements by reading the candlestick charts rather than the numerical data from financial reports. DCP contains a chart decomposer which can decomposes given candlestick chart into several sub-charts, an CNN-Autoencoder which can derive the best representation for sub-charts, and a RNN which can forecast the price movements of the k+1-th day. Extensive experiments are conducted by daily prices from real dataset of 6 future merchandises of stock indices in Taiwan Future Exchange, which totally have 21,819 trading days. The experimental results show that the proposed framework DCP could achieve higher accuracy than the traditional index-based model, which shows the effectiveness of the concept of designing a deep network to read candlestick charts like human beings.
||Table of Content
List of Content IV
List of Figure VI
List of Table VII
List of Formula VIII
Chapter 1 Introduction 1
Chapter 2 Related Works 6
2.1 Candlestick Charts Analysis 6
2.2 Time Series Forecasting 9
2.2.1 RNN 9
2.2.2 GRU 11
2.2.3 CNN 13
2.3 CNN-Autoencoder 15
Chapter 3 Problem Formulation 17
Chapter 4 DCP 19
4.1 Chart Decomposer 19
4.2 CAE 21
4.3 RNN 23
Chapter 5 Experiment Result 25
5.1 Settings and Dataset 25
5.1.1 Dataset 25
5.1.2 Experiment Workflow 30
5.2 Feature-Efficiency 31
5.2.1 IEM 31
5.2.2 Performance Evaluation 33
5.3 Model-Efficiency 37
5.3.1 1-D CNN 38
5.3.2 2-D CNN 40
5.3.3 Performance Evaluation 42
Chapter 6 Conclusion 44
List of Figure
Figure 1 : 20-days candlestick chart 2
Figure 2 : candlesticks 2
Figure 3 : Method summary 4
Figure 4 : Architecture of RNN 10
Figure 5 : General representation of RNN 10
Figure 6 : Calculation overview of GRU 13
Figure 7 : CNN overview 14
Figure 8: Workflow of CAE 16
Figure 9 : An illustrative example of a 3-day sub-chart 20
Figure 10 : CAE overview 22
Figure 11 : RNN overview 24
Figure 12 : IEM overview 33
Figure 13 : 1-D CNN overview 38
Figure 14 : 1-D tensor 38
Figure 15 : 1-D CNN model 39
Figure 16 : 2-D CNN overview 40
Figure 17 : Original sub-chart 40
Figure 18 : Concatenated sub-charts for 2-D CNN 40
Figure 19 : 2-D CNN Model 41
List of Table
Table 1 : Comparison between different method 8
Table 2 : Future merchandises 28
Table 3 : Distribution of trend on each year 29
Table 4 : Nested-CV results of 10 best-performing model of DCP 34
Table 5 : Nested-CV results of 3 different classifier of IEM 34
Table 6: Scores of DCP (Experiment 10) and IEM(SVM) which is tested on TX merchandise in 2016 34
Table 7 : Detail of Nested-CV on each year on DCP(Experiment 10) and IEM(SVM) 35
Table 8 : Detail of Nested-CV of best 10 model of DCP 36
Table 9 : Number of data and accumulative total number of data 37
Table 10 : Nested-CV results of 10 best-performing model of 1-D CNN 43
Table 11 : Nested-CV results of 10 best-performing model of 2-D CNN 43
Table 12 : Performance comparison of best models of RNN, 1-D CNN and 2-D CNN 43
|| S. Asur and B. A. Huberman, "Predicting the future with social media," in Proceedings of the 2010 IEEE/WIC/ACM International Conference on Web Intelligence and Intelligent Agent Technology-Volume 01, 2010.
 D. Charles and R. Julie, Technical Analysis, 2006.
 G. C. Cawley and N. L. C. Talbot, "On over-fitting in model selection and subsequent selection bias in performance evaluation," Journal of Machine Learning Research, vol. 11, pp. 2079-2107, 2010.
 S. Nison, Beyond Candlesticks: New Japanese Charting Techniques Revealed, Wiley, 1994.
 S. Nison, Japanese Candlestick Charting Techniques: A Contemporary Guide to the Ancient Investment Techniques of the Far East, New York Institute of Finance, 2001.
 H. A. Latane and R. J. Rendleman Jr, "Standard deviations of stock price ratios implied in option prices," The Journal of Finance, vol. 31, pp. 369-381, 1976.
 T. Kamo and C. Dagli, "Hybrid approach to the Japanese candlestick method for financial forecasting," Expert Systems with applications, vol. 36, pp. 5023-5030, 2009.
 K. Martiny, "Unsupervised Discovery of Significant Candlestick Patterns for Forecasting Security Price Movements.," in KDIR, 2012.
 E. Ahmadi, M. H. Abooie, M. Jasemi and Y. Z. Mehrjardi, "A nonlinear autoregressive model with exogenous variables neural network for stock market timing: The candlestick technical analysis," International Journal of Engineering, vol. 29, pp. 1717-1725, 2016.
 E. Ahmadi, M. Jasemi, L. Monplaisir, M. A. Nabavi, A. Mahmoodi and P. A. Jam, "New efficient hybrid candlestick technical analysis model for stock market timing on the basis of the Support Vector Machine and Heuristic Algorithms of Imperialist Competition and Genetic," Expert Systems with Applications, vol. 94, pp. 21-31, 2018.
 C.-F. Tsai and Z.-Y. Quan, "Stock prediction by searching for similarities in candlestick charts," ACM Transactions on Management Information Systems (TMIS), vol. 5, p. 9, 2014.
 Z.-Y. Quan, "Stock prediction by searching similar candlestick charts," in Data Engineering Workshops (ICDEW), 2013 IEEE 29th International Conference on, 2013.
 K.-i. Kamijo and T. Tanigawa, "Stock price pattern recognition-a recurrent neural network approach," in Neural Networks, 1990., 1990 IJCNN International Joint Conference on, 1990.
 K. H. Lee and G. S. Jo, "Expert system for predicting stock market timing using a candlestick chart," Expert systems with applications, vol. 16, pp. 357-364, 1999.
 J. T. Connor, R. D. Martin and L. E. Atlas, "Recurrent neural networks and robust time series prediction," IEEE transactions on neural networks, vol. 5, pp. 240-254, 1994.
 G. Dorffner, "Neural networks for time series processing," in Neural network world, 1996.
 P. J. Werbos and others, "Backpropagation through time: what it does and how to do it," Proceedings of the IEEE, vol. 78, pp. 1550-1560, 1990.
 S. Hochreiter, "The vanishing gradient problem during learning recurrent neural nets and problem solutions," International Journal of Uncertainty, Fuzziness and Knowledge-Based Systems, vol. 6, pp. 107-116, 1998.
 K. Cho, B. Van Merriënboer, C. Gulcehre, D. Bahdanau, F. Bougares, H. Schwenk and Y. Bengio, "Learning phrase representations using RNN encoder-decoder for statistical machine translation," arXiv preprint arXiv:1406.1078, 2014.
 X. Wang, W. Jiang and Z. Luo, "Combination of convolutional and recurrent neural network for sentiment analysis of short texts," in Proceedings of COLING 2016, the 26th International Conference on Computational Linguistics: Technical Papers, 2016.
 D. Tang, B. Qin and T. Liu, "Document modeling with gated recurrent neural network for sentiment classification," in Proceedings of the 2015 conference on empirical methods in natural language processing, 2015.
 K. Tran, A. Bisazza and C. Monz, "Recurrent memory networks for language modeling," arXiv preprint arXiv:1601.01272, 2016.
 A. Krizhevsky, I. Sutskever and G. E. Hinton, "Imagenet classification with deep convolutional neural networks," in Advances in neural information processing systems, 2012.
 Y. Kim, "Convolutional neural networks for sentence classification," arXiv preprint arXiv:1408.5882, 2014.
 D. Britz, "Understanding Convolutional Neural Networks for NLP," 2018. [Online]. Available: http://www.wildml.com/2015/11/understanding-convolutional-neural-networks-for-nlp/.
 A. Krizhevsky and G. E. Hinton, "Using very deep autoencoders for content-based image retrieval.," in ESANN, 2011.
 P. Vincent, H. Larochelle, Y. Bengio and P.-A. Manzagol, "Extracting and composing robust features with denoising autoencoders," in Proceedings of the 25th international conference on Machine learning, 2008.
 H. Noh, S. Hong and B. Han, "Learning deconvolution network for semantic segmentation," in Proceedings of the IEEE international conference on computer vision, 2015.
 P. Baldi, "Autoencoders, unsupervised learning, and deep architectures," in Proceedings of ICML workshop on unsupervised and transfer learning, 2012.
 M. D. Zeiler and R. Fergus, "Visualizing and understanding convolutional networks," in European conference on computer vision, 2014.
 M. D. Zeiler, D. Krishnan, G. W. Taylor and R. Fergus, "Deconvolutional networks," 2010.
 N. Srivastava, G. Hinton, A. Krizhevsky, I. Sutskever and R. Salakhutdinov, "Dropout: a simple way to prevent neural networks from overfitting," The Journal of Machine Learning Research, vol. 15, pp. 1929-1958, 2014.
 T. Salimans and D. P. Kingma, "Weight normalization: A simple reparameterization to accelerate training of deep neural networks," in Advances in Neural Information Processing Systems, 2016.
 J. Patel, S. Shah, P. Thakkar and K. Kotecha, "Predicting stock and stock price index movement using trend deterministic data preparation and machine learning techniques," Expert Systems with Applications, vol. 42, pp. 259-268, 2015.
 C. Sun, A. Shrivastava, S. Singh and A. Gupta, "Revisiting unreasonable effectiveness of data in deep learning era," in Computer Vision (ICCV), 2017 IEEE International Conference on, 2017.
 R. Rothe, "Applying deep learning to real-world problems," 2018. [Online]. Available: https://medium.com/merantix/applying-deep-learning-to-real-world-problems-ba2d86ac5837.
 I. Sutskever, J. Martens and G. E. Hinton, "Generating text with recurrent neural networks," in Proceedings of the 28th International Conference on Machine Learning (ICML-11), 2011.
 H. Sak, A. Senior and F. Beaufays, "Long short-term memory based recurrent neural network architectures for large vocabulary speech recognition," arXiv preprint arXiv:1402.1128, 2014.
 R. Pascanu, T. Mikolov and Y. Bengio, "On the difficulty of training recurrent neural networks," in International Conference on Machine Learning, 2013.
 S. Kombrink, T. Mikolov, M. Karafiát and L. Burget, "Recurrent neural network based language modeling in meeting recognition," in Twelfth annual conference of the international speech communication association, 2011.
 Y. Goldberg, "Neural network methods for natural language processing," Synthesis Lectures on Human Language Technologies, vol. 10, pp. 1-309, 2017.
 W. S. Cleveland and R. McGill, "Graphical perception: Theory, experimentation, and application to the development of graphical methods," Journal of the American statistical association, vol. 79, pp. 531-554, 1984.
 D. Bahdanau, K. Cho and Y. Bengio, "Neural machine translation by jointly learning to align and translate," arXiv preprint arXiv:1409.0473, 2014.