§ 瀏覽學位論文書目資料
  
系統識別號 U0002-2402201919490400
DOI 10.6846/TKU.2019.00753
論文名稱(中文) 仿生自動化K線圖閱讀系統
論文名稱(英文) Let machine read candlestick charts like human beings
第三語言論文名稱
校院名稱 淡江大學
系所名稱(中文) 資訊工程學系全英語碩士班
系所名稱(英文) Master's Program, Department of Computer Science and Information Engineering (English-taught program)
外國學位學校名稱
外國學位學院名稱
外國學位研究所名稱
學年度 107
學期 1
出版年 108
研究生(中文) 郭修志
研究生(英文) Siou-Jhih Guo
學號 606780012
學位類別 碩士
語言別 英文
第二語言別
口試日期 2018-12-28
論文頁數 52頁
口試委員 指導教授 - 洪智傑(oshin@mail.tku.edu.tw)
委員 - 彭文智(wcpeng@gmail.com)
委員 - 林莊傑(josephcclin@gmail.com)
關鍵字(中) 深度學習
量化交易
K線圖
關鍵字(英) Deep Learning
Quantitative Trading
Candlestick Chart
第三語言關鍵字
學科別分類
中文摘要
自18世紀以來,K線圖長期被視為市場分析中俱要重要地位的輔助工具。K線圖原理為將歷史價格圖切分為大小不同的區間,對每一區間判讀其樣式,再將每一種樣式對於未來趨勢的影響組合為最終決策。相較於其他方法,K線圖成功之處在於,其能夠解析更大量歷史資訊,自中抽取對於未來趨勢至關重要的部分,同時亦具有極高精準度。本研究透過近年異軍突起的深度學習中,大量研究者採用的深度卷積網路,建構能夠閱讀K線圖並預測未來價格趨勢的自動化決策系統。設計原理仿自人類交易員閱讀K線圖時之過程--集合由各小區間圖形樣式對漲跌之影響,產出最終對價格趨勢的預測。系統由三大元件構築而成:將大區間價格資訊拆分為小區間的區間拆分器;將小區間圖形轉化為低維度、並自中抽取圖形特徵的Autoencoder;以及自各小區間的特徵中,判讀出最終價格趨勢的RNN。此系統以台灣期貨交易所上的6檔交易標的,TX、MTX、TE、TF、XIF、GTF進行訓練及測試,並與利用傳統指標如:SMA、K/D線等的既有方法進行比較,本研究之系統可得到更高的精確度,證實本研究之有效性與可行性。
英文摘要
Candlestick charts have been very important tool for human traders while making trading decisions since 18th century. Inspired from people reading candlestick charts for decision making, this paper proposed a deep network framework, Deep Candlestick Predictor (DCP), to forecast the price movements by reading the candlestick charts rather than the numerical data from financial reports. DCP contains a chart decomposer which can decomposes given candlestick chart into several sub-charts, an CNN-Autoencoder which can derive the best representation for sub-charts, and a RNN which can forecast the price movements of the k+1-th day. Extensive experiments are conducted by daily prices from real dataset of 6 future merchandises of stock indices in Taiwan Future Exchange, which totally have 21,819 trading days. The experimental results show that the proposed framework DCP could achieve higher accuracy than the traditional index-based model, which shows the effectiveness of the concept of designing a deep network to read candlestick charts like human beings.
第三語言摘要
論文目次
Table of Content

Abstract I
List of Content IV
List of Figure VI
List of Table VII
List of Formula	VIII
Chapter 1 Introduction 1
Chapter 2 Related Works 6
2.1 Candlestick Charts Analysis	6
2.2 Time Series Forecasting 9
2.2.1 RNN 9
2.2.2 GRU 11
2.2.3 CNN 13
2.3 CNN-Autoencoder 15
Chapter 3 Problem Formulation 17
Chapter 4 DCP 19
4.1 Chart Decomposer 19
4.2 CAE	21
4.3 RNN	23
Chapter 5 Experiment Result 25
5.1 Settings and Dataset 25
5.1.1 Dataset 25
5.1.2 Experiment Workflow 30
5.2 Feature-Efficiency 31
5.2.1 IEM 31
5.2.2 Performance Evaluation 33
5.3 Model-Efficiency 37
5.3.1 1-D CNN 38
5.3.2 2-D CNN 40
5.3.3 Performance Evaluation 42
Chapter 6 Conclusion 44
References 47

List of Figure

Figure 1 : 20-days candlestick chart 2
Figure 2 : candlesticks	2
Figure 3 : Method summary 4
Figure 4 : Architecture of RNN 10
Figure 5 : General representation of RNN 10
Figure 6 : Calculation overview of GRU 13
Figure 7 : CNN overview	14
Figure 8: Workflow of CAE 16
Figure 9 : An illustrative example of a 3-day sub-chart	20
Figure 10 : CAE overview 22
Figure 11 : RNN overview 24
Figure 12 : IEM overview 33
Figure 13 : 1-D CNN overview 38
Figure 14 : 1-D tensor 38
Figure 15 : 1-D CNN model 39
Figure 16 : 2-D CNN overview 40
Figure 17 : Original sub-chart 40
Figure 18 : Concatenated sub-charts for 2-D CNN	40
Figure 19 : 2-D CNN Model 41

List of Table

Table 1 : Comparison between different method 8
Table 2 : Future merchandises 28
Table 3 : Distribution of trend on each year 29
Table 4 : Nested-CV results of 10 best-performing model of DCP 34
Table 5 : Nested-CV results of 3 different classifier of IEM 34
Table 6: Scores of DCP (Experiment 10) and IEM(SVM) which is tested on TX merchandise in 2016 34
Table 7 : Detail of Nested-CV on each year on DCP(Experiment 10) and IEM(SVM) 35
Table 8 : Detail of Nested-CV of best 10 model of DCP 36
Table 9 : Number of data and accumulative total number of data 37
Table 10 : Nested-CV results of 10 best-performing model of 1-D CNN	43
Table 11 : Nested-CV results of 10 best-performing model of 2-D CNN	43
Table 12 : Performance comparison of best models of RNN, 1-D CNN and 2-D CNN	43
參考文獻
[1] 	S. Asur and B. A. Huberman, "Predicting the future with social media," in Proceedings of the 2010 IEEE/WIC/ACM International Conference on Web Intelligence and Intelligent Agent Technology-Volume 01, 2010. 
[2] 	D. Charles and R. Julie, Technical Analysis, 2006. 
[3] 	G. C. Cawley and N. L. C. Talbot, "On over-fitting in model selection and subsequent selection bias in performance evaluation," Journal of Machine Learning Research, vol. 11, pp. 2079-2107, 2010. 
[4] 	S. Nison, Beyond Candlesticks: New Japanese Charting Techniques Revealed, Wiley, 1994. 
[5] 	S. Nison, Japanese Candlestick Charting Techniques: A Contemporary Guide to the Ancient Investment Techniques of the Far East, New York Institute of Finance, 2001. 
[6] 	H. A. Latane and R. J. Rendleman Jr, "Standard deviations of stock price ratios implied in option prices," The Journal of Finance, vol. 31, pp. 369-381, 1976. 
[7] 	T. Kamo and C. Dagli, "Hybrid approach to the Japanese candlestick method for financial forecasting," Expert Systems with applications, vol. 36, pp. 5023-5030, 2009. 
[8] 	K. Martiny, "Unsupervised Discovery of Significant Candlestick Patterns for Forecasting Security Price Movements.," in KDIR, 2012. 
[9] 	E. Ahmadi, M. H. Abooie, M. Jasemi and Y. Z. Mehrjardi, "A nonlinear autoregressive model with exogenous variables neural network for stock market timing: The candlestick technical analysis," International Journal of Engineering, vol. 29, pp. 1717-1725, 2016. 
[10] 	E. Ahmadi, M. Jasemi, L. Monplaisir, M. A. Nabavi, A. Mahmoodi and P. A. Jam, "New efficient hybrid candlestick technical analysis model for stock market timing on the basis of the Support Vector Machine and Heuristic Algorithms of Imperialist Competition and Genetic," Expert Systems with Applications, vol. 94, pp. 21-31, 2018. 
[11] 	C.-F. Tsai and Z.-Y. Quan, "Stock prediction by searching for similarities in candlestick charts," ACM Transactions on Management Information Systems (TMIS), vol. 5, p. 9, 2014. 
[12] 	Z.-Y. Quan, "Stock prediction by searching similar candlestick charts," in Data Engineering Workshops (ICDEW), 2013 IEEE 29th International Conference on, 2013. 
[13] 	K.-i. Kamijo and T. Tanigawa, "Stock price pattern recognition-a recurrent neural network approach," in Neural Networks, 1990., 1990 IJCNN International Joint Conference on, 1990. 
[14] 	K. H. Lee and G. S. Jo, "Expert system for predicting stock market timing using a candlestick chart," Expert systems with applications, vol. 16, pp. 357-364, 1999. 
[15] 	J. T. Connor, R. D. Martin and L. E. Atlas, "Recurrent neural networks and robust time series prediction," IEEE transactions on neural networks, vol. 5, pp. 240-254, 1994. 
[16] 	G. Dorffner, "Neural networks for time series processing," in Neural network world, 1996. 
[17] 	P. J. Werbos and others, "Backpropagation through time: what it does and how to do it," Proceedings of the IEEE, vol. 78, pp. 1550-1560, 1990. 
[18] 	S. Hochreiter, "The vanishing gradient problem during learning recurrent neural nets and problem solutions," International Journal of Uncertainty, Fuzziness and Knowledge-Based Systems, vol. 6, pp. 107-116, 1998. 
[19] 	K. Cho, B. Van Merriënboer, C. Gulcehre, D. Bahdanau, F. Bougares, H. Schwenk and Y. Bengio, "Learning phrase representations using RNN encoder-decoder for statistical machine translation," arXiv preprint arXiv:1406.1078, 2014. 
[20] 	X. Wang, W. Jiang and Z. Luo, "Combination of convolutional and recurrent neural network for sentiment analysis of short texts," in Proceedings of COLING 2016, the 26th International Conference on Computational Linguistics: Technical Papers, 2016. 
[21] 	D. Tang, B. Qin and T. Liu, "Document modeling with gated recurrent neural network for sentiment classification," in Proceedings of the 2015 conference on empirical methods in natural language processing, 2015. 
[22] 	K. Tran, A. Bisazza and C. Monz, "Recurrent memory networks for language modeling," arXiv preprint arXiv:1601.01272, 2016. 
[23] 	A. Krizhevsky, I. Sutskever and G. E. Hinton, "Imagenet classification with deep convolutional neural networks," in Advances in neural information processing systems, 2012. 
[24] 	Y. Kim, "Convolutional neural networks for sentence classification," arXiv preprint arXiv:1408.5882, 2014. 
[25] 	D. Britz, "Understanding Convolutional Neural Networks for NLP," 2018. [Online]. Available: http://www.wildml.com/2015/11/understanding-convolutional-neural-networks-for-nlp/.
[26] 	A. Krizhevsky and G. E. Hinton, "Using very deep autoencoders for content-based image retrieval.," in ESANN, 2011. 
[27] 	P. Vincent, H. Larochelle, Y. Bengio and P.-A. Manzagol, "Extracting and composing robust features with denoising autoencoders," in Proceedings of the 25th international conference on Machine learning, 2008. 
[28] 	H. Noh, S. Hong and B. Han, "Learning deconvolution network for semantic segmentation," in Proceedings of the IEEE international conference on computer vision, 2015. 
[29] 	P. Baldi, "Autoencoders, unsupervised learning, and deep architectures," in Proceedings of ICML workshop on unsupervised and transfer learning, 2012. 
[30] 	M. D. Zeiler and R. Fergus, "Visualizing and understanding convolutional networks," in European conference on computer vision, 2014. 
[31] 	M. D. Zeiler, D. Krishnan, G. W. Taylor and R. Fergus, "Deconvolutional networks," 2010. 
[32] 	N. Srivastava, G. Hinton, A. Krizhevsky, I. Sutskever and R. Salakhutdinov, "Dropout: a simple way to prevent neural networks from overfitting," The Journal of Machine Learning Research, vol. 15, pp. 1929-1958, 2014. 
[33] 	T. Salimans and D. P. Kingma, "Weight normalization: A simple reparameterization to accelerate training of deep neural networks," in Advances in Neural Information Processing Systems, 2016. 
[34] 	J. Patel, S. Shah, P. Thakkar and K. Kotecha, "Predicting stock and stock price index movement using trend deterministic data preparation and machine learning techniques," Expert Systems with Applications, vol. 42, pp. 259-268, 2015. 
[35] 	C. Sun, A. Shrivastava, S. Singh and A. Gupta, "Revisiting unreasonable effectiveness of data in deep learning era," in Computer Vision (ICCV), 2017 IEEE International Conference on, 2017. 
[36] 	R. Rothe, "Applying deep learning to real-world problems," 2018. [Online]. Available: https://medium.com/merantix/applying-deep-learning-to-real-world-problems-ba2d86ac5837.
[37] 	I. Sutskever, J. Martens and G. E. Hinton, "Generating text with recurrent neural networks," in Proceedings of the 28th International Conference on Machine Learning (ICML-11), 2011. 
[38] 	H. Sak, A. Senior and F. Beaufays, "Long short-term memory based recurrent neural network architectures for large vocabulary speech recognition," arXiv preprint arXiv:1402.1128, 2014. 
[39] 	R. Pascanu, T. Mikolov and Y. Bengio, "On the difficulty of training recurrent neural networks," in International Conference on Machine Learning, 2013. 
[40] 	S. Kombrink, T. Mikolov, M. Karafiát and L. Burget, "Recurrent neural network based language modeling in meeting recognition," in Twelfth annual conference of the international speech communication association, 2011. 
[41] 	Y. Goldberg, "Neural network methods for natural language processing," Synthesis Lectures on Human Language Technologies, vol. 10, pp. 1-309, 2017. 
[42] 	W. S. Cleveland and R. McGill, "Graphical perception: Theory, experimentation, and application to the development of graphical methods," Journal of the American statistical association, vol. 79, pp. 531-554, 1984. 
[43] 	D. Bahdanau, K. Cho and Y. Bengio, "Neural machine translation by jointly learning to align and translate," arXiv preprint arXiv:1409.0473, 2014.
論文全文使用權限
校內
校內紙本論文立即公開
同意電子論文全文授權校園內公開
校內電子論文立即公開
校外
同意授權
校外電子論文立即公開

如有問題,歡迎洽詢!
圖書館數位資訊組 (02)2621-5656 轉 2487 或 來信