
Bidirectional Encoder Representations from Transformers for Modelling Stock Prices
Author(s) -
Parinnay Chaudhry
Publication year - 2022
Publication title -
international journal for research in applied science and engineering technology
Language(s) - English
Resource type - Journals
ISSN - 2321-9653
DOI - 10.22214/ijraset.2022.40406
Subject(s) - transformer , computer science , encoder , architecture , artificial intelligence , sentence , natural language processing , artificial neural network , stock (firearms) , sentiment analysis , engineering , history , mechanical engineering , voltage , electrical engineering , operating system , archaeology
Bidirectional Encoder Representations from Transformers (BERT) is a transformer neural network architecture designed for natural language processing (NLP). The model’s architecture allows for an efficient, contextual understanding of words in sentences. Empirical evidence regarding the usage of BERT has proved a high degree of accuracy in NLP tasks such as sentiment analysis and next sentence classification. This study utilises BERT’s sentiment analysis capability, proposes and tests a framework to model a quantitative relation between the news and reportings of a company, and the movement of its stock price. This study also aims to explore the nature of human psychology in terms of modelling risk and opportunity and gain insight into the subjectivity of the human mind. Keywords: natural language processing, BERT, sentiment analysis, stock price modelling, transformers, neural networks, selfattention