‹ Quipo Blog

Using Artificial Intelligence to Predict Market Trends

Marius Mindroc

Why Artificial Intelligence? The FinTech industry is rapidly evolving and reshaping commerce, trading, insurance and risk management by automating and speeding up decision-making. How can Artificial Intelligence (AI) play a part in this process? There are many applications for AI in finance, including fraud detection, virtual financial assistants, chat- bots, and forecasting the financial market’s response to news and social media. This is reflected in the huge market value that has accrued since the AI race began back in 2010. Although things only really heated up in 2017, and its yearly growth has remained at a healthy 50%.

Everything-As-A-Service

The use of cloud technology has increased exponentially in recent years, but many companies have failed to adopt a true ‘cloud mindset’, and so are not exploiting it to its full potential. The tendency, in fact, has been simply to move IT systems to the cloud, without putting further transformational strategies in place.

Many businesses have changed direction of late, and have started to move from the On-Premise to Software- as-a-Service (SaaS) paradigm. This means that their customers are buying not just a piece of software but the underlying infrastructure that is used to serve it, setting aside concerns regarding scalability, integration, uptime, security and upgrades in favour of cost predictability (pay as you go), reduced time to market, rapid prototyping, and the ability to run the product anywhere.

It seems only logical that APIs will be the next target for this approach, given their ability to create new work channels between partners. Could the SaaS API be the result? It looks increasingly likely. Such a move would be closely related to cloud computing, and would open up the possibility of integrating with other SaaS providers, and expanding product ranges and business capabilities.

What is Natural Language Processing?

Natural language processing (NLP) is an emerging technology, and one of the many forms of AI. Whilst information is now dispersed and broadcast 24/7 on the internet, it does so in an unstructured way that computers find difficult to understand. To be more specific, such data either does not have a data model or is not organised in a pre-defined manner. At the same time, humans are also not able to handle too much information at a time, even though it is available continuously. NLP can solve this problem by allowing computers to process large amounts of natural language data.

Who are the major players in NLP?

When building an application that can work with natural language, there are a number of easy-to-use SaaS APIs available from various providers, notably Google Cloud Natural Language, Amazon Comprehend, Microsoft Azure Text Analytics and IBM Watson NLU. They are quite similar in terms of entity extraction, key phrase extraction, sentiment analysis and multiple languages support, although there some differences in supporting syntax analysis, topic modelling and, of course, pricing plans. Before choosing one over another, customers should consider their existing ecosystems and APIs, and the required features of the NLP API.

Market Price Predictions Using Sentiment Analysis

Creating a service to help forecast the financial market’s response to news and social media is a challenge, but one that could be made easier with AI. First of all, things need to be broken down into smaller building blocks to reduce the overall complexity. Open-source tools and scripting should then be used to glue everything together, and simultaneously to unleash the power of the commercial NLPs. I am also assuming that a mathematical model capable of indicating market trends has already been implemented. Now let’s take a look at the various components that are necessary, and how they can function side-by-side as a complete product.

I have chosen the Google Cloud Platform solution as my case study, but any other provider should do the job. This being said, this particular model requires extra inputs, which should come from relevant news, blogs and social media channels. The list of such sources, along with that of significant topics from within the financial world and beyond, needs to be constructed gradually. Let’s take the oil market as an example, and note down the factors that could influence future prices: ‘oil price’, ‘oil future contracts’, ‘OPEC’, ‘oil supply’, ‘oil demand’, ‘future oil supply’, ‘future oil demand’, ‘oil crisis’, ‘natural disasters’, ‘threat of war’, and so on. This exercise would need to be repeated as each new market domain is analysed.

The web content can be found via various open source crawlers, which extract text and export it to Json or CSV files. Apache Nutch is one such and has several useful features, e.g. automatically monitoring a site, and crawling and exporting the updates to text file. It can also be used in combination with Apache Solr to enable indexing, and to work as a search engine. Social media websites generally have dedicated APIs that can run queries on new posts or comments using scripting tools. These can be configured and run as batches, which can then be scheduled on a regular basis (daily, hourly...) to extract unstructured data and save it to the cloud storage for further analysis. For real-time scenarios, news and social media feeds can be used in combination with Google Pub/Sub as an input channel and event trigger.

By this stage, the inputs and relevant topics are ready for the Google Cloud Natural Language API. The numerous blocks of text, culled from a multiplicity of sources, are fed into the Google NLP API to allow it to identify the sentences and split them into known entities. The identified text entities are matched to the inputted list of topics, with NLP able to match related topics even if they have different names. The API can provide scores on how relevant these matched topics are in the context of the overall text, to determine if any of them is the main topic of the text. This makes it possible to filter the information even further, if sections of an article, or an article in its entirety, are vague, or treat a given issue as a side topic.

NLP then runs sentiment analysis on the sentences that have survived the process, and can classify the whole text as positive, negative, neutral or mixed. Depending on how relevant the authors or sources of information are judged to be, their significance to the overall result can be weighted accordingly. Lastly, the mathematical model can use this information to make fine adjustments to the final analysis.

Conclusions

AI might sound complicated, but with it progress happens at a faster pace, and it feels like the distant future has already arrived. It is so generic that the same principles can be applied to many domains, not just the financial markets. It also has a number of potential competitive advantages, even if many people find it unnerving. Used in the right way, in summary, it can help a business become smarter, more resilient to change, able to react faster, and to stand out from the crowd.

Rejoindre Quipo!

Ajoutez votre adresse e-mail pour devenir membre du réseau Quipo.