Abstract:
The development of new technologies has confronted the entire domain of science and industry with issues of big data's scalability as well as its integration with the purpose of forecasting analytics in its life cycle. In predictive analytics, the forecast of near-future and recent past - or in other words, the now-casting - is the continuous study of real-time events and constantly updated where it considers eventuality. So, it is necessary to consider the highly data-driven technologies and to use new methods of analysis, like machine learning and visualization tools, with the ability of interaction and connection to different data resources with varieties of data regarding the type of big data aimed at reducing the risks of policy-making institution’s investment in the field of IT. The main scientific contribution of this article is presenting a new approach of policy-making for the now-casting of economic indicators in order to improve the performance of forecasting through the combination of deep nets and deep learning methods in the data and features representation. In this regard, a net under the title of P-V-L Deep: Predictive Variational Auto Encoders - Long Short-term Memory Deep Neural Network was designed in which the architecture of variational auto-encoder was used for unsupervised learning, data representation, and data reconstruction; moreover, long short-term memory was adopted in order to evaluate now-casting performance of deep nets in time-series of macro-econometric variations. Represented and reconstructed data in the generative network of variational auto-encoder to determine the performance of long-short-term memory in the forecasting of the economic indicators were compared to principal data of the net. The findings of the research argue that reconstructed data which are derived from variational auto-encoder embody shorter training time and outperform of prediction in long short-term memory compared to principal data.
Machine summary:
The approach of most central banks in the last few decades can be divided into three important periods, based on the turning point of the 2008-2009 crisis: Pre-crisis; with features Lagging distribution of macro-economic indicators and the traditional use of simple foresight models by policy-making institutions Concentration on aggregated data used at the level of central bank balance sheets Focus on deductive/ inference approach Crisis; with features Distortions and lack of equilibrium economic Reveal the deficiency of deductive models due to inability and confrontation of accepted financial models with a huge amount of data contributing eventually to decrease of accuracy in forecasting Complexity in data analytics because of non-linear relations Data tsunami and the emergence of the big data paradigm due to the rapid development of computer and internet networks, and the formation of new sources of information and digital data Focus on topics, especially those relating to cause and effects as well as contractual reward in models that are easy to interpret due to access to massive data resources Focus on the inductive approach and not to pay attention to theoretical generalizations Post-Crisis; with features Real-time and immediate assessment of the economic situation Focus on the data-driven approach including technologies, techniques, methods, and tools Focus on the abduction approach (the hybrid approach inductive and deductive) Big data is a transformative paradigm that its trend analyses and its turning points lead to an improvement in the outlining of real-time conditions of economy through data-driven decisions.