Abstract

Certain variables of interest in agriculture may preferably be mapped to distributions rather than to real numbers, e.g. the variable “apple (fruit) weight” on an apple tree at sampling times; hence, a stochastic time-series emerges. Since a cumulative probability distribution can be represented by a Type-1 Intervals’ Number (IN) in the mathematical lattice (F1,⪯), this work considers a stochastic time-series F: R → F1 and interprets it as a parametric Type-2 IN, symbolically F(x; t). Given samples F(x; ti), i  k, the problem here is to predict F(x, ti) for i > k, assuming a deterministic relation among samples F(x; ti), i ε {1,…,n}. Prediction is pursued by a novel neural architecture, namely meta-Statistical IN Neural Network, or metaStatINNN for short, consisting of two neural modules in series: The first one, namely lowerNN, induces INs from data using deep learning; whereas, the second one, namely upperNN, is a well known INNN (IN Neural Network) that implements a function f: F1N  F1. This preliminary work considers a benchmark time-series of temperature distributions as well as artificial data sets of increasing cardinality. Computational experiments are demonstrated comparatively to a conventional neural network. The results by metaStatINNN are superior not only in terms of computational speed as well as prediction accuracy but also because a conventional neural network typically predicts only first-order data statistics, whereas the metaStatINNN may predict all-order data statistics; in addition, the metaStatINNN can explain its answers; moreover, an IN can represent big data.

Citation

C. Bazinas, C. Lytridis, V. G. Kaburlasos, “Meta-statistical deep learning for stochastic time-series prediction in agricultural applications”, International Joint Conference on Neural Networks (IJCNN 2023). Gold Coast, Queensland, Australia, 18-23 June 2023. 2023 International Neural Network Society (INNS) Workshop on Deep Learning Innovations and Applications (DLIA)