The labour-oriented, collective intelligence of ours: economic systems seen through the eyes of a neural network

This article explores and substantiates the hypothesis that capital investment and real output in national economies are instrumental to the optimization of labour markets, i.e. human societies are oriented, most of all, on shaping their labour markets so as to assure an equilibrium between demographic growth and the set of social roles available. An original method, largely based on the Interface Theory of Perception, studies economic systems as manifestations of collective intelligence, learning by evolutionary tinkering with itself. An artificial neural network is used to study economic systems as Markov chains of states, and to discover their structuring σ-algebras, which turn out being oriented mostly on optimizing the compensation of labour and the average workload per person per year. JEL : E01, E17, J01, J11


Introduction
The impact of technological change on the labour market is one of the hottest topics in economics, so to say. The extent of human labour made obsolete by robotization and artificial intelligence is still to explore and discover. Even the author of this article asks himself when such pieces of scientific writing will be routinely ghost-written by artificial neural networks. The latest World Development Report (WDR) by World Bank (World Bank 2019) brings interesting observations in that respect. Apparently, at least so far, the fears of robots and digital technologies sweeping millions of jobs out of existence are unfounded. WDR 2019 brings evidence that not only don't those new technologies destroy jobs, but they also create whole new job categories and new sub-markets for human labour. Still, developing human capital is crucial: new jobs can appear, accompanying new technologies, when people acquire new skills. The most likely scenario for the global labour market is that, whilst new technologies do can create new jobs, people will have to adapt deeply in their lifestyles, including education, and cross-jobs mobility. The already proverbial 'uberization' of the labour market is a fact.
Human societies can be studied as factories of social roles. When we grow demographically, the rising headcount of humans needs to be accommodated with new slots to take in the social structure.
When there are much more new humans than social slots to take, bad things, such as violent revolutions and wars tend to happen. Over millennia, we have invented social contrivances supposed to make new social roles: cities with their intense social interactions, educational systems, or, for example, the specifically European invention of quickly changing fashions in the ways we dress, which fuelled some 400 years of intense technological change through the industry of garment and textile, and thus created a stream of new occupations.
That generally anthropological assumption translates into a strong claim in the realm of economics: capital investment and real output in national economies are instrumental to the optimization of labour markets, i.e. human societies are oriented, most of all, on shaping their labour markets so as to assure an equilibrium between demographic growth and the set of social roles available. This is the working hypothesis of the present article, which distantly echoes the Keynesian doctrine of attaching utmost importance to proper health of the labour market. Such as it is formulated here, this hypothesis goes one step further and assumes that not only should governments take care of the labour market, but entire societies do it as a matter of fact, just with various efficacy. As one scratches its surface, controversial ramifications emerge. If desired social outcomes sum up to a balance in the labour market, the capital market could be considered as subservient and instrumental to that end. That, in turn, goes against much of the established economic (or wannabe-economic) science, which claims systemic bias of economic systems to the benefit of large capital-holders, and, correspondingly, to the detriment of the working class.
Underneath that working hypothesis as regards economics, another one dwells, more generally anthropological. The strictly economic hypothesis of general economic equilibrium is generalized into the hypothesis of collective intelligence in human societies, with general economic equilibrium being one among many manifestations thereof. Human societies are studied in this article as intelligent structures, i.e. complex wholes able to learn and evolve by experimenting with many alternative versions of themselves, whilst staying internally coherent. Each such alternative version of the coherent whole is a one-mutation neighbour to other versions, and collectively intelligent learning occurs through the selection of the fittest mutants, regarding collectively pursued social outcomes. In other words, societies are assumed to possess the capacity for social evolutionary tinkering (Jacob 1977) through tacit coordination, such that the given society displays social change akin to an adaptive walk in rugged landscape (Kauffman & Levin 1987;Nahum et al. 2015 Tables 9.1, grows steadily, more or less in correlation with population. Yet, when that headcount is being denominated over said population, the resulting average national coefficient of professional activity is strongly and positively Pearson-correlated with the previously mentioned share of aggregate amortization in GDP (the coefficient of correlation is r = 0,8902).
What if we, as a civilization, were developing digital technologies in order to accommodate a growing population with new social roles in the job market? What if the observably growing burden of aggregate amortization over the current real output of the economy was one among many manifestations of that collective struggle? For many centuries, cities served as factories of new social roles. Still, urban land needs to stay in balance with its food base, i.e. with agricultural land. Cities have limited hosting capacity. What if digital technologies were progressively taking over that traditional function of cities, The coefficient of hours worked per person per year is informative both about the technologies used by people working, and about the workstyles built around those technologies. Interestingly, that coefficient seems to be inversely correlated with income, still, the correlation is rather international than intranational. When developing countries are compared with the developed ones, the average adult in the former category works about 50% more per week than the average adult in the latter group. However, inside countries, the same type of correlation reverts, and one sees richer people work more than the relatively poorer ones (Bick et al. 2018).
Somewhat in the background of research on the strictly speaking coefficient of hours worked, another question arises: do we adapt our balance between workload and leisure to our expected consumption, or maybe we do something else, i.e. form a balance between consumption and savings on the grounds of what the local labour market imposes on us? Interestingly, both approaches may be quantitatively robust (Koo et al. 2013). Research focused on the impact of institutional changes, e.g. new environmental regulations, upon the labour market suggest that the latter is much more adaptable than in was assumed in the past, and has the capacity to offset the disappearance of jobs in some sectors by the creation of jobs in other sectors (Hafstead & Williams 2018).
Since David Ricardo, all the way through the works of Karl Marks, John Maynard Keynes, and those of Kuznets, economic sciences seem to be treating the labour market as easily transformable in response to an otherwise exogenous technological change. It is the assumption that technological change brings greater a productivity, and technology has the capacity to bend social structures. In this view, work means executing instructions coming from the management of business structures. In other words, human labour is supposed to be subservient and executive in relation to technological change. Still, the interaction between technology and society seems to be mutual, rather than unidirectional (Mumford 1964, McKenzie 1984, Kline and Pinch 1996David 1990, Vincenti 1994). The relation between technological change and the labour market can be restated in the opposite direction. There is a body of literature, which perceives society as an organism, and social change is seen as complex metabolic adaptation of that organism. This channel of research is applied, for example, in order to apprehend energy efficiency of national economies. The so-called MuSIASEM model is an example of that approach, claiming that complex economic and technological change, including transformations in the labour market, can be seen as a collectively intelligent change towards optimal use of energy (see for example: Andreoni 2017; Velasco-Fernández et al 2018). Work can be seen as fundamental human activity, crucial for the management of energy in human societies. The amount of work we perform creates the need for a certain caloric intake, in the form of food, which, in turn, shapes the economic system around, so as to produce that food. This is a looped adaptation, as, on the long run, the system supposed to feed humans at work relies on this very work.

The method and the dataset
The here-introduced methodology attempts at testing the general hypothesis of collective intelligence, and the specific hypothesis of labour-oriented general equilibrium. The central methodological thread consists in doing something which was extremely sceptical about, i.e. in assessing what people collectively want (Bosanquet 1920). If human societies are intelligent structures, i.e. if they are able to learn by experimenting with many alternative versions of themselves and by selecting the fittest ones, the same societies need a gauging criterion to assess fitness. The here-presented method aims at finding that criterion, equating it to a collectively pursued social outcome. The intermediate goal is to reconstruct, out of quantitative macroeconomic data, the process of collective learning as regards the working hypothesis stated in the introduction. That general methodological thread is being unwrapped by using an artificial neural network as mathematical representation of an intelligent structure. This methodological path requires clearing two hurdles, namely the extent to which quantitative socio-economic data can be considered as empirical manifestation of collective intelligence, and the possibility for an artificial neural network to be a logical representation of such a collective intelligence.
When we do economics, we essentially study human behaviour. At the bottom line, economic sciences are behavioural sciences. Yet, when we have a careful look at the type of data that we work with in empirical economics, we deal with descriptive information about aggregate outcomes of human behaviour rather than manifestation of behaviour strictly speaking. The economist faces cognitive limitations akin to those of quantum physics. Individual behaviour can be observed in great detail, which, in terms of physics, is known position. Yet, the more detailed is the observation of idiosyncratic properties, the harder it is to locate those individual cases in a spectrum of probability. We can forego some of the idiosyncrasies, which we know are there, and simplify the observed cases so as to fit them in a standardized distribution of probability.
When we observe, through the lens of quantitative variables, a body of collective human behaviour, do we observe many distinct phenomena or just one structured phenomenon, which we epistemically split into separate observables? The question digs into the foundations of empirical sciences, as outlined by William James (James 1912 How does phenomenological unity, or, conversely, diversity, matter for empirical economic research? When we assume that quantitative variables correspond to rigorously distinct phenomena, the baseline hypothesis is the null one, i.e. the absence of correlation. Should phenomenological unity be assumed, it is reasonable to postulate correlation, and the null hypothesis is just a methodological trick in statistics. Thus, the hypothesis of phenomenological unity further leads to assuming the working of a collective intelligence behind many distinct, quantitative observables of social sciences. To what extent and how exactly does the intelligent logical structure of an artificial neural network represent collective intelligence of human societies? Artificial intelligence can be approached as the general science of intelligence, as it allows testing abstract concepts we make as regards our own mind (Sloman 1993  At this point, it is useful to demystify the logic of artificial neural networks. Data scientists and programmers use to say that artificial intelligence is a black box. We know it does something, we like the outcomes, yet we never quite understand how those outcomes happen, and that forced ignorance makes the engineering of AI so experimental and laborious. Yet, a few patterns are firmly drawn. An artificial neural network always performs adaptive walk in rugged landscape: it learns by producing many alternative versions of itself, which, whilst staying internally coherent, demonstrate various degrees of fitness as projected against a desired outcome. Besides defining the exact structure and sequence of equations in a piece of AI, and in the presence of any empirical dataset, the programming of neural networks passes through the fundamental distinction between input variables and the output ones. In other words, we distinguish between the valuable outcomes pursued, and the adaptive changes instrumentally serving those outcomes. That distinction can be made arbitrarily or empirically. In the latter case, before using a neural network to optimize a given variable, one can empirically select the variable which the network is the most capable to optimize with the given structure and sequence of equations. This is precisely the property of artificial neural networks which the present methodology utilizes to check the working hypothesis stated in the introduction. Whilst seeming pertinent as for the core issue of this article, i.e. the labour market, the 'No-Empty-Cells' set is representative for relatively big economies, both GDP-wise and population-wise, with a lower than average propensity to consume, and slightly faster inflation. It is also to keep in mind that timewise, this set is slightly sloping towards relatively recent years, and, for example, does not cover the period In each S i , data is being standardized, one variable is selected as the output one, whilst the remaining 40 variables are aggregated into m = 3008 perceptual vectors 'h' as in equation (2). In the jth perceptual vector h j , the standardized value of the i-th input variable x i is incremented with the error e j-1 , propagated from the preceding, j-1-th experimental round of learning, and the incremented value is then weighed with two parameters. The E(x i,j ) parameter endogenous to the network. It is the Euclidean distance between the input variable x i and the remaining 39 input variables in the same j-th experimental round. Its presence in equation (2) represents the internal coherence of the social system studied, aswhich has already been stated beforeall the variables studied are assumed to show different quantitative aspects of essentially the same set of behavioural patterns in humans. The R~U([0,1]) parameter is exogenous and quasi-random in the interval between 0 and 1. Its forces the network to experiment with different magnitudes of importance attached to input variables. The Euclidean distance between vectors of mean expected values in, respectively, the set X and the given set S i , as in equation (4) Table 2 and Figure 1, in the Appendix, document the results of simulation run with the abovedescribed method. Table 2 presents the Euclidean distances as computed with equation (4), between the original dataset X, and the 41 transformations S i thereof. Figure 1 (4), which can be interesting to the extent that some among the 41 sets S i display apparently absurd mean values, e.g.

Results and final discussion
negative aggregate real output. The link to that file is to find in the reference list. The author attempts to give as exhaustive an interpretation of those results, yet the reader is welcome to study them directly.
On the other hand, the workbook entitled 'PWT 9_1 Perceptron pegged AVH', referenced as a link in the bibliographical list further below, introduces the database used for this empirical research, as well as the logical structure of the neural network, according to equations (1)  i.e. that explaining the PL_K variable, is relatively the weakest.
As multiple regressions cross-check the results yielded by the neural network, the structuring power of LABSH and AVH, as yielded by the latter, seems to be something essentially different from the strictly speaking explanatory power of regressions built around these metrics. There is something else in the game, and that something could be collective intelligence of human societies acting as factories of social roles. The macroeconomic metrics of labour compensation LABSH and average workload AVH translate into a broader social balance between the energy required to play social roles, and the socially sanctioned rewards paid to those roles.
Should the here-presented research be taken at its face value as basis for economic policies, the Keynesian imperative of stabilizing the job market takes a different shade of tan. Apparently, labour markets optimize themselves anyway. Whatever capital resources are put at the disposition of the herestudied national labour markets, would those resources be channelled fiscally or at the monetary level, they seem being aligned on the basic imperative of balancing workload and work compensation. Temporal scope of the 'No-Empty-Cells' database selected from PWT 9.1 for being processed with the neural network (hyphenated values are the respective numbers of national observations for each given year): 1954 -25; 1955 -28; 1956 -28; 1957 ÷ 1963 -29 each; 1964 ÷ 1969 -32 each; 1970 ÷ 1979 -42 each; 1980 -43; 1981 ÷ 1985 -44 each; 1986 -45; 1987 ÷ 1989 -46 each; 1990-47; 1991 ÷ 1992 -49 each; 1993 -50; 1994 -54; 1995