In order to accompany the consirable advance in this field by the Decision Analysis Group at SEEL-Systems Engineeering Economics Lab
, all sections of this website will undergo updates during the month of July 2021.
Locational State Theory is based on the analysis of datasets of object properties by taking into account divergences (variance) and convergence (equivalence) of property values that are associated with the specific location of objects in space-time.
|"The components of Location-State Theory are not particularly original but it pulls together already existing knowledge from a wide range of disciplines and the output of countless practitioners, into a single system."|
"It provides a model that integrates the interactions of human, social, economic, financial, technological, environmental, ecosystem, climatic and sustainability factors."
"The particular focus on the significance of the space-time dimension to important relationships can contribute to a better identification of options of significance in solving pressing practical problems facing the human population..."
The term locational-state refers to location in space-time and state is the corresponding value of a property. LST is concerned with a general canonical form as a natural or unique representation of an object that includes the space-time dimension.
This advancing domain developed from an initial concern with the exact specification and communication of information requirements as well as the validation that information, being sent in response to a requirement, is what had been requested.
This work was initiated by Hector McNeill1
a Senior Scientific Officer with the Information Technology & Telecommunications Task Force (ITTTF) of the European Commission in Brussels in 1985.
McNeill was concerned with systems strategies associated with the evolution of a global network and the development of applications of benefit to society. This was a decade before the World Wide Web became a reality but it was already clear, at that time, that the Internet would become the foundation network for this transition. McNeill foresaw the current issues related to the circulation of poor quality information as well as intentionally biased information making decision-making less effective, erroneous and therefore risky. On the side of risk, the antagonism that can arise from misleading information and recriminations arising from poor decisions by policy-makers affecting millions of constituents, are states of affairs that should and can be avoided. At the extreme, misleading information, can lead to conflict2
However, Locational State Theory did not start out with any intent to establish a new theory but was only intended to provide a simple a way to establish unambiguous specifications of the datasets required to satisfy specific information requirements. In fact, this process started out to answer a simple question, "How can we safeguard against misrepresentation in information received over a global network in order to be sure that we can base decisions on the information?"
Hector McNeill was trained as an agronomist at Cambridge University and holds post-graduate degrees in economics and systems engineering from Cambridge and Stanford Universities. McNeill's development of this domain relates to work he carried out in 1968 at Stanford, concerning earth resources and the relationships between agriculture, forestry and bioclimatic zones. He noted, during this research, that the majority of published agricultural experimentation results had tended to restrict analysis to a limited number of factors to establish simple relationships such as between the availability of water and crop yields or fertilizer inputs and crop yields. Invariably these papers excluded considerations of the natural complexity of agricultural systems and the impact of natural resource factors such as temperature, altitude and actual water availability and how the natural environmental setting and seasons alter these relationships significantly from year to year. Therefore most experimental results we partial analyses of reality.
He also noted the same issue in relation to macroeconomics and development economics where the application of "ceteris paribus" or " all other factors remaining at a fixed value
" condition, severely limited sight of the full operational complexity of economies. He considered this, as a basis for developing economic theory to be used to design economic policy decision-making affecting the lives of constituents to be inadequate. McNeill realized that statistical methods were compromised by the reality of researchers having to trade-off data precision and number of factors analyse and the costs of carrying out surveys or experimentation, invariably resulted in a limitation on the number of factors investigated leading to only a limited insight to reality and this undermined the practical utility of the results. In statistical terms the analysis of variance generated high levels of "explained variance" and therefore significance or low error within samples while at the same time generated unacceptable levels of "unexplained variance" between samples when the experimental plots of surveys targets were distributed over geographic distances or even when the experimental plots were limited to one location but data sets collected in different years (production years). Therefore, as a result of a lack of resources and time or of interest in the part of the researcher, the data required to explain significant systemic effects, arising from the effects of variations in nature linked to locational-states, was not commonly collected.
Based on McNeill, H. W., "3DPF-Three dimensional production function", TP, Food Research Institute, Stanford University, 1968.
As a result of this reasoning, McNeill proposed the existence of "complete
" and "incomplete
" data sets according to the degree to which the data elements within the set can explain all of the observed variance. This has significant implications for statistical analysis and with the advance of knowledge facilitates the calculation of opportunity costs of not collecting complete data sets. Even today, researchers limit budgets according to the ability to investigate a limited range of interactions of interest which are the subject of some hypothesis to be tested. However, in terms of practical utility of experimental results the consideration of completeness of incompleteness with respect to dominant determinants is of fundamental importance.
As a result these facts the critical information necessary to take rational decisions was often not available.
McNeill therefore attempted to develop a decision analysis approach based on a decision analysis model 3DPF (three dimensional production function) as opposed to a statistical approach. In 1968 he created a macroeconomic model and later version for agricultural production at the farm level (see left).
Although starting with just three determinants, this generic approach has since been extended to include any number of determinants becoming the MFPF (multi-factor production function). It can include learning curve properties and time-based growth as well as modifications in technologies and to accommodate change in proportions of inputs as technologies change. The model maintains the correct relationship between increasing inputs and diminishing marginal returns but also related this to crop genetics as well as machine capacity design and maintenance history. These models generate copious tabulations somewhat like Monte Carlo simulations. Subsequent computer-generated tabulations created the first demonstrations of the foundation of dynamic decision analysis models able to simulate outputs as well as to create coherent linkages between the microeconomic units and the macroeconomy. The shape of the 3D projections were very similar to the Cob-Douglas production function but the formula is completely different and based on reciprocal mathematical logic developed by McNeill for this purpose.
Later, in 1975 he initiated the work leading to the so-called real incomes approach to economics, now recognized as RIO-real incomes objective theory and practice. This is a fully-fledged macroeconomic theory founded on microeconomic principles and associated policies supported by policy instruments. McNeill continues to lead the development of this alternative school of economic thought.
More recently location-state theory has been justified as a general theory in terms of its applicability to most human activities or natural ecosystems giving rise to the recognition of specific laws of relationships. Locational-state mathematical logic is able to integrate animate-inanimate and man-machine relationships that help in analysis of the limits to productivity and points to the strategies for progressing beyond current productivity limitations. Locational-state theory is one of the few approaches that combine explicit and tacit knowledge into a single system designed to drive technical and economic innovation.
This approach has important implications for decision analysis by creating a close relationship between determinant decision analysis models to the current state of knowledge on relevant cause and effect relationships, probabilities of events and information quality.
The word Locational
: is not an English word in common usage but it acts as an adjective with the syntactic role of qualifying the nature of the properties that describe an object or State
as being dependent on space-time coordinates expressed as geographic (longitude, latitude and altitude) and time coordinates.
1 Hector McNeill is director of SEEL-Systems Engineering Economics Lab. SEEL was established by McNeill in 1983 and it is now a specialized unit of the George Boole Foundation Limited. He is a graduate of Cambridge University and completed post-graduate studies at Cambridge and Stanford Universities.
In this own words, McNeill explained that, "My work at the time was concerned with the development of initiatives for the development of information applications. My own brief was to identify ways to establish broad civilian benefits from a wider use of a global communications network. The basic theme was the convergence of all then current analog and digital applications into a single digital standard. I was also asked to develop "learning systems" initiatives. However, I am afraid that the concept generally held was that "learning systems" consisted of course preparation modules and online university tuition like the Open University model in the UK." However, by that time (1985) I had been developing the Real Incomes Approach to economics for more than a decade. This approach to macroeconomics is designed to promote technological change and the development of human technique, learning, the accumulation of tacit knowledge and improvement in the quality of explicit knowledge. This is the very human foundation for the dissemination of innovation. These human traits account for 60% of economic growth and the improvement of human welfare. Therefore, for me, a learning system is the organization of society so as to maximize the acquisition and use of advancing knowledge by all people at any life stage. This learning system is an ongoing organism that supports us throughout our life."
"This approach, however, faces many challenges. We all know that part of the process of learning is trial and error but the better the quality of information and sharing of experience not only cautions us on what to avoid but also helps us deploy decision analysis leading to more successful outcomes. If we receive defective information the process of human advance is debased and corrupted. Indeed in many transactions, the art of lying by misrepresentation or omission of key facts is assumed, by some, to be part of the way they gain advantage over others. This creates problems for those who for philosophical or religious reasons believe that a better state to be in is one where all can take decisions based on an ability to trust that supplied information is factual. This, of course, would lead to a better condition for mankind supported by mutual consent to be established within constitutional frameworks that are designed to uphold this condition by protecting society from the abuse of misrepresentation and imposing sanctions on those whose behavior prejudices others".
2 Today this has become more obvious in the political spheres and in so-called "Fake News" battle that has erupted (2016-2017) between social, alterative and "mainstream" media, all of whom have significant problems with information quality and reliability.
3 Recently (November 2017) US Congressional hearings concerning the use of social media services such as Google, Facebook and Twitter by "foreign agents" are a case study of a completely structure less interaction with the posing of wholly deficient questions by Congress members, reflecting pre-conceived ideas of what was happening, and a lack of understanding of how such systems work combined with deficient responses from these organizations.