ADVERTISEMENT

A New Mathematical Tool For Artificial Intelligence Borrowed From Physics

This research aims to increase our understanding  and our mathematical control of “natural” (i.e.”spontaneous/emergent”) information processing skills shown by Artificial Intelligence (AI), namely by neural networks and learning machines. Indeed AI is experiencing a “magic moment” as finally theorists have been overwhelmed by “big data” that can be used to train these networks and  we can check their capabilities concretely.

Among a plethora of variations on theme, in particular a bulk of algorithms overall termed “Deep Learning” is showing impressive successes in several fields, ranging from scientific applications (e.g. statistical learning and feature extraction from high dimensional data for health care) to more applied ones (e.g. image and video processing and/or natural language processing). As an immediate consequece of these recent great triumphs (see e.g. [1]), the quest for a deeper (mathematical) control on these systems is continuously raising and we aim to contribute to construct a “rationale” for Deep Learning by taking advantage of methods and techniques typical of Theoretical Physics.
Indeed in the past decades Theoretical Physics has been heavily involved in the mathematical formalization of the emergent/spontaneous properties shown by neural networks as, for instance, distributed memory, pattern classification, feature extraction, multitasking capabilities and much more. The bulk of contributions came from Statistical Mechanics and Stochastic Processes: the former (Statistical Mechanics) has been used to paint the “phase diagrams” of  crucial networks in statistical learning (e.g. restricted Boltzmann machines) as well as in pattern recognition (e.g. Hopfield neural networks) while the latter (Stochastic Processes) has been naturally adapted to describe the dynamical evolution of the (artificial) neurons and synapses building up the aforementioned networks.

We have proved that the mathematical framework(s) stemming from Theoretical Physics can be enlarged in order to include both classical and relativistic mechanics too: in particular, for these models (i.e. Boltzmann machines and Hopfield networks), the variational principle usually underlying any minimization of a cost function (in their learning/retrieval algorithms) can be shown to coincide sharply with the Least Action Principle: as a natural consequence, the equations for the evolution of the order parameters (e.g. Mattis overlaps with the stored patterns, etc.) in the space of the tunable parameters (e.g. noise level, load of the net, etc..) do coincide naturally with the equations of motion as prescribed by Lagrangian Mechanics in Physics and this allows drawing a number of conclusions.

ADVERTISEMENT
At first, this bridge between the mathematics involved in a rationale for AI and Lagrangian mechanics allows to import an arsenal of mathematical weapons ready to be used by researchers working in machine learning and neural networks: for instance dynamical instabilities of the network’s evolution can now be inspected by classical Hopf bifurcation theory and conserved quantities -if present- can be studied by inspecting symmetries à la Noether.
Then, focusing on the priorities in AI research, that is Deep Learning, through the perspective we offer to tackle the problem, it shines clearly  that Boltzmann machines and Hopfield nets play solely as the “classical limit” (storing just pairwise correlation functions of the learnt/retrieved patterns of information) of a much broader theory (i.e., the relativistic extension), where all the higher order correlations functions are properly accounted. It is worth pointing out that the relativistic generalization shows several “deep-learning-like” skills: beyond the development of the general mechanic approach to neural networks, in the paper we have extensively shown (both analytically and numerically) how the relativistic extension outperforms w.r.t. the “classical limit”.
Next steps in this branch of research will be achieved by a  systematic exploration of the proposed relation among AI and Lagrangian Mechanics with the hope that this analogy can act as a little Pandora box: we plan to report soon our findings.
These findings are described in the article entitled A new mechanical approach to handle generalized Hopfield neural networks, recently published in the journal Neural NetworksThis work was conducted by Adriano Barra from the Università del Salento, the INFN, Istituto Nazionale di Fisica Nucleare, and GNFM-INdAM, Gruppo Nazionale per la Fisica Matematica, and Matteo Beccaria and Alberto Fachechi from the Università del Salento and INFN, Istituto Nazionale di Fisica Nucleare.
Reference:
  1. LeCun, Yann, Yoshua Bengio, and Geoffrey Hinton. “Deep learning.” nature 521.7553 (2015): 436. 

Comments

READ THIS NEXT

Explaining Why Concussions May Activate A Pituitary “Dimmer Switch”

For a number of years, researchers have described endocrine (glandular) problems in some people with a history of concussion. These […]

Enhancing Gypsum Precipitation In A Compacted Montmorillonite Using An Electrokinetic Method

Despite advances in solution chemistry, little is known about the chemical reactions in confined porous media like compacted clay, where […]

Silver-sulfur Supertetrahedral Clusters: Filling A Gap In The Field Of Metal-Chalcogenide Tetrahedral Clusters

Metal chalcogenide supertetrahedral clusters (SCs) are regular tetrahedrally shaped fragments of the cubic semiconducting ZnS type lattice. SCs can be […]

EPA’s “Transparency Rule” Could Limit Science’s Role In Creating Policy

A new EPA-backed proposal could end up limiting the role science plays in creating public regulations on environmental activity. Back […]

Challenges In Predicting Severe Weather Events In The Northeast U.S.

Predicting severe weather (defined by the National Weather Service as wind gusts above 58 MPH, hail greater than 1″, or […]

How Many Countries Are In Europe?

There are 50 countries in Europe spanning a wide geographic area, culture, beliefs, religion, and terrain. The countries range from […]

Twinning In Twins: A Diagnosis Seldom Comes Alone 

Double and triple sickness – some chronic conditions often occur together. Scientists have now investigated whether genetic material lies behind […]

Science Trends is a popular source of science news and education around the world. We cover everything from solar power cell technology to climate change to cancer research. We help hundreds of thousands of people every month learn about the world we live in and the latest scientific breakthroughs. Want to know more?