## Math Calendar

Vertex models are models of statistical mechanics on a lattice, in which the Boltzmann weights are associated with vertices of the lattice. Quantities associated with a system, such as the energy, may diverge when the size of the lattice is infinite. Renormalisation techniques are then used to try to overcome this difficulty. Problems occurring for models of statistical mechanics are similar in nature to divergency problems in quantum field theory. Kontsevich has suggested an approach to the renormalisation question in quantum field theory that would in particular imply the following statement, which still remains not fully precise as there is not satisfactory definition yet of what a QFT is: ‘For any QFT on R^d with invariance under the action of the group of parallel translations and dilations, translation invariant forms with values in local fiels form an algebra over the little disc operads E_d’. Today we shall describe a more precise conjecture made by Kontsevich in the case of vertex models. This work is joint with Damien Calaque.

Following a paper by Singer & Sternberg, I will present an application of the framework of Exterior Differential Systems (and the Cartan-Kähler Theorem, in particular) to the theory of G-Structures. I will prove that, in the real-analytic category, if two G-structure have constant intrinsic torsion and the two intrinsic torsions are equal, then the G-structures are locally equivalent (=isomorphic). As a corollary, we will conclude that the so-called transitive G-structures are precisely those with constant intrinsic torsion.

Abstract:

ArtificialNeural Networks (ANN) are used as universal approximators and are gettingwidely adopted in several working fields such as finance.

One ofthe problems that can be addressed using ANN is the forecasting of time series.Time series forecasting is known to be a difficult problem, often requiringexpert knowledge, and can be applied to problems including predicting stockvalue, sales forecasting, and inventory. We explore how the sparsity thatoccurs in trained ANNs can be used to generalize the network topology to anyDirected Acyclic Graph (DAG). We show how both the Feed-forward Neural Network(FNN) and Recurrent Neural Network (RNN) topologies can be generalized in bothtraining and prediction. Finally we train these network architectures onbenchmark problems and use them to forecast time series.