BEGIN:VCALENDAR
VERSION:2.0
CALSCALE:GREGORIAN
PRODID:UW-Madison-Physics-Events
BEGIN:VEVENT
SEQUENCE:2
UID:UW-Physics-Event-7952
DTSTART:20221102T160000Z
DTEND:20221102T171500Z
DTSTAMP:20230529T185715Z
LAST-MODIFIED:20221031T034555Z
LOCATION:Online Seminar: Please sign up for our mailing list at www.ph
ysicsmeetsml.org for zoom link
SUMMARY:Bayesian Updating and dynamical flows\, Physics ∩ ML Seminar
\, David Berman\, Queen Mary University
DESCRIPTION:Statistical Inference is the process of determining a prob
ability distribution over the space of parameters of a model given a d
ata set. As more data becomes available this probability distribution
becomes updated via the application of Bayes’ theorem. We present a
treatment of this Bayesian updating process as a continuous dynamical
system. Statistical inference is then governed by a first order differ
ential equation describing a trajectory or flow in the information geo
metry determined by a parametric family of models. We solve this equat
ion for some simple models and show that when the Cram´er-Rao bound i
s saturated the learning rate is governed by a simple 1/T power-law\,
with T a time-like variable denoting the quantity of data. We illustra
te this with both analytic and numerical examples based on Gaussians a
nd the inference of the coupling constant in the Ising model. Finally
we compare the qualitative behaviour exhibited by Bayesian flows to th
e training of various neural networks on benchmarked data sets such as
MNIST and CIFAR10 and show how that for networks exhibiting small fin
al losses the simple power-law is also satisfied.
URL:https://www.physics.wisc.edu/events/?id=7952
END:VEVENT
END:VCALENDAR