Events

Events at Physics

<< Spring 2020 Summer 2020 Fall 2020 >>
Subscribe your calendar or receive email announcements of events

Events During the Week of May 31st through June 7th, 2020

Monday, June 1st, 2020

Thesis Defense
Stability of Diffuse Optical Tomography in the Bayesian Framework
Time: 1:00 pm
Place: Virtual
Speaker: Kit Newton, Physics PhD Graduate Student
Abstract: Join Zoom Meeting Meeting ID: 827 9599 3976 Password: thesis One tap mobile +13126266799,,82795993976#,,1#,240150# US (Chicago) +16465588656,,82795993976#,,1#,240150# US (New York)
Host: Lisa Everett and Qin Li, Faculty Co-Advisors
Add this event to your calendar

Tuesday, June 2nd, 2020

Network in Neutrinos, Nuclear Astrophysics, and Symmetries (N3AS) Seminar
Exposing the astrophysical conditions of r-process events through observable signatures of lanthanide and actinide production
Time: 3:30 pm - 4:30 pm
Place:
Speaker: Nicole Vassh, Notre Dame
Abstract: Exposing the astrophysical conditions of r-process events through observable signatures of lanthanide and actinide production
Host: Baha Balantekin
Add this event to your calendar

Wednesday, June 3rd, 2020

Physics ∩ ML Seminar
Why do neural networks generalise in the overparameterised regime?
Time: 11:00 am - 12:00 pm
Place: Please register for this online event:
Speaker: Ard Louis, University of Oxford
Abstract: One of the most surprising properties of deep neural networks (DNNs) is that they typically perform best in the overparameterised regime. Physicists are taught from a young age that having more parameters than datapoints is a terrible idea. This intuition can be formalised in standard learning theory approaches, based for example on model capacity, which also predict that DNNs should heavily over-fit in this regime, and therefore not generalise at all. So why do DNNs work so well? We use a version of the coding theorem from Algorithmic Information Theory to argue that DNNs are generically biased towards simple solutions. Such an inbuilt Occam’s razor means that they are biased towards solutions that typically generalise well. We further explore the interplay between this simplicity bias and the error spectrum on a dataset to develop a detailed Bayesian theory of training and generalisation that explains why and when SGD trained DNNs generalise, and when they should not. This picture also allows us to derive tight PAC-Bayes bounds that closely track DNN learning curves and can be used to rationalise differences in performance across architectures. Finally, we will discuss some deep analogies between the way DNNs explore function space, and biases in the arrival of variation that explain certain trends observed in biological evolution.
Host: Gary Shiu
Add this event to your calendar

Thursday, June 4th, 2020

Department Coffee Hour
Time: 3:00 pm - 4:00 pm
Place: Virtual
Abstract: Join us weekly for an informal virtual coffee hour! Catch up with others in the department, tell us how things are going, and impress everyone with your Zoom background skills.

Coffee hour is open to any and all faculty, staff, and students in the department.

Link:
Host: Department
Add this event to your calendar

Friday, June 5th, 2020

No events scheduled