<< May 2020 >>
Sun Mon Tue Wed Thu Fri Sat
   1   2 
 3   4   5   6   7   8   9 
 10   11   12   13   14   15   16 
 17   18   19   20   21   22   23 
 24   25   26   27   28   29   30 
Add an Event

Events at Physics

<< Spring 2020 Summer 2020 Fall 2020 >>
Subscribe to receive email announcements of events

Events During the Week of May 31st through June 7th, 2020

Monday, June 1st, 2020

Thesis Defense
Stability of Diffuse Optical Tomography in the Bayesian Framework
Time: 1:00 pm
Place: Virtual
Speaker: Kit Newton, Physics PhD Graduate Student
Abstract: Join Zoom Meeting https://us02web.zoom.us/j/82795993976?pwd=U3BFMjFyQU5kRng4UlVSRytldU5hQT09 Meeting ID: 827 9599 3976 Password: thesis One tap mobile +13126266799,,82795993976#,,1#,240150# US (Chicago) +16465588656,,82795993976#,,1#,240150# US (New York)
Host: Lisa Everett and Qin Li, Faculty Co-Advisors
Add this event to your calendar

Tuesday, June 2nd, 2020

Network in Neutrinos, Nuclear Astrophysics, and Symmetries (N3AS) Seminar
Exposing the astrophysical conditions of r-process events through observable signatures of lanthanide and actinide production
Time: 3:30 pm
Place: https://berkeley.zoom.us/j/91922781599
Speaker: Nicole Vassh, Notre Dame
Abstract: Exposing the astrophysical conditions of r-process events through observable signatures of lanthanide and actinide production
Host: Baha Balantekin
Add this event to your calendar

Wednesday, June 3rd, 2020

Physics meets ML (virtual seminar series)
Why do neural networks generalise in the overparameterised regime?
Time: 11:00 am
Place: Please register for this online event: http://physicsmeetsml.org
Speaker: Ard Louis, University of Oxford
Abstract: One of the most surprising properties of deep neural networks (DNNs) is that they typically perform best in the overparameterised regime. Physicists are taught from a young age that having more parameters than datapoints is a terrible idea. This intuition can be formalised in standard learning theory approaches, based for example on model capacity, which also predict that DNNs should heavily over-fit in this regime, and therefore not generalise at all. So why do DNNs work so well? We use a version of the coding theorem from Algorithmic Information Theory to argue that DNNs are generically biased towards simple solutions. Such an inbuilt Occam’s razor means that they are biased towards solutions that typically generalise well. We further explore the interplay between this simplicity bias and the error spectrum on a dataset to develop a detailed Bayesian theory of training and generalisation that explains why and when SGD trained DNNs generalise, and when they should not. This picture also allows us to derive tight PAC-Bayes bounds that closely track DNN learning curves and can be used to rationalise differences in performance across architectures. Finally, we will discuss some deep analogies between the way DNNs explore function space, and biases in the arrival of variation that explain certain trends observed in biological evolution.
Host: Gary Shiu
Add this event to your calendar

Thursday, June 4th, 2020

Department Coffee Hour
Time: 3:00 pm
Place: Virtual
Abstract: Join us weekly for an informal virtual coffee hour! Catch up with others in the department, tell us how things are going, and impress everyone with your Zoom background skills.

Coffee hour is open to any and all faculty, staff, and students in the department.

Link: https://cern.zoom.us/j/98625439661?pwd=NmNMeit4aGNLOSs3eHl0RHk0ZHRKUT09<br>
Host: Department
Add this event to your calendar

Friday, June 5th, 2020

No events scheduled

©2013 Board of Regents of the University of Wisconsin System