BEGIN:VCALENDAR
VERSION:2.0
CALSCALE:GREGORIAN
PRODID:UW-Madison-Physics-Events
BEGIN:VEVENT
SEQUENCE:2
UID:UW-Physics-Event-6385
DTSTART:20210324T160000Z
DTEND:20210324T171500Z
DTSTAMP:20240414T072818Z
LAST-MODIFIED:20210322T042031Z
LOCATION:Online Seminar: Please sign up for our mailing list at www.ph
ysicsmeetsml.org for zoom link
SUMMARY:Algebraic Neural Networks\, Physics ∩ ML Seminar\, Alejandro
Ribeiro\, University of Pennsylvania
DESCRIPTION:We study algebraic neural networks (AlgNNs) with commutati
ve algebras which unify diverse architectures such as Euclidean convol
utional neural networks\, graph neural networks\, and group neural net
works under the umbrella of algebraic signal processing. An AlgNN is a
stacked layered structure where each layer is conformed by an algebra
\, a vector space and a homomorphism between the algebra and the space
of endomorphisms of the vector space. Signals are modeled as elements
of the vector space and are processed by convolutional filters that a
re defined as the images of the elements of the algebra under the acti
on of the homomorphism.
\n
\nWe analyze stability of algebraic filters
and AlgNNs to deformations of the homomorphism and derive conditions
on filters that lead to Lipschitz stable operators. We conclude that s
table algebraic filters have frequency responses – defined as eigenv
alue domain representations – whose derivative is inversely proporti
onal to the frequency – defined as eigenvalue magnitudes. It follows
that for a given level of discriminability\, AlgNNs are more stable t
han algebraic filters\, thereby explaining their better empirical perf
ormance. This same phenomenon has been proven for Euclidean convolutio
nal neural networks and graph neural networks. Our analysis shows that
this is a deep algebraic property shared by a number of architectures
.
\n
\nFurther details in arxiv.org/abs/2009.01433 and gnn.seas.upenn.
edu/lecture-12.
URL:https://www.physics.wisc.edu/events/?id=6385
END:VEVENT
END:VCALENDAR