BEGIN:VCALENDAR
VERSION:2.0
CALSCALE:GREGORIAN
PRODID:UW-Madison-Physics-Events
BEGIN:VEVENT
SEQUENCE:0
UID:UW-Physics-Event-4897
DTSTART:20181113T180500Z
DTEND:20181113T190000Z
DTSTAMP:20260409T212202Z
LAST-MODIFIED:20180911T192822Z
LOCATION:4274 Chamberlin (Refreshments will be served)
SUMMARY:Artificial general intelligence\, Chaos & Complex Systems Semi
 nar\, Bill Hibbard\, UW Space Science and Engineering Center
DESCRIPTION:In the early 1960s Ray Solomonoff combined Turing's theory
  of computation with Shannon's information theory to create algorithmi
 c information theory. The Kolmogorov complexity of a binary string is 
 defined as the length of the shortest program that computes the string
 . Solomonoff used a related measure as the basis for an (uncomputable 
 but approximable) universal induction algorithm for predicting arbitra
 ry binary strings. In the early 2000's Marcus Hutter combined this ind
 uction algorithm with sequential decision theory to define his univers
 al AI that maximizes expected rewards from arbitrary environments\, an
 d to define a formal measure of intelligence. This work led to confere
 nces and journals dedicated to the mathematical study of properties of
  artificial general intelligence (AGI) systems\, including ways that t
 hey may fail to conform to the intentions of their designers and ways 
 to design systems that do conform to their design intentions. These pr
 oblems are not resolved and research is very active. While some mainst
 ream AI developers criticize AGI theory\, the creators of some of the 
 most successful AI systems (e.g.\, Google DeepMind) are also deeply in
 volved in this AGI research. Practical versions of Hutter's universal 
 AI are called Bayesian program learning and in some ways they outperfo
 rm the deep learning algorithms that are revolutionizing AI.\n
URL:https://www.physics.wisc.edu/events/?id=4897
END:VEVENT
END:VCALENDAR
