BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Penn Engineering Events - ECPv6.15.18//NONSGML v1.0//EN
CALSCALE:GREGORIAN
METHOD:PUBLISH
X-WR-CALNAME:Penn Engineering Events
X-ORIGINAL-URL:https://seasevents.nmsdev7.com
X-WR-CALDESC:Events for Penn Engineering Events
REFRESH-INTERVAL;VALUE=DURATION:PT1H
X-Robots-Tag:noindex
X-PUBLISHED-TTL:PT1H
BEGIN:VTIMEZONE
TZID:America/New_York
BEGIN:DAYLIGHT
TZOFFSETFROM:-0500
TZOFFSETTO:-0400
TZNAME:EDT
DTSTART:20210314T070000
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:-0400
TZOFFSETTO:-0500
TZNAME:EST
DTSTART:20211107T060000
END:STANDARD
BEGIN:DAYLIGHT
TZOFFSETFROM:-0500
TZOFFSETTO:-0400
TZNAME:EDT
DTSTART:20220313T070000
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:-0400
TZOFFSETTO:-0500
TZNAME:EST
DTSTART:20221106T060000
END:STANDARD
BEGIN:DAYLIGHT
TZOFFSETFROM:-0500
TZOFFSETTO:-0400
TZNAME:EDT
DTSTART:20230312T070000
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:-0400
TZOFFSETTO:-0500
TZNAME:EST
DTSTART:20231105T060000
END:STANDARD
END:VTIMEZONE
BEGIN:VEVENT
DTSTART;TZID=America/New_York:20220120T100000
DTEND;TZID=America/New_York:20220120T120000
DTSTAMP:20260406T070807
CREATED:20220114T165505Z
LAST-MODIFIED:20220114T165505Z
UID:6028-1642672800-1642680000@seasevents.nmsdev7.com
SUMMARY:ESE Ph.D. Thesis Defense: "Statistical Learning for System Identification\, Prediction\, and Control"
DESCRIPTION:Despite the recent widespread success of machine learning\, we still do not fully understand its fundamental limitations. Going forward\, it is crucial to better understand learning complexity\, especially in critical decision making applications\, where a wrong decision can lead to catastrophic consequences. In this thesis\, we focus on the statistical complexity of learning unknown linear dynamical systems\, with focus on the tasks of system identification\, prediction\, and control. We are interested in sample complexity\, i.e. the minimum number of samples required to achieve satisfactory learning performance. Our goal is to provide finite-sample learning guarantees\, explicitly highlighting how the learning objective depends on the number of samples. A fundamental question we are trying to answer is how system theoretic properties of the underlying process can affect sample complexity. \nUsing recent advances in statistical learning\, high-dimensional statistics\, and mini-max theory\, we provide finite-sample guarantees in the following settings. i) System Identification. We provide the first finite-sample guarantees for identifying a stochastic partially-observed system; this problem is also known as the stochastic system identification problem. ii) Prediction. We provide the first end-to-end guarantees for learning the Kalman Filter\, i.e. for learning to predict\, in an offline learning architecture. We also provide the first logarithmic regret guarantees for the problem of learning the Kalman Filter in an online learning architecture\, where the data are revealed sequentially. iii) Difficulty of System Identification and Control. Focusing on fully-observed systems\, we investigate when learning linear systems is statistically easy or hard\, in the finite sample regime. Statistically easy to learn linear system classes have sample complexity that is polynomial with the system dimension. Statistically hard to learn linear system classes have worst-case sample complexity that is at least exponential with the system dimension. We show that there actually exist classes of linear systems which are hard to learn. Such classes include indirectly excited systems with a large degree of indirect excitation. Similar conclusions hold for both the problem of system identification and the problem of learning to control.
URL:https://seasevents.nmsdev7.com/event/ese-ph-d-thesis-defense-statistical-learning-for-system-identification-prediction-and-control/
LOCATION:Zoom – Meeting ID 949 5950 4530
CATEGORIES:Seminar,Dissertation or Thesis Defense
ORGANIZER;CN="Electrical and Systems Engineering":MAILTO:eseevents@seas.upenn.edu
END:VEVENT
END:VCALENDAR