BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Penn Engineering Events - ECPv6.15.18//NONSGML v1.0//EN
CALSCALE:GREGORIAN
METHOD:PUBLISH
X-WR-CALNAME:Penn Engineering Events
X-ORIGINAL-URL:https://seasevents.nmsdev7.com
X-WR-CALDESC:Events for Penn Engineering Events
REFRESH-INTERVAL;VALUE=DURATION:PT1H
X-Robots-Tag:noindex
X-PUBLISHED-TTL:PT1H
BEGIN:VTIMEZONE
TZID:America/New_York
BEGIN:DAYLIGHT
TZOFFSETFROM:-0500
TZOFFSETTO:-0400
TZNAME:EDT
DTSTART:20200308T070000
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:-0400
TZOFFSETTO:-0500
TZNAME:EST
DTSTART:20201101T060000
END:STANDARD
BEGIN:DAYLIGHT
TZOFFSETFROM:-0500
TZOFFSETTO:-0400
TZNAME:EDT
DTSTART:20210314T070000
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:-0400
TZOFFSETTO:-0500
TZNAME:EST
DTSTART:20211107T060000
END:STANDARD
BEGIN:DAYLIGHT
TZOFFSETFROM:-0500
TZOFFSETTO:-0400
TZNAME:EDT
DTSTART:20220313T070000
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:-0400
TZOFFSETTO:-0500
TZNAME:EST
DTSTART:20221106T060000
END:STANDARD
END:VTIMEZONE
BEGIN:VEVENT
DTSTART;TZID=America/New_York:20210709T090000
DTEND;TZID=America/New_York:20210709T100000
DTSTAMP:20260406T185627
CREATED:20210706T160206Z
LAST-MODIFIED:20210706T160206Z
UID:5055-1625821200-1625824800@seasevents.nmsdev7.com
SUMMARY:ESE PhD Dissertation Defense: "Balancing Fit and Complexity in Learned Representations"
DESCRIPTION:Thesis Title: Balancing Fit and Complexity in Learned Representations \nAbstract: This dissertation is about learning representations of functions while restricting complexity. In machine learning\, maximizing the fit and minimizing the complexity are two conflicting objectives. Common approaches to this problem involve solving a regularized empirical minimization problem\, with a complexity measure regularizer and a regularizing parameter that controls the trade-off between the two objectives. The regularizing parameter has to be tuned by repeatedly solving the problem and does not have a straightforward interpretation. This work formulates the problem as a minimization of the complexity measure subject to the fit constraints. \nThe issue of complexity is tackled in reproducing kernel Hilbert spaces (RKHSs) by introducing a novel integral representation of a family of RKHSs that allows arbitrarily placed kernels of different widths. The functional estimation problem is then written as a sparse functional problem\, which despite being non-convex and infinite-dimensional can be solved in the dual domain. This problem achieves representations of lower complexity than traditional methods because it searches over a family of RKHS rather than a subspace of a single RKHS. \nThe integral representation is used in a federated classification setting\, in which a global model is trained from a federation of agents. This is possible due to the observation that the dual optimal variables give information about the samples which are fundamental to the classification. Each agent\, therefore\, learns a local model and sends only the fundamental samples over the network. This creates a federated learning method that requires only one network communication. Its solution is proven to asymptotically converges to that of traditional classification. \nNext\, a theory for constraint specification is established. An optimization problem with a constraint for each sample point can easily become infeasible if the constraints are too tight. In contrast\, relaxing all constraints can cause the solution to not fit the data well. The constrained specification method relaxes the constraints until the marginal cost of changing a constraint is equal to the marginal complexity measure. This problem is proven to be feasible and solvable\, and shown empirically to be resilient to outliers and corrupted training data. \nFor Zoom link\, please email Elizabeth Kopeczky at: kopeczky@seas.upenn.edu.
URL:https://seasevents.nmsdev7.com/event/phd-dissertation-defense-maria-peifer/
CATEGORIES:Dissertation or Thesis Defense
END:VEVENT
END:VCALENDAR