Title:
Interactive Scalable Discovery Of Concepts, Evolutions, And Vulnerabilities In Deep Learning

dc.contributor.advisor Chau, Duen Horng
dc.contributor.author Park, Haekyu
dc.contributor.committeeMember Hoffman, Judy
dc.contributor.committeeMember Hao, Cong (Callie)
dc.contributor.committeeMember Yang, Diyi
dc.contributor.committeeMember Zhang, Chao
dc.contributor.department Computational Science and Engineering
dc.date.accessioned 2024-01-10T18:48:21Z
dc.date.available 2024-01-10T18:48:21Z
dc.date.created 2023-12
dc.date.issued 2023-12-05
dc.date.submitted December 2023
dc.date.updated 2024-01-10T18:48:21Z
dc.description.abstract Deep Neural Networks (DNNs) are increasingly prevalent, but deciphering their operations is challenging. Such a lack of clarity undermines trust and problem-solving during deployment, highlighting the urgent need for interpretability. How can we efficiently summarize concepts models learn? How do these concepts evolve during training? When models are at risk from potential threats, how do we explain their vulnerabilities? We address these concerns with a human-centered approach, by developing novel systems to interpret learned concepts, their evolution, and potential vulnerabilities within deep learning. This thesis focuses on three key thrusts: (1) Scalable Automatic Visual Summarization of Concepts. We develop NeuroCartography, an interactive system that scalably summarizes and visualizes concepts learned by a large-scale DNN, such as InceptionV1 trained with 1.2M images. A large-scale human evaluation with 244 participants shows that NeuroCartography discovers coherent, human-meaningful concepts. (2) Insights to Reveal Model Vulnerabilities. We develop scalable interpretation techniques to visualize and identify internal elements in DNNs, which are susceptible to potential harms, aiming to understand how these defects lead to incorrect predictions. We develop first-of-its-kind interactive systems such as Bluff that visually compares the activation pathways for benign and attacked images in DNNs, and SkeletonVis that explains how attacks manipulate human joint detection in human action recognition models. (3) Scalable Discovery of Concept Evolution During Training. Our first-of-its-kind ConceptEvo unified interpretation framework holistically reveals the inception and evolution of learned concepts and their relationships during training. ConceptEvo enables powerful new ways to monitor model training and discover training issues, addressing critical limitations of existing post-training interpretation research. A large-scale human evaluation with 260 participants demonstrates that ConceptEvo identifies concept evolutions that are both meaningful to humans and important for class predictions. This thesis contributes to information visualization, deep learning, and crucially, their intersection. We have developed open-source interactive interfaces, scalable algorithms, and a unified framework for interpreting DNNs across different models. Our work impacts academia, industry, and the government. For example, our work has contributed to the DARPA GARD program (Garanteeing AI Robustness against Deception). Additionally, our work has been recognized through a J.P. Morgan AI PhD Fellowship and 2022 Rising Stars in IEEE EECS. NeuroCartography has been highlighted as a top visualization publication (top 1%) invited to SIGGRAPH.
dc.description.degree Ph.D.
dc.format.mimetype application/pdf
dc.identifier.uri https://hdl.handle.net/1853/73145
dc.language.iso en_US
dc.publisher Georgia Institute of Technology
dc.subject Machine Learning Interpretability, Visualization
dc.title Interactive Scalable Discovery Of Concepts, Evolutions, And Vulnerabilities In Deep Learning
dc.type Text
dc.type.genre Dissertation
dspace.entity.type Publication
local.contributor.advisor Chau, Duen Horng
local.contributor.corporatename College of Computing
local.contributor.corporatename School of Computational Science and Engineering
relation.isAdvisorOfPublication fb5e00ae-9fb7-475d-8eac-50c48a46ea23
relation.isOrgUnitOfPublication c8892b3c-8db6-4b7b-a33a-1b67f7db2021
relation.isOrgUnitOfPublication 01ab2ef1-c6da-49c9-be98-fbd1d840d2b1
thesis.degree.level Doctoral
Files
Original bundle
Now showing 1 - 1 of 1
Thumbnail Image
Name:
PARK-DISSERTATION-2023.pdf
Size:
13.04 MB
Format:
Adobe Portable Document Format
Description:
License bundle
Now showing 1 - 1 of 1
No Thumbnail Available
Name:
LICENSE.txt
Size:
3.86 KB
Format:
Plain Text
Description: