Title:
Pruning Deep Neural Networks with Net-Trim: Deep Learning and Compressed Sensing Meet
Pruning Deep Neural Networks with Net-Trim: Deep Learning and Compressed Sensing Meet
dc.contributor.author | Aghasi, Alireza | |
dc.contributor.corporatename | Georgia Institute of Technology. Machine Learning | en_US |
dc.contributor.corporatename | Georgia State University. J. Mack Robinson College of Business | en_US |
dc.date.accessioned | 2018-03-26T18:42:58Z | |
dc.date.available | 2018-03-26T18:42:58Z | |
dc.date.issued | 2018-03-14 | |
dc.description | Presented on March 14, 2018 at 12:00 p.m. in the Marcus Nanotechnology Building, Room 1116. | en_US |
dc.description | Alireza Aghasi is currently an assistant professor in the Institute for Insight at the Robinson College of Business at Georgia State University. His research fundamentally focuses on optimization theory and statistics, with applications to various areas of data science, artificial intelligence, modern signal processing and physics-based inverse problems. | en_US |
dc.description | Runtime: 58:09 minutes | en_US |
dc.description.abstract | We introduce and analyze a new technique for model reduction in deep neural networks. Our algorithm prunes (sparsifies) a trained network layer-wise, removing connections at each layer by addressing a convex problem. We present both parallel and cascade versions of the algorithm along with the mathematical analysis of the consistency between the initial network and the retrained model. We also discuss an ADMM implementation of Net-Trim, easily applicable to large scale problems. In terms of the sample complexity, we present a general result that holds for any layer within a network using rectified linear units as the activation. If a layer taking inputs of size N can be described using a maximum number of s non-zero weights per node, under some mild assumptions on the input covariance matrix, we show that these weights can be learned from O(slog N/s) samples. | en_US |
dc.format.extent | 58:09 minutes | |
dc.identifier.uri | http://hdl.handle.net/1853/59442 | |
dc.language.iso | en_US | en_US |
dc.relation.ispartofseries | Machine Learning @ Georgia Tech (ML@GT) Seminar Series | |
dc.subject | Compressed sensing | en_US |
dc.subject | Deep learning | en_US |
dc.subject | Pruning neural networks | en_US |
dc.title | Pruning Deep Neural Networks with Net-Trim: Deep Learning and Compressed Sensing Meet | en_US |
dc.type | Moving Image | |
dc.type.genre | Lecture | |
dspace.entity.type | Publication | |
local.contributor.corporatename | Machine Learning Center | |
local.contributor.corporatename | College of Computing | |
local.relation.ispartofseries | ML@GT Seminar Series | |
relation.isOrgUnitOfPublication | 46450b94-7ae8-4849-a910-5ae38611c691 | |
relation.isOrgUnitOfPublication | c8892b3c-8db6-4b7b-a33a-1b67f7db2021 | |
relation.isSeriesOfPublication | 9fb2e77c-08ff-46d7-b903-747cf7406244 |
Files
Original bundle
1 - 3 of 3
No Thumbnail Available
- Name:
- aghasi.mp4
- Size:
- 467.08 MB
- Format:
- MP4 Video file
- Description:
- Download video
No Thumbnail Available
- Name:
- aghasi_videostream.html
- Size:
- 985 B
- Format:
- Hypertext Markup Language
- Description:
- Streaming video
No Thumbnail Available
- Name:
- transcription.txt
- Size:
- 41.93 KB
- Format:
- Plain Text
- Description:
- Transcription
License bundle
1 - 1 of 1
No Thumbnail Available
- Name:
- license.txt
- Size:
- 3.13 KB
- Format:
- Item-specific license agreed upon to submission
- Description: