Person:
Vempala, Santosh S.

Associated Organization(s)
Organizational Unit
ORCID
ArchiveSpace Name Record

Publication Search Results

Now showing 1 - 3 of 3
  • Item
    Professor Debate on the Topic - Do We Live In a Simulation?
    (Georgia Institute of Technology, 2019-11-12) Cvitanović, Predrag ; Holder, Mary ; Klein, Hans ; Rocklin, D. Zeb ; Turk, Gregory ; Vempala, Santosh S.
    Do we live in a simulation? The School of Physics and the Society of Physics Students will host a public debate between faculty from the College of Science and the College of Computing to answer this question. This event is free and open to the all. There will be time at the conclusion of the debate for audience members to direct questions towards the faculty panel.
  • Item
    A Computer Science View of the Brain
    (Georgia Institute of Technology, 2017-03-15) Vempala, Santosh S.
    Computational perspectives on scientific phenomena have often proven to be remarkably insightful. Rapid advances in computational neuroscience, and the resulting plethora of data and models highlight the lack of an overarching theory for how the brain accomplishes perception and cognition (the mind). Taking the view that the answer must surely have a computational component, we present a few approachable questions for computer scientists, along with some recent work (with Christos Papadimitriou, Samantha Petti and Wolfgang Maass) on mechanisms for the formation of memories, the creation of associations between memories and the benefits of such associations.
  • Item
    The Joy of PCA
    (Georgia Institute of Technology, 2010-09-17) Vempala, Santosh S.
    Principal Component Analysis is the most widely used technique for high-dimensional or large data. For typical applications (nearest neighbor, clustering, learning), it is not hard to build examples on which PCA "fails." Yet, it is popular and successful across a variety of data-rich areas. In this talk, we focus on two algorithmic problems where the performance of PCA is provably near-optimal, and no other method is known to have similar guarantees. The problems we consider are (a) the classical statistical problem of unraveling a sample from a mixture of k unknown Gaussians and (b) the classic learning theory problem of learning an intersection of k halfspaces. During the talk, we will encounter recent extensions of PCA that are noise-resistant, affine-invariant and nonviolent.