Series
IDEaS Seminar Series

Series Type
Event Series
Description
Associated Organization(s)
Associated Organization(s)

Publication Search Results

Now showing 1 - 9 of 9
  • Item
    Logical Neural Networks: Towards Unifying Statistical and Symbolic AI
    ( 2021-01-15) Gray, Alexander
    Recently there has been renewed interest in the long-standing goal of somehow unifying the capabilities of both statistical AI (learning and prediction) and symbolic AI (knowledge representation and reasoning). We introduce Logical Neural Networks, a new neuro-symbolic framework which identifies and leverages a 1-to-1 correspondence between an artificial neuron and a logic gate in a weighted form of real-valued logic. With a few key modifications of the standard modern neural network, we construct a model which performs the equivalent of logical inference rules such as modus ponens within the message-passing paradigm of neural networks, and utilizes a new form of loss, contradiction loss, which maximizes logical consistency in the face of imperfect and inconsistent knowledge. The result differs significantly from other neuro-symbolic ideas in that 1) the model is fully disentangled and understandable since every neuron has a meaning, 2) the model can perform both classical logical deduction and its real-valued generalization (which allows for the representation and propagation of uncertainty) exactly, as special cases, as opposed to approximately as in nearly all other approaches, and 3) the model is compositional and modular, allowing for fully reusable knowledge across talks. The framework has already enabled state-of-the-art results in several problems, including question answering.
  • Item
    Challenges and Opportunities at the Nexus of Synthetic Biology, Machine Learning, and Automation
    ( 2020-11-13) Zhao, Huimin
    Inspired by the exponential growth of the microelectronic industry, my lab has been attempting to build a biofoundry that integrates biology, automation and artificial intelligence (AI)/machine learning for rapid prototyping and manufacturing of biological systems for synthesis of bioproducts ranging from chemicals to materials to therapeutic agents. In this talk, I will discuss the challenges and opportunities at the nexus of synthetic biology, machine learning, and automation and highlight a few of our accomplishments and the recently launched NSF AI research institute for molecular synthesis. Specifically, I will introduce three interconnected stories, including: (1) development of the Illinois Biological Foundry for Advanced Biomanufacturing (iBioFAB) for next-generation synthetic biology applications; (2) development of genome-scale engineering tools for rapid metabolic engineering applications, and (3) integration of biocatalysis and chemical catalysis for synthesis of value-added chemicals, which necessitates the development of AI-enabled synthesis planning and catalyst design tools.
  • Item
    Collision Course: Artificial Intelligence meets Fundamental Interactions
    ( 2020-10-30) Thaler, Jesse
    Modern machine learning has had an outsized impact on many scientific fields, and fundamental physics is no exception. What is special about fundamental physics, though, is the vast amount of theoretical, experimental, and observational knowledge that we already have about many problems in the field. Is it possible to teach a machine to “think like a physicist” and thereby advance physics knowledge from the smallest building blocks of nature to the largest structures in the universe? In this talk, I argue that the answer is “yes”, using the example of particle physics at the Large Hadron Collider to highlight the fascinating synergy between theoretical principles and machine learning architectures. I also argue that by fusing the “deep learning” revolution with the time-tested strategies of “deep thinking” in physics, we can galvanize research innovation in artificial intelligence more broadly.
  • Item
    Curating a COVID-19 data repository and forecasting county-level death counts in the United States​
    ( 2020-10-23) Yu, Bin
    As the COVID-19 outbreak evolves, accurate forecasting continues to play an extremely important role in informing policy decisions. In this paper, we present our continuous curation of a large data repository containing COVID-19 information from a range of sources. We use this data to develop predictions and corresponding prediction intervals for the short-term trajectory of COVID-19 cumulative death counts at the county-level in the United States up to two weeks ahead. Using data from January 22 to June 20, 2020, we develop and combine multiple forecasts using ensembling techniques, resulting in an ensemble we refer to as Combined Linear and Exponential Predictors (CLEP). Our individual predictors include county-specific exponential and linear predictors, a shared exponential predictor that pools data together across counties, an expanded shared exponential predictor that uses data from neighboring counties, and a demographics-based shared exponential predictor. We use prediction errors from the past five days to assess the uncertainty of our death predictions, resulting in generally-applicable prediction intervals, Maximum (absolute) Error Prediction Intervals (MEPI). MEPI achieves a coverage rate of more than 94% when averaged across counties for predicting cumulative recorded death counts two weeks in the future. Our forecasts are currently being used by the non-profit organization, Response4Life, to determine the medical supply need for individual hospitals and have directly contributed to the distribution of medical supplies across the country. We hope that our forecasts and data repository at this https URL can help guide necessary county-specific decision-making and help counties prepare for their continued fight against COVID-19.
  • Item
    Understanding Human Functioning & Enhancing Human Potential through Computational Methods
    ( 2020-10-08) D'Mello, Sidney K.
    It is generally accepted that computational methods can complement traditional approaches to understanding human functioning, including thoughts, feelings, behaviors, and social interactions. I suggest that their utility extends beyond a mere complementary role. They serve a necessary role when data is too large for manual analysis, an opportunistic role by addressing questions that are beyond the purview of traditional methods, and a promissory role in facilitating change when fully-automated computational models are embedded in closed-loop intelligent systems. Multimodal computational approaches provide further benefits by affording analysis of disparate constructs emerging across multiple types of interactions in diverse contexts. To illustrate, I will discuss a research program that use linguistic, paralinguistic, behavioral, and physiological signals for the analysis of individual, small group, multi-party, and human-computer interactions in the lab and in the wild with the goals of understanding cognitive, noncognitive, and socio-affective-cognitive processes while improving human efficiency, engagement, and effectiveness. I will also discuss how these ideas align with our new NSF National AI Institute on Student-AI Teaming and how you can get involved in the research.
  • Item
    Building Trustworthy AI for Environmental Science
    ( 2020-09-25) McGovern, Amy
    As climate change affects weather patterns and sea levels rise, the world’s need for accurate, usable predictions of weather and ocean and their impacts has never been greater. At the same time, the quantity and quality of Earth observation and modeling systems are increasing dramatically, offering a deluge of data so rich that only automated intelligent systems can fully exploit it. In this talk, I will introduce our new NSF AI Institute for Research on Trustworthy AI in Weather, Climate, and Coastal Oceanography and discuss our approach to developing trustworthy AI methods for environmental science. I will also present preliminary results for creating trustworthy AI for two high-impact weather prediction tasks: Hail and tornado prediction.
  • Item
    The Science of Stories: Measuring and Exploring the Ecology of Human Stories with Lexical Instruments
    ( 2019-11-06) Dodds, Peter S.
    I will survey our efforts at the Computational Story Lab to measure and study a wide array of social and cultural phenomena using “lexical meters” — online, interactive instruments that use social media and other texts to quantify population dynamics of human behavior. These include happiness, public health, obesity rates, and depression. I will explain how lexical meters work and how we have used them to uncover natural language encodings of positivity biases across cultures, universal emotional arcs of stories, links between social media posts and health, measures of fame and ultra-fame, and time compression for news. I will offer some thoughts on how fully developing a post-disciplinary, collaborative science of human stories is vital in our efforts to understand the evolution, stability, and fracturing of social systems.
  • Item
    Second Order Machine Learning
    (Georgia Institute of Technology, 2017-09-22) Mahoney, Michael
    A major challenge for large-scale machine learning, and one that will only increase in importance as we develop models that are more and more domain-informed, involves going beyond high-variance first-order optimization methods to more robust second order methods. Here, we consider the problem of minimizing the sum of a large number of functions over a convex constraint set, a problem that arises in many data analysis, machine learning, and more traditional scientific computing applications, as well as non-convex variants of these basic methods. While this is of interest in many situations, it has received attention recently due to challenges associated with training so-called deep neural networks. We establish improved bounds for algorithms that incorporate sub-sampling as a way to improve computational efficiency, while maintaining the original convergence properties of these algorithms. These methods exploit recent results from Randomized Linear Algebra on approximate matrix multiplication. Within the context of second order optimization methods, they provide quantitative convergence results for variants of Newton's methods, where the Hessian and/or the gradient is uniformly or non-uniformly sub-sampled, under much weaker assumptions than prior work.
  • Item
    Assembly of Big Genomic Data
    (Georgia Institute of Technology, 2017-09-15) Medvedev, Paul
    As genome sequencing technologies continue to facilitate the generation of large datasets, developing scalable algorithms has come to the forefront as a crucial step in analyzing these datasets. In this talk, I will discuss several recent advances, with a focus on the problem of reconstructing a genome from a set of reads (genome assembly). I will describe low-memory and scalable algorithms for automatic parameter selection and de Bruijn graph compaction, recently implemented in two tools: KmerGenie and bcalm. I will also present recent advances in the theoretical foundations of genome assemblers.