CS Events
Computer Science Department ColloquiumScalable Learning and Reasoning for Large Knowledge Graphs |
|
||
Tuesday, March 29, 2016, 10:30am |
|||
Learning to reason and understand the world’s knowledge is a fundamental problem in Artificial Intelligence (AI). While it is always hypothesized that both the symbolic and statistical approaches are necessary to tackle complex problems in AI, in practice, bridging the two in a combined framework might bring intractability—most probabilistic first-order logics are simply not efficient enough for real-world sized tasks.
With the vast amount of relational data available in digital form, now is a good opportunity to close the gap between these two paradigms. The core research question that I will address in this talk is the following: how can we design scalable statistical learning and inference methods to operate over rich knowledge representations? In this talk, I will describe some examples of my work in advancing the state-of-the-arts in theories and practices of statistical relational learning, including: 1) ProPPR, a scalable learning and reasoning framework whose inference time does not depend on the size of knowledge graphs; 2) an efficient structural gradient based meta-reasoning approach that learns formulas from relational data; 3) and an application of joint information extraction and relational reasoning in NLP. I will conclude this talk by describing my other research interests and my future research plans in the interdisciplinary field of data science.
Speaker: William Wang
Bio
William Wang is a final-year PhD student at the School of Computer Science, Carnegie Mellon University, working with William Cohen. He has broad interests in machine learning approaches to data science, including statistical relational learning, informati
Location : Core A (Room 301)
Committee:
Dimitris Metaxas
Event Type: Computer Science Department Colloquium
:
Organization:
Carnegie Mellon University