We build models for answering various types of questions that require understanding and reasoning over the given context.
We develop algorithms to extract knowledge from unstructured text, with applications to downstream tasks like knowledge base construction and text generation.
Our work in dialog and text generation focuses on generating interesting and correct conversational, educational, or technical natural language text.
We develop new transformations and neural architectures that allow models to learn richer representations effectively and efficiently across different domains, including visual and sequence modeling tasks.
Hannaneh Hajishirzi's lab page, University of Washington, 2021