Additive deductive embeddings for planning of natural language proofs

dc.contributor.advisorDurrett, Greg
dc.creatorSprague, Zayne
dc.creator.orcid0009-0000-6887-8029
dc.date.accessioned2023-08-25T17:12:53Z
dc.date.available2023-08-25T17:12:53Z
dc.date.created2023-05
dc.date.issued2023-04-21
dc.date.submittedMay 2023
dc.date.updated2023-08-25T17:12:54Z
dc.description.abstractConstructing a logical argument to support a claim entails selecting relevant evidence from a collection of facts and formulating a sequence of reasoning steps. Current natural language systems designed for claim validation employ large language models to reason and plan deductions. In this paper, we investigate whether embedding spaces can adequately represent natural language deductions for use in proof-generating systems. We introduce a novel method for planning natural language proof generation called Additive Deduction, which leverages simple arithmetic operations performed exclusively in an embedding space. We explore multiple sources of off-the-shelf dense embeddings in addition to sparse embeddings from BM25. We devise three experiments to demonstrate the effectiveness of embedding models for natural language planning. The first experiment consists of two intrinsic evaluations of Additive Deduction using two off-the-shelf sentence encoders. The second incorporates an embedding-based heuristic into planning for natural language proof generation datasets, namely EntailmentBank and Everyday Norms: Why Not. Lastly, we create a dataset that benchmarks various reasoning categories and common reasoning failures. Our findings suggest that while standard embedding methods frequently embed conclusions near the sums of their premises, they lack the ability to fully express certain categories of reasoning, hurting their proof generation performance.
dc.description.departmentComputer Science
dc.format.mimetypeapplication/pdf
dc.identifier.urihttps://hdl.handle.net/2152/121237
dc.identifier.urihttp://dx.doi.org/10.26153/tsw/48065
dc.language.isoen
dc.subjectNLP
dc.subjectNatural Language Inference
dc.subjectTextual reasoning
dc.subjectNatural Language Deduction
dc.subjectNLI
dc.subjectReasoning in text
dc.titleAdditive deductive embeddings for planning of natural language proofs
dc.typeThesis
dc.type.materialtext
thesis.degree.departmentComputer Sciences
thesis.degree.disciplineComputer Science
thesis.degree.grantorThe University of Texas at Austin
thesis.degree.levelMasters
thesis.degree.nameMaster of Science in Computer Sciences

Access full-text files

Original bundle

Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
SPRAGUE-THESIS-2023.pdf
Size:
1008.35 KB
Format:
Adobe Portable Document Format

License bundle

Now showing 1 - 2 of 2
No Thumbnail Available
Name:
PROQUEST_LICENSE.txt
Size:
4.45 KB
Format:
Plain Text
Description:
No Thumbnail Available
Name:
LICENSE.txt
Size:
1.84 KB
Format:
Plain Text
Description: