Social commonsense reasoning with multi-head knowledge attention

Social Commonsense Reasoning requires understanding of text, knowledge about social events and their pragmatic implications, as well as commonsense reasoning skills. In this work we propose a novel multi-head knowledge attention model that encodes semi-structured commonsense inference rules and lear...

Full description

Saved in:
Bibliographic Details
Main Authors: Paul, Debjit (Author) , Frank, Anette (Author)
Format: Chapter/Article Conference Paper
Language:English
Published: November 2020
In: Findings of the Association for Computational Linguistics - Findings of ACL: EMNLP 2020
Year: 2020, Pages: 2969-2980
DOI:10.18653/v1/2020.findings-emnlp.267
Online Access:Verlag, lizenzpflichtig, Volltext: https://doi.org/10.18653/v1/2020.findings-emnlp.267
Verlag, lizenzpflichtig, Volltext: https://aclanthology.org/2020.findings-emnlp.267
Get full text
Author Notes:Debjit Paul, Anette Frank
Description
Summary:Social Commonsense Reasoning requires understanding of text, knowledge about social events and their pragmatic implications, as well as commonsense reasoning skills. In this work we propose a novel multi-head knowledge attention model that encodes semi-structured commonsense inference rules and learns to incorporate them in a transformer-based reasoning cell.We assess the model's performance on two tasks that require different reasoning skills: Abductive Natural Language Inference and Counterfactual Invariance Prediction as a new task. We show that our proposed model improves performance over strong state-of-the-art models (i.e., RoBERTa) across both reasoning tasks. Notably we are, to the best of our knowledge, the first to demonstrate that a model that learns to perform counterfactual reasoning helps predicting the best explanation in an abductive reasoning task. We validate the robustness of the model's reasoning capabilities by perturbing the knowledge and provide qualitative analysis on the model's knowledge incorporation capabilities.
Item Description:Gesehen am 10.07.2023
Physical Description:Online Resource
ISBN:9781952148903
DOI:10.18653/v1/2020.findings-emnlp.267