LLMs4Implicit-knowledge-generation public

Code for equipping pretrained language models (BART, GPT-2, XLNet) with commonsense knowledge for generating implicit knowledge statements between two sentences, by (i) finetuning the models on corpora enriched with implicit information; and by (ii) constraining models with key concepts and commonse...

Full description

Saved in:
Bibliographic Details
Main Author: Becker, Maria (Author)
Format: Database Research Data
Language:English
Published: Heidelberg Universität 2024-02-26
DOI:10.11588/data/5VTJ26
Subjects:
Online Access:Verlag, kostenfrei, Volltext: https://doi.org/10.11588/data/5VTJ26
Verlag, kostenfrei, Volltext: https://heidata.uni-heidelberg.de/dataset.xhtml?persistentId=doi:10.11588/DATA/5VTJ26
Get full text
Author Notes:Maria Becker
Description
Summary:Code for equipping pretrained language models (BART, GPT-2, XLNet) with commonsense knowledge for generating implicit knowledge statements between two sentences, by (i) finetuning the models on corpora enriched with implicit information; and by (ii) constraining models with key concepts and commonsense knowledge paths connecting them.
Item Description:Gesehen am 30.01.2025
Gefördert durch: DFG: SPP-1999; DFG: FR1707/-4-1; Leibniz-Gesellschaft und Ministerium für Wissenschaft, Forschung und Kunst Baden-Württemberg: SAS-2015-IDS-LWC
Physical Description:Online Resource
DOI:10.11588/data/5VTJ26