This study investigates the accuracy of the BART transformer model in the context of a sequence-to-sequence parser that will parse English sentence into EDS semantic graphs.
This study compares the use of BiLSTMs and transformers for performing node prediction for semantic graphs constructed under the EDS framework.
This study presents additional research into the effectiveness of the BERT Transformer and Maximum Entropy Loss within the development of the Edge Prediction Component of a Semantic Graph Parser.