Preprint Article Version 1 This version is not peer-reviewed

Research on Machine Reading Comprehension Integrating Longdistance Semantic Relations and Linguistic Features

* and
Version 1 : Received: 18 July 2024 / Approved: 19 July 2024 / Online: 19 July 2024 (09:49:47 CEST)

How to cite: Ma, N.; Di, W. Research on Machine Reading Comprehension Integrating Longdistance Semantic Relations and Linguistic Features. Preprints 2024, 2024071590. https://doi.org/10.20944/preprints202407.1590.v1 Ma, N.; Di, W. Research on Machine Reading Comprehension Integrating Longdistance Semantic Relations and Linguistic Features. Preprints 2024, 2024071590. https://doi.org/10.20944/preprints202407.1590.v1

Abstract

Machine reading comprehension is a crucial area of research in natural language processing, aiming to enable machines to read and understand text as humans do, and to answer questions related to the text content. The pre-trained language model, represented by BERT, has achieved superior results compared to traditional models in many NLP tasks, leading to the rise of the pre-trained paradigm in the field of natural language processing. This paper addresses the issue that the pre-trained language model lacks the ability to extract long-distance semantic relations and make efficient use of linguistic features. Firstly, the recent developments of the pre-trained language model are described. Then, two feature maps are used to intuitively express the structured long-distance semantic correlation features, and the traditional sequence structure features are integrated. The influence of different feature map construction methods on machine reading comprehension is compared. Finally, the application of the pre-trained language model in machine reading comprehension is summarized and prospected. By fusing graph structure features, the model can learn more abundant language knowledge, thus further improving its reasoning ability.

Keywords

Machine reading comprehension; Long-distance semantic relations; Linguistic features; Pre-trained language models

Subject

Computer Science and Mathematics, Artificial Intelligence and Machine Learning

Comments (0)

We encourage comments and feedback from a broad range of readers. See criteria for comments and our Diversity statement.

Leave a public comment
Send a private comment to the author(s)
* All users must log in before leaving a comment
Views 0
Downloads 0
Comments 0
Metrics 0


×
Alerts
Notify me about updates to this article or when a peer-reviewed version is published.
We use cookies on our website to ensure you get the best experience.
Read more about our cookies here.