Neural Language Models for the Multilingual, Transcultural, and Multimodal Semantic Web

Tracking #: 2244-3457

This paper is currently under review
Authors: 
Dagmar Gromann

Responsible editor: 
Guest Editor 10-years SWJ

Submission type: 
Other
Abstract: 
A vision of a truly multilingual Semantic Web has found strong support with the Linguistic Linked Open Data community. Standards, such as OntoLex-Lemon, highlight the importance of explicit linguistic modeling in relation to ontologies and knowledge graphs. Nevertheless, there is room for improvement in terms of automation, usability, and interoperability. Neural language models have achieved several breakthroughs and successes considerably beyond Natural Language Processing (NLP) tasks and recently also in terms of multimodal representations. Several paths naturally open up to port these successes to the Semantic Web, from automatically translating linguistic information associated with structured knowledge resources to multimodal question-answering with machine translation and multilingual text-video knowledge representation with embeddings. Language is also an important vehicle for culture, an aspect that deserves considerably more attention. Building on existing approaches, this article envisions joint forces between Neural Language Models and Semantic Web technologies for multilingual, transcultural, and multimodal information access.
Full PDF Version: 
Tags: 
Under Review