Visual Analysis of Sentiment and Stance in Social Media Texts
2018 (English)In: EuroVis 2018 - Posters / [ed] Anna Puig and Renata Raidou, Eurographics - European Association for Computer Graphics, 2018, p. 49-51Conference paper, Poster (with or without abstract) (Refereed)
Abstract [en]
Despite the growing interest for visualization of sentiments and emotions in textual data, the task of detecting and visualizing various stances is not addressed well by the existing approaches. The challenges associated with this task include development of the underlying computational methods and visualization of the corresponding multi-label stance classification results. In this poster abstract, we describe the ongoing work on a visual analytics platform called StanceVis Prime, which is designed for analysis of sentiment and stance in temporal text data from various social media data sources. Our approach consumes documents from several text stream sources, applies sentiment and stance classification, and provides end users with both an overview of the resulting data series and a detailed view for close reading and examination of the classifiers’ output. The intended use case scenarios for StanceVis Prime include social media monitoring and research in sociolinguistics.
Place, publisher, year, edition, pages
Eurographics - European Association for Computer Graphics, 2018. p. 49-51
Keywords [en]
visual analytics, visualization, information visualization, interaction, sentiment analysis, stance analysis, text mining, natural language processing
National Category
Computer Sciences Human Computer Interaction Language Technology (Computational Linguistics)
Research subject
Computer Science, Information and software visualization
Identifiers
URN: urn:nbn:se:lnu:diva-73397DOI: 10.2312/eurp.20181127ISBN: 978-3-03868-065-9 (print)OAI: oai:DiVA.org:lnu-73397DiVA, id: diva2:1200539
Conference
The 20th EG/VGTC Conference on Visualization (EuroVis '18), Brno, Czech Republic, 4-8 June, 2018
Projects
StaViCTA
Funder
Swedish Research Council, 2012-56592018-04-242018-04-242020-10-26Bibliographically approved