lnu.sePublications
Change search
Link to record
Permanent link

Direct link
BETA
Publications (8 of 8) Show all publications
Reski, N., Alissandrakis, A. & Tyrkkö, J. (2019). Collaborative exploration of rich corpus data using immersive virtual reality and non-immersive technologies. In: ADDA: Approaches to Digital Discourse Analysis – ADDA 2, Turku, Finland 23-25 May 2019 ; Book of abstracts. Paper presented at 2nd International Conference: Approaches to Digital Discourse Analysis (ADDA 2), 23-25 May, 2019, Turku, Finland (pp. 7-7). Turku: University of Turku
Open this publication in new window or tab >>Collaborative exploration of rich corpus data using immersive virtual reality and non-immersive technologies
2019 (English)In: ADDA: Approaches to Digital Discourse Analysis – ADDA 2, Turku, Finland 23-25 May 2019 ; Book of abstracts, Turku: University of Turku , 2019, p. 7-7Conference paper, Oral presentation with published abstract (Other academic)
Abstract [en]

In recent years, large textual data sets, comprising many data points and rich metadata, have become a common object of investigation and analysis. Information Visualization and Visual Analytics provide practical tools for visual data analysis, most commonly as interactive two-dimensional (2D) visualizations that are displayed through normal computer monitors. At the same time, display technologies have evolved rapidly over the past decade. In particular, emerging technologies such as virtual reality (VR), augmented reality (AR), or mixed reality (MR) have become affordable and more user-friendly (LaValle 2016). Under the banner of “Immersive Analytics”, researchers started to explore the novel application of such immersive technologies for the purpose of data analysis (Marriott et al. 2018).

By using immersive technologies, researchers hope to increase motivation and user engagement for the overall data analysis activity as well as providing different perspectives on the data. This can be particularly helpful in the case of exploratory data analysis, when the researcher attempts to identify interesting points or anomalies in the data without prior knowledge of what exactly they are searching for. Furthermore, the data analysis process often involves the collaborative sharing of information and knowledge between multiple users for the goal of interpreting and making sense of the explored data together (Isenberg et al. 2011). However, immersive technologies such as VR are often rather single user-centric experiences, where one user is wearing a head-mounted display (HMD) device and is thus visually isolated from the real-world surroundings. Consequently, new tools and approaches for co-located, synchronous collaboration in such immersive data analysis scenarios are needed.

In this software demonstration, we present our developed VR system that enables two users to explore data at the same time, one inside an immersive VR environment, and one outside VR using a non-immersive companion application. The context of this demonstrated data analysis activity is centered around the exploration of the language variability in tweets from the perspectives of multilingualism and sociolinguistics (see, e.g. Coats 2017 and Grieve et al. 2017). Our primary data come from the the Nordic Tweet Stream (NTS) corpus (Laitinen et al. 2018, Tyrkkö 2018), and the immersive VR application visualizes in three dimensions (3D) the clustered Twitter traffic within the Nordic region as stacked cuboids according to their geospatial position, where each stack represents a color-coded language share (Alissandrakis et al. 2018). Through the utilization of 3D gestural input, the VR user can interact with the data using hand postures and gestures in order to move through the virtual 3D space, select clusters and display more detailed information, and to navigate through time (Reski and Alissandrakis 2019) ( https://vrxar.lnu.se/apps/odxvrxnts-360/ ). A non-immersive companion application, running in a normal web browser, presents an overview map of the Nordic region as well as other supplemental information about the data that are more suitable to be displayed using non-immersive technologies.

We will present two complementary applications, each with a different objective within the collaborative data analysis framework. The design and implementation of certain connectivity and collaboration features within these applications facilitate the co-located, synchronous exploration and sensemaking. For instance, the VR user’s position and orientation are displayed and updated in real-time within the overview map of the non-immersive application. The other way around, the selected cluster of the non-immersive user is also highlighted for the user in VR. Initial tests with pairs of language students validated the proof-of-concept of the developed collaborative system and encourage the conduction of further future investigations in this direction.

Place, publisher, year, edition, pages
Turku: University of Turku, 2019
Keywords
virtual reality, Nordic Tweet Stream, digital humanities, immersive analytics
National Category
Human Computer Interaction General Language Studies and Linguistics Language Technology (Computational Linguistics)
Research subject
Computer and Information Sciences Computer Science, Computer Science; Computer Science, Information and software visualization; Humanities, Linguistics
Identifiers
urn:nbn:se:lnu:diva-83858 (URN)
Conference
2nd International Conference: Approaches to Digital Discourse Analysis (ADDA 2), 23-25 May, 2019, Turku, Finland
Projects
DISA-DHOpen Data Exploration in Virtual Reality (ODxVR)
Available from: 2019-05-28 Created: 2019-05-28 Last updated: 2019-06-03Bibliographically approved
Reski, N. & Alissandrakis, A. (2019). Open data exploration in virtual reality: a comparative study of input technology. Virtual Reality
Open this publication in new window or tab >>Open data exploration in virtual reality: a comparative study of input technology
2019 (English)In: Virtual Reality, ISSN 1359-4338, E-ISSN 1434-9957Article in journal (Refereed) Published
Abstract [en]

In this article, we compare three different input technologies (gamepad, vision-based motion controls, room-scale) for an interactive virtual reality (VR) environment. The overall system is able to visualize (open) data from multiple online sources in a unified interface, enabling the user to browse and explore displayed information in an immersive VR setting. We conducted a user interaction study (n=24; n=8 per input technology, between-group design) to investigate experienced workload and perceived flow of interaction. Log files and observations allowed further insights and comparison of each condition. We have identified trends that indicate user preference of a visual (virtual) representation, but no clear trends regarding the application of physical controllers (over vision-based controls), in a scenario that encouraged exploration with no time limitations.

Place, publisher, year, edition, pages
Springer, 2019
Keywords
Comparative study, Gamepad, Room-scale virtual reality, Virtual reality, Vision-based motion controls, 3D gestural input
National Category
Computer Sciences Human Computer Interaction
Research subject
Computer and Information Sciences Computer Science; Computer and Information Sciences Computer Science, Computer Science; Computer Science, Information and software visualization
Identifiers
urn:nbn:se:lnu:diva-79974 (URN)10.1007/s10055-019-00378-w (DOI)
Projects
Open Data Exploration in Virtual Reality (ODxVR)
Funder
Knowledge Foundation, 2016/0174
Available from: 2019-01-28 Created: 2019-01-28 Last updated: 2019-04-17
Reski, N. & Alissandrakis, A. (2018). Using an Augmented Reality Cube-like Interface and 3D Gesture-based Interaction to Navigate and Manipulate Data. In: VINCI '18 Proceedings of the 11th International Symposium on Visual Information Communication and Interaction: . Paper presented at 11th International Symposium on Visual Information Communication and Interaction (VINCI '18), Växjö, Sweden, August 13 - 15, 2018 (pp. 92-96). New York: Association for Computing Machinery (ACM)
Open this publication in new window or tab >>Using an Augmented Reality Cube-like Interface and 3D Gesture-based Interaction to Navigate and Manipulate Data
2018 (English)In: VINCI '18 Proceedings of the 11th International Symposium on Visual Information Communication and Interaction, New York: Association for Computing Machinery (ACM), 2018, p. 92-96Conference paper, Published paper (Refereed)
Abstract [en]

In this paper we describe our work-in-progress to create an interface that enables users to browse and select data within an Augmented Reality environment, using a virtual cube object that can be interacted with through 3D gestural input. We present the prototype design (including the graphical elements), describe the interaction possibilities of touching the cube with the hand/finger, and put the prototype into the context of our Augmented Reality for Public Engagement (PEAR) framework. An interactive prototype was implemented and runs on a typical off-the-shelf smart-phone device.

Place, publisher, year, edition, pages
New York: Association for Computing Machinery (ACM), 2018
Keywords
human-computer interaction, augmented reality, interaction design, 3D user interface, 3D gesture-based interaction
National Category
Human Computer Interaction
Research subject
Computer and Information Sciences Computer Science, Computer Science
Identifiers
urn:nbn:se:lnu:diva-77132 (URN)10.1145/3231622.3231625 (DOI)978-1-4503-6501-7 (ISBN)
Conference
11th International Symposium on Visual Information Communication and Interaction (VINCI '18), Växjö, Sweden, August 13 - 15, 2018
Projects
Augmented Reality for Public Engagement (PEAR)
Funder
Knowledge Foundation
Available from: 2018-08-15 Created: 2018-08-15 Last updated: 2018-09-11Bibliographically approved
Alissandrakis, A., Reski, N., Laitinen, M., Tyrkkö, J., Levin, M. & Lundberg, J. (2018). Visualizing dynamic text corpora using Virtual Reality. In: ICAME 39 : Tampere, 30 May – 3 June, 2018: Corpus Linguistics and Changing Society : Book of Abstracts. Paper presented at The 39th Annual Conference of the International Computer Archive for Modern and Medieval English (ICAME39): Corpus Linguistics and Changing Society. Tampere, 30 May - 3 June, 2018 (pp. 205-205). Tampere: University of Tampere
Open this publication in new window or tab >>Visualizing dynamic text corpora using Virtual Reality
Show others...
2018 (English)In: ICAME 39 : Tampere, 30 May – 3 June, 2018: Corpus Linguistics and Changing Society : Book of Abstracts, Tampere: University of Tampere , 2018, p. 205-205Conference paper, Oral presentation with published abstract (Refereed)
Abstract [en]

In recent years, data visualization has become a major area in Digital Humanities research, and the same holds true also in linguistics. The rapidly increasing size of corpora, the emergence of dynamic real-time streams, and the availability of complex and enriched metadata have made it increasingly important to facilitate new and innovative approaches to presenting and exploring primary data. This demonstration showcases the uses of Virtual Reality (VR) in the visualization of geospatial linguistic data using data from the Nordic Tweet Stream (NTS) project (see Laitinen et al 2017). The NTS data for this demonstration comprises a full year of geotagged tweets (12,443,696 tweets from 273,648 user accounts) posted within the Nordic region (Denmark, Finland, Iceland, Norway, and Sweden). The dataset includes over 50 metadata parameters in addition to the tweets themselves.

We demonstrate the potential of using VR to efficiently find meaningful patterns in vast streams of data. The VR environment allows an easy overview of any of the features (textual or metadata) in a text corpus. Our focus will be on the language identification data, which provides a previously unexplored perspective into the use of English and other non-indigenous languages in the Nordic countries alongside the native languages of the region.

Our VR prototype utilizes the HTC Vive headset for a room-scale VR scenario, and it is being developed using the Unity3D game development engine. Each node in the VR space is displayed as a stacked cuboid, the equivalent of a bar chart in a three-dimensional space, summarizing all tweets at one geographic location for a given point in time (see: https://tinyurl.com/nts-vr). Each stacked cuboid represents information of the three most frequently used languages, appropriately color coded, enabling the user to get an overview of the language distribution at each location. The VR prototype further encourages users to move between different locations and inspect points of interest in more detail (overall location-related information, a detailed list of all languages detected, the most frequently used hashtags). An underlying map outlines country borders and facilitates orientation. In addition to spatial movement through the Nordic areas, the VR system provides an interface to explore the Twitter data based on time (days, weeks, months, or time of predefined special events), which enables users to explore data over time (see: https://tinyurl.com/nts-vr-time).

In addition to demonstrating how the VR methods aid data visualization and exploration, we will also briefly discuss the pedagogical implications of using VR to showcase linguistic diversity.

Place, publisher, year, edition, pages
Tampere: University of Tampere, 2018
Keywords
virtual reality, Nordic Tweet Stream, digital humanities
National Category
General Language Studies and Linguistics Human Computer Interaction Language Technology (Computational Linguistics)
Research subject
Computer Science, Information and software visualization; Humanities, Linguistics
Identifiers
urn:nbn:se:lnu:diva-75064 (URN)
Conference
The 39th Annual Conference of the International Computer Archive for Modern and Medieval English (ICAME39): Corpus Linguistics and Changing Society. Tampere, 30 May - 3 June, 2018
Projects
DISA-DHOpen Data Exploration in Virtual Reality (ODxVR)
Available from: 2018-06-05 Created: 2018-06-05 Last updated: 2018-07-23Bibliographically approved
Alissandrakis, A. & Reski, N. (2017). Using Mobile Augmented Reality to Facilitate Public Engagement. In: Koraljka Golub, Marcelo Milrad (Ed.), Extended Papers of the International Symposium on Digital Humanities (DH 2016): . Paper presented at International Symposium on Digital Humanities (DH 2016) Växjö, Sweden, November, 7-8, 2016. (pp. 99-109). CEUR-WS, 2021
Open this publication in new window or tab >>Using Mobile Augmented Reality to Facilitate Public Engagement
2017 (English)In: Extended Papers of the International Symposium on Digital Humanities (DH 2016) / [ed] Koraljka Golub, Marcelo Milrad, CEUR-WS , 2017, Vol. 2021, p. 99-109Conference paper, Published paper (Refereed)
Abstract [en]

This paper presents our initial efforts towards the development of a framework for facilitating public engagement through the use of mobile Augmented Reality (mAR), that fall under the overall project title "Augmented Reality for Public Engagement" (PEAR). We present the concept, implementation, and discuss the results from the deployment of a mobile phone app (PEAR 4 VXO). The mobile app was used for a user study in conjunction with a campaign carried out by Växjö municipality (Sweden) while exploring how to get citizens more engaged in urban planning actions and decisions. These particular activities took place during spring 2016.One of the salient features of our approach is that it combines novel ways of using mAR together with social media, online databases, and sensors, to support public engagement. In addition, the data collection process and audience engagement were tested in a follow-up limited deployment.The analysis and outcomes of our initial results validate the overall concept and indicate the potential usefulness of the app as a tool, but also highlight the need for an active campaign from the part of the stakeholders.Our future efforts will focus on addressing some of the problems and challenges that we have identified during the different phases of this user study.

Place, publisher, year, edition, pages
CEUR-WS, 2017
Series
CEUR Workshop Proceedings, ISSN 1613-0073
Keywords
Augmented Reality, public engagement, crowdsourcing
National Category
Human Computer Interaction
Research subject
Computer and Information Sciences Computer Science, Media Technology; Computer and Information Sciences Computer Science, Computer Science
Identifiers
urn:nbn:se:lnu:diva-69265 (URN)
Conference
International Symposium on Digital Humanities (DH 2016) Växjö, Sweden, November, 7-8, 2016.
Projects
Augmented Reality for Public Engagement (PEAR)
Available from: 2017-12-13 Created: 2017-12-13 Last updated: 2019-01-10Bibliographically approved
Yousefi, S., Kidane, M., Delgado, Y., Chana, J. & Reski, N. (2016). 3D Gesture-Based Interaction for Immersive Experience in Mobile VR. In: 2016 23rd International Conference on Pattern Recognition (ICPR)Cancún Center, Cancún, México, December 4-8, 2016: . Paper presented at 23rd International Conference on Pattern Recognition (ICPR): Image Analysis and Machine Learning for Scene Understanding" Cancun, 4-8 December, 2016 (pp. 2122-2127). Cancun: IEEE Press
Open this publication in new window or tab >>3D Gesture-Based Interaction for Immersive Experience in Mobile VR
Show others...
2016 (English)In: 2016 23rd International Conference on Pattern Recognition (ICPR)Cancún Center, Cancún, México, December 4-8, 2016, Cancun: IEEE Press, 2016, , p. 6p. 2122-2127Conference paper, Published paper (Refereed)
Abstract [en]

In this paper we introduce a novel solution for real-time 3D hand gesture analysis using the embedded 2D camera of a mobile device. The presented framework is based on forming a large database of hand gestures including the ground truth information of hand poses and details of finger joints in 3D. For a query frame captured by the mobile device's camera in real time, the gesture analysis system finds the best match from the database. Once the best match is found, the corresponding ground truth information will be used for interaction in the designed interface. The presented framework performs an extremely efficient gesture analysis (more than 30 fps) in flexible lighting condition and complex background with dynamic movement of the mobile device. The introduced work is implemented in Android and tested in Gear VR headset.

Place, publisher, year, edition, pages
Cancun: IEEE Press, 2016. p. 6
Series
International Conference on Pattern Recognition, ISSN 1051-4651
Keywords
Gesture and Behavior Analysis, Image and video analysis and understanding, Human Computer Interaction
National Category
Signal Processing
Research subject
Computer and Information Sciences Computer Science, Media Technology
Identifiers
urn:nbn:se:lnu:diva-61447 (URN)10.1109/ICPR.2016.7899949 (DOI)000406771302019 ()2-s2.0-85019102175 (Scopus ID)9781509048465 (ISBN)9781509048472 (ISBN)
Conference
23rd International Conference on Pattern Recognition (ICPR): Image Analysis and Machine Learning for Scene Understanding" Cancun, 4-8 December, 2016
Available from: 2017-03-16 Created: 2017-03-16 Last updated: 2019-06-11Bibliographically approved
Reski, N. & Alissandrakis, A. (2016). Change Your Perspective: Exploration of a 3D Network Created from Open Data in an Immersive Virtual Reality Environment. In: Alma Leora Culén, Leslie Miller, Irini Giannopulu, Birgit Gersbeck-Schierholz (Ed.), ACHI 2016: The Ninth International Conference on Advances in Computer-Human Interactions. Paper presented at ACHI 2016: The Ninth International Conference on Advances in Computer-Human Interactions (pp. 403-410). International Academy, Research and Industry Association (IARIA)
Open this publication in new window or tab >>Change Your Perspective: Exploration of a 3D Network Created from Open Data in an Immersive Virtual Reality Environment
2016 (English)In: ACHI 2016: The Ninth International Conference on Advances in Computer-Human Interactions / [ed] Alma Leora Culén, Leslie Miller, Irini Giannopulu, Birgit Gersbeck-Schierholz, International Academy, Research and Industry Association (IARIA), 2016, p. 403-410Conference paper, Published paper (Refereed)
Abstract [en]

This paper investigates an approach of how to naturally interact and explore information (based on open data) within an immersive virtual reality environment (VRE) using a head-mounted display and vision-based motion controls. We present the results of a user interaction study that investigated the acceptance of the developed prototype, estimated the workload as well as examined the participants' behavior. Additional discussions with experts provided further feedback towards the prototype's overall design and concept. The results indicate that the participants were enthusiastic regarding the novelty and intuitiveness of exploring information in a VRE, as well as were challenged (in a positive manner) with the applied interface and interaction design. The presented concept and design were well received by the experts, who valued the idea and implementation and encouraged to be even bolder, making more use of the available 3D environment.

Place, publisher, year, edition, pages
International Academy, Research and Industry Association (IARIA), 2016
Keywords
human-computer interaction, virtual reality, immersive interaction, information visualization
National Category
Media and Communication Technology
Research subject
Computer and Information Sciences Computer Science, Media Technology; Computer Science, Information and software visualization
Identifiers
urn:nbn:se:lnu:diva-52379 (URN)978-1-61208-468-8 (ISBN)
Conference
ACHI 2016: The Ninth International Conference on Advances in Computer-Human Interactions
Available from: 2016-05-04 Created: 2016-05-04 Last updated: 2018-01-10Bibliographically approved
Reski, N., Nordmark, S. & Milrad, M. (2014). Exploring New Interaction Mechanisms to Support Information Sharing and Collaboration Using Large Multi-touch Displays in the Context of Digital Storytelling. In: Proceedings of the 14th IEEE International Conference on Advanced Learning Technologies IEEE - ICALT2014: . Paper presented at The 14th IEEE International Conference on Advanced Learning Technologies - ICALT2014, Advanced Technologies for Supporting Open Access to Formal and Informal Learning, July 7-9, 2014, Athens, Greece (pp. 176-180). IEEE Press
Open this publication in new window or tab >>Exploring New Interaction Mechanisms to Support Information Sharing and Collaboration Using Large Multi-touch Displays in the Context of Digital Storytelling
2014 (English)In: Proceedings of the 14th IEEE International Conference on Advanced Learning Technologies IEEE - ICALT2014, IEEE Press, 2014, p. 176-180Conference paper, Published paper (Refereed)
Abstract [en]

A wide range of Information and Communications Technologies (ICT) have been used to support teaching and to enhance the learning process in the last decades. With the latest introduction of large interactive tabletops, multi-touch interaction is taken to the next level since large displays allow and invite not just one but multiple users to interact and collaborate at the same time. The latest presents designers and developers with new challenges in terms of interaction possibilities to promote active collaboration and information sharing. This paper evaluates the use of novel Tangible User Interface (TUI) approaches for the design of an interactive tabletop application conceived to support co-located collaborative learning in the particular context of Digital Storytelling (DS). We present the results of a user interaction study, which considers the users' subjective reaction and acceptance for these User Interface (UI) paradigms, as well as their level of collaboration and communication while working together. The results of the study indicated that the users adapted working in a close collaboration using the provided multi-touch functionalities very quickly. Furthermore, users appreciated the possibility to closely discussing, conversing and exchanging information with their peers through simultaneous interactions on the multi-touch display.

Place, publisher, year, edition, pages
IEEE Press, 2014
Series
IEEE International Conference on Advanced Learning Technologies, ISSN 2161-3761
Keywords
interactive tabletops, mobile digital storytelling, collaborative learning, tangible user interfaces
National Category
Media and Communication Technology
Research subject
Computer and Information Sciences Computer Science, Media Technology
Identifiers
urn:nbn:se:lnu:diva-34113 (URN)10.1109/ICALT.2014.59 (DOI)000347713100054 ()2-s2.0-84910051534 (Scopus ID)978-1-4799-4038-7 (ISBN)
Conference
The 14th IEEE International Conference on Advanced Learning Technologies - ICALT2014, Advanced Technologies for Supporting Open Access to Formal and Informal Learning, July 7-9, 2014, Athens, Greece
Available from: 2014-05-07 Created: 2014-05-07 Last updated: 2019-02-27Bibliographically approved
Organisations
Identifiers
ORCID iD: ORCID iD iconorcid.org/0000-0001-7485-8649

Search in DiVA

Show all publications