lnu.sePublications
Change search
Refine search result
1 - 5 of 5
CiteExportLink to result list
Permanent link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Rows per page
  • 5
  • 10
  • 20
  • 50
  • 100
  • 250
Sort
  • Standard (Relevance)
  • Author A-Ö
  • Author Ö-A
  • Title A-Ö
  • Title Ö-A
  • Publication type A-Ö
  • Publication type Ö-A
  • Issued (Oldest first)
  • Issued (Newest first)
  • Created (Oldest first)
  • Created (Newest first)
  • Last updated (Oldest first)
  • Last updated (Newest first)
  • Disputation date (earliest first)
  • Disputation date (latest first)
  • Standard (Relevance)
  • Author A-Ö
  • Author Ö-A
  • Title A-Ö
  • Title Ö-A
  • Publication type A-Ö
  • Publication type Ö-A
  • Issued (Oldest first)
  • Issued (Newest first)
  • Created (Oldest first)
  • Created (Newest first)
  • Last updated (Oldest first)
  • Last updated (Newest first)
  • Disputation date (earliest first)
  • Disputation date (latest first)
Select
The maximal number of hits you can export is 250. When you want to export more records please use the Create feeds function.
  • 1.
    Alissandrakis, Aris
    et al.
    Linnaeus University, Faculty of Technology, Department of Media Technology.
    Reski, Nico
    Linnaeus University, Faculty of Technology, Department of Media Technology.
    Using Mobile Augmented Reality to Facilitate Public Engagement2017In: Extended Papers of the International Symposium on Digital Humanities (DH 2016) / [ed] Koraljka Golub, Marcelo Milrad, CEUR-WS , 2017, Vol. 2021, p. 99-109Conference paper (Refereed)
    Abstract [en]

    This paper presents our initial efforts towards the development of a framework for facilitating public engagement through the use of mobile Augmented Reality (mAR), that fall under the overall project title "Augmented Reality for Public Engagement" (PEAR). We present the concept, implementation, and discuss the results from the deployment of a mobile phone app (PEAR 4 VXO). The mobile app was used for a user study in conjunction with a campaign carried out by Växjö municipality (Sweden) while exploring how to get citizens more engaged in urban planning actions and decisions. These particular activities took place during spring 2016.One of the salient features of our approach is that it combines novel ways of using mAR together with social media, online databases, and sensors, to support public engagement. In addition, the data collection process and audience engagement were tested in a follow-up limited deployment.The analysis and outcomes of our initial results validate the overall concept and indicate the potential usefulness of the app as a tool, but also highlight the need for an active campaign from the part of the stakeholders.Our future efforts will focus on addressing some of the problems and challenges that we have identified during the different phases of this user study.

  • 2.
    Alissandrakis, Aris
    et al.
    Linnaeus University, Faculty of Technology, Department of computer science and media technology (CM).
    Reski, Nico
    Linnaeus University, Faculty of Technology, Department of computer science and media technology (CM).
    Laitinen, Mikko
    University of Eastern Finland, Finland.
    Tyrkkö, Jukka
    Linnaeus University, Faculty of Arts and Humanities, Department of Languages.
    Levin, Magnus
    Linnaeus University, Faculty of Arts and Humanities, Department of Languages.
    Lundberg, Jonas
    Linnaeus University, Faculty of Technology, Department of computer science and media technology (CM).
    Visualizing dynamic text corpora using Virtual Reality2018In: ICAME 39 : Tampere, 30 May – 3 June, 2018: Corpus Linguistics and Changing Society : Book of Abstracts, Tampere: University of Tampere , 2018, p. 205-205Conference paper (Refereed)
    Abstract [en]

    In recent years, data visualization has become a major area in Digital Humanities research, and the same holds true also in linguistics. The rapidly increasing size of corpora, the emergence of dynamic real-time streams, and the availability of complex and enriched metadata have made it increasingly important to facilitate new and innovative approaches to presenting and exploring primary data. This demonstration showcases the uses of Virtual Reality (VR) in the visualization of geospatial linguistic data using data from the Nordic Tweet Stream (NTS) project (see Laitinen et al 2017). The NTS data for this demonstration comprises a full year of geotagged tweets (12,443,696 tweets from 273,648 user accounts) posted within the Nordic region (Denmark, Finland, Iceland, Norway, and Sweden). The dataset includes over 50 metadata parameters in addition to the tweets themselves.

    We demonstrate the potential of using VR to efficiently find meaningful patterns in vast streams of data. The VR environment allows an easy overview of any of the features (textual or metadata) in a text corpus. Our focus will be on the language identification data, which provides a previously unexplored perspective into the use of English and other non-indigenous languages in the Nordic countries alongside the native languages of the region.

    Our VR prototype utilizes the HTC Vive headset for a room-scale VR scenario, and it is being developed using the Unity3D game development engine. Each node in the VR space is displayed as a stacked cuboid, the equivalent of a bar chart in a three-dimensional space, summarizing all tweets at one geographic location for a given point in time (see: https://tinyurl.com/nts-vr). Each stacked cuboid represents information of the three most frequently used languages, appropriately color coded, enabling the user to get an overview of the language distribution at each location. The VR prototype further encourages users to move between different locations and inspect points of interest in more detail (overall location-related information, a detailed list of all languages detected, the most frequently used hashtags). An underlying map outlines country borders and facilitates orientation. In addition to spatial movement through the Nordic areas, the VR system provides an interface to explore the Twitter data based on time (days, weeks, months, or time of predefined special events), which enables users to explore data over time (see: https://tinyurl.com/nts-vr-time).

    In addition to demonstrating how the VR methods aid data visualization and exploration, we will also briefly discuss the pedagogical implications of using VR to showcase linguistic diversity.

  • 3.
    Reski, Nico
    Linnaeus University, Faculty of Technology, Department of Media Technology.
    Change your Perspective: Exploration of a 3D Network created with Open Data in an Immersive Virtual Reality Environment using a Head-mounted Display and Vision-based Motion Controls2015Independent thesis Advanced level (degree of Master (Two Years)), 20 credits / 30 HE creditsStudent thesis
    Abstract [en]

    Year after year, technologies are evolving in an incredible rapid pace, becoming faster, more complex, more accurate and more immersive. Looking back just a decade, especially interaction technologies have made a major leap. Just two years ago in 2013, after being researched for quite some time, the hype around virtual reality (VR) arouse renewed enthusiasm, finally reaching mainstream attention as the so called head-mounted displays (HMD), devices worn on the head  to grant a visual peek into the virtual world, gain more and more acceptance with the end-user. Currently, humans interact with computers in a very counter-intuitive two dimensional way. The ability to experience digital content in the humans most natural manner, by simply looking around and perceiving information from their surroundings, has the potential to be a major game changer in how we perceive and eventually interact with digital information. However, this confronts designers and developers with new challenges of how to apply these exciting technologies, supporting interaction mechanisms to naturally explore digital information in the virtual world, ultimately overcoming real world boundaries. Within the virtual world, the only limit is our imagination.

    This thesis investigates an approach of how to naturally interact and explore information based on open data within an immersive virtual reality environment using a head-mounted display and vision-based motion controls. For this purpose, an immersive VR application visualizing information as a network of European capital cities has been implemented, offering interaction through gesture input. The application lays a major focus on the exploration of the generated network and the consumption of the displayed information. While the conducted user interaction study with eleven participants investigated their acceptance of the developed prototype, estimating their workload and examining their explorative behaviour, the additional dialog with five experts in the form of explorative discussions provided further feedback towards the prototype’s design and concept. The results indicate the participants’ enthusiasm and excitement towards the novelty and intuitiveness of exploring information in a less traditional way than before, while challenging them with the applied interface and interaction design in a positive manner. The design and concept were also accepted through the experts, valuing the idea and implementation. They provided constructive feedback towards the visualization of the information as well as emphasising and encouraging to be even bolder, making more usage of the available 3D environment. Finally, the thesis discusses these findings and proposes recommendations for future work.

  • 4.
    Reski, Nico
    et al.
    Linnaeus University, Faculty of Technology, Department of Media Technology.
    Alissandrakis, Aris
    Linnaeus University, Faculty of Technology, Department of Media Technology.
    Change Your Perspective: Exploration of a 3D Network Created from Open Data in an Immersive Virtual Reality Environment2016In: ACHI 2016: The Ninth International Conference on Advances in Computer-Human Interactions / [ed] Alma Leora Culén, Leslie Miller, Irini Giannopulu, Birgit Gersbeck-Schierholz, International Academy, Research and Industry Association (IARIA), 2016, p. 403-410Conference paper (Refereed)
    Abstract [en]

    This paper investigates an approach of how to naturally interact and explore information (based on open data) within an immersive virtual reality environment (VRE) using a head-mounted display and vision-based motion controls. We present the results of a user interaction study that investigated the acceptance of the developed prototype, estimated the workload as well as examined the participants' behavior. Additional discussions with experts provided further feedback towards the prototype's overall design and concept. The results indicate that the participants were enthusiastic regarding the novelty and intuitiveness of exploring information in a VRE, as well as were challenged (in a positive manner) with the applied interface and interaction design. The presented concept and design were well received by the experts, who valued the idea and implementation and encouraged to be even bolder, making more use of the available 3D environment.

  • 5.
    Reski, Nico
    et al.
    Linnaeus University, Faculty of Technology, Department of computer science and media technology (CM).
    Alissandrakis, Aris
    Linnaeus University, Faculty of Technology, Department of computer science and media technology (CM).
    Using an Augmented Reality Cube-like Interface and 3D Gesture-based Interaction to Navigate and Manipulate Data2018In: VINCI '18 Proceedings of the 11th International Symposium on Visual Information Communication and Interaction, New York: Association for Computing Machinery (ACM), 2018, p. 92-96Conference paper (Refereed)
    Abstract [en]

    In this paper we describe our work-in-progress to create an interface that enables users to browse and select data within an Augmented Reality environment, using a virtual cube object that can be interacted with through 3D gestural input. We present the prototype design (including the graphical elements), describe the interaction possibilities of touching the cube with the hand/finger, and put the prototype into the context of our Augmented Reality for Public Engagement (PEAR) framework. An interactive prototype was implemented and runs on a typical off-the-shelf smart-phone device.

1 - 5 of 5
CiteExportLink to result list
Permanent link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf