lnu.sePublikationer
Ändra sökning
RefereraExporteraLänk till posten
Permanent länk

Direktlänk
Referera
Referensformat
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Annat format
Fler format
Språk
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Annat språk
Fler språk
Utmatningsformat
  • html
  • text
  • asciidoc
  • rtf
Robust correction of 3D geo-metadata in photo collections by forming a photo grid
Umeå University.ORCID-id: 0000-0003-2203-5805
Umeå University.
Umeå University.
2011 (Engelska)Ingår i: WCSP2011: IEEE International Conference on Wireless Communications and Signal Processing, IEEE Press, 2011, s. 1-5Konferensbidrag, Publicerat paper (Refereegranskat)
Abstract [en]

In this work, we present a technique for efficient and robust estimation of the exact location and orientation of a photo capture device in a large data set. The provided data set includes a set of photos and the associated information from GPS and orientation sensor. This attached metadata is noisy and lacks precision. Our strategy to correct this uncertain data is based on the data fusion between measurement model, derived from sensor data, and signal model given by the computer vision algorithms. Based on the retrieved information from multiple views of a scene we make a grid of images. Our robust feature detection and matching between images result in finding a reliable transformation. Consequently, relative location and orientation of the data set construct the signal model. On the other hand, information extracted from the single images combined with the measurement data make the measurement model. Finally, Kalman filter is used to fuse these two models iteratively and enhance the estimation of the ground truth(GT) location and orientation. Practically, this approach can help us to design a photo browsing system from a huge collection of photos, enabling 3D navigation and exploration of our huge data set.

Ort, förlag, år, upplaga, sidor
IEEE Press, 2011. s. 1-5
Nationell ämneskategori
Mediateknik
Forskningsämne
Data- och informationsvetenskap, Medieteknik
Identifikatorer
URN: urn:nbn:se:lnu:diva-40994DOI: 10.1109/WCSP.2011.6096689ISBN: 978-1-4577-1008-7 (tryckt)OAI: oai:DiVA.org:lnu-40994DiVA, id: diva2:796175
Konferens
IEEE International Conference on Wireless Communications and Signal Processing (WCSP2011), Nanjing, China, 9-11 November 2011
Tillgänglig från: 2012-03-02 Skapad: 2015-03-18 Senast uppdaterad: 2017-04-19Bibliografiskt granskad
Ingår i avhandling
1. 3D Gesture Recognition and Tracking for Next Generation of Smart Devices: Theories, Concepts, and Implementations
Öppna denna publikation i ny flik eller fönster >>3D Gesture Recognition and Tracking for Next Generation of Smart Devices: Theories, Concepts, and Implementations
2014 (Engelska)Doktorsavhandling, sammanläggning (Övrigt vetenskapligt)
Abstract [en]

The rapid development of mobile devices during the recent decade has been greatly driven by interaction and visualization technologies. Although touchscreens have signicantly enhanced the interaction technology, it is predictable that with the future mobile devices, e.g., augmentedreality glasses and smart watches, users will demand more intuitive in-puts such as free-hand interaction in 3D space. Specically, for manipulation of the digital content in augmented environments, 3D hand/body gestures will be extremely required. Therefore, 3D gesture recognition and tracking are highly desired features for interaction design in future smart environments. Due to the complexity of the hand/body motions, and limitations of mobile devices in expensive computations, 3D gesture analysis is still an extremely diffcult problem to solve.

This thesis aims to introduce new concepts, theories and technologies for natural and intuitive interaction in future augmented environments. Contributions of this thesis support the concept of bare-hand 3D gestural interaction and interactive visualization on future smart devices. The introduced technical solutions enable an e ective interaction in the 3D space around the smart device. High accuracy and robust 3D motion analysis of the hand/body gestures is performed to facilitate the 3D interaction in various application scenarios. The proposed technologies enable users to control, manipulate, and organize the digital content in 3D space.

Ort, förlag, år, upplaga, sidor
Stockholm: KTH Royal Institute of Technology, 2014. s. xii, 101
Serie
TRITA-CSC-A, ISSN 1653-5723 ; 14:02
Nyckelord
3D gestural interaction, gesture recognition, gesture tracking, 3D visualization, 3D motion analysis, augmented environments
Nationell ämneskategori
Medieteknik
Forskningsämne
Medieteknik
Identifikatorer
urn:nbn:se:lnu:diva-40974 (URN)978-91-7595-031-0 (ISBN)
Disputation
2014-03-17, F3, Lindstedtsvägen 26, KTH, 13:15 (Engelska)
Opponent
Handledare
Anmärkning

QC 20140226

Tillgänglig från: 2014-02-26 Skapad: 2015-03-18 Senast uppdaterad: 2018-01-11Bibliografiskt granskad

Open Access i DiVA

Fulltext saknas i DiVA

Övriga länkar

Förlagets fulltext

Personposter BETA

Yousefi, Shahrouz

Sök vidare i DiVA

Av författaren/redaktören
Yousefi, Shahrouz
Mediateknik

Sök vidare utanför DiVA

GoogleGoogle Scholar

doi
isbn
urn-nbn

Altmetricpoäng

doi
isbn
urn-nbn
Totalt: 75 träffar
RefereraExporteraLänk till posten
Permanent länk

Direktlänk
Referera
Referensformat
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Annat format
Fler format
Språk
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Annat språk
Fler språk
Utmatningsformat
  • html
  • text
  • asciidoc
  • rtf