映维网(Paper) https://paper.yivian.com 影响力虚拟现实(VR)、增强现实(AR)产业信息数据平台 Fri, 03 Apr 2020 13:25:29 +0000 zh-CN hourly 1 https://wordpress.org/?v=4.8.12 https://paper.yivian.com/wp-content/uploads/2019/11/cropped-YV-400-150x150.png 映维网(Paper) https://paper.yivian.com 32 32 An augmented reality interface to contextual information https://paper.yivian.com/425 Fri, 03 Apr 2020 13:25:29 +0000 https://paper.yivian.com/425 Title: An augmented reality interface to contextual informationTeams: MicrosoftWriters: A. Ajanki M. Billinghurst H. Gamper T. Järvenpää M. Kandemir S. Kaski M. Koskela M. Kurimo J. Laaksonen K. Puolamäki T. Ruokolainen T. Tossavainen Hannes GamperPublication date: January 2011AbstractIn this paper, we report on a prototype augmented reality (AR) platform for accessing abstract information in real-world pervasive computing environments.

An augmented reality interface to contextual information最先出现在映维网(Paper)

]]>

Title: An augmented reality interface to contextual information

Teams: Microsoft

Writers: A. Ajanki M. Billinghurst H. Gamper T. Järvenpää M. Kandemir S. Kaski M. Koskela M. Kurimo J. Laaksonen K. Puolamäki T. Ruokolainen T. Tossavainen Hannes Gamper

Publication date: January 2011

Abstract

In this paper, we report on a prototype augmented reality (AR) platform for accessing abstract information in real-world pervasive computing environments. Using this platform, objects, people, and the environment serve as contextual channels to more information. The user’s interest with respect to the environment is inferred from eye movement patterns, speech, and other implicit feedback signals, and these data are used for information filtering. The results of proactive context-sensitive information retrieval are augmented onto the view of a handheld or head-mounted display or uttered as synthetic speech. The augmented information becomes part of the user’s context, and if the user shows interest in the AR content, the system detects this and provides progressively more information. In this paper, we describe the first use of the platform to develop a pilot application, Virtual Laboratory Guide, and early evaluation results of this application.

An augmented reality interface to contextual information最先出现在映维网(Paper)

]]>
Spatialisation in Audio Augmented Reality using Finger Snaps https://paper.yivian.com/423 Fri, 03 Apr 2020 13:25:27 +0000 https://paper.yivian.com/423 Title: Spatialisation in Audio Augmented Reality using Finger SnapsTeams: MicrosoftWriters: Hannes Gamper T. LokkiPublication date: 2011 AbstractIn audio augmented reality (AAR) information is embedded into the user’s surroundings by enhancing the real audio scene with virtual auditory events. To maximize their embeddedness and naturalness they can be processed with the user’s head-related impulse responses (HRIRs).

Spatialisation in Audio Augmented Reality using Finger Snaps最先出现在映维网(Paper)

]]>

Title: Spatialisation in Audio Augmented Reality using Finger Snaps

Teams: Microsoft

Writers: Hannes Gamper T. Lokki

Publication date: 2011

Abstract

In audio augmented reality (AAR) information is embedded into the user’s surroundings by enhancing the real audio scene with virtual auditory events. To maximize their embeddedness and naturalness they can be processed with the user’s head-related impulse responses (HRIRs). The HRIRs including early (room) reflections can be obtained from transients in the signals of ear-plugged microphones worn by the user, referred to as instant binaural room impulse responses (BRIRs). Those can be applied on-the-fly to virtual sounds played back through the earphones. With the presented method, clapping or finger snapping allows for instant capturing of BRIR, thus for intuitive positioning and reasonable externalisation of virtual sounds in enclosed spaces, at low hardware and computational costs.

Spatialisation in Audio Augmented Reality using Finger Snaps最先出现在映维网(Paper)

]]>
Audio Augmented Reality in Telecommunication through Virtual Auditory Display https://paper.yivian.com/421 Fri, 03 Apr 2020 13:25:25 +0000 https://paper.yivian.com/421 Title: Audio Augmented Reality in Telecommunication through Virtual Auditory DisplayTeams: MicrosoftWriters: H. Gamper T. Lokki Hannes GamperPublication date:June 2010AbstractAudio communication in its most natural form, the face-to-face conversation, is binaural. Current telecommunication systems often provide only monaural audio, stripping it of spatial cues and thus deteriorating listening comfort and speech intelligibility.

Audio Augmented Reality in Telecommunication through Virtual Auditory Display最先出现在映维网(Paper)

]]>

Title: Audio Augmented Reality in Telecommunication through Virtual Auditory Display

Teams: Microsoft

Writers: H. Gamper T. Lokki Hannes Gamper

Publication date:June 2010

Abstract

Audio communication in its most natural form, the face-to-face conversation, is binaural. Current telecommunication systems often provide only monaural audio, stripping it of spatial cues and thus deteriorating listening comfort and speech intelligibility. In this work, the application of binaural audio in telecommunication through audio augmented reality (AAR) is presented. AAR aims at augmenting auditory perception by embedding spatialised virtual audio content. Used in a telecommunication system, AAR enhances intelligibility and the sense of presence of the user. As a sample use case of AAR, a teleconference scenario is devised. The conference is recorded through a headset with integrated microphones, worn by one of the conference participants. Algorithms are presented to compensate for head movements and restore the spatial cues that encode the perceived directions of the conferees. To analyse the performance of the AAR system, a user study was conducted. Processing the binaural recording with the proposed algorithms places the virtual speakers at fixed directions. This improved the ability of test subjects to segregate the speakers significantly compared to an unprocessed recording. The proposed AAR system outperforms conventional telecommunication systems in terms of the speaker segregation by supporting spatial separation of binaurally recorded speakers.

Audio Augmented Reality in Telecommunication through Virtual Auditory Display最先出现在映维网(Paper)

]]>
Audio augmented reality in telecommunication https://paper.yivian.com/419 Fri, 03 Apr 2020 13:25:22 +0000 https://paper.yivian.com/419 Title: Audio augmented reality in telecommunicationTeams: MicrosoftWriters: H. Gamper Hannes GamperPublication date: February 2010AbstractTelecommunication systems have evolved to allow users to communicate and interact over distance. Audio communication in its most natural form, the face-to-face conversation, is binaural. Current telecommunication systems often provide only monaural audio, stripping it of spatial cues and thus deteriorating listening comfort and speech intelligibility.

Audio augmented reality in telecommunication最先出现在映维网(Paper)

]]>

Title: Audio augmented reality in telecommunication

Teams: Microsoft

Writers: H. Gamper Hannes Gamper

Publication date: February 2010

Abstract

Telecommunication systems have evolved to allow users to communicate and interact over distance. Audio communication in its most natural form, the face-to-face conversation, is binaural. Current telecommunication systems often provide only monaural audio, stripping it of spatial cues and thus deteriorating listening comfort and speech intelligibility. In this work, the application of binaural audio in telecommunication through audio augmented reality (AAR) is presented. AAR aims at augmenting auditory perception by embedding spatialised virtual audio content. Used in a telecommunication system, AAR enhances intelligibility and the sense of presence of the user. As a sample use case of AAR, a teleconference scenario is devised. The conference is recorded through a headset with integrated microphones, worn by one of the conference participants. Algorithms are presented to compensate for head movements and restore the spatial cues that encode the perceived directions of the conferees. To analyse the performance of the AAR system, a user study was conducted. Processing the binaural recording with the proposed algorithms places the virtual speakers at fixed directions. This improved the ability of test subjects to segregate the speakers significantly compared to an unprocessed recording. The proposed AAR system outperforms conventional telecommunication systems in terms of the speaker segregation by supporting spatial separation of binaurally recorded speakers.

Audio augmented reality in telecommunication最先出现在映维网(Paper)

]]>
Development and Evaluation of Mixed Reality Interaction Techniques https://paper.yivian.com/417 Fri, 03 Apr 2020 13:25:18 +0000 https://paper.yivian.com/417 Title: Development and Evaluation of Mixed Reality Interaction TechniquesTeams: MicrosoftWriters: Edward Ishak Hrvoje Benko Steven FeinerPublication date: March 2005AbstractWe present our thoughts on the development and evaluation of novel interaction techniques for mixed reality (MR) systems, particularly those consisting of a heterogeneous mix of displays, devices, and users.

Development and Evaluation of Mixed Reality Interaction Techniques最先出现在映维网(Paper)

]]>
Title: Development and Evaluation of Mixed Reality Interaction Techniques

Teams: Microsoft

Writers: Edward Ishak Hrvoje Benko Steven Feiner

Publication date: March 2005

Abstract

We present our thoughts on the development and evaluation of novel interaction techniques for mixed reality (MR) systems, particularly those consisting of a heterogeneous mix of displays, devices, and users. Interaction work in MR has predominantly focused on two areas: glove or wand-based virtual reality (VR) interactions and tangible user interfaces. We believe that MR users should not be limited to these approaches, but should rather be able to utilize all available devices and interaction methods in their environment to make use of the most relevant ones for the current task at hand. Furthermore, we will discuss the difficulty in finding appropriate ways to evaluate new techniques, primarily due to the lack of standards of comparison.

Development and Evaluation of Mixed Reality Interaction Techniques最先出现在映维网(Paper)

]]>
Collaborative Mixed Reality Visualization of an Archaeological Excavation https://paper.yivian.com/415 Fri, 03 Apr 2020 13:19:36 +0000 https://paper.yivian.com/415 Title: Collaborative Mixed Reality Visualization of an Archaeological ExcavationTeams: MicrosoftWriters: Hrvoje Benko Edward W. Ishak Steve FeinerPublication date: November 2004AbstractWe present VITA (Visual Interaction Tool for Archaeology), an experimental collaborative mixed reality system for offsite visualization of an archaeological dig.

Collaborative Mixed Reality Visualization of an Archaeological Excavation最先出现在映维网(Paper)

]]>

Title: Collaborative Mixed Reality Visualization of an Archaeological Excavation

Teams: Microsoft

Writers: Hrvoje Benko Edward W. Ishak Steve Feiner

Publication date: November 2004

Abstract

We present VITA (Visual Interaction Tool for Archaeology), an experimental collaborative mixed reality system for offsite visualization of an archaeological dig. Our system allows multiple users to visualize the dig site in a mixed reality environment in which tracked, see-through, head-worn displays are combined with a multi-user, multi-touch, projected table surface, a large screen display, and tracked hand-held displays. We focus on augmenting existing archaeological analysis methods with new ways to organize, visualize, and combine the standard 2D information available from an excavation (drawings, pictures, and notes) with textured, laser rangescanned 3D models of objects and the site itself. Users can combine speech, touch, and 3D hand gestures to interact multimodally with the environment. Preliminary user tests were conducted with archaeology researchers and students, and their feedback is presented here.

Collaborative Mixed Reality Visualization of an Archaeological Excavation最先出现在映维网(Paper)

]]>
Interaction Management for Ubiquitous Augmented Reality User Interfaces https://paper.yivian.com/413 Fri, 03 Apr 2020 13:19:33 +0000 https://paper.yivian.com/413 Title: Interaction Management for Ubiquitous Augmented Reality User InterfacesTeams: MicrosoftWriters: Otmar HilligesPublication date: January 2004AbstractOne of the major challenges of current computer science research is to provide users with suitable means of interaction with increasingly powerful and complex computer systems. In recent years several concepts in user interface technologies and human computer interaction have been evolved.

Interaction Management for Ubiquitous Augmented Reality User Interfaces最先出现在映维网(Paper)

]]>

Title: Interaction Management for Ubiquitous Augmented Reality User Interfaces

Teams: Microsoft

Writers: Otmar Hilliges

Publication date: January 2004

Abstract

One of the major challenges of current computer science research is to provide users with suitable means of interaction with increasingly powerful and complex computer systems. In recent years several concepts in user interface technologies and human computer interaction have been evolved. Among them augmented, mixed and virtual reality, tangible, ubiquitous and wearable user interfaces. All these technologies are, more and more, converging into a new user interface paradigm which we call Ubiquitous Augmented Reality. Ubiquitous Augmented Reality user interfaces incorporate a wide variety of concepts such as multi-modal, multi-user and multi-device aspects. Also these include new input and output devices. In contradiction to classic 2D user interfaces, there has no standardization taken place for ubiquitous augmented reality user interfaces’ input and output devices nor for the interaction techniques utilized in such user interfaces. This thesis presents a method that handles interaction management for ubiquitous augmented reality user interfaces, consisting of flexible integration of I/O devices at runtime and information-flow control. The presented solution allows to assemble user interfaces very quickly and to change the behavior of them at runtime. This enables researchers to experiment and identify appropriate interaction techniques, metaphors and idioms. The presented component for interaction management has been prototypical implemented and tested within the project CAR. That has been conducted at the augmented reality research group of the Technische Universität München. The project CAR is part of a interdisciplinary research project that aims at the development of user interfaces in automobiles of the near future (five to ten years). Its main goal is to provide a collaboration platform for researches of different disciplines to discuss and develop new concepts for human computer interaction in automobile environments.

Interaction Management for Ubiquitous Augmented Reality User Interfaces最先出现在映维网(Paper)

]]>
On-demand, In-place Help for Augmented Reality Environments https://paper.yivian.com/411 Fri, 03 Apr 2020 13:19:30 +0000 https://paper.yivian.com/411 Title: On-demand, In-place Help for Augmented Reality EnvironmentsTeams: MicrosoftWriters: Desney Tan Ivan Poupyrev Mark Billinghurst Hirokazu Kato Holger Regenbrecht Nobuji TetsutaniPublication date: August 2001AbstractIn many help systems, users are either distracted with a constant barrage of help or have to stop working on the task at hand and explicitly search for help. In this paper, we propose two methods to present on-demand, in-place help in augmented reality environments.

On-demand, In-place Help for Augmented Reality Environments最先出现在映维网(Paper)

]]>

Title: On-demand, In-place Help for Augmented Reality Environments

Teams: Microsoft

Writers: Desney Tan Ivan Poupyrev Mark Billinghurst Hirokazu Kato Holger Regenbrecht Nobuji Tetsutani

Publication date: August 2001

Abstract

In many help systems, users are either distracted with a constant barrage of help or have to stop working on the task at hand and explicitly search for help. In this paper, we propose two methods to present on-demand, in-place help in augmented reality environments. In the interfaces we describe, users interact with virtual objects that are superimposed on the real world by manipulating physical cards. We describe Tiles, a prototype application for designing aircraft instrument panels, from which our work on help systems grew. In Tiles, users manipulate special ‘help’ cards in combination with data cards to invoke detailed help. This technique, which we call Tangible Bubble Help may be multi-modal, taking the form of text, audio, graphics, and animations. We also present Tangible Tooltips, a lightweight technique in which users control the display of textual help by tilting data cards. In both cases, users can seamlessly transition between performing the main task and acquiring help.

On-demand, In-place Help for Augmented Reality Environments最先出现在映维网(Paper)

]]>
On-demand, In-place Help for Augmented Reality Environments https://paper.yivian.com/409 Fri, 03 Apr 2020 13:19:27 +0000 https://paper.yivian.com/409 Title: On-demand, In-place Help for Augmented Reality EnvironmentsTeams: MicrosoftWriters: Ivan Poupyrev Mark Billinghurst Hirokazu Kato Holger Regenbrecht Nobuji TetsutaniPublication date: April 2001AbstractIn many help systems, users are either distracted with a constant barrage of help or have to stop working on the task at hand and explicitly search for help. In this paper, we propose two methods to present on-demand, in-place help in augmented reality environments.

On-demand, In-place Help for Augmented Reality Environments最先出现在映维网(Paper)

]]>

Title: On-demand, In-place Help for Augmented Reality Environments

Teams: Microsoft

Writers: Ivan Poupyrev Mark Billinghurst Hirokazu Kato Holger Regenbrecht Nobuji Tetsutani

Publication date: April 2001

Abstract

In many help systems, users are either distracted with a constant barrage of help or have to stop working on the task at hand and explicitly search for help. In this paper, we propose two methods to present on-demand, in-place help in augmented reality environments. In the interfaces we describe, users interact with virtual objects that are superimposed on the real world by manipulating physical cards. We describe Tiles, a prototype application for designing aircraft instrument panels, from which our work on help systems grew. In Tiles, users manipulate special ‘help’ cards in combination with data cards to invoke detailed help. This technique, which we call Tangible Bubble Help may be multi-modal, taking the form of text, audio, graphics, and animations. We also present Tangible Tooltips, a lightweight technique in which users control the display of textual help by tilting data cards. In both cases, users can seamlessly transition between performing the main task and acquiring help.

On-demand, In-place Help for Augmented Reality Environments最先出现在映维网(Paper)

]]>
Mutual Dissambiguation of 3D Multimodal Interaction in Augmented and Virtual Reality https://paper.yivian.com/407 Fri, 03 Apr 2020 13:19:23 +0000 https://paper.yivian.com/407 Title: Mutual Dissambiguation of 3D Multimodal Interaction in Augmented and Virtual RealityTeams: MicrosoftWriters: Ed Kaiser Alex Olwal David McGee Hrvoje Benko Andrea Corradini Xiaoguang Li Phil Cohen Steven FeinerPublication date: November 2003AbstractWe describe an approach to 3D multimodal interaction in immersive augmented and virtual reality environments that accounts for the uncertain nature of the information sources.

Mutual Dissambiguation of 3D Multimodal Interaction in Augmented and Virtual Reality最先出现在映维网(Paper)

]]>

Title: Mutual Dissambiguation of 3D Multimodal Interaction in Augmented and Virtual Reality

Teams: Microsoft

Writers: Ed Kaiser Alex Olwal David McGee Hrvoje Benko Andrea Corradini Xiaoguang Li Phil Cohen Steven Feiner

Publication date: November 2003

Abstract

We describe an approach to 3D multimodal interaction in immersive augmented and virtual reality environments that accounts for the uncertain nature of the information sources. The resulting multimodal system fuses symbolic and statistical information from a set of 3D gesture, spoken language, and referential agents. The referential agents employ visible or invisible volumes that can be attached to 3D trackers in the environment, and which use a time-stamped history of the objects that intersect them to derive statistics for ranking potential referents. We discuss the means by which the system supports mutual disambiguation of these modalities and information sources, and show through a user study how mutual disambiguation accounts for over 45% of the successful 3D multimodal interpretations. An accompanying video demonstrates the system in action.

Mutual Dissambiguation of 3D Multimodal Interaction in Augmented and Virtual Reality最先出现在映维网(Paper)

]]>