Workshop on Novel Input Devices and Interaction Techniques (NIDIT) 202

Workshop on Novel Input Devices and Interaction Techniques – NIDIT at IEEE VR 2024, Date: March 17, 2024, Orlando, Florida

Virtual Reality (VR) has become a mainstream technology. Recent advances in VR display technologies have led to high-resolution, ergonomic, and – critically – low-cost head-mounted displays (HMDs). Advances in commercial input devices and interaction techniques have arguably not kept pace with these advances in display systems. For instance, most HMDs include tracked input devices, but these “controllers” are still fairly similar to the earliest examples of 3D controllers used in the VR systems of the 1980s. Interaction in commercial VR systems has similarly lagged; despite many advances in 3D interaction in the past three decades of VR research, interaction in commercial systems largely relies on classical techniques like the virtual hand or ray-casting.

Topics

This full-day workshop will bring together researchers and industry practitioners to discuss and experience the future of input devices for Virtual/Augmented/Mixed/Extended Reality and 3D User Interfaces, and help chart a course for the future of 3D interaction techniques. We invite authors to submit 4-6 page papers on any of the following topics:

§   Form factors and ergonomics of input devices,

§   Hardware design and prototyping,

§   Mapping of input to varying degrees of freedom,

§   Haptic/tactile feedback,

§   Novel input devices,

§   Repurposing of existing devices (e.g., smartphones, tablets) for immersive contexts,

§   Tracked passive or custom props,

§   Novel interaction techniques supported by custom devices,

§   User studies evaluating the above topics.

Related but unlisted topics are also welcome. In addition to a presentation at the workshop, authors of all accepted submissions are strongly encouraged to demonstrate their novel input device and interaction techniques in an interactive demo format following their presentation.

Submission Information

NIDIT 2024 will accept 4-6 pages of short papers (references included).

Papers should be submitted via PCS : https://new.precisionconference.com/

Submissions must be anonymized and in PDF, using the VGTC format: https://tc.computer.org/vgtc/publications/conference/

All submissions will be reviewed by experts in the areas listed above. At least one author of each accepted submission must register for the workshop and at least one day of the IEEE VR 2024 conference. 

IMPORTANT NOTE: Authors of accepted papers are expected to give a 10-minute presentation at the workshop and are strongly encouraged to subsequently give a hands-on demonstration of their research

In cases where demonstrations are not possible, a video may be provided. The workshop organizers will be able to provide limited quantities of standard equipment (e.g., head-mounted displays, controllers) to help authors demonstrate their work. Authors of accepted submissions should contact the organizers early to determine what is available. Proceedings will be submitted for inclusion in the IEEE Xplore Library. We will also host the papers on the NIDIT website.


Important Dates

▪ IEEE VR Conference Papers Author Notification: December 15, 2023, AoE

▪ Workshop Submission Deadline: January 12, January 15, 2024, AoE

▪ Workshop Notification Deadline: January 19, 2024, AoE

▪ Workshop Camera-Ready Deadline:  January 26, 2024, AoE

Organizers:

Anil Ufuk Batmaz, ufuk.batmaz@concordia.ca  

Kristen Grinyer, kristengrinyer@cmail.carleton.ca

Mayra D. Barrera Machuca, mbarrera@dal.ca

Francisco R. Ortega, fortega@colostate.edu

Robert J. Teather, rob.teather@carleton.ca 

Wolfgang Stuerzlinger, w.s@sfu.ca 

Contact

Please send any questions to Anil Ufuk Batmaz (ufuk.batmaz@concordia.ca).


Keynote talk by Kiyoshi Kiyokawa , Sunday, March 17, 2024, 09:00-10:00

Title: Unleashing the Potential of Multi-Modal Interaction in Virtual Reality

 

Abstract: Virtual Reality (VR) has the power to transform how we interact with digital environments, but its true potential can only be realized through the development of natural and intuitive interaction techniques. In this talk, I will present a series of recent research projects from our laboratory that showcase the promise of multi-modal interaction in VR. By leveraging physiological sensing, such as eye-gaze tracking, EEG, and HRV, alongside hand gestures and haptic feedback, we can develop novel interaction paradigms that enhance user experiences and unlock new possibilities for VR applications. Through these examples, I will demonstrate how multi-modal interaction can make VR experiences more engaging, immersive, and accessible. Finally, I will present future research directions and opportunities for further enhancing interaction in VR.

 

Bio: Kiyoshi Kiyokawa is a Professor at Nara Institute of Science and Technology (NAIST), Japan, since 2017. He received his M.S. and Ph.D. degrees in information systems from NAIST in 1996 and 1998, respectively. He was a Research Fellow of the Japan Society for the Promotion of Science in 1998. He worked for Communications Research Laboratory (current National Institute of Information and Communications Technology (NICT)) from 1999 to 2002. He was a visiting researcher at Human Interface Technology Laboratory of University of Washington from 2001 to 2002. He was an Associate Professor at Cybermedia Center, Osaka University from 2002 to 2017. His research interests include virtual reality, augmented reality, human augmentation, 3D user interfaces, CSCW, and context awareness. He is a Board Member and Fellow of the Virtual Reality Society of Japan. He is currently a member of the Steering Committees of IEEE VR and ISMAR. He is an Associate Editor in Chief of IEEE TVCG. He is an inductee of the IEEE VGTC Virtual Reality Academy (Inaugural Class).

Schedule (Sunday, March 17, 2024)

8:00 – 9:00 Setting up demos  

9:00  – 10:00 Keynote Talk

10:00 – 10:30  Coffee break

10:30 – 11:15 First presentation session

11:15 – 12:00 Demos for the first presentation session


12:00 – 13:30 Lunch break


13:30 – 14:45  Second presentation session

14:45 - 15:30 Demos for the second presentation session

15:30 - 16:00 Coffee break 

16:00 - 17:30 Panel (Potential Future Research Directions in New Input Devices and Interaction Techniques) 


First Presentation Session (10:30 – 11:15)

#1001 - Subtask-Based Virtual Hand Visualization Method for Enhanced User Accuracy in Virtual Reality Environments

Voisant Laurent, Amal Hatira, Mohammad Raihanul Bashar, Mucahit Gemici, Marta Kersten-Oertel, Mine Sarac, Anil Ufuk Batmaz.

#1010 - A novel framework for hand visualization in web-based collaborative XR Lovis Schwenderling, Wilhelm Herbrich, Fabian Joeres, Christian Hansen.

#1011 - Optimization of a Tether-Handle Object Retrieval Technique for VR David Michael Broussard, Christoph W Borst.


Second Presentation Session (13:30 – 14:45 )

#1003 - Meet Me Half Way: Concerted Physical and Virtual World Manipulations for Effective Haptic Feedback in VR Yuqi Zhou, Voicu Popescu.

#1004 - Evaluating Voxel-Based Graphical Passwords for Virtual Reality Prashant Rawat, Rumeysa Turkmen, Chukwuemeka Nwagu, Kissinger Sunday, Mayra Donaji Barrera Machuca.

#1006 - VRNConnect: Towards more intuitive interaction of 3D brain connectivity data in virtual environments Sepehr Jalayer, Yiming Xiao, Marta Kersten-Oertel.

#1008 - Stiffness Simulation with Haptic Feedback Using Robotic Gripper and Paper Origami as End-Effector Khrystyna Vasylevska, Mohammad Ghazanfari, Kiumars Sharifmoghaddam, Soroosh Mortezapoor, Emanuel Vonach, Hugo Brument, Georg Nawratil, Hannes Kaufmann.

#1009 An Approach to Pitch Based Implementation of Non-verbal Vocal Interaction (NVVI)  Samuel Williams, Denis Gracani.