[Infovis] CFP: AVI Workshop on Multimodal Interaction for Data Visualization
John Stasko
john.stasko at cc.gatech.edu
Thu Feb 1 21:29:55 CET 2018
Call For Participation
AVI 2018 Workshop on Multimodal Interaction for Data Visualization
https://sites.google.com/view/multimodalvis/
May 29, 2018
Workshop Overview
Multimodal interaction offers many potential benefits for data
visualization to help people stay in the flow of their visual analysis
and presentation. Often, the strengths of one interaction modality can
offset weaknesses of another. However, existing visualization tools and
interaction techniques have mostly explored a single input modality such
as mouse, touch, pen, or more recently, natural language and speech.
Recent interest in deploying data visualizations on diverse display
hardware including mobile, AR/VR, and large displays create an urgent
need to develop natural and fluid interaction techniques that can work
in these contexts. Multimodal interaction offers strong promise for such
situations, but its unique challenges for data visualization have yet to
be deeply investigated.
This workshop will bring together researchers with expertise in
visualization, interaction design, and natural user interfaces. We aim
to build a community of multimodal visualization researchers, explore
synergies and challenges in our research, and establish an agenda for
research on multimodal interactions for visualization.
Important Dates
March 9, 2018: Position paper submission deadline
March 16, 2018: Notification of acceptance
May 11, 2018: Final position papers due
May 29, 2018: Workshop
Submissions
We invite 2-4 page position papers (in the CHI Extended Abstracts
format, with page limit including references) on any topic related to
multimodal interaction for data visualization. Position papers should
outline experiences, interests, and challenges around multimodal
interaction for visualizations including pen, touch, gesture, speech,
and natural language. Topics may include, but are not limited to:
- Visualization on various displays beyond the desktop, including
mobile, large screen, and AR/VR.
- Visualization designs to leverage specific interaction modalities
including natural language interaction (text and voice), pen, touch, mouse.
- Libraries and toolkits to support specific interaction modalities
- Evaluation methods
- Use cases and motivating scenarios of multimodal interaction for data
visualization
Organizers
Bongshin Lee, Microsoft Research
Arjun Srinivasan, Georgia Institute of Technology
John Stasko, Georgia Institute of Technology
Melanie Tory, Tableau Software
Vidya Setlur, Tableau Software
More information about the Infovis
mailing list