Jeremy R. Cooperstock - Director, Shared Reality Lab
jeremy
Photo credit: M. Mostyn

My lab is broadly concerned with human-computer interaction technologies, emphasizing multimodal sensory augmentation for communication in both co-present and distributed contexts. Our research tackles the full pipeline of sensory input, analysis, encoding, data distribution, and rendering, as well as interaction capabilities and quality of user experience. Applications of these efforts include distributed training of medical and music students, augmented environmental awareness for the blind community, treatment of lazy eye syndrome, low-latency uncompressed HD videoconferencing and a variety of multimodal immersive simulation experiences. Most of our research takes place within the Shared Reality Environment, a facility that includes two different configurations of multi-projector displays, camera and loudspeaker arrays, and a high-fidelity vibrotactile sensing and actuated floor.

BIO · Jeremy Cooperstock (Ph.D., University of Toronto, 1996) is an associate professor in the department of Electrical and Computer Engineering, a member of the Centre for Intelligent Machines, and a founding member of the Centre for Interdisciplinary Research in Music Media and Technology at McGill University. He directs the Shared Reality Lab, which focuses on computer mediation to facilitate high-fidelity human communication and the synthesis of perceptually engaging, multimodal, immersive environments. He led the development of the Intelligent Classroom, the world's first Internet streaming demonstrations of Dolby Digital 5.1, multiple simultaneous streams of uncompressed high-definition video, a high-fidelity orchestra rehearsal simulator, a simulation environment that renders graphic, audio, and vibrotactile effects in response to footsteps, and a mobile game treatment for amblyopia. Cooperstock's work on the Ultra-Videoconferencing system was recognized by an award for Most Innovative Use of New Technology from ACM/IEEE Supercomputing and a Distinction Award from the Audio Engineering Society. The research he supervised on the Autour project earned the Hochhausen Research Award from the Canadian National Institute for the Blind and an Impact Award from the Canadian Internet Registry Association, and his Real-Time Emergency Response project won the Gold Prize (brainstorm round) of the Mozilla Ignite Challenge. Cooperstock has worked with IBM at the Haifa Research Center, Israel, and the T.J. Watson Research Center in Yorktown Heights, New York, the Sony Computer Science Laboratory in Tokyo, Japan, and was a visiting professor at Bang & Olufsen, Denmark, where he conducted research on telepresence technologies as part of the World Opera Project. Cooperstock led the theme of Enabling Technologies for a Networks of Centres of Excellence on Graphics, Animation, and New Media (GRAND) and is an associate editor of the Journal of the AES. (FULL CV AVAILABLE)

research projects
Autour is an eyes-free mobile system designed to give blind users a better sense of their surroundings. We are presently adding new functionality to support intersection crossing, indoor exploration, and dialogue, as well as advancing towards an Android release of the platform.
SenseProxy: We use vibrotactile patterns to convey background information between two people on an ongoing basis as part of a mobile remote implicit communication system.
Wearable Haptics: We are exploring the design space for wearable haptics as an interaction paradigm in everyday conditions. This requires energy efficient, wireless devices that can be attached to the body or inserted into regular clothing, and that are capable of both sensing human input and delivering richly expressive output to the wearer.
Social Media Analytics: We are developing tools for scraping social media feeds for posts of relevance to public safety, to facilitate early detection of events including flooding, highway accidents, road closures, fires, and downed electrical lines.
Multimodal Medical Alarms: To reduce the problem of auditory sensory overload in the clinical environment, we are exploring the use of a multimodal alarm system in operating rooms and intensive care units.
MORE PROJECTS
selected publications
Millet, G., Otis, M., Horodniczy, D., and Cooperstock, J.R. (2017).
Design of Variable-Friction Devices for Shoe-Floor Contact
Mechatronics
design of variable-friction devices for use in the context of human walking on surfaces in which the coefficient of friction can be controlled dynamically
Horodniczy, D. and Cooperstock, J.R. (2017).
Free the Hands! Enhanced Target Selection via a Variable-Friction Shoe
Human Factors in Computing Systems (CHI)
we confirm that variable-friction foot-controlled pointing can achieve throughput competitive with a range of hand-controlled devices
Aguilera, E., Lopez, J. J., and Cooperstock, J.R. (2016).
Spatial Audio for Audioconferencing in Mobile Devices: Investigating the Importance of Virtual Mobility and Private Communication and Optimizations
Journal of the Audio Engineering Society
analysis of the utility of avatar movement and sidebar whisper-mode functionality within an audioconferencing application
Blum, J., Frissen, I., and Cooperstock, J.R. (2015).
Improving Haptic Feedback on Wearable Devices through Accelerometer Measurements
User Interface Software and Technology (UIST)
measurement of user motion immediately prior to delivery of a haptic stimulus can help predict the likelihood of that stimulus being perceived
Blum, J., Eichhorn, A., Smith, S., Sterle-Contala, M., and Cooperstock, J.R. (2014).
Real-Time Emergency Response: Improved Management of Real-Time Information During Crisis Situations
Multimodal User Interfaces
describes our video streaming and management architecture as a prototype for NG-911 systems, offering to enhance situational awareness in crisis scenarios
Panëels, S., Olmos, A., Blum, J., and Cooperstock, J. R. (2013).
Listen to It Yourself! Evaluating Usability of "What's Around Me?" for the Blind
Human Factors in Computing Systems (CHI)
task-based and long-term deployment studies of In-Situ Audio Services
Olmos, A., Bouillot, N., Knight, T., Mabire, N., Redel, J., and Cooperstock, J. R. (2012).
A High-Fidelity Orchestra Simulator for Individual Musicians' Practice
Computer Music Journal
simulator architecture to provide an immersive audiovisual experience of ensemble rehearsal or performance
Blum, J., Greencorn, D., and Cooperstock, J.R. (2012).
Smartphone sensor reliability for augmented reality applications.
Mobile and Ubiquitous Systems (Mobiquitous) an extensive analysis of GPS, compass, and gyroscope sensor reliability in current generation smartphones
Blum, J., Bouchard, M., and Cooperstock, J.R. (2011).
What's around me? Spatialized audio augmented reality for blind users with a smartphone
Mobile and Ubiquitous Systems (Mobiquitous)    BEST PAPER AWARD
design of a mobile system to provide environmental awareness to the visually impaired community
Visell, Y., Giordano, B.L., Millet, G., and Cooperstock, J.R. (2011).
Vibration Influences Haptic Perception of Surface Compliance During Walking
PLoS ONE
investigation of how the perception of ground surface compliance is altered by plantar vibration feedback
Cooperstock, J.R. (2011).
Multimodal Telepresence Systems
IEEE Signal Processing Magazine, Special Issue on Immersive Communications
overview of multimodal signal acquisition, processing, transport, and rendering technologies for next generation telepresence systems
To, L., Thompson, B., Blum, J.R., Maehara, G., Hess, R., and Cooperstock, J.R. (2011).
A game platform for treatment of amblyopia
IEEE Transactions on Neural Systems and Rehabilitation Engineering
describes development of a prototype device for take-home use that can be used in the treatment of amblyopia; despite training being relatively intermittent throughout the week, significant improvements were observed
Visell, Y. and Cooperstock, J.R. (2010).
Design of a Vibrotactile Display via a Rigid Surface
IEEE Haptics Symposium    BEST PAPER AWARD
analysis, optimized redesign and evaluation of a high fidelity vibrotactile interface integrated in a rigid surface
Cooperstock, J.R. (2008)
Human-Computer Interaction
Chapter in Wiley Encyclopedia of Computer Science and Engineering, ISBN 978-0-471-38393-2, Benjamin W. Wah, ed., Vol. 3, 1529-1542.
A tutorial overview of the field of HCI, including history, use and context, human characteristics, usability principles, design and evaluation principles, interaction paradigms, related domains, risks of computer technology, and challenges ahead.
FULL PUBLICATIONS LIST
"I do remember students who took your classes. They were clearly divided as those who complained saying that you were a hard grader and expected them to do work, and those who were appreciative for the hard work and what they learned. All those who were willing to do work thought you were a great educator."
fall 2017-2018ECSE424/542 · Human-Computer Interaction
Tuesday & Thursday, 8:35 – 9:55 ·  ENGTR 0070
ECSE526 · Artificial Intelligence
Tuesday & Thursday, 11:35 – 12:55 ·  ENGTR 0060
winter 2017-2018ECSE421 · Embedded Systems
Tuesday & Thursday, 11:35 – 12:55 ·  ENGTR 0100
EARLIER COURSES
recent media
Regarding air passenger rights
CBC News, August 11, 2017
CBC News, August 2, 2017
Op ed in Montreal Gazette, June 1, 2017
National Consumers League, Flyers Rights, and Alliance for Aviation Across America Press Conference Remarks, July 12, 2017
Washington Post, June 1, 2017
CTV News, April 15, 2017
CTV News, April 13, 2017
AM 640 Toronto, April 13, 2017
Global News, April 12, 2017
Mashable, April 11, 2017
Zoomer Radio, August 8, 2016
Financial Post April 14, 2016
Regarding Autour
CBC News, August 4, 2016
La Presse, July 31, 2016
CTV News, July 29, 2016
Regarding Real-time Emergency Response
GCN, July 24, 2013
National Science Foundation press release, June 25, 2013
Regarding Air Canada's overbooking compensation
Radio-Canada, June 11, 2013
News 1130 Vancouver, June 7, 2013
La Presse, June 7, 2013
Toronto Star, June 6, 2013
Regarding Montreal's transit system
Montreal Gazette, September 11, 2010
Montreal Gazette, March 17, 2010
Global National News, January 29, 2010
Montreal Gazette, January 27, 2010
National Post, January 27, 2010
CBC As It Happens, January 27 2010 ( audio begins at 21:43)
Regarding research activities
Associated Press, Future tech on show at 36th SIGGRAPH, Aug. 3, 2009
Montreal Gazette, January 19, 2009
PREVIOUS MEDIA
videos


UltraVideo

EcoTile

real-time
Emergency Response

mosaicing

interpolation

projection