CREST Symbiotic Interaction Research Group


Towards visualization of the sensing data of learners' activities and their contexts

Sensing, Modeling, and Visualizing Pedagogical Environments


CREST Symbiotic Interaction Research Group is advancing technologies of sensing, modeling, and visualizing sounds of learners' interactions and surrounding environments. Currently, we are focusing on the development of two systems: a visualization system of human interactions using a microphone array and a mobile auditory sensing system using smartwatches. These developments are conducted in "Pedagogical information infrastructure based on mutual interaction, Symbiotic Interaction, CREST."

Abstract






Member



  • Mitsuru Kawamoto
  • Akio Sashima
  • Shuichi Tomita



Interaction Analysis based on Auditory Data Sensed by Microphone Array


We are developing an interaction analysis technology for sensing, modeling, and visualizing statuses of mutual interactions in a pedagogical environment, such as classroom, coaching, and so on. In this technology, a microphone array is used to sense the interaction in an auditory environment.


Sensing
Sensing pedagogical environments
Modeling
Extracting features to sense interaction statuses (stages) and its changes and constructing a mode for visualizing the interaction statuses
Design
Developing an evaluation method based on auditory sensing data by comparing with conventional QA-based evaluation methods
Intervention
Applying the auditory evaluation method to the pedagogical processes








Context Analysis based on Mobile Auditory Sensing


We are researching mobile auditory sensing technology that enables us to understand the auditory contexts of users. In this technology, a smartwatch is used to sense the contexts of a user.


Prototype System
Mobile auditory sensing system implemented on a smart watch
Dimension Reduction
Applying Non-negative Matrix Factorization (NMF) for dimension reduction of auditory sensing data
Cluster Analysis
Clustering reduced-dimensional auditory data for segmentation of user's contexts

Contact


If you have any questions, please contact us at the following address.