


Teaching subjects which involve practical demonstrations present major challenges when performed online. Such subjects include physical computing, digital fabrication, electronics, chemistry, and traditional crafts. An overhead camera system has gone some way to addressing these challenges but from our experience in teaching physical computing and hardware hacking subjects, this setup also introduces new problems. Using an overhead camera requires switching between camera inputs, which interrupts the smooth flow of the lecture. Also, when using simultaneous video-capture applications (such as smartphone mirroring apps), the handling of additional windows over a single shared screen makes matters even more cumbersome. A commercial-grade overhead camera providing multiple sources of illumination (vertical, lateral, diffuse, etc.) would solve the illumination problem, but even with a large field of view, a single camera would be insufficient: indeed, a critical aspect of teaching by demonstration is that students can observe the task dynamically, self-selecting an appropriate point of view at any time and being able to replicate the demonstration from their own distant location.
We propose to build a multi-camera rig for recording different viewpoints simultaneously, each of which can be selected by each student independently, while allowing for the viewing angle to be overridden by the teacher when something requires the attention of all students. Aspects of illumination (position, intensity, color) could also be controlled by the students independently. With multiple camera views where students and instructors could navigate freely, we may be able to investigate a new way to improve students’ attention during lectures and the relationship between attention allocation and view transitions. We also record periods for each camera view to generate a personalised learning experience based on priority of footages.
We developed a customised streaming system equipped with 13 cameras to provide students with a semi-free control of viewpoints. System is also able to suggest the best view for object understanding based on a prior user study which concluded the students’ perception during physical tasks based on object recognition. Students’ attention allocations are also recorded to conclude an optimal view-switch for recorded class. To study how embodied experience could facilitate understanding of 3D objects, a AR rig is developed on student side to provide a on-situ simulation as if students were around the instructor’s workspace.
Contact