top of page
Automatic Generation of Motion Effects

Automatic Generation of Motion Effects

Automatic Generation of Motion Effects

We developed the automatic synthesis algorithm of motion effects, considering many kinds of motion effects based on a camera, objects, and sound. The algorithms can accelerate motion effect design.

Publications

CHI 2023 / WHC 2023 (Demo)

Generating Real-Time, Selective, and Multimodal Haptic Effects from Sound for Gaming Experience Enhancement
Gyeore Yun, Minjae Mun, Jungeun Lee, Dong-Geun Kim, Hong Z Tan, and Seungmoon Choi

We propose an algorithm that generates a vibration, an impact, or a vibration+impact haptic effect by processing a sound signal in real time. Our algorithm is selective in that it matches the most appropriate type of haptic effects to the sound using a machine-learning classifier (random forest) that is built on expert-labeled datasets. Our algorithm is tailored to enhance user experiences for video game play, and we present two examples for the RPG (role-playing game) and FPS (first-person shooter) genres. We demonstrate the effectiveness of our algorithm by a user study in comparison to other state-of-the-art (SOTA) methods for the same cross-modal conversion. Our system elicits better multisensory user experiences than the SOTA algorithms for both game genres.

CHI 2023

Generating Haptic Motion Effects for Multiple Articulated Bodies for Improved 4D Experiences: A Camera Space Approach
Sangyoon Han, Jaejun Park, and Seungmoon Choi

Motion effects are indispensable for improving 4D experiences in highly interactive applications, such as amusement parks, 4D theaters, and virtual reality games. Their recent emergence calls for effective algorithms generating motion effects synchronized with audiovisual content. This paper presents an automatic algorithm for synthesizing the object-based motion effects that express the movements of multiple articulated bodies inclusively when the objects’ motion trajectories are available in the 3D camera space. By taking the visual velocities and sizes of all object parts, our method computes a motion proxy that represents the objects’ movements by one point and converts the motion proxy to a motion command through a motion cueing algorithm. The motion proxy is determined by linearly combining the velocities, and its best combination was selected from several candidates by user studies. The results of user studies indicate that our algorithm can produce compelling object-based motion effects that enhance the multisensory experience.

TOH 2022

Data-Driven Rendering of Motion Effects for Walking Sensations in Different Gaits
Hyoseung Lee, Seungjae Oh, and Seungmoon Choi

Motion effects are a vital component in 4D interactive applications, where special physical effects, such as motion, vibration, and wind, are provided with audiovisual stimuli. In 4D films and VR games, the scenes that show human locomotion appear frequently, and motion effects emphasizing such movements can enhance the viewers’ immersive experiences. This paper proposes a data-driven framework for automatic generation of the motion effects that provide users with walking sensations. Measurements are made using the motion sensors attached to the human body during locomotion in different gaits, e.g., walking, running, and stumping. The captured data are processed and converted to multiple degree-of-freedom commands to a motion platform. We demonstrate that the data-driven motion commands can be represented in a greatly lower-dimensional space by principal component analysis. This finding leads to an algorithm for the synthesis of new motion commands that can elicit the target gait's walking sensations. The perceptual performance of our method is validated by two user studies. This work contributes to investigating the feasibility of mimicking walking sensations using a motion platform based on human locomotion data and developing an automatic generation algorithm of motion effects conveying the impressions of different gaits.

IEEE VR 2021

Camera Space Synthesis of Motion Effects Emphasizing a Moving Object in 4D Films
Sangyoon Han, Gyeore Yun, and Seungmoon Choi

Four-dimensional (4D) films, which provide special physical effects to the audience with audiovisual stimuli, are gaining more popularity and acceptance. One of the most frequent 4D effects is the object-based motion effect, which refers to the vestibular stimulus generated by a motion chair to emphasize a moving object of interest, e.g., the flying iron man, displayed on the screen. In this paper, we present an algorithm for synthesizing convincing object-based motion effects automatically from a given object motion trajectory. While previous approaches use the 2D object position on the screen as input, our method takes the 3D position and orientation of the object in the camera space and computes its motion proxy that reflects both the object translation and rotation, as well as its size to the viewers' eyes. The proxy is determined based on the results of a perceptual experiment that presents an optimal additive rule of the translation and rotation information scaled by the object's visual size. The motion proxy is fed to a motion cueing algorithm (MCA) that computes the command using a washout filter or model predictive control. The most appropriate MCA for our purpose is selected from six candidates by a user study. We also consider the effects of visual perception by incorporating two types of motion field equations into the computation of the visually perceived velocity. The results of a user study indicate that our algorithm can generate compelling object-based motion effects that better enhance the 4D film viewing experience than the previous methods.

CHI 2021

Improving Viewing Experiences of First-Person Shooter Gameplays with Automatically-Generated Motion Effects
Gyeore Yun, Hyoseung Lee, Sangyoon Han, and Seungmoon Choi

In recent times, millions of people enjoy watching video gameplays at an eSports stadium or home. We seek a method that improves gameplay spectator or viewer experiences by presenting multisensory stimuli. Using a motion chair, we provide the motion effects automatically generated from the audiovisual stream to the viewers watching a first-person shooter (FPS) gameplay. The motion effects express the game character’s movement and gunfire action. We describe algorithms for the computation of such motion effects developed using computer vision techniques and deep learning. By a user study, we demonstrate that our method of providing motion effects significantly improves the viewing experiences of FPS gameplay. The contributions of this paper are with the motion synthesis algorithms integrated for FPS games and the empirical evidence for the benefits of experiencing multisensory gameplays.

Interaction Lab. and Experience Lab. @ Pohang University of Science and Technology (POSTECH)

    bottom of page