Body2Particles: Designing Particle Systems Using Body Gestures

. Body2Particles is an interactive design system that allows users to create complex particle systems using body gestures. Even for skilled animators, adjusting the various parameters for the animation of complex physics systems is a boring task. Common users may feel frus-trated in attempting to understand these parameters and their proper usages. To solve these issues, Body2Particles provides an embodied de-sign interface for the animation of complex particle systems. This work focuses especially on the animation of ﬁreworks as it requires the hierarchical construction of multiple particle systems. The proposed user interface takes the user’s body movements as input from depth sensor and outputs ﬁrework animation. In our preliminary study, we identiﬁed the relationships among the animation parameters and skeleton joints. The system proposed in this study adapts these relationships using pre-deﬁned parameter constraints and estimates the simulation parameters based on the captured skeleton from a depth sensor. User studies are also conducted to verify the eﬀectiveness and potential uses of this sys-tem such as exercise promotion.


Introduction
Particle systems are used to simulate complex phenomena in virtual worlds, such as fluid, smoke, and fireworks, which are common in computer animations and video games [12].Although various sophisticated Euler and Lagrangian fluid simulation approaches are used in computer graphics nowadays, it is convenient for common users to use particle systems due to the low computation cost and the plugin embedded in popular game engines, such as Unity3D.It is not easy for common users to achieve the desired animation results by adjusting various control parameters.Even for professional users, it is boring task to control the parameters with the traditional graphical user interfaces (GUI).If a user wanted to create fireworks animation, they would have to study the flow dynamics of actual fireworks and master the appropriate simulation software, which are not interesting tasks for common users.
We have also observed that lack of exercise is a serious social issue because we are so busy with our daily studies and work that we have few motivation to exercise, leading to a variety of health problems.A embodied design interface for content creation can facilitate bodily self-expression in contrast to the common design interface with keyboards and mouse as system input.To make the user exercise while designing the interesting items, we aim to present a gesture-based embodied interface for designing a complex particle system.Through such an interface, the common users can design particle systems using body gestures that exercise their bodies.
To achieve these goals, we propose Body2Particles, an interactive system that allows any user to easily design complex particle systems using full-body gestures as system input.Body2Particles provides a similar experience as playing games with Microsoft Kinect in the design processes for particle systems.In this work, we especially focus on firework animations which are common and fascinating phenomena which can be represented by particle systems in our daily lives.Different from the conventional design system used for firework animations [5], the proposed system tracks human gestures as input.
Body2Particles enables the procedural design of particle systems without requiring professional knowledge.As shown in Figure 1, we conduct the parameter mapping from body posture onto the control parameters of a particle system, so users are not required to understand simulation principles or master any simulation skills.The main contribution of this work is to provide an intuitive and playful user experience in animation design using body gestures.We believe that the proposed system can be applied to general procedural modeling techniques in computer graphics, especially for physical animation designs, such as cloth and smoke animations.

Related Work
In this section, we review related work on firework animation, design interfaces.We also discuss the training systems to support daily exercise.
Firework Animation Particle systems are usually used to model fuzzy objects, such as clouds, water, and fireworks [12].It is difficult to use particle systems to design the fuzzy objects under animation constraints, such as into particular shapes.A GPU-based firework simulation pipeline has been proposed to animate shape-constrained fireworks [16].Recently, a similar approach has been proposed for a virtual-reality environment with a head-mounted display [5].The authors provided a sketch-based design system for shape-constrained firework animations.All these approaches were proposed for computers with traditional GUI.However, our work aims to provide an embodied design interface for fireworks animation.
Design Interface It is challenging to create a design interface for novice users, especially for the tasks required high-level skills, such as aerodynamics [14] and fluid dynamics [17].The common solutions fall into two categories: sketch-based design and gesture-based design.An interactive sketch system usually adopts a data-driven approach to provide real-time feedbacks to users.Sketch2Photo was proposed to composite realistic pictures with freehand sketches [4].Sketch2VF can help novice users design a 2D flow field with conditional generative adversarial networks [6].Sketch2Domino provided the spatial sketch design of block management and guiding for real world tasks [10].As a gesture-based design system, BodyAvatar enables the design of freeform 3D avatars using human gestures [15].This embodied interaction can also be used in 3D furniture design using body posing and acting as system input [9].Unlike these gesture-based design systems, we provide an embodied design interface for dynamical animation rather than 3D modeling.
Training System A training system usually used depth sensors, such as Microsoft Kinect, to support sports training [2].A more complex motion capture system has been used to guide tai chi exercise in a virtual environment [7].A similar 3D virtual coaching system has been proposed to improve golf swing technique [8].In addition, a VR training system has been proposed to learn the basketball free throw gestures with motion capture devices [11].Due to the rapid development of deep learning based approaches, the human skeleton [3] and 3D meshes [13] can be constructed interactively to guide users in sport training.In contrast to previous training systems, our work aims to design particle systems and provide a game experience that promotes user exercise.

System Configuration
The proposed system framework of Body2Particles is illustrated in Figure 2. To explore the relationship between body gestures and the control parameters of particle systems, we conducted a preliminary study to collect motion data, in which the users were asked to perform gestures freely with the recorded fireworks animations of different control parameters.In our embodied design interface for firework animation, we explicitly designed body gestures to correspond with different launching angles and heights and different blooming sizes and shapes.The final firework animations are created based on the users' gestures in real time.

Body Tracking
In this study, we used the RGB-D depth camera for body tracking (Kinect V2 with Kinect SDK 2.0) and obtained the coordinates, velocities and angles of the captured skeletons' joints.As shown in Figure 3(a), we used 21 items of joint data from the 25 joints captured by the Kinect configuration.Note that the data on ankles and hand tips were ignored because they were not clearly visible in this study.Then, we defined the distances between two hands D h , elbows D e , shoulders D s , knees D k , and feet D f for the particle system control.To control blooming shapes, a shape made by two arms is estimated based on the angles between the forearms and upper arms, where θ l and θ r denote the left and right arm joint angles as shown in Figure 3(b).

Firework Animation
A particle system has been proposed to represent fuzzy objects, such as fireworks with a collection of many small particles [12].Each particle system specifies various parameters to control the dynamical behaviors of fireworks as shown in Figure 4(a).A particle system normally goes through three stages in a given lifetime: the creation of particles, a change in particle attributes, and disappearance.After each particle in the particle system is rendered, fireworks animations can be generated.In this study, we generate a fireworks animation by dividing the whole life cycle of firework generation into two phases, from setting off the fireworks to their extinguishing.As shown in Figure 4(b), the firework to be launched is the parent particle, and the firework after blooming is a sub-emitter of particle systems.There are different control parameters defined in two phases as follows: -Launching phase: launching height and angle, and number of fireworks; -Blooming phase: blooming sizes and shapes, number of particles after blooming are specified as sub-emitter particles from blooming to extinction.
Note that we developed both single and multiple fireworks modes in our prototype development in terms of the number of fireworks.

User Interface Design
Body2Particles provides an embodied design interface for particle systems, which takes users' body gestures as input.In this section, we discuss the preliminary study conducted to map body gesture parameters and control parameters, and to define the gesture controls used in our interface to increase the system immersion and user experiences.

Preliminary Study
We conducted a preliminary study to explore how to express fireworks by observing the animations of fireworks from the common users.All the fireworks animations were generated by modifying the control parameters manually using Unity3D.Totally, we collected 18 basic styles of fireworks animations as shown in Figure 5.To cover the feature spaces in designing fireworks, we adopted the following main parameters of particle system.We asked five participants to join our preliminary study (three males and two females).The experiment setup is shown in Figure 2. The participants stood in front of Kinect depth sensor, and a display screen which showed video clips of generated fireworks animations.The participants were asked to express the fireworks using their body gestures.The distance between participants and the screen was 3.0 m.

Parameter Mapping
Figure 6 shows the motion trajectories tracked during our preliminary study, and each figure corresponds to the video clip of firework animation.We found that all participants preferred to use hand movements to express the firework dynamics in both the launching and blooming phases.For different launching angles of fireworks as shown in Figures 5 and 6 (a ∼ c), the participants were most likely to tilt their bodies to the left or right.In interviews, participants said that it was difficult to express the differences among blooming numbers of firework animations.In addition, there was no obvious difference in the data on male and female participants' movements.With reference to the results of the preliminary study, we explicitly define the parameter mapping between body gestures and control parameters as follows: Launching Angles are calculated from the tilting angle of the user's body θ B .
where (x neck , y neck , z neck ) is three-dimensional coordinates of the body's neck joint, and (x sb , y sb , z sb ) is the coordinates of spine base joint.Launching Heights are defined by the relative positions of the user's head, spine shoulder and spine middle joints as shown in Figure 3.Because it is difficult to distinguish between launching heights with small variations, we defined three levels: high y head ∈ (y spine−shoulder , ∞), medium y head ∈ (y spine−mid , y spine−shoulder ), and low y head ∈ (−∞, y spine−mid ) heights, where y head , y spine−shoulder , and y spine−mid denote the height (y-coordinate) of the user's body, spine shoulder, and spine middle joints.
Launching Numbers of parent particles in particle system are decided based on the acceleration values of the hand joints.The larger the acceleration of the user's hands, the greater the number of launching fireworks.
Blooming Sizes are calculated as the maximum values among the distances between the two hands, elbows, shoulders, knees and feet, as max(D h , D e , D s , D k , D f ).Then, we adjust the life cycle of sub-emitter particles to modify the blooming sizes.During the preliminary study, the largest fireworks were generated by the maximum distance between two hands.We argue that these size constraints are related to exercise promotion while the users intended to achieve larger fireworks.
Blooming Shapes are set to be hemispheres, rings or weeping-willow shapes in our prototype design, which are the typical and common shapes of fireworks.
The blooming shapes are determined by hand distance D h and the arm angles θ l and θ r .If D h > δ (δ is a threshold value of shape criterion, δ = 40 cm in this study) and θ l,r > 115 • , the blooming shape is set to be hemispherical.If D h < δ and 60 • < θ l,r ≤ 115 • , the blooming shape is set to be a closed shape (ring or weeping-willow shape).We decide the specific shape in terms of the relative position of hands to body core.Therefore, it is ring shape if y hand ∈ (y spine−mid , ∞), and weeping-willow shape if y hand ∈ (y spine−base , y spine−mid ).y hand and y spine−base denote the height (y-coordinate) of user hands and spine base joints.Figure 7 shows the firework animations generated by different body gestures with our proposed parameter mapping approaches.Please refer to the accompanying video for the details of gesture controls.

Gesture Control
In addition to the parameter mapping between body gesture and control parameters, Body2Particles provides the global setting for firework animations, including initial launching conditions, particle colors, and single or multiple fireworks modes.Firework animations can be designed with different colors and numbers using our proposed embodied design interface as shown in Figure 8.

Launching Conditions
In order to increase the recognition rate of body gestures, the design system launches fireworks under the initial conditions.Based on our preliminary study, we set the initial condition as the hand heights are 20 cm above spine middle joints, and the velocity of the user's hands should exceed 1.8 m/s.

Color Selection
The user can select the fireworks colors using the right hand in our design system as shown in Figure 8(a1, a2).The height of right hand positions y hr is used to determine the colors from bottom to up.The fireworks color can be set to orange if y hr ∈ (y head , ∞), yellow if y hr ∈ (y spine−shoulder , y head ), purple if y hr ∈ (y spine−mid , y spine−shoulder ), blue if y hr ∈ (y spine−base , y spine−mid ), dark blue if y hr ∈ (y knee−right , y spine−base ), and green if y hr ∈ (0, y knee−right ).

Mode Selection
The user can select the single or multiple fireworks modes in our design system using the knees as shown in Figure 8(b1, b2).To diversify the gestures and promote exercise, we set the single fireworks mode if y f oot−lef t > 20 and (y knee−kef t −y hip−lef t ) > 10 cm with the user's left leg, and multiple firework mode if y f oot−right > 20 and (y knee−right − y hip−right ) > 10 cm with the user's right leg.

User Study
In our user study, the prototype system of Body2Particles was developed using Unity3D in Windows, Kinect V2 for depth sensor, and HUAWEI Honor band 5 for calorie consumption measurement.The experiment setup and participants are shown in Figure 9.In order to verify the user experiences using the proposed system, we conducted a subjective evaluation through a questionnaire and an objective evaluation by measuring activity intensity.There were 12 graduate students joined this user study (6 males and 6 females).All participants were asked to experience the proposed system for a total of 20 minutes, 10 minutes in both single and multiple fireworks.During the user study, the participants wore the smartwatch device to record calorie expenditure, which was used to calculate the intensity of the 20-minute activity (metabolic rates).

Results
In this section, we discuss the subjective and objective evaluation results from our user study.Subjective Evaluation We have confirmed the effectiveness of three aspects of proposed system through the subjective evaluation: system usage, system effects, and user experience as shown in Figure 10.We adopted 5-Likert scale for all questions (5 for strongly agree, 1 for strongly disagree).We have received quite positive feedback on all evaluation aspects including mean values of 4.4 for system usage, 4.5 for system effects, 4.1 for user experiences.For system usages, it is verified that our proposed system is easy to operate and provide adequate interactive feedback to users.For system effects, the proposed system can encourage the user to exercise, and the participants felt relaxed after using the system and wanted to use it again.For user experience, the participants felt satisfied and interested while using both single and multiple fireworks modes.
In the interviews with participants after they used the proposed system, the following comments were most common: "More types of fireworks shapes", "Allow two users to battle against each other", "Generate fireworks with footprints while walking", and "Design cute shapes for female users".The participants also pointed out that the system sometimes recognized gestures incorrectly.There was a delay between body gestures and fireworks generation because the system had to generate new particle systems after the completion of previous ones.
Objective Evaluation Because our research aimed to combine particle system design with excercise promotion in gesture-based control, we tried to evaluate the effect of using the proposed system on exercise.Figure 11 shows the correlation between the participant weight and calories consumption.In this study, we measured activity intensity using metabolic rate [1] as follows.
In our user study, both male and female participants achieved more than 6.0 METs as shown in right subfigure of Figure 11.This shows that the proposed system had the effects of medium to high intensity physical activity.

Conclusion
In this study, we proposed an interactive system Body2Particles, which designs particle systems using body gestures.In order to explore the relation between the fireworks animations and users' body movements, we conducted a preliminary study to investigate the parameter mapping from body gestures to control parameters of particle systems.Then, we analyzed the body movements of the participants and clarified the correspondence between the body gestures and the control parameters, including launching angles and heights, blooming sizes and shapes.In our user study, we verified that the proposed system is intuitive and enjoyable for common users.Furthermore, the proposed system had the clear effect of promoting exercise equivalent to other moderate-intensity physical activities.
The prototype implementation of the proposed system can be improved in multiple ways.Although current proposed system used the list of built-in skeletal parameters for designs of particle systems, the design flexibility would be improved by providing interactive authoring tool for visual control of simulation parameters.To provide a continuously stimulating environment to system user, we would improve the current system with few constraints on control gestures and more freedom to the user.The system could be made more interesting by increasing the types of fireworks and gamifying elements, such as scores and awards.The current prototype system can support only single user at a time, so we plan to make it possible for a large number of people to participate in designing multiple fireworks.In future work, we would like to explore the use of deep learning approaches to improve gesture recognition and automatic parameter mapping.

Fig. 2 .
Fig. 2. System framework with parameter mapping in the preliminary study and the runtime embodied design interface with controlled parameters.

Fig. 3 .
Fig. 3. Skeleton tracked by the depth camera (a) and the defined metrics (b).

Fig. 5 .
Fig. 5. Video clips of fireworks animations used in the preliminary study.

Fig. 6 .
Fig. 6.Motion trajectories tracked in the preliminary study corresponding to the firework animations from (a) to (r).

Fig. 8 .
Fig. 8. Gesture controls for color selection (a1, a2) and transition between single and multiple fireworks modes (b1, b2) in the proposed design user interface.

Fig. 9 .
Fig. 9. Experiment setup (a) and participants performances (b) in our user study.

Fig. 11 .
Fig. 11.Results of objective evaluation based on the calorie consumption (left) and metabolic rates (right) from our user study.