Next Article in Journal
Multimodal Dictionaries for Traditional Craft Education
Previous Article in Journal
The Optimization of Numerical Algorithm Parameters with a Genetic Algorithm to Animate Letters of the Sign Alphabet
Previous Article in Special Issue
Sound of the Police—Virtual Reality Training for Police Communication for High-Stress Operations
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

3D Hand Motion Generation for VR Interactions Using a Haptic Data Glove

by
Sang-Woo Seo
1,
Woo-Sug Jung
1 and
Yejin Kim
2,*
1
Defense and Safety Convergence Research Division, Electronics and Telecommunications Research Institute, Daejeon 34129, Republic of Korea
2
School of Games, Hongik University, Sejong 30016, Republic of Korea
*
Author to whom correspondence should be addressed.
Multimodal Technol. Interact. 2024, 8(7), 62; https://doi.org/10.3390/mti8070062
Submission received: 10 May 2024 / Revised: 14 June 2024 / Accepted: 28 June 2024 / Published: 15 July 2024
(This article belongs to the Special Issue 3D User Interfaces and Virtual Reality)

Abstract

:
Recently, VR-based training applications have become popular and promising, as they can simulate real-world situations in a safe, repeatable, and cost-effective way. For immersive simulations, various input devices have been designed and proposed to increase the effectiveness of training. In this study, we developed a novel device that generates 3D hand motion data and provides haptic force feedback for VR interactions. The proposed device can track 3D hand positions using a combination of the global position estimation of ultrasonic sensors and the hand pose estimation of inertial sensors in real time. For haptic feedback, shape–memory alloy (SMA) actuators were designed to provide kinesthetic forces and an efficient power control without an overheat problem. Our device improves upon the shortcomings of existing commercial devices in tracking and haptic capabilities such that it can track global 3D positions and estimate hand poses in a VR space without using an external suit or tracker. For better flexibility in handling and feeling physical objects compared to exoskeleton-based devices, we introduced an SMA-based actuator to control haptic forces. Overall, our device was designed and implemented as a lighter and less bulky glove which provides comparable accuracy and performance in generating 3D hand motion data for a VR training application (i.e., the use of a fire extinguisher), as demonstrated in the experimental results.

1. Introduction

In virtual reality (VR), the introduction of an artificial environment that substitutes physical surroundings convincingly is a crucial element of providing a realistic experience to users. Recently, VR-based training has become popular and promising as it can simulate real-world situations in a safe, repeatable, and cost-effective way. With interactive and immersive simulations, various input devices have been designed and proposed to increase the effectiveness of training [1,2]. A comprehensive review of VR training in various application domains is discussed in [3].
Among several VR elements (i.e., visual feedback, free movement, physical interaction, narrative engagement, and others) that increase the immersion of the user experience, the use of physical tools has gained substantial attention in virtual training [4]. For example, using data gloves, a user can interact with virtual objects in a tangible way, allowing precise hand movements. Using haptic feedback, these hand motions feel like real-life ones and make the user fully engaged with the VR environment.
Studies on hand motion tracking or haptic feedback have been actively conducted to improve the effectiveness of user interaction in VR training. A virtual training system using haptic tools allows users to know the feeling, size, and weight of the tools [5]. For example, in a training model for learning the use of a fire extinguisher, a hand-held VR controller was modified by adding tactile sensations like air pressure and vibration [6]. The perceptions of this type of training were further evaluated using a VR application with a physical object [7]. Arora et al. proposed modular tools that facilitate physical operation in a VR environment [8]. Zhu et al. developed haptic tools that are comparable to real ones and demonstrated the effectiveness of tangible interactions [9]. Shigeyama et al. introduced a portable VR controller that can change the mass properties on a two-dimensional plane and provide a realistic tool feel [10]. Similarly, Zenner and Krüger proposed a device that changes the drag and rotational inertia for sensing the surface area [11]. Baek et al. proposed a car-washing system in VR using multiple sensors [12]. In the medical field, Talhan and Jeon conducted a review on the various haptic effects of pneumatic actuation [13].
From design to deployment, VR training applications are developed for specific learning purposes. Thus, previous systems modified hand-held VR controllers or built special tools for user interactions during training. Data gloves are used to track hand motions and provide user interactions with virtual objects in a tangible way [14,15,16,17,18,19]. However, these commercial devices rely on vibrotactile sensation, which only provides the textural feeling of a real object, or a mechanical system which consists of a bulky exoskeleton and motors. Moreover, they require a special suit or an external tracker to track the global positions of hands in a VR space. In contrast to the development of tracking methods with visual feedback, techniques for 3D hand motion generation with haptic force feedback for flexible and precise interactions need further improvement.
In this study, we developed a novel device that generates 3D hand motion data and provides haptic force feedback for VR interactions. The proposed device can track 3D hand positions using a combination of the global position estimation of ultrasonic sensors and the hand pose estimation of inertial sensors in real time. Multichannel ultrasonic sensors in a 2D plane localize hand positions at a relatively low speed without a drift problem, whereas a set of inertial sensors accurately estimates hand positions at a relatively high speed. For haptic feedback, the shape–memory alloy (SMA) actuators were designed to provide kinesthetic forces and efficient power control without an overheating problem. The proposed device improves upon the shortcomings of existing commercial devices by designing and implementing the tracking and haptic capabilities in an integrated device.
The proposed device makes several contributions to the interactions in VR training. First, we have designed and implemented a flexible haptic glove that can track 3D global positions and estimate hand poses in a VR space without using an external suit or tracker. Unlike existing devices, we integrated two different types of sensors (ultrasonic and inertial) into one system for hand position and pose estimation, which overcomes the magnetic disturbance and occlusion problems. Also, the proposed device provides the kinematic feeling of hand interactions, such as gripping and releasing a real object (i.e., the use of a fire extinguisher in our experiment), through haptic force feedback. For better flexibility in handling and feeling physical objects compared to exoskeleton-based devices, we introduced a SMA-based actuator to control haptic forces. Overall, the proposed device was designed and implemented as a lighter and less bulky glove while providing comparable accuracy and performance in generating 3D hand motion data for a VR training application.
This paper is organized as follows: In Section 2, we review previous research on the hand motion tracking and haptic feedback of data gloves. Section 3 describes the design and implementation of hand motion tracking and haptic force feedback in the proposed device. We present the experimental results in Section 4 and our conclusions for the proposed device and its limitations in Section 5.

2. Related Works

2.1. Hand Motion Tracking

The hand motion tracking performance of data gloves is a key factor in providing a realistic experience in a VR environment. The techniques for flexible and strain sensor-based recognition estimate the entire hand motion by using the resistance variation of the sensors based on the degree of bend in the finger joints [17,20,21]. However, it is difficult to track the accurate positions of the finger movements. Currently, commercial data gloves use a set of inertial measurement units (IMUs) attached to the target fingers to track the motion of individual joints in real time [14,15,16,18,19]. O’Flynn et al. proposed an IMU-based glove that minimizes the calibration process and provides data analytics to track joint movement [22]. In these devices, the tracking performance depends on the number of IMUs attached to the glove, which increases the complexity and cost of the device. A simplified version with fewer IMUs is available as an off-the-shelf device [16]; however, it is difficult to recognize accurate hand poses for training purposes. Moreover, these devices rely on an external tracker such as the VIVE tracker [23] to track the global positions of hands in a 3D space.
Head-mounted displays (HMDs) for VR, such as the HTC VIVE and Meta Quest series, apply a vision-based and constellation tracking technique to locate the 3D positions of their hand-held controllers [24,25]. However, particular attention must be paid to the configuration of the base stations when tracking global positions in a VR environment because body parts and physical objects often suffer from occlusion problems [26], for example, if the tracking performance is constrained by the viewing range and affected by an in-between obstacle. Moreover, hand recognition fails if the hand disappears from the cameras mounted on an HMD.

2.2. Haptic Feedback

Haptic feedback is another key factor in providing users with an immersive experience in VR training. In the experience of VR, this is particularly emphasized in handling the physical proxy for a virtual object [27]. MANUS Prime Glove provides vibrotactile sensation using flexible sensors and voice coil actuators [17]. However, it only provides the textural feeling of a real object. Teslasuit Glove and CyberGlove are haptic devices that adopt a motor-driven technique to provide finger force feedback [18,19]. These devices adopt rotation sensors to capture hand motion; however, they constrain the degrees of freedom (DOF) in finger movements due to the bulky mechanism mounted on the dorsal hand and the fingers. Hinchet et al. proposed a wearable haptic feedback device that uses electrostatic brakes to generate torque forces [28]. However, this device generates a strong magnetic field and is not suitable to use with the tracking device of IMU sensors. Bouzit et al. developed hand–palm-based force feedback using pneumatic actuators [29]. In their device, an air compressor or a continuous supply of CO2 cans with a regulator is required to generate actuation force feedback, which is bulky and impractical in VR training. Fujimoto et al. introduced a force feedback method using shape–memory alloy (SMA) wires [30]. However, this type of device adopts natural cooling; hence, an additional cooling method should be considered for frequent use. Moreover, a user can sense the kinesthetic effect even when the actuator is not activated, owing to the frictional force of the SMA wires mounted on the fingers. Patterson et al. applied the SMA wires to locomote a soft robot and used water to boost the cooling rate [31]. However, the application of water to an external device like a glove is difficult owing to leakage and maintenance. For these reasons, most of the previously studied methods of kinesthetic haptic feedback have been difficult to use practically because they are large and bulky to wear and carry in a VR environment.

3. Design and Implementations

3.1. Hand Position Estimation

The proposed device tracks hand positions in 3D space using a combination of ultrasound-based position estimation, inertial sensor-based 3D acceleration, and pose estimation. However, there are well-known problems for these sensors: ultrasound-based devices have high computational costs, IMU sensors are sensitive to drifting issues, and it is difficult to estimate 3D positions continuously without noise. Considering these, we combined two different types of sensors into the proposed device, which consisted of four ultrasonic source generators, two ultrasonic receiver arrays, two 3D ultrasonic source direction estimators, an ultrasonic sensor-based 3D position estimator, a 9-axis IMU sensor, an attitude and heading reference system (AHRS) processor, and a 3D position estimator, as shown in Figure 1.
Initially, the ultrasonic generators attached to the back of the glove generated ultrasonic waves of 50 kHz with a radiation pattern of ±30°. To minimize the shadow area of ultrasonic reception, four ultrasonic transducers were attached to the generators, such that the two multichannel ultrasonic receiver arrays could receive signals from all directions without restricting the range of the hand rotations. The ultrasonic receiver arrays obtained ultrasonic signals in the time domain at a sampling rate of 2 MHz and sent them to the 3D direction estimators, which estimated the direction between the ultrasonic source generators and receiver arrays using the delay-and-sum beamforming (DSBF) algorithm. Using the source localization method [32], we estimated the 3D position of the ultrasonic sources by calculating the closest intersection point of the two ultrasonic direction vectors obtained from those two estimators, as shown in Figure 2.
The inertial sensor attached to the ultrasonic transducers provided 9-axis data: 3-axis acceleration, 3-axis gyro, and 3-axis geomagnetism. This data included gravitational acceleration, which should be removed to measure the accurate acceleration of hand rotations, a R , from the inertial sensor, as follows:
a R = R ( a s g ) ,
where a s is the acceleration of the inertial sensor, g is the gravitational force, and R is the rotation matrix of the inertial sensor, which was estimated using the AHRS algorithm [33]. Thus, the AHRS processor in the proposed device filters out the gravitational component from the acceleration data measured by the inertial sensor on the glove.
The inertial sensor-based 3D position was estimated by integrating Equation (1) twice; however, the noise errors from the drift phenomenon should be corrected for the position estimation. Assuming a linear system used for a state-space model in the Kalman filter, the state prediction model can be defined as follows:
x ¯ t = A x t 1 + B a t ,
where t is the time index, a t is a R at t , x ¯ t is the predicted state vector, x t is the state vector, A = 1 d t 0 1 , and B = d t 2 2 d t . During the correction step, x ¯ t and the Kalman gain matrix, K t , are used to update x t as follows:
x t = x ¯ t + K t z t H x ¯ t , K t = P ¯ t H ( H P ¯ t H + R ) 1 , P t = I K t H P ¯ t , P ¯ t = A P t 1 A + Q ,
where z t is 3D position and velocity estimated by the ultrasonic estimator, H is the estimation matrix, P t is the posteriori covariance matrix, P ¯ t is the predicted covariance matrix, Q is the covariance matrix for the IMU sensors, and R is the covariance matrix for the ultrasonic estimators. We defined the initial state variables at t = 0 and assigned x ¯ t and P t as the input arguments to estimate the prediction states at t + 1 . In the proposed system, σ 2 in Q and R was set to 0.01 and 1.0, respectively [34].
The 3D hand positions were roughly estimated at a relatively low speed (10 Hz) using the DSBF algorithm from the ultrasonic receiver arrays in a 2D plane and accurately estimated at a high-speed (187.5 Hz) using the data from the AHRS processor. It is noteworthy that the inertial sensor-based AHRS processor and the ultrasound-based localization module operated independently during the position estimation process. Finally, the 3D position estimator integrated the 3D acceleration and rotation data estimated by the inertial sensor and the 3D position and velocity data estimated by the ultrasonic receiver arrays into the output data using the Kalman filter.

3.2. Finger Rotation Estimation

To obtain the rotation data of finger joints, a set of 12 IMU sensors were placed on the glove, as shown in Figure 3. We omitted five sensors on the proximal interphalangeal (PIP) joints to reduce the complexity of the device. The rotations of the proximal interphalangeal (PIP) joints, q P I P , were estimated by adopting linear interpolation (LERP) between the distal interphalangeal (DIP) and the metacarpophalangeal (MCP) joints as follows:
q P I P = L E R P q M C P ,       q D I P ,       α L E R P q M C P ,       q D I P ,       α , L E R P q M C P , q D I P , α = q M C P 1 α + q D I P α ,   if   q M C P q D I P 0 q M C P 1 α q D I P α ,   if   q M C P q D I P < 0 ,
where α is the normalized position on the straight line between q M C P and q D I P in the quaternion space. We set α to 0.4–0.6 based on the anatomical model of hand motion [35,36].

3.3. Haptic Force Feedback

In the proposed device, we introduced haptic force feedback using SMA wires that are lightweight to wear and compatible with the hand motion estimation described in the previous section. Unlike motor-based kinesthetic feedback, a major challenge in SMA-based devices is the low cooling rate of the heated SMA when kinesthetic senses are transmitted to users. Generally, seconds to several tens of seconds are required to cool down the SMA-based devices, depending on the activation temperature, diameter, and surrounding conditions. We adopted a flexible thermoelectric device [37] and thermal grease with a high thermal conductivity to resolve the cooling problem.
The limited motion direction is another problem with SMA-based devices. Because the shape is activated in only one direction when it is heated, the generation of negative force is not possible in kinesthetic feedback. Thus, the proposed device applied SMA actuators to the hand dorsal and palm to provide bidirectional feedback. Figure 4 shows an overview of the haptic force feedback used in the VR interactions.

3.3.1. SMA Actuator

We deformed the SMA wire of Dynalloy 70 °C Flexinol® with a diameter of 375 µm and created an actuator spring, as shown in Figure 5. To memorize the initial shape of the straight SMA wire in the form of a spring, it was placed in a furnace at 475 °C for one hour. The tensile force was measured using a weight scale and the SMA spring attached to the wooden hand of a mannequin to obtain a mathematical model of the pull forces from the supplied powers, as shown in Figure 6. The actuation of a stretched spring started at approximately 0.3 W. When a power of 3 W was applied, the pulling force measured was up to approximately 1 kg. When more than 3 W was applied, it was overheated and caused a permanent strain; therefore, it could not be actuated to the initial shape of a spring. To provide a precise force feedback, we have estimated a polynomial fit from the measured data as follows:
F = ( 6.3 × 10 9 ) w 3 + ( 6.2 × 10 6 ) w 2 + 0.0027 w + 0.0814 ,
where F is the pulling force and w is the electric power.

3.3.2. SMA Cooling

As noted earlier in Section II, the slow cooling rate is a critical problem for the use of SMA materials in haptic devices that generate force feedback frequently. In our experiment, several seconds were required for the SMA spring to cool down its actuating temperature in the static air. To expedite this cooling rate, we applied a flexible thermoelectric pad that was developed by adopting the Peltier effect, as shown in Figure 7. Because the spring and thermoelectric pad are electrically conductive, the spring was wrapped with a silicone tube to avoid shorting. To effectively transfer the cool temperature of the thermoelectric pad to the spring, a hardened thermal grease with a thermal conductivity of 8.5 w/mk was applied between the pad covered with the insulating film and the copper foam, and a permanently uncured silicone grease with a thermal conductivity of 4.8 w/mk was applied around the spring in the tube. Moreover, several holes were drilled into the silicone tube to increase its thermal conductivity. When an electric current flows through a thermoelectric device, one side becomes cool while the other side becomes hot. If the hot side is not cooled quickly, heat is applied to the cooled side, thereby heating both sides of the device. To solve this heating problem, heat sinks were attached to the hot side. Finally, the cooling device was closed with a silicone cap to prevent the injected grease from leaking.

4. Experimental Results

The proposed device is best understood through examples of its uses, as described in the subsequent sections. The accompanying video can be found at Supplementary Materials.

4.1. Hand Motion Generation

Figure 8 shows the prototype of the data glove for 3D hand motion generation. The proposed device was implemented in two main parts: the ultrasonic source position estimator and the hand pose estimator using a set of inertial sensors. We used Hagisonic HG-(M/L)40(T/R)B for the ultrasonic transducers [38], Knowles SPU0410LR5H-1 for ultrasonic receivers [39], and Linear Technology AD7386BCPZ-RL for the analog-to-digital (AD) converter [40], which was sampled at a rate of 2 MHz. We adopted our previous system for the DSBF analyzer and the communication component [32].
The hand pose estimation part consisted of 12 IMU sensors, micro-processing units (MPUs), an ultrasonic source generator, and a sensor data acquisition and communication module. We used InvenSense ICM-20948 for the IMU sensors [41], which acquired 9-axis data at 187.5 Hz, and STMicroelectronics STM32L432 for the MPUs [42], which processed the AHRS algorithm via serial peripheral interface (SPI) communication. An STMicroelectronics STM32F767 processor acquired 12 quaternions from small MPUs for the finger rotations and transmitted data to the PC via a Silicon Labs WGM110 Wi-Fi communication module [43].
Figure 9 compares the performance of tracking hand positions between the proposed device and the VIVE tracker [23]. Both devices were tracked approximately 30 m in an indoor room with dimensions of 4 m × 6 m × 2.5 m (width × length × height). The total distance offset of the proposed device compared to the tracker was less than 5 cm without a drift. Furthermore, Table 1 compares the finger rotations in various hand poses between the proposed device and the commercial glove [17] using the marker-based motion capture system [44]. In this comparison, a total of 1000 joint rotations were collected at 120 Hz from the hand poses. As shown in the table, the angular differences of the MCP, PIP, and DIP joints for the proposed device were less than the ones for the commercial one (i.e., 0.51–1.24° against 2.27–2.82°, 1.46–2.60° against 2.07–2.78°, and 0.95–2.69° against 2.3–3.44°). An interpolation estimation was used in the proposed device for the PIP joints. For the third and fourth poses in Table 1, the commercial device showed larger differences as this device relies on flexible sensors to estimate finger rotations, which is not as precise as the proposed device using multiple inertial sensors.

4.2. Haptic Feedback with Cooling Rate

To obtain the desired amount of force feedback and a constant cooling temperature from the thermoelectric pads, constant power was controlled by a temperature sensor and supplied to each device. As shown in Figure 10, the SMA springs and thermoelectric pads operated at the desired current and voltage using an 8-bit digital-to-analog (DA) converter (Texas Instruments DAC5311) [45], a low-voltage operational amplifier (OP-Amp) as a comparator (Texas Instruments LMV321) [45], and MOSFETs (Infineon Technologies IPP026N10NF2S) [46].
The desired amount of current, I D , which traverses the SMA springs, MOSFET1, and a shunt resistor R S h u n t was controlled by adjusting the voltage of the negative input of the OP-Amp. The voltage for I D was estimated as follows:
V S h u n t = I D × R S h u n t ,
where R S h u n t is 200 mΩ. The digital input value to the DAC to supply V S h u n t to the negative input of the OP-Amp was estimated as follows:
D A C = V S h u n t V I n × ( 2 n 1 ) ,
where the voltage input of the system V I n was 3.3 V, and n is the bit resolution of the DAC. Here, MOSFET1 was used to supply stable power to the SMA springs, whereas R2, R3, and MOSFET2 were used to turn on and off the power of the module [46].
Figure 11 shows the prototype of the proposed device and its controller, which had five channels (two for the SMA springs on the index finger, two for those on the middle finger, and one for the thermoelectric pad). All the power control modules received current values through SPI communication from the STMicroelectronics STM32F070 microprocessor [42].
Figure 12 compares the cooling rate between the static cooling and proposed method. The static air cooled down below the actuating temperature (70 °C) for approximately 4 s, whereas the proposed method decreased the temperature within 1 s. It is noteworthy that the proposed device maintained a temperature of 15 °C at a low power of 3 W. Table 2 summarizes the physical comparison between the proposed device and the commercial gloves [14,15,16,17,18,19]. It is noteworthy that the proposed device is lighter and smaller in height compared to the commercial ones with force feedback [14,18,19], which are more bulky owing to a motor-driven structure.

4.3. User Test

We applied the proposed device to a VR training application on the use of a fire extinguisher, as shown in Figure 13. In this training, a user extinguished a fire on several spots in a VR environment using a real fire extinguisher that was used and emptied. During the exercise, various kinesthetic feedback were provided from the handle and nozzle of the extinguisher. For example, the parameters of the reaction force were estimated from the collision characteristics between the glove and handle or between the glove and nozzle, and then they were transmitted to the haptic force feedback controller via Wi-Fi. The controller applied current to the SMA springs to generate a force corresponding to the parameter input. When a user squeezes or releases the handle, the controller increases or reduces the current flowing through the SMA springs, as shown in Figure 4. The controller generated the force feedback similarly for the nozzle vibration (i.e., a vibration caused by the spraying pressure) and reaction.
For a user test, 12 general people (6 males, 6 females, and ages between 25 and 35), who did not experience the application using the haptic glove, participated in the training. Prior to the training, they were instructed on how to use a real fire extinguisher in the outside environment and repeated the same instructions in the virtual environment. Each participant conducted the virtual exercise for several sessions, which lasted approximately 30 min. Figure 13 shows a survey result of evaluating the force feedback and VR immersion against the real extinguisher on the radar chart. In this survey, an average of a five-point Likert scale was used, where a larger number indicates higher similarity. Overall, the participants felt the force feedback from the handle (i.e., squeeze and release) was more similar to the real extinguisher’s than the one from the nozzle (i.e., vibration and reaction). This is mainly because excessive force feedback was generated for the nozzle reactions compared to the real extinguisher. In addition, the participants felt the squeezing force was more similar to the real extinguisher than the releasing force. Our device recognized the user’s hand poses and controlled the SMA actuators precisely to generate the right amount of squeezing force; however, a residual force acting on the springs made it uncomfortable for the releasing force. Both groups felt the overall training was highly immersive.

5. Conclusions

In this study, we introduced a novel device that can generate hand motion data and provide haptic force feedback for VR interactions. The proposed device used a combination of ultrasonic and inertial sensors and tracked 3D hand positions and poses with high accuracy without using an external tracker. By applying efficient power control to the actuators, the proposed device provided kinesthetic feedback without an overheating problem. Compared to existing commercial gloves, the overall design of the proposed device is lighter and less bulky, so it is suitable for use in various VR training applications, as shown in the user test for the virtual training of handling a fire extinguisher. Synchronizing with the visual feedback of HMDs, this immersive experience of haptic feedback will be helpful in responding to fires ranging from small scales in houses and apartments to large scales in urban infrastructure facilities (i.e., underground utility tunnels and factories) in a safe, repeatable, and cost-effective way. For other applications, our device can be applied to a training situation that requires the kinematic feeling of hand interactions using a tool in medical, manufacturing, and sporting simulations.
There are several ongoing improvements to the proposed device. The current design of the implemented device consists of various sensors and control modules attached to the back of a hand, which are not resistant to collision and shock, particularly in intensive training situations. Using SMA-based actuators for haptic force feedback, a user can feel a residual force acting on the unactuated spring even if the springs are completely cooled down. This can interfere with the user’s natural finger movement and the force feedback of the SMA being actuated. Currently, we are looking for a substituent material that completely removes the force when the wire is bent in a non-driven state.

Supplementary Materials

The following supporting information can be downloaded. The experiment results can be seen on the video located at https://drive.google.com/file/d/16jKbycoB1u1D3alSV4snTid9e_9ZgxUL/view?usp=sharing (accessed on 1 May 2024).

Author Contributions

Conceptualization, S.-W.S., W.-S.J. and Y.K.; methodology, S.-W.S., W.-S.J. and Y.K.; software, S.-W.S.; validation, S.-W.S.; formal analysis, S.-W.S., W.-S.J. and Y.K.; investigation, S.-W.S., W.-S.J. and Y.K.; resources, S.-W.S.; data curation, S.-W.S.; writing—original draft preparation, S.-W.S. and Y.K.; writing—review and editing, S.-W.S. and Y.K.; visualization, S.-W.S. and Y.K.; supervision, W.-S.J. and Y.K.; project administration, W.-S.J. and Y.K.; funding acquisition, W.-S.J. All authors have read and agreed to the published version of the manuscript.

Funding

This research was supported by the Institute of Information & Communications Technology Planning & Evaluation (IITP) grant funded by the Korea government (MSIT, MOIS, MOLIT, and MOTIE) (No. 2020-0-00061, Development of integrated platform technology for fire and disaster management in underground utility tunnel based on digital twin), a grant (2019-MOIS34-001) from Preventive Safety Service Technology Development Program funded by Korean Ministry of Interior and Safety (MOIS), and the National Research Foundation of Korea (NRF) grant funded by the Korea government (MSIT) (No. 2021R1F1A1046513).

Institutional Review Board Statement

Ethical review and approval were waived for this study, due to user testing to be used for assessment purposes.

Informed Consent Statement

Written informed consent has been obtained from all subjects involved in the study to publish this paper.

Data Availability Statement

The original contributions presented in the study are included in the article, further inquiries can be directed to the corresponding author.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Chao, C.-J.; Wu, S.-Y.; Yau, Y.-J.; Feng, W.-Y.; Tseng, F.-Y. Effects of Three-Dimensional Virtual Reality and Traditional Training Methods on Mental Workload and Training Performance. Hum. Factors Ergon. Manuf. Serv. Ind. 2017, 27, 187–196. [Google Scholar] [CrossRef]
  2. Tan, Y.; Xu, W.; Li, S.; Chen, K. Augmented and Virtual Reality (AR/VR) for Education and Training in the AEC Industry: A Systematic Review of Research and Applications. Buildings 2022, 12, 1529. [Google Scholar] [CrossRef]
  3. Xie, B.; Liu, H.; Alghofaili, R.; Zhang, Y.; Jiang, Y.; Lobo, F.D.; Li, C.; Li, W.; Huang, H.; Akdere, M.; et al. A Review on Virtual Reality Skill Training Applications. Front. Virtual Real. 2021, 2, 645153. [Google Scholar] [CrossRef]
  4. Radianti, J.; Majchrzak, T.A.; Fromm, J.; Wohlgenannt, I. A systematic review of immersive virtual reality applications for higher education: Design elements, lessons learned, and research agenda. Comput. Educ. 2020, 147, 103778. [Google Scholar] [CrossRef]
  5. Loch, F.; Ziegler, U.; Vogel-Heuser, B. Integrating Haptic Interaction into a Virtual Training System for Manual Procedures in Industrial Environments. IFAC-PapersOnLine 2018, 51, 60–65. [Google Scholar] [CrossRef]
  6. Seo, S.-W.; Kwon, S.; Hassan, W.; Talhan, A.; Jeon, S. Interactive Virtual-Reality Fire Extinguisher with Haptic Feedback. In Proceedings of the ACM Symposium on Virtual Reality Software and Technology, Parramatta, NSW, Australia, 12–15 November 2019; pp. 1–2. [Google Scholar] [CrossRef]
  7. Saghafian, M.; Laumann, K.; Akhtar, R.S.; Skogstad, M.R. The Evaluation of Virtual Reality Fire Extinguisher Training. Front. Psychol. 2020, 11, 3137. [Google Scholar] [CrossRef] [PubMed]
  8. Arora, J.; Saini, A.; Mehra, N.; Jain, V.; Shrey, S.; Parnami, A. VirtualBricks: Exploring a Scalable, Modular Toolkit for Enabling Physical Manipulation in VR. In Proceedings of the CHI Conference on Human Factors in Computing Systems, Glasgow, UK, 4–9 May 2019; pp. 1–12. [Google Scholar] [CrossRef]
  9. Zhu, K.; Chen, T.; Han, F.; Wu, Y.-S. HapTwist: Creating Interactive Haptic Proxies in Virtual Reality Using Low-cost Twistable Artefacts. In Proceedings of the CHI Conference on Human Factors in Computing Systems, Glasgow, UK, 4–9 May 2019; pp. 1–13. [Google Scholar] [CrossRef]
  10. Shigeyama, J.; Hashimoto, T.; Yoshida, S.; Narumi, T.; Tanikawa, T.; Hirose, M. Transcalibur: A Weight Shifting Virtual Reality Controller for 2D Shape Rendering based on Computational Perception Model. In Proceedings of the CHI Conference on Human Factors in Computing Systems, Glasgow, UK, 4–9 May 2019; pp. 1–11. [Google Scholar] [CrossRef]
  11. Zenner, A.; Krüger, A. Drag:on: A Virtual Reality Controller Providing Haptic Feedback Based on Drag and Weight Shift. In Proceedings of the CHI Conference on Human Factors in Computing Systems, Glasgow, UK, 4–9 May 2019; pp. 1–12. [Google Scholar] [CrossRef]
  12. Baek, S.; Gil, Y.-H.; Kim, Y. VR-Based Job Training System Using Tangible Interactions. Sensors 2021, 21, 6794. [Google Scholar] [CrossRef]
  13. Talhan, A.; Jeon, S. Pneumatic Actuation in Haptic-Enabled Medical Simulators: A Review. Access 2017, 6, 3184–3200. [Google Scholar] [CrossRef]
  14. VRgluv. Available online: https://kickstarter.com (accessed on 1 May 2024).
  15. BeBop Data Glove. Available online: https://www.youtube.com/@BeBopSensors/featured (accessed on 1 May 2024).
  16. Home Hi5 VR Glove. Available online: https://hi5vrglove.com/ (accessed on 1 May 2024).
  17. MANUS Prime Glove. Available online: https://www.manus-meta.com (accessed on 1 May 2024).
  18. Teslasuit Glove. Available online: https://teslasuit.io (accessed on 1 May 2024).
  19. CyberGlove Systems. Available online: http://www.cyberglovesystems.com (accessed on 1 May 2024).
  20. Shen, Z.; Yi, J.; Li, X.; Mark, L.H.P.; Hu, Y.; Wang, Z. A Soft Stretchable Bending Sensor and Data Glove Applications. In Proceedings of the 2016 IEEE International Conference on Real-Time Computing and Robotics (RCAR), Angkor Wat, Cambodia, 6–10 June 2016; IEEE: Piscataway, NJ, USA, 2016; pp. 88–93. [Google Scholar]
  21. Atalay, A.; Sanchez, V.; Atalay, O.; Vogt, D.M.; Haufe, F.; Wood, R.J.; Walsh, C.J. Batch Fabrication of Customizable Silicone-Textile Composite Capacitive Strain Sensors for Human Motion Tracking. Adv. Mater. Technol. 2017, 2, 1700136. [Google Scholar] [CrossRef]
  22. O’Flynn, B.; Sanchez, T.; Connolly, J.; Curran, K.; Gardiner, P.; Ireland, N.; Downes, B. Integrated Smart Glove for Hand Motion Monitoring. In Proceedings of the SENSORDEVICES 2015, Venice, Italy, 23 August 2015; pp. 45–50. [Google Scholar]
  23. VIVE Tracker. Available online: https://www.vive.com/kr/accessory/tracker3/ (accessed on 1 May 2024).
  24. HTC VIVE. Available online: https://www.vive.com (accessed on 1 May 2024).
  25. Meta Quest. Available online: https://www.meta.com (accessed on 1 May 2024).
  26. Kuhlmann de Canaviri, L.; Meiszl, K.; Hussein, V.; Abbassi, P.; Mirraziroudsari, S.D.; Hake, L.; Potthast, T.; Ratert, F.; Schulten, T.; Silberbach, M.; et al. Static and Dynamic Accuracy and Occlusion Robustness of SteamVR Tracking 2.0 in Multi-Base Station Setups. Sensors 2023, 23, 725. [Google Scholar] [CrossRef]
  27. Simeone, L.; Velloso, E.; Gellersen, H. Substitutional Reality: Using the Physical Environment to Design Virtual Reality Experiences. In Proceedings of the CHI Conference on Human Factors in Computing Systems, Seoul, Republic of Korea, 18–23 April 2015; pp. 3307–3316. [Google Scholar] [CrossRef]
  28. Hinchet, R.; Vechev, V.; Shea, H.; Hilliges, O. DextrES: Wearable Haptic Feedback for Grasping in VR via a Thin Form-Factor Electrostatic Brake. In Proceedings of the Proceedings of the 31st Annual ACM Symposium on User Interface Software and Technology, Berlin, Germany, 11 October 2018; ACM: New York, NY, USA, 2018; pp. 901–912. [Google Scholar]
  29. Bouzit, M.; Popescu, G.; Burdea, G.; Boian, R. The Rutgers Master II-ND Force Feedback Glove. In Proceedings of the Proceedings 10th Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems (HAPTICS 2002), Orlando, FL, USA, 24–25 March 2002; IEEE Computer Society: Piscataway, NJ, USA, 2002; pp. 145–152. [Google Scholar]
  30. Fujimoto, K.; Kobayashi, F.; Nakamoto, H.; Kojima, F. Development of Haptic Device for Five-Fingered Robot Hand Teleoperation. In Proceedings of the Proceedings of the 2013 IEEE/SICE International Symposium on System Integration, Kobe, Japan, 15–17 December 2013; IEEE: Piscataway, NJ, USA, 2013; pp. 820–825. [Google Scholar]
  31. Patterson, Z.J.; Sabelhaus, A.P.; Chin, K.; Hellebrekers, T.; Majidi, C. An Untethered Brittle Star-Inspired Soft Robot for Closed-Loop Underwater Locomotion. In Proceedings of the 2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Las Vegas, NV, USA, 24 October 2020; IEEE: Piscataway, NJ, USA, 2020; pp. 8758–8764. [Google Scholar]
  32. Seo, S.-W.; Yun, S.; Kim, M.-G.; Sung, M.; Kim, Y. Screen-Based Sports Simulation Using Acoustic Source Localization. Appl. Sci. 2019, 9, 2970. [Google Scholar] [CrossRef]
  33. Madgwick, S.O.H.; Harrison, A.J.L.; Vaidyanathan, R. Estimation of IMU and MARG Orientation Using a Gradient Descent Algorithm. In Proceedings of the 2011 IEEE International Conference on Rehabilitation Robotics, Zurich, Switzerland, 29 June–1 July 2011; IEEE: Piscataway, NJ, USA, 2011; pp. 1–7. [Google Scholar]
  34. Hong, S.; Kim, Y. Dynamic Pose Estimation Using Multiple RGB-D Cameras. Sensors 2018, 18, 3865. [Google Scholar] [CrossRef] [PubMed]
  35. Hume, M.C.; Gellman, H.; McKellop, H.; Brumfield, R.H. Functional Range of Motion of the Joints of the Hand. J. Hand Surg. 1990, 15, 240–243. [Google Scholar] [CrossRef] [PubMed]
  36. Bullock, I.M.; Borras, J.; Dollar, A.M. Assessing Assumptions in Kinematic Hand Models: A Review. In Proceedings of the 2012 4th IEEE RAS & EMBS International Conference on Biomedical Robotics and Biomechatronics (BioRob), Rome, Italy, 24–27 June 2012; IEEE: Piscataway, NJ, USA, 2012; pp. 139–146. [Google Scholar]
  37. TEGWAY. Available online: https://tegway.cafe24.com/tegway (accessed on 1 May 2024).
  38. Hagisonic. Available online: http://hagisonic.com (accessed on 1 May 2024).
  39. Knowles. Available online: http://knowles.com (accessed on 1 May 2024).
  40. Analog Devices. Available online: http://analog.com (accessed on 1 May 2024).
  41. TDK. Available online: http://invensense.tdk.com (accessed on 1 May 2024).
  42. STMicroelectronics. Available online: http://st.com (accessed on 1 May 2024).
  43. Silicon Labs. Available online: http://silabs.com (accessed on 1 May 2024).
  44. OptiTrack. Available online: http://optitrack.com (accessed on 1 May 2024).
  45. Texas Instruments. Available online: http://ti.com (accessed on 1 May 2024).
  46. Infineon Technologies. Available online: https://www.infineon.com (accessed on 1 May 2024).
Figure 1. Overview of 3D hand position estimation using ultrasonic and inertial sensors.
Figure 1. Overview of 3D hand position estimation using ultrasonic and inertial sensors.
Mti 08 00062 g001
Figure 2. The closest point, P c , to the intersections, P E 1 and P E 2 , of the two ultrasonic direction vectors, v 1 and v 2 , which are obtained from the source localization methods [32], is estimated as the 3D location of the ultrasonic waves.
Figure 2. The closest point, P c , to the intersections, P E 1 and P E 2 , of the two ultrasonic direction vectors, v 1 and v 2 , which are obtained from the source localization methods [32], is estimated as the 3D location of the ultrasonic waves.
Mti 08 00062 g002
Figure 3. The placements of the IMU sensors (blue) on a hand: the distal phalange (DP) bones, the distal interphalangeal (DIP) joints, the medial phalange (MP) bones, the proximal interphalangeal (PIP) joints, the proximal phalange (PP) bones, the metacarpophalangeal (MCP) joints, and the intermetacarpal (IMC) joints.
Figure 3. The placements of the IMU sensors (blue) on a hand: the distal phalange (DP) bones, the distal interphalangeal (DIP) joints, the medial phalange (MP) bones, the proximal interphalangeal (PIP) joints, the proximal phalange (PP) bones, the metacarpophalangeal (MCP) joints, and the intermetacarpal (IMC) joints.
Mti 08 00062 g003
Figure 4. Overview of the haptic force feedback using SMA wires (springs) for VR interactions.
Figure 4. Overview of the haptic force feedback using SMA wires (springs) for VR interactions.
Mti 08 00062 g004
Figure 5. SMA actuator: (a) the SMA wire and (b) the deformed wire in the form of a spring (a diameter of 5 mm and a helical pitch of 0.5 mm).
Figure 5. SMA actuator: (a) the SMA wire and (b) the deformed wire in the form of a spring (a diameter of 5 mm and a helical pitch of 0.5 mm).
Mti 08 00062 g005
Figure 6. SMA actuation forces: (a) the measurement setup, (b) the pulling forces with the powers supplied to the spring from 0 to 3.2 W, and (c) the temperature variation of the pulling forces.
Figure 6. SMA actuation forces: (a) the measurement setup, (b) the pulling forces with the powers supplied to the spring from 0 to 3.2 W, and (c) the temperature variation of the pulling forces.
Mti 08 00062 g006
Figure 7. SMA cooling: (a) the thermoelectric pad with a size of 25 × 80 mm and (b) overall cooling structure.
Figure 7. SMA cooling: (a) the thermoelectric pad with a size of 25 × 80 mm and (b) overall cooling structure.
Mti 08 00062 g007
Figure 8. Prototype of the implemented device for 3D hand motion generation.
Figure 8. Prototype of the implemented device for 3D hand motion generation.
Mti 08 00062 g008
Figure 9. Comparison of tracking hand positions in 3D space between the proposed device (green) and the VIVE tracker (red).
Figure 9. Comparison of tracking hand positions in 3D space between the proposed device (green) and the VIVE tracker (red).
Mti 08 00062 g009
Figure 10. Circuit design for the power control to the SMA springs and thermoelectric pad.
Figure 10. Circuit design for the power control to the SMA springs and thermoelectric pad.
Mti 08 00062 g010
Figure 11. Prototype of the implemented device for haptic force feedback.
Figure 11. Prototype of the implemented device for haptic force feedback.
Mti 08 00062 g011
Figure 12. Comparison of the SMA cooling rate between the static air and proposed method.
Figure 12. Comparison of the SMA cooling rate between the static air and proposed method.
Mti 08 00062 g012
Figure 13. Training the use of a fire extinguisher using the haptic glove: (a) VR application and (b) radar chart (force feedback for squeezing and releasing the handle, nozzle vibration and reaction, and VR immersion) surveyed from user experiences.
Figure 13. Training the use of a fire extinguisher using the haptic glove: (a) VR application and (b) radar chart (force feedback for squeezing and releasing the handle, nozzle vibration and reaction, and VR immersion) surveyed from user experiences.
Mti 08 00062 g013
Table 1. Comparison of finger rotations: Average of angular differences (in degrees) was compared for each hand pose between the proposed device and commercial glove [17] using the marker-based motion capture system [44].
Table 1. Comparison of finger rotations: Average of angular differences (in degrees) was compared for each hand pose between the proposed device and commercial glove [17] using the marker-based motion capture system [44].
Hand Pose
Mti 08 00062 i001Mti 08 00062 i002Mti 08 00062 i003Mti 08 00062 i004Mti 08 00062 i005Mti 08 00062 i006
FingerMANUSOursMANUSOursMANUSOursMANUSOurs
ThumbMCP0.741.631.851.641.180.791.020.67
PIP1.020.560.700.340.922.371.630.63
DIP0.930.711.591.091.270.801.450.93
IndexMCP3.091.323.291.243.080.163.501.00
PIP1.133.681.770.821.232.641.781.05
DIP3.310.091.281.184.685.653.571.75
MiddleMCP4.420.902.091.163.740.312.280.20
PIP3.572.252.633.423.540.122.292.15
DIP1.821.631.750.191.751.893.620.74
RingMCP3.580.572.391.793.520.424.211.61
PIP2.633.153.931.593.221.213.443.79
DIP2.950.923.091.964.763.463.613.05
LittleMCP2.281.791.720.162.320.852.080.84
PIP3.593.361.561.101.431.824.781.54
DIP4.541.713.810.314.760.261.406.97
MeanMCP2.821.242.271.202.770.512.620.86
PIP2.392.602.121.462.071.632.781.83
DIP2.711.012.300.953.442.412.732.69
Table 2. Physical comparison between the proposed device and commercial gloves [14,15,16,17,18,19].
Table 2. Physical comparison between the proposed device and commercial gloves [14,15,16,17,18,19].
VRgluv
[14]
BeBop
[15]
Home Hi5 [16]MANUS
[17]
Teslasuit
[18]
CyberGlove [19]Ours
Force feedbackYes
(motor)
NoNoNoYes
(motor)
Yes
(motor)
Yes
(SMA)
Weight 1
(g)
450200105134450340200
Size 1
(length × width × height mm)
170 × 100 × 50170 × 90 × 8170 × 90 × 5170 × 90 × 5200 × 100 × 60195 × 125 × 70170 × 90 × 20
1 Approximately estimated with sensors attached.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Seo, S.-W.; Jung, W.-S.; Kim, Y. 3D Hand Motion Generation for VR Interactions Using a Haptic Data Glove. Multimodal Technol. Interact. 2024, 8, 62. https://doi.org/10.3390/mti8070062

AMA Style

Seo S-W, Jung W-S, Kim Y. 3D Hand Motion Generation for VR Interactions Using a Haptic Data Glove. Multimodal Technologies and Interaction. 2024; 8(7):62. https://doi.org/10.3390/mti8070062

Chicago/Turabian Style

Seo, Sang-Woo, Woo-Sug Jung, and Yejin Kim. 2024. "3D Hand Motion Generation for VR Interactions Using a Haptic Data Glove" Multimodal Technologies and Interaction 8, no. 7: 62. https://doi.org/10.3390/mti8070062

APA Style

Seo, S. -W., Jung, W. -S., & Kim, Y. (2024). 3D Hand Motion Generation for VR Interactions Using a Haptic Data Glove. Multimodal Technologies and Interaction, 8(7), 62. https://doi.org/10.3390/mti8070062

Article Metrics

Back to TopTop