skip to main content
10.1145/3544548.3581172acmconferencesArticle/Chapter ViewFull TextPublication PageschiConference Proceedingsconference-collections
research-article
Open access

PumpVR: Rendering the Weight of Objects and Avatars through Liquid Mass Transfer in Virtual Reality

Published: 19 April 2023 Publication History

Abstract

Perceiving objects’ and avatars’ weight in Virtual Reality (VR) is important to understand their properties and naturally interact with them. However, commercial VR controllers cannot render weight. Controllers presented by previous work are single-handed, slow, or only render a small mass. In this paper, we present PumpVR that renders weight by varying the controllers’ mass according to the properties of virtual objects or bodies. Using a bi-directional pump and solenoid valves, the system changes the controllers’ absolute weight by transferring water in or out with an average error of less than 5%. We implemented VR use cases with objects and avatars of different weights to compare the system with standard controllers. A study with 24 participants revealed significantly higher realism and enjoyment when using PumpVR to interact with virtual objects. Using the system to render body weight had significant effects on virtual embodiment, perceived exertion, and self-perceived fitness.

1 Introduction

When we hold or move objects, we sense their weight to understand the objects’ physical properties, which in turn affects how we interact with them [22]. Perceiving the weight of our own body using the proprioceptive system, on the other hand, shapes our body awareness [12] and affects motor accuracy [32]. Providing realistic weight sensations is therefore a key challenge for the progress of haptic virtual reality (VR) technology. Moreover, aspects of the game experience in VR can diminish, when the weight of the handheld VR controller does not match the visual properties of the object held in the virtual environment [64]. As the handling of objects is inherently linked to the perception of them, VR simulations should be able to render weight to accurately demonstrate the actual sensorimotor behavior. This is particularly vital for VR training, which relies on the transfer of motor skills acquired in VR to safety-critical tasks in the real world. It is, for example, important to feel the weight of a chainsaw when learning to use it through VR training [65].
In addition to simulating the weight of virtual objects, it also remains a challenge to simulate the body weight of avatars in VR games or applications for therapies. Especially research on virtual body perception and its use in therapeutic contexts would benefit from a system for adjusting the body weight perceived in VR. Embodying avatar bodies can elicit behavioral [29], attitudinal [7] or perceptual [27] changes based on the avatar’s appearance. Due to these effects researchers create embodied VR experiences of avatars of different body weights to support body image-related behavioral therapy [11, 63]. As matching visual characteristics have been shown to enhance virtual embodiment [60], it has been hypothesized, that congruent body weight can amplify the illusion as well [59]. Previous research revealed that the perception of body weight is flexible using a short arm human centrifuge and parabolic flight [12]. Other researchers attached weight to the inside of suits aiming to enhance the sensation of owning an overweight body [18]. However, a system to systematically simulate body weight in VR is still missing.
To perceive weight in the physical world, complex sensory mechanisms respond to the gravity force, that originates from the mass of a lifted physical object. Gravity pulling downwards and the forces applied to resist gravity are the main forces at work in human weight perception [10, 34]. When humans pick up an object, grip and lift forces are increased simultaneously until the object lifts off and the grip force is no longer increased [15]. To estimate weight, both kinaesthetic information of stretch receptors and cutaneous information of pressure receptors are taken into account [4, 36, 48]. Pressure receptors are located in the skin and are sensitive to pressure, whereas stretch receptors are located in the muscles and are sensitive to changes in muscle length [48]. The brain also receives information about weight from the vestibular system, which estimates the strength of gravity via the otolith organs to create the sense of balance [12]. Other factors that influence the perception of weight include the material, size, and shape of a lifted object [5].
Lim et al. [34] systematically reviewed the state of the art in weight perception in VR. The authors noted that most current approaches to weight simulation can be categorized into four types of haptic techniques: Devices that apply or manipulate forces (e.g., [17]), devices that deform (finger) skin (e.g., [14]), devices that use vibration feedback (e.g., [1]) and devices that shift the center of mass within the controller (e.g., [68]). Additionally, some systems combine two of these techniques [57]. In contrast, there are techniques, that do not use physical manipulations at all and rely on visual cues to communicate different weights (pseudo-haptic feedback) [33, 45, 49]. However, for VR users to perceive weight realistically, gravitational forces need to be exerted, to which human senses normally respond [34]. Lim et al. [34] conclude in their review, that current approaches have not yet solved the challenge of simulating weight due to limitations regarding the accuracy, discriminability of displayed weight, amount of rendered weight, double-handed interaction, asynchrony, and applicability to different scenarios.
Due to the complexity of sensory mechanisms involved in weight perception, simulating weight despite the absence of real mass, requires synthesizing a multitude of sensory inputs. To accurately render weight, it seems thus more effective to adjust the mass outside of VR, rather than simulate the variety of required forces. A method to flexibly change the mass of objects was introduced by Niiyama et al. [41]. The researchers proposed the use of liquid mass transfer to adjust the weight of objects. Their system uses a bi-directional pump to transfer liquid in or out of a bladder, that is located in a shell. As a result, they obtained an object with adjustable mass. However, as they have only demonstrated the mechanism on a small scale, it is unclear if the device can be configured for use in VR. Cheng et al. [6] adapted the concept of liquid mass transfer to implement a haptic feedback device for VR. For their device, GravityCup, the researchers installed pumps in a handheld and in a waist-worn container to transfer water between both components. However, due to the perceptible inertia of the water and a slow pumping speed of 19.62 ml/s, GravityCup’s use cases were limited to simulating containers that can be filled with liquids. Similarly, Wang et al. employed liquid mass transfer in their VR controller extension [61]. Their system combines vibrotactile feedback with changes in the center of mass and changes in absolute mass. The weight of the handheld device could be adjusted using stepper motors actuating syringes. Using this mechanism they demonstrated the induction of weight changes of up to 50 g. However, to be applicable for a wider range of application needs, rendering weight of mid-weight and heavier objects needs to be supported.
In this work, we present PumpVR (Providing Users with Mass Perception in Virtual Reality), a device, that builds on the concept of liquid mass transfer to enable automatically adjusting the weight of two handheld VR controllers. The system uses a mobile high-performance water pump to induce distinct levels of weight within a range of 500 g per controller, with the controller’s maximum weight being reached within 3100 ms. In a study, we demonstrate that the device enhances perceived realism and enjoyment in a game, that requires the players to interact with virtual objects of different weights, that they can pull from an inventory. We additionally show, how the device can amplify the level of virtual embodiment and perceived exertion while reducing self-perceived fitness when used to render body weight. We implemented two use cases to demonstrate PumpVR’s capability of rendering weight of virtual objects and avatars. In addition, we propose further use cases that utilize PumpVR’s influence on perceived weight and exertion.
The contribution of this paper is threefold:
(1)
It describes a novel liquid-based weight interface for VR. The system can provide a wider range of weight at a higher speed than previous approaches, and features bi-manual weight feedback.
(2)
We validate the use of our device to simulate object weight in VR and provide evidence that this can enhance the VR experience in terms of realism and enjoyment.
(3)
We empirically demonstrate that weight-changing controllers can also be a tool to strengthen virtual embodiment and show how this affects perceived fitness and exertion.

2 Related Work

In this section, we discuss previous research on providing weight sensations in VR. A broader overview of weight interfaces and weight perception in VR can also be found in the review by Lim et al. [34] and the survey by Ye [67].

2.1 Simulating weight of virtual objects

Most research on rendering weight of virtual objects has focused on simulating the sensory input, that is required to sense an object’s weight. In their review on weight perception in VR Lim et al. [34] systematically analyzed 65 research papers and classified them into five different types of haptic cues. The resulting categories are force, skin deformation, vibration, weight shifting and others.
Force relates to approaches that exert or manipulate forces to simulate weight. Heo et al. [17], for example, used propeller propulsion to create force feedback. Regarding weight simulations, they demonstrated the gravitational pulls on different planets. However, as the direction of force depends on the orientation of the device, the force to be exerted must be continuously computed, which causes a latency of 300 ms. Hence, rotating virtual objects can cause visual-haptic asynchrony. Also, due to the employment of six motors and propellers, their system produced noise of up to 80.7 dB, exceeding safe noise levels [13].
Skin deformation refers to mimicking the cutaneous pressure when holding weights. For instance, Girard et al. [14] employed DC motors to actuate downward pressure on the fingertips deforming skin on the fingers to indicate how heavy an object is. These approaches however don’t simulate forces essential for human weight perception, such as grip or other kinaesthetic forces. Due to these missing cues, weight feedback indicated by skin deformation is rather metaphorical [14].
A different approach is to rely on vibration to display different weights. Amemiya and Maeda [1] systematically varied oscillation patterns of a handheld device, which affected the perceived weight of that object. However, this haptic illusion vanishes for downward movements, or when the device is not held upright.
Other researchers employed mechanisms, that shift the center of mass within a handheld device (weight shifting). For instance, Zenner and Krüger [68] developed Shifty, a rod-shaped VR controller that uses a pulley to shift a weight of 127 g along its own axis to provide different weight sensations. As a result of the changed internal weight distribution of the device, the force that users need to exert to manipulate the controller is changed, creating the illusion of a different mass. Inherently, the device can also be used to render shifted centers of mass. While these weight shifting devices could be used to provide weight sensations associated with distinct masses, they are restricted to a small range of weight, as their absolute mass cannot be changed and installing a heavier moving weight would equally increase the empty weight. Additionally, the weight sensation generated is highly dependent on the type of interaction and object manipulation. Users can, for example, discriminate weights when lifting virtual objects using Shifty, but this effect vanishes, when the device is held upright. When users swing the device, they can perceive the object’s center of mass, breaking the illusion of an absolute mass change.
The category others subsumes haptic feedback cues based on electrical muscle stimulation, head rotations, or liquid inertia. Lopes et al. [35] explored the use of electrical muscle stimulation to create tension in the muscles, that are involved when trying to hold an object or touching a wall. However, the system’s effectiveness regarding weight perception and discrimination has not been evaluated. Similarly experimental, Teo et al. [53] stimulated receptors in the vestibular system to force head motion rotations. The strength of stimulation was increased or decreased based on the weight of the displayed object. As no evaluation was carried out, it is unclear whether this method could be used to successfully communicate weight changes. Regarding liquid inertia, Lim et al. [34] refer to the work by Cheng et al. [6], which we discuss in section 2.2.
In their review, Lim et al. [34] identify inaccuracy, distinctiveness and amount of rendered weight, one-handed interaction, synchrony, and lack of adaptability to different scenarios as the main problems of current approaches. Consequently, in line with Ye [67], they regard the challenge of simulating weight without real gravity forces as unsolved.

2.2 Liquid mass transfer

Niiyama et al. [41] presented a promising method to provide real weight by changing the absolute mass of a proxy through the transfer of liquid metal (Ga-In-Tin eutectic). Their system uses a bi-directional pump to move liquid in or out of a bladder inside the object. For this, the weight-changing object is connected to a second liquid container via hoses. The researchers provided application scenarios of simulating the density of materials, using different weights to balance a lever, and exhibiting the relation of planet properties in miniature. Thus, the focus of the device was on small scales, rather than meeting the performance requirements of a VR system (achieving the largest possible range of weights within the smallest possible time frame). Wang et al. [61] incorporated liquid metal transfer to their VR controller extension. The prototype comprises actuators providing vibrotactile feedback with two liquid metal reservoirs providing gravitational feedback (changes in the center of gravity and absolute changes). The liquid metal is contained in two balloons, each connected to a separate syringe via tubing. The syringes are operated by stepper motors to fill or empty the balloons on demand. In an evaluation, this mechanism was shown to transfer up to 50 g, which is sufficient to emulate shifted centers of mass or to support the visual information that an object has been lifted. However, it limits the accurate rendering of absolute mass change to very lightweight objects, such as picking up two alkaline AA batteries (46 g) in an assembly line simulation.
Likewise, Cheng et al. [6] developed GravityCup, a system, that transfers liquid between two units through hoses. Their approach uses two water pumps, that regulate the water level of both units to achieve the weight change. Each pump is located in a water bag, one of which is worn at the waist, and the other is placed in a cup-shaped handheld device. The latter has a capacity of 330 g, which can be filled by the system in 16.8 seconds, which equals a flow rate of 19.62 ml/s. As the liquid inertia in the cup is noticeable, GravityCup’s use case is limited to inducing the perception of the inertia of virtual liquids or containers. Hence, the researchers demonstrate the system in the VR scenarios of watering a plant, filling a cup of coffee, scooping up dog food, and holding an empty cup during changes in gravity.
Overall, previous work employing liquid mass transfer to render weight has revealed several challenges, that need to be tackled [23]. First of all, the delay between input action and reaching the targeted weight needs to be decreased to increase user experience and realism. With studies repeatedly showing that latency of several milliseconds is already perceptible [9, 39], it is unrealistic to perform liquid mass transfer without a noticeable delay. However, longer waiting times negatively affect the user experience and can, depending on the context, cause frustration. For instance, the waiting time for GravityCup to fill the container exceeds the threshold of 10 seconds, which is regarded as the limit above which users stop paying attention to their task [40]. Inherently, the matter of speed is a trade-off with the range of mass that can be provided by the system. Consequently, a further challenge is the ability to emulate the weight of commonly used heavier items like virtual hammers or swords. Thirdly, the design of the liquid-transfer mechanism must not generate unintended forces. Perceiving liquid inertia or other forces that do not match the characteristics of the virtual pendant could break the illusion. Additionally, Lim et al. [34] highlight the lack of double-handed interaction as a problem of most present interfaces for weight perception. In line with this, previous liquid-based systems are designed for one-handed interaction and their design does not scale well if the system is extended to provide weight sensation on different parts of the body. Adapting the design of Wang et al. [61] to support double-handed interaction, for example, would require mirroring the whole system for the other hand.

2.3 Body weight simulation

Research on virtual body weight simulation builds on investigations on the body ownership illusion (BOI), which refers to the illusory sensation of experiencing a foreign body as the own one [25]. For BOIs to occur, the foreign body’s visual appearance, postural and anatomical features, as well as multisensory information, e.g., visuo-motor or visuo-tactile synchrony [31], have to be plausible [56]. To simulate body weight, designers and researchers of VR applications therefore provide participants with a first-person view of avatar bodies, whose visual characteristics represent a certain body composition and body weight. In addition, visuo-motor or visuo-tactile synchrony is typically applied while embodying the avatar. Visuo-motor synchrony can be established by tracking the participants’ movements and transferring them in real-time onto the avatar. To intensify the experienced BOI during experiments, participants are therefore frequently asked to perform simple tasks in front of a virtual mirror [11, 26, 30, 66]. Other approaches rely on visuo-tactile synchrony where a certain part of the real body is touched while participants see a congruent touch on the virtual body [24, 28, 31, 52]. This allows to more specifically target the perceived body size [42, 43]. For instance, Normand et al. [42] found that participants overestimated their belly size after synchronous visuo-tactile stimulation of the avatar’s belly, which was oversized, and the participant’s own belly.
Users estimate the weight of avatars based on visual cues, such as shape and texture of the avatar [27, 43, 55]. Sikström et al. [51] also demonstrated that body weight estimations of a virtual avatar can be altered by applying different audio filters on the avatar’s foot step sound. These findings imply that auditory cues also affect the extent of the BOI. Other studies found the participants’ body mass index to predict their perceived avatar weight [54, 66]. Consequently, it is not only the avatar’s visual and auditory appearance but also users’ own body weight that affects how they perceive the weight of their virtual avatar. Ferrè et al. [12] showed through altered gravitational fields (using parabolic flight and a short arm human centrifuge) that the perception of one’s own body weight is rooted in the brain’s estimation of gravitational strength. Hence, it has been hypothesized that sensing weight forces congruent to the target weight could enhance body weight simulations [59]. However, it is still unknown if manipulating weight forces improves the perception of avatar weight. Furthermore, technical solutions to provide body weight sensations in the context of virtual embodiment are currently unexplored. Innovative ways to foster BOIs, however, would be valuable for researchers and designers of VR applications as they seek to create intense and believable BOIs and VR experiences.

2.4 Summary

A substantial body of research has explored the induction of weight sensations in VR. Previous work on haptic weight interfaces for VR has focused exclusively on the weight of virtual objects, rather than the weight of virtual bodies. Overall, robust simulation of different weights without physically changing the weight exerted has not yet been proven feasible using current approaches [34, 67]. Adjusting the weight through liquid mass transfer is promising however, provided that limitations regarding the amount of mass, disturbing forces such as liquid inertia, the flow rate and double-handedness can be tackled [6, 41, 61].

3 Development of PumpVR

Considering the limitations of previous work, we developed PumpVR to enable users to perceive the weight of objects and avatars in VR. The system provides distinguishable weight sensations of up to 500 g per hand. To reach that weight, it is capable of transferring mass at a rate of 150.8 g/s into a reservoir inside the controller. PumpVR allows double-handed use by featuring two handheld controllers that can be filled from the same reservoir. For other researchers to build on this work, we have made PumpVR’s hardware and software design available on github1.
Figure 1:
Figure 1: A single PumpVR Controller (top) with three fill levels (left: empty, center: partially filled, right: full) and respective exemplary virtual counterparts (bottom). The outline of the water bag is highlighted.

3.1 Design goals

Based on the challenges summmarized in section 2.4, we identified the following design goals, that guided the conceptualization and development of PumpVR. We derived, that the system should:
Be capable of performing weight changes as fast as possible
Provide distinguishable levels of weight
Render a wide range of mass
Have a wearable size and weight
Allow double-handed interaction
Prevent unintended forces, i.e. liquid inertia and air resistance
Operate with safety extra-low voltage
Figure 2:
Figure 2: Components of PumpVR. (1) Solenoid valves. (2) Control unit, consisting of an Arduino microcontroller, a HC-05 Bluetooth transceiver, and six relay modules. (3) Bi-directional pump (4) Power cords. (5) 1l water bag. (6) Handheld controllers (0.5l water bag with plastic housing). (7) HTC Vive Tracker

3.2 Concept

The device comprises two handheld controllers, whose weight can be adjusted dynamically through liquid mass transfer. This is accomplished by pumping water into or out of the controller using a hose connection to an external water reservoir. It can change the weight of the controllers simultaneously or independently of each other. The external water reservoir, as well as the system’s other parts, can be placed on a table or worn on the back. The controllers are bottle-shaped and meant to be grabbed at the bottleneck to keep the center of mass distant, which amplifies the perception of weight [68]. Alternatively, straps could be used to attach the controllers to other body parts, such as the forearm which enables the fingers to operate other input devices, or to the feet to exert weight there in sports applications, such as ski simulations. Figure 1 shows PumpVR at three different weight levels and virtual items that could be represented by them.

3.3 Implementation

An overview of PumpVR and its components can be seen in Figure 2. The liquid mass transfer is achieved by pumping water in or out of a 0.5 l water bag, which is located in each controller. Sealed foldable drinking bags (Recreatio, 247Goods) were chosen, as they contract under negative pressure, avoiding the perception of liquid inertia. Each of them is placed inside a plastic housing to prevent sensing the bag’s air resistance. The water transfer is enabled through a hose connection that leads from each controller via a reversible electrical pump to a third water bag (1 l cap.). The flexible hoses have an inner diameter of 10 mm. We used a Marco UP1-JR pump, as it meets our requirements for performance, weight, and safety, with a flow rate of 460 ml/s, a weight of 1.8 kg and safety extra-low voltage of 12 V (DC) [50]. The pump features a flexible rubber impeller, which allows the reversal of the direction of water flow. Four two-way solenoid valves - two between each controller and the pump - are added to the water circuit. This allows the weight of the controllers to be changed independently of one another or in tandem. Figure 3 schematically shows the operation of the hydraulic system.
Figure 3:
Figure 3: Schematic ISO 1219-1 diagram of the hydraulic system (left) and an overview of its different modes (right)
The weight change mechanism is controlled by an Arduino Micro (Adafruit Industries) microcontroller, that is coupled to a Bluetooth transceiver module (HC-05, Major Brands) and six single pole double throw relay modules. Two of the relay modules are wired in a polarity switching circuit connected to the pump’s DC motor, enabling the Arduino to reverse the direction of water flow. The other four relays switch one solenoid valve each. The relays are used to isolate the Arduino and the Bluetooth module from the higher current of the pump. In line with this protective partition, the microcontroller and the pump are powered by separate power supplies. A 10 amp circuit breaker further protects the components. The hardware, along with the water tank and the pump is assembled on a board, that can either be positioned stationary or worn on the back.
To enable the system to be used in the virtual environment, the controller’s positions and rotations are tracked via HTC Vive Trackers screwed on both controllers’ housings. As all user input required in our test scenarios can be provided via positional and rotational tracking, the present prototype does not feature any further interaction components, such as buttons. To establish communication between the device and a game engine, either a serial Bluetooth connection or the Arduino’s micro USB port can be used. We used the.NET SerialPort Class within a Unity3D environment to send messages depending on in-application events to the Arduino. Based on this input, the Arduino switches the corresponding relays to set the flow direction, starts the pump, and opens the associated valves to achieve the desired weight.

3.4 Performance

As we require the prototype to produce distinct levels of weight, we measured the relative weight of a single controller for different pumping durations. Therefore we tested five fill intervals in the steps of 1/5 of the time required to fill the controller’s reservoir. As its maximum weight of 500 g can be reached in 3100 ms, we derived the time intervals of 620 ms (level 1), 1240 ms (level 2), 1860 ms (level 3), 2480 ms (level 4) and 3100 ms (level 5). We temporarily connected a weigh scale and a load cell amplifier module (HX711, SparkFun Electronics) to the Arduino to map weight data to the arduino states. In the test loop, the water bag was subsequently filled and drained 30 times per fill interval.
We found a mean weight of 94.7 g for level 1 (SD=3.8), 188.1 g for level 2 (SD=10.5), 270.2 g for level 3 (SD=14.6), 381.5 g for level 4 (SD=20.4) and 465.7 g for level 5 (SD=12.6). The values are relative to the controller’s empty weight of 245 g. Thus, PumpVR fills the reservoir at an average rate of 150.8 g/s and with an average deviation of less than 5%. Figure 4 shows the weight change for each interval. The results demonstrate, that by varying the pumping duration, PumpVR can be used to target different weights. The drop in efficiency compared to the manufacturer’s specification of 460 ml/s is expected, as the flow rate is limited by the valves internal diameter, as well as the size and bending of the tubing.
Figure 4:
Figure 4: Relative controller weight by pumping duration

4 Test Applications

We developed two test applications using Unity3d. The focus of the first application (hereinafter referred to as Inventory Game) is to demonstrate and test the system’s ability to render the weight of different virtual objects. It can be seen in Figure 5 (right top). The purpose of the second application (hereinafter referred to as Embodiment Scene) is to explore the use of PumpVR to render body weight. It is shown in Figure 5 (right bottom).
Figure 5:
Figure 5: User perceiving weight through PumpVR (left) and the users’ view (right) for the inventory game (top) and the embodiment scene (bottom).

4.1 Inventory Game

The aim of the Inventory Game is to use common game mechanics to allow players to switch items of their inventory, such as weapons, tools or resources. Hence, we implemented the Inventory Game in the style of fantasy role-playing games featuring, to demonstrate the use of PumpVR for object interaction. Players find themselves in a virtual forest, where they embody virtual hands. Players can freely pull out a variety of items from their back and use them to combat approaching enemies and block arrows. Based on the items’ real world characteristics, we assigned each item one of the five weight levels defined in section 3.4. An additional weight level of zero was set for phases when no object is held in the hand. With each hand, players can grab a torch (weight level 1), a spear (weight level 3), a sword (weight level 4), a shield (weight level 5) or a hammer (weight level 5) from a virtual backpack by touching a box collider at their back. They can put the object away by reaching into that backpack again. Which object appears is determined by the game. There is no mechanic that obscures the delay between the visual event and weight change, to test users’ acceptance of the delay. This means, that as soon as colliders register that the player is grasping an item, it is rendered in the virtual hand and the process of changing the controller’s weight starts. This also applies to putting the item away. Players go through six game levels, that have individual enemies, each requiring a specific item to be defeated. Subsequently, they have to defend themselves against arrows with a shield, wolves with a sword, wasps with a torch, rhinos with a hammer and iguanas with a spear. In the final level players have to block arrows with a shield in one hand while fending off wolves with a sword in the other.

4.2 Embodiment Scene

In the Embodiment Scene, users first embody a slim female or male avatar and then can transform into a bulky one. Avatars were created using the avatar modeling software Daz 3D. During the transformation, the avatar’s body fat and the weight of both controllers are increased simultaneously until the controllers reach their maximum weight. Through the use of inverse kinematics, users can see the body parts of the avatar at the estimated position of their own body parts. The virtual room includes a mirror in which the users can observe their virtual upper body. Figure 6 shows the users’ view and the reflection in the mirror during the transformation of the avatar. The scene features an activity wall to engage the users while embodying the avatar. It is modeled after a commercial training device2 and consists of a 4 × 5 matrix of light panels, with one of them randomly lighting up blue. The wall challenges the user to touch that light panel, that lights up blue, as quickly as possible, which turns the light off and randomly switches on a different light panel. Figure 5 depicts the users’ interaction with the activity wall.

5 Evaluation

We conducted a study to test how PumpVR affects the experience when rendering the weight of objects and avatars. In this section, we describe the experiment, present and interpret our results, and discuss their implications.

5.1 Method

Figure 6:
Figure 6: User (left) experiencing the transformation from slim (center) to bulky (right) for the female (top) and male (bottom) avatar.

5.1.1 Participants.

Twenty-four subjects (11 identified as male, and 13 identified as female) participated in the study. They were recruited using our department’s mailing list and public forums at our institution. Their age reached from 20 to 27 years (M = 23.48 SD = 2.13). The exclusion criteria was an uncorrected vision. Thus, all participants had either normal or corrected-to-normal vision. According to the international classification of adult underweight, overweight, and obesity [38], their weight was normal, except for 4 participants, who are classified as pre-obese. Participants were compensated with 10€ or credit points for their study program. The study received ethics clearance according to our institution’s ethics and privacy regulations. All participants completed all tasks.

5.1.2 Apparatus.

The VR applications ran on a PC with Windows 10, equipped with an Intel i7-8750H Processor, 16GB RAM, and an NVIDIA GeForce GTX 1060 graphics card. Participants perceived the virtual world through an HTC Vive Pro head-mounted display (HMD). They had a play area of approximately 1 × 1 m. The head-mounted display, PumpVR controllers and HTC Vive controllers were tracked by StreamVR Base Stations 2.0.

5.1.3 Study Design.

In the experiment, all participants performed each task twice, once using PumpVR, and once using standard VR controllers (HTC Vive Controllers) with static weight. As such, we used Controller with the two levels standard and PumpVR as independent within-subject variable. The order of the Controller was counterbalanced to avoid sequence effects. We hypothesized, that using PumpVR in the Inventory Game leads to the experience being perceived as more realistic, as well as more enjoyable, in comparison to using the standard controllers. Additionally, we hypothesized that using PumpVR in the Embodiment Scene increases the level of embodiment and perceived exertion, whereas it decreases fitness ratings. This resulted in the dependent variables of realism, enjoyment, virtual embodiment, perceived exertion and self-perceived fitness.

5.1.4 Measures.

To quantify the perceived realism of the experience we used the reality judgment subscale of the Reality Judgment and Presence Questionnaire [2]. It consists of eight items, that address realism, presence and reality judgement. The questions are answered on ten-point scales (from 1 to 10), which are added up obtain the total score. We assessed differences in enjoyment via the seven items from the interest/enjoyment dimension of the Intrinsic Motivation Inventory Scale, the subscale to measure intrinsic motivation [47]. Participants are asked to indicate their agreement with the seven statements on seven-point Likert scales. To investigate the level of avatar embodiment we used the Virtual Embodiment Questionnaire (VEQ) [46]. It assesses the three dimensions ownership, agency, and change (in perceived body schema) via twelve items, that scored on 1 to 7 scales. Aside from the dimensions’ individual scores, a general virtual embodiment score was calculated by summing the scores of these three components. Self-perception of fitness was obtained using the questionnaire created by Delignières et al. [8] with the single-item subscales endurance, strength, flexibility, body composition and fitness rated on 13-point scales. Prior to the experiment, all dimensions were asked to screen for outliers regarding the participants’ overall physical capacities. During the experiment, only the fitness dimension was assessed. As the questionnaires were answered outside of VR, here the wording was modified to refer to the experience in the virtual world (e.g. from "I am exceptionally fit." to "In the virtual world I was exceptionally fit."). Participants rated their perceived exertion on the psychophysical Borg Rating of Perceived Exertion (RPE) scale [3]. The scale ranges from 6 (perceiving “no exertion at all”) to 20 (perceiving “maximal exertion”). The question referred to their current perceived exertion in general, not specifically to exertion regarding completing the task or using the controller. Apart from the RPE scale, which allows the participants to indicate the score orally during the experience, all questionnaires were answered outside of VR, as the interaction with a VR questionnaire could be affected by the used Controller.

5.1.5 Procedure and tasks.

Participants were welcomed and seated at a table, where the experimenter explained the procedure and handed out a consent form. After giving informed consent in accordance with the Declaration of Helsinki (2013) they filled out the demographic questionnaire. Participants subsequently performed three tasks. They were informed about the boundaries of their play area beforehand. Each task was followed by one repetition of the same task using the other Controller. To ensure that participants could not see the controllers, the experimenter handed the controllers while they were already wearing the HMD. PumpVR was placed stationary on a table.
To investigate the level of realism and enjoyment for PumpVR during the interaction with virtual objects, we engaged the participants in the Inventory Game as the first task (Task 1). Prior to entering VR, participants were explained that they can receive objects by reaching into their virtual backpack and pulling them out. They then completed the series of six game levels as described in section 4.1. Each level had a fixed duration of 40 seconds, except for the final level, which lasted 80 seconds. In-game text instructions were shown between the levels to guide the participants through the game and to avoid deviating game events between the conditions. These instructions informed them about the upcoming enemy and requested to take out the required object. Between levels, the text instructions prompted participants to put away the currently held object. The game was completed in approximately five minutes. After finishing the game, participants were asked to complete the questionnaires assessing realism and enjoyment.
For the second task (Task 2), participants perceived the embodiment scene through an avatar. They were assigned an avatar of their own gender to avoid gender mismatches between participant and avatar. At the start of the task, subjects embodied the slim version of their assigned avatar. To familiarize themselves with their avatars, participants were then instructed to adopt two simple upstanding body postures (first a T-pose and then a posture of stretching the arms forward). An illustration of the pose to perform was presented to them on a poster, that was visible in the mirror. Up to the moment when the second pose was completed, the embodied avatar was still slim. After adopting the second pose, the transformation started and the slim avatar morphed into a bulky one (see Figure 6). In the PumpVR condition the weight of the controllers was increased in synchrony with the visual transformation. Embodying the now bulky avatar, participants were then instructed to perform another sequence of simple poses, that involved their torso and arms. As soon as participants adopted the pose displayed, the experimenter faded in the subsequent pose via a keyboard command. Completing the sequence of 18 poses took approximately two minutes. The order of the poses was the same for all participants and conditions. After finishing Task 2, the participants filled in the VEQ.
Participants then continued with Task 3, which addressed the effects of PumpVR on self-perceived fitness and exertion. The procedure was identical to the one in Task 2 until the transformation to the bulky avatar was completed. Participants then were instructed to switch off any light panel of the activity wall, that lights up blue by touching it (as described in section 4.2). Every 30 seconds the BORG scale appeared on the right on which participants were to indicate their perceived exertion. They had 10 seconds to respond. The task was finished after three trials of 30 seconds, resulting in 3 values, from which the mean perceived exertion score was calculated. After completing the task, self-perceived fitness was assessed.

5.2 Results

We conducted paired-samples t-tests to compare the standard controller and PumpVR regarding realism, enjoyment, virtual embodiment, perceived exertion and self-perceived fitness. The assumption of normal distribution was verified for all data using Shapiro-Wilk tests.

5.2.1 Realism and Enjoyment.

In the Inventory Game, we found an effect of Controller, t(23) = 4.896, p < .001, d = .999 on the realism scores, which were 43.63 for the standard controllers (SD = 13.81) and 50.3 for PumpVR (SD = 13.32). There was also an effect of Controller, t(23) = 3.578, p = .002, d = .730 on enjoyment, with the score of 21.92 for the standard controllers (SD = 9.21) and 25.63 for PumpVR (SD = 6.95). Figure 7 depicts the mean ratings of realism and enjoyment.
Figure 7:
Figure 7: Mean scores of realism (left) and enjoyment (right) for the standard controllers and for PumpVR. The error bars show the standard deviation. Asterisks indicate significant differences (**p < .01, ***p < .001).

5.2.2 Embodiment.

Regarding the embodiment scene, we found an effect of Controller, t(23) = 4.126, p < .001, d = .842 on the virtual embodiment scores, that were 10.83 for the standard controllers (SD = 3.3) and 13.19 for PumpVR (SD = 3.46). There was no effect of Controller on the dimension agency of the VEQ, t(23) = 1.926, p = .067, d = .393. However, there was an effect of Controller on the dimension ownership, t(23) = 3.102, p = .005, d = .633, as well as on the dimension change t(23) = 4.518, p < .001, d = .922. The standard controllers had an ownership score of 2.87 (SD = 1.27), a change score of 2.77 (SD = 1.71) and an agency score of 5.20 (SD = 1.15). PumpVR had an ownership score of 3.71 (SD = 1.66), a change score of 3.87 (SD = 1.81) and an agency score of 5.62 (SD = .96). Figure 8 depicts the mean scores of the embodiment subscales.
Figure 8:
Figure 8: Mean questionnaire scores of virtual embodiment (top left), change (top right), ownership (bottom left) and agency (bottom right) for the standard controllers and PumpVR. The error bars show the standard deviation. Asterisks indicate significant differences (**p < .01, ***p < .001).

5.2.3 Perceived fitness and exertion.

There was an effect of Controller on self-perceived fitness, t(23) = 5.755, p < .001, d = 1.175, with the scores of 8.17 for the standard controllers (SD = 2.43) and 5.17 for PumpVR (SD = 3.19). Furthermore, we found a effect of Controller on perceived exertion, t(23) = 6.417, p < .001, d = 1.310, with the scores of 8.88 for the standard controllers (SD = 2.05) and 11.56 for PumpVR (SD = 2.14). Figure 9 shows the mean ratings of self-perceived fitness, perceived exertion and virtual embodiment.
Figure 9:
Figure 9: Mean scores of self-perceived fitness (left) and perceived exertion (right) for the standard controllers and for PumpVR. The error bars show the standard deviation. Asterisks indicate significant differences (**p < .01, ***p < .001).

5.3 Discussion

Overall, our evaluation shows that PumpVR successfully enhanced the experience for both, rendering object weight and rendering body weight. In line with our hypothesis, we found that PumpVR has significant effects on realism, enjoyment, virtual embodiment, self-perceived fitness and perceived exertion.

5.3.1 Rendering weight of virtual objects.

Our analysis revealed that users experienced the VR scene as more realistic when they used PumpVR to grab, hold and swing virtual objects of different weights. This demonstrates the capability of PumpVR to enhance realism in games, simulations or VR training. Interestingly, we found PumpVR to enhance the perceived realism, despite previous findings showing negative effects of latency on presence in VR [37]. This indicates that PumpVR’s rendering speed, i.e. the delay between grasping a virtual object and rendering the target weight, is sufficient to increase realism. This is in line with Plaisier et al. [44] suggesting that perceptual decision-making about the weight of an object is still in progress at least 300 ms after lifting it. The authors point out that even after that decision is made, new direct sensory information could lead to re-evaluating the decision. It is possible that PumpVR’s continuous weight change led to such re-evaluations of weight perception rather than to intersensory discrepancy. The type of movement performed by participants to grasp objects may have impacted this as well. As participants grasped objects from their backs, the first contact with the grabbed object was occluded. For the phase when the object was already grasped yet still behind the back, this might have led to decreased sensitivity to discrepancies between perceived and expected weight. On the other hand, an occluded vision implies depending more on the other senses, which in turn suggests increased sensitivity to weight changes. Hence, the sensitivity to visual-weight asynchronies for different ways of picking up or releasing objects could be the subject of future work.
Our results showed that PumpVR positively affected the enjoyment ratings, indicating, that users had more fun using PumpVR for object interaction than with standard controllers. This demonstrates, that rendering weight through PumpVR can enhance the experience in games, despite players having to exert more physical effort due to the increased weight. In the experiment, players were between 20 and 27 years and played the game for 5 minutes per condition. Due to the increased physical effort, outcomes regarding enjoyment could vary for other age groups or if playing for a longer period of time. In addition, it cannot be disregarded, that we only examined stationary experiences. Our outcomes regarding enjoyment and realism could vary for larger room-scale setups, as we have not yet tested using PumpVR mobile.
Even though we found that users enjoyed interacting with objects in games more when using PumpVR, there may be use cases in which perceiving static weight during object interaction is preferred. For instance, the reception to perceiving the weight of 3D tools in VR graphic editors (e.g., [58]) might differ from our outcome. Future studies could investigate the acceptance of weight perception in different contexts.

5.3.2 Simulating body weight.

Regarding the embodiment scene, we found higher levels of virtual embodiment when using PumpVR, as compared to using standard controllers. This indicates that the weight feedback exerted by PumpVR amplified the feeling of embodying the bulky avatar. PumpVR positively affected the VEQ’s dimension ownership, meaning that the virtual body was more strongly experienced as one’s own when PumpVR was used. This points to the rendered weight being interpreted as body weight. PumpVR also had a significant impact on the VEQ’s dimension change, which demonstrates that PumpVR fostered the participants’ feeling of a change in their own body schema. We did not find effects on the VEQ’s dimension agency, indicating that PumpVR did not restrict the feeling of control over the own body movements. To perceive body weight, participants received two kind of feedback, active feedback during the transformation (i.e. continuously increasing weight) and the passive feedback of the added weight. The relative contributions of those feedback types to the effect on embodiment needs further investigation.
Mean perceived exertion scores for the activity in Task 3 corresponded to "very light" (standard controllers) and "light" (PumpVR). This generally low exertion was to be expected considering the short duration of the three trials of 30 seconds each. Nevertheless, when embodying the bulky avatar, using PumpVR significantly increased the perceived exertion and decreased self-perceived fitness ratings. This shows that the additional weight was noticeable and resulted in more exertion. The negative effect on self-perceived fitness suggests that participants attributed this additional weight to their own fitness being reduced, rather than to the extra weight they had to lift. These findings combined with PumpVR’s impact on virtual embodiment indicate, that the participants associated the weight rendered by PumpVR with the increased body fat of the embodied avatar. This is in line with neuroscientific findings that perceived body weight can be altered by experimentally exerting forces in the direction of gravity [12]. Our findings can be explained with the unity assumption, according to which the human brain interprets two different sensory signals as originating from the same multisensory event [62]. PumpVR therefore contributes to research on bodily illusions and demonstrates exerting weight forces as promising approach to simulate body weight in VR.

6 Use Cases

We identified a variety of application scenarios for PumpVR. The most evident use case is that of rendering the weight of virtual objects in VR applications. PumpVR could thus be applied to perceive the weight of different game items, construction tools, or sports equipment, such as bats or clubs. This could be used to acquire psychomotor skills in VR training [21], enhance the experience in VR games, or to make objects in simulations tangible. Besides, PumpVR can simulate body weight during avatar embodiment. This could enhance the sensation of owning an overweight body in the context of body-image-related research and therapy. In VR games, the body weight of different player characters could be simulated. A further use case for PumpVR is its employment in fitness applications and exergames. As demonstrated, PumpVR can enhance effort during physical tasks. Hence, it could be used to add challenges during exergames [19], aiming to help users to reach their fitness goals.
Furthermore, we envision multiple other applications for PumpVR in the context of VR games. First of all, it could be used to dynamically adjust the difficulty in games [16, 20], to, for example, handicap players who have an unintended advantage due to having high endurance. Varying weight levels may be used to equalize the effort of players of different fitness levels. Moreover, PumpVR could be utilized for other game mechanics, such as indicating loss of health points or stamina. These attributes, usually showing that the player character takes damage, could be made physically perceptible by increasing the weight and thus the exertion. Targeting a single controller might be a method to display localized damage. Weight could, for example, be transferred to the right controller to indicate a right shoulder fracture of the player character. PumpVR’s weight adjustment could also be used to display different armor classes or to represent individual attributes of the selected player character, such as strength, constitution or agility.

7 Conclusion

In this paper, we presented PumpVR, a system that adjusts the absolute weight of two handheld controllers according to the properties of objects or avatars in VR. To generate distinct weight sensations, the device utilizes a bi-directional pump and a set of solenoid valves to transfer water in or out of the controllers. It is capable of pumping up to 500 g at an average rate of 150.8 g/s into each controller.
A study comparing PumpVR with a standard controller demonstrated that PumpVR increased the perceived realism and enjoyment when interacting with virtual objects of different weights. Thus, PumpVR could be used to improve game experiences, to enable a more natural depiction of real-world tasks in VR training or simulations. The study also showed that PumpVR successfully simulated body weight, which is a novelty in the development of weight interfaces. These findings also confirm that extra weight affects the feeling of embodying a bulky avatar. Our work therefore contributes to the progression of embodied VR experiences that support therapies treating body image disorders. In addition, PumpVR was found to affect perceived fitness and exertion, which suggests further use cases in the context of fitness applications.
In the next steps, the system could be extended with additional water reservoirs and valves to render different centers of mass. With consideration of safety and toxicity factors, as well as of viscosity, future systems could also explore reducing the volume of the controllers by using liquids denser than water. In terms of body weight simulation, effects of extra weight on additional body parts could be studied. In addition, it could be investigated whether using PumpVR to exert weight on users’ feet affects their perception of different ground properties in VR. Finally, incorporating tactile feedback could be explored by attaching PumpVR’s controllers to the forearms and interacting through haptic gloves.

Acknowledgments

This work is funded by the German Ministry of Education and Research (BMBF) within the HapticIO project (16SV8758).

Footnotes

2
www.twall.de

Supplementary Material

MP4 File (3544548.3581172-video-preview.mp4)
Video Preview
MP4 File (3544548.3581172-talk-video.mp4)
Pre-recorded Video Presentation
MP4 File (3544548.3581172-video-figure.mp4)
Video Figure

References

[1]
Tomohiro Amemiya and Taro Maeda. 2008. Asymmetric Oscillation Distorts the Perceived Heaviness of Handheld Objects. IEEE Transactions on Haptics 1, 1 (2008), 9–18. https://doi.org/10.1109/TOH.2008.5
[2]
Rosa María Baños, Cristina Botella, Azucena García-Palacios, Helena Villa Martín, Concepción Perpiñá, and Mariano Luis Alcañiz Raya. 2000. Presence and Reality Judgment in Virtual Environments: A Unitary Construct?Cyberpsychology Behav. Soc. Netw. 3 (2000), 327–335.
[3]
Gunnar A V Borg. 1982. Psychophysical bases of perceived exertion.Medicine and science in sports and exercise 14 (1982), 377–81. Issue 5. https://doi.org/10.1249/00005768-198205000-00012
[4]
Eric E Brodie and Helen E Ross. 1984. Sensorimotor mechanisms in weight discrimination. Perception & Psychophysics 36 (1984), 477–481. Issue 5. https://doi.org/10.3758/BF03207502
[5]
Gavin Buckingham. 2014. Getting a grip on heaviness perception: a review of weight illusions and their probable causes. Experimental Brain Research 232 (2014), 1623–1629. Issue 6. https://doi.org/10.1007/s00221-014-3926-9
[6]
Chih-Hao Cheng, Chia-Chi Chang, Ying-Hsuan Chen, Ying-Li Lin, Jing-Yuan Huang, Ping-Hsuan Han, Ju-Chun Ko, and Lai-Chung Lee. 2018. GravityCup: A Liquid-Based Haptics for Simulating Dynamic Weight in Virtual Reality. In Proceedings of the 24th ACM Symposium on Virtual Reality Software and Technology(VRST ’18). Association for Computing Machinery, New York, NY, USA, 2 pages. https://doi.org/10.1145/3281505.3281569
[7]
Maria Christofi and Despina Michael-Grigoriou. 2017. Virtual reality for inducing empathy and reducing prejudice towards stigmatized groups: A survey. In 2017 23rd International Conference on Virtual System & Multimedia (VSMM). IEEE, Dublin, 1–8. https://doi.org/10.1109/VSMM.2017.8346252
[8]
D. Delignières, A. Marcellini, J. Brisswalter, and P. Legros. 1994. Self-Perception of Fitness and Personality Traits. Perceptual and Motor Skills 78, 3 (1994), 843–851. https://doi.org/10.1177/003151259407800333 arXiv:https://doi.org/10.1177/003151259407800333PMID: 8084701.
[9]
Massimiliano Di Luca and Arash Mahnan. 2019. Perceptual Limits of Visual-Haptic Simultaneity in Virtual Reality Interactions. In 2019 IEEE World Haptics Conference (WHC). IEEE, Tokyo, 67–72. https://doi.org/10.1109/WHC.2019.8816173
[10]
Anton J M Dijker. 2014. The role of expectancies in the size-weight illusion: A review of theoretical and empirical arguments and a new explanation. Psychonomic Bulletin & Review 21 (2014), 1404–1414. Issue 6. https://doi.org/10.3758/s13423-014-0634-1
[11]
Nina Döllinger, Carolin Wienrich, Erik Wolf, Mario Botsch, and Marc Erich Latoschik. 2019. ViTraS - Virtual Reality Therapy by Stimulation of Modulated Body Image - Project Outline. In Mensch und Computer 2019 - Workshopband. Gesellschaft für Informatik e.V., Bonn, 606–611. https://doi.org/10.18420/muc2019-ws-633
[12]
E R Ferrè, T Frett, P Haggard, and M R Longo. 2019. A gravitational contribution to perceived body weight. Scientific Reports 9(2019), 11448. Issue 1. https://doi.org/10.1038/s41598-019-47663-x
[13]
Daniel Fink and Jan Mayes. 2021. Too loud! Non-occupational noise exposure causes hearing loss. Proceedings of Meetings on Acoustics 43 (2021), 13 pages. Issue 1. https://doi.org/10.1121/2.0001436
[14]
Adrien Girard, Maud Marchal, Florian Gosselin, Anthony Chabrier, François Louveau, and Anatole Lécuyer. 2016. HapTip: Displaying Haptic Shear Forces at the Fingertips for Multi-Finger Interaction in Virtual Environments. Frontiers in ICT 3(2016), 15 pages. https://doi.org/10.3389/fict.2016.00006
[15]
James M Goodman and Sliman J Bensmaia. 2002. The Neural Basis of Haptic Perception. John Wiley & Sons, Ltd, Hoboken, NJ, 537–584. https://doi.org/10.1002/9781119170174.epcn205
[16]
David Halbhuber, Jakob Fehle, Alexander Kalus, Konstantin Seitz, Martin Kocur, Thomas Schmidt, and Christian Wolff. 2019. The Mood Game - How to Use the Player’s Affective State in a Shoot’em up Avoiding Frustration and Boredom. In Proceedings of Mensch Und Computer 2019 (Hamburg, Germany) (MuC’19). Association for Computing Machinery, New York, NY, USA, 867–870. https://doi.org/10.1145/3340764.3345369
[17]
Seongkook Heo, Christina Chung, Geehyuk Lee, and Daniel Wigdor. 2018. Thor’s Hammer: An Ungrounded Force Feedback Device Utilizing Propeller-Induced Propulsive Force. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems (Montreal QC, Canada) (CHI ’18). Association for Computing Machinery, New York, NY, USA, 1–11. https://doi.org/10.1145/3173574.3174099
[18]
Anne Herrmann-Werner, Teresa Loda, Lisa M Wiesner, Rebecca Sarah Erschens, Florian Junne, and Stephan Zipfel. 2019. Is an obesity simulation suit in an undergraduate medical communication class a valuable teaching tool? A cross-sectional proof of concept study. BMJ Open 9, 8 (2019), 7 pages. https://doi.org/10.1136/bmjopen-2019-029738 arXiv:https://bmjopen.bmj.com/content/9/8/e029738.full.pdf
[19]
Han-Chung Huang, May-Kuen Wong, Ju Lu, Wei-Fan Huang, and Ching-I Teng. 2017. Can Using Exergames Improve Physical Fitness? A 12-Week Randomized Controlled Trial. Comput. Hum. Behav. 70, C (may 2017), 310–316. https://doi.org/10.1016/j.chb.2016.12.086
[20]
Robin Hunicke. 2005. The Case for Dynamic Difficulty Adjustment in Games. In Proceedings of the 2005 ACM SIGCHI International Conference on Advances in Computer Entertainment Technology(Valencia, Spain) (ACE ’05). Association for Computing Machinery, New York, NY, USA, 429–433. https://doi.org/10.1145/1178477.1178573
[21]
Lasse Jensen and Flemming Konradsen. 2018. A review of the use of virtual reality head-mounted displays in education and training. Education and Information Technologies 23 (2018), 1515–1529. Issue 4. https://doi.org/10.1007/s10639-017-9676-0
[22]
Lynette A. Jones. 1986. Perception of force and weight: Theory and research.Psychological bulletin 100 (1986), 29–42. Issue 1.
[23]
Alexander Kalus, Martin Kocur, and Niels Henze. 2022. Towards Inducing Weight Perception in Virtual Reality Through a Liquid-based Haptic Controller. In AVI ’22 - Workshop on Visuo-Haptic Interaction. 5 pages. https://doi.org/10.5283/epub.53628
[24]
Alexander Kalus, Martin Kocur, Niels Henze, Johanna Bogon, and Valentin Schwind. 2022. How to Induce a Physical and Virtual Rubber Hand Illusion. In Proceedings of Mensch Und Computer 2022 (Darmstadt, Germany) (MuC ’22). Association for Computing Machinery, New York, NY, USA, 580–583. https://doi.org/10.1145/3543758.3547512
[25]
Konstantina Kilteni, Antonella Maselli, Konrad P Kording, and Mel Slater. 2015. Over my fake body: body ownership illusions for studying the multisensory basis of own-body perception. Frontiers in Human Neuroscience 9 (2015), 141. https://doi.org/10.3389/fnhum.2015.00141
[26]
Martin Kocur, Johanna Bogon, Manuel Mayer, Miriam Witte, Amelie Karber, Niels Henze, and Valentin Schwind. 2022. Sweating Avatars Decrease Perceived Exertion and Increase Perceived Endurance While Cycling in Virtual Reality. In Proceedings of the 28th ACM Symposium on Virtual Reality Software and Technology (Tsukuba, Japan) (VRST ’22). Association for Computing Machinery, New York, NY, USA, Article 29, 12 pages. https://doi.org/10.1145/3562939.3565628
[27]
Martin Kocur, Florian Habler, Valentin Schwind, Paweł W. Woźniak, Christian Wolff, and Niels Henze. 2021. Physiological and Perceptual Responses to Athletic Avatars While Cycling in Virtual Reality. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems (Yokohama, Japan) (CHI ’21). Association for Computing Machinery, New York, NY, USA, Article 519, 18 pages. https://doi.org/10.1145/3411764.3445160
[28]
Martin Kocur, Alexander Kalus, Johanna Bogon, Niels Henze, Christian Wolff, and Valentin Schwind. 2022. The Rubber Hand Illusion in Virtual Reality and the Real World - Comparable but Different. In Proceedings of the 28th ACM Symposium on Virtual Reality Software and Technology (Tsukuba, Japan) (VRST ’22). Association for Computing Machinery, New York, NY, USA, Article 31, 12 pages. https://doi.org/10.1145/3562939.3565614
[29]
Martin Kocur, Melanie Kloss, Valentin Schwind, Christian Wolff, and Niels Henze. 2020. Flexing Muscles in Virtual Reality: Effects of Avatars’ Muscular Appearance on Physical Performance. Association for Computing Machinery, New York, NY, USA, 193–205. https://doi.org/10.1145/3410404.3414261
[30]
Martin Kocur, Philipp Schauhuber, Valentin Schwind, Christian Wolff, and Niels Henze. 2020. The Effects of Self- and External Perception of Avatars on Cognitive Task Performance in Virtual Reality. In 26th ACM Symposium on Virtual Reality Software and Technology (Virtual Event, Canada) (VRST ’20). Association for Computing Machinery, New York, NY, USA, Article 27, 11 pages. https://doi.org/10.1145/3385956.3418969
[31]
Elena Kokkinara and Mel Slater. 2014. Measuring the Effects through Time of the Influence of Visuomotor and Visuotactile Synchronous Stimulation on a Virtual Body Ownership Illusion. Perception 43, 1 (2014), 43–58. https://doi.org/10.1068/p7545
[32]
James R Lackner and Paul Dizio. 1993. Multisensory, Cognitive, and Motor Influences on Human Spatial Orientation in Weightlessness. Journal of Vestibular Research 3 (1993), 361–372. https://doi.org/10.3233/VES-1993-3315
[33]
Jun Lee, Jee-In Kim, and HyungSeok Kim. 2019. Force Arrow 2: A Novel Pseudo-Haptic Interface for Weight Perception in Lifting Virtual Objects. In 2019 IEEE International Conference on Big Data and Smart Computing (BigComp). IEEE, Kyoto, 1–8. https://doi.org/10.1109/BIGCOMP.2019.8679400
[34]
Woan Ning Lim, Kian Meng Yap, Yunli Lee, Chyanna Wee, and Ching Chiuan Yen. 2021. A Systematic Review of Weight Perception in Virtual Reality: Techniques, Challenges, and Road Ahead. IEEE Access 9(2021), 163253–163283. https://doi.org/10.1109/ACCESS.2021.3131525
[35]
Pedro Lopes, Sijing You, Lung-Pan Cheng, Sebastian Marwecki, and Patrick Baudisch. 2017. Providing Haptics to Walls & Heavy Objects in Virtual Reality by Means of Electrical Muscle Stimulation. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems (Denver, Colorado, USA) (CHI ’17). Association for Computing Machinery, New York, NY, USA, 1471–1482. https://doi.org/10.1145/3025453.3025600
[36]
D.I. McCloskey. 1974. Muscular and cutaneous mechanisms in the estimation of the weights of grasped objects. Neuropsychologia 12, 4 (1974), 513–520. https://doi.org/10.1016/0028-3932(74)90081-5
[37]
M. Meehan, S. Razzaque, M.C. Whitton, and F.P. Brooks. 2003. Effect of latency on presence in stressful virtual environments. In IEEE Virtual Reality, 2003. Proceedings.IEEE, Los Angeles, 141–148. https://doi.org/10.1109/VR.2003.1191132
[38]
Sarah Messiah. 2013. Body Mass Index. In Encyclopedia of Behavioral Medicine, Marc D Gellman and J Rick Turner (Eds.). Springer New York, New York, NY, 247–249. https://doi.org/10.1007/978-1-4419-1005-9_729
[39]
Albert Ng, Michelle Annett, Paul Dietz, Anoop Gupta, and Walter F. Bischof. 2014. In the Blink of an Eye: Investigating Latency Perception during Stylus Interaction. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (Toronto, Ontario, Canada) (CHI ’14). Association for Computing Machinery, New York, NY, USA, 1103–1112. https://doi.org/10.1145/2556288.2557037
[40]
Jakob Nielsen. 1994. Usability Engineering. Morgan Kaufmann Publishers Inc., San Francisco, CA, USA.
[41]
Ryuma Niiyama, Lining Yao, and Hiroshi Ishii. 2014. Weight and Volume Changing Device with Liquid Metal Transfer. In Proceedings of the 8th International Conference on Tangible, Embedded and Embodied Interaction(TEI ’14). Association for Computing Machinery, New York, NY, USA, 49–52. https://doi.org/10.1145/2540930.2540953
[42]
Jean-Marie Normand, Elias Giannopoulos, Bernhard Spanlang, and Mel Slater. 2011. Multisensory stimulation can induce an illusion of larger belly size in immersive virtual reality. PloS one 6, 1 (2011), e16128. https://doi.org/10.1371/journal.pone.0016128
[43]
Ivelina V. Piryankova, Hong Yu Wong, Sally A. Linkenauger, Catherine Stinson, Matthew R. Longo, Heinrich H. Bülthoff, and Betty J. Mohler. 2014. Owning an overweight or underweight body: distinguishing the physical, experienced and virtual body. PloS one 9, 8 (2014), e103428. https://doi.org/10.1371/journal.pone.0103428
[44]
Myrthe A. Plaisier, Irene A. Kuling, Eli Brenner, and Jeroen B. J. Smeets. 2019. When Does One Decide How Heavy an Object Feels While Picking It Up?Psychological Science 30, 6 (2019), 822–829. https://doi.org/10.1177/0956797619837981 arXiv:https://doi.org/10.1177/0956797619837981PMID: 30917092.
[45]
Michael Rietzler, Florian Geiselhart, Jan Gugenheimer, and Enrico Rukzio. 2018. Breaking the Tracking: Enabling Weight Perception Using Perceivable Tracking Offsets. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems (Montreal QC, Canada) (CHI ’18). Association for Computing Machinery, New York, NY, USA, 1–12. https://doi.org/10.1145/3173574.3173702
[46]
Daniel Roth and Marc Erich Latoschik. 2020. Construction of the Virtual Embodiment Questionnaire (VEQ). IEEE Transactions on Visualization and Computer Graphics 26, 12(2020), 3546–3556. https://doi.org/10.1109/TVCG.2020.3023603
[47]
Richard Ryan and Edward Deci. 2000. Self-Determination Theory and the Facilitation of Intrinsic Motivation, Social Development, and Well-Being. American Psychologist 55 (02 2000), 68–78. Issue 1. https://doi.org/10.1037/0003-066X.55.1.68
[48]
T A Ryan. 1940. Interrelations of sensory systems in perception.Psychological Bulletin 37 (1940), 659–698. https://doi.org/10.1037/h0060252
[49]
Majed Samad, Elia Gatti, Anne Hermes, Hrvoje Benko, and Cesare Parise. 2019. Pseudo-Haptic Weight: Changing the Perceived Weight of Virtual Objects By Manipulating Control-Display Ratio. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems (Glasgow, Scotland Uk) (CHI ’19). Association for Computing Machinery, New York, NY, USA, 1–13. https://doi.org/10.1145/3290605.3300550
[50]
Natascha Sanders. 2022. Marco UP1-JR Reversible impeller pump 7.4 gpm – 28 l/min with on/off integrated switch (12 Volt). Retrieved January 19, 2023 from marco-pumps.shop/marco-up1-jr-reversible-impeller-pump-28-l-min-with-on-off-integrated-switch-12-volt-16201112
[51]
Erik Sikström, Amalia de Götzen, and Stefania Serafin. 2015. Self-characterstics and sound in immersive virtual reality — Estimating avatar weight from footstep sounds. In 2015 IEEE Virtual Reality (VR). IEEE, Arles, 283–284. https://doi.org/10.1109/VR.2015.7223406
[52]
Mel Slater, Daniel Pérez Marcos, Henrik Ehrsson, and Maria Sanchez-Vives. 2008. Towards a digital body: the virtual arm illusion. Frontiers in Human Neuroscience 2 (2008), 1–8. https://doi.org/10.3389/neuro.09.006.2008
[53]
Theophilus Teo, Fumihiko Nakamura, Maki Sugimoto, Adrien Verhulst, Gun A. Lee, Mark Billinghurst, and Matt Adcock. 2020. Feel It: Using Proprioceptive and Haptic Feedback for Interaction with Virtual Embodiment. In ACM SIGGRAPH 2020 Emerging Technologies (Virtual Event, USA) (SIGGRAPH ’20). Association for Computing Machinery, New York, NY, USA, Article 2, 2 pages. https://doi.org/10.1145/3388534.3407288
[54]
Anne Thaler, Michael N. Geuss, Simone C. Mölbert, Katrin E. Giel, Stephan Streuber, Javier Romero, Michael J. Black, and Betty J. Mohler. 2018. Body size estimation of self and others in females varying in BMI. PLOS ONE 13, 2 (02 2018), 1–24. https://doi.org/10.1371/journal.pone.0192152
[55]
Anne Thaler, Ivelina Piryankova, Jeanine K. Stefanucci, Sergi Pujades, Stephan de la Rosa, Stephan Streuber, Javier Romero, Michael J. Black, and Betty J. Mohler. 2018. Visual Perception and Evaluation of Photo-Realistic Self-Avatars From 3D Body Scans in Males and Females. Frontiers in ICT 5(2018), 1–14. https://doi.org/10.3389/fict.2018.00018
[56]
Manos Tsakiris. 2010. My body in the brain: A neurocognitive model of body-ownership. Neuropsychologia 48, 3 (2010), 703–712. https://doi.org/10.1016/j.neuropsychologia.2009.09.034
[57]
Femke E van Beek, Raymond J King, Casey Brown, Massimiliano Di Luca, and Sean Keller. 2021. Static Weight Perception Through Skin Stretch and Kinesthetic Information: Detection Thresholds, JNDs, and PSEs. IEEE Transactions on Haptics 14, 1 (2021), 20–31. https://doi.org/10.1109/TOH.2020.3009599
[58]
Anurag Sai Vempati, Harshit Khurana, Vojtech Kabelka, Simon Flueckiger, Roland Siegwart, and Paul Beardsley. 2019. A Virtual Reality Interface for an Autonomous Spray Painting UAV. IEEE Robotics and Automation Letters 4, 3 (2019), 2870–2877. https://doi.org/10.1109/LRA.2019.2922588
[59]
Adrien Verhulst, Jean-Marie Normand, Cindy Lombart, Maki Sugimoto, and Guillaume Moreau. 2018. Influence of Being Embodied in an Obese Virtual Body on Shopping Behavior and Products Perception in VR. Frontiers in Robotics and AI 5 (2018), 1–20. https://doi.org/10.3389/frobt.2018.00113
[60]
Thomas Waltemate, Dominik Gall, Daniel Roth, Mario Botsch, and Marc Erich Latoschik. 2018. The Impact of Avatar Personalization and Immersion on Virtual Body Ownership, Presence, and Emotional Response. IEEE Transactions on Visualization and Computer Graphics 24, 4(2018), 1643–1652. https://doi.org/10.1109/TVCG.2018.2794629
[61]
Xian Wang, Diego Monteiro, Lik-Hang Lee, Pan Hui, and Hai-Ning Liang. 2022. VibroWeight: Simulating Weight and Center of Gravity Changes of Objects in Virtual Reality for Enhanced Realism. In 2022 IEEE Haptics Symposium (HAPTICS). IEEE, Santa Barbara, 1–7. https://doi.org/10.1109/HAPTICS52432.2022.9765609
[62]
Robert B. Welch and David H. Warren. 1980. Immediate perceptual response to intersensory discrepancy.Psychological bulletin 88 3 (1980), 638–67.
[63]
Robin Welsch, Heiko Hecht, David R. Kolar, Michael Witthöft, and Tanja Legenbauer. 2020. Body image avoidance affects interpersonal distance perception: A virtual environment experiment. European Eating Disorders Review 28, 3 (2020), 282–295. https://doi.org/10.1002/erv.2715 arXiv:https://onlinelibrary.wiley.com/doi/pdf/10.1002/erv.2715
[64]
Michael White, James Gain, Ulysse Vimont, and Daniel Lochner. 2019. The Case for Haptic Props: Shape, Weight and Vibro-Tactile Feedback. In Motion, Interaction and Games (Newcastle upon Tyne, United Kingdom) (MIG ’19). Association for Computing Machinery, New York, NY, USA, Article 7, 10 pages. https://doi.org/10.1145/3359566.3360058
[65]
Mieszko Wodzyński. 2020. Design of a Research and Training Platform for Operating Portable Chainsaws Using Virtual Reality Technology. Problems of Mechatronics Armament Aviation Safety Engineering 11 (12 2020), 83–92. https://doi.org/10.5604/01.3001.0014.5646
[66]
Erik Wolf, Nina Döllinger, David Mal, Carolin Wienrich, Mario Botsch, and Marc Erich Latoschik. 2020. Body Weight Perception of Females using Photorealistic Avatars in Virtual and Augmented Reality. In 2020 IEEE International Symposium on Mixed and Augmented Reality (ISMAR). IEEE, Porto de Galinhas, 462–473. https://doi.org/10.1109/ISMAR50242.2020.00071
[67]
Xupeng Ye. 2021. A Survey on Simulation for Weight Perception in Virtual Reality. Journal of Computer and Communications 9 (2021), 1–24. https://doi.org/10.4236/jcc.2021.99001
[68]
André Zenner and Antonio Krüger. 2017. Shifty: A Weight-Shifting Dynamic Passive Haptic Proxy to Enhance Object Perception in Virtual Reality. IEEE Transactions on Visualization and Computer Graphics 23, 4(2017), 1285–1294. https://doi.org/10.1109/TVCG.2017.2656978

Cited By

View all

Index Terms

  1. PumpVR: Rendering the Weight of Objects and Avatars through Liquid Mass Transfer in Virtual Reality

      Recommendations

      Comments

      Information & Contributors

      Information

      Published In

      cover image ACM Conferences
      CHI '23: Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems
      April 2023
      14911 pages
      ISBN:9781450394215
      DOI:10.1145/3544548
      This work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike International 4.0 License.

      Sponsors

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      Published: 19 April 2023

      Check for updates

      Badges

      • Honorable Mention

      Author Tags

      1. haptic controllers
      2. virtual embodiment
      3. virtual reality
      4. weight interface
      5. weight perception

      Qualifiers

      • Research-article
      • Research
      • Refereed limited

      Funding Sources

      Conference

      CHI '23
      Sponsor:

      Acceptance Rates

      Overall Acceptance Rate 6,199 of 26,314 submissions, 24%

      Upcoming Conference

      CHI 2025
      ACM CHI Conference on Human Factors in Computing Systems
      April 26 - May 1, 2025
      Yokohama , Japan

      Contributors

      Other Metrics

      Bibliometrics & Citations

      Bibliometrics

      Article Metrics

      • Downloads (Last 12 months)1,220
      • Downloads (Last 6 weeks)169
      Reflects downloads up to 08 Feb 2025

      Other Metrics

      Citations

      Cited By

      View all

      View Options

      View options

      PDF

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader

      HTML Format

      View this article in HTML Format.

      HTML Format

      Login options

      Figures

      Tables

      Media

      Share

      Share

      Share this Publication link

      Share on social media