UX Engineer and Design Technologist

Thoughts and Ideas

nbody: Non-Anthropomorphic embodiment in Virtual Reality

Statement

If a person is to become a Mars Rover in Virtual Reality, what kind of interactions would the user develop if the user’s body is a constraint to that of a non-human form (Mars Rover )?Does the user identify itself as a human or rover or both? Can this perception of self, change if the tools through which the user is experiencing the virtual world change.
The project investigates these notions through a series of design experiments and prototyping where the user’s physical body mapped to a non-anthropomorphic avatar. Use of physical immobilizers as tools of translating the physical body to the virtual, the project examines the relationship between the physical anthropomorphic form and virtual non-anthropomorphic form.
Changing the physiology in the virtual environment by changing the tools of interaction exposes the contribution of these controllers to the embodiment of the human body in the virtual anatomy. This research investigates the ontological discourses around the idea of embodiment, presence, and self-definition with a non-anthropomorphic avatar in virtual environments.

Paper-

Virtual Reality or Immersive Virtual Environments (IVE) allow the practice of representation to be used to its maximum ability. Avatars are traditionally understood as representing they human counterparts in virtual contexts by incorporating many aspects of persons real world physical characteristics within the virtual world. An alternate approach in which avatars are instead imbued with non-human characteristics challenges the limitations of solely of anthropomorphic principles and expands the potential of avatars for virtual world interaction and communication especially in an Immersive Virtual Environment (IVE). This paper explores virtual reality controllers as a tool for non anthropomorphic avatars in Immersive Virtual Reality(IVR) or as commonly known as Virtual Reality. This article argues that these non anthropomorphic forms in IVE create new modes of communication that may direct us in exciting directions for future research in the areas of transhumanistic interactions and behaviours . To encourage communication and interaction behaviors we exhibit in the real world, avatars are usually are created similar to human body forms with means to manifest typical familiar human behaviors. An Avatar need not display only one’s real-world characteristics. Many people choose to engage in transformed social interaction by decoupling and augmenting their avatar-ic representation self-representation and behavior[1]. This augmentation can be a physical augmentation in digital physiology if the medium allows. Traditionally anthropomorphism as a term has been used when a human is mapped to animals or objects with human qualities such as reasoning and emotion to relate better with the creature or object[2]. Following this line of reasoning, if we look into non anthropomorphic avatars we can find communicative behaviors that are non human. Both avatar form and its interaction mechanisms depend on the kind of affordances, the environment they exist in and the tools through which we as users communicate or control our avatars in the virtual realm. These tools are a part of the medium that inhabits these avatars. In recent years many controllers and wearables have been developed for virtual reality that maps the human body to the virtual avatar. In Virtual Reality games and simulations, the focus of designing around human physical principles is currently complemented by the increasing use of various non anthropomorphic avatars such as dragons, objects
or tiny creatures that neither look or act human[3]. Part of interacting with this being may require role playing that focuses on new modalities and mores. We know that such role playing establishes a link between the performance and identity inside the virtual environment, but what how would identity change if these interactions are based on non anthropomorphic principles that stretch human interactions to a non human realm. Interacting with and inhabiting with non anthropomorphic avatars inside virtual reality may define new ways of being and interaction that go beyond human-centered principles. Also what could be the emergent behaviors[3] from those interactions, for example, inside a virtual reality if the user is a table, what does it mean for the user if a glass of water is placed on him? How would he do it? Does he use one of the four limbs to put it or does the user has just to wait if someone places the glass of water on him? Does that mean you are working? Is this how work might be defined in a world where the user is a table? Such questions might help us to go beyond human interactions and embrace the emerging modalities of interactions from these virtual worlds. Embodiment in virtual reality lends itself to be the most promising form of media to explore non-anthropomorphic physiologies.
2. Use of non anthropomorphic avatars in Digital Media
Computer graphics a tool for personal representation and communication has been in use for a long time. Lucas films Habitat(1986)[5], the first, 2D online virtual world allowed players to customise the design of their avatars. Players had the option to create non human avatars, like animals and other non human cartoon figurines. When the game was released, many users mixed up the physiology of humans and animals and other objects to create unusual creatures. Traveler used little fidelity 3D faces representing the inhabitants of the virtual world. In this game also players chose the non human avatars, like flamingo cats and other animals. The online world of Lamda Moo(1990) perhaps had the most flexibility in defining the avatar as it was text based. It had an unlimited range of avatar types since the intricately written descriptions, limited only by the power of the participant writer, were presented in a chat based interface [3]. Developments in computer graphics and computing power, computers of today can support 3D avatars with highly with highly modifiable physiognomy; gender attributes hair styles and body shapes. The ability to create custom avatars of such details has allowed people to represent themselves in many virtual contexts. This flexibility in design has also afforded not only more detailed avatars but also non human identities with various more physical affordances and kinematics and gestures of interaction.
Second life is a hugely popular virtual world that allows participants to create an extensive range of personal avatars[4]. Most people take great pride and pleasure in customizing their avatars in various ways. While a most of second lifers customise their avatars to look like their real world self, others take the advantage of the opportunity to create weird and absurd avatars from legends (dragons , vampires ) , mythology (devil , seraphim) , science fiction (robots , aliens ) or pure imagination( spaghetti monsters ). A group of avatars falls into the category of “tinies” which through clever manipulation underlying skeletal and second life forms manifest a smaller size than otherwise possible.
Recently researchers have developed different IVE in which the participant is controlling a non anthropomorphic form. One such example is Birdly[6]. Birdly is a multisensory virtual reality experience, crafted by Max Rheiner and his team at the Zurich University of the Arts. The player lays atop a custom table that supports their chest, hips, legs, and arms. The player’s head is left free to accommodate the Rift(headset). Arms are placed across boards which act as wings and slotted underneath a bracket that allows the player to pull up on the wing; the start button is also positioned on the bracket. The simulation runs atop the Unity engine, with the birdly rig itself appearing as a USB device dropped into the game. This project is a perfect example of how we can alter the human physiology in virtual space by making the user interact with a designed tool to perform with.
I imagine us using physical instruments that would help us augment our body in the physical reality, improving our feeling of emboldenment in the virtual reality. These instruments would translate the motion in the physical reality to the virtual reality . I imagine users would be able to add and subtract these instruments depending on their personal embodiment requirement. The following vignette shows how one might in the future use such instruments and embody a virtual avatar.
3. Vignette: Becoming a Rover.
Samuel is a design engineer at NASA, Jet Propulsion Laboratory. His job is to make a new Mars rover that will go in with astronauts to Mars in an upcoming mission. There are four essential functions of a rover for this mission. One, provide first-hand medical assistance to the fellow astronaut. Two, provide the functionality of breaking, digging or doing reconnaissance of an area. Three, provide the functionality of a storage unit. The rover has to have the capacity and strength to store any and all evidence found during a survey task.
Samuel looks at his desk. There on the table is his virtual reality gear. The equipment included a headset; that would show the visuals and project audio of the virtual Mars, a far wear, that would emit different smells at Mars, A wearable similar to a jacket, which also resembled, by the looks of it, to a solar panel. Other than these, other, gadgets /wearables resembled functionally to a typical Mars Rover. Samuel’s job is to wear these devices and pretend to be a Mars Rover on Virtual Mars. Once he completely embodies the rover, he would start adding, subtracting or replacing different parts, on his body that would then define the structure of the rover.
Samuel had made a sketch of a mantis-like robot, for this mission, It had four legs, two hands, a head structure, and body that extended till the back. Just like a mantis, this robot had a large abdomen, that would be used to store the evidence. Each of the of the four legs had two sets of feet. One was a flatbed stepping function feet, which he uses as walking feet, and if need be to step on to crush a stone. The another feel was rolling wheel, which would allow the rover to navigate to any position quickly. Both of the feet were hinged at their ankle and would snap at the end of the leg, depending on what kind of feet is required. The hands of the rover had a claw-like structure at the end with three fingers. One can imagine the claw to look like as if there is a human hand with only the index finger, middle finger and the thumb. The head of the mantis has two big fisheye lens cameras, providing the stereo image of the surrounding. At the top of the head, the rover had two sets of antenna that would receive and send radio signals to the HAB( habitat where astronauts can stay comfortably without the space suit). Between the abdomen and the metathorax of the mantis, where one would expect to have wings of mantis, the rover had solar panels, shaped similarly as mantis wings. Though the solar panels were wing-shaped, the rover did not possess the functionality of flying.
To completely immerse into the virtual body and the world, Samuel has to calibrate his body to that of the virtual self. The process of calibration allows him to be a rover, and recognize himself as the rover and not as Samuel, only then he can understand and feel like a rover, and make proper judgments on the function and design of the rover.
The calibration of getting into a virtual self-starts with defining the “REST” state of the virtual body, in this case, it is the rover. Once the “REST” state is set, the user has to position and adjust his body that is closest to the REST state of the virtual object. The REST state of the Mantis Rover was when it was standing straight with its head looking in the front, hands bent at the elbow and legs, apart, bent outwards at the knee. Samuel stood up from his desk went to the open space next to his table. He shaped his body to that of the REST state of the mantis and stood still. When the body is calibrated to that of the another body, the process is meditative in nature. The user has to be convinced that the body he is inhabiting is not the one he was before calibration but it is the one in which he is in right now. Samuel stands in the position and meditates, thinking himself to be a mantis-like robot. In his meditation, he thinks about how he
would move his legs, where would he feel his other two legs. He also imagines, how his solar panel wings are placed on his body. He imagines that when the sun rays would hit the solar panel wings, it make him feel similar to when he would feel warm when sun rays would fall on his skin. Similarly, he thinks how would the ground feel once if he is using his stepping feet, and how would it be different if he uses the wheels. How he has to adjust for the weight if anything is put into the storage unit at the end of his body. He could feel that the increase in the weight at the end would put strains on his shoulders and the virtual joints where the wings are attached to compensate the weight. Samuel has not yet worn any of the virtual reality controllers. He is simply meditating and imagining how it would feel to be a rover.
After half hour of meditative calibrating his body, Samuels starts to feel like the Rover. The calibration time is different for different users and is dependent how long does it take them to convince themselves to be in a different body, without any external input. Samuel does not open his eyes and stretches out his hand to the desk grabbing the VR headset on top of it. The room was familiar to him and the positioning of himself in the 3D space around him. He did not need to open his eyes to locate the different VR instruments. He placed the VR headset and opened his eyes. He can see the Martian surface, his two arms folded in front of him. As he looks around, he can see all his body white body with light green accents at the joints and surface separations.
Samuel has a basic understanding of his new found body because of the calibrating process he went through. He will use this process to now actually place the different physical parts onto himself to give shape to the rover. After wearing the helmet, he picks up the arms controllers. The controllers we designed to match the physical shape of the mantis rover. They had clay like a hand which would be put on like a glove over a human hand the arms of the controller would map to that of human arms and elbows bending inwards. As soon as Samuel wore the controllers he could feel the position of the rover claws to the of his body is was not correct. He was matching this experience to the experience when he was calibrating, and he sensed that the elbow joint on the rover had to be outward int instead of an inward joint. The position at which Samuel was meditating and calibrating was with the elbows on bent outwards. The position of the arms would be similar to the position if a person is holding a big cylindrical object from above along the curvature with hunched shoulders. Samuel inverted the joint lever of the controller, and now the arms were bending outwards. In the virtual world, Samuel now has two parts of his physiology, a head and a pair of arms. To place the other parts on the body, Samuels starts to move around as the rover would and tries to extrapolate the feeling to the function. He tries to stretch his body upwards so that he can estimate how the back of the body should feel. He stretches in a similar way to
that of a werewolf howling at the moon in a stereotypical werewolf movie. As he does that, he can figure out that his physical body is at an offset from his virtual body. He puts on the VR jacket and sets the elasticity of its spine and its height to match the offset to his physical body. The jacket also has haptic controllers on it that simulate the feeling of the limbs coming out of the main body. The positioning of these sensors and motors are adjustable. Samuel now has to adjust his body in a way that would make him feel as if he has extra two limbs. In a human body, the limb is attached to the hips in a ball socket connection. Samuel wanted to feel as if the same hip has two ball socket connections, so he placed the sensor and the motor next to Psoas Major bone on the hip. Samuel puts the sensors on both sides of his waist, left and right. Now he has four legs in the virtual world. The controller on the VR Jacket behaves in a such a way that if he flexed his muscle on his leg correctly the Vastus Lateralis muscle on the side of his thigh in almost a circular fashion, the VR would simulate as if the virtual leg is moving. The circular flexing would map to a ball socket rotation which in turn performs a stepping action of the virtual leg. The sensor can detect when Samuel flexes the muscle and when he moves the leg. Together Samuel can design the gate for the rover. When the Rover is walking, at a given time one of the real leg and diagonally opposite one virtual leg is touching the ground. The left virtual leg pairs with the Real right leg, and the left real leg is paired with the virtual right leg. For each step, the rover takes, it lifts one pair and keeps the other pair on the ground. As one pair of legs touches the ground and the other lifts up, Rover completes a step. The actuator at the hip position detects when Samuel raises his virtual leg and gives out haptic actuation at the hips which emulate the feeling of moving a limb. So even though there is no actual limb in the physical reality, Samuel can feel one in the virtual reality because of the haptic sensation. Now that his feeling of having four legs is convincing, he put on the remaining leg controllers that would simulate the haptic sensations if a virtual object was to touch the leg, of if the feet were to step on something.
Samuel moves around in the virtual would by using his virtual limbs and claws. He imagines himself as this robot and how it would move, and as he was moving, he was identifying more and more as the rover. His body was not still complete. He puts on the storage unit controller. It goes on the back of the VR jacket and extends back like a tailcoat, except it was solid and would not flow like a cloth. Samuel could feel the on his back specifically because the way he was standing the majority of the weight was falling on his lower vertebrae. It had been a while since he had put on a controller that had a tail like a feature, but because of the calibration, he was able to imagine how it would feel to have a large body stretching out from the end of the spine. Samuel adjusted the sensors and actuators on the controller such that he could feel the storage unit similar to his virtual leg. Samuel would move in a tail swiping fashion, to
exactly feel how the sensation of the tail should be. He put in the right values for the setting in the controller and move around the virtual Mars surface. Finally, it felt right; now he was a complete rover.
A 3D rendering of the rover from the third perspective was showing on a separate screen. It was recording all kinetics of the controllers to that of the virtual rover. The computer was recording the body movement of how Samuel was figuring out where the body part should be and what the sensitivity of it should be. There was a kind of choreography like movement for this Rover. The computer created a .bvh( Biovision Hierarchy) file. It had the bone structure of the Rover and various animations done by Samuel as he was creating the Rover. The animation looked similar to a 3D rendering of a motion capture creature in the behind the scene of Peter Jackson movie. The computer prompted Samuel “ File name ? “. Samuel responded, “Mantis Dance .“ Computer saved the file and distributed the file to all the program managers at NASA. Mantis dance is now the visual grammar used whenever referring to the Rover. These performative references are used by NASA whenever they are deciding on what the Rover should do and how it should do it.
fin.
4. Presence and embodiment in VR
Researchers in Human Computer Interaction use the principle of Presence to explain human recognition and relations to other beings in the virtual world. When the participant feels the presence, the role of the computer in presenting the virtual world is no longer noticed thus the experience feels authentic and natural.
Progressive Embodiment in virtual environments explores VR interfaces which embody the user in virtual reality[7]. The article provides an example of the technological construction of the self in virtual reality. The goal or effect of the progressive embodiment is the sensation of physical, social and self-presence in the virtual environment. The progressive embodiment is defined as "advancing toward immersion of the sensorimotor channels to computer interfaces through a tighter coupling of the body to interface sensors and displays" [8]. Biocca argues that the body is at the center of all communication; new communication mediums engage the body in new ways. A progressive embodiment involves the human body as an information channel, an expressive communication device and a simulator for the mind. The body represented by the interface design (the graphical image) is referred to as "the avatar of the user." Immersive VR interfaces define the shape and boundaries of the avatar in the virtual world. The avatar and the virtual world are perceptual illusions generated by a head- mounted display.
The psychological effect of the progressive embodiment is presence which users describe as a "compelling sense of being in a mediated space”[7]. Biocca argues that presence in virtual reality exemplifies the desire to use media as a means of physical transcendence. Presence in the virtual environment is unstable and changing. Users mentally oscillate between the physical, virtual and imaginal environments. A shape, behavior or sensory experience indicates the presence of another intelligence in the virtual world. The intelligence may be human or non-human, friend or alien. Self-presence is the user’s mental image or model of the self in the virtual world[8]. Besides the graphic representation of the self, the user’s internal self is represented in virtual reality as models of the self’s body and identity. Biocca sees progressive embodiment in advanced VR technologies (head-set, data gloves, and bodysuit systems) as a form of cyborg coupling. The user becomes a cyborg when the user’s body is coupled with its technological sensors and displays which he refers to as the cyborg’s dilemma. The more natural the VR interface, the more it adapts to the user and the user to it so that the user becomes unnatural or a cyborg.
The relationship between the body and experience is direct and immediate, even entwined. Our body becomes the vehicle for sensory experience –that body which has itself been formed by experience. The body shapes who we become by compelling our neurons to form their intricate and scintillating patterns of connectivity. Experience affects how we think, feel and understand our place in the external world, and it does this by forming the mind by which we make sense of it.
Experience discloses beneath objective space, in which the body eventually finds its place, a primitive spatiality of which experience is merely the outer covering and which merges with the body’s very being.[9] To be a body is to be tied to a certain world. An early, yet pivotal example is the Placeholder project, done in the early 1990s by Brenda Laurel, Rachel Strickland[10] and team, which is arguably one of the most embodied virtual experiences ever to be created.
Placeholder directly recalls Donna Haraway’s notion of our relationship to other gendered creatures[11]. In Placeholder you are embodied, but not as a human being. You take on the persona and characteristics of one of four totemic animals: spider, crow, snake or fish, performing from their point of view, speaking in their voice, seeing with their eyes and even leaving messages in the virtual world for others to find.
VR critics have described how participants enter into the world of the virtual and leave their bodies ‘behind.' I believe that participants do not leave their bodies behind, even though to a bystander or spectator the physical body may seem to be a form of shed detritus in the room. The body of the participant is
synchronously subsumed into the virtual self that enters into the world within the screen, which is created in mind from what the body experiences[12].
At present, the VR controllers are in their infancy. One can imagine achieving immersion in the environment the first attempt was to make these controllers for humans with anthropomorphic design philosophy. The same reasoning can be given for using virtual human bodies in the VR systems. As humans, cognitively we can immerse ourselves in environments if the virtual point of view is similar to that of our physical self.
I imagine in future the controllers for these VR would not only map the human body as the human body in the virtual world but also allow the user to embody a physiology different to that of a human. The controllers need to have a sensor to sense the physiology and an actuator system to give feedback to the user on activating the sensor. The form of the controller can be varied depending on what use case it has to address. Taking an example from the vignette described before, one can imagine the controllers that Samuel was using were made specifically for designing a rover. In future, we can imagine if 3D printing would be as ubiquitous as present day 2D printing, users might 3D print a specific controller for a particular environment.
We can also imagine these controllers to have modular properties, which you can physically attach or detach to the form of the controller which would lead to the specific functionality of that controller. Having this controller in a modular form would allow us to map different functionality to different body parts. It would also give the users freedom to experiment the local embodiment of their physiology. For example in the vignette described before , if Samuel wanted to to have the same proprioception(position of organs with respect to body) of the hand but feel more metal , meaning , the sensation of the hands would feel similar to the sensation of touching a metal surface, in this situation, he could have attached the metal property controller to the hands, that would imitate the sensorial behaviour of a metal surface.
To explore different iterations and use-cases for these controllers, I am speculating a case study of these future designed controllers that a user might work with in a virtual environment. I imagine these controllers to be able to map the human body and also induce or aid the sense of embodiment of the user. At the same time, I also envision that these controllers would also allow the users to change the virtual physiology in the virtual world.
5. Speculative VR Controller Case Study
The Nimtom VR controllers are the latest addition to the Virtual Reality controller market. They have introduced new mechanisms and sensors that
increase the embodiment in the virtual body you would inhabit. The controllers are proved as a premed bundle, and you can also download the circuits and 3d model blueprints to custom 3d print them at your home, for $3.99 a blueprint. This case study is divided into two parts namely, form - design, and setup - usage.
5.1 Form and Design.
The Nimtoms or toms as they are famously known are of four different categories. Actions, properties, and sensations. The form of these controllers varies depending on what art of body you wish to attach them. In the purchased bundle you get, two sets of armatures, eight sets of soft body controllers that you can mold into any shape and four sets of adapters, that you would use as an adapter to join one or more controller to another controller.
The armature controller is solid controllers. Their shape is similar to human bones with muscles on top of it, almost teardrop like. These controllers are to be put on the arms, limbs, necks and other body structure that would have any form protrusion and some degree of motion. These controllers have muscle sensors, heat sensors and dielectric sensors on them. Similarly, they have the muscle actuators, heat dissipators, and haptic actuators. Together these controllers can give the user the sensation of an organ or a limb different to that of a human. The user can program and design the corresponding virtual appendage for each control in the proved easy to use SDK from Nimtoms. The soft body controllers work as attachments on the armatures and also as stand alone. They are primarily to be used as flesh cover over the body part. One way to imagine this is to think of them as a silicone rubber prosthetics one sees in the movies. The material for the soft body controller is similar to that of clay. One can hand-mould it to any shape or form one likes. The material is soft enough to take any shape but at the same time it does not break that easily. Underneath the soft body material, there is a solid structure similar to an armature that has the same sensors and actuators. The soft body allows users to create custom forms for their creations inside the virtual world also. Once inside the Virtual world, the user canon the SDK inside the VR and hand-mould the soft body in the physical reality. Depending on how you have programmed the mapping of the soft body to that of the virtual appendage it would change the form in the virtual world as you would change its shape in the real world. The adapters are not controllers by themselves they are used as a module for attaching other controllers together. One can use a certain Action armature with a properties soft body. This allows the user with a lot of freedom in designing and defining the kind of physiology they would like to embody and create.
5.2 Set up and Usage.
The important change of Nimtoms from other controllers is its calibration process. Usually, other products allow the user to not use any controllers while they are calibrating through meditation, the Nimtoms encourage the users to use their controllers while they are calibrating to a virtual physiology. Their argument is that these controllers are designed in such a way that instead of having a distracting effect these controllers allow the user to enter a state of cognitive flow, which allows them to calibrate faster. For calibration, first, attach all the controllers on the body and take the REST stance. Now as one starts to imagine themselves in the virtual self-remove or change the setting of the controller. Changing from the popular practice the Nimtoms encourage a subtractive approach in which you remove the parts that you don't need than adding the parts you need. On their website, they claim that in their research they found removing the controllers helps calibrate faster than adding the controller on one's body.
The controllers are not only divided in their form, they also have different functions. Each type, whether armature or soft body or adapter has a function relating to Action, Properties, and Sensations. Action toms are the controllers that can perform volumetric and mechanical actions. So if you attach the attachment action Tom, you can attract or attach other virtual objects to it. An example of it would be using it as a hand tool in VR. Once you attach the controller on your body, in the virtual world you can use the virtual representation of this controller to hold things. As the action of this controller is to simulate a tenacious behavior, one can use it various ways. Some of the user created designs that are shared on the online community use of attachment controller as the source for attractive force, similar to that of a magnet, and create a virtual electrical motor like a hand. So when this user had to hold something, he would rotate his virtual hand watch would create a rotating a magnetic field like force and anything made of virtual metal would start orbiting the hand. Another example of using the controllers in a creative way was the usage of a Property controller, was the usage of mirror property. The Mirror property just copies the simulated physical properties, that it interacts with in the virtual world. In the user submitted story, his design was to create dynamic embodiment through these controllers. The whole virtual self-was created using these controllers, and whatever he sees or interacts with he virtual world he would embody it immediately.
All the types of controllers are open to public for creating more types of interactions and virtual behaviours. In their Asset Store, one can find many of such user submitted controller design that is either free to get or have a premium to pay.
I think it is safe to Nimtoms controllers are one of the most progressive technologies out there, pushing the limits of VR creation through embodiment. Unlike other systems, they have an open system for creating the controllers which lead to quality check issues in the market. Embodying a virtual body has an affective nature on the human sense of proprioception and kinematics of the body.
The performance of these controllers are seamless, and setting the sensitivity in these controllers after calibration is easy. The variety of controllers and their function some time add confusion and thinking on which one to choose and when but after some time of usage, it is easy to get accustomed.
fin.
Through this case study my attempt was to show how the controllers for VR enable us to embody other physiology and at the same time make the same physiology. The VR controllers of the future would be designed keeping in mind that the person using these controllers are not just spectating a new physiology but actively creating it.
The body is likewise an expressive gadget [13], a social semiotic vehicle for speaking to mental states (e.g., feelings, perceptions, arranges, and so forth.) to others.
If the body is the crucial correspondence equipment, a test system for a brain, what is its relationship to media made of steel, plastic, or silicon? Rather than beating blood, beats of electrons and light enliven these media[14]. McLuhan long prior pointed out that cutting edge correspondence interfaces connect themselves to the body in the expressions of McLuhan, "media are augmentations of the faculties."
McLuhan's vision of media situations is a somewhat unique vision than the one progressed by [15] is described in his article on "man-PC advantageous interaction." For him, "man-PC beneficial interaction" is a subclass of "man- machine frameworks." The PC was not to be dealt with like different machines since it was "canny." The incorporeal human mind would be coupled to a machine cerebrum as opposed to psychological situations:
The advancement of these gadgets(VR Controllers) is the development of the dynamic coupling of sensors and show gadgets to the body[16]. The vision of such a framework anticipates a few applications where the body of the client is to be totally inundated in the interface, and the psyche is set coasting in the media transmission framework – on the internet. Like a body entering a sink, a
shower, or a pool, correspondence requests and settings will decide how much the body should be submerged in the electric-cool waters of the internet.
Virtual reality technology has a unique feature of enabling its users to embody a different physiology. I believe that this feature is dramatically underused aspect of the technology. The ability to inhabit another body gives us an altogether a new playground to explore[17]. Traditionally as a maker in the form of a designer or engineer, we have been constrained to create in third person perspective. Because of this constraint we have to create in a trial and error process, where we create something and then test it. This process has never been in question because of the physical limitations of human body and the physicality of the reality we live in. Through virtual reality, we can embody the object itself [18]and create it while embodying it. This phenomenon opens up a whole new way creating and designing. Being embodied in a body other than self can enable us to make objects with a sense of animism in the object. Having an object being sensed when it is being created would enable us as creators to implement the subtle attributes and affordances in the object that once made would connect better to a human actor.
The embodiment as a media has already been extensively used in the motion capture industry.[19] Instead of creating digital animations of the creatures and other animated entities, actors are put into a digital sensing room that records all the body motion of the actor, including facial features. The challenge lies in the performance of the actor, who is imaging himself or herself as a digital creature. In the movie Planet of the Apes[20], actor Andy Serkis plays the role of a chimpanzee. To give out convincing performance, that other actor could react to, special props were given to him that would help him match his body proportions to that of a chimp. Andy Serkis used these props to walk and jump imitating a chimpanzee and feeling of embodying this digital character allowed him to imagine how would a chimp show emotions on screen.
6. Conclusion
Through the discussion and these speculative descriptions, I hope I have been able to show, the unique opportunity and the ability in Virtual Reality we have. VR is a medium that’s only been around since the 80’s, and it doesn’t fully have language, standards, and conventions that apply to more traditional mediums like film. There are have been many representations of the human form as humans, but by creating a non human representation, we have opened a pandora's box. I believe through this research we can push forward the research
towards becoming better transhumans. Avert of questions are yet to be explored. It is clear more work needs to be done studying interactions that go beyond the human form for virtual realities.
References
[1] Ihde, Don. Bodies in Technology. Vol. 5. U of Minnesota Press, 2002.
[2] Bailenson, Jeremy N., and Andrew C. Beall. "Transformed social interaction: Exploring the digital plasticity of avatars." Avatars at Work and Play. Springer Netherlands, 2006. 1-16.
[3] Morie, Jacquelyn Ford, and Gustav Verhulsdonck. "Body/persona/action!: Emerging non-anthropomorphic communication and interaction in virtual worlds." Proceedings of the 2008 International Conference on Advances in Computer Entertainment Technology. ACM, 2008.
[4] Kaplan, Andreas M., and Michael Haenlein. "The fairyland of Second Life: Virtual social worlds and how to use them." Business Horizons 52.6 (2009): 563-572.
[5] Morningstar, Chip, and F. Randall Farmer. "The Lessons of Lucasfilm's Habitat." Journal For Virtual Worlds Research 1.1 (2008).
[6] Rheiner, Max. "Birdly an attempt to fly." ACM SIGGRAPH 2014 Emerging Technologies. ACM, 2014.
[7] Biocca, Frank. "The Cyborg's Dilemma: Progressive Embodiment in Virtual
Environments [1]." Journal of Computer-Mediated Communication 3.2 (1997): 0-0.
[8] Lee, Kwan Min. "Presence explicated." Communication Theory 14.1 (2004): 27-50.
[9] Morie, Jacquelyn Ford. "Performing in (virtual) spaces: Embodiment and being in virtual environments." International Journal of Performance Arts and Digital Media 3.2-3 (2007): 123-138.
[10] Laurel, Brenda, Rachel Strickland, and Rob Tow. "Placeholder: Landscape and Narrative in virtual environments." ACM SIGGRAPH Computer Graphics 28.2 (1994): 118-126.
[11] Haraway, Donna Jeanne. A manifesto for cyborgs: Science, technology, and socialist feminism in the 1980s. Center for Social Research and Education, 1985.
[12] Doyle, Denise. "The body of the avatar: rethinking the mind-body relationship in virtual worlds." Journal of Gaming & Virtual Worlds 1.2 (2009): 131-141.
[13] Polhemus, Ted, and Jonathan Benthall. "The body as a medium of expression." J. Benthall andJ. Polhemus, eds (1975): 13-35.
[14] McLuhan, Marshall. "Effects of the improvements of communication media." The Journal of Economic History 20.04 (1960): 566-575.
[15] Licklider, Joseph CR. "Man-computer symbiosis." IRE transactions on human factors in electronics 1 (1960): 4-11.
[16] Mayernik, Matthew S., Jillian C. Wallis, and Christine L. Borgman. "Unearthing the infrastructure: Humans and sensors in field-based scientific research." Computer Supported Cooperative Work (CSCW) 22.1 (2013): 65-101.
[17] Ryan, Marie-Laure. Narrative as Virtual Reality 2: Revisiting Immersion and Interactivity in Literature and Electronic Media. JHU Press, 2015.
[18]Redström, Johan. "Towards user design? On the shift from object to user as the subject of design." Design studies 27.2 (2006): 123-139.
[19]Beier, Klaus-Peter, et al. "Method, system and computer program product for automatically creating an animated 3-D scenario from human position and path data." U.S. Patent Application No. 10/408,884.
[20] Wells, Paul. "Where the rubber hits the road: The illusion of animation." Animation Practice, Process & Production 1.2 (2012): 197-207.

Shiveesh Fotedar