Following a study conducted in 1991 supporting that kinesthetic information affects visual processing information when moving an arm in extrapersonal space, this research aims to suggest utilizing virtual-reality (VR) technology will lead to more accurate and faster data acquisition (Helms Tillery, et al.) [1]. The previous methods for conducting such research used ultrasonic systems of ultrasound emitters and microphones to track distance from the speed of sound. This method made the experimentation process long and spatial data difficult to synthesize. The purpose of this paper is to show the progress I have made in the efforts to capture spatial data using VR technology to enhance the previous research that has been done in the field of neuroscience. The experimental setup was completed using the Oculus Quest 2 VR headset and included hand controllers. The experiment simulation was created using Unity game engine to build a 3D VR world which can be used interactively with the Oculus. The result of this simulation allows the user to interact with a ball in the VR environment without seeing the body of the user. The VR simulation is able to be used in combination with real-time motion capture cameras to capture live spatial data of the user during trials, though spatial data from the VR environment has not been able to be collected.
Details
- A Starting Guide to Capturing Visual & Kinesthetic Information from Virtual-Reality Technology
- Syed, Anisa (Author)
- Helms-Tillery, Stephen (Thesis director)
- Tanner, Justin (Committee member)
- Barrett, The Honors College (Contributor)
- Harrington Bioengineering Program (Contributor)