My Research

 My Research

    Usability Study of Fabric-Based Sensors as Interactive Input Devices in Augmented and Virtual Reality Experiences.

    Keywords: Fabric-Based Sensor, Capacitive Sensor, UI, Unity, AR and VR.

The Capacitive Touch Sensor (CTS)

    The name of the fabric-based sensor that I am currently using is The Capacitive Touch Sensor(CTS). It is created by Vallett Richard who is a electrical engineering PHD student as Drexel University, and it was his master project. Currently, he is working at Center for Functional Fabric. They have some pretty cool projects, such as haptic glove. Here is the link to the website: https://drexel.edu/functional-fabrics/research/projects/

    Here is the description of the sensor from the website. "The Capacitive Touch Sensor (CTS) is a gesture sensitive functional textile touch-pad interface for physical devices. The CTS is produced as a single piece of fabric requiring only two electrodes to connect it to a microcontroller. The CTS offers a solution for a flexible touch interface with consistent location detection, responsiveness, comfort and unobtrusiveness." https://drexel.edu/functional-fabrics/research/projects/capacitive-touch-sensors/

    It works very similar to touch pad, we just need to simply touch the black areas with our fingers. Since it is made of fabric, it is very light and highly customizable. It is very easy to change the layout of the pattern(input keys) to create different user interface. In the Figure.1, the pad will work very similar to keyboard that we are using everyday. The red line indicates how the touch areas are connected to each other and how it was knitted by the machine. The pad in Figure.2 works like a slider. 

                             Figure.1                                                                    Figure.2



    The second part of my research is AR/VR. AR/VR is good at display user interface and give a visual feedback to the user. I combined these two technologies together, so when user touch the pad, they will receive correspond visual feedback from the AR/VR. 
    I use Unity to develop my project. Unity is a very powerful tool to develop applications. Vuforia is a add-on in Unity, it is commonly use for develop AR/VR application, and it has a very convenient functionality which is the image target. The user can just simply upload an image that has a good quality for image recognition and place it into Unity's scene. Congratulation! You have a simple AR application.  
    I made a demo using these technologies. In the demo video, I used a cardboard(will use the sensor's pad as image target in the future) as the placeholder of the image target in the real world and use the sensor pad as the control tool. When the camera detects the cardboard, it will display a set of piano's key near to the cardboard. If the cardboard moves, it will follows. When I touch the pad, it will play the piano's sound and change the color of one key. However, there are some problems. The biggest problem is the sensitivity of the sensor, the range of the reading of one button is kind high. When I press a button, the reading of this button may jump to another button and trigger the script of that button. But this problem can be solved in the future. 
    Here is the video of my project demo.

   
Related Works
   "Smarter Objects" is a very cool AR project that done by MIT. It connects daily electrical objects through Internet and use AR to display UI.(Figure.3 and Figure.4) The way how the UI is being displayed by the AR is very similar to this project. Especially at the 0:37 of the second video.  




                        Figure.3                                                                             Figure.4
             Smarter Object Video 1                                                      Smarter Object Video 2
   





评论

此博客中的热门博文

Work Plan

Development Progress