Menu
Home
Log in / Register
 
Home arrow Computer Science arrow Augmented and Virtual Reality
< Prev   CONTENTS   Next >

5 Conclusions

This paper presents a system to allow non-invasive remote control of a robotic hand by using low-cost acquisition devices. The system is able to recognize human hand poses and can send them over the Internet, in order to control a robotic hand, that is able to reproduce poses in real time. This system does not require any tuning phase. Despite further optimizations which are still required, our approach shows great accuracy in discriminating even similar poses and achieves real-time performances.

Such system can be useful in many different fields, as for example humanmachine interaction, and may easily and intuitively allows interaction with 3D virtual environments.

The paper presents also an early set of experiments demonstrating the efficiency of the system. The preliminary collected results demonstrate that more than 90% of times signs are correctly sent over the network and successively recognized by the test subjects touching the robotic hand. Note that errors in sign recognition by the subjects are not a validation penalty, since they are not Tactile LIS experts. The system will be evaluated through future experiments, when deaf-blind persons will be involved as well. Nevertheless, performed experiments were very useful to preliminarily assess the feeling of the subjects in touching the haptic interface while performing the sign recognition task.

References

1. Openni. openni.org/

2. Prensilia s.r.l., datasheet eh1 milano series (2010). prensilia.com/ index.php?q=en/node/41

3. Abbou, C.C., Hoznek, A., Salomon, L., Olsson, L.E., Lobontiu, A., Saint, F., Cicco, A., Antiphon, P., Chopin, D.: Laparoscopic radical prostatectomy with a remote controlled robot. The Journal of Urology 165(6), 1964–1966 (2001)

4. Athitsos, V., Sclaroff, S.: Estimating 3d hand pose from a cluttered image. In: Proceedings of 2003 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, vol. 2, p. II-432. IEEE (2003)

5. Bray, M., Koller-Meier, E., Van Gool, L.: Smart particle filtering for 3d hand tracking. In: Proceedings of Sixth IEEE International Conference on Automatic Face and Gesture Recognition, pp. 675–680. IEEE (2004)

6. Breiman, L.: Random forests. Machine Learning 45(1), 5–32 (2001)

7. Breuer, P., Eckes, C., Mu¨ller, S.: Hand gesture recognition with a novel ir time-offlight range camera–a pilot study. In: Gagalowicz, A., Philips, W. (eds.) MIRAGE 2007. LNCS, vol. 4418, pp. 247–260. Springer, Heidelberg (2007)

8. Comaniciu, D., Meer, P.: Mean shift: A robust approach toward feature space analysis. IEEE Transactions on Pattern Analysis and Machine Intelligence 24(5), 603–619 (2002)

9. Controzzi, M., Cipriani, C., Carrozza, M.C.: Design of artificial hands: A review. The Human Hand as an Inspiration for Robot Hand Development. STAR, vol. 95, pp. 219–246. Springer, Heidelberg (2014)

10. Erol, A., Bebis, G., Nicolescu, M., Boyle, R.D., Twombly, X.: Vision-based hand pose estimation: A review. Computer Vision and Image Understanding 108(1), 52–73 (2007)

11. Frankel, S., Glenn, R., Kelly, S.: The aes-cbc cipher algorithm and its use with ipsec. RFC3602 (2003)

12. Gavrila, D.M., Davis, L.S.: 3-d model-based tracking of humans in action: A multiview approach. In: Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, CVPR 1996, pp. 73–80. IEEE (1996)

13. Goncalves, L., Di Bernardo, E., Ursella, E., Perona, P.: Monocular tracking of the human arm in 3d. In: Proceedings of Fifth International Conference on Computer Vision, pp. 764–770. IEEE (1995)

14. Grebenstein, M.: The awiwi hand: An artificial hand for the DLR hand arm system. In: Grebenstein, M. (ed.) Approaching Human Performance. STAR, vol. 98, pp. 67–136. Springer, Heidelberg (2014)

15. Han, J., Shao, L., Xu, D., Shotton, J.: Enhanced computer vision with microsoft kinect sensor: A review (2013)

16. Keskin, C., Kıra¸c, F., Kara, Y.E., Akarun, L.: Real time hand pose estimation using depth sensors. In: Consumer Depth Cameras for Computer Vision, pp. 119–137. Springer (2013)

17. Kuznetsova, A., Leal-Taixe, L., Rosenhahn, B.: Real-time sign language recognition using a consumer depth camera. In: Proceedings of the IEEE International Conference on Computer Vision Workshops, pp. 83–90 (2013)

18. Lorussi, F., Scilingo, E.P., Tesconi, M., Tognetti, A., De Rossi, D.: Strain sensing fabric for hand posture and gesture monitoring. IEEE Transactions on Information Technology in Biomedicine 9(3), 372–381 (2005)

19. Mesch, J.: Signed conversations of deafblind people

20. Oikonomidis, I., Kyriazis, N., Argyros, A.A.: Efficient model-based 3d tracking of hand articulations using kinect. In: BMVC, pp. 1–11 (2011)

21. Raspopovic, S., Capogrosso, M., Petrini, F.M., Bonizzato, M., Rigosa, J., Di Pino, G., Carpaneto, J., Controzzi, M., Boretius, T., Fernandez, E., Granata, G., Oddo, C.M., Citi, L., Ciancio, A.L., Cipriani, C., Carrozza, M.C., Jensen, W., Guglielmelli, E., Stieglitz, T., Rossini, P.M., Micera, S.: Restoring natural sensory feedback in real-time bidirectional hand prostheses. Science Translational Medicine 6(222), 222ra19 (2014)

22. Rehg, J.M., Kanade, T.: Digiteyes: Vision-based hand tracking for humancomputer interaction. In: Proceedings of the 1994 IEEE Workshop on Motion of Non-Rigid and Articulated Objects, pp. 16–22. IEEE (1994)

23. Rodriguez-Galiano, V., Ghimire, B., Rogan, J., Chica-Olmo, M., Rigol-Sanchez, J.: An assessment of the effectiveness of a random forest classifier for land-cover classification. ISPRS Journal of Photogrammetry and Remote Sensing 67, 93–104 (2012)

24. Shotton, J., Sharp, T., Kipman, A., Fitzgibbon, A., Finocchio, M., Blake, A., Cook, M., Moore, R.: Real-time human pose recognition in parts from single depth images. Communications of the ACM 56(1), 116–124 (2013)

25. Stenger, B., Thayananthan, A., Torr, P.H., Cipolla, R.: Model-based hand tracking using a hierarchical bayesian filter. IEEE Transactions on Pattern Analysis and Machine Intelligence 28(9), 1372–1384 (2006)

26. Walkler, R.: Developments in dextrous hands for advanced robotic applications. In: Proc. the Sixth Biannual World Automation Congress, Seville, Spain. pp. 123–128 (2004)

27. Wang, R., Paris, S., Popovi´c, J.: 6d hands: markerless hand-tracking for computer aided design. In: Proceedings of the 24th Annual ACM Symposium on User Interface Software and Technology, pp. 549–558. ACM (2011)

28. Wang, R.Y., Popovi´c, J.: Real-time hand-tracking with a color glove. ACM Transactions on Graphics (TOG) 28, 63 (2009)

 
Found a mistake? Please highlight the word and press Shift + Enter  
< Prev   CONTENTS   Next >
 
Subjects
Accounting
Business & Finance
Communication
Computer Science
Economics
Education
Engineering
Environment
Geography
Health
History
Language & Literature
Law
Management
Marketing
Philosophy
Political science
Psychology
Religion
Sociology
Travel