SA-SEMNET: A NETWORK FUSED TACTILE INFORMATION AND SEMANTIC ATTRIBUTES FOR OBJECT RECOGNITION, 383-393.

Shengjie Qiu, Baojiang Li, Haiyan Ye, and Haiyan Wang

References

  1. [1] M. Bauza, A. Bronars, and A. Rodriguez, Tac2Pose:Tactile object pose estimation from the first touch, TheInternational Journal of Robotics Research, 42(13), 2023,1185–1209.
  2. [2] C. Wang, C. Liu, F. Shang, S. Niu, L. Ke, N. Zhang, B. Ma,et al., Tactile sensing technology in bionic skin: A review,Biosensors and Bioelectronics, 220, 2023, 114882.
  3. [3] Z. Wang, D. Zhang, L. Yang, and J. An, Three-dimensionalforce detection using PVDF and room temperature-vulcanizedsilicone rubber layers, Measurement Science and Technology,34(4), 2023, 045111.
  4. [4] Y. Xu, S. Zhang, S. Li, Z. Wu, Y. Li, Z. Li, X. Chen, C. Shi, P.Chen, P. Zhang, M. D. Dickey, and B. Su, A soft magnetoelectricfinger for robots’ multidirectional tactile perception in non-visual recognition environments, NPJ Flexible Electronics,8(1), 2024, 2.
  5. [5] N. Li, Z. Yin, W. Zhang, C. Xing, T. Peng, B. Meng, J. Yang,and Z. Peng, A triboelectric-inductive hybrid tactile sensorfor highly accurate object recognition, Nano Energy, 96, 2022,107063.
  6. [6] S.-K. Yeh and F. Weileun, Molding/encapsulation/integrationapproach for tactile-bump and sensing-interface of inductivetactile sensor, in Proceeding 20th International Conf. on Solid-State Sensors, Actuators and Microsystems & EurosensorsXXXIII, Berlin, 2019, 285–288.
  7. [7] M. Zhao, H. Yinguo, H. Zhang, N. Guo, and Y. Zheng,Micro-force sensing techniques and traceable reference forces:A review, Measurement Science and Technology, 33(11), 2022,114010.
  8. [8] W. Yuan, D. Siyuan, and E.H. Adelson, Gelsight: High-resolution robot tactile sensors for estimating geometry andforce, Sensors, 17(12), 2017, 2762.
  9. [9] L. Cao, F. Sun, X. Liu, W. Huang, R. Kotagiri, and H,Li, End-to-end convnet for tactile recognition using residualorthogonal tiling and pyramid convolution ensemble, CognitiveComputation, 10, 2018, 718–736.
  10. [10] J.M. Gandarias, A.J. Garcia-Cerezo, and J.M. Gomez-de-Gabriel, CNN-based methods for object recognition with high-resolution tactile sensors, IEEE Sensors Journal, 19(16), 2019,6872–6882.
  11. [11] C. Guo, K. Huang, Y. Luo, H. Zhang, and W. Zuo,Object-oriented semantic mapping and dynamic optimizationon a mobile robot, International Journal of Robotics andAutomation, 37(4), 2022, 321–331.
  12. [12] Y. Zhang, Y. Zhu, H. Hu, and H. Wang, Automatichyperspectral image classification based on deep feature fusionnetwork, International Journal of Robotics and Automation,36(5), 2021, 363–375.
  13. [13] S. Woo, J. Park, J.-Y. Lee, and I. So Kweon, CBAM:Convolutional block attention module, In Proceeding. Euro-pean Conf. on Computer Vision (ECCV), Cham, 2018,3–19.
  14. [14] H. Chen, A. Gallagher, and B. Girod, Describing clothing bysemantic attributes, in Proceeding 12th European Conf. onComputer Vision (ECCV), Florence, 2012.
  15. [15] J.M. Gandarias, J.M. G´omez-de-Gabriel, and A.J. Garc´ıa-Cerezo, Tactile sensing and machine learning for human andobject recognition in disaster scenarios, in Proceeding 3rdIberian Robotics Conference, Cham, 2017, 165–175.
  16. [16] Q.T. Lai, Z.H. Zhao, Q.J. Sun, Z. Tang, X.G. Tang, andV.A.L. Roy, Emerging MXene-based flexible tactile sensors forhealth monitoring and haptic perception, Small, 19(27), 2023,2300283.
  17. [17] X. Zhi, S. Ma, Y. Xia, B. Yang, S. Zhang, K. Liu, M. Li, S.Li, W. Peiyuan, and X. Wang, Hybrid tactile sensor array forpressure sensing and tactile pattern recognition, Nano Energy,125, 2024, 109532.391
  18. [18] X.A. Nguyen and S. Chauhan, Characterization of flexibleand stretchable sensors using neural networks, MeasurementScience and Technology, 32(7), 2021, 75004.
  19. [19] S. Funabashi, G. Yan, F. Hongyi, A. Schmitz, L. Jamone, T.Ogata, and S. Sugano, Tactile transfer learning and objectrecognition with a multifingered hand using morphology specificconvolutional neural networks, IEEE Transactions on NeuralNetworks and Learning Systems, 35(6), 2024, 7587–7601.
  20. [20] S. Sundaram, P. Kellnhofer, Y. Li, and J.-Y. Zhu, A. Torralba,and W. Matusik, Learning the signatures of the human graspusing a scalable tactile glove, Nature, 569(7758), 2019, 698–702.
  21. [21] J. Bai, B. Li, H. Wang, and Y. Guo, Tactile perception infor-mation recognition of prosthetic hand based on DNN-LSTM,IEEE Transactions on Instrumentation and Measurement, 71,2022, 1–10.
  22. [22] V. Ashish, N. Shazeer, N. Parmar, J. Uszkoreit, L. Jones, A.N.Gomez, L. Kaiser, and I. Polosukhin, Attention is all you need,in Proceeding 31st Conf. on Neural Information ProcessingSystems, 2017, 1–11.
  23. [23] H. Liu, S. Ren, D. Ren, and X. Liu, Automatic extractionof orchards from remote sensing image based on categoryattention mechanism, International Journal of Robotics andAutomation, 37(1), 2022, 20–28.
  24. [24] G. Cao, Y. Zhou, D. Bollegala, and S. Luo, Spatio-temporalattention model for tactile texture recognition, in ProceedingIEEE/RSJ International Conf. on Intelligent Robots andSystems (IROS), Las Vegas, NV, 2020, 9896–9902.
  25. [25] Q. Wang, B. Wu, P. Zhu, P. Li, W. Zuo, and Q. Hu, ECA-Net: Efficient channel attention for deep convolutional neuralnetworks, in Proceeding. IEEE/CVF Conf. on Computer Visionand Pattern Recognition, Seattle, WA, 2020, 11534–11542.
  26. [26] B. Fang, X. Long, F. Sun, H. Liu, S. Zhang, and C. Fang,Tactile-based fabric defect detection using convolutional neuralnetwork with attention mechanism, IEEE Transactions onInstrumentation and Measurement, 71, 2022, 1–9.
  27. [27] R. Shi, S. Yang, Y. Chen, W. Rui, Z. Mengyue, L. Jiayi, and C.Yaoguang, CNN-transformer for visual-tactile fusion applied inroad recognition of autonomous vehicles, Pattern RecognitionLetters, 166, 2023, 200–208.
  28. [28] F. Wei, J. Zhao, C. Shan, and Z. Yuan, Alignment and multi-scale fusion for visual-tactile object recognition, in ProceedingInternational Joint Conference on Neural Networks (IJCNN),Padua, 2022, 1–8.
  29. [29] Z. Abderrahmane, G. Ganesh, A. Crosnier, and A. Cherubini,A deep learning framework for tactile recognition of knownas well as novel objects, IEEE Transactions on IndustrialInformatics, 16(1), 2019, 423–432.
  30. [30] A. Creswell, T. White, V. Dumoulin, K. Arulkumaran, B.Sengupta, and A.A. Bharath, Generative adversarial networks:An overview, IEEE Signal Processing Magazine, 35(1), 2018,53–65.
  31. [31] Z. Lin, S. Yu, Z. Kuang, D. Pathak, and D. Ramanan, Mul-timodality helps unimodality: Cross-modal few-shot learningwith multimodal models, arXiv:2301.06267, 2023.
  32. [32] X. Wu, H. Danfeng, and J. Chanussot, Convolutional neuralnetworks for multimodal remote sensing data classification,IEEE Transactions on Geoscience and Remote Sensing, 60,2021, 1–10.
  33. [33] K. He, X. Zhang, S. Ren, and J. Sun, Deep residual learningfor image recognition, in Proceedings of the IEEE Conf. onComputer Vision and Pattern Recognition, Las Vegas, NV,2016, 770–778.
  34. [34] A. Dosovitskiy, L. Beyer, A. Kolesnikov, D. Weissenborn, X.Zhai, T. Unterthiner, M. Dehghani, M. Minderer, G. Heigold,S. Gelly, J. Uszkoreit, and N. Houlsby, , An image is worth16x16 words: Transformers for image recognition at scale,arXiv:2010.11929, 2020.
  35. [35] V. Chu, I. McMahon, L. Riano, C.G. McDonald, Q.He, J.M. Perez-Tejada, M. Arrigo, T. Darrell, and K.J.Kuchenbecker, Robotic learning of haptic adjectives throughphysical interaction, Robotics and Autonomous Systems, 63,2015, 279–292.
  36. [36] R. Li, R. Platt Jr., W. Yuan, A. ten Pas, N. Roscup,M.A. Srinivasan, and E. Adelson, Localization andmanipulation of small parts using GelSight tactile sensing,in Proceeding IEEE/RSJ International Conference onIntelligent Robots and Systems, Chicago, Illinois, 2014,3988–3993.
  37. [37] A. Yamaguchi and C.G. Atkeson, Combining finger vision andoptical tactile sensing: Reducing and handling errors whilecutting vegetables, in Proceeding IEEE-RAS 16th InternationalConference on Humanoid Robots (Humanoids), Cancun, 2016,1045–1051.
  38. [38] N. Wettels and G.E. Loeb, Haptic feature extractionfrom a biomimetic tactile sensor: Force, contact loca-tion and curvature, in Proceeding IEEE InternationalConf. on Robotics and Biomimetics, Karon Beach, 2011,2471–2478.
  39. [39] J. Park, M. Kim, Y. Lee, H.S. Lee, and H. Ko, Fingertipskin–inspired microstructured ferroelectric skins discriminatestatic/dynamic pressure and temperature stimuli, ScienceAdvances, 1(9), 2015, e1500661.
  40. [40] F. Pastor, J. Garc´ıa-Gonz´alez, J.M. Gandarias, D. Medina,P. Closas, A. Garcia, and J. Gomez-de-Gabriel, Bayesian andneural inference on lstm-based object recognition from tactileand kinesthetic information, IEEE Robotics and AutomationLetters, 6(1), 2020, 231–238.
  41. [41] S. Luo, W. Mou, K. Althoefer, and H. Liu, Novel tactile-siftdescriptor for object shape recognition, IEEE Sensors Journal,15(9), 2015, 5001–5009.
  42. [42] A.-M. Cretu, T.E. Alves de Oliveira, V.P. da Fonseca, B.Tawbe, E. Petriu, and V. Groza, Computational intelligenceand mechatronics solutions for robotic tactile object recog-nition, in Proceeding IEEE 9th International Symposiumon Intelligent Signal Processing (WISP), Siena, 2015,1–6.
  43. [43] S. Luo, W. Mou, K. Althoefer, and H. Liu, Iterativeclosest labeled point for tactile object shape recogni-tion, in Proceeding IEEE/RSJ International Conf. onIntelligent Robots and Systems (IROS), Daejeon, 2016,3137–3142.
  44. [44] T. Mi, D. Que, S. Fang, Z. Zhou, C. Ye, C. Liu, Z. Yi, andX. Wu, Tactile grasp stability classification based on graphconvolutional networks, in Proceeding IEEE InternationalConf. on Real-time Computing and Robotics (RCAR), Xining,2021, 875–880.
  45. [45] E. Ayodele, T. Bao, S.A.R. Zaidi, A. Hayajneh, J. Scott, Z.Zhang, and D. McLernon, Grasp classification with weft knitdata glove using a convolutional neural network, IEEE SensorsJournal, 21(9), 2021, 10824–10833.
  46. [46] X. Zhang, S. Li, J. Yang, Q. Bai, Y. Wang, M. Shen,R. Pu, and Q. Song, Target classification method oftactile perception data with deep learning, Entropy, 23(11),2022, 1537.

Important Links:

Go Back