IEEE Transactions on Robotics, Journal Year: 2023, Volume and Issue: 39(6), P. 4973 - 5022
Published: Dec. 1, 2023
Language: Английский
IEEE Transactions on Robotics, Journal Year: 2023, Volume and Issue: 39(6), P. 4973 - 5022
Published: Dec. 1, 2023
Language: Английский
Science Advances, Journal Year: 2025, Volume and Issue: 11(4)
Published: Jan. 22, 2025
Tactile interfaces are essential for enhancing human-machine interactions, yet achieving large-scale, precise distributed force sensing remains challenging due to signal coupling and inefficient data processing. Inspired by the spiral structure of Aloe polyphylla processing principles neuronal systems, this study presents a digital channel–enabled decoding strategy, resulting in phygital tactile system named PhyTac. This innovative effectively prevents marker overlap accurately identifies multipoint stimuli up 368 regions from coupled signals. By integrating physics into model training, we reduce dataset size just 45 kilobytes, surpassing conventional methods that typically exceed 1 gigabyte. Results demonstrate PhyTac’s impressive fidelity 97.7% across range 0.5 25 newtons, enabling diverse applications medical evaluation, sports virtual reality, robotics. research not only enhances our understanding hand-centric actions but also highlights convergence physical realms, paving way advancements AI-based sensor technologies.
Language: Английский
Citations
1Published: April 3, 2023
Robots have been brought to work close humans in many scenarios. For coexistence and collaboration, robots should be safe pleasant for interact with. To this end, the could both physically soft with multimodal sensing/perception, so that better awareness of surrounding environment, as well respond properly humans' action/intention. This paper introduces a novel robotic link, named ProTac, possesses multiple sensing modes: tactile proximity sensing, based on computer vision functional material. These modalities come from layered structure transparent silicon skin, polymer dispersed liquid crystal (PDLC) film, reflective markers. Here, PDLC film can switch actively between opaque state, which obtained by using cameras solely built inside ProTac link. In paper, inference algorithms perception are introduced. Evaluation results two demonstrated that, simple activation strategy, link effectively perceive useful information approaching in-contact obstacles. The proposed device is expected bring ultimate solutions design softness, whole-body safety control strategies.
Language: Английский
Citations
12Sensors, Journal Year: 2025, Volume and Issue: 25(1), P. 256 - 256
Published: Jan. 5, 2025
Transferring knowledge learned from standard GelSight sensors to other visuotactile is appealing for reducing data collection and annotation. However, such cross-sensor transfer challenging due the differences between in internal light sources, imaging effects, elastomer properties. By understanding collected each type of as domains, we propose a few-sample-driven style-to-content unsupervised domain adaptation method reduce gaps. We first Global Local Aggregation Bottleneck (GLAB) layer compress features extracted by an encoder, enabling extraction containing key information facilitating unlabeled learning. introduce Fourier-style transformation (FST) module prototype-constrained learning loss promote global conditional domain-adversarial adaptation, bridging style-level also high-confidence guided teacher–student network, utilizing self-distillation mechanism further content-level gaps two domains. Experiments on three real-world robotic shape recognition tasks demonstrate that our outperforms state-of-the-art approaches, particularly achieving 89.8% accuracy DIGIT dataset.
Language: Английский
Citations
0Materials Today Physics, Journal Year: 2025, Volume and Issue: unknown, P. 101740 - 101740
Published: April 1, 2025
Language: Английский
Citations
0IEEE Transactions on Robotics, Journal Year: 2024, Volume and Issue: 40, P. 1509 - 1526
Published: Jan. 1, 2024
Visuotactile sensors can provide rich contact information, having great potential in contact-rich manipulation tasks with reinforcement learning (RL) policies. Sim2Real technique tackles the challenge of RL's reliance on a large amount interaction data. However, most methods for visuotactile rely rigid-body physics simulation, which fails to simulate real elastic deformation precisely. Moreover, these do not exploit characteristic tactile signals designing network architecture. In this paper, we build general-purpose protocol policy marker-based sensors. To improve simulation fidelity, employ an FEM-based simulator that sensor accurately and stably arbitrary geometries. We further propose novel feature extraction directly processes set pixel coordinates markers self-supervised pre-training strategy efficiency generalizability RL conduct extensive experiments peg-in-hole task validate effectiveness our method. And show its additional including plug adjustment lock opening. The protocol, framework, will be open-sourced community usage.
Language: Английский
Citations
3IEEE Transactions on Robotics, Journal Year: 2024, Volume and Issue: 40, P. 2373 - 2389
Published: Jan. 1, 2024
Robots have taken the place of human operators in hazardous and challenging jobs requiring high dexterity manipulation, robots with skin for force tactile sensing that mimics function mechanoreception animals will be highly dexterous performing complex tasks. In this study, we propose design modular robotic skin, capable detecting magnitude location a contact simultaneously. Each module needs three degrees freedom order to estimate horizontal vertical locations as well its magnitude. Force proposed is enabled by custom-designed triangular beam structure underneath cover. A applied cover causes bending beam, which detected fiber optic strain sensors. The result shows resolutions 1.45 N estimation 1.85 mm 1.91 localization directions, respectively. We also demonstrate how can used remote autonomous control commercial arms equipped an array modules.
Language: Английский
Citations
3Scientific Data, Journal Year: 2025, Volume and Issue: 12(1)
Published: March 18, 2025
Hand contact data, reflecting the intricate behaviours of human hands during object operation, exhibits significant potential for analysing hand operation patterns to guide design hand-related sensors and robots, predicting properties. However, these applications are hindered by constraints low resolution incomplete capture data. Leveraging a non-contact high-precision 3D scanning method surface capture, high-resolution whole-body dataset, named as Ti3D-contact, is constructed in this work. The with an average 0.72 mm, contains 1872 sets texture images models. area painted on gloves, which captured original data through scanner. Reliability validation Ti3D-contact conducted movement classification 95% precision achieved using acquired dataset. properties capturing make dataset exhibit promising application posture recognition prediction.
Language: Английский
Citations
02022 IEEE/SICE International Symposium on System Integration (SII), Journal Year: 2024, Volume and Issue: unknown, P. 369 - 375
Published: Jan. 8, 2024
Safe and pleasant physical human-robot interaction (pHRI) is essential in the robotic skin related control paradigms. Among approaches, integrating tactile sensing thermal techniques enables robot to provide gentle touch perceptions, thus enhancing likability trustworthiness during interactions with humans. However, achieving both accurate desired comfort perceptions poses significant challenges. In response these challenges, this paper presents development of a novel soft named Thermo Tac. This innovative not only possesses ability sense touches effectively but also provides sensations. The integration capabilities realized through carefully designed strategic system. developed integrated vision-based water circulation systems evaluate its performance effectiveness realistic scenarios. results experiments highlight potential ThermoTac as promising solution for safe stable interaction, paving way advanced robotics.
Language: Английский
Citations
2Information Fusion, Journal Year: 2024, Volume and Issue: 108, P. 102390 - 102390
Published: March 28, 2024
Language: Английский
Citations
2Published: May 13, 2024
Language: Английский
Citations
2