Abstract
Integrating sensors into knitted input devices traditionally comes with considerable constraints for textile and UI design freedom. In this work, we demonstrate a novel, minimally invasive method for fabricating knitted sensors that overcomes this limitation. We integrate copper wire with piezoresistive enamel directly into the fabric using weft knitting to establish strain and pressure sensing cells that consist only of single pairs of intermeshed loops. The result is unobtrusive and potentially invisible, which provides tremendous latitude for visual and haptic design. Furthermore, we present several variations of stitch compositions, resulting in loop meshes that feature distinct response with respect to direction of exerting force. Utilizing this property, we are able to infer actuation modalities and considerably expand the device's input space. In particular, we discern strain directions and surface pressure. Moreover, we provide an in-depth description of our fabrication method, and demonstrate our solution's versatility on three exemplary use cases.
Original language | English |
---|---|
Pages | 1-17 |
DOIs | |
Publication status | Published - 11 May 2024 |
Event | CHI'24 ACM CHI Conference on Human Factors in Computing Systems - Honolulu, Honolulu, United States Duration: 11 May 2024 → 16 May 2024 https://chi2024.acm.org/ |
Conference
Conference | CHI'24 ACM CHI Conference on Human Factors in Computing Systems |
---|---|
Abbreviated title | CHI'24 |
Country/Territory | United States |
City | Honolulu |
Period | 11.05.2024 → 16.05.2024 |
Internet address |
Keywords
- e-textiles
- fabrication
- force sensor
- knitting
- resistive sensor
- smart textiles
- textile interface
Fingerprint
Dive into the research topics of 'Loopsense: low-scale, unobtrusive, and minimally invasive knited force sensors for multi-modal input, enabled by selective loop-meshing'. Together they form a unique fingerprint.Prizes
-
CHI'24 Best Paper Honorable Mention
Wintersberger, P. (Recipient), Lingler, A. (Recipient) & Talypova, D. (Recipient), 11 May 2024
Prize