Artificial skin material applied on various electronics, allowing them to receive signals in new input gestures like twists and pinches.
(Photo : ACM Digital Library)

When mobile phones first started seeing use, they used buttons on a keypad as a way for us to interact with them. In recent years, these have developed more and more, eventually doing away with these buttons in favor of touchscreen devices that are easier to use. We can now interact with our electronic devices like never before, even controlling them with our own voices.

But what if we could go further still?

What if electronics could "feel" our touch the same way we do?

Pursuing an answer to this question, researchers from Institut Polytechnique de Paris, Sorbonne Université, and the University of Bristol have developed an artificial skin that, when applied to common electronics like computers, phones, and smartwatches, were able to open up new ways of interacting with them using natural human gestures like twisting, pinching, and more.

Published in the Proceedings of the 32nd Annual ACM (Association for Computing Machinery) Symposium on User Interface Software and Technology, the study looks at the whole development process of what they call "Skin-On": from simulating the human skin interface, to sensing human gestures and converting them to electronic signals, to fabricating the material itself.

Doing their best to replicate actual human skin using a bio-driven approach, the researchers incorporated a hypodermis (fat) layer, which allows for the reception of these new gestures for the Skin-On interface, as well as providing kinesthetic feedback similar to skin. Using this bio-driven approach paired with user-feedback studies, they aimed for a realistic skin pore and wrinkle material with a skin-like color.

The skin was made using silicone for the interface, lined with conductive threads to make a system of electrodes that can detect touch and pressure.

The researchers tested this Skin-On interface on a smartphone messaging application. Tickling the material sent a 'laughing' emoji, while simply tapping it sent a 'surprised' emoji. The intensity of these gestures was correlated to the size of the emoji sent. This could also be a way for users to express emotions onto the device itself (e.g. hard, intense grips conveying anger).

Posted by ACM SIGCHI (ACM Special Interest Group on Computer–Human Interaction), a video shows the material in action: allowing the computer to detect signals from a human applying pressure, pinching, and twisting the material.

With this interesting development in the field of human–computer interaction, this opens up a whole new way that humans can interface with electronics. Applications for the technology are far-ranging.

With robots that can "feel" touch in the same way we do, they can now interact and sense with the environment as well as provide feedback in a similar manner that we do. This allows for robots to possibly do more complex and delicate tasks, possibly in the field of healthcare and surgery.

Aside from robotics, it can also provide a whole new depth of experience in videogames, allowing users to interact with in-game characters on a more realistic level.

Only time can tell the impact that this technology will make in the electronics industry.