A Sensor That Might Someday Enable ‘Mind-Controlled’ Robots

The work could help expand the applications of brain-machine interfaces.

Sensor
Adapted from ACS Applied Nano Materials, 2023, DOI: 10.1021/acsanm.2c05546

It sounds like something from science fiction: Don a specialized, electronic headband and control a robot using your mind. But now, recent research published in ACS Applied Nano Materials has taken a step toward making this a reality. By designing a special, 3D-patterned structure that doesn’t rely on sticky conductive gels, the team has created “dry” sensors that can measure the brain’s electrical activity, even amidst hair and the bumps and curves of the head.

Physicians monitor electrical signals from the brain with electroencephalography (EEG), in which specialized electrodes are either implanted into or placed on the surface of the head. EEG helps diagnose neurological disorders, but it can also be incorporated into “brain-machine interfaces,” which use brain waves to control an external device, such as a prosthetic limb, robot or even a video game.

Most non-invasive versions involve the use of “wet” sensors, which are stuck onto the head with a gloopy gel that can irritate the scalp and sometimes trigger allergic reactions. As an alternative, researchers have been developing “dry” sensors that don’t require gels, but thus far none have worked as well as the gold-standard wet variety.

Although nanomaterials like graphene could be a suitable option, their flat and typically flaky nature make them incompatible with the uneven curves of the human head, particularly over long periods. So, Francesca Iacopi and colleagues wanted to create a 3D, graphene-based sensor based on polycrystalline graphene that could accurately monitor brain activity without any stickiness.

The team created several 3D graphene-coated structures with different shapes and patterns, each around 10 µm thick. Of the shapes tested, a hexagonal pattern worked the best on the curvy, hairy surface of the occipital region — the spot at the base of the head where the brain’s visual cortex is located. The team incorporated eight of these sensors into an elastic headband, which held them against the back of the head.

When combined with an augmented reality headset displaying visual cues, the electrodes could detect which cue was being viewed, then work with a computer to interpret the signals into commands that controlled the motion of a four-legged robot — completely hands-free.

Though the new electrodes didn’t yet work quite as well as the wet sensors, the researchers say that this work represents a first step toward developing robust, easily implemented dry sensors to help expand the applications of brain-machine interfaces.

The authors acknowledge funding from the Defence Innovation Hub of the Australian Government and support from the Australian National Fabrication Facility of the University of Technology Sydney and the Research & Prototype Foundry at the University of Sydney Nano Institute.

More in IoT