Scientists hail digital retina heat management breakthrough

Health wearables

It’s common knowledge that managing heat is one of the biggest problems in computer chip design – but it’s even more of an issue when designing products supposed to work in delicate biologic systems.

Finding ways to stop digital implants from overheating and frying the tissue that surrounds them is a huge technical challenge.

No more so than with artificial retinas, which must deal with large amounts of data that generate large amounts of heat, while also interfacing with the nervous system.

But members of Stanford University’s Electrical Engineering and Computer Science departments think they have devised a way to solve the problem.

Discussing their advance in a study published in IEE Transaction on Biomedical Circuits and Systems, the Artificial Retina Team from Stanford said the biggest issue is that neurons from the retina send electrical signals in large “spikes” of information to the brain.

All of this creates heat, even in prototype retinas with a few hundred electrodes, as opposed to tens of thousands of electrodes that will be needed in the final versions.

According to Boris Murmann, professor of electrical engineering on the project at Stanford, the team is working on a way to extract the same amount of useful visual information but with less data.

They are finding ways to decide which parts of the spikes can be ignored, reducing the amount of data to be processed, and the heat generated by the artificial retina.

Summarising the team's work in an article on Stanford’s website, Murmann said: “We compress the data by being more selective, ignoring the noise and baseline samples and digitising only the unique spikes.”

Previously, digitisation and compression were done separately leading to a lot of data storage and transfer, the team said.

Compressing the information is now integrated into the digitisation process, so that the most useful information is retained, and making hardware easier to implement.

The compression process is surprisingly simple too: whenever two or more electrodes in the artificial retina record identical signal samples, these signal “collisions” are ignored.

But whenever a unique signal is recorded by a single electrode, this is prioritised, an approach that the team says is far more efficient.

Only 5% of cells are missed as a result but the acquired data is reduced by 40 times, and the principle could also be applied to other brain-machine interfaces that turn nerve impulses into computer signals.

Applications could include restoring motion to paralysed patients, or restoring hearing to the deaf, the team said.

Dante Muratore, one of the team’s postdoctoral researchers, said: “This is an important step that might someday allow us to build a digital retina with over 10,000 channels.”