A Swiss firm aims to transform the traditional approaches to creating artificial intelligence (AI) models: It holds that the world requires biological chip processors, which consume far less energy than digital ones.
FinalSpark, a company founded in 2014 in Vevey by Martin Kutter and Fred Jordan, claims to have tested ten million live neurons and to be actively working on developing living, thinking devices using skin-derived human neurons.
In order to demonstrate self-sustaining computing capability for the development of AI models in the future, the startup is cultivating neurons in cell cultures.
How is this operational?
The startup is teaching human neurons to process information in a manner similar to that of the human brain by means of an electric wire. Co-founder Fred Jordan revealed to Quartz that his group has been experimenting with cutting-edge techniques in AI models, like genetic programming and in silico spiking neural networks. Achieving artificial general intelligence is the goal.
FinalSpark aims to reach true human reasoning, which is capable of analyzing emotions and generating new ideas and concepts outside of its own experience, in contrast to existing AI models that imitate human thought after months of training on sophisticated data. Jordan remarked, “This is what a’real’ thinking machine should do.” “Considering that the most well-known information processor is a human neuron.”
According to the company, DNA data storage may eventually be more efficient and sustainable than cloud storage, and it hopes to spearhead the transition from artificial to biological engineering. On its lab website, real-time, live examples of its biochips in action are displayed.
A blueprint for “organoid intelligence” was unveiled in February by scientists under the direction of Johns Hopkins University in Baltimore. In order for this to function, microscopic, three-dimensional brain structures developed from human stem cells would be used to create a thinking system. These would receive machine learning training and be linked to sensors and output devices. According to this tendency, biological neural networks could eventually take the place of artificial neural networks in many computing applications, including artificial intelligence (AI), said Jordan.
Still, synthetic biological intelligence is still in its infancy and faces several challenges, according to Frontiers research on cellular neuroscience. The “reproducibility of the synthetic biological models themselves, as well as the accuracy and efficiency of the AI algorithms used to analyze the data,” according to the paper, need to be improved. “Synthetic biological intelligence has the potential to revolutionize the field of medicine,” the article states, provided this subject continues to improve.
Moving away from the energy-consuming construction of massive language models
It should come as no surprise that the ongoing training of AI algorithms on billions of data points uses a significant amount of energy, since data centers need a lot of water to stay cool throughout this process. In actuality, global data center emissions surpass those of the commercial aviation sector. For example, training a single AI model or chatbot can consume more electricity annually than 100 US homes do.
Even though the human brain includes at least 86 billion neurons and has a storage capacity of 2,500 gigabytes, it only requires 10 watts of power for daily computation.
Additionally, the brain uses the energy of a dim lightbulb while executing roughly 1,000 trillion operations per second. Ten megawatts of silicon AI chips would be needed to match the organic computing capability of the human brain. That is, a million times less energy is used by human neurons.
The greatest method to stop AI models from increasing carbon emissions, according to Jordan, is to use alternative hardware, such as living neurons, for AI models with the same or better results.