The Project

Main Objectives

A long-term objective is to use RCN as a substitute (BACKUP, therefore the name of the project) of damaged neural tissue. In this case, the RCN has to be designed with a proper topology that meets the final task to accomplish. Such type of optimization would boost the use of RCN in therapeutic applications.

High gain/high-risk balance In BACKUP, I am moving at the forefront of the integrated photonic technology and of the interdisciplinary science of the interface between living and condensed matter. Only now, large networks of photonic components are being integrated with electronic driving circuits. Few successful optogenetic experiments have been carried out where the light signal is provided by remote lasers through optical fibres.

RCN with few photonic components have only recently being demonstrated. Therefore, all the activities I am planning have a very high-risk content. However, the success of BACKUP will open definitely new scenarios in computing by allowing the concurrent and simultaneous use of three profoundly different platforms for computation (photonics, electronics and neuronics). I will leap forward in the understanding of our brain by unveiling the relationship between connectivity –topology- of nodes and the functions they perform as a complex single network. I will unfold new possibilities in neurological disorder therapies by supplementing the faulting functions with artificial networks that replace or supplement the nerves or the brain tissues.

Artificial Intelligence

Artificial Intelligence (AI) is a Computer Science discipline whose goal is to create artificial systems (e.g., algorithms) which are able to perform intelligent tasks such as image understanding, language translation or problem solving. AI systems are based on many different paradigms. Some of these paradigms are directly inspired to the human brain. For instance, Artificial Neural Networks (ANNs) are a mathematical simulation of a biological neural network. Broadly speaking, an ANN is a graph, where nodes simulate neurons (and are associated with activation functions) and edges simulate synaptic connections. Importantly, the ANN weights, associated with the edges, simulate the synaptic strength of the connections, which may vary during time.

The main characteristic of ANNs is their ability to automatically learn from a set of training samples. For instance, in order to learn to recognize a chair into an input image, a collection of images of chairs is progressively presented to the ANN. Specific training algorithms modify the ANN weights trying to minimize the recognition error on the training set. When training is done, the implicit knowledge, contained in the training set, has been acquired by the ANN and memorized in its weight values. Now the ANN can recognize new, unseen images of chairs.

Recently, many AI fields have been significantly improved due to the introduction of Deep ANNs (DNNs), which basically are ANNs composed of a (deep) cascade of layers, each layer being characterized by a set of homogeneous neurons which embed part of the ANN general knowledge.

In Backup we will use DNNs in order to study and simulate real, biological networks. The idea is sketched in Fig. 1. B is a (small) biological network grown in vitro. Using specific photonic circuits and optogenetics, we can stimulate and read the activation state of each individual neuron in B. We can thus collect a virtually unlimited dataset of pairs stimuli-responses of B. Once this dataset has been collected, it will be used to train the artificial network A. A is trained in order to predict the response of B given a specific stimulus. Basically, A is trained to “think” as B “thinks”. If successful, this experiment will show, for the first time, that the “memories” of a biological network can be artificially reproduced using external hardware and software.

Even more intriguing and challenging is the possibility of building an hybrid computational system, which will be explored in the second part of the project. Specifically, parts of B will be replaced with a DNN, and we will address this question: “Can a computational task be jointly performed by the cooperation of two networks: an artificial and a biological network, connected to each other?”

 

artificial-intelligence

Figure 1: A schematic representation of an ANN (A) and a biological network (B)

Building an in vitro memory engram using light

The brain is an intricate network formed by many neurons highly interconnected via synaptic contacts (synapses). Once formed during development, the neuronal circuitry can be modified by neuronal activity and these changes are directly correlated to cognitive processes such as learning and memory. The term memory is referred to the storage of information in the brain, a cognitive function essential for learning and interaction with the external world.

In particular, information are believed to be stored in the brain as physical persistent modifications of ensemble of neurons (engram). These permanent changes involve a reinforcement of synaptic connections among specific neurons that are activated during the encoding of a memory trace.

Only recently, with the availability of optogenetic techniques, it become possible to identify single memory traces in the brain. Optogenetics is a biological method that involves genetic modification of neurons to express light-sensitive membrane channels (i.e. channel rhodopsin) allowing to control neuronal activity by light. Channel rhodopsin can be specifically expressed in neurons activated during learning of a memory trace and scientists demonstrated that is possible to retrieve memory in mice by re-activating these neurons with light.

The aim of BACKUP is to create a memory engram in a small in vitro neuronal network using a photonic chip (Fig. 2). This artificial engram will be created using optogenetic strategies in which patterned-light illumination will correspond to activation of group of interconnected neurons expressing channel rhodopsin along the light path. In this sense patterned light will work as an artificial learning event for generating memory engram.

This reductionist in vitro system will allow comparison of the activity and morphological changes in “engram neurons” (cells activated by light) vs “non engram neurons” (cells not activated by light) to study basic mechanisms of engram cells connectivity and to establish a link between neuronal activity and changes in the connectivity among neurons.

It is well established that some pathological conditions such as memory loss (amnesia) could be associated with a damaged memory engram or with the inability to retrieve it. The hybrid system developed in BACKUP will also give us the opportunity to easily modify or eliminate the artificial engram in order to better understand basic mechanisms underlying memory disfunctions.

vitro-memory

Figure 2 In vitro neuronal network grown on a photonic chip.

Photonics for the neural networks

Artificial Neural Networks (ANN) are computational network’s models that that mimics how biological neurons elaborate data. These models have dramatically improved the performance of many learning tasks, including speech and object recognition. However, today’s computing hardware is inefficient at implementing neural networks mainly because much of it was designed by a standard computer (that is based on Von Neumann architectures).

The scientific community developed specific electronic architectures that directly beaves as an ANN trying to improve the computational speed and energy efficiency.

Optics already boosted the telecom field to a new performance level by exploiting the huge data handling capabilities, speed and flexibility of optical fibres and is doing the same with the ANN.

The BACKUP project is inserted in this context where optics will be exploited to find new ways to implement ANN schemes directly inspired to the biology. The brain is composed of a huge number of neurons deeply interconnected between each other; therefore we will exploit the integrated optics to pack several thousand of optical artificial neurons with specific interconnection topology in a microchip smaller than 1 euro. The packing capabilities allow scaling up the number of artificial neurons that is directly related to the network “intelligence".

The aim of BACKUP is to use the unique advantages of optics to create an ANN able to elaborate ultrafast optical signal that can learn from the external optical stimuli. The learning process, called 'deep learning', will be mediated by the electronic part of the chip directly connected to the optical ANN. The optical ANN will provide a huge enhancement of the computational speed over the state-of-the-art and at the same time increased power efficiency.

The fact that the optical ANN is directly inspired to the biological counterpart will allows to integrate real neurons onto the optical chip to let them compute together.
Replace part of the damaged biological neural network with the optical ANN to heal diseases like Alzheimer’s, Parkinson’s and epilepsy.

Integrated optical chip

Figure 3: (a) Integrated optical chip containing both the optical ANN and the control electronics. (Right) Optical fibres are visible to train the network using external optical stimuli and to collect the processed signals. (b) Schematic of optical ANN where artificial neurons are visible (circles) that are interconnected using a recursive topology.

The logo

Backup's project logo recap the interaction between a biological brain network and artificial networks we are going to study and to make interact. On the one hand Backup aims to learn from biological systems how to design efficient networks of interacting neurons, on the other hand BACKUP wants to create artificial networks able to substitute damaged tissues.