UCSC Introduces SpikeGPT

Assistant Professor of Electrical and Computer Engineering at UC Santa Cruz, Jason Eshraghian, has developed a new model for language generation that uses an alternative algorithm called a spiking neural network (SNN), which uses less energy than conventional deep learning. Eshraghian and two students have released the open-source code for the largest language-generating SNN ever, named SpikeGPT. Large language models, such as ChatGPT, use a technique called self-attention, taking a sequence of data, such as a string of words, and applying a function to determine how closely each data point is related to each other. The mathematics behind this requires matrix-matrix multiplication, which is computationally expensive.

Spiking neural networks, however, face their own challenges in the training of the models. Many of the optimization strategies that have been developed for regular neural networks and modern deep learning, such as backpropagation and gradient descent, cannot be easily applied to the training of SNNs because the information inputted into the system is not compatible with the training techniques. But Eshraghian has pioneered methods to circumvent these problems and apply the optimization techniques developed for traditional deep learning for the training of SNNs.

SpikeGPT offers several benefits for data security, privacy, accessibility, and green computing and energy efficiency within the field. However, this transition will require the development of brain-inspired hardware, which is a significant investment. Eshraghian hopes to work with a hardware company such as Intel to host these models, which would allow him to further demonstrate the energy-saving benefits of his SNN.