Building Smarter, Scalable Hardware For Artificial Intelligence
Share This:

About The Author

I'm an engineer working to bridge the fields of neuroscience, artificial intelligence, and nanomaterials to create new computing architectures modeled after the brain. I'm the technical founder of Rain Neuromorphics, where I get to work on some of the most exciting problems facing these fields today.

I also have a deep passion for mathematics and physics. Most of my interests revolve around complex systems such as quantum computing, self-organization, and deep learning.


Building Smarter, Scalable Hardware For Artificial Intelligence

The current hardware for training neural networks, the backbone of modern artificial intelligence, is the graphics processing unit (GPU). As its name suggests, the GPU was originally designed for rendering images at high speeds; the realization that it could be used for training neural networks was serendipitous. When training neural networks on GPUs, one is simulating the algorithmic mechanisms in software, and this gives rise to various limitations that are not present in biological neural systems. For example, in GPUs,...

Read more