[go: up one dir, main page]

0% found this document useful (0 votes)
192 views15 pages

Neuromorphic Computing

Neuromorphic computing is a paradigm inspired by the human brain, utilizing artificial neural networks to optimize performance for tasks like pattern recognition and sensory processing. It employs specialized hardware that mimics biological neurons, enabling flexible and adaptive computing compared to traditional digital methods. Applications include edge AI, robotics, healthcare, and industrial automation, though challenges remain in algorithm complexity, device variability, and programming models.

Uploaded by

singhn5443
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
192 views15 pages

Neuromorphic Computing

Neuromorphic computing is a paradigm inspired by the human brain, utilizing artificial neural networks to optimize performance for tasks like pattern recognition and sensory processing. It employs specialized hardware that mimics biological neurons, enabling flexible and adaptive computing compared to traditional digital methods. Applications include edge AI, robotics, healthcare, and industrial automation, though challenges remain in algorithm complexity, device variability, and programming models.

Uploaded by

singhn5443
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 15

Neuromorphic computing

An intro to building brain~

Neha Singh
24.04.2025
BCA 4th Semester
Roll no. : 231912033349
Submitted to: Mrs Amita Verma
Neuromorphic computing...................................................................................0
INTRODUCTION................................................................................................... 2
~Carver Mead.................................................................................................... 3
“PRINCIPLES OF NEUROMORPHIC TECHNOLOGY”........................................ 5
“Spiking Neural Networks & Event-Driven Computing”.................................6
Intel Loihi:.............................................................................................................. 9
On-Chip Learning in a 14 nm Neuromorphic Processor...................................... 9
Von Neumann Architecture:................................................................................ 11
“APPLICATION OF NEUROMORPHIC COMPUTING”...................................... 11
● Edge AI & IoT..................................................................................................... 12
● Robotics & Autonomous Vehicles..................................................................... 12
● Healthcare & Biomedical.................................................................................. 12
● Industrial Automation & Smart Manufacturing.............................................. 13
Scientific Research & Cloud Integration..............................................................13
“Challenges of neuromorphic computing”.................................................... 13
INTRODUCTION

1.Neuromorphic computing is a new computing paradigm inspired by the


workings of the human brain.

2.It involves the use of artificial neural networks that mimic the structure and
function of biological neurons.

3.These networks are implemented in specialized hardware that is designed to


optimize the performance of neural computations.

4.Neuromorphic computing is particularly well-suited for tasks such as pattern


recognition, classification, and sensory processing.

5.Unlike traditional digital computing, which relies on fixed algorithms,


neuromorphic computing is more flexible and adaptable, making it ideal for
handling complex and dynamic data.

What is Neuromorphic computing?


Neuromorphic engineering, also known as neuromorphic computing, started as a
concept developed by Carver Mead in the late 1980s, describing the use of
very-large-scale integration (VLSI) systems containing electronic analogue
circuits to mimic neurobiological architectures present in the nervous system.

Neuro: “to do with neurons i.e. neurally inspired”.

Morphic: “structure or form”

~Emulates the functional structure of neurobiological systems.


~Carver Mead

Carver A. Mead is a Caltech professor who first showed that silicon transistors could
behave like the neurons and synapses in our brains.

In the late 1980s and early 1990s, his lab built analog VLSI chips that mimic neural
dynamics and even created sensors—like a silicon “retina” and “cochlea”—that only send
spikes when the input changes, cutting power use and data rates. By using event-driven,
massively parallel designs and exploring on-chip learning rules, Mead laid the
groundwork for today’s neuromorphic chips.

–such as IBM TrueNorth and Intel Loihi—that run tiny, always-on AI tasks in robots,
self-driving cars, medical implants and other low-power edge applications.
~ Moore Law’s

In 1965, Gordon Moore made a prediction that would set the pace for our modern digital
revolution. From careful observation of an emerging trend, Moore extrapolated that
computing would dramatically increase in power, and decrease in relative cost, at an
exponential pace. Carver A. Mead is a Caltech professor who first showed that silicon
transistors could behave like the neurons and synapses in our brains. In the late 1980s
and early 1990s, his lab built analog VLSI chips that mimic neural dynamics and even
created sensors—like a silicon “retina” and “cochlea”—that only send spikes when the
input changes, cutting power use and data rates. The insight, known as Moore’s Law,
became the golden rule for the electronics industry, and a springboard for innovation
“PRINCIPLES OF NEUROMORPHIC TECHNOLOGY”

• Build machines that have similar perception capabilities as human perception.


• Adaptable and self organizing.
• Robust to changing environments.

~Realisation of future “THINKING machines (intelligent and interactive systems)”.

“Neuromorphic Architecture”

• Computer architectures that are similar to biological brains, computer architectures


that implement artificial neural networks in hardware.

• Functional units are composed of neurons, axons, synapses, and dendrites.

• Synapses are connections between two neurons Remembers previous state, updates to
a new state, holds the weight of the connection.

• Axons and dendrites connect to many neurons/synapses, like long- range buses.
“Spiking Neural Networks & Event-Driven Computing”
Traditional ANNs vs. Spiking Neural Networks:

Traditional Artificial Neural Networks (ANNs) process information in synchronized


steps. In an ANN, every neuron multiplies its inputs by weights, sums them, and applies a
continuous activation (like 0.23 or 0.87) on each clock cycle—even if there’s no new
data—wasting energy on check-ups that do nothing Lifewire. Spiking Neural Networks
(SNNs), by contrast, model each neuron as an integrator that stays silent until inputs pile
up and cross a threshold; only then does it fire a brief, all-or-nothing “spike” and reset
itself ResearchGate. Because information is carried by when spikes occur rather than by
smooth values, SNNs only compute at actual events, whereas ANNs must crunch
numbers on every tick regardless of activity.

Event-Driven Computing:

Like a motion-sensor light: stays off until something happens​


Traditional chips: run on a constant clock, making every part check for work all the
time (wasting energy when idle)​
Neuromorphic approach: most circuits nap in ultra-low-power mode until they receive
a spike or sensor event​
Selective wake-up: only the exact neuron/synapse circuit that needs to fire turns on,
processes the event, then goes back to sleep​
Massive savings: uses 10×–100× less energy than always-on AI accelerators.​
Real-world wins:​

●​ Wearables that monitor heartbeats or fall detection continuously for days on one
charge​

●​ Smart cameras that record only motion events at milliwatt power​

●​ Drones and robots that avoid obstacles in real time without draining batteries

“IBM TrueNorth: A Million-Neuron, Ultra-Low-Power Neuromorphic


Chip”

Who: IBM Research is the advanced R&D arm of IBM, tackling tomorrow’s computing
challenges.

What: TrueNorth is a digital neuromorphic chip that packs:

●​ 4 096 independent “neuron cores” (think of each core as a tiny brain region)

●​ 1 000 000 silicon neurons and 256 000 000 programmable synapses (the
connections that carry information).
●​ Ultra-low power draw: around 65 milliwatts—comparable to a dim night-light.

Demo:

●​ Ran a handwritten-digit recognition task in real time at just 70 mW, whereas a


typical GPU would need tens of watts for the same job.
●​ Proved that large, spiking-neuron systems can match AI accuracy while sipping
orders-of-magnitude less energy.

Why it matters:

●​ TrueNorth demonstrated at scale that brain-inspired architectures need not


sacrifice performance for power.
●​ It opened the door to always-on vision and audio applications—surveillance
cameras, hearing aids, mobile devices—that can run complex AI on very small
batteries.​

Intel Loihi:
On-Chip Learning in a 14 nm Neuromorphic Processor
Who: Intel Labs is the research division of Intel, best known for making the CPUs inside
most PCs and servers.

What: Loihi is Intel’s first neuromorphic research chip, built on a 14 nm process,


featuring:

●​ 128 neuromorphic cores (each core simulates thousands of spiking neurons)


●​ ∼130 000 total neurons with on-chip learning rules (synapses adjust themselves
in hardware).

Demo:

●​ A small robot equipped with Loihi learned to navigate a new maze in minutes,
without any pre-training or cloud-based computation.
●​ The chip adjusted its own connection strengths on the fly, showing true on-device
learning.​

Why it matters:

●​ Proved that adaptive, real-time learning needn’t rely on big data


centers—robots and edge devices can teach themselves right where they are.
●​ Highlights a practical path for spiking hardware to join mainstream robotics and
autonomous systems
“Von Neumann architecture and Neuromorphic
computing”

Von Neumann Architecture:


Stored-program model: Instructions and data share the same memory space, loaded
into the CPU for execution Wikipedia.​
Central Processing Unit (CPU): Fetches, decodes, and executes one instruction at a time.​
Memory: A single RAM module holds both program instructions and data.​
I/O System: Interfaces with peripherals via the same bus architecture.

Neuromorphic Computing :

Brain-inspired design: Models silicon “neurons” and “synapses” to mimic how


biological neural networks compute via spikes WikipediaIBM Research.

Event-driven: Circuits only activate when an input “spike” arrives, rather than on every
clock tick IBM - United States.

Memory+Compute Co-location: Synaptic weights stored in the same physical location as


neuron circuits, eliminating massive data shuttling.
“APPLICATION OF NEUROMORPHIC COMPUTING”

●​ Edge AI & IoT


●​ Neuromorphic chips are ideal for edge devices because they process sensor data
locally, reducing cloud uploads and latency IoT Insider. Their event-driven
operation keeps them in a micro-watt “sleep” until real events occur, enabling
always-on monitoring in smart homes, industrial sensors, and environmental
monitors Embedur. Wearables such as fitness trackers and hearables leverage
spiking cores to analyze biometric signals (heart rate, EEG) continuously for days
on a single battery charge LinkedIn.

●​ Robotics & Autonomous Vehicles


●​ Spiking neural networks give robots and drones human-like reflexes: they
integrate visual, tactile, and inertial data in real time and adapt on the fly, all at
millisecond latencies LinkedIn. In autonomous cars, neuromorphic accelerators
handle event-based vision and lidar spikes to detect obstacles and lane markings
faster and with lower power than conventional GPUs LinkedIn. This on-chip
learning capability lets robots navigate unknown terrain without constant cloud
retraining Embedur.

●​ Healthcare & Biomedical


●​ Neuromorphic processors power implantable neurostimulators that adjust
electrical pulses in response to neural activity, improving therapies for
Parkinson’s and epilepsy while minimizing battery replacements. Wearable
brain–machine interfaces decode EEG signals on-device, enabling hands-free
control for paralyzed patients with strict privacy and energy constraints .
●​ Industrial Automation & Smart Manufacturing
●​ In Industry 4.0 settings, neuromorphic accelerators embedded in machinery
enable predictive maintenance by spotting vibration or acoustic anomalies in real
time, avoiding costly downtime on LinkedIn. Cloud-connected “cloud robotics”
systems use neuromorphic nodes on factory robots to learn assembly tasks, while
sharing updates via a central server—combining local speed with global
coordination Wikipedia. This hybrid approach boosts flexibility for small-batch,
high-mix production lines.

Cybersecurity & Finance

Neuromorphic chips excel at anomaly detection in network traffic by spotting rare


“spike” patterns in data streams, enabling low-latency intrusion detection with minimal
power draw LinkedIn. In finance, spiking-based pattern recognition accelerates fraud
detection and high-frequency trading strategies by adapting to market fluctuations
on-chip, rather than relying on power-hungry server farms.

Scientific Research & Cloud Integration

Wafer-scale neuromorphic platforms like BrainScaleS accelerate neuroscience


simulations by running cortical microcircuits 10 000× faster than real time, slashing
experiment turnaround from days to minutes SSRN. Hybrid cloud-neuromorphic
systems distribute computation: event-driven edge nodes preprocess data, while the
cloud handles heavy analytics, optimizing energy use and data privacy SSRN.
“Challenges of neuromorphic computing”

1.​ Algorithm & Software Complexity: The lack of standardized spiking-network


models makes it hard to design and compare neuromorphic algorithms effectively
CMU. Additionally, without clearly established benchmarks or challenge
problems, progress is anecdotal and uneven across platforms Nature.​

2.​ Device & Materials Variability: Emerging memristive and phase-change synapse
technologies show large device-to-device variations that disrupt predictable
neural behavior Nature. Moreover, integrating these novel materials into
standard CMOS flows faces yield, reliability, and aging issues that slow down
fabrication arXiv.​

3.​ Architecture & Scalability: Scaling to millions or billions of neurons demands


complex spike-routing networks, which suffer IR-drop and leakage in large
interconnect arrays arXiv. Wafer-scale analog systems, while fast, struggle with
calibration complexity and low manufacturing yields due to analog variability
Nature.​

4.​ Programming Models & Toolchains: Frameworks like Lava and PyNN are still
maturing and lack the polish and ecosystem of TensorFlow/PyTorch, making
neuromorphic development cumbersome arXiv. Debugging and profiling
asynchronous, spike-based code is also far harder than tracing synchronous
deep-net layers, slowing developer productivity Frontiers.

You might also like