P
PropelGrad

Neuromorphic Computing Engineer Jobs & Internships 2026

Neuromorphic computing engineers design hardware and software for brain-inspired computing architectures that process information fundamentally differently from conventional von Neumann computers. Neuromorphic chips like Intel's Loihi and IBM's NorthPole use spiking neural network (SNN) paradigms and massively parallel, event-driven processing to achieve dramatic energy efficiency advantages for specific AI workloads. The field is highly research-oriented, with applications in ultra-low-power edge AI, sensory processing, and theoretical neuroscience.

$8,500–$13,000/moIntern monthly pay
$125,000–$185,000Entry-level salary

What Does a Neuromorphic Computing Engineer Do?

Neuromorphic computing engineers implement spiking neural network models that translate the temporal spike timing of biological neurons into computational operations on neuromorphic hardware. They develop programming tools and frameworks — often proprietary SDKs for specific chips — that abstract the hardware details of neuromorphic processors into usable ML development environments. Converting trained conventional deep learning models into SNN equivalents that preserve accuracy while gaining the efficiency advantages of event-driven computation is a core technical challenge. They benchmark neuromorphic hardware against GPU baselines to characterize the energy efficiency and latency trade-offs, identifying which workloads benefit most from the neuromorphic paradigm. Collaboration with neuroscience researchers informs hardware design with biological insights about how real neural circuits achieve efficient computation.

Required Skills & Qualifications

  • Spiking neural network (SNN) design and simulation with NEST, Brian2, or Norse
  • Intel Loihi, IBM TrueNorth, or BrainChip Akida programming and optimization
  • ANN-to-SNN conversion techniques: rate coding and temporal coding approaches
  • Event-driven computation paradigms and asynchronous hardware design
  • Power and latency profiling for neuromorphic vs. GPU comparisons
  • Computational neuroscience fundamentals: Hodgkin-Huxley model, STDP learning
  • FPGA programming (VHDL or Verilog) for custom neuromorphic hardware development
  • PyTorch with SNN-specific libraries for neuromorphic model training

A Day in the Life of a Neuromorphic Computing Engineer

Morning begins with running SNN inference benchmarks on Intel Loihi 2 hardware, comparing energy consumption and latency against an equivalent ConvNet running on an edge GPU. The neuromorphic implementation shows 10x lower energy consumption for a keyword spotting task, which is promising for the target battery-powered sensor application. Late morning involves debugging an ANN-to-SNN conversion where the converted model has 8% accuracy loss versus the original floating-point model — investigating rate coding configuration and threshold calibration. After lunch, a research meeting discusses a new learning rule inspired by biological spike-timing-dependent plasticity that could enable on-chip learning without backpropagation. Afternoon is spent implementing a Python wrapper that makes the Loihi SDK more accessible for ML researchers who aren't familiar with neuromorphic programming.

Career Path & Salary Progression

Neuromorphic Research Intern → Neuromorphic Engineer I → Senior Neuromorphic Engineer → Principal Research Engineer → Research Director

LevelBase SalaryTotal Comp (with equity)Intern Monthly
Intern$8,500–$13,000/mo
Entry-Level (0–2 yrs)$125,000–$185,000+20–40% in equity/bonus
Mid-Level (3–5 yrs)$185,000–$259,000+30–60% in equity/bonus
Senior (5–8 yrs)$259,000–$362,000+50–100% in equity/bonus

Salary data sourced from Levels.fyi, Glassdoor, and company disclosures. 2026 estimates.

Top Companies Hiring Neuromorphic Computing Engineers

Intel

IBM

Samsung

BrainChip

SynSense

Apply for Neuromorphic Computing Engineer Roles

Submit your profile and a PropelGrad recruiter will help you land an interview for neuromorphic computing engineer internships and entry-level positions at top companies.

Neuromorphic Computing Engineer — Frequently Asked Questions

What is a spiking neural network and how is it different from a conventional neural network?

Conventional neural networks communicate continuous activation values between layers. Spiking neural networks communicate discrete spikes (binary events) in time — more closely mimicking biological neurons. This event-driven nature means computation only occurs when spikes arrive, providing potential energy efficiency advantages for sparse, temporally structured inputs like audio and sensor data. The downside is that SNNs are harder to train with standard backpropagation.

Is neuromorphic computing commercially viable in 2026?

For specific ultra-low-power edge applications — keyword spotting, simple gesture recognition, sensor anomaly detection — neuromorphic chips like BrainChip Akida are being deployed in commercial products. For general-purpose AI inference, conventional GPU and NPU architectures remain dominant. Neuromorphic's commercial relevance is currently limited to niches where extreme power efficiency is the primary constraint.

How many job openings exist for neuromorphic engineers?

The field is small — total global openings at any given time are likely a few hundred, concentrated at Intel Labs, IBM Research, Samsung's Research America, and a handful of startups. This is a very specialized niche that attracts engineers motivated by scientific exploration rather than job market breadth. The skills developed (SNN modeling, event-driven computing, neuroscience-inspired architecture) are transferable to broader AI hardware and computational neuroscience roles.

What academic programs best prepare someone for neuromorphic engineering?

Computer architecture, electrical engineering, and computational neuroscience programs with research experience in brain-inspired computing are ideal. Intel's Loihi program offers collaboration opportunities with universities. The CapoCaccia Neuromorphic Engineering Workshop is the premier conference for the field. Reading the biological neuroscience literature alongside ML research provides the cross-disciplinary background that the most effective neuromorphic engineers have.

How does Intel's Loihi 2 differ from conventional AI accelerators?

Loihi 2 is a research chip with 1 million spiking neurons and 120 million synapses, designed for event-driven computation. Unlike NVIDIA's A100 which is optimized for dense matrix multiplication at high throughput, Loihi 2 is optimized for sparse, asynchronous computation with extremely low power consumption. It's not competitive with GPUs for training large neural networks, but it can run specific inference tasks at 1000x lower energy consumption for suitable workloads.