Kickoff
In October of 2025, a few dozen machine learning nerds, a few venture capitalists, some very generous team members from Weights & Biases, and the Perforated AI team gathered on the top floor of Frontier Tower in San Francisco during Open Source Week to kick off our second-ever hackathon during the PyTorch Conference. In the ensuing months, the scale of the event would explode, and by judgment day, we would see some of the best results we've ever seen from teams attacking a wide range of problem spaces and using our technology in exciting and novel ways. Here's a video of the results announcement. If you'd prefer to learn about it in blog form, read on.
The Premise
Our hackathons are set up a little differently than your average, run-of-the-mill one-weekend competition. ML engineers have free access to our patented dendritic optimization technology, Perforated BackpropagationTM. They can then apply the tech to any model they like, optimizing for model compression, enhanced accuracy, or both. Participants cover a wide range of applications for their dendritic models, from medical imaging to robotics to exoplanet detection, or even to winning another hackathon.
Next, the participants implement the tech into their training pipeline (this part is very quick; many of the IRL participants successfully implemented it at the 3-hour kickoff event) and then spend the next couple of months training, fine-tuning, and generating results to be presented to the judges. This time around, the judging took place in mid-January 2026. Several thousand dollars were on the line, and a couple of one-year Pro Memberships to Weights&Biases were set to be awarded to the top projects. Along with traditional first, second, and third place prizes, five honorable mentions each won $250, and we also awarded $500 for "Hackathon Inception", an award for using our tech to win another hackathon!
When the dust settled, this hackathon had seen 693 participants, 71 submissions, 10 novel use cases, and 1 hackathon inception winner.
The Winners
Some hackathon blogs would save the best for last, but I can't wait, so here's the first-place submission:
First Place - $3,000
Team Neuron AI
Team Neuron AI used dendrites and applied YOLO to the Pascal VOC dataset. This ubiquitous object detection framework is used in edge devices around the world, but has been a particularly hard nut to crack for dendritic optimization. Successfully applying dendrites to YOLO had, so far, eluded previous hackathon participants and even Dr. Rorry Brenner (CEO and inventor of dendritic optimization). The team solved this problem and even achieved a reduced error rate of 5.2% on the project. This valiant effort resulted in a well-earned top prize of $3,000 and a one-year Weights&Biases Pro Membership.
Second Place - $2,000
Abhinit Mahajan
Second place went to Abhinit Mahajan for the first-ever dendritic language modeling project. Mahajan used a transformer neural net on the WikiText2 natural language dataset and showed an 11% improvement in test scores from adding dendrites to the model. This netted them a cool $2,000 and the other one-year Weights&Biases Pro Membership.
Third Place - $1,000
Nicholas Mesa-Cucalon
The bronze medal went to Nicholas Mesa-Cucalon for Quadruped Robot Control. Using the DeepMind Control Suite, exercising continuous control with simulated physics, they were able to apply dendrites to a TD3 Reinforcement learning algorithm and showed a 13% improvement, rivaling the performance of other, much more complicated TD3 architectures. Nico was rewarded with $1,000 for the project.
Traditional neural network
Dendritic neural network
The Honorable Mentions
Five projects won a $250 honorable mention prize. Here they are!
Best Image Classification - $250
Rairo Mukamuri - Produce Labeling for Point of Sale
Working with a fruit and vegetable image dataset, they showed a 47.1% relative error reduction with dendrites.
Best Financial - $250
Prashaman Pokharel - Credit Risk Prediction
Working with tabular credit card default and income datasets, they showed a 3% improvement in accuracy scores while employing a 33% more efficient model.
Best Medical - $250
Harit Dey - Medical Brain Tumor Labeling
Using 3D image segmentation on MRI brain scans, they showed a 47.3% Baseline Improvement using dendrites.
Best Graph Neural Network - $250
Abhishek Nandy - Drug Screening
Using the MoleculeNet Blood Brain Barrier Penetration dataset and graph isomorphism network data for molecular property prediction, they demonstrated a 47% reduction in relative error.
Most Hacker-est - $250
Most Hacker-est (yes, that's a real category) went to Parth Agrawal, who changed PAIDendrite Module to use his own custom dendrites instead of the PAI dendrites in his project, showing a 23.6% reduction in relative error.
Hackathon Inception
While the hackathon's specific goal was for participants to show the best application of dendritic optimization, the goal of dendritic optimization itself is to make neural networks smarter, smaller, and just better. This means that a well-implemented dendritic neural net should be able to outperform traditional networks in the real world, enabling participants to win other hackathons. This is the idea behind our $500 Hackathon Inception prize.
Hackathon Inception - $500
Vishy Gopal
Vishy Gopal (with a little help from Dr. Brenner to get the dendrites humming just right) was able to win the Edge Impulse "Best Model" award by adding dendrites to Edge Impulse's neural net Impulse Block and consistently producing better results. Check out Edge Impulse's hackathon results announcement featuring Vishy's win here!
You can't have a hackathon without judges, and we're so grateful to all the judges who helped us make the tough decisions to find the very best projects from the ones we received.
These Hackers Did Cool Stuff With Dendrites, And You Can Too!
Innovation is at the core of Perforated AI, and running hackathons like this is such a cool way to both get our tech into the hands of brilliant engineers and to show even more results that verify how powerful dendritic optimization really is. If you're a curious ML engineer, check us out on GitHub, and if you're interested in applying dendritic optimization to your model to make it smarter, smaller, and just better, get in touch!
Ready to Join Our Journey?
See how Perforated BackpropagationTM can transform your AI projects
Get Started Today