top of page
Search
Steve

The History of NAM

2023 was quite the year for NAM. With the success it's had, I think it's worth it to put together a post to go back through NAM's history.


This started as a "year in review post", but as I started working on it, I realized that it might be valuable to go even farther back and re-trace the steps from "the beginning of recorded history" to see how things got to today.


This shaped up to be a lot of text and a lot of videos, but I hope folks find the story interesting.


Here we go!


The early days

NAM started as little for-fun project that I had open-sourced as a GitHub repository starting back in 2019. As a guitarist who was also a scientist at my day job and getting into machine learning for science applications it seemed like a nice for-fun project to see whether I could apply those skills to my musical hobby and see if it was possible to model the sound of guitar amps using neural networks. Excitingly, the very first attempt worked quite well!



I did a few tweaks on top of that initial version, but life happened and it sat dormant for a few years.


2022


"Second wind"

I didn't do a lot of music for a few years. In 2022, I had a "second wind" and picked NAM up again in early (February?) 2022, and really started to work on getting it usable. I started using iPlug2, and, with a proper plugin (but just barely), it was a lot easier and rewarding to make more models of gear I had around...



...as well as some friends' gear...



To me, from a technical point of view, these were some of the most rewarding moments--where I had a program that I could use somewhat efficiently, that worked well enough for me to be able to enjoy making music and hearing the sounds from it.


The comparison videos

I remember the early 2010's on the internet where there was a lot of skepticism about whether digital modeling could sound like a real tube amp, so I was cautiously moving forward. But, I really wanted to see where NAM held up in the running of digital "snapshot modeling" tech.


I've never owned a Kemper, but I was curious to see how NAM compared to it, given its historical significance in black-box "snapshot" amp modeling. I reached out to a stranger on the internet, who graciously agreed to reamp some of their gear. This was the first real "comparison video" I did in what I consider the "current days" of NAM:



Talking with over DMs, I think we agreed that NAM was "not bad"--which was thrilling to me! As I drilled into the data we'd collected, though, I got a quantitative comparison between the two and found that NAM was actually more accurate than the Kemper, which was a sincere surprise to me--this was a for-fun project that, sure, I wanted to do a good job with, but in the same way you want to try your best on a crossword puzzle--you're not really expecting much out of it.


After that, I tracked down a Quad Cortex owner and, in May 2022, did the same thing...



This was where I started to get...nervously excited? I had heard that the QC was in a league of its own, so I expected that this would be where things ended--"Better than the decade-old Kemper, but not quite where a modern product is." Neural DSP has an incredible reputation for their technology, so to find out that NAM was better was its own sincere surprise to me--and meant that, as far as I could tell, NAM was the state of the art in terms of neural modeling. This was also the first video where I showed what I called the "mono test"--where the model and source tone are panned hard-left and hard-right, and the result sounds like a mono track if both are close enough. (I later learned of the "null test", which accomplishes a similar thing and seems pretty popular.)


3 months later when IK Multimedia announced TONEX, I felt like I had an obligation to make another video, so I bought it and...



Two things surprised me about this comparison. First, The TONEX appeared to be more accurate than Neural Capture (though I hadn't directly compared the two); second, NAM was still better. Others have also replicated my findings about this "hierarchy" of modeling.


As of my writing this, to my knowledge, NAM is the state of the art for modeling accuracy.


Improvements and new things for NAM in 2022

While the comparison videos were fun to do, the meat of what was happening with NAM for me was in the improvements I was making. I was having fun building new things with NAM and making videos to show them off.


I released the new version:



The next thing I did was to implement parametric models, which allow one to model the behavior of the amp over the range of its knobs and controls:



As an aside, this might be the most under-the-radar part about the open-source project. Something of a theme with this project has been that what people know about it is largely driven by social media and what others tend to emphasize. As far as the "social phenomenon" side of NAM, it's been very interesting to observe.


Next, I added some more neural net architectures. Something that surprises people about NAM is that, for a long time, I implemented it "closed-book"--instead of looking at what others were doing in machine learning for audio, I decided to figure things out for myself, using my general intuition for machine learning to guide my approach. I've compared it to a "crossword puzzle"--something that's fun to do because you're figuring it out for yourself. Sure, there's an "answer key" out there (probably, for this), but using it sort of robs you of the experience of going through it yourself. It wasn't until September 2022 that I "opened the book" and implemented a stacked LSTM and WaveNet model into NAM.*



One of the most meaningful improvements to the plugin's user experience was the "universal snapshot loader" I shared in November 2022. Until this point, if you wanted to play a NAM of a different piece of gear, you had to train the model, paste some code into the plugin's source, and re-compile the plugin. With this plugin release, users could take a folder containing a "config.json" file and "weights.npy" file, and point the plugin at those files via a file picker dialog on their computers to change sounds.



2023


Like I said at the top, 2023 was a big year for NAM. I'm going to do my best to recap the highlights.


The NAM plugin gets some help

At around the turn of the year, I got contacted by Oli Larkin, the lead of the iPlug2 framework. Oli had noticed NAM and did a ton of work to refactor code that I had written in order to really improve the UI/UX for the plugin. This was the basis for the plugin that most folks are now familiar with. This is also where the EQ section and IR loader got added. Before this, the plugin had just an input & output knob and a model loader.



A really exciting moment for NAM's UI came with a re-skin in April. This brings the plugin to the visual that folks recognize today.



From PR#202


There were also a ton of improvements to all sorts of aspects of the plugin along the way--memory usage was decimated, CPU usage was halved thanks to a contributor's pull request, and other convenience features like resampling were added to make the plugin work for more users' workflows.


Training gets better

As a "coder's" project, NAM came from a place that wasn't the most accessible to musician end-users. Over the course of 2023, I worked to improve the trianing process by making a more streamlined browser-based trainer as well as a locally-installable GUI trainer. I also made videos for how to install for Windows and Mac as well as how to make your first model.


NAM goes viral

Starting at the end of February, the Facebook page started seeing a lot more attention. Over the next few months, the community size grew from a couple hundred to thousands, ending the year at over 15,000 members. To help deal with the amount of attention that the community now required, I onboarded a team of moderators. I'm incredibly grateful to them for their many hours of help and attention reviewing new member requests, helping answer questions, and helping shape the community into something I'm quite proud of.


I've been admittedly reluctant to play up NAM, but I owe a lot to the enthusiastic praise from my early users for sharing their experiences with NAM across the internet. What started as a few comments here and there turned into a few blog posts, a few threads on forums, and eventually uncountably many YouTube videos, including a pair of sit-down interviews in the early half of the year




One of the coolest parts of this all was to see artists using NAM in their work. Some folks (myself included) have incorporated it into their gigging rigs, while many others have integrated it into their studio. It was an unbelievable experience to have people reach out to me telling me about the projects that they were using NAM in, and it's been hard to communicate how rewarding it's been for NAM to have such an impact.


And of course, 2023 saw the start of this website! Due to its special role as an open-source project, the website tries to connect both users who are looking to start having fun with what I've built, as well as fellow builders who are curious about using NAM to build products.


The world, powered by NAM

As an open-source project, I'm excited for the potential of NAM to be integrated into all sorts of products and to see an ecosystem grow around it. In 2023, I was thrilled to see a variety of product and service launches that have given vibrance and color to the future NAM might enable:


2024

In new year, things are looking bright for NAM. There are a lot of things in motion, and I can't wait to share them with you when they're ready.


Stay tuned!


Footnotes

*some folks have noted that technically NAM's WaveNet isn't actually a WaveNet, but has a few things I changed and added based on my own intuition. I'm not really into naming architectures, but I suppose this would have been a good opportunity! [back]


1,845 views

Recent Posts

See All

Plug-in version 0.7.11 is released

I've released a new version of the NAM snapshot plug-in. You can download it on the users page . This release fixes a bug that users were...

NeuralAmpModelerCore v0.2.0 is released

There's a new version of the core DSP library for NAM: version 0.2.0 . This makes a few breaking changes but also cleans some things up...

NEURAL AMP MODELER

©2024 by Steven Atkinson

bottom of page