View Single Post
Old 12th August 2020
  #40
Here for the gear
 

Hi, for clarification: it indeed use the initial Spleeter training, but goes beyond what the public Spleeter package does. SL7 actually use zero lines from the Spleeter code, removes several of its technical limitations (such as sample rate, channels, frequency range, length, bit depth...), and provides more optimized results.
I also happen to know that team personally (we're both based in Paris), and we've reviewed how the training could (or could not) be improved.
The other unmixing and restoration processes in SL7 were trained entirely in-house.

As for the use of the term AI: "AI" is commonly associated to machine learning and deep learning in particular these days. All the crazy stuff you can read in the press about AI these past few years is actually achieved with deep learning. All the AI processes in SL7 uses deep learning. You can argue whether it's appropriate to talk about AI or not in this case, but it's the closest we have right now in term of machine intelligence, at least in regard to the capability of a machine to analyze and produce meaning from its environment

Deep learning is actually inspired by biological neurons, and how connected to each other they produce meaning from raw signals. While there's no clear understanding of how they are biologically structured in our brain, some models of virtual neurons (convolutional neural networks) can now achieve impressive results in signal treatment, almost to the level of human capabilities.
Still, it's "dumb" : while achieving more and more impressive results, these models are trained for only one specific task at a time. The next level would be AGI: Artificial General Intelligence. But it's still science fiction at this point.