Get Real Time

tl;dr :machine learning has to be realtime. TensorFlow lite devs need a little history.

So much for my determination to get back into regular blogging. Reboot, again.

So, like I expect most people that work around technology that are past their first flush, and pragmatists everywhere, I’m not a specialist. But that’s not to say delving deep into any subject¬†hasn’t got it’s merits. Any one subject, you pick up the necessary skill set, I’d argue: basic comprehension of written language, basic mathematical thinking, the constraints of the material world, an ability to Google your unknowns.

Any more than one subject, the human mind starts pattern matching. Analogies are evident; more so contradictions.

I’ve dipped into practical signal processing, bleepy music, now I’m fascinated by recent advances in machine learning. There’s a major overlap. Chart the line of one against the other.

Look back at any works about digital music synthesis pre- say 1980. You will find at the back of virtually everyone’s mind is an eye on hardware capability. I have a book on DSP in front of me, published in 2001. Most of it relies on the generalised maths. But then it gets constipated. Forays into practical realisation of, say, a digital filter, is hobbled by the desire to show it’s doable. Here’s some reasonable pseudocode. Here’s an implementation in assembler for a Texas Instruments chip that probably found it’s way into the bin a year after design.

Well-motivated, yes, this stuff is not only possible, it’s actually quite easy if you follow the meta-algorithm. But wrong-minded, imho. The approach blocks you from a wide open space.

There is one simple point I want to make here, but I’m dancing around it because it is trivial.
A 2001 digital filter will calculate it’s coefficients offline. Why? Because of the technical limitations of the day, not for any rule of the universe.

So why on this tiny blue dot, do people insist on a training phase to make a model, then an application phase for that model.
This, in the vernacular, is arse-backwards.
Start your design with an end-to-end, real time processing chain.
A very crude but very real example of the error is with facial recognition. You train the system with a corpus of white faces, deploy, it’s broken in the real world.

Yes, you might have to lower your expectations. But even a poor system that responds to the actual requirements is, in the long term, better than one that works perfectly in the lab, dies out on the street.

TensorFlow lite, Edge computing, only works for the hype curve.

–¬† I enjoy a rambling rant, but now run out of steam.

danny

Leave a Reply

Your email address will not be published. Required fields are marked *

Post comment