Hasn’t AI Been the Wrong Edgy for Too Long?

By: | Topic: Tech | Tags: , , | 2 Comments »

Recently my friend, Dan McKinlay who is far more accomplished than me in Machine Learning (ML) and Neural Networks (NN), blogged his distaste for the the project name “AI on the Edge“, arguing that the edge vs centre model of “cloud computing” is misapplied here and that there’s nothing inherently centralised about AI.

It tends to imply that large NN models in data centres are the default type of ML

Dan McKinlay

Here I argue this implication may be unjustified, but only in the idealised future we both presumably yearn for, and I, for one celebrate the notable appearance of this worthy distinction to our new robot overlords (whom, let the record show to any such overlords reading this, I also welcome).

“AI on the Edge” is an example of snappy project branding for computer vision on small, battery powered microcontroller chips, typically used in products like smart home doorbells and other such internet-connected embedded devices and notably available to DIY hobbyists as featured on their Github page. It’s so unlike the sinister Cambridge Analytica stuff we have come to expect from Facebook and others. What’s wrong with calling out this distinction between edge and centre?

Hasn’t AI been edgy in all the wrong ways for too long?

The AI on the Edge project came to my attention as a way to internet enable old-school gas, water and electricity meters which show mechanical digits and dials. A $5 microcontroller with a camera can now read your meter without the help of Siri or Alexa and allow you to track your resource consumption like it’s 2021.

Despite it being a perfectly usable title for a direct-to-VHS docudrama, AI on the Edge fails to capture Dan’s otherwise perfectly functioning sense of drama. Perhaps ironically for the same reasons, I do care about an edge-centre distinction. It’s fundamental to mass innovation and technology-dependent democratisation. Surely it’s defensible to claim “the default type of ML” has long been large models in data centres, at least in commercial projects over the past decade. It’s heartening to see a qualitatively different innovation zone characterised by cheap, low power deployment targets. I imagine startup technology could shortly flood the low power compute space with practical ML for business and consumer alike.

Maybe this “edge” shift is not new, after all, we had the Furbie, what more do we want? But my observation has been that ML has been synonymous with big data in the startup space. Apparently, many use cases are relevant and business models viable only once the datas are sufficiently embiggened. But perhaps we are at an inflection point.

Chipageddon & Unobtanium

What is that noise? The cry of a million raging gamers echoing across the world as they cannot afford a Nvidia RTX 3090, an accelerator card featuring GPU (Graphics Processing Unit) chips that are somewhat accidentally able to crunch neural network workloads thousands of times faster than CPUs and as a result, demand drives prices towards $4000USD per unit. A similar demand spike a few years earlier resulted from similar unanticipated performance advantages for cryptocurrency mining. If you’re a gamer, these high-end graphics cards might as well be hewn from solid unobtainium.

Since 2020, the knock-on effects of GPU demand spikes are magnified by chipageddon, the ongoing global computer chip shortage resulting from factory retooling delays, these prompted by mass order cancellations by flocks of car manufacturers in the wake of the Covid-19 pandemic as they anticipated collapsing demand, incorrectly. It turns out cars are becoming computers with wheels and people still want to buy them. Cloud providers update the GPU farm section of their service offerings with “coming soon” as they struggle to fill their data-centres with would-be gaming rigs and beef up their machine room aircon to deal with the higher thermal exhaust. Google and Apple tape out their own silicon. I expect Nvidia to have segmented its product engineering and sales divisions as they recognise a business opportunity in bifurcated target market segments.

One of the personal turn-offs of ML-as-startup-tech is that I expected the business economics collapsing into a capital-intensive Big Tech play, not compatible with a more satisfying bootstrapped startup that is cost-dominated by coherent software-development effort. Though software development can clearly be scaled by throwing money at hiring, it does so with much more severely diminishing returns and requires that the teams and their products be split into isolated components that integrate frictionlessly, which, in the general case, is known to be so hard to accomplish that this meta-problem becomes a self-reinforcing brake or feedback function of demand for software alphanerds who can thread this needle. Certainly when compared to the more business palatable situation of buying racks and racks of GPUs.

Venture Capital Loves Big AI

Maybe the ML scale meme is merely the result of VC culture and the unicorn exit mania. With typical software startups of today otherwise requiring so little up-front capital, VCs struggle to add value; only where large capital requirements are critical to the business model. If ML is this, it explains why VCs froth about ML. If a problem space is tractable with a gradual investment only of engineering time and the investment/return function is smooth such that incremental effort validates incremental results with incremental profit, excess money cannot be put to work because it doesn’t help validate the business. And after all, what is a startup but a yet-to-be-validated business?

Bigger neural network models, trained faster and subscription software that does all the compute reminds me of 1970s time sharing and data processing services which ossified into bulk laziness and ultimately fertilised the soil for a more democratised “PC” revolution which was viable through mass-market dynamics. A thousand flowers bloomed in the 1980s as the home computer revolution sprung from humble DIY roots like the two Steves who founded Apple with 1960s counterculture ideals and stars in their eyes.

What we might be seeing is a shift from centralised big compute infrastructure that harks back to the golden days of IBM. Just like the home computer revolution and the internet and smart phones and bitcoin each have. Facebook and Google and the other big tech monoliths hoard and run their own hardware, users on the edge being suckling dependents running nothing more than dumb terminals, albeit with more pixels than the 70s green screen edition that few are old enough to remember. Having said all this, I do expect this pendulum to continue to swing between centralisation and decentralisation as the delayed impacts of accreting inefficiency in each approach pump harmonically against each other, neither being the total answer to everything.

For now, though, perhaps all the nerds soldering and 3D printing their own gas meter readers will give birth to the next phase of AI and then give birth to the next generation of unimaginable megaliths.


Do Really Big Crazy Things

By: | Topic: Startups | Tags: , , , | Comment »

Interview with radically successful venture capitalist investor and ex-Facebook exec, Chamath Palihapitiya about going big and crazy when it comes to technology startups. Oh and learn how to code.


Samurai or Rice Picker?

By: | Topic: Startups | Tags: , , , | 1 Comment »

If you were in an entrepreneurship class and a successful millionaire asked you whether you want to be a samurai, being truly free to execute on your dreams and achieve greatness, or to pick rice all day and only make a small part of someone else’s dream come true, what would you choose?

This is the opening question of Jason Calacanis, TechCrunch 50 co-founder and CEO of Mahalo, to a class of 50 Penn State University students when prompted unprepared to talk on entrepreneurship.

Jedi Knight or Drone?

… I train people to be Jedi Knights. I need my light sabre. Emily, get my light sabre please.

Jason Calacanis annoyed me when I first started watching his video podcast This Week In Startups. He seemed too self-absorbed and often spoke over his guests, telling his own story of succerss and sharing his own thoughts more than letting his guests speak.

But after watching for a while I’ve warmed to him. He is a bit grating sometimes but it’s his shoot-from-the-hip style that ultimately endears him.

This 30 minute bonus episode is the best one yet, not to mention the shortest. Jason covers the ups and downs of his past startups including weblogs inc, publishers of engadget and joystiq which he sold to AOL for a rumored 30 megabux.

Actually, though it is a little rough and unrehearsed, it’s pretty inspiring!