AGI is not an engineering problem but a research problem. John Carmack is good at putting stuff together but how good he is at coming up with novel concepts for an open research problem remains to be seen. Even the rocketry example that is hailed here as a success mostly wasn't. That doesn't make me happy, it would have been far nicer if Armadillo had succeeded, more competition in that space is better. But for all the work done it was more of an advanced hobby project along the lines of those guys in the Nordics than something that moved the needle scientifically.
Carmack is someone who has proven to be an almost unequalled productivity machine when working on medium-difficulty problems... Now, for the first time, we'll see if his approach to problem solving can also work on a truly difficult problem. I agree it's very much an open question.
Is that really true though? It seems more like he's good at a medium difficulty problems in a narrow subdomain of software development, which is saying something a bit different. I might even say he's good at hard problems within that subdomain. How transferable those skills are is the most salient point. Assuming peak genius level intellect (which, I don't know, maybe?) It would still take something like 4 or 5 years to reach expert level knowledge in such a complex domain.
Agreed. Deep learning has revolutionized AI and anyone hoping to contribute to AGI is going to have to master DL first, and probably a lot more AI like a variety of probabilistic methods.
That's a challenging learning curve that's not much different from earning a PhD. And then, to stand out in AGI, you're going to have to integrate a dozen kinds of cutting edge components, none of which are anywhere ready for prime time.
At this moment in time, I think any attempt at implementing AGI is going to be half-baked at best. For now, a Siri / Alexa that can do more than answer single questions will be challenging enough.
I actually don't think mastering deep learning is very difficult. Theres a gazillion papers and ideas floating around, but the core concepts, that actually work, things like batch normalization, gradient descent, dropout, etc are all relatively simple. Most of the complexity comes from second rate scientists pushing their flawed research out into the public in some form of a status game
For anyone unfamiliar with all but the most trivial details, do you have some good papers to recommend, to save us from wading through all the rest?