I feel as if it stands to reason that someone without practice in common topics should do worse at a subject when encountered professionally or academically; the base mathematical principles behind programming like lambda calculus and the principles of recursion and boolean logic haven't changed, they've been masked.
I feel like the use of many AI sub-systems really just trains someone in the use of that sub-system rather than any of the underlying principles. This feels obvious to me, am I missing something?
Also : I don't think that specific 'AI sub-system' training is useless, that's the way our world is heading.. I just also happen to believe that it hides trees.
It feels like the app-ification programming. It's more accessible but leaves the users less able to deal with things it can't solve and completely incompetent to debug them.
Hopefully I'm wrong.
Also hopefully it doesn't put me out of a job in 5-10 years.
This is normal. The industry has many times more front-end web developers than full stack. It has many more full-stack developers than systems programmers. It has many more systems programmers than chip engineers. Each layer of abstraction makes things easier to learn (or at least more visible and therefore approachable) and therefore makes the number of people trained in any given discipline higher.
I feel as if it stands to reason that someone without practice in common topics should do worse at a subject when encountered professionally or academically; the base mathematical principles behind programming like lambda calculus and the principles of recursion and boolean logic haven't changed, they've been masked.
I feel like the use of many AI sub-systems really just trains someone in the use of that sub-system rather than any of the underlying principles. This feels obvious to me, am I missing something?
Also : I don't think that specific 'AI sub-system' training is useless, that's the way our world is heading.. I just also happen to believe that it hides trees.