Hell, much of A.I is referred to as a "Black Box", especially Deep Learning and large Neural Networks where constant inputs and outputs occur over massive datasets to produce a long line of mathematical permutations across an algorithm. All the ML practictioner does, for the most part, is keep an eye on the gradient descent so that it doesn't slow down or get "stuck" (local minima).
The world is developing successful learning models based on algorithms that structure data and all the "engineer" does is change the parameters, weights, learning rate, amount of data etc until the Cost Function decreases and convergence of data is found. Think about that for a moment, lol. It is beautiful, but peculiar at the same time.
The results can then be replicated by anyone using the same parameters and no expert would be able to explain how it worked epoch by epoch, though intuitively and even mathematically as a whole, they understand what the end result should be. Many pre-made models do that for others.
This intuition, is where experience and plenty of work on them is a HUGE deal. I personally worked of hundreds of datasets and models on my laptop and I began to get a feel for what parameters needed to be altered (even added on a code line in some cases) to achieve successful learning even after only a few epochs, depending on the data size. If I were doing it full time with a company, I have no doubt that my experience would have allowed me to reach a highly proficient level. ML can be learned by most people, even absent the underlying linear algebra etc if they have the time and interest.
Especially today, where algos and parameters are nothing more than modules you add in Python or R and then tweak. It's more efficient and cost productive, depending on the application of course; then running an entire training set for days and THEN working with the failure.
When I self taught myself Machine Learning, I did so the hard way, I worked with the math and equations, because that is how this particular course taught it (the original one from Andrew Ng, he has long since updated it with Python and more use of modules). It was a hell of a road to climb at first, especially as I was learning Python at the same time.
Each course after another though felt so much easier after beginning with that one raw course. I found myself even finding errors in presenters code while they were doing it, and they would say in the recording "oh, this is supposed to be this" etc. It was quite inspiring to me just a year or so ago.
It will only become more mastered over time, it's application more apparent and potentially dangerous.