It might have enough predictive power to be accepted. It might be predicting the wrong thing, following correlations it shouldn't, but if it looks good, people won't bother trying to understand. I agree it won't lead to new science, but I certainly believe people will follow any stupid thing The Numbers say as long as it's presented with a veneer of competence.There's always an underlying model; the only question is whether or not the analysts know what that model is. If you don't understand why your results are what they are, your analysis has no predictive power!
This is what gets me with people who believe that AI will lead to scientific results beyond the ability of humans to comprehend. What's the point? Humans have a hard enough time listening to other humans, even if their arguments are well-reasoned and easy to understand! How are we going to get society to blindly follow what machines say?