I don't think the subjects should even be separated. Programming is math, essentially, just a bit more abstract. It's fun, it's relevant, and it can even give kids a creative outlet as they become more proficient. Assign a kid the task of creating a simple video game from scratch and you can teach a huge array of skills: logical thinking, math, computer literacy, etc. Or maybe if a kid runs into a math problem they don't understand, encourage them to write a program to help solve it. As they're starting calculus, walk them through writing a program that estimates the area under a curve by calculating the areas of a bunch of different rectangles, than show them how if they increase the variable denoting the number of rectangles to use, they can get a more accurate answer. Maybe I'm just projecting myself onto these kids. I loved making dumb little games growing up, and that allowed me to understand subjects like trigonometry (how do I get this character to look at my mouse pointer?) in a much more intimate way than had I only learned from the homework problems in my textbooks.
If you learn a particular language, you can only express yourself (and therefore, think) in that one language. When it comes to languages such as C, the only way you can approach a problem is through the lens of paradigm of portable assembly. When your language is not expressive enough to encapsulate the abstractions you intend to use, it gets harder to overcome the limits of the current paradigm. Some areas of mathematics (particularly category theory) have been applied to programming languages to create abstractions that are capture ideas without contradictions through a type system. I guess when we talk of mathematics, we tend to think of the curricula that shows up in public schools (the progression of algebra->geometry->trig->calculus), the stuff that you calculate and come up with an answer with little concept of what it means or how it's applied, we might not be aware of areas of advanced algebra or formal logic which provide the means to create abstractions and reason about them.
I feel like a lot of these algorithms and everything are the heart of what many computer science students are learning. I don't think they are being forgotten in most situations. While many self-taught programmers might not be aware of the knowledge this article is talking about I hardly think math is going anywhere. There is still plenty of it being taught imo.
What we learn as 'Math' up through college is hardly the math that anyone in the future will actually need to understand. Most engineering which requires calculus can be approximated through real-world measurements or through computer simulations. The part of math that we actually don't learn in school such as algebraic structure and vectors completely change how you think about numbers and the rules of math in the first place. There are some amazing things that you can do with modulus which would be impossible with normal algebra, but still will not likely come into any normal person's life, so the question we should be asking is why teach math? Outside of what most people will ever run into, why teach it at all? Currently, there isn't a reason because the way we teach math is only to teach the applications of it instead of the reasons why those applications are important. Here's an example. We currently teach base 10 as a system by explaining that each column in a number is a tens, or hundreds place and so on. This makes it very difficult to later teach that in binary the second digit is not representative of a tens place. Instead we could teach that in a base 10 system, each column represents 10 to a power. The ones place is n(10^0), n(10^1) for the tens place and so on. Then when you switch to a base 2 (binary) system, it is infinitely easier to say, "All we do is switch 2 for where 10 used to be. n(2^0) for ones, n(2^1) for twos place and so on." This teaches a whole new understanding of what makes a number system tick, and is infinitely more important than calculus. It teaches that numbers are parts of a system which can be manipulated in amazing ways to represent new systems, instead of showing more perfunctory and memorized crap that you're never going to use.
The fact that we are so rigidly structured by a common curriculum which does not actually consider future or mathematically advanced learning is amazing. Math can teach critical thinking, but not the way it's taught now.
To refine your example, as it's hard to talk about powers to children, you must take caution to not introduce how to do something without illustrating the why of the necessity of it and the what it is they are actually doing. I think when it comes to teaching the numbers, starting with unary (base-1, or a tally system) representations of real-world things, and the operations on them, and then talk about how the operations are isomorphic (without using that word) to those on a decimal system. Bring to context the utility of a representation, you can bring up binary as an example (and learning binary alongside the standard curricula definitely helped add missing context to Arabic numerals for my childhood), but when you teach how unary behaves the same as decimal, they can carry that reasoning over to whatever else they encounter.
While I don't disagree that the fad of everyone needs to be a programmer is over blow, I don't see anything resembling justification for this opinion. It has never been important for most of the population to be heavily invested in math, and I don't think that's a bad thing or something that is likely to change.
It's also worth noting that literally everyone spends a large proportion of their education learning maths (although most don't really get to anywhere interesting), compared to absolutely no computer science or programming most places.