Thanks for commenting. I was hoping you'd weigh in! My understanding is that ANY transmission line is susceptible to the chaotic data they were showing in their graphs. And, by extension, installing a device to "tune" the stream to be less chaotic, showed the increase in efficiency and drop in overall power draw. The extrapolation I took away from his data center power-tuning example, was that this same method could work anywhere in the system - generation, transmission, distribution - to increase efficiencies at all points, thereby decreasing the overall load on the system.
I took away that while they could correct at the transmission or distribution level, the losses and increased efficiency still occur at the load level. There could be science here that's beyond me, so I might be missing where the efficiency gain in the transmission and distribution is. There just aren't many active components to see many losses from bad power quality. Most power quality issues, and from what I can tell that includes the ones the article discusses, don't travel far on the wires. The transmission and distribution has a lot of series inductance and shunt capacitance. The distortion is higher frequencies, and inductive impedance goes up with frequency while the shunt capacitance goes down as frequency increases. This acts to both block and bleed off the distortion. They probably have a point on the load, though, and I don't mean to be completely skeptical.