While there have been increases in the bleeding edge of processors, GPU's especially as applied to supercomputing, less and less of this increase is in the chips.
Moore's Law is petering out as we get closer to the quantum limitations of electron conduction in semiconductor paths. Quantum tunneling takes over and the electrons just teleport wherever the hell they want to go. Although there may be ways to actually exploit this intentionally.
So to a degree, new supercomputer records are gradually becoming more and more about the will and the funds to stack ever more server blades together, than it is about the cost or power of the chips which is beginning to level off.
One can hope for some significant improvements in actual CPU design and architecture, and logic. Although that will run into institutional barriers because such improvements may require discarding much of the prior design work and processes that have been perfected up till now.
"Memristor" memory that can hold any number of states per element, rather than transistor memory that can only hold two, "on" and "off" aka "0" and "1" may be possible soon, creating a bonanza in solid-state storage densities.
And there's still much ongoing work with optical and quantum computing as well. Although compared to silicon semiconductors, they're still in their infancy, even after several years.
However, the "end of silicon" from a basic physics standpoint, and the end of Moore's Law is in sight. So improvements in computing power will begin to mean just "piling up more silicon" than it will mean "cramming more into the existing silicon".