IMO, the slowdown in Moore's Law, it's almost needed. There's a LOT of back-end computing problems and optimization that have gotten swept under the rug because you could always just "Throw more flops at the problem".
Moore's Law slowing to a crawl is going to force a number of technology industries to dig deeper into improvements in architecture, code, and more elegant formulas and how problems and data sets and inputs are represented. We'll also see improvements in compression technology.
Also, even if the doubling of chip performance stops, the chips will continue to get cheaper, and spreading the intelligence around, making for a de-facto distributed computing model even within individual devices will continue to have an impact.
It could even be a boon for AI, since some of the most promising results seem to be on this distributed model. In one example, if you have a self-driving car, or a legged robot, there's a processor or subsystem for the legs, one for balance, one for navigation, a processor for the vision system, then they all communicate via shorthand and make decisions collectively, or in a predetermined hierarchical basis. Which is a closer parallel to the way actual human intelligence, or even just the self-directed activities of lower animals is carried out.
We also have a lot of technologies that are nearing perfection, self-driving cars, speech recognition, facial recognition etc. that were once thought impossible without AI that are now everyday items, or about to be everyday items.
I don't think we'll see a huge decrease in the rate of progress of technology because Moore's Law has stalled. We've gotten this far with massive vertical growth in CPU performance. Now we can see what's possible with a broader more parallel and distributed approach.