I'm not so sure about that. The predictions of the end of Moore's Law have come regularly every two or three years since I was in college, and at that time everyone had his panties in a bunch over 70 micron processes, which yields a feature size about 5,000 times bigger than the current standard of 14 nanometers. There's a reason you can buy a microcontroller for $40 which can be used as a basic desktop computer.
The article presents a compelling argument, but all the similar articles did, too, and they only really last until you start looking at them carefully. It's very expensive to carve silicon at 14 nm, yes. There are challenges to making it efficient, yes. The most recent foo-raw surrounding Meltdown and Spectre (overblown, IMHO) demonstrates that designers should be more careful about performance enhancements, yes.
"Wait just a damned minute," you say. "What do you mean, 'overblown'?"
I understand how the malware works; it's basically a statistical method of inferring cache contents by checking read times. It has the potential for causing trouble, because it does allow someone a back door into protected memory.
But no one has explained to me how that affects a desktop machine, even one connected to the Internet. Someone can read data off your computer--well, there are probably still half a million ways to do that, ways that are a lot less complex and technically finicky than Spectre and Meltdown are. As far as I've seen, the "proof of concept" programs haven't done much of anything significant; you show me a program based on either of those two bits of malware that causes a significant problem in real time and I'll change my mind.
...but back to Moore's Law.
The major stories surrounding CPUs tell the tale. We were told in 1992 that 70 microns was it, that going smaller was impossible and Moore's Law was over. We were told in 1998 that clock speeds higher than 4 GHz are problematic and Moore's Law was over. A few years after that, we were told that big multicore chips consumed too much power and Moore's Law was over. And on, and on, and on, with a dozen little stories in between, about process sizes and heat dissipation and quantum interference and-and-and.
And feature sizes kept getting smaller, and chips kept getting more powerful and more efficient the entire time, the better part of three decades since I started paying attention to it. What was said about the processes before 70 microns? What happened in 1990, 1989, 1988 that presaged the doom of Moore's Law? Was it the fact that chip designers could no longer lay out etch masks on huge sheets of acetate? Or was it something else?
The main reason that Meltdown and Spectre don't hail the end of Moore's Law, however, comes from the fact that the thing they exploit--speculative execution--has nothing to do with feature size. As has been noted elsewhere, speculative execution has been a feature of Pentium processors for twenty-three years. We can now fit about a hundred Pentium processors in the space of one from two decades ago.
I don't buy it. I mean, logically, at some point Moore's Law ought to end, because there is a practical limit to how small you can make a circuit--but we won't know where that end is until we hit it. Exactly the same way we learnd that you can't scale clock speeds to the heavens in an economically reasonable fashion (as intel learned with the Pentium 4, which had originally been meant to scale to 10 GHz) we'll find the end of Moore's Law abruptly and without warning. We'll try to make a chip that simply won't work, and the reason will be, "Going smaller than this is too expensive to be worthwhile" exactly as happened with the P4 and its 10 GHz clock speed.
And furthermore, the end of Moore's Law isn't the catastrophe these guys make it out to be, either. Moore's Law predicts a halving of feature size and/or a doubling of power every eighteen months. What if that cycle stretches to 24 or 36 months, or even longer? So what? It doesn't mean the end of progress; it merely means progress happens at a slower pace. It might be a good thing if programmers can't rely on Moore's Law to make up for bloated code; maybe a longer cycle would result in better software.
So, reports of the death of Moore's Law, greatly exaggerated, blah blah blah etcetera. Oh, well.
* * *
Only a Democrat could say something that friggin' stupid with a straight face. Yep, our immigration enforcement people are rounding up "law-abiding" illegal aliens, and it's a travesty.
Next up: complaining about the mistreatment of law-abiding drug smugglers.
* * *
So, on my day off, an ice storm.
Mrs. Fungus had to get up early today. I got up to hit the can, saw her scraping windows, and promptly hied myself outside to help her. The driveway was a skating rink.
She startled me by coming home again--"I forgot my phones"--but then set out again. I'm not sure how long it was--fifteen, twenty minutes--and she was back.
In that time, she'd gotten as far as the high school, then realized it was too slick to drive and came home again.
Mind you, the high school isn't that far from here--about a mile, maybe a bit more--and if it takes you that long to make that round trip, there is something wrong. Apparently it was all over the radio that the highways were a mess, and Mrs. Fungus had people from work calling and/or texting to tell her they'd be late, because traffic. One reported that she'd been sitting in traffic for two hours.
If I had been going to work today--if it were not my day off--I would have soldiered on and gotten to work however late--but this shit is ridiculous.
I'm not kidding. There's no reason for the roads to be like that, not for the thin crust of ice that had fallen. It wasn't even 1/16th of an inch, and there's a vast array of equipment we are very highly taxed to pay for that is designed to deal with it.
Where the hell were the salt trucks?
The kind of bad weather we've had this winter has been nothing compared to past years. Why are we driving on slush when it's snowing at the blistering rate of a quarter inch per hour? Why do the trucks not go out until after the precipitation has ceased? Why are the roads such a mess?
You want to know why? Illinois is broke, that's why.
* * *
I had intended, today, to scan at least some of the old artwork into the computer. With my resurgence in drawing, I've realized that not all the good stuff is computer-accessible, so I wanted to remedy that. But I guess it's not happening today, after all.