Giga Goodbye
Intel says it will stop using gigahertz numbers to market its PC processors. It turns out that such numbers really do not tell consumers much anymore. Just about every processor on the market is plenty fast enough to handle most PC tasks.
More important performance factors these days might be video memory, overall system memory, and battery-life for laptops.
Editor's Note: As of February 29, 2024, commenting privileges on reason.com posts are limited to Reason Plus subscribers. Past commenters are grandfathered in for a temporary period. Subscribe here to preserve your ability to comment. Your Reason Plus subscription also gives you an ad-free version of reason.com, along with full access to the digital edition and archives of Reason magazine. We request that comments be civil and on-topic. We do not moderate or assume any responsibility for comments, which are owned by the readers who post them. Comments do not represent the views of reason.com or Reason Foundation. We reserve the right to delete any comment and ban commenters for any reason at any time. Comments may only be edited within 5 minutes of posting. Report abuses.
Please
to post comments
I don't think this is a good idea. Yes most PC's can handle the applications out there, but there is still one key program that does rely on a certain minimum processor and speed: the operating system itself.
Not all computers could make the jump from Windows 95/98/ME to Windows XP. Can you imagine what kind of confusion will go on when MS Bill unleases the next OS? Tech support people will be getting endless calls asking whether or not certain computers can upgrade.
(On the plus side, that also means certain people will be calling me for help, so I guess it works to my advantage too.)
I can understand Intel's decision, though. AMD and Apple sort of abandoned the whole Ghz standard with their chips and the Intel boys don't want to appear to be anal-retentive.
Yes we don't care about processor speed right now... but that's only until the next generation of goodies come out.
"A chip's clock speed is almost irrelevant in determining the overall performance of a computer."
... in precisely the same way that the size and configuration of an auto engine is irrelevant in determining the performance of a car. Do you really need to know what's under the hood when you buy a car? V8, V6, N-liter engine ... what's the diff? You can't tell exactly how fast a car will go by its engine size, so why do you need to know? Do you really care?
If you say "well, yeah, OF COURSE I care!", then I've made my point.
For the past decade, I've bought only AMD -- they're cheaper, faster, and better designed than Intel (and Cyrix are total crap). Now I have another reason: AMD will actually tell me what I'm buying.
"Intel: You Don't Need to Know What's Inside"
Of course, the whole processor speed specification was next-to-useless even back when we were still clocking in tens of MegaHertz, owing to the fact that many important parts of the system could not operate at anywhere near the CPU clock speed on the one hand, while different processors (especially those with different architectures, such as ARMs and PowerPC RISC chips) got more or less work done with each clock cycle depending on their architectures and other factors. I applaud the sunset of the clock-speed specification; its demise was long overdue. On the other hand, perhaps Intel decided to bow out of the clock-speed wars early because their research showed that they weren't going to be able to produce significantly faster processors at reasonable prices for much longer, anyway.
I have a Palm Tungsten T3 that, for slightly more than $300 brand new ($375 if you include the 256MB memory expansion card), runs rings around some of the best systems I could get for $3000 just ten years ago: these were powerful, near top-of-the-line desktop systems that supported software development or multimedia applications easily, or could crunch data for small and medium-size businesses. I don't use the T3 for such heavyweight tasks of course, but if pushed, I could. The screen is a bit on the smallish side (though large for a handheld), so the PDA may never be a direct replacement for a desktop workstation, but even so, it offers more processing power and I/O flexibility than most people need now or may need in the foreseeable future -- certainly far more than they needed five or ten years ago. The processor in this little unit, a variant of the ARM RISC chip, runs at only 400MHz -- less than half a GHz. For all but the most computationally-intensive tasks (which could, I'll admit, include photo-realistic, real-time video games that the PDA can't handle), people won't perceive sufficient added value from faster processors to justify paying a premium. I'm sure that this truth is one factor that motivates the Intel decision.
Another interesting milestone is hard disk size. It used to be that I could count on exceeding the main mass-storage unit's capacity on my personal computer within 12 months -- sometimes within six months -- assuming I was the system's only user. My current desktop unit at home is now almost two years old. We bought the whole thing for half what I used to pay for a sufficiently powerful system ten years ago. Our current system came with a 120GB HD. In nearly two years, my entire family of 3 has yet to come close to filling the HD to capacity, although we have laden it with a vast amount of disk-hungry audio and video multimedia content. 120GB seems to be the "sweet spot" for HD capacity, just as 2-2.5GHz is the apparent sweet spot for (desktop system) processor speed. The good news is that, if we ever do hit the wall, we can go out and buy another 120GB or so in an outboard storage unit for less than $200.
We're crossed the threshhold into a golden age of mass-market personal computing a couple of years ago, by my reckoning, so it only seems reasonable that Intel and other major players start paying more attention to user experience and relevant specifications (probably no more valid than clock speed, alas), than to sheer hardware muscle. But we should be careful: when the automotive industry reached this point, we entered the age of tail-fins.
I'm interested to see how this impacts how the Gateway's and Dell's of the world market their computers. For those of us who use their computers to the fullest (ie. scientific computation, gamers, enthusiasts) how a chip is labled meant little. It seems the pre-assembled computer market needs to come up with some standard measure of performance for daily tasks in place of the old frequency ratings. Any thoughts on how this will shake out?
Good news for AMD.
and it's becuase Intel's "NetBurst" achitecture runs hot as hell, and they realize they're not going to be able to ramp up the Mhz much higher anyway. They're going the way of Banias/Pentium M, and the plant in Israel must be throwing a party.
I too wondered about what MS puts on the box when it ships Longhorn. Maybe we'll see a move to PC "lines" that don't put the processor front and center in the marketing. Quite a reversal there. Also, I know of no one who predicted the "case modding" craze which kinda leaves the people who really care about processors with a custom design route.
Code Monkey, Intel is just following in AMD's footsteps. AMD decoupled model numbers from clock speed when the Athlon XP came out -- as in, the Athlon XP "3000+" that you're supposed to compare to the 3.0GHz Pentiums but which really only runs at 2.17GHz.
No doubt you'll still be able to find an Intel chip's actual clock speed deeply buried in the fine print, just like with AMD. Benchmarks for the applications you use matter more anyway.
Megahertz only reflects how many times in a second a processor can do its thang. It does not tell you anything about how effective that thang is. For the last 4 years or so, a Mac using a Motorola or IBM chip would perform equally with a PC that has a chip running 25% faster. Intel is dropping megahertz as a marketing ploy because they are no longer winning the race, however meaningless. AMD is out-playing them, and IBM's next generation Power chips, which will be put into Macs this summer or fall, will also surpass Intel's raw clock speed.
"For the last 4 years or so, a Mac using a Motorola or IBM chip..."
Just how long ago did Mac's stop using Motorola 680x0 family processors? 😉
As Jim McMannis put it, "mhz sells."
MHz became an absolute myth with the launch of the Pentium 4 (a hot thingie that was often outperformed by the Pentium 3... the chip that's still in most servers.)
Intel ran with PIV 8ghz for a while, and to be fair, when AMD's Athlon32 ran into production difficulties, started beating up on the little guy again performance-wise. But now, by most accounts, the PIV can't go any higher, so they're finally abandoning the myth.
Waiting for the big guy to warn, this quarter or next.