Jonathan's Hardware Thread 4.0 (Q3 2004)
-
- Grand Pooh-Bah
- Posts: 6722
- Joined: Tue Sep 19, 2006 8:45 pm
- Location: Portland, OR
- Contact:
Jonathan's Hardware Thread 4.0 (Q3 2004)
Well, Q2 was pretty quiet. But Q3 (though we're not even quite there) is looking to be pretty exciting.
Nvidia's NV40 and ATI's R420 launched within several weeks of each other. The Nvidia line is called GeForce 6800 and the ATI line is X800. There are a confusing proliferation of suffixes and modifiers to denote the various speed grades of products based on these chips. The battle for Most Ridiculous Product continues unabated, as Nvidia will be selling a $600 card requiring a 550W power supply and two four-pin power connectors.
The important takeaway is neither company will be selling a card worth purchasing for another month or two (i.e. in the middle of Q3). All the announced products start at $400 and go up from there. The real battle is for the mainstream $200-$300 segment, which will have differently configured chips, if not completely different designs. At the high end, ATI has the advantage with their single slot design drawing only 65W of power. Nvidia actually has an advantage in OpenGL games, though, which are restricted to games written by John Carmack and licensees thereof at this point in time, however.
Nvidia's NV40 and ATI's R420 launched within several weeks of each other. The Nvidia line is called GeForce 6800 and the ATI line is X800. There are a confusing proliferation of suffixes and modifiers to denote the various speed grades of products based on these chips. The battle for Most Ridiculous Product continues unabated, as Nvidia will be selling a $600 card requiring a 550W power supply and two four-pin power connectors.
The important takeaway is neither company will be selling a card worth purchasing for another month or two (i.e. in the middle of Q3). All the announced products start at $400 and go up from there. The real battle is for the mainstream $200-$300 segment, which will have differently configured chips, if not completely different designs. At the high end, ATI has the advantage with their single slot design drawing only 65W of power. Nvidia actually has an advantage in OpenGL games, though, which are restricted to games written by John Carmack and licensees thereof at this point in time, however.
-
- Grand Pooh-Bah
- Posts: 6722
- Joined: Tue Sep 19, 2006 8:45 pm
- Location: Portland, OR
- Contact:
The Inquirer has what appears to be the Intel party line, here:
http://www.theinquirer.net/?article=15777
Their story jibes with the Reuters article here:
http://www.reuters.com/newsArticle.jhtm ... ID=5076395
http://www.theinquirer.net/?article=15777
Their story jibes with the Reuters article here:
http://www.reuters.com/newsArticle.jhtm ... ID=5076395
Reuters wrote:The chips being canceled include the fourth-generation Pentium 4 chip, code-named Tejas, which was to be sold next year. Also being dropped is a new Xeon processor for low-end computer servers, code-named Jayhawk and believed to be based on a similar architecture to Tejas.
Intel plans to introduce dual-core chips for desktop computers in 2005 and plans to start shipments of dual-core chips for notebook computers the same year, spokeswoman Laura Anderson said.
-
- Tenth Dan Procrastinator
- Posts: 4891
- Joined: Fri Jul 18, 2003 3:09 am
- Location: San Jose, CA
http://www.nytimes.com/2004/05/17/busin ... ner=GOOGLE
And then the rest of the world catches up...
Kinda funny how they alude to this being a prelude to the end of CISC near the end of the article. The processors are already mostly RISC anyways. With the shift to CMP, ISA is going to matter less and less. A better question would be to ask what this means for microarchitectural research. There's not much more we can do with out of order super scalar processors. Most researchers acknowledge this. We already see research being done on new computing models like ALU grids and cell processors and even stream processors from Bill Daley. CMP's won't scale forever either, but maybe it will at least take us to the end of silicon.
And then the rest of the world catches up...
Kinda funny how they alude to this being a prelude to the end of CISC near the end of the article. The processors are already mostly RISC anyways. With the shift to CMP, ISA is going to matter less and less. A better question would be to ask what this means for microarchitectural research. There's not much more we can do with out of order super scalar processors. Most researchers acknowledge this. We already see research being done on new computing models like ALU grids and cell processors and even stream processors from Bill Daley. CMP's won't scale forever either, but maybe it will at least take us to the end of silicon.
-
- Grand Pooh-Bah
- Posts: 6722
- Joined: Tue Sep 19, 2006 8:45 pm
- Location: Portland, OR
- Contact:
We're already living in a post-RISC world, at least in the narrow "reduced" sense. With AltiVec extensions and SSE, we've come full circle and recognized that making new instructions that compilers can use to generate efficient code is a big win. I've seen several things refer to modern ISAs as either "load-store" or "register-memory" which does seem like a more meaningful distinction nowadays.
But yes, the last couple of paragraphs in that article were just idle speculation.
Another way to look at uarch research is to realize that there isn't much new in the state of the art in industry right now, and hasn't been for a while. All the great advances of the 90's were simply taking ideas from mainframes and scaling them to VLSI designs. There are definitely new some new improvements and features, but right now there isn't any idea that is as powerful as pipelining, superscalar, and out-of-order were in their time.
I work for Intel, but I do not speak for Intel. My opinions are not necessarily those of Intel Corporation.
But yes, the last couple of paragraphs in that article were just idle speculation.
Another way to look at uarch research is to realize that there isn't much new in the state of the art in industry right now, and hasn't been for a while. All the great advances of the 90's were simply taking ideas from mainframes and scaling them to VLSI designs. There are definitely new some new improvements and features, but right now there isn't any idea that is as powerful as pipelining, superscalar, and out-of-order were in their time.
I work for Intel, but I do not speak for Intel. My opinions are not necessarily those of Intel Corporation.
-
- Grand Pooh-Bah
- Posts: 6722
- Joined: Tue Sep 19, 2006 8:45 pm
- Location: Portland, OR
- Contact:
http://theinquirer.net/?article=15916
http://www.firingsquad.com/hardware/nvidia_mxm/
Here's a coming trend, and you heard it here first: DIY laptops. They're coming.
http://www.firingsquad.com/hardware/nvidia_mxm/
Here's a coming trend, and you heard it here first: DIY laptops. They're coming.
DIY?Dwindlehop wrote:http://theinquirer.net/?article=15916
http://www.firingsquad.com/hardware/nvidia_mxm/
Here's a coming trend, and you heard it here first: DIY laptops. They're coming.
-
- Tenth Dan Procrastinator
- Posts: 4891
- Joined: Fri Jul 18, 2003 3:09 am
- Location: San Jose, CA
Design it yourself? I had the same question, but thought that was pretty likely to be what he meant or close.Jason wrote:DIY?Dwindlehop wrote:http://theinquirer.net/?article=15916
http://www.firingsquad.com/hardware/nvidia_mxm/
Here's a coming trend, and you heard it here first: DIY laptops. They're coming.
-
- Grand Pooh-Bah
- Posts: 6722
- Joined: Tue Sep 19, 2006 8:45 pm
- Location: Portland, OR
- Contact:
http://anandtech.com/showdoc.html?i=2070&p=8
BTX (as distinct from ATX) form factor systems will launch on June 21. The press has concentrated on the higher thermal envelope of these systems allowing for hot, hot Prescotts to be clocked ever higher, and indeed, they will. But there is an important aspect of the BTX form factor which has nothing to do with the extreme performance market segment. Anandtech.com has the pictures.
Here's a trend, and you heard it here first: someone will be selling these set-top boxes pictured at the bottom of this page. Maybe standalone, perhaps bundled with cable service or with a new TV purchase, but someone will sell them. People will buy them, not instead of a computer like WebTV was supposed to be, but in addition to a regular desktop or laptop, and they'll use them in ways that currently only Vinny has the patience to execute.
I work for Intel, but I do not speak for Intel. My opinions are not necessarily the opinions of Intel Corporation.
BTX (as distinct from ATX) form factor systems will launch on June 21. The press has concentrated on the higher thermal envelope of these systems allowing for hot, hot Prescotts to be clocked ever higher, and indeed, they will. But there is an important aspect of the BTX form factor which has nothing to do with the extreme performance market segment. Anandtech.com has the pictures.
Here's a trend, and you heard it here first: someone will be selling these set-top boxes pictured at the bottom of this page. Maybe standalone, perhaps bundled with cable service or with a new TV purchase, but someone will sell them. People will buy them, not instead of a computer like WebTV was supposed to be, but in addition to a regular desktop or laptop, and they'll use them in ways that currently only Vinny has the patience to execute.
I work for Intel, but I do not speak for Intel. My opinions are not necessarily the opinions of Intel Corporation.
-
- Grand Pooh-Bah
- Posts: 6722
- Joined: Tue Sep 19, 2006 8:45 pm
- Location: Portland, OR
- Contact:
I haven't really looked into the matter, but I believe dual monitor support depends upon having more than just two outputs. That is, a card with a dvi output and an analog output is going to put out the same picture on both ports in most cases. I'd look for a non-3D accelerator given the constraints you have mentioned. Perhaps something in a Matrox.
Are you in the situation where you already own a CRT, are purchasing an LCD, and wish to utilize both on the same machine? Or are you going to purchase the monitors to plug into this hypothetical video card at the same time?
Are you in the situation where you already own a CRT, are purchasing an LCD, and wish to utilize both on the same machine? Or are you going to purchase the monitors to plug into this hypothetical video card at the same time?
-
- Grand Pooh-Bah
- Posts: 6722
- Joined: Tue Sep 19, 2006 8:45 pm
- Location: Portland, OR
- Contact:
Implicit in all this is that DVI/VGA cards are really single head, despite having two outputs. You'll need to fork over serious cash for a dual head 3d card. For a dual head 2d card, I really don't know how much they cost.In our first Computex graphics article, we mentioned ATI’s plans going forward this summer. One additional tidbit we picked up on during our conversations with board manufacturers that we forgot to report on was ATI’s plans for dual DVI.
With NV40, NVIDIA integrated dual DVI connectors as standard equipment on its reference design and has been adopted by all of NVIDIA’s board partners on the Ultra line. This was welcome news to users with high-end LCD displays, as more flexible configurations are possible: with the standard DVI/VGA combo, you’re limited to supporting one DVI display, but with dual DVI you can run dual DVI displays, two VGAs, or one DVI and one VGA.
Rumor has it that ATI will also adopt dual DVI on its PCI Express RADEON X800 XT Platinum cards, just like NV40. This should come as welcome news to those of you with flat panel displays. We’ve received no word on plans for a dual DVI X800 PRO card however.
With that news out of the way, lets get on to the rest of the report.
I wonder if there's software that can take advantage of a couple cheap PCI video cards and turn it into dual monitor support. Might be worth a look.
-
- Minion to the Exalted Pooh-Bah
- Posts: 2790
- Joined: Fri Jul 18, 2003 2:28 pm
- Location: Irvine, CA
thinking of getting a 19" lcd to go with my 17". the 17" is analog, and the 19" (if I get it) will be digital. However my current video card doesn't have dvi. Also doesn't xp support daul monitor?
hmm, didn't think about if the two outlet of the video card produces the same signal, I guess I should look into it.
hmm, didn't think about if the two outlet of the video card produces the same signal, I guess I should look into it.
Last edited by Peijen on Wed Jun 09, 2004 9:24 pm, edited 1 time in total.