Printable Version of Topic

Click here to view this topic in its original format

GameReplays.org _ Hardware News _ NVIDIA GeForce™ GTX 400 Series Launched

Posted by: GiDeoN Mar 28 2010, 14:25 PM

NVIDIA has finally released the GeForce GTX 480 and GTX 470 DirectX 11-compliant graphics cards.

The GeForce GTX 480 features the GF100 GPU based on 40 nm TSMC Process Technology. The GeForce GTX 480 core has 480 CUDA Cores, a Graphics Clock running at 700 MHz and a Processor Clock running at 1401 MHz. The GeForce GTX 480 runs 1536 MB GDDR5 memory on a 384-bit memory bus at 3696 MHz (effective) for 177.4 GB/s of bandwidth.

The GeForce GTX 470 features the GF100 GPU based on 40 nm TSMC Process Technology. The GeForce GTX 470 core has 448 CUDA Cores, a Graphics Clock running at 607 MHz and a Processor Clock running at 1215 MHz. The GeForce GTX 470 runs 1280 MB GDDR5 memory on a 320-bit memory bus at 3348 MHz (effective) for 133.9 GB/s of bandwidth.

Reviews of the GeForce GTX 400 Series have painted a mixed picture on the power, performance and pricing from the cards, certainly the power required and the subsequent heat produced may make some question longevity of the stock cooler cards;

Finally, with this data in hand we went to NVIDIA to ask about the longevity of their cards at these temperatures, as seeing the GTX 480 hitting 94C sustained in a game left us worried. In response NVIDIA told us that they have done significant testing of the cards at high temperatures to validate their longevity, and their models predict a lifetime of years even at temperatures approaching 105C (the throttle point for GF100). Furthermore as they note they have shipped other cards that run roughly this hot such as the GTX 295, and those cards have held up just fine.Source: AnandTech


When we focus solely on the GTX 480 now for a minute, substandard and topic of discussion obviously has to be the TDP -- the card when stressed will utilize nearly 250 Watts and that certainly is pretty steep power consumption, especially compared to ATI's Radeon 5870 with a TDP of only 188W. You are going to need a decently ventilated PC as the heat the GeForce GTX 480 produces is plentiful. Make no mistake, the card will get hot, very hot. So that's definitely something you need to keep in the back of your head. Source: Guru3D


The elaborate cooling mechanism does allow for some overclocking, although don't expect to set records with it. Anyone looking to top leaderboards even at the tech-forum level with the GTX 480 should spare some dough on at least water-cooling. This is where a slightly scary flip-side of the GTX 480 starts. The GPU runs extremely hot! NVIDIA set its thermal limits at a vaporizing 105 degrees Celsius. It's a pleasant 22 degrees this time of the year being Spring, but the GTX 480 still scores a scorching 96 degrees Celsius on typical gaming load. Crossing the 100 degrees mark won't be tough for this GPU in even slightly hotter places, especially with Summer coming up.Source: TechPowerUP


You can read more reviews and view the GeForce GTX 400 Series benchmark performance results below.Keep yourself updated on the latest Hardware and Technology news here at Gamereplays.org.


Attached image(s)
Attached Image

Posted by: Guardian Mar 28 2010, 15:35 PM

They really timed this release badly. People need space heaters in the winter, not the spring sad.gif

Posted by: GiDeoN Mar 28 2010, 15:47 PM

I guess it's the downside of the monolithic approach NVIDIA take with their GPU design.

Keep you nice and warm in the winter months though! tongue.gif

Posted by: Frode789 Mar 28 2010, 15:55 PM

According to many of the tests, they reached over 100 C during some gaming, which made the GPU fan to reach over 70dB in noise levels. And of course, the card draws up to 300W..

I mean seriously, what a flop. I expected more after all the hyping from NVIDIA.

Posted by: Species8472 Mar 28 2010, 16:17 PM

For the 480 its 250watts.
Too bad they come so late with such a bad card. Seeing these temperatures, a dual gpu won't be possible, a die shrink would be very welcome here.
Until then ATI will still have the performance crown.

Posted by: iAmNoTiEnGi Mar 28 2010, 16:20 PM

I wonder if it will even lower the prices of cards like the 5970.

Posted by: |elder|Ap0C Mar 28 2010, 20:32 PM

What a fail.

Nearly as bad as the 2900xt, but not quite there.

Posted by: GiDeoN Mar 28 2010, 21:12 PM

QUOTE(iAmNoTiEnGi @ Mar 28 2010, 16:20 PM) *

I wonder if it will even lower the prices of cards like the 5970.


I seriously doubt it, i'm sure AMD/ATI will have a refresh soon, which might lower the prices a bit to make room for a 2GB 5870 etc.

QUOTE(|elder|Ap0C @ Mar 28 2010, 20:32 PM) *

What a fail.

Nearly as bad as the 2900xt, but not quite there.


Lol'd.

Posted by: Guardian Mar 28 2010, 21:44 PM

What I want to see is a game that actually needs a new gen DX11 card to realize its full potential. None of this AvP or Dirt BS.

Posted by: GiDeoN Mar 28 2010, 22:43 PM

BFBC2

Posted by: |elder|Ap0C Mar 28 2010, 23:31 PM

QUOTE
BFBC2


I personally believe the Frostbyte engine is a heap of fail as well. Technically it is doing about 80% of the things that Cryengine2, as much more the cost.

This is because it is tuned towards consoles and their old GPU's. In BFBC2, shadows are actually rendered by the CPU, even on PC's. It is quite CPU limited. So a GPU performance increase, as a result of DX11, will probably not be noticed.

If you want to actually take advantage of DX11, I would recommend the new stalker. It takes advantage of tesselation, which is DX11 specific, and gives much higher detail models at less cost.

Posted by: ^Dr_PhiL Mar 29 2010, 01:36 AM

well at least now i know I won't doubt buying the radeon 5870 as nvidia clearly won't make any worthwhile competition for a while now.

Posted by: seithon Mar 29 2010, 01:59 AM

This new card reminds me of the 5X00 range of Nvidia cards... the good ol dustbusters sleep.gif

Posted by: Kibatsu Mar 29 2010, 07:18 AM

Hmm, I also waited for months for the launch of the 400 series, since I want to upgrade from my NVIDIA 8800 GTX; currently I'm mostly playing Battlefield: Bad Company 2.

Although I am usually a NVIDIA fanboi, I'm quite disappointed by the new cards. Now I am playing with the thought of getting either a HD5850 or a HD5870 ... post-13661-1143531603.gif

Posted by: maverick199 Mar 29 2010, 10:08 AM

Though the specs are solid, specially with AA on, its not obliterated the 5800's. I am happy now I did not wait for Fermi and went with 5850.

Nice article btw.thumb.gif

Posted by: GiDeoN Mar 29 2010, 12:18 PM

QUOTE(Kibatsu @ Mar 29 2010, 07:18 AM) *

Now I am playing with the thought of getting either a HD5850 or a HD5870 ... post-13661-1143531603.gif


Wait a little bit longer, the Eyefinity6 edition of the 5870 should be coming soon, plus there was talk of a 2GB refresh version, which may push down the prices of the 1GB cards.

Posted by: Kibatsu Mar 29 2010, 12:21 PM

QUOTE(GiDeoN @ Mar 29 2010, 14:18 PM) *

Wait a little bit longer, the Eyefinity6 edition of the 5870 should be coming soon, plus there was talk of a 2GB refresh version, which may push down the prices of the 1GB cards.

Thanks GiDeoN, I was waiting for a tip like that. thumb.gif smile.gif

Posted by: GiDeoN Mar 29 2010, 14:28 PM

I'm giving them until July to drop the prices before i decide to increase my gpu count regardless of pricing.

Posted by: Kibatsu Mar 29 2010, 14:29 PM

meh, you think ATi will wait that long until they reduce their prices? sad.gif

Posted by: GiDeoN Mar 29 2010, 16:27 PM

I hope not sad.gif

Posted by: Kibatsu Mar 29 2010, 17:05 PM

Hmm, until they lower their prices, I might try to overclock my Intel E6750 to 3,0GHz, which shouldn't be a big deal.

I've heard Battlefield: Bad Company 2 needs a slightly faster CPU to smoothly run everything maxed out.

Posted by: GiDeoN Mar 29 2010, 21:12 PM

It's certainly more friendly to cpus with more cores afaik.

Posted by: STCAB Mar 29 2010, 21:44 PM

Not surprising that they weren't able to beat ATI this time... I bet they're pushing the 400 REALLY hard based on that fact, which is why it's getting super hot and stuff. And price? WTF almost the same as a 5970 which is just way ahead of the 400's main competitor the 5870.

Is that bankruptcy I smell?

Posted by: Guardian Mar 30 2010, 00:53 AM

QUOTE(STCAB @ Mar 29 2010, 17:44 PM) *
Is that bankruptcy I smell?

Lol no, that's the popcorn I'm making with my 480.

ATI had their problems when they transitioned to the 40nm process and now NVIDIA is going through its growing pains. They lose this round sure, but all this talk off doom and gloom is just unfounded imo.

Posted by: Kibatsu Mar 30 2010, 05:58 AM

QUOTE(Guardian @ Mar 30 2010, 02:53 AM) *

Lol no, that's the popcorn I'm making with my 480.

happy.gif

QUOTE(Guardian @ Mar 30 2010, 02:53 AM) *

ATI had their problems when they transitioned to the 40nm process and now NVIDIA is going through its growing pains. They lose this round sure, but all this talk off doom and gloom is just unfounded imo.

Agreed; NVIDIA will probably win the next "round" of the next generation GFX, so no need to worry ...

Posted by: |elder|Ap0C Mar 30 2010, 14:31 PM

IPB Image

Posted by: -AwfuL^^ Mar 30 2010, 15:53 PM

Love that picture tongue.gif

BTW current rumours indicate ATi will refresh it's 40nm range with a 67xx part in the end of this year. It will supposedly feature 1600 or less stream processors, but with increased efficiency leading to ~20% better performance than GTX480. Should also have better Tesselation performance and much improved GPGPU performance due to a revamped cache hierarchy. Will be followed up in 2010 by the 68xx series which is 67xx times two.

That or it's another one of ATI's infamous smokescreen rumours smile.gif

Posted by: GiDeoN Mar 30 2010, 17:58 PM

QUOTE(Guardian @ Mar 30 2010, 00:53 AM) *

ATI had their problems when they transitioned to the 40nm process and now NVIDIA is going through its growing pains. They lose this round sure, but all this talk off doom and gloom is just unfounded imo.


I think you really need to look at the overall picture with NVIDIA. Fermi isn't an architecture designed with the gamer in mind, it's much more about gpgpu and their growth in this sector reflects this.

They're aiming for a different set of markets if you look closely at their products.

Posted by: Guardian Mar 30 2010, 20:56 PM

And maybe this new architecture will be beneficial for gamers. You saw from the benchmarks that AA and tessellation took less of a performance hit on the fermi stuff than ATIs stuff. You really think they are going to abandon that segment of the market?

I'm not a fan boy, but lets be realistic here....

Posted by: Species8472 Mar 30 2010, 23:21 PM

Nvidia has way too much knowledge to get into bankruptcy and like AMD if they have trouble with money, there are way too many possibilities to start through, only bad thing would be if it was sold to intel.
Anyway on the workstation market, they do not have any competition and there's probably more to gain than in the consumer market.

Posted by: ^Dr_PhiL Mar 31 2010, 13:48 PM

what is it about them? the gtx 295 costs upwards of 500 dollars where as the radeon 5970 which owns it almost every single way costs around 600-700. and the radeon card below that, the radeon 5870, can still match up but for a hundred dollars less...

they always think that they can charge whatever they want and ask ridiculous prices for their gpu's

Posted by: GiDeoN Mar 31 2010, 17:48 PM

They have to factor in production costs, they got forced into a bad situation last round when ATI released the 4xxx series and seriously chopped pricing to compete. There is only so far they can lower prices before it becomes a question of profit vs market share retention.

This is where ATI win with their silicon design, they fit more on to each wafer due to a smaller gpu design, which in turn gives them a much better yield %. Compare this to NVIDIA which have a much larger silicon design, which gives a lower yield % and also means they get less chips per wafer.

The wafer of silicon costs the same for both of them, ATI just get more off each wafer, which equates to more cards to be sold and a stonger position to lower pricing if they need to compete.

I doubt either of them will lower pricing too much currently, the yields from the 40nm TSMC production have been problematic for both ATI and NVIDIA, which has cost both of them. At least ATI are six months ahead here and have had time for refinement to iron out problems, plus they'll likely switch to Global Foundries at some stage soon enough.

Downside is there has been no competition in the market for six months, which has left ATI room to inflate pricing, let alone what NVIDIA want to charge us due to gpu design choices / poor production yields.

Posted by: ^Dr_PhiL Mar 31 2010, 21:27 PM

true, the new ATI cards will probably be over-inflated pricing like nvidia is doing right now with the 400 series

Posted by: Guardian Apr 1 2010, 00:10 AM

QUOTE(GiDeoN @ Mar 31 2010, 13:48 PM) *
They have to factor in production costs, they got forced into a bad situation last round when ATI released the 4xxx series and seriously chopped pricing to compete. There is only so far they can lower prices before it becomes a question of profit vs market share retention.

This is where ATI win with their silicon design, they fit more on to each wafer due to a smaller gpu design, which in turn gives them a much better yield %. Compare this to NVIDIA which have a much larger silicon design, which gives a lower yield % and also means they get less chips per wafer.

The wafer of silicon costs the same for both of them, ATI just get more off each wafer, which equates to more cards to be sold and a stonger position to lower pricing if they need to compete.

I doubt either of them will lower pricing too much currently, the yields from the 40nm TSMC production have been problematic for both ATI and NVIDIA, which has cost both of them. At least ATI are six months ahead here and have had time for refinement to iron out problems, plus they'll likely switch to Global Foundries at some stage soon enough.

Downside is there has been no competition in the market for six months, which has left ATI room to inflate pricing, let alone what NVIDIA want to charge us due to gpu design choices / poor production yields.


Wait so you do agree with me ;d

Posted by: GiDeoN Apr 1 2010, 18:05 PM

QUOTE(Guardian @ Apr 1 2010, 00:10 AM) *

Wait so you do agree with me ;d


Didn't say that tongue.gif

Powered by Invision Power Board (http://www.invisionboard.com)
© Invision Power Services (http://www.invisionpower.com)