SUBSIM Radio Room Forums

SUBSIM Radio Room Forums (https://www.subsim.com/radioroom/index.php)
-   PC Hardware/Software forum (https://www.subsim.com/radioroom/forumdisplay.php?f=235)
-   -   First benchmark (https://www.subsim.com/radioroom/showthread.php?t=152702)

SUBMAN1 06-22-09 07:32 PM

Both of you should do yourself a favor and skip NVidia till Next Spring. NVidia has never implemented DX10.1 and they barely support DX10. Where is their tessellation?

-S

Arclight 06-23-09 02:43 AM

It's been given a swirly and stuffed in a locker by PhysX. :roll:

Nvidia is focusing on improving current tech, while ATI is forging ahead, quickly implementing new tech. Thing is, if Nvidia would do hardware-tessellation too, it could become a standard technique. ATI has been putting tessellators in their GPUs from the R600, giving them a head start in this field.

Gonna be interesting to see what the future holds. If Nvidia keeps ignoring this feature of DX11 and game devs start using it, ATI could take the leed and keep it untill Nvidia makes hardware-tessellation capable GPU.

I'm sure they will conjure up some solution before this happens though. Simulating it seems viable given the sheer horsepower available on both todays GPUs and CPUs. :hmmm:

SUBMAN1 06-23-09 08:23 AM

Quote:

Originally Posted by Arclight (Post 1122145)
It's been given a swirly and stuffed in a locker by PhysX. :roll:

Nvidia is focusing on improving current tech, while ATI is forging ahead, quickly implementing new tech. Thing is, if Nvidia would do hardware-tessellation too, it could become a standard technique. ATI has been putting tessellators in their GPUs from the R600, giving them a head start in this field.

Gonna be interesting to see what the future holds. If Nvidia keeps ignoring this feature of DX11 and game devs start using it, ATI could take the leed and keep it untill Nvidia makes hardware-tessellation capable GPU.

I'm sure they will conjure up some solution before this happens though. Simulating it seems viable given the sheer horsepower available on both todays GPUs and CPUs. :hmmm:

Try not till Spring. And since their 520's are so complex, NVidia is taking a loss on each on right now that they sell. There is a good chance of a buyout or bankruptcy for them at the moment.

And they aren't focusing on current tech. They are focusing on a worthless feature called CUDA that no one wants.

-S

stabiz 06-23-09 02:45 PM

13560 for me.

E8500
4GB DDR2
HD 4870 512MB
Win XP

Time for a reinstall, me thinks.

Task Force 06-23-09 02:48 PM

I myself was thinking about ATI but I hate there control panel, all the fancy bullcrap. and ive heard they dont like to get to heated up/fans breakdown often.:hmmm:

And what the hell is this CUDA stuff. Ive heard of it and still dont have a clue what it is.:hmmm:

Arclight 06-23-09 02:59 PM

Ehr, basically using GPU as CPU. Think of a CPU with 256 cores, something like that, if I understand correctly. Offcourse you need specially designed applications.

For example, you could use a GPU as a search engine. Instead of 2 or 4 threads, there are 256 running at once, making it possible to cover a lot of ground very quickly. Not sure if the analogy is accurate though, bit sketchy on the whole thing myself. :-?

Task Force 06-23-09 03:01 PM

They realy need to look towards DX 11 or possiable DX 11.x or 12.:hmmm: what is the average lifespan of a 8800 that is overclocked to 576 mhz?

Arclight 06-23-09 03:18 PM

IMO don't worry about raising the clocks too much, it's increasing the volts that will get ya.

Personally when it comes to PC stuff, I expect something to last as long as it's covered under the warantee, anything after that is just a bonus. So: 2 years. :DL

Task Force 06-23-09 03:44 PM

This thing came with a lifetime warantee, and i just realised i forgot to regester it. (its 2 years old this december.):yep: Oah well, il get a better card eventualy.


All times are GMT -5. The time now is 11:59 AM.

Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
Copyright © 1995- 2024 Subsim®
"Subsim" is a registered trademark, all rights reserved.