View Full Version : Realtime Ray Tracing = Goodbye GPU?
:huh: Quake4 in 1024 res at 100fps with NO GPU!
Better start saving now for that Octa-core machine!
Graphics Coming Back to the CPU (http://www.pcper.com/comments.php?nid=4388)
Real Time Ray-Tracing May Replace GPU Rasterization (http://www.cdrinfo.com/Sections/News/Details.aspx?NewsId=21608)
FIREWALL
10-15-07, 09:52 AM
I know the article said it would be a few years but it's kool to know in the near future gameing could go to another level.:D
I know the article said it would be a few years but it's kool to know in the near future gameing could go to another level.:D
I think I'm most excited about the scalability... whereby each additional core would increase graphics performance by almost a 1:1 factor.
And of course, the thought that we'd no longer have to fiddle with nuances between all the different competing GPU's. Just seems to me the programming would become more stable.
FIREWALL
10-15-07, 10:50 AM
Your more technical then my knoggin can absorb.
But doe's this mean you think it's kool too ?:D
Yeah... I think it's very cool. :up:
Kinda reminiscent of what I thought was a bygone era...
...when everyone sorta had the same standard graphics hardware and the faster the CPU, the smoother the graphics (like the good ole Amiga).
SUBMAN1
10-15-07, 11:22 AM
Don't discount the usefulness of the GPU no matter what they do. I see what they are saying, but I don't expect that to happen for some time. And when the CPU (with it's lousy floating point capability) replaces the GPU, the GPU will still be in your box. The reason? The GPU is capable of massive GFLOP's. THis is extremely useful for anything floating point. Physics is an example.
An Ageia physics card for example is extremely limited. It has a pre-defined set of rules for object physics and only those pre-defined rules. That is what limits their attractiveness. A GPU however, not only can do the same as the physics card, it has none of the physic cards boundaries. You can reprogram the physics calculations on the fly!
The GPU is not going anywhere as far as I can see. It's duties however may change. Is this why ATI saw the writing on the wall and joined AMD? Maybe a super CPU with impressive floating point perf coupled with it's already excellent integer performace is what they are trying to create.... Just a thought.
-S
The GPU is not going anywhere as far as I can see. It's duties however may change. Is this why ATI saw the writing on the wall and joined AMD? Maybe a super CPU with impressive floating point perf coupled with it's already excellent integer performace is what they are trying to create.... Just a thought.
Well... we'll see what happens.
I know I'd be content to see the mandatory GPU addon as a thing of the past. I'd much rather focus my attention on multi-core expansion options. I hate holding my breath everytime I update the stupid GPU drivers.
antikristuseke
10-15-07, 04:18 PM
Both Intel and AMD are looking to fuse the gpu and cpu into a one unit package, if im not mistaken it is only three generations down the road map to have a so called Fusion processor which combines gpu and cpu cores in a single package. Further integration is planed to follow to finaly end up with a general purpouse processing unit which could would be exelent in both highly paralelized and serial functions. Ofcourse this is at least a couple of years in the future and i doubt that this graphics generation technology will gain widespread use in the gaming sector before that.
SUBMAN1
10-15-07, 05:13 PM
Both Intel and AMD are looking to fuse the gpu and cpu into a one unit package, if im not mistaken it is only three generations down the road map to have a so called Fusion processor which combines gpu and cpu cores in a single package. Further integration is planed to follow to finaly end up with a general purpouse processing unit which could would be exelent in both highly paralelized and serial functions. Ofcourse this is at least a couple of years in the future and i doubt that this graphics generation technology will gain widespread use in the gaming sector before that.Well, as I said here in this forum before - the Fusion CPU will be superior to an Intel offering, so maybe this is Intel's response to that? The AMD multicore design is light years ahead of Intel's, and removing the PCI-E BUS from the equation should greatly speed things up. Direct access between CPU / GPU with minimal interference (THere is still a bit of a BUS, but not one in the classic sense), and direct link to system memory might allieviate some of the slowness associated with system memory.
Someone answer something for me - What is the facination with Intel and Buses? Is this because the Intel engineers are aging? Buses = bottleneck. They need to do away with them already. Especially their multi-core archetecture. Why did they feel the need to but a BUS in between cores?? I guess this was a hack to keep up with AMD on this technology.
-S
antikristuseke
10-16-07, 02:18 AM
To be honest i have wondered the same thing about intel, whats with the BUSes?
As for the Fusion being superior, cant really take a stand on this point since i dont know much about Intels plans, I hope there will be stiff competition keeping hte prices down tbh. But tis being Intels response to Fusion, i seriouslt doubt that since that would mean games incompatible with an amd system and i see ro reason for developers to go that route. (ofcourse i could be wrong here, correct me if this is so)
vBulletin® v3.8.11, Copyright ©2000-2025, vBulletin Solutions Inc.