Home
Friday, August 18, 2017
11:05:31 AM
Users online: 0   You are here >> Home > Graphic Hardware

Forums | Graphic Hardware Forums search
Forum FAQ
   
 Previous Page 1 | 2 | 3 | 4 Next Page 
physx with a radeon gpu?
sora3 
26/8/08 6:30:17 PM
Immortal

Quote by layzi
sora
stupid is as stupid does ahur hur hur
i dont think very highly of you either and i think you are missing the point totally, you make it sound as if havok cannot run on nvidia gpus and only on ati. havok fx was cancelled not long after intel acquired havok, havok and havok fx are 2 seperate things so dont confuse yourself. any benefits you ask? yes huge imo.

Edited by layzi: 26/8/2008 12:19:08 PM


Edited by layzi: 26/8/2008 12:20:01 PM



So you're saying 'You're right and I'm fucking wrong because I have an opinion?'

I didn't say that fact. You only assumed it which is why you miss the point of Baner86 and others. I don't doubt that PhysiX on GPUs will be huge but having the technology now is a bit pointless given that there aren't any software that will take advantage of this. Possibly in simulation for high end computer or high performance computing but not gaming ATM.

Also, if Havok FX was cancelled, then there has to be a reason why. Possibly because multi-core CPUs or the like? But until then, I'll reserve my judgement.

-----
Embrace your dreams and never forget your honour.

Quote by Jen-Hsun Huang
"How much faster can you render the blue screen of death?"



illdrift 
26/8/08 6:31:54 PM
Overlord

I've used lfs and gtr2 a fair bit. They are by far the most realistic race sims, but it still seems they have a fair way they can improve. It seems the only physics calculations they've made is as basic as a the main forces in a few certain directions. It no doubt covers the basics.

I agree with the current scenario being adequite for most people's gaming needs. They're slowly improving realism as typical hardware develops, but if you can change to a better system which can provide exponential improvements, i'd prefer they do so. Ray tracing could be considered just as unnecessary for the average gamer, but i would much prefer to use it.

The main reason i mentioned wind resistance is because it has a large effect on the car's handling when sliding at high speeds (150k's+). Naturally you won't notice it in a typical racing sim since the wind resistance is always front on.

Though with the number of cpu cores increasing i can see how cpu's could also be adequite

-----
Desktop|xfx 790i|q6600@450x8(h2o)|2x2GB@1600-777|3xVX2640W|8800gt(h2o)| 2x8600gt(passive)|LG GGC-H20L|CoolMax 480w(passive)|

Nas|iCute Super18|msi 680i|e4300|2x640gb Rd1|HighPoint 2320|5x1tb Rd5|TX750W|

elmo198 
26/8/08 7:28:59 PM
Champion

Arent ATI sleeping with Havok and Nvidia with PhysX?

I also read somewhere some middle eastern country could of been Israel. they mod the drivers to do PhysX stuff. google it, I cant remember link.

-----
http://users.tpg.com.au/elmie/linux/
http://users.tpg.com.au/elmie/windows/

SceptreCore 
26/8/08 8:00:40 PM
Guru

Quote by Baner86
EVGA only makes Nvidia parts at the moment. Their forums would be full of Nvidia owners. They come to the conclusion that running physics with CUDA on their Nvidia graphics cards doesn't help performance at all because:

Frame rate is decreased because stream processors are diverted to calculating physics which completely defeats the purpose of hardware accelerating physics in the first place. After all you want the shit to run faster right?

It doesn't run faster because your graphics card is only half processing graphics and half processing video. It ends up slower.

All that Nvidia has achieved is running physics on a GPU. It's an interesting development but noone cares at this point in time.


But as the number of GPU processor clusters increase, the more amount of grunt it has. Could you imagine how well a 4870X2 could run a game with physics calculated on the GPU, with its 1600 SP core madness? Really physics could be so popular in the coming future of the electronic games if ATI had a physics engine in their drivers.

Just my 2c's

-----
Greatest Sayings Of Our Time:
Quote by sora3
...What drivers are you using? If you say the drivers on your CD, I will phone you and tell you that I will bone your f*cking dog...



Vanne 
26/8/08 9:17:55 PM
Champion

http://developer.nvidia.com/object/physx_good_company.html

wow, theres more than 5. :)

-----
DFi Lanparty Ultra D-Hardmod to SLi
AMD X2 4400+ Holy Toledo Batman
TeamXtreem Cronus 3-3-2-8 DDR500
2xAsus 7900GTX (GA SLi bridge)
Enermax Liberty 620W SLi
2xApple 20 Inch LCD's
Ageia PhysX

F117_Nighthawk 
26/8/08 9:18:07 PM
Guru

If many games supported / had a good benefit from physics then ATI would have implemented them into the drivers. Thats the problem now, while physics may be beneficial in the future right now there isnt enough games to actually take advantage of it. Its only recently been implemented into the Nvidia drivers and even then there isnt really any benefits to using it.

As said many times before in this topic, if physics was popular and more game developers embraced it then we would see better development from both camps.

-----

layzi 
26/8/08 9:18:16 PM
Apprentice
Quote by Baner86
You must agree that Intel at the moment is a damn fine company dominating the CPU market by a huge margin right? If they spent wads of cash buying Havok to develop Havok FX WHY THE FUCK WOULD THEY CANCEL IT IF THEY THOUGHT PHYSICS WAS GOING TO BE BIG. After all they would be wading around in cash right now cos every man and his dog owns an Intel CPU so they could afford a bit of research dedicated to hardware accelerating physics.

They aren't because they see there is currently no point in it? Evidence for this? The majority of games in the most wanted lists for the coming years are running Havok.





who are you to make such claims are you an intel representive or something?

i recommend reading this http://www.tomshardware.com/news/nvidia-drivers-physics,5758.html

March 2006: GPU Physics begins its life as a marketing gimmick between ATI and Nvidia. Both companies announce GPU Physics at GDC Spring in San Francisco using Havok FX, a sub-set of the Havok physics API that used the GPU to "animate" physics. Was never really true Physics.

May 2006: At E3 2006 in Los Angeles, key game developers criticized Havok FX and decided to go either with Havok or Ageia’s PhysX API - since both APIs are CPU agnostic and work on almost all platforms.


in my perspective this indicates that with there upcoming larrabee chips they will use directx11 or possibly physx api to run there accelerated physics, thats my theory anyway. why larrabee? enter gpgpu technology, fact that its a hybrid of the gpu and cpu it is better tailored for such intensive computation tasks like physics because of its raw parallel processing power and hence many core architecture sound familiar?, if you simply put a cpu up against a gpu which one does the job quicker? want more schoolin?

as for the current situation i dont totally disagree with some of you but it is clear that after all this discussion, gaming physics is not just a gimmick or a waste of time which some people here are saying which is not true.

Quote by Baner86
3dMark Vantage scores are made up of both scores based on physics tests as well as the traditional graphics testing.

If the CUDA physics actually helped in any way shouldn't the world 3DMark Vantage record be owned by a system running Nvidia hardware?

http://en.hardspell.com/doc/showcont.asp?news_id=3553




it would be nice to be able to read those scores, but nevermind i found the real scores here. is it just me or is that not a full blown nvidia setup?

http://service.futuremark.com/resultComparison.action;jsessionid=F6E4CD25D58593315928B4A24BCECD25



Edited by layzi: 27/8/2008 06:34:38 AM

-----
FTW!
http://overclockingpin.com/final%20gtx280/790isli-triple-tek9-40.jpg

Cpt. Lock 
28/8/08 2:40:03 PM
Banned

Quote by nesquick
the source engine does physics on a software level and it does it dam well so i don't see why all this havok and physx stuff is going to be that great.



The source engine also has quite a lot of backing from ATi as it runs havok physics

ATI = Havok
Nvidia = AGEIA

both have different ways of implementing phsyics both companies can use it

http://ati.amd.com/technology/crossfire/physics/index.html

That's right you can even use a card in crossfire purely for phsyics with an ATi card

ATi had physics before Nvidia even had the idea to buy AGEIA, go play halflife 2 and CSS with an ATi card


Edited by Cpt. Lock: 28/8/2008 2:49:26 PM

-----

SceptreCore 
28/8/08 2:56:48 PM
Guru

Quote by layzi
Quote by Baner86
3dMark Vantage scores are made up of both scores based on physics tests as well as the traditional graphics testing.

If the CUDA physics actually helped in any way shouldn't the world 3DMark Vantage record be owned by a system running Nvidia hardware?

http://en.hardspell.com/doc/showcont.asp?news_id=3553




it would be nice to be able to read those scores, but nevermind i found the real scores here. is it just me or is that not a full blown nvidia setup?

http://service.futuremark.com/resultComparison.action;jsessionid=F6E4CD25D58593315928B4A24BCECD25



I think I could make out the score of that rig there, it got 23286 if my eyes do not deceive me... so nVidia won with 29676.. and with a Q9650 I might add @ 5.5GHz, and the Radeon machine was using a QX9770 overclocked to 6.1GHz. So yeah quite a difference of score there.

-----
Greatest Sayings Of Our Time:
Quote by sora3
...What drivers are you using? If you say the drivers on your CD, I will phone you and tell you that I will bone your f*cking dog...



Baner86 
28/8/08 4:26:24 PM
Hero
Champion


Heh. Nice analysis there SceptreCore and Layzi.

I admit that all I did to get that link was google "3dMark Vantage record" to prove a point but it didn't work out too well did it.

The site gives no system information whatsoever except for the CPU used.

All we can actually conclude from that site I linked was that FOXCONN ran an event at Computex where they broke the 3DMark Vantage record but have no idea of any other details except for the QX9770.

I can't believe I didn't try and clear that up before linking ><

However, I don't believe I have lost all credibility...just bear with me with the following WALL OF TEXT :D

Regardless if you look at the pictures on the linked site REALLY closely you can see from picture 4: (if numbering from top to bottom right?)

In picture 4 they are using the "narrow" red chassis for their testbed and if you look EVEN closer you'll see the video cards they are using are 9800GX2's in SLI. This is because of the heatsink design being one that has never been used on any Ati card and ALSO because of the arrangement of the DVI ports on the PCI bracket at the rear of the cards. The ports are on top of each other vertically rather than arranged horizontally along the PCI bracket.

In Picture 5 the balding bearded guy is using the wide and open red testbed "chassis" and similair to before if you look real close you'll actually notice they are running twin HD3870X2's. Know how I know they are 3870X2's? Simply that they are on RED PCB's. No HD4870X2 not even the reference samples sent for previews wore a red PCB. Also, the Black heatspreader on the back of the cards in picture 5 are different to those on the HD4870X2's. Also note that Computex ended on June 5th well before even GTX280's and HD4870's were even released.

So yer...I guess it kind of makes sense that they don't have the record because ATI never held the Vantage record...my bad.

Also, I never said GPU's weren't better for physics processing than CPU's.

GPU's wipe the floor with CPU's at that kind of task.

What I disagree with is pure speculation.

You constantly say "I believe this shows that....". It's fair enough for you to believe something might happen (just as most people believed the GTX2XX release would be a resounding success just a couple of months ago) as opposed to what I'm saying where there is clearly no push from the games industry at this time or even in the near future (Starcraft 2 and Diablo 3 among other titles) for hardware accelerated physics let alone hardware accelerated physics using a fair propoertion of your graphics processing power.


Edited by Baner86: 28/8/2008 4:35:07 PM

-----
Q6600//GA-X38-DQ6//4x1GB ADATA DDR2-800//Hightech HD4870X2

E6600//P5B Deluxe WiFi-AP//2x1GB G.Skill DDR2-533//2x Palit 2900XT (Crossfire)

A64 4400+ x2//A8N-Sli Premium//2x1GB G.Skill DDR-400//2x XFX 8800GT (Sli)

SceptreCore 
28/8/08 4:43:34 PM
Guru

True enough baner... that link you provided didn't sport the latest competitors, but you may still need to back up that claim you made you that "If the CUDA physics actually helped in any way shouldn't the world 3DMark Vantage record be owned by a system running Nvidia hardware?" because I believe in the link that layzi put up, they were using the WHQL approved 177.35 drivers which aren't the latest for the GT200's anymore, and supports CUDA technology

So wouldn't this constitute the world record holding 3dmark score using CUDA? :)

-----
Greatest Sayings Of Our Time:
Quote by sora3
...What drivers are you using? If you say the drivers on your CD, I will phone you and tell you that I will bone your f*cking dog...



Baner86 
28/8/08 4:48:25 PM
Hero
Champion


Yeah, I made that statement based on that site that I linked which had nothing to do with anything :)

The 3dMark Vantage record IS held by an Nvidia system and there's no arguing that at all.

I can admit I was wrong...through the tears... :P

-----
Q6600//GA-X38-DQ6//4x1GB ADATA DDR2-800//Hightech HD4870X2

E6600//P5B Deluxe WiFi-AP//2x1GB G.Skill DDR2-533//2x Palit 2900XT (Crossfire)

A64 4400+ x2//A8N-Sli Premium//2x1GB G.Skill DDR-400//2x XFX 8800GT (Sli)

SceptreCore 
28/8/08 4:53:15 PM
Guru

Quote by Baner86
Yeah, I made that statement based on that site that I linked which had nothing to do with anything :)

The 3dMark Vantage record IS held by an Nvidia system and there's no arguing that at all.

I can admit I was wrong...through the tears... :P


It takes a man of strong morale character to that kind of admission, and not to continue to argue out of a situation.

/shakes hand

But you have still made some fine points that can't be disputed!

-----
Greatest Sayings Of Our Time:
Quote by sora3
...What drivers are you using? If you say the drivers on your CD, I will phone you and tell you that I will bone your f*cking dog...



Baner86 
28/8/08 5:01:28 PM
Hero
Champion


I got hell schooled fair and square.

The closest an Ati system gets to that system shown by layzi was around 24000. Thats CrossFireX apparently but it doesn't specify he was actually using two cards but I would say he was using the two cards.

That's also WITH a PhysX card thrown in.

If you look a bit deeper (which I'm certain layzi will do to try and own me again :)) you'd see that for the physics test in particular the Ati system with the AGEIA physX card in it actually get's a higher score than the tri-sli GTX280's in the Physics test.

If anything this shows there may actually be some value in having an AGEIA physX card in the Sli system as it seems to better handle the physics processing than the video cards do running CUDA.

The CrossfireX setup also loses in the game simulation Raw FPS tests which would also account for the crappy score, although admittedly this was a user entered submission...not an outrageously overclocked system at some event by professionals.

But I highly doubt that putting a heavily overclocked CPU in will bring that score from 24XXX to ANYWHERE NEAR 29XXX....

I just wanna say. After all MY walls of text and near-ranting, I believe it might be a foolish decision for Nvidia to jump in and say "We are running AGEIA PhysX". To do this would mean you are expecting everyone else, all developers, everything to agree with you and ALSO adopt AGEIA when Havok is the majority physics implementation at the moment.

We all know people like to resist change and so I don't think Ati is getting left behind by not adopting a physics API right away it's wanting to go with the flow.


Edited by Baner86: 28/8/2008 5:05:43 PM


Edited by Baner86: 28/8/2008 5:06:08 PM

-----
Q6600//GA-X38-DQ6//4x1GB ADATA DDR2-800//Hightech HD4870X2

E6600//P5B Deluxe WiFi-AP//2x1GB G.Skill DDR2-533//2x Palit 2900XT (Crossfire)

A64 4400+ x2//A8N-Sli Premium//2x1GB G.Skill DDR-400//2x XFX 8800GT (Sli)

SceptreCore 
28/8/08 5:09:50 PM
Guru

I am absolutely pissing myself with laughter.

That entire rant you made before... it seemed so 'structured', so 'pre-meditated', and then BAM! evidence brings it all crashing down. Like an old building in the way of future development.

What were you saying before about PhysX cards providing no performance increase, in fact slowing down FPS, and absolutely useless!????

But don't worry, as nVidia increase processor clusters, the more CUDA will be a viable option for anyone. So perhaps now you agree that nVidia aren't so fool hardy in developing GPU level physics calculations, and that maybe ATI need their own?

even if it's not needed just yet, what's the harm in producing and refine the technology before the demand is needed?

-----
Greatest Sayings Of Our Time:
Quote by sora3
...What drivers are you using? If you say the drivers on your CD, I will phone you and tell you that I will bone your f*cking dog...



Baner86 
28/8/08 5:29:23 PM
Hero
Champion


Yes but they're producing and refining the technology for the AGEIA PhysX API.

I would understand if they were showcasing hardware acceleration of both:

PhysX physics

and

Havok Physics

Being the only two real players in the game.

Wouldn't it pay to develop for both rather than one if you wanted to get a head start and the industry hadn't decided which direction to go.

-----
Q6600//GA-X38-DQ6//4x1GB ADATA DDR2-800//Hightech HD4870X2

E6600//P5B Deluxe WiFi-AP//2x1GB G.Skill DDR2-533//2x Palit 2900XT (Crossfire)

A64 4400+ x2//A8N-Sli Premium//2x1GB G.Skill DDR-400//2x XFX 8800GT (Sli)

nesquick 
28/8/08 5:48:14 PM
Guru

i thought k|ngp|n has the world record for vantage using 3 gtx280's ?

-----
e8600@4.5ghz 1.39v|GIGABYTE X48DS4|BALLISTIX TRACERS 4GB+PATRIOT PC9200 1GB|HD4870 790/4400/HD4850 CROSSFIREX|640GB WD HD|CORSAIR HX520|ANTEC TX1050B CASE|TRUE|

SceptreCore 
28/8/08 6:06:18 PM
Guru

Quote by Baner86
Yes but they're producing and refining the technology for the AGEIA PhysX API.

I would understand if they were showcasing hardware acceleration of both:

PhysX physics

and

Havok Physics

Being the only two real players in the game.

Wouldn't it pay to develop for both rather than one if you wanted to get a head start and the industry hadn't decided which direction to go.




I agree that developing both would be the way to go, but nVidia get a hard on when they make their own shit to work with their other stuff. Just the way it is. What Im saying is Id like to see what ATI could do in this area of physics on GPU, and or GPGPU, but then I suppose that their R&D isn't as large as the 10 billion dollar green giant's.

Quote by nesquick
i thought k|ngp|n has the world record for vantage using 3 gtx280's ?


Yeah it's that futuremark link up there ^.^


Edited by SceptreCore: 28/8/2008 6:06:58 PM

-----
Greatest Sayings Of Our Time:
Quote by sora3
...What drivers are you using? If you say the drivers on your CD, I will phone you and tell you that I will bone your f*cking dog...



nesquick 
28/8/08 6:17:00 PM
Guru

yea sorry seems some people like to write walls of text which i have to gloss over so i don't age ten years by the time i have read to the bottom :P

-----
e8600@4.5ghz 1.39v|GIGABYTE X48DS4|BALLISTIX TRACERS 4GB+PATRIOT PC9200 1GB|HD4870 790/4400/HD4850 CROSSFIREX|640GB WD HD|CORSAIR HX520|ANTEC TX1050B CASE|TRUE|

SceptreCore 
28/8/08 6:35:59 PM
Guru

Quote by nesquick
yea sorry seems some people like to write walls of text which i have to gloss over so i don't age ten years by the time i have read to the bottom :P

Yeah Baner can apologise for that as well. :P

-----
Greatest Sayings Of Our Time:
Quote by sora3
...What drivers are you using? If you say the drivers on your CD, I will phone you and tell you that I will bone your f*cking dog...



 Previous Page 1 | 2 | 3 | 4  | Next Page 
Forums | Graphic Hardware