Home
Tuesday, June 27, 2017
10:20:52 AM
Users online: 0   You are here >> Home > Tech Talk

Forums | Tech Talk Forums search
Forum FAQ
   
  1  
Discuss: Larrabee: What we know so far
Dany Williams 
11/8/08 3:54:42 PM
Guru

Dany Williams thinks:

So this is what Intel had up it's jumper, well is this the end of the GPU as we know it.

I wonder how NVIDIA and AMD/ATI will like this.

About the Atomic article Larrabee: What we know so far

http://www.atomicmpc.com.au/article.asp?CIID=119252

Atomic wants its 32 core processor. Here's why, and how Intel plans to take on Nvidia and AMD on the graphics front

What do you think?

-----
AMD64 4000+ SanDiego
ZalmanCNPS9500
DFIUltraD
1Gig Ram
7800GTXoc
74Gig Raptor, 36Gig Raptor, 2x120's

mark84 
11/8/08 5:06:26 PM
Hero
Guru


Will be interesting to see how this pans out. It only needs to be close in performance to be a winner. The fact that it'll be totally programmable is a big feature. Auto-load balancing should be easy to do on it, compared to current GPU fixed pipeline architectures.

Anandtech has a great in depth article here for those interested:
http://www.anandtech.com/cpuchipsets/intel/showdoc.aspx?i=3367

-----
Q9550 | HD4870 | 2x WD3000GLFS RAID0 | Swiftech Storm | 2407WFP |

AtomicMPC Firefox extension
http://tinyurl.com/2qabr7

The Truth About Graphics Power Requirements V2
http://tinyurl.com/cj3pw

SceptreCore 
11/8/08 6:05:52 PM
Guru

Well you can bet that AMD will make a similar design... and for their sake I hope they can beat it!

-----
Greatest Sayings Of Our Time:
Quote by sora3
...What drivers are you using? If you say the drivers on your CD, I will phone you and tell you that I will bone your f*cking dog...



battlefield_gir 
11/8/08 6:13:06 PM
Guru

I think Nvidia still has a lot of life left in it.

/ waits for offical stats and figures

-----
Minister for Education, Innovation, Science & Research.


R.I.P Josh Woods

Goth: Am I the only one who somehow swam through the wall of the vagina and got lost/stuck inside




"There is no probl

Comrade 
11/8/08 6:34:22 PM
Serf

screw Intel. Nvidia's CUDA is already here and gaining support from a wide range of application developers. Phyx driver will be enabled with the next Geforce driver so right now Nvidia's chip is like superman, can do anything. G80 is already highly programmable, CUDA talks straight to the hardware, no graphics involved. When Intel Larabee finally arrives, it will only be a discrete graphics processor, it will have to earn its place in the GPU and GPGPU market. GOOD LUCK INTEL!

-----

nesquick 
11/8/08 6:49:35 PM
Guru

cuda and physx will be for shit if this blows every graphics card on the market out of the water.

-----
Q6700@3.8 24/7|GIGABYTE X48DS4|PATRIOT 2GB 1148MHZ 4-4-4-12|HD4870 790/4400/HD4850 CROSSFIREX|640GB WD HD|CORSAIR HX520|ANTEC TX1050B CASE|TRUE|

Kimmo 
11/8/08 6:59:29 PM
Hero
Immortal


This idea looks the bomb.

-----
http://www.youtube.com/profile_play_list?user=Mothdust666

bluedude 
11/8/08 7:18:56 PM
Guru

Do you have to program apps specifically for the Larrabee like you do with nVidia's CUDA? If not, then this is a great advantage for Intel and its users, so good on 'em.

-----
bonna-wana jeebi-wah?

Quote by Fat_Bodybuilder
MY TAG DOESN'T SAY THAT THERE IS ANYTHING WRONG WITH AMD CPUs IT JUST IS A QUOTE BY ANOTHER MEMBER CAN YOU GRASP THAT??! HIS IS DIRECTLY OFFENSIVE IN AN OFFENSIVE WAY

Shikimaru 
11/8/08 8:05:28 PM
Titan

/shiks sips the wine of innovation.

"It's good stuff!"

-----
Quote by battlefield_gir
i am ashamed.



fliptopia 
11/8/08 10:26:14 PM
Hero
Guru


Interesting ideas here but I want to see some results. It sound like it could be a potential failure or a resounding success. or maybe both depending on the application.

-----

Athiril 
11/8/08 11:50:54 PM
Titan

No nesquick, CUDA is win.

Look up RapiHD, and Photoshop CS4, they'll be here in 2 months, and they'll be running on available hardware.

Larrabee wont be.

I want to do heavy editing now, not in 2 years time.

-----
North Coast NSW Photo Community Forums
http://photodan.com.au/forums/

kunzie 
12/8/08 1:02:01 AM
Titan

http://backoffice.ajb.com.au/images/features/Slidescaling.jpeg

Absolutely useless marketing trash.

-----
http://www.youtube.com/watch?v=WF4-rN14zEc

layzi 
12/8/08 6:06:58 AM
Serf
its a brave move by intel entering the realm of gpus and competing with the giants i must say but blowing them out of the water? pffft i'd like to see that. give larrabee some time to evolve then it might be possible.

-----
future gaming=realism=physics processing=nvidia.

Linux_Inside V2 
12/8/08 8:43:23 AM
Immortal

Quote by kunzie
http://backoffice.ajb.com.au/images/features/Slidescaling.jpeg

Absolutely useless marketing trash.



Lol I love it, you can make graphs work in your favour when one half of it is unquantifiable

-----

xguntherc 
13/8/08 11:08:58 AM
Overlord

we don't want the GPU to end as we know it. we need it here. for the prices of course

-----
eVGA 750i FTW- -Q6700 GO @3.4Ghz w/Tuniq- -eVGA GTX 260- -2x2GB G-Skill- -Seagate Cuda 500GB 32mb 7200.(11)+ 500GB Ext- -ASUS 20x ODD- -Samsung 226BW 22" WS + Dell 19" 1905FP- -Bose Companion 3-

http://tinyurl.com/3DMark06


17747114553 
13/8/08 8:17:01 PM
Primarch
Intel has hot asian chicks. There's no way Larrabee will be anything less than totally awesome.

-----

osama_bin_athlon 
20/8/08 3:40:55 PM
Hero
Immortal


Exteme (r) Graphics?

-----
....it's called a 'snuke'.

  1  
Forums | Tech Talk