 
        
        
         
                                
                
             Orpheus
                            Orpheus
            
                         
                                Orpheus said:What would you need that for?
Check this shit out..
 
                                
                
             
                                Crono said:so..... for showing off(the design)?
It is pretty cool.
The Bulldozer chips are actually out now. And still cheaper than i7s. NewEgg has one of the FX chips listed at $220, it's an 8 core CPU.
Sadly, they're still very new. So, there's not a lot out there that utilizes all the cores and features and what-not. But that'll change.
It's not that you NEED to overclock these CPUs to that degree ... it's that it can do it, and that's neat. It shows the quality of engineering put into the thing. That it can stably handle those speeds.
No one is going to practically try to overclock their CPU to these speeds for any sort of normal use.
 
                                
                
             Orpheus
                            Orpheus
            
                         
                                             
                                
                
            tornados2111 said:Pretty much.
so..... for showing off(the design)?
 
                                
                
             Orpheus
                            Orpheus
            
                         
                                             
                                
                
            
 
                                Crono said:What do you mean by CPU modeling GPU behavior?
Well they are magazine editors and not necessarily technical professionals
Rule of thumb is speed is exponential in growth. But there is a ceiling. Then again, we probably won't reach it for awhile. I imagine CPUs will start modeling GPU behavior in some ways in a while (outside of outright having GPU cores inside them).
 
                                
                
             sgtfly
                            sgtfly
            
                         
                                
                
            
 
                                 omegaslayer
                            omegaslayer
            
                        tornados2111 said:Generally speaking graphics cards can process floating point numbers (decimal numbers like 1.67) more efficiently than CPUS did (overall integer numbers are always fast aka: 7, 78, 9, -7). Likewise theres certain graphics operations that can be efficiently done in GPUs rather than CPUs (aka: vector operations, matrix translations - the heart of graphics rendering).
What do you mean by CPU modeling GPU behavior?
 
                                
                
             
                                 omegaslayer
                            omegaslayer
            
                        "Crono" said:I don't believe we'll ever stop needing a discrete graphics card. I just know thats what Intel is pushing for.
You still will. You can't really substitute the amount of work you can offload onto another processor.
"Crono" said:K
In any case, the GPUs prior to GF8 or so were vector processors. Which is different than "doing vector calculations really good". They actually operated on three or four values each cycle
 
                                
                
            omegaslayer said:I'm aware. How else are they going to keep pushing forward against AMD in that realm if they don't?
I don't believe we'll ever stop needing a discrete graphics card. I just know thats what Intel is pushing for.