Quantcast

Jump to content

» «
Photo

192-bit or 256-bit

7 replies to this topic
Jewstinian
  • Jewstinian

    Entitled Asshat

  • Members
  • Joined: 29 Jan 2011
  • United-States

#1

Posted 23 November 2012 - 08:28 AM

So, a couple days ago, I had purchased a GTX 660TI. I was very excited until I read somewhere that the 192-bit memory bus is going to limit the card's potential and won't be very futureproof for long. This kind of worried me, since I was given the option to go for a GTX 670, which had almost the same specs except for a larger memory bus -- 256-bit. I don't know if I should return my 660 TI when it arrives and purchase a 670, or just keep it. I hate being spoiled when it comes to these things, but I want to expand the best future out of my gaming and not have to worry about purchasing a new video card often.


I appreciate the help.

OysterBarron
  • OysterBarron

    You need a Pearl necklace? Hit me up ;)

  • Zaibatsu
  • Joined: 17 Dec 2004
  • United-Kingdom
  • April Fools Winner 2015

#2

Posted 23 November 2012 - 09:24 AM

My new card is 356bit my old card was 256bit and tbh I havent noticed a difference really. Try hwcompare.com and compare the 2.

OysterBarron
  • OysterBarron

    You need a Pearl necklace? Hit me up ;)

  • Zaibatsu
  • Joined: 17 Dec 2004
  • United-Kingdom
  • April Fools Winner 2015

#3

Posted 23 November 2012 - 09:54 AM

My new card is 356bit my old card was 256bit and tbh I havent noticed a difference really. Try hwcompare.com and compare the 2.
http://www.hwcompare...eforce-gtx-670/

Opps double post dont know how this happened onluy wanted to edit.

Shou
  • Shou

    y

  • The Yardies
  • Joined: 09 Jul 2012
  • Jamaica
  • Best Signature 2013
    Best Signature 2012

#4

Posted 23 November 2012 - 12:17 PM Edited by shoumic, 23 November 2012 - 12:33 PM.

Can you afford to wait Justin? Becasue the new gen of cards should be announced soon. And the larger memory bus will help you playing higher resolutions and multiple monitors so it depends on your preferences. Do you play in resolutions over 1920x1080p and do you play with multiple monitors? If yes then go for the GTX670 or 680, if no. Its your choice at the end of the day what you want to do, but you should be fine.

I would spend the extra $80 because im already spending like $300, why not a little more?

yojc
  • yojc

    ~y

  • Andolini Mafia Family
  • Joined: 06 Dec 2008
  • Poland
  • Best Poster [Tech] 2014
    Most Knowledgeable [Tech] 2013
    Helpfulness Award

#5

Posted 23 November 2012 - 12:33 PM

It'll get bottlenecked only under heavy memory load. Otherwise, it'll perform close to GTX670.

Stinky12
  • Stinky12

    No title

  • Members
  • Joined: 14 Oct 2010

#6

Posted 23 November 2012 - 02:27 PM

My old card was 512bit, but its other parts are lacking, so it wasn't the best card on the market.

Jewstinian
  • Jewstinian

    Entitled Asshat

  • Members
  • Joined: 29 Jan 2011
  • United-States

#7

Posted 23 November 2012 - 08:40 PM

Thanks for the help, again everyone.


The only reason why I want to purchase a graphics card now is because I'm not going to have the funds to because I like to call this season "the spending season" because for some reason, I'm always ending up empty-pocketed by the end of the year. Any suggestions on how it's going to truly affect my performance in the later years?

Shou
  • Shou

    y

  • The Yardies
  • Joined: 09 Jul 2012
  • Jamaica
  • Best Signature 2013
    Best Signature 2012

#8

Posted 24 November 2012 - 12:30 PM Edited by shoumic, 24 November 2012 - 12:33 PM.

If you want to be future proof, always go for the better card. So the GTX 670 would be my answer.

Picked this up from a website.

QUOTE
So will memory bandwidth amount to any real differences in video game performance? Most PC gamers have a PCI-Express 2.0 compatible motherboard inside their computer system, although the most recent motherboards and all of NVIDIA's GeForce GTX 600-series feature PCI-Express 3.0 compliance. Additionally, nearly all high-performance video cards feature at least 256-bit memory bandwidth or better (such as 384-bit with some AMD Radeon HD 7000-series graphics cards). However, most testing with these high-end graphics cards has shown little indication that bottlenecks actually occur at the PCI-Express level, even while playing the most demanding DirectX 11 video games. It's more likely that a bandwidth bottleneck might occur at the video card's memory subsystem, where the GPU may be capable of sending more information than the video frame buffer can accept. We discovered some evidence of this in our recent testing on the ultra-overclocked ASUS GeForce GTX 660Ti DirectCU-II TOP, which maintained performance levels with a stock GTX 670 in all but the most demanding video games that featured large maps or virtual worlds.

Obviously the GPU plays the star role in creating a bottleneck, and while less powerful processors lack the raw number of transactions to saturate the memory subsystem this isn't so difficult with powerful Kepler-based GeForce GTX 600-series products. Because there are times when GeForce GTX 660 Ti's 192-bit memory bandwidth can become bottlenecked, what would happen if we split demand between two cards connected together into an SLI set? In theory, by combining two graphics cards together with SLI technology we're essentially doubling the memory bandwidth available to standard workloads. Given that 256-bit memory configurations suffer very few bandwidth bottlenecks while 192-bit can have occasional limitations, it seems plausible. Furthermore, the results could really help gamers decide on creating an SLI set of either GTX 660 Ti or GTX 670 graphics cards.


Since the GTX 660i is already bottlenecked now, it will struggle more later. Making it less futureproof.




0 user(s) are reading this topic

0 members, 0 guests, 0 anonymous users