PDA

View Full Version : How Quickly Do Video Cards Go Out of Date?



astrostu
12-13-2012, 01:29 PM
May seem like a silly question, but thought I'd ask it. I built my first PC back in April, video card works fine. But I have it hooked up to a 21" monitor and am looking to upgrade since I do a lot of mapping work and am eyeing something in the 27" 2560x1440px class. My graphics card doesn't have ports that support most of the new 27"-ers.

So I went onto NewEgg and looked up my current card just to see roughly what it was going for now. It was $140 when I bought it, now, 8 months later, discontinued.

CPUs tend to have a ~1-2-ish year product cycle with Intel, and so with this, it led me to wondering what the typical refresh rate is (no, not on screens ...) with graphics cards. Is there a good answer?

zburns
12-13-2012, 02:12 PM
There are probably a number of reasons a company would discontinue a graphics card. (1) They have an improved version and for some reason they want that earlier version off the market, meaning they want to sell the new version and not the old version. They could have several reasons for doing this such as better performance, therefore better advertising opportunities, essentially same card but cheaper to build because of 'chips' that allow faster assembly at less parts cost and less labor cost, moving the manufacturing to China as opposed to existing assembly in the US -- many reasons. Improving capability such as 'old card works on 21 inch monitor but not on 24 inch, 27 inch, etc..

Lots of reasons. They have to put together a decent number of the same cards all at one time, because a set up time to make just several cards may be approximately the same set up time to make a hundred. The companies manufacturing cost is a 'driving force' in determining whether a 'product' (ex. video card) stays on the market; products like video cards have to be mfg and sold in some decent quantity in order to make a net profit and therefore, remain on the market -- otherwise the company discontinues the card for lack of profitability. There is not just one reason a product is pulled from the market; there can be lots of different reasons.

RickyTick
12-13-2012, 06:35 PM
Interesting question.

It's a constant race between AMD and Nvidia. They usually release a different architecture based on fabrication processes every 12 to 18 months. Within each architecture, there will be several variations that are designed for different markets. For example, Nvidia targets 3 separate markets; mainstream, performance, and enthusiasts. Obviously, these represent different levels of performance as well as different price ranges. So, like any modern technology, as soon as an update is possible and makes sense financially, these manufacturers will change their product line-up with new designs and crank up the marketing campaign. At that point, all older product instantly becomes "obsolete", and retailers will start unloading older product as fast as possible.

In a nutshell, it's really all marketing, and is driven by supply and demand.

astrostu
12-14-2012, 01:16 PM
Thanks for the info. Maybe in my case, it was more that I bought it towards the end of its cycle (though it is 8 months later now ...). As a mostly Mac guy, I should've known better to look at that release date.

JeffAHayes19
01-23-2013, 12:28 AM
Without even looking I'm certain my ATI 4670HD video card I put in my build summer of 2009 isn't on the market any longer... BUT, when I bought it I made sure I got a card with a pretty fast clock and 2 GB of DDR5 RAM, and it CAME with specs saying it supports resolutions up to 2560 x 1600 and up to 2 monitors (and since I'm getting into digital photography in a bigger way, and recently bought Lightroom and the guts to upgrade what's in my current case to a 3770 system with 32 GB of RAM, I figure I may get to find out if I can swing something like the Dell U3011 monitor -- I already bought the Spyder4Pro monitor calibrator).

Of course my new MB and CPU comes with what looks like halfway decent integrated graphics and built-in HDMI and Display Port, but I'll still also be using that 4670, with thoughts in the back of my mind that close down the road I MAY replace it with something like a 7950. The 30" monitor for photo editing will come first, though (27" if I have to "settle").

From what I've seen, some of the bigger, better cards seem to have a longer shelf life. The lower end and mid-range cards are the ones that seem to get replaced more often. Then again, I don't really keep abreast of it that much. When I finish my new build (hopefully this week -- have had the parts for A WHILE, but just been otherwise occupied), I'll still be using my current HP w2408h monitor for some time, will install that spyder and see how well I can calibrate it, for one thing. Meantime, I'm waiting to see if all these somewhat "long-in-the-tooth" 30" monitors get newer versions. They all have USB 2.0 ports on them, now, which definitely SHOWS their age! Only a COUPLE of 27" 2560 x 1440s have USB 3.0s on 'em, but they don't have as much Adobe RGB color space (like the new Dell U2713).

Yaknow, you could drive yourself crazy trying to keep up with and figure all this stuff out! :)
Jeff