The new 5750 and 5770 launched today are lined up against the 4830 and 4850, respectively. It looks like they are about half the die size of the 4850/70 which means that AMD will be making a healthy profit on them if they get the same prices (or more, like they are currently) for the 5 series cards. I would expect that within six months we’ll see the 5770 at $149 and the 5750 at $99 which will completely replace the 4xxx series. If the line up for the cards moves to 5750/$99-$119, 5770/$149, 5850/$249, 5870/$349 which makes sense, that leaves a pretty glaring hole in the lineup. Makes you wonder if we’ll see a 5830 type card packing DDR3 at the $200 price point. That would probably make a lot of sense.
It will also be very interesting to see what the 5600 series cards are going to look like next year, but with price points and performance levels being covered it is hard to believe there will be more than two 56xx series cards available, something with ddrd2/3 and a 64 bit interface with a slower core and a speedier part with ddr3 and 128 bit interface. My guess is the core will be basically half of the 5750, which would follow the trend AMD is setting as that part is half of the 5850. The weird things that that AMD is still talking like they are going to launch four lines of cards (and this is the second launch wave)- does that mean an uber-low end or are they referring to inevitable x2 launches? Time will tell 🙂
With the x2 launches we might see the 5850 drop to $225 and the 5870 drop to $325, which would also help AMD close the gap they have at the $200 and $300 price points. That would leave precious few places for Nvidia to make an impact unless the really drop the prices on their GTX260+ line of cards and even then the AMD cards hold the distinction of being DX11. While that is probably a check box at this point of time that will have limited if any relevance in the future, if the cards from both chip makers perform equivalently it would be a safer bet to take the newest card with the most features which again leaves Nvidia at a disadvantage.