My girlfriend plays on the Xbox. We had a little dinky television and she was sick of playing games on it. So she went out and bought a 42" LCD and loves it.
I know this is a completely newbish question, especially for someone like me whose played games all my life on PCs and consoles dating back to the Mac Plus and the Mattel Intellivision! But what is it about consoles that allows you to play on a larger television and suffer no performance loss? Aren't her Xbox and my PC fundamentally the same? You have a processor, a graphics card, RAM. I know her console is dedicated only to gaming while my PC is a jack-of-all-trades, but there must be more to it when it comes to performance. She's experienced no performance loss even though her new television screen is three or four times bigger than the last. Meanwhile, when I went to the larger monitor I easily lost 10 or 15 fps. And it's not as if her game is displayed at a lower resolution either; it doesn't look worse than the old TV, but much better.
So can anyone explain (in layman terms?) how rendering differs between console-to-television vs. PC-to-monitor in such a way that I'm screwed until I get a better GPU, while she could literally play her game on the large screen at Times Square and still get her 30 fps?