Difference between monitors and ponies?

Post » Mon May 14, 2012 7:11 pm

A few weeks ago I got a larger monitor but didn't like the frame-rate drop in Skyrim and other games so I took it back and am back to my 17" screen.

My girlfriend plays on the Xbox. We had a little dinky television and she was sick of playing games on it. So she went out and bought a 42" LCD and loves it.

I know this is a completely newbish question, especially for someone like me whose played games all my life on PCs and consoles dating back to the Mac Plus and the Mattel Intellivision! But what is it about consoles that allows you to play on a larger television and suffer no performance loss? Aren't her Xbox and my PC fundamentally the same? You have a processor, a graphics card, RAM. I know her console is dedicated only to gaming while my PC is a jack-of-all-trades, but there must be more to it when it comes to performance. She's experienced no performance loss even though her new television screen is three or four times bigger than the last. Meanwhile, when I went to the larger monitor I easily lost 10 or 15 fps. And it's not as if her game is displayed at a lower resolution either; it doesn't look worse than the old TV, but much better.

So can anyone explain (in layman terms?) how rendering differs between console-to-television vs. PC-to-monitor in such a way that I'm screwed until I get a better GPU, while she could literally play her game on the large screen at Times Square and still get her 30 fps?
User avatar
cutiecute
 
Posts: 3432
Joined: Wed Sep 27, 2006 9:51 am

Post » Mon May 14, 2012 3:56 pm

Larger monitors means bigger resolutions, which will reduce your FPS on your GPU, so it all depends on what Graphic's card you've got
User avatar
noa zarfati
 
Posts: 3410
Joined: Sun Apr 15, 2007 5:54 am

Post » Mon May 14, 2012 8:17 pm

In simple terms... Skyrim was developed on Xbox and ported to PC and PS3. That's why. Skyrim is optimized on Xbox and so you could view it on a large monitor but PC and PS3 isn't optimized for it. It's that simple. You could try to toy around with the settings and see if that's getting any better but I doubt it. Unless you got a bad graphics card.
User avatar
Sophh
 
Posts: 3381
Joined: Tue Aug 08, 2006 11:58 pm

Post » Mon May 14, 2012 6:12 pm

Not only that, but even though high definition televisions which have bigger screen sizes than monitors, they usually have lower resolutions.
User avatar
Susan
 
Posts: 3536
Joined: Sun Jun 25, 2006 2:46 am

Post » Mon May 14, 2012 6:20 pm

First, a monitor is just a TV that does not have a Tuner. A PC Monitor is specific in that it takes a signal from a Video Card and rasterizes it, or display's it as pixels. There are many ways for this to happen and it is not important for this thread.

Now, an HD TV is basically a 1920 by 1080 Resolution Monitor. 360 Games are upsampled from 1280 by 720 resolution.

So, the Xbox is playing on a smaller resolution, but is displayed at a higher resolution. On a PC what happens is you need a high horsepower video card to run a game like Skyrim at 1920 by 1080, no matter what kind of monitor you put that on.
User avatar
Soph
 
Posts: 3499
Joined: Fri Oct 13, 2006 8:24 am

Post » Mon May 14, 2012 2:12 pm

Yeah, there's multiple issues here.

Screen resolution is one. Larger resolution means you need more graphical horsepower to drive it. But a 20" monitor at 1920x1080 and a 42" TV at 1920x1080 are the same resolution. It's just displayed over a larger space. (I'd guess that your 17" screen has a lower resolution, which is why you gained performance.)


Console & TV vs PC & Monitor isn't really an issue, because you can hook up a console to a monitor and a PC to a TV, as long as they have the right connectors. (I currently have my PS3 hooked up to my PC's monitor, since it had a spare HDMI connector. And I don't have an HD TV.)
User avatar
Kortniie Dumont
 
Posts: 3428
Joined: Wed Jan 10, 2007 7:50 pm

Post » Mon May 14, 2012 6:00 pm

it is not using 1080p. Skyrim unless PS3 (which may not either) is not even running at 720p either Oblivion on 360 ran on a lower res then 720p where as PS3's version ran at 720p.

If she has a 1080p she is only using half (again possibly less) then what the TV is capable of and the better the screen the more it becomes apparent. But since you are sitting 5-10ft you cannot recognize as many flaws in the graphics as you would sitting on a monitor upscaling to 1080i from 720p. Also the 360/PS3 are locked at 30fps a noticble difference for those who are use to something else let say 60 the highest you can recognize it automatically and if its playing at 30fps it is like a slideshow to some.
User avatar
Wayne W
 
Posts: 3482
Joined: Sun Jun 17, 2007 5:49 am

Post » Mon May 14, 2012 10:05 pm

I know her console is dedicated only to gaming while my PC is a jack-of-all-trades, but there must be more to it when it comes to performance.

So I gather from what people are saying that my above quote is wrong. It really is just that simple.
User avatar
GRAEME
 
Posts: 3363
Joined: Sat May 19, 2007 2:48 am

Post » Mon May 14, 2012 10:02 pm

I'm in the process of building mine, so I've just got a question about the specific card I'm looking at.

(GPU: http://www.newegg.com/Product/Product.aspx?Item=N82E16814130738)

Hypothetically, I sshouldn't lose any frames running this card, right?

I'll be running 32" Samsung 720p widescreen to HDMI to GPU.
User avatar
Lakyn Ellery
 
Posts: 3447
Joined: Sat Jan 27, 2007 1:02 pm

Post » Mon May 14, 2012 12:30 pm

In simple terms... Skyrim was developed on Xbox and ported to PC and PS3. That's why. Skyrim is optimized on Xbox and so you could view it on a large monitor but PC and PS3 isn't optimized for it. It's that simple. You could try to toy around with the settings and see if that's getting any better but I doubt it. Unless you got a bad graphics card.

Close but not totally accurate, main problem with PS3 has been that it has less system memory than Xbox360 who has 512MB shared while PS3 has 256 MB main memory and 256 MB for graphic, because Skyrim was designed for Xbox first it sometimes used more than 256 MB causing problems.

Now over to PC, if you had an old and small monitor you would get low resolution, an new large one will do 1920x1080 this is more work for the game, my advice is either to reduce graphic settings or set that the game should run in lower resolution.
Test it out and see that you prefer.

Also if you have the high texture pack, disable it, it's significantly increases the system demands.

Now for the xbox and resolution, it does not run in full hd that is 1920x1080, but rather 1200x700 and something and the image is stretched, this work better on TV than monitors as the TV is designed to show multiple resolutions while the monitor has much sharper pixels and you sit closer.

The xbox is much slower than modern pces, yes it's more efficient as it don't run any OS or has to adjust for difrent hardware but most pc's easy overcome that gain.
Has you considered upgrading your system?
User avatar
gemma king
 
Posts: 3523
Joined: Fri Feb 09, 2007 12:11 pm

Post » Mon May 14, 2012 8:36 pm

So I gather from what people are saying that my above quote is wrong. It really is just that simple.

That's why high end GPU's arn't cheap, unfortunatly
User avatar
CxvIII
 
Posts: 3329
Joined: Wed Sep 06, 2006 10:35 pm

Post » Mon May 14, 2012 1:48 pm

It is really a matter of Resolution, as once the game signal leaves the box, it is handled by the TV or the Monitor much the same way. How it gets there and what kind of cable or input is irrelevant really.

Your PC is playing at about twice the resolution, as far as pixel count, as a console. Also, because it is playing at a higher res, the CPU has to handle more things to support that resolution. It's not just graphics.
User avatar
SexyPimpAss
 
Posts: 3416
Joined: Wed Nov 15, 2006 9:24 am

Post » Mon May 14, 2012 4:43 pm

That's why high end GPU's arn't cheap, unfortunatly

Fortunately, unless you are playing at more than 1920 res, a mid level GPU is all you need these days. Such as a GTX560 or a HD6950. Heck, I have seen Skyrim/New Vegas run on lesser cards just fine.
User avatar
Caroline flitcroft
 
Posts: 3412
Joined: Sat Nov 25, 2006 7:05 am

Post » Mon May 14, 2012 5:54 pm

I'm in the process of building mine, so I've just got a question about the specific card I'm looking at.

(GPU: http://www.newegg.com/Product/Product.aspx?Item=N82E16814130738)

Hypothetically, I sshouldn't lose any frames running this card, right?

I'll be running 32" Samsung 720p widescreen to HDMI to GPU.
You will be able to run Skyrim on max with the high texture pack.

Now your monitor is not full hd, you are running an laptop resolution, however the card should manage full hd with no problem.
Screen size does not matter just resolution, however most TV below 34" does not do full hd, most pc monitors below 24" does not. over that and you usually get full hd except on budget models.
User avatar
Elina
 
Posts: 3411
Joined: Wed Jun 21, 2006 10:09 pm

Post » Mon May 14, 2012 10:45 pm

Close but not totally accurate, main problem with PS3 has been that it has less system memory than Xbox360 who has 512MB shared while PS3 has 256 MB main memory and 256 MB for graphic, because Skyrim was designed for Xbox first it sometimes used more than 256 MB causing problems.

Now over to PC, if you had an old and small monitor you would get low resolution, an new large one will do 1920x1080 this is more work for the game, my advice is either to reduce graphic settings or set that the game should run in lower resolution.
Test it out and see that you prefer.


it has more to do with the PS3's CPU archintecture then the RAM both systems have to use that RAM to run the OS even though not small they both have one PS3 however reserves RAM and dedicates half to run itself while the other half is for games both have leave 256MB of ram of free use for developers and the games. Graphics are mainly depended on the CPU because the CPU of the PS3 and 360 are the only decent part of the systems as of now which is why you get CPU dependent titles that favors it over GPU
User avatar
Laura Shipley
 
Posts: 3564
Joined: Thu Oct 26, 2006 4:47 am

Post » Mon May 14, 2012 1:22 pm

Well if his CPU meets the minimun requirements for the game, and he has enough memory, then it is down to the Graphics Card or chip if you go up in resolution, which if you buy a bigger monitor it will be a flat screen nowadays and they run in their native resolutions, if you run them in non native resolutions the display looks awful!
User avatar
Alyesha Neufeld
 
Posts: 3421
Joined: Fri Jan 19, 2007 10:45 am

Post » Tue May 15, 2012 12:53 am

Fortunately, unless you are playing at more than 1920 res, a mid level GPU is all you need these days. Such as a GTX560 or a HD6950. Heck, I have seen Skyrim/New Vegas run on lesser cards just fine.
True, an 800$ laptop did run Oblivion real well however I would not try skyrim with the HD texture pack on it, default skyrim should work well but had to give it back :o(
User avatar
Micah Judaeah
 
Posts: 3443
Joined: Tue Oct 24, 2006 6:22 pm

Post » Mon May 14, 2012 6:53 pm

Fortunately, unless you are playing at more than 1920 res, a mid level GPU is all you need these days. Such as a GTX560 or a HD6950. Heck, I have seen Skyrim/New Vegas run on lesser cards just fine.

sorry i disagree, ive always bought a high end single gpu, and im gaming on a 1920 res Samsung, the better the card the better the performance, depending whether you want to run everything on full that is
User avatar
Laura Cartwright
 
Posts: 3483
Joined: Mon Sep 25, 2006 6:12 pm

Post » Mon May 14, 2012 10:00 am

how rendering differs between console-to-television vs. PC-to-monitor in such a way that I'm screwed until I get a better GPU, while she could literally play her game on the large screen at Times Square and still get her 30 fps?
There is no difference between consoles and PCs at all.

The difference lies in the monitor/TV. All flat displays suffer from http://en.wikipedia.org/wiki/Display_lag, Sometimes it's high enough for a person to notice, and sometimes not. So what this means is that you bought a crappy monitor with high input lag, nothing else. I recommend you do research about a monitor before you buy it.

Well, it could also mean that your graphic card is too weak to run properly at the native resolution of the monitor. If so, just run at a lower resolution, or buy a more powerful graphic card.
User avatar
k a t e
 
Posts: 3378
Joined: Fri Jan 19, 2007 9:00 am

Post » Mon May 14, 2012 6:07 pm

There is no difference between consoles and PCs at all.

The difference lies in the monitor/TV. All flat displays suffer from http://en.wikipedia.org/wiki/Display_lag, Sometimes it's high enough for a person to notice, and sometimes not. So what this means is that you bought a crappy monitor with high input lag, nothing else. I recommend you do research about a monitor before you buy it.
I don't know about other HDTVs, but my Samsung has the option to put it into "Gaming mode" which removes all noticeable lag at the cost of some image quality.

Oh, and I play Skyrim at 1920 x 1080 on my 40" screen with a normal FPS of 57 with a texture pack at x16 AA. Except in Markarth... what is it about Markarth that drives down FPS?
User avatar
Invasion's
 
Posts: 3546
Joined: Fri Aug 18, 2006 6:09 pm

Post » Mon May 14, 2012 10:02 pm

I'm in the process of building mine, so I've just got a question about the specific card I'm looking at.

(GPU: http://www.newegg.com/Product/Product.aspx?Item=N82E16814130738)

Hypothetically, I sshouldn't lose any frames running this card, right?

I'll be running 32" Samsung 720p widescreen to HDMI to GPU.

You actually can lose fps running a very low resolution with a hi-power gpu. You'll still be running mad fps. If you were running 1080p instead of 720p [1366x768], I'd say get a card with 2GB VRAM.
User avatar
Jani Eayon
 
Posts: 3435
Joined: Sun Mar 25, 2007 12:19 pm

Post » Mon May 14, 2012 11:11 am

sorry i disagree, ive always bought a high end single gpu, and im gaming on a 1920 res Samsung, the better the card the better the performance, depending whether you want to run everything on full that is

But even with that argument you can max games such as FONV/3 Oblivion/Skyrim with a fairly weak GPU if they are exclusive yes it requires a better GPU like a game such as Witcher 2 or BF3 both the best on open market currently ARMA is also a resource hog that requires RAM more then anything because you render everything but when it comes to majority of multiplatform games it takes very little my 9600 could run Oblivion everything very high at x4 AA stable FPS 40-50 same with FO3 where as Crysis on high with x2AA ran at 25-30fps.
User avatar
Breanna Van Dijk
 
Posts: 3384
Joined: Mon Mar 12, 2007 2:18 pm

Post » Mon May 14, 2012 5:17 pm

You actually can lose fps running a very low resolution with a hi-power gpu. You'll still be running mad fps. If you were running 1080p instead of 720p [1366x768], I'd say get a card with 2GB VRAM.

Doesn't Skyrim run at 720p native res, though?

720p is not a low resolution. 360p is. :S
User avatar
Jerry Cox
 
Posts: 3409
Joined: Wed Oct 10, 2007 1:21 pm

Post » Mon May 14, 2012 6:10 pm

Doesn't Skyrim run at 720p native res, though?

720p is not a low resolution. 360p is. :S
On console it does, but on PC it runs on whatever resolution you tell it to.
User avatar
NAkeshIa BENNETT
 
Posts: 3519
Joined: Fri Jun 16, 2006 12:23 pm

Post » Mon May 14, 2012 11:26 pm

On console it does, but on PC it runs on whatever resolution you tell it to.

A higher res doesn't necessarily improve overall definition, though. Right?

I'm new to the techy stuff. (:
User avatar
Craig Martin
 
Posts: 3395
Joined: Wed Jun 06, 2007 4:25 pm

Next

Return to Othor Games