Page 1 of 1

ATI radeon HD 3200 Graphics card

PostPosted: Sun Aug 16, 2009 10:45 am
by Strawberry
was just thinking about buying oblivion but would like to know what kind of profomance i will get out of it
please help me made the choose

trumpetgeek240

ATI radeon HD 3200 Graphics card

PostPosted: Sun Aug 16, 2009 3:31 am
by Claudia Cook
Either upgrade the machine, if you can (possible only with a desktop), or get the console version if you have a laptop with that onboard chip in it. The HD 3200 is just the guts out of the HD 2400, buried inside of the mainboard's chipset ASIC. As such, it is well below the minimum Radeon usable in this game.

By definition, a "Graphics Card" is on a separate circuit board, produced on a separate assembly line, delivered to the OEM for inclusion in some branded PC or other, where it is plugged into the video bus via an add-on PCI-e slot in the mainboard (desktop, anyway).

Laptops are very poor for games due to inadequate cooling, whatever graphics solution is being used, and there is no official support offered for them; I do not think that Bethesda needs to make any apologies for that policy (possible serious hardware changes are in the wings for 2011 / 2012, the end result of what may be available remains to be seen (if no producers support the new offerings, then it will be status quo.


Gorath

ATI radeon HD 3200 Graphics card

PostPosted: Sun Aug 16, 2009 1:43 am
by ladyflames
was just thinking about buying oblivion but would like to know what kind of profomance i will get out of it
please help me made the choose

trumpetgeek240


Didn't you already have a thread about your system and didn't you already have the Steam version of Oblivion?

ATI radeon HD 3200 Graphics card

PostPosted: Sun Aug 16, 2009 12:40 am
by OJY
The Radeon 3200 is going to pull low settings comfortably. I've heard mentions of it pulling the low side of medium...but uncomfortably.

ATI radeon HD 3200 Graphics card

PostPosted: Sun Aug 16, 2009 4:01 am
by Greg Swan
The Radeon 3200 is going to pull low settings comfortably. I've heard mentions of it pulling the low side of medium...but uncomfortably.

You need to make up your mind -- the HD 3200, HD 4200, and HD 4240 are all onboard video chips, the first closely related to the HD 2400 design of four years ago, the next two being based on the HD 3450. All are slower than the discrete video graphics cards they were cloned from.

What about it? Can the video be upgraded? Nope. There's an integrated chipset in there and that is not going to be removed. (snip) the Radeon 4250 will be the weak spot.

None of them are any better than the (real, practical) minimum that this game actually needed, and the 3200 is significantly poorer.

If the HD 3200 was even as capable as an HD 2400, look at it compared to the official minimum Radeon:

http://www.gpureview.com/show_cards.php?card1=561&card2=54

That's a bad comparison, given that Bethesda never tested the X700s and X800s, because Omega drivers are needed with those. IMO, an X800 Pro was probably a better choice as a "Practical" minimum, same as a 6800 GS made much more sense than a 6600 GT.

The HD 2400 is roughly 30% as capable as an X800 Pro:

http://www.gpureview.com/show_cards.php?card1=561&card2=301

The HD 3450 is about 40% as good as an X800 Pro.

http://www.gpureview.com/show_cards.php?card1=555&card2=301


Gorath

ATI radeon HD 3200 Graphics card

PostPosted: Sun Aug 16, 2009 7:03 am
by Lilit Ager
You need to make up your mind -- the HD 3200, HD 4200, and HD 4240 are all onboard video chips, the first closely related to the HD 2400 design of four years ago, the next two being based on the HD 3450. All are slower than the discrete video graphics cards they were cloned from.


None of them are any better than the (real, practical) minimum that this game actually needed, and the 3200 is significantly poorer.

If the HD 3200 was even as capable as an HD 2400, look at it compared to the official minimum Radeon:

http://www.gpureview.com/show_cards.php?card1=561&card2=54

That's a bad comparison, given that Bethesda never tested the X700s and X800s, because Omega drivers are needed with those. IMO, an X800 Pro was probably a better choice as a "Practical" minimum, same as a 6800 GS made much more sense than a 6600 GT.

The HD 2400 is roughly 30% as capable as an X800 Pro:

http://www.gpureview.com/show_cards.php?card1=561&card2=301

The HD 3450 is about 40% as good as an X800 Pro.

http://www.gpureview.com/show_cards.php?card1=555&card2=301


Gorath

Make up my mind about what? All those chipsets you are listing I have posted over and over again that low settings is probably going to be the best for them. But I have come across users (and enough of them) that have told me some medium is feasible too. If they can expect low settings then that's what it means. If people want to push it more to a fps tolerance that I find unacceptable, yet they find tolerable....well, I'm not going to stop them.

I don't just rely on raw numbers to tell me how the game is going to perform with partciular chipsets. I will take user accounts (if there are enough) very heavily into my anolysis as well.

I don't really get what you're trying to say....:huh:

ATI radeon HD 3200 Graphics card

PostPosted: Sun Aug 16, 2009 1:28 pm
by Michelle Chau
was just thinking about buying oblivion but would like to know what kind of profomance i will get out of it
please help me made the choose

trumpetgeek240



my card can play high with pretty much no lag but i like to pput it at ultra high and works out fine on both thanks everyone

ATI radeon HD 3200 Graphics card

PostPosted: Sun Aug 16, 2009 1:51 am
by Gemma Flanagan
Didn't you already have a thread about your system and didn't you already have the Steam version of Oblivion?


yea and i mentioned that i wanted to buy it cause i couldnt get it to work so i bought the disk version and now it works :)

ATI radeon HD 3200 Graphics card

PostPosted: Sun Aug 16, 2009 9:15 am
by des lynam
my card can play high with pretty much no lag but i like to pput it at ultra high and works out fine on both thanks everyone


Do me a favour. Type tdt in the console and tell me what the FPS is when you're doing stuff inside and outside. NOT staring at a wall. I'd love to know.