PDA

View Full Version : nVidia 320M Impressions?


Archived Post
04-13-2010, 06:00 AM
Hi All,

Was wondering if anyone has any experience with the nVidia 320M processor in Champions. Does it run at least alright? It appears to be 'shared memory', so I immediately think it'd be pretty lousy, but I can't find much of anything about it anywhere.

Thanks for any insight!

Archived Post
04-13-2010, 06:32 AM
Do you already have a GeForce 320M, or are you considering getting one?

Nvidia's naming schemes are completely incomprehensible. They'll take a a single GPU chip and give it a bunch of different names, so you really have no clue what you're getting. With desktop parts, Nvidia will at least give some specs most of the time, but for mobile parts, they won't say anything. They'll also try to give mobile parts names that are very similar to desktop parts that are totally different.

Basically, if you haven't already bought the laptop, I'd consider buying something else instead. If you get an AMD video card, it's pretty easy to figure out what you're getting--and the Mobility Radeon HD 5000 series also has vastly superior performance per watt to anything else on the market, which is tremendously important in a laptop. Gaming on a laptop is a bad idea in general, though.

My best guess is that the game will probably run, but you'll have to turn settings way down. It might be Nvidia's low end GT218 GPU chip, which is basically a low end "don't try to play games on this" card. It might be Nvidia's somewhat less low end GT216 GPU chip, which is basically their idea of the lowest end GPU chip that is passable for gaming. Or it might be a rebranded G98 or G96 chip, or a failed GPU chip that they had to disable a bunch of parts of just to get it to run, or something completely random that people figured was off the market two years ago. I'd say more likely than not, it's either a GT218 or GT216, which in desktops, are branded as GeForce G 210 and GeForce GT 220, respectively, among a variety of other names.

If you already have the card, you could run GPU-Z, report the specs here, and maybe I can figure out what it is.

Archived Post
04-13-2010, 07:51 AM
Geez, how nice of nVidia to complicate it so much.

Haven't bought anything, was just looking at the updated MacBook Pro today, and the 13" has an nVidia 320M. Yes, I know laptops aren't great for gaming. But I need a Mac for work (I work at a 100% Mac shop, a twist!) and the only reason I'd be at all interested in upgrading would be if it could run Champions somewhat competently through Boot Camp.

All I know about the technical aspect is what I see here: http://www.apple.com/macbookpro/specs-13inch.html

If it's going to be crummy, I just won't bother. My existing several year old MacBook Pro is still more than fast enough processor-wise. Amazing how long computers last these days if you DON'T do gaming :)

Archived Post
04-13-2010, 08:12 AM
After seeing that, my new best guess is that it's a rebranded GeForce 9400M integrated graphics, which Apple already uses in some other products. If it were a decent discrete card, Nvidia would probably put a GT or GTS or GTX or GSO or GTX+ or some other such combination of letters in the name, rather than just "GeForce 320M". Shared memory also points toward integrated graphics, and the 9400M is Nvidia's most recent effort there, now that they've bsaically been forced out of the integrated graphics market as both Intel and AMD integrate their graphics into the processors.

So basically, Champions Online would run, and it would be a lot better off than if you were stuck with Intel's awful integrated graphics. You'd probably have to turn all settings all the way down, though. Also, considering that it's Apple, they're probably charging around double what you'd pay for the same hardware in a PC, meaning that there's no way that they'll give you something decent for $1200.

Also consider that the highest end graphics that they offer are the GeForce GT330M, which use a somewhat pathetic GT216 GPU. It's not so much that they can't offer something better as that they don't want to, as Apple computers aren't meant for gaming. That's probably Nvidia's new "Optimus" switchable graphics, that will use Intel's horrible integrated graphics most of hte time and turn the Nvidia card off, but then let you switch to the Nvidia card when Intel's graphics flatly fail, such as if you want to watch a full screen video.

Archived Post
04-13-2010, 09:22 AM
Just to entertain the idea, if I were to look at a PC laptop that could run Champs fairly well (I'm not an FPS nut, but ~30FPS with decent level of effects would be nice) what brands should I consider? Can it be done for around $1K? I *don't* want a "desktop replacement" type where they shoehorn desktop class parts into a giant heavy cinder block.

Archived Post
04-13-2010, 10:12 AM
I found out what the GeForce 320M is. Apparently it's an integrated graphics version of Nvidia's GT216 chip that they made special for Apple. Of course, the reason why it's Apple-only is likely that no other OEM would want it.

http://www.appleinsider.com/articles/10/04/13/nvidia_320m_gpu_made_especially_for_apples_new_13_ inch_macbook_pro.html

This means two things:
1) It's the best integrated graphics ever made.
2) It's bizarre and pointless.

Having to share system memory means that it's going to be severely starved for memory bandwidth. If you're lucky, it will deliver maybe 1/3 of the performance of a $70 desktop video card.

It's also going to be a short-lived part. Nvidia can't legally make integrated graphics for Intel's newer processor architectures, so it can only be paired with older Core 2 processors that Intel started phasing out with the release of Nehalem back in 2008. I'm not sure if they can do so for AMD processors, either, but it would be kind of pointless to do so, as AMD can make good integrated graphics themselves (unlike Intel), and it's not going to be a terribly high volume part. Besides, around the end of the year, AMD will release their "Llano" processor, which will have on-die integrated graphics--and likely be the first integrated graphics ever that are actually passable for gaming.

I wonder if Apple ordered them from Nvidia before they realized that
1) GT216 would be delayed by about a year, and
2) GT216 wouldn't be very good even when it finally did launch.
GT216 gets absolutely destroyed by AMD's "Redwood" chip in performance per watt, and is only roughly competitive with AMD's older "RV730" GPU chip in performance per watt in spite of GT216 being a full process node smaller.

-----

Does it need to be a laptop? You can get a very nice desktop gaming PC for $1000, but a laptop won't be nearly so great.

Here's a not that nice but not really that bad, either, laptop for $800.

http://www.newegg.com/Product/Product.aspx?Item=N82E16834115733

My guess is that that one will give you better gaming performance than even Apples top $2200 MacBook Pro. The video card in there has a TDP of 15-19 W, probably depending on the GPU clock speed (which isn't as important as you might think, because at higher clock speeds, it will be limited by memory bandwidth, anyway), so it's not going to run hot and fry things, either.

Of course, with a $1000 budget in a desktop gaming computer, you can easily get something that will have more than double the performance of that laptop.

Archived Post
04-13-2010, 12:37 PM
Wow, you really do your homework :)

That sheds a lot of light on why they stuck with the Core 2 Duo for the low end MBP too instead of using something like Core i3. Unfortunately Apple hasn't used ATI in their laptops in a few generations now, I don't know why. I've always preferred ATI personally and use it in my Mac desktops.

I have a good desktop machine for Champs already. It's a few years old now, but it's a Mac Pro Quad Xeon something-or-other with an ATI 4970. AND I already have a MacBook Pro that's more than good enough for most needs EXCEPT gaming.

So as a nice-to-have I'm toying with the idea of a laptop that can play Champs for trips. I'd prefer not to have two laptops, but I'll explore the idea.

Archived Post
04-13-2010, 01:22 PM
There is no such thing as an ATI 4970. You might mean a 4870, which was the top video card that Apple offers with a Mac Pro. The laptop I linked above has a video card that will offer maybe 1/3 or 1/4 of the performance of a Radeon HD 4870, though it probably has a lower monitor resolution than your desktop, which boosts frame rates.

My understanding is that Apple was moving away from Nvidia as a reaction to "bumpgate", where Nvidia sold Apple (and a bunch of other OEMs) a ton of defective chips. It's the worst sort of defective chips, too, as they'd seem fine at first, but then die after a year or two of normal use. And then the end user would probably blame the OEM, not Nvidia.

Maybe Apple decided to go with Nvidia in their new laptops because of Nvidia's "Optimus" switchable graphics, to use Intel's low power but otherwise pathetic integrated graphics most of the time, while allowing the user to switch to Nvidia's graphics that actually work as needed. AMD doesn't offer anything like that, though that's largely because AMD wants you to buy an AMD processor, and doesn't want to offer better graphics to pair with an Intel processor. Rather, AMD wants getting better integrated graphics to be a reason to buy an AMD processor rather than an Intel one. Of course, it would help if AMD had better mobile processors to offer.

The real solution is to bring power gating to video cards. Power gating is something Intel introduced with their Nehalem architecture processors, to allow some cores to completely shut down while not being used (and hence use no power at all), while allowing other cores to continue working as normal. If a high end video card could shut down all but one memory channel, group of shaders, etc. to function like a low end video card while it's idle, and then re-enable everything on the fly when the performance is needed again, you could have the performance of a high end video card when you need it, and the power usage of a low end card when you don't. That's probably coming pretty soon, but isn't out yet.

Apple presumably doesn't care about DirectX 11, which AMD can offer and Nvidia can't. AMD also offers better performance per watt at load, but Apple seems to be looking more at idle wattage, so that they can advertise long battery life when you aren't actually doing anything on the computer.

Archived Post
04-13-2010, 01:38 PM
Yes, you're right, it's a 4870 =) Though it's aftermarket with a BIOS I created from Apple's own EFI image (taken from their own 4870) and the existing Sapphire BIOS. I like Macs and Mac OS, but I didn't want to buy a ridiculously priced upgrade when it's avoidable.

Any thoughts on the Radeon HD5850 Mobility part? I see some laptops with that at about $1K.

Archived Post
04-13-2010, 01:52 PM
The Mobility Radeon HD 58** chips use AMD's excellent "Juniper" GPU chip. The desktop equivalent is a Radeon HD 5770, which performs about the same as your Radeon HD 4870.

The Mobility Radeon HD 5830, 5850, and 5870 all use the same GPU chip. The difference between them is the clock speed, and probably the voltage. The 5830 is the slowest and 5870 the fastest, but the 5830 is also the lowest power, which is the point of clocking a GPU slower than it could have been.

Make sure you get one paired with GDDR5 memory and not DDR3, as the chip needs the extra memory bandwidth of GDDR5 memory to perform as well as it should. Depending on how it is clocked, it will probably give you about 2/3 of the performance of your Radeon HD 4870. Depending on your monitor resolution, it might actually give you about the same frame rates. If you're worried about overheating, then the Juniper-based cards give you the best performance per watt of anything on the market. The Mobility Radeon HD 5850 has a TDP of 30-39 W, depending on the clock speeds; for comparison, your Radeon HD 4870 has a TDP of 150 W.

Archived Post
04-13-2010, 04:51 PM
Thanks for all your input, it's been helpful. And the tip about the GDDR RAM. It's been a LONG time since I did the PC building enthusiast thing, so I'm woefully out of date on all the trivia that goes into these parts :o

Archived Post
04-13-2010, 05:02 PM
It's GDDR5 that is the key. It sends four bits of data per memory bus bit per clock cycle. All other versions of DDR or GDDR send only two bits, not four. That means GDDR5 memory has double the memory bandwidth at a given clock speed. It's actually quite nice and means that a given amount of memory bandwidth only takes half as many memory channels as before, which keeps costs (and hence price tags) down.

Archived Post
04-13-2010, 05:59 PM
As a useful bit of info for comparison I recently bought a laptop w/ a Mobility Radeon 4330 HD. I can play the game at about low-med settings and get 25-40 FPS. The 4570 or higher should run the game pretty well...

Archived Post
04-13-2010, 07:12 PM
The 4330 and 4570 are both low end "don't try to play games on this" cards. You can play games on it, I suppose, but you'll have to turn settings way, way down, as you've seen. If you're going to play games on an ATI card, you generally want the second digit to be at least 6, as below that is low end "not for gaming" cards. Well, I guess the desktop 5570 is decent for gaming, but it's really meant for HTPCs and all-in-ones.

Archived Post
04-20-2010, 05:48 AM
I have the new macbook pro and in bootcamp on the 320m I can run it at one notch below the top on the recommended settings.

the 13inch macbook that isnt the lowest end unit has 256mb of onboard video memory as opposed to sharing the system resources.

Very playable :D

Archived Post
04-20-2010, 07:13 AM
The new MacBook Pros that aren't the low end one have a GeForce GT 330M discrete video card, not the GeForce 320M integrated video card. It's about the same GPU chip, but having dedicated dual-channel DDR3 video memory on the card means that the GPU chip can get enough video memory bandwidth to perform properly. It's basically an underclocked version of a desktop GeForce GT 220 card with DDR3, which is a lower-midrange gaming card, but yeah, you can play games on it just fine if you're willing to turn down some settings. Giving it only 256 MB of video memory is a curious choice, as nearly all video cards, even at the very low end, come with at least 512 MB these days--and having only 256 MB will sabotage performance in some games.

Basically, it's Apple's way of telling you, if you want to play games, get a PC.

Archived Post
04-20-2010, 08:00 AM
As I said the top end 13 inch macbook pro does have the 320m (or lowish end macbook pro depending on how you look at is does have the onboard graphics

you are right though the very bottom macbook pro has the shared memory

AS I said it DOES run the game on medium so if you are after a macbook because you want one rather than because people keep telling you to not get one it will run the game in bootcamp ;)

It's not a gaming laptop but it does run games.

my guess about not using 512mb of onboard gpu ram is to keep the heat down, maybe...maybe not

Archived Post
04-20-2010, 08:13 AM
Are you sure about that? Integrated graphics generally doesn't have dedicated memory. Are you misreading the amount of system memory set aside for use by integrated graphics as being separate dedicated memory with its own dedicated memory channels for the integrated graphics? The issue here is bandwidth, not capacity, and if the laptops did that, I'd think Apple would say so.

I doubt it's an issue of power consumption. Memory chips don't use much power, anyway. If Apple wanted lower power consumption, they'd have used Mobility Radeon HD 5000 series cards in the laptops, which absolutely destroy anything Nvidia has in performance per watt.

It's probably just Apple being cheap, as usual. Their entire business model is built around selling mid-range hardware with a high-end price tag, so that hardware costs only count for a minority of the price tag. This gives them huge gross profit margins on individual sales. Package high-end hardware with a high-end price tag and they wouldn't get the huge profit margins anymore.

Archived Post
04-20-2010, 08:38 AM
Are you sure about that? Integrated graphics generally doesn't have dedicated memory. Are you misreading the amount of system memory set aside for use by integrated graphics as being separate dedicated memory with its own dedicated memory channels for the integrated graphics? The issue here is bandwidth, not capacity, and if the laptops did that, I'd think Apple would say so.

I doubt it's an issue of power consumption. Memory chips don't use much power, anyway. If Apple wanted lower power consumption, they'd have used Mobility Radeon HD 5000 series cards in the laptops, which absolutely destroy anything Nvidia has in performance per watt.

It's probably just Apple being cheap, as usual. Their entire business model is built around selling mid-range hardware with a high-end price tag, so that hardware costs only count for a minority of the price tag. This gives them huge gross profit margins on individual sales. Package high-end hardware with a high-end price tag and they wouldn't get the huge profit margins anymore.

im not sure no, I haven't checked either only had it a day. Just loaded up bootcamp and it worked on the settings i mentioned. I'll get my facts straight next time..

However, the original poster asked for nVidia 320m impressions and I can confirm that it does run the game.