Insert BS here A place to discuss anything you want

The AMD Trinity

Old 10-03-2012, 12:38 PM
  #1  
Boost Pope
Thread Starter
iTrader: (8)
 
Joe Perez's Avatar
 
Join Date: Sep 2005
Location: Chicago. (The less-murder part.)
Posts: 33,019
Total Cats: 6,587
Default The AMD Trinity

So I was reading an article recently touting the release of AMD's new Trinity CPU lineup. Cliffs: these chips are what they call APUs- the combination of a CPU and a low-end 3d graphics chipset onto a single die.
The new A10 represents AMD's flagship APU. It's positioned against Intel's Core i3 3220.
Really? The Flagship of the fleet is poised to compete against the third-cheapest processor in the entire Intel Core-i lineup?

I mean, yeah. A quad-core processor running at 3.8 / 4.2 Ghz is obviously going to out-perform a dual-core processor running at 3.3 Ghz. And the gamer-oriented GPU unit will no doubt appeal to that segment of the market who wants gaming performance nowhere near as good as a moderately-priced standalone video card, and is willing not to pay anything at all to get it.

Talk about picking your battles wisely.
Joe Perez is online now  
Old 10-03-2012, 12:44 PM
  #2  
Tour de Franzia
iTrader: (6)
 
hustler's Avatar
 
Join Date: Jun 2006
Location: Republic of Dallas
Posts: 29,085
Total Cats: 375
Default

In due time these processors will probably be used in the shitty $400 notebooks I continue buying because I spend all my money on tires and **** like that.
hustler is offline  
Old 10-03-2012, 01:19 PM
  #3  
Elite Member
iTrader: (1)
 
Full_Tilt_Boogie's Avatar
 
Join Date: May 2009
Location: Jacksonville, FL
Posts: 5,155
Total Cats: 406
Default

Apparently theyre awesome sauce in netbooks and such.

I was disappointed with bulldozer. It takes 8 cores to keep up with an Ivy bridge quad core.
My now old Phenom II x4 is still the fastest quad core from AMD...
Full_Tilt_Boogie is offline  
Old 10-03-2012, 02:49 PM
  #4  
Elite Member
iTrader: (2)
 
thenuge26's Avatar
 
Join Date: Aug 2012
Location: Indianapolis
Posts: 3,267
Total Cats: 239
Default

Top of the line APU going against the i3 makes sense. If you are building a serious machine, you aren't going to get a CPU with an integrated GPU anyway. So why make a $400 version of it?
thenuge26 is offline  
Old 10-03-2012, 03:19 PM
  #5  
Boost Pope
Thread Starter
iTrader: (8)
 
Joe Perez's Avatar
 
Join Date: Sep 2005
Location: Chicago. (The less-murder part.)
Posts: 33,019
Total Cats: 6,587
Default

Originally Posted by Full_Tilt_Boogie
Apparently theyre awesome sauce in netbooks and such.
As with the Intel Core processors, the AMD processors come in different versions for laptop vs. desktop use. Because the laptop processors are designed to operate at a a much lower power and generate much less heat, they inherently contain up to 60% less awesome than their otherwise identially-named desktop counterparts.

If you compare a Clarkdale (desktop) i3 to an Arrandale (mobile) i3, you find a lot of similarities. They are both based on the Westmere / Nehalem architecture, they both contain 382 million transistors on an 81 mm² die, they both contain the Ironlake-architecture GPU core, etc. On paper, they appear to be the exact same processor.

Except that the Clarkdale i3-530 consumes 75 watts, while the Arrandale i3-380UM consumes 18 watts. This is mostly accomplished by cranking the clock waaaaaaay down, and the performance reflects this.



Originally Posted by thenuge26
Top of the line APU going against the i3 makes sense. If you are building a serious machine, you aren't going to get a CPU with an integrated GPU anyway. So why make a $400 version of it?
Well, I agree and I disagree.

I grant you that nobody building a high-end machine is going to need or want an onboard GPU. No argument there.

On the other hand, of those people who are NOT serious gamers, how many of them are going to CARE whether the CPU has a good on-board GPU?

In other words, I'm not sure who the target market for this chip is.
  • Serious gamers are going to buy a high-end CPU and pair it with a high-end graphics card.
  • Serious computer users who are not gamers are going to buy a high-end CPU and use the crappy-*** onboard GPU that comes with it.
  • Everyone else is going to buy whichever CPU is cheapest (Intel Atom / AMD-E series / VIA nano), and not even think about the video chipset.

I just can't imagine that there are a huge number of people who are going to say "Well, I'd like something that has a reasonably powerful CPU, and I also want it to come with a GPU that's a bit better than nothing at all, but not quite good enough to do any serious gaming on."
Joe Perez is online now  
Old 10-03-2012, 03:39 PM
  #6  
Elite Member
iTrader: (2)
 
thenuge26's Avatar
 
Join Date: Aug 2012
Location: Indianapolis
Posts: 3,267
Total Cats: 239
Default

Originally Posted by Joe Perez

I just can't imagine that there are a huge number of people who are going to say "Well, I'd like something that has a reasonably powerful CPU, and I also want it to come with a GPU that's a bit better than nothing at all, but not quite good enough to do any serious gaming on."

That is true. The funny thing is, most of those people say they want a "reasonable powerful CPU" because they know the more gigahurtz the better right? When in fact an i3 (or this chip) would be perfectly fine for playing farmville.

The point is, I don't see this as a chip that they are targeting the home-built market with, this is to go in the laptops at Best Buy for epic mafia wars pwnage.
thenuge26 is offline  
Old 10-03-2012, 05:28 PM
  #7  
Boost Pope
Thread Starter
iTrader: (8)
 
Joe Perez's Avatar
 
Join Date: Sep 2005
Location: Chicago. (The less-murder part.)
Posts: 33,019
Total Cats: 6,587
Default

Originally Posted by thenuge26
The point is, I don't see this as a chip that they are targeting the home-built market with, this is to go in the laptops at Best Buy for epic mafia wars pwnage.
Fair enough. I can see this being popular as a mid-range gaming laptop chip.

Then why bother producing a desktop version of it?
Joe Perez is online now  
Old 10-03-2012, 06:06 PM
  #8  
Senior Member
iTrader: (2)
 
messiahx's Avatar
 
Join Date: Jun 2007
Location: Shalimar, FL
Posts: 956
Total Cats: 7
Default

AMD is probably targeting that low to mid-low segment. They can't compete with the Core series, and frankly, the Athlon XP was probably the last series that could compete with all market segments. This chip will probably be pretty successful in the run of the mill browse teh interwebs, check mah emails, what's a gigahurtz? crowd and college kids looking for a cheap notebook. There are still a lot of people and organizations that don't give a **** about PC performance. If you throw a low-power or EnergyStar sticker on there you're already looking pretty good to many people. All of the computers at my shop (USAF) were AMD based until recently. We don't need anything beyond the ability to browse poorly designed websites, read PDFs, and Outlook.

I would imagine having an onboard GPU, memory controller, etc. as seems to be the usual case with mobile oriented platforms does wonders in saving fabrication costs for OEMs beyond the obvious power savings/battery life improvements.
messiahx is offline  
Old 10-03-2012, 06:27 PM
  #9  
Elite Member
iTrader: (2)
 
thenuge26's Avatar
 
Join Date: Aug 2012
Location: Indianapolis
Posts: 3,267
Total Cats: 239
Default

Originally Posted by Joe Perez

Then why bother producing a desktop version of it?
Because some idiots people go to Best Buy to buy a desktop also?
thenuge26 is offline  
Old 10-03-2012, 06:28 PM
  #10  
Boost Pope
Thread Starter
iTrader: (8)
 
Joe Perez's Avatar
 
Join Date: Sep 2005
Location: Chicago. (The less-murder part.)
Posts: 33,019
Total Cats: 6,587
Default

Originally Posted by messiahx
AMD is probably targeting that low to mid-low segment. This chip will probably be pretty successful in the run of the mill browse teh interwebs, check mah emails, what's a gigahurtz? crowd and college kids looking for a cheap notebook.
I agree, as this has always been AMD's target audience.

But again, it raises the question as to why they would bother putting a (relatively) high-performance GPU onto the die. Doing this inherently raises the cost, lowers the yield, and raises the power dissipation of the chip. And for what? I honestly just can't see buyers in this market segment knowing or caring whether they CPU has a better on-board GPU than the equivalently-targeted CPU from Intel.



I would imagine having an onboard GPU, memory controller, etc. as seems to be the usual case with mobile oriented platforms does wonders in saving fabrication costs for OEMs beyond the obvious power savings/battery life improvements.
Pretty much all CPUs, desktop or laptop, are built this way nowadays. Intel put the memory controller onto the main die all the way back in 2008 with the Nehlam, the Clarkdale / Arrandale moved GPU on-board, and Sandy Bridge did away with the northbridge entirely, moving practicaly the entire PCH onto the CPU. It doesn't just save cost, having the memory controller located just a few millimeters away from the cache and the ALUs also improves performance.

The big difference is that the Intel chips feature very basic GPUs that satisfy the requirements of normal 2d apps without making any serious concession to 3d gamers. They don't waste silicon and watts on stuffing the CPU with shaders-o-plenty that will never get used in most applications.

And that, again, circles back around to why this doesn't make sense for AMD. Putting a higher-performance GPU onto the main die increases the cost of the CPU and decreases its thermal efficiency, without providing any obvious benefit that I can see insofar as attracting market share.
Joe Perez is online now  
Old 10-03-2012, 06:31 PM
  #11  
Boost Pope
Thread Starter
iTrader: (8)
 
Joe Perez's Avatar
 
Join Date: Sep 2005
Location: Chicago. (The less-murder part.)
Posts: 33,019
Total Cats: 6,587
Default

Originally Posted by thenuge26
Because some idiots people go to Best Buy to buy a desktop also?
See above.

The people who go to BestBuy to buy a desktop PC simply aren't going to care how many more polygons-per-second the Trinity's onboard GPU can shade as compared to the integrated HD Graphics GPU of a comperable Intel Core. They want to know three things:

How many gigahertz does this have?

How much does it cost?

Can I use it to (play farmville / send emails / look at cat pictures / etc)?
Joe Perez is online now  
Old 10-03-2012, 08:33 PM
  #12  
Elite Member
iTrader: (2)
 
thenuge26's Avatar
 
Join Date: Aug 2012
Location: Indianapolis
Posts: 3,267
Total Cats: 239
Default

I guess AMD is hoping that best buy will upsell them to the one that can play farmville better

Or that the requirements for farmville and its clones will soon need a GPU like that. I don't know, doesn't really make sense to me either.
thenuge26 is offline  
Old 10-03-2012, 08:43 PM
  #13  
Boost Pope
Thread Starter
iTrader: (8)
 
Joe Perez's Avatar
 
Join Date: Sep 2005
Location: Chicago. (The less-murder part.)
Posts: 33,019
Total Cats: 6,587
Default

Originally Posted by thenuge26
Or that the requirements for farmville and its clones will soon need a GPU like that. I don't know, doesn't really make sense to me either.
Actually, I think you may have a point here.

As a matter of historical precedent, software has always grown in size and complexity to fill the capacity of the hardware, even when such growth serves no useful purpose other than cosmetic appeal. (See Windows Vista / 7 "Aero" modes, etc.)

Once a point is reached at which the presence of a moderately heavy GPU can be assumed in even the lowest-end machines, then even the simplest flash-style applications will require a heavy GPU. That'll be another -1 for laptop battery life.
Joe Perez is online now  
Old 10-03-2012, 08:50 PM
  #14  
Senior Member
iTrader: (2)
 
messiahx's Avatar
 
Join Date: Jun 2007
Location: Shalimar, FL
Posts: 956
Total Cats: 7
Default

Is this maybe a future proofing type move that provides video hardware for some benefit other than gaming? Maybe hardware decoding of 1080p formats or something along those lines? Home theater is one of the applications mentioned on the press page.

I'd like to think there is a reason a company would invest that kind of money into a platform, but then again, this being AMD/ATI, the merged provider of the mediocre, so I can't be sure.
messiahx is offline  
Old 10-03-2012, 09:51 PM
  #15  
Boost Pope
Thread Starter
iTrader: (8)
 
Joe Perez's Avatar
 
Join Date: Sep 2005
Location: Chicago. (The less-murder part.)
Posts: 33,019
Total Cats: 6,587
Default

Originally Posted by messiahx
Maybe hardware decoding of 1080p formats or something along those lines? Home theater is one of the applications mentioned on the press page.
I'm certainly no GPU guru, but I do tend to take things mentioned on press pages with a grain of salt. Most corporate marketing departments are simply tasked with throwing as many buzzwords as possible at the product which can even remotely be construed as having some relevance to it. In theory, the Miata could be described as being suitable for use as a military transport vehicle, although it would not be particularly good as this task as compared to pretty much every imaginable alternative up to and including the Dacia Sandero.

From what I can gather, the GPU section of the Trinity processors is essentially a scaled-down port of the Radeon HD6900 series architecture. The vast majority of the A10's GPU footprint consists of 384 shader processors. This is essentially just a large array of tiny little subprocessors optimized for SIMD (Single Instruction, Multiple Data) execution, which is a fancy way of saying "Do the exact same operation to a million pieces of sequential data all in a row."

They are great for tasks which lend themselves to massively parallel execution. The most commonly-known of these, of course, is doing pixel-shading and texture processing in 3d games. A 1920 x 1080 display, for instance, consists of 2,073,600 pixels, and for each frame you can process as many of them in parallel as you have the computational resources to handle. A GPU with 1,024 shader cores can process that whole screen in 2,025 blocks, with each block consisting of 1,024 pixels all getting rendered and dumped out into memory at the same time, and then having another 1,024 pixels loaded up right behind them. Or put another way, it can finish rendering the scene 2,025 times faster than a single processor handling one pixel at a time, and thus provide a framerate 2,025 times as high.

There are other tasks which lend themselves well to this sort of computational architecture, but they're mostly the kind of applications that you'd typically throw a supercomputer at. Things like brute force cryptography or modelling the folding of proteins in a cell. (In fact, many so-called supercomputers these days are in fact arrays built out of huge numbers of gamer-grade video cards loaded into commodity PCs with consumer-grade CPUs.)

By comparison, decoding and playing back a compressed video stream is a task not well-suited to this method of execution, and most computers available today already have sufficient resources in the main CPU to do it quite easily. Heck, a lot of high-end cellphones these days can play high-quality video, and their processors are absolute weaksauce by comparison to even an entry-level Atom.
Joe Perez is online now  
Old 10-03-2012, 09:54 PM
  #16  
Elite Member
iTrader: (5)
 
pusha's Avatar
 
Join Date: Nov 2009
Location: Miami, FL
Posts: 7,330
Total Cats: -29
Default

Attached Thumbnails The AMD Trinity-1349284403539.jpg  
pusha is offline  
Old 10-03-2012, 10:15 PM
  #17  
Elite Member
iTrader: (24)
 
Bryce's Avatar
 
Join Date: Jul 2007
Location: Cypress, TX
Posts: 3,759
Total Cats: 35
Default

Originally Posted by Joe Perez
I'm certainly no GPU guru, but I do tend to take things mentioned on press pages with a grain of salt. Most corporate marketing departments are simply tasked with throwing as many buzzwords as possible at the product which can even remotely be construed as having some relevance to it. In theory, the Miata could be described as being suitable for use as a military transport vehicle, although it would not be particularly good as this task as compared to pretty much every imaginable alternative up to and including the Dacia Sandero.

From what I can gather, the GPU section of the Trinity processors is essentially a scaled-down port of the Radeon HD6900 series architecture. The vast majority of the A10's GPU footprint consists of 384 shader processors. This is essentially just a large array of tiny little subprocessors optimized for SIMD (Single Instruction, Multiple Data) execution, which is a fancy way of saying "Do the exact same operation to a million pieces of sequential data all in a row."

They are great for tasks which lend themselves to massively parallel execution. The most commonly-known of these, of course, is doing pixel-shading and texture processing in 3d games. A 1920 x 1080 display, for instance, consists of 2,073,600 pixels, and for each frame you can process as many of them in parallel as you have the computational resources to handle. A GPU with 1,024 shader cores can process that whole screen in 2,025 blocks, with each block consisting of 1,024 pixels all getting rendered and dumped out into memory at the same time, and then having another 1,024 pixels loaded up right behind them. Or put another way, it can finish rendering the scene 2,025 times faster than a single processor handling one pixel at a time, and thus provide a framerate 2,025 times as high.

There are other tasks which lend themselves well to this sort of computational architecture, but they're mostly the kind of applications that you'd typically throw a supercomputer at. Things like brute force cryptography or modelling the folding of proteins in a cell. (In fact, many so-called supercomputers these days are in fact arrays built out of huge numbers of gamer-grade video cards loaded into commodity PCs with consumer-grade CPUs.)

By comparison, decoding and playing back a compressed video stream is a task not well-suited to this method of execution, and most computers available today already have sufficient resources in the main CPU to do it quite easily. Heck, a lot of high-end cellphones these days can play high-quality video, and their processors are absolute weaksauce by comparison to even an entry-level Atom.
GPU array supercomputers are pretty awesome. That is all.
Bryce is offline  
Old 10-04-2012, 09:00 AM
  #18  
Elite Member
iTrader: (2)
 
thenuge26's Avatar
 
Join Date: Aug 2012
Location: Indianapolis
Posts: 3,267
Total Cats: 239
Default

Originally Posted by Joe Perez
Actually, I think you may have a point here.

As a matter of historical precedent, software has always grown in size and complexity to fill the capacity of the hardware, even when such growth serves no useful purpose other than cosmetic appeal. (See Windows Vista / 7 "Aero" modes, etc.)

Once a point is reached at which the presence of a moderately heavy GPU can be assumed in even the lowest-end machines, then even the simplest flash-style applications will require a heavy GPU. That'll be another -1 for laptop battery life.
And of course by giving lazy programmers like me a decent GPU to work with in commodity machines will guarantee the NEED for a decent GPU much sooner than otherwise.
thenuge26 is offline  
Related Topics
Thread
Thread Starter
Forum
Replies
Last Post
Quality Control Bot
Gaming
674
05-26-2023 01:34 AM
miataman04
Miata parts for sale/trade
3
02-27-2015 09:50 AM
miataman04
Build Threads
23
10-06-2014 12:04 PM
jeff_man
Insert BS here
10
06-28-2012 06:52 PM


Thread Tools
Search this Thread
Quick Reply: The AMD Trinity



All times are GMT -4. The time now is 08:23 PM.