View Single Post
Old 05-25-2013, 11:08 PM   #171
Willynowei
Some dude
 
Willynowei's Avatar
 
Football is a wonderful thing.

Join Date: Oct 2004
Location: NJ
Posts: 2,966

Adopt-a-Bronco:
Eddie Royal
Default

Quote:
Originally Posted by Drek View Post
Given that I've written operating systems in assembly for x86 hardware, a few games in my free time, and actively follow the industry with multiple acquaintances working in it professionally I'd bet that I probably do.

They will use the ESRAM just like the embedded RAM on the 360, as an expanded cache to make up for slower main RAM.

Embedded RAM doesn't make up for a relative lack of transistors and no sane hardware engineer would claim that the embedded cache with slower main RAM could ever be "customized" into outperforming far faster GDDR5.

Hell, Sony lead designer (Mark Cerny) specifically mentioned that they considered an embedded solution and found GDDR5 to be a far superior solution. Sony smaller OS footprint let them bet on 512MB GDDR5 chips and settling for 256MB and only 4GB of system RAM if they were wrong. MS committed to a massive OS footprint and therefore needed 8GB of RAM, they couldn't gamble.

Most developers were in favor of Sony's choice when it looked like they would not have 4 gigs. Them having 8 instead only magnifies that difference.

Also, video games are a single application. Any competently designed game has no issues with RAM latency as the multiple operations needed are all actively occurring at a given instance. Latency matters when you need to suddenly issue a large RAM dump and refill due to completely changing applications (like going from Web browsing to a spreadsheet) . Many of your games this generation stream from the disc and disc read/write options have latency that is orders of magnitude greater than any RAM latency difference.
So you've programmed and have friends who work in the industry. Let's avoid making this a dick swinging contest on who's closer to the industry. I clearly know what I'm talking about, and everyone whose getting the best work in tech has NDA's way too far up their ass to be able to freely talk about who they work for and what they work on, so lets keep that out of this discussion.

You're flat out wrong about the way that on-die memory affects the graphics operations of an APU. Your quotes above completely miss the point, you're talking about completely unrelated things.

Of course the latency going to disc or HDD is much longer, that is irrelevant to the point because the scenarios where on-die memory provides savings involve a situation where all the relevant data is already in the Ram, and the RAM is feeding the APU. We aren't talking about RAM dumps, we are talking about time that the APU is idle; precious time that could be spent in cycle when it is instead waiting for packets from the memory module.

At certain points, the processing unit has to request information from the RAM module. The travel time, measured in nanoseconds for the processing unit to acquire information from the RAM can be shaven. The nanoseconds add up, and reduction of said idle time makes the processor more efficient. This is all in addition to increasing the effective bandwidth.

Sony considered on-die memory because it has its benefits. Just like Nvidia is providing similar solution sets for their next GPU series: http://www.anandtech.com/show/6846/n...or-beyond-2014.

Also I never said anything about the Xbox One outperforming the PS4, I said there's a lot of unknowns, that the gap could very well be smaller than the compute power per cycle would indicate, and that its not as simple as you understand it.

You went as far as to talk about the gap between these systems as comparable to the PS2 to Xbox gap. That just shows how much you are reaching in your statements, you're jumping way too far and quickly. There are still unknown parts to these systems, and they could very well end up much closer together in performance than the 33% difference in per clock graphics processing would indicate. Even in a worst case scenario, it is unimaginable for the PS4 to have image quality gains over its peer to the level of the Xbox vs. the PS2.

If you are speculating that the gap is really large enough to compare to the Xbox vs. PS2 generation, well... I think we should just agree to disagree and move on, because we are too far apart to come anywhere near a consensus. I'd think the chance of all life on earth ending tomorrow is higher than the gap between the Xbox One and PS4 being that noticeable. In fact, i would expect the gap to be almost unnoticeable in cross platform games, especially in screenshots.

Last edited by Willynowei; 05-25-2013 at 11:48 PM..
Willynowei is offline   Reply With Quote