The Orange Mane -  a Denver Broncos Fan Community  

Go Back   The Orange Mane - a Denver Broncos Fan Community > Orange Mane Discussion > Orange Mane Central Discussion
Register FAQ Members List Calendar Chat Room Mark Forums Read



Reply
 
Thread Tools Display Modes
Old 05-24-2013, 10:13 PM   #151
Atwater His Ass
Ring of Famer
 
Atwater His Ass's Avatar
 

Join Date: Jul 2006
Posts: 5,623

Adopt-a-Bronco:
None
Default

Consoles have never been more adept than PC's at gaming...lol wtf i don't even.

There are reasons some people want to buy these new gen consoles over a comparable PC, but gaming certainly isn't one of them.
Atwater His Ass is offline   Reply With Quote
Old 05-24-2013, 10:50 PM   #152
Doggcow
Rebel Laughs
 
Doggcow's Avatar
 
At Unbelievers

Join Date: Sep 2007
Posts: 8,913

Adopt-a-Bronco:
Von Miller
Default

Quote:
Originally Posted by extralife View Post
I said a modest off the shelf PC today would compete with if not outstrip a PS4/Xbone. A modest PC today could not make Crysis look like that, much less in 2007. When Crysis was released there was not a single consumer grade PC in existence that could run it at full settings in 1080p at even 30 FPS. In most ways, Crysis is still the most technically impressive game in existence from a raw power perspective. Only Crysis 3 and Metro Last Light can really compete. Of course, Crysis is also the least optimized video game ever developed.

As for the "myth" of console power, when the original Playstation came out consumer computers were barely capable running 3D video games in the slightest. It was not until 1996 (the PS launched in late 94) that PCs could reasonably compete. The PS2, and particularly the Xbox and Gamecube all outstripped mainstream PC 3D performance in 2000 and 2001. Prior to 3D, consoles were far more adept at gaming than comparable computers.
My PC runs Crysis 3 just fine.
Doggcow is offline   Reply With Quote
Old 05-24-2013, 11:27 PM   #153
extralife
Ring of Famer
 

Join Date: Mar 2006
Posts: 4,873
Default

I wasn't talking about Crysis 3, for one thing.
extralife is offline   Reply With Quote
Old 05-25-2013, 09:27 AM   #154
Drek
Ring of Famer
 
Drek's Avatar
 

Join Date: Mar 2004
Posts: 12,368
Default

Quote:
Originally Posted by extralife View Post
both systems use an 8-core 64-bit AMD Jaguar CPU with an AMD GPU of the same architecture on the die. they are so close to each other that if these were theoretically PC products launched by AMD or Nvida or Intel or whatever, they would both be released under the same brand. this information is widely available.
Then you're reading it wrong.

The Xbox One:
8 core AMD CPU (likely Jaguar, unknown clock but likely 1.6Ghz)
768 thread, 800Mhz GPU with 1.2 TFLOPs performance, 12 compute units
32MB of ESRAM at 102GB/s
8GB of DDR3 at 68GB/s

Of that 2 CPU cores, 10% of the GPU, and 3GB of the DDR3 are restricted entirely to the OS and not accessible to games.

The PS4:
8 core AMD CPU (likely Jaguar, unkown clcok but likely 1.6Ghz, though a recent rumor suggests they're pushing for 2.0Ghz)
1152 thread, 800Mhz GPU with 1.84 TFLOPs performance, 18 compute units
8GB of GDDR5 running at 176GB/s

Of that the speculation is that 1 CPU core (in conjunction in a seperate ARM CPU reserved only for the OS) and between 512MB to 1GB of RAM is OS reserved, with the rest available for games.

So the GPU is a 50% upgrade. The PS4 offers between 2-2.5GB more RAM. That RAM also runs significantly faster than the XB1's and with no need to manage data flow through the ESRAM.

The core silicon alone for these two devices are VERY different. Early rumors that were tied to these same numbers months ago make a Radeon 7700 to Radeon 7800 series. That now seems to be a bit generous to the Xbox One, but not entirely inaccurate. The PS4 has the biggest hardware gap we've seen since the original Xbox was released nearly two years after the Playstation 2. Their similarity as x86 CPUs with AMD designed GPUs only makes this gap more important because fine tuning to the more powerful hardware is substantially easier for developers.
Drek is offline   Reply With Quote
Old 05-25-2013, 11:00 AM   #155
24champ
Livin' the dream!
 
24champ's Avatar
 
Keep Calm and Chive on

Join Date: Aug 2005
Location: Southern California
Posts: 19,585

Adopt-a-Bronco:
DomCasual
Default

Quote:
Originally Posted by Drek View Post
The PS4:
8 core AMD CPU (likely Jaguar, unkown clcok but likely 1.6Ghz, though a recent rumor suggests they're pushing for 2.0Ghz)
1152 thread, 800Mhz GPU with 1.84 TFLOPs performance, 18 compute units
8GB of GDDR5 running at 176GB/s

Of that the speculation is that 1 CPU core (in conjunction in a seperate ARM CPU reserved only for the OS) and between 512MB to 1GB of RAM is OS reserved, with the rest available for games.

So the GPU is a 50% upgrade. The PS4 offers between 2-2.5GB more RAM. That RAM also runs significantly faster than the XB1's and with no need to manage data flow through the ESRAM.

The core silicon alone for these two devices are VERY different. Early rumors that were tied to these same numbers months ago make a Radeon 7700 to Radeon 7800 series. That now seems to be a bit generous to the Xbox One, but not entirely inaccurate. The PS4 has the biggest hardware gap we've seen since the original Xbox was released nearly two years after the Playstation 2. Their similarity as x86 CPUs with AMD designed GPUs only makes this gap more important because fine tuning to the more powerful hardware is substantially easier for developers.
Great Scott!

24champ is offline   Reply With Quote
Old 05-25-2013, 11:49 AM   #156
Drek
Ring of Famer
 
Drek's Avatar
 

Join Date: Mar 2004
Posts: 12,368
Default

Quote:
Originally Posted by 24champ View Post
Great Scott!

Honestly, other than the 8GB of DDR5 it's not even that impressive. The PS3/360 were a 7 year generation (2 years longer than previous) and the processing power jump is the smallest we've ever seen. When the Xbox 360 and PS3 came out they actually spent a brief bit of time as being the real world most powerful gaming devices on the market (as the more powerful silicon in gaming PCs is always handicapped to some degree by the OS and bottle necks between the CPU, GPU, main memory pool, etc.) Same with the PS2 and original Xbox.

Granted, in all cases these were very narrow windows that quickly evaporated, but when the PS4 comes out it will be comparable to but not significantly better than a high end PC with a single top tier graphics card. SLI/Crossfire configurations will be significantly better. The XB1 will be on the high end of mid tier. Both will only fall down from there as they're fixed hardware.

That's not a huge deal, but in my opinion at least it makes every bit of power you can squeeze in the box for launch that much more important. Microsoft intentionally pulled up short on what they're putting in the box to allow for a three OS system (as opposed to one well designed OS, because they want to foist Windows 8 and RT on everyone) and to pay for Kinect in every box. It's not as bad as the Wii U, but it still doesn't go as far as it should (in my opinion).
Drek is offline   Reply With Quote
Old 05-25-2013, 12:01 PM   #157
Houshyamama
I Make The Weather
 
Houshyamama's Avatar
 

Join Date: Sep 2006
Location: San Diego
Posts: 4,476

Adopt-a-Bronco:
Brock Osweiler
Default

Quote:
Originally Posted by Drek View Post
Honestly, other than the 8GB of DDR5 it's not even that impressive. The PS3/360 were a 7 year generation (2 years longer than previous) and the processing power jump is the smallest we've ever seen. When the Xbox 360 and PS3 came out they actually spent a brief bit of time as being the real world most powerful gaming devices on the market (as the more powerful silicon in gaming PCs is always handicapped to some degree by the OS and bottle necks between the CPU, GPU, main memory pool, etc.) Same with the PS2 and original Xbox.

Granted, in all cases these were very narrow windows that quickly evaporated, but when the PS4 comes out it will be comparable to but not significantly better than a high end PC with a single top tier graphics card. SLI/Crossfire configurations will be significantly better. The XB1 will be on the high end of mid tier. Both will only fall down from there as they're fixed hardware.

That's not a huge deal, but in my opinion at least it makes every bit of power you can squeeze in the box for launch that much more important. Microsoft intentionally pulled up short on what they're putting in the box to allow for a three OS system (as opposed to one well designed OS, because they want to foist Windows 8 and RT on everyone) and to pay for Kinect in every box. It's not as bad as the Wii U, but it still doesn't go as far as it should (in my opinion).
This is the reason I'm considering switching to Sony this go around. There's still a lot of info coming, but if not for the abomination Sony calls a controller, I'd be sold on PS4 already. I'll have to wait and see if the new one is better.
Houshyamama is offline   Reply With Quote
Old 05-25-2013, 01:33 PM   #158
Willynowei
Some dude
 
Willynowei's Avatar
 
Football is a wonderful thing.

Join Date: Oct 2004
Location: NY
Posts: 3,017

Adopt-a-Bronco:
Ryan Clady
Default

Quote:
Originally Posted by extralife View Post
I said a modest off the shelf PC today would compete with if not outstrip a PS4/Xbone. A modest PC today could not make Crysis look like that, much less in 2007. When Crysis was released there was not a single consumer grade PC in existence that could run it at full settings in 1080p at even 30 FPS. In most ways, Crysis is still the most technically impressive game in existence from a raw power perspective. Only Crysis 3 and Metro Last Light can really compete. Of course, Crysis is also the least optimized video game ever developed.

As for the "myth" of console power, when the original Playstation came out consumer computers were barely capable running 3D video games in the slightest. It was not until 1996 (the PS launched in late 94) that PCs could reasonably compete. The PS2, and particularly the Xbox and Gamecube all outstripped mainstream PC 3D performance in 2000 and 2001. Prior to 3D, consoles were far more adept at gaming than comparable computers.
You use the words "mainstream" and "modestly priced" interchangeably. Sorry, but that's just not the case. A mainstream Personal Computer can barely run the Sims.

As for "modestly priced" gaming computer - what is the price, is it assembled by your self, is it purchased at retail?

Here's some reference points:

The comparably more powerful 7series gfx card to the PS3's RSX was available for roughly ~$200 during the launch of the system.

A very popular PC game that sold very well at the same time as when the original Xbox was released was Battlefield 1942. The best visuals on the Xbox at the time belonged to the launch title: Halo Combat Evolved.

Visuals of the two games are highly comparable with the PC's BF1942 having the inclusion of large levels, better draw distance and 64 player vehicle based battles.

What is your definition of "consumer grade"? And Crysis does not have to be run at "full settings" to outstrip what was available on the consoles at the time. Not to mention, consoles don't run games at 1080p, contrary to what the makers might have you believe.

Last edited by Willynowei; 05-25-2013 at 01:54 PM..
Willynowei is offline   Reply With Quote
Old 05-25-2013, 01:49 PM   #159
Willynowei
Some dude
 
Willynowei's Avatar
 
Football is a wonderful thing.

Join Date: Oct 2004
Location: NY
Posts: 3,017

Adopt-a-Bronco:
Ryan Clady
Default

You know what i'm going to exit the discussion, there's just too much misinformation here, its no ones fault, it just happens, i'm tired of trying to correct assumptions people have about things they don't fully understand.

There is a lot of comparison of specs by people who are erroneously comparing paper numbers without knowing the real significance of each figure. There is plenty of debate in developer and engineering circles, about the specifics of internal hardware and how the two systems compare. Whether one is more powerful than the other, or similar, or different, whatever the measurement, no one is sure of the exact gap right now in real world performance.

There's so many factors, and they are so complex. For example, there's so much to the Xbox's ESram that you could easily write a whole book on the complex tradeoffs it entails for the system, from an APU workload-distribution efficiency boost to a latency boost in comparison to a traditional memory layout. More "Teraflops per second" can be a rough estimator, but there's other factors that people just don't know about yet. Small differences in design and process can make big differences - differences that no one can pin point yet. There's very little information as to the voltage and efficiency of each Soc, no one knows how high each chip can clock while providing adequate factory yields (even if the design recommends a 1ghz cap, you can have one system failing to reach the cap due to heat issues, etc). We also don't know if one company has better manufacturing competencies, that only show up in binning, that can lead to higher yields, which can in turn lead to the option of clocking certain parts higher. Its all speculation. The only thing anyone can say, is that it looks like the PS3 has more raw power right now, but very few people have a good idea of how large that difference is, or how it will translate into games. Same is true when it comes to the effects of their design changes, its not as simple as the branding and the fact they use "cores." There's too much over simplification going on.

There's no point in getting deeper and deeper into discussion when my general point is, we don't know. Lets wait and see. These companies have a lot of money invested to try and make successful entertainment devices, and powerful internals are a big part of that for both companies, even if its not their only concern.

Last edited by Willynowei; 05-25-2013 at 02:06 PM..
Willynowei is offline   Reply With Quote
Old 05-25-2013, 02:06 PM   #160
Drek
Ring of Famer
 
Drek's Avatar
 

Join Date: Mar 2004
Posts: 12,368
Default

Quote:
Originally Posted by Willynowei View Post
You know what i'm going to exit the discussion, there's just too much misinformation here, its no ones fault, it just happens, i'm tired of trying to correct assumptions people have about things they don't fully understand.

There is a lot of comparison of specs by people who are erroneously comparing paper numbers without knowing the real significance of each figure. There is plenty of debate in developer and engineering circles, about the specifics of internal hardware and how the two systems compare. Whether one is more powerful than the other, or similar, or different, whatever the measurement, no one is sure of the exact gap right now in real world performance.

There's so many factors, and they are so complex. For example, there's so much to the Xbox's ES ram that you could easily write a whole book on the complex tradeoffs it entails for the system, from an APU workload-distribution efficiency boost to a latency boost in nanoseconds in comparison to a traditional memory layout. No one knows the voltage and heat of the memory and chip design, therefore, no one knows how high each chip can clock while providing adequate factory yields (even if the design recommends a 1ghz cap, you can have one system failing to reach the cap due to heat issues, etc).

There's no point in getting deeper and deeper into discussion when my general point is, we don't know. Lets wait and see. These companies have a lot of money invested to try and make successful entertainment devices, and powerful internals are a big part of that for both companies, even if its not their only concern.
The APUs are based on Jaguar and Radeon GCN silicon. There isn't a lot of unknowns for the hardware engineers about what this well proven silicon can handle.

Are MS and Sony both designing towards an energy and heat budget? Sure, but enough leaks have come out that we have a pretty clear answer as to what each is capable of.

Also, the latency "benefit" from ESRAM is measured only in nanoseconds and also largely irrelevant for anything other than changing code midstream (like you do on a PC OS when you close a program and open another). It has no appreciable benefit to single application coding.

It isn't erroneous at all to compare paper number of two x86 based APUs built by the same company using the same family of silicon as the origin. This isn't apples to oranges. This is apples to apples where they both fell from the same tree but different branches.
Drek is offline   Reply With Quote
Old 05-25-2013, 02:12 PM   #161
Willynowei
Some dude
 
Willynowei's Avatar
 
Football is a wonderful thing.

Join Date: Oct 2004
Location: NY
Posts: 3,017

Adopt-a-Bronco:
Ryan Clady
Default

Quote:
Originally Posted by Drek View Post
The APUs are based on Jaguar and Radeon GCN silicon. There isn't a lot of unknowns for the hardware engineers about what this well proven silicon can handle.

Are MS and Sony both designing towards an energy and heat budget? Sure, but enough leaks have come out that we have a pretty clear answer as to what each is capable of.

Also, the latency "benefit" from ESRAM is measured only in nanoseconds and also largely irrelevant for anything other than changing code midstream (like you do on a PC OS when you close a program and open another). It has no appreciable benefit to single application coding.

It isn't erroneous at all to compare paper number of two x86 based APUs built by the same company using the same family of silicon as the origin. This isn't apples to oranges. This is apples to apples where they both fell from the same tree but different branches.
No. Single application coding? Do you understand how graphics processing works?

1.) We do not know what they are planning to do with the ESRAM.
2.) on-die ram has in past instances resulted in efficiency gains of over 20%.
3.) as I've gone over and over and over again, the two companies have made enough customization that you can't compare them so easily.

I have an old Macbook that runs on x86 architecture. It's just not that simple.

Last edited by Willynowei; 05-25-2013 at 02:16 PM..
Willynowei is offline   Reply With Quote
Old 05-25-2013, 02:48 PM   #162
Agamemnon
Guest
 

Posts: n/a
Default

All the spec comparisons aside, I have a distinct feeling the two systems are going to perform very similarly just as the 360 and the PS3 do now. I also feel fairly confident that both will outperform any PC you can find on the market under $1200. They are also going to age better than said PC because designers are going to be designing specifically for them and therefore the software will be massively more optimized. That said high end PC's custom built by tech-heads will easily remain the best gaming platforms graphics-wise, but the fact is that most people do not have the tech-savvy to build their own high end gaming rigs, and pre-built gaming rigs are much more expensive.
  Reply With Quote
Old 05-25-2013, 03:42 PM   #163
extralife
Ring of Famer
 

Join Date: Mar 2006
Posts: 4,873
Default

Quote:
Originally Posted by Agamemnon View Post
I also feel fairly confident that both will outperform any PC you can find on the market under $1200.
no chance, as long as by "on the market" you're allowing for a self-build.
extralife is offline   Reply With Quote
Old 05-25-2013, 04:20 PM   #164
El Guapo
aka mav_7. who?
 
El Guapo's Avatar
 

Join Date: Apr 2004
Location: VA (heart is in TX)
Posts: 3,935
Default

I wish people would get off their high horse when it came to consoles not being as great as home-built rigs. Of course this is the case! I used to build my own gaming pc's, but the last one I built was four years ago and I don't plan on doing it again anytime soon.

I've always owned consoles and I always will. This next-gen will not be any different and I'll probably end up owning both for various reasons, but I'll purchase the Xbox first cause... 'merica.

El Guapo is offline   Reply With Quote
Old 05-25-2013, 04:42 PM   #165
2KBack
Rumblin' Bumblin'
 
2KBack's Avatar
 
Cake is delicious

Join Date: Aug 2004
Location: Wash DC
Posts: 7,758
Default

Quote:
Originally Posted by El Guapo View Post
I wish people would get off their high horse when it came to consoles not being as great as home-built rigs. Of course this is the case! I used to build my own gaming pc's, but the last one I built was four years ago and I don't plan on doing it again anytime soon.

I've always owned consoles and I always will. This next-gen will not be any different and I'll probably end up owning both for various reasons, but I'll purchase the Xbox first cause... 'merica.

I agree.
2KBack is offline   Reply With Quote
Old 05-25-2013, 06:13 PM   #166
lolcopter
Guest
 

Posts: n/a
Default

only posting this here because this comment is video game relevant

i can't believe i didn't know you could just simply download and install updated rosters for madden 12? haha, i just loaded the latest version, dysert and ball on my broncos team already. sweet.
  Reply With Quote
Old 05-25-2013, 06:32 PM   #167
BroncoBeavis
Guest
 

Posts: n/a
Default

Quote:
Originally Posted by El Guapo View Post
I wish people would get off their high horse when it came to consoles not being as great as home-built rigs. Of course this is the case! I used to build my own gaming pc's, but the last one I built was four years ago and I don't plan on doing it again anytime soon.

I've always owned consoles and I always will. This next-gen will not be any different and I'll probably end up owning both for various reasons, but I'll purchase the Xbox first cause... 'merica.

I don't really care. I'll own one of the next gen consoles. The performance specs of either are really pretty disappointing though. And remember, we're 5-6 months out on these as well. By the time these machines are on the street, a $150 card will outperform either one. That just wasn't true of the 360 or PS3 when they came out.
  Reply With Quote
Old 05-25-2013, 08:10 PM   #168
Drek
Ring of Famer
 
Drek's Avatar
 

Join Date: Mar 2004
Posts: 12,368
Default

Quote:
Originally Posted by Willynowei View Post
No. Single application coding? Do you understand how graphics processing works?

1.) We do not know what they are planning to do with the ESRAM.
2.) on-die ram has in past instances resulted in efficiency gains of over 20%.
3.) as I've gone over and over and over again, the two companies have made enough customization that you can't compare them so easily.

I have an old Macbook that runs on x86 architecture. It's just not that simple.
Given that I've written operating systems in assembly for x86 hardware, a few games in my free time, and actively follow the industry with multiple acquaintances working in it professionally I'd bet that I probably do.

They will use the ESRAM just like the embedded RAM on the 360, as an expanded cache to make up for slower main RAM.

Embedded RAM doesn't make up for a relative lack of transistors and no sane hardware engineer would claim that the embedded cache with slower main RAM could ever be "customized" into outperforming far faster GDDR5.

Hell, Sony lead designer (Mark Cerny) specifically mentioned that they considered an embedded solution and found GDDR5 to be a far superior solution. Sony smaller OS footprint let them bet on 512MB GDDR5 chips and settling for 256MB and only 4GB of system RAM if they were wrong. MS committed to a massive OS footprint and therefore needed 8GB of RAM, they couldn't gamble.

Most developers were in favor of Sony's choice when it looked like they would not have 4 gigs. Them having 8 instead only magnifies that difference.

Also, video games are a single application. Any competently designed game has no issues with RAM latency as the multiple operations needed are all actively occurring at a given instance. Latency matters when you need to suddenly issue a large RAM dump and refill due to completely changing applications (like going from Web browsing to a spreadsheet) . Many of your games this generation stream from the disc and disc read/write options have latency that is orders of magnitude greater than any RAM latency difference.
Drek is offline   Reply With Quote
Old 05-25-2013, 09:40 PM   #169
Agamemnon
Guest
 

Posts: n/a
Default

Quote:
Originally Posted by extralife View Post
no chance, as long as by "on the market" you're allowing for a self-build.
No that wouldn't count. Building a computer yourself drastically reduces its cost.
  Reply With Quote
Old 05-25-2013, 10:17 PM   #170
extralife
Ring of Famer
 

Join Date: Mar 2006
Posts: 4,873
Default

Quote:
Originally Posted by Agamemnon View Post
No that wouldn't count. Building a computer yourself drastically reduces its cost.
that's kind of the point. not many people interested in playing "AAA" or high end or whatever buzzword you want to use games on a computer is likely to buy a premade one.
extralife is offline   Reply With Quote
Old 05-25-2013, 11:08 PM   #171
Willynowei
Some dude
 
Willynowei's Avatar
 
Football is a wonderful thing.

Join Date: Oct 2004
Location: NY
Posts: 3,017

Adopt-a-Bronco:
Ryan Clady
Default

Quote:
Originally Posted by Drek View Post
Given that I've written operating systems in assembly for x86 hardware, a few games in my free time, and actively follow the industry with multiple acquaintances working in it professionally I'd bet that I probably do.

They will use the ESRAM just like the embedded RAM on the 360, as an expanded cache to make up for slower main RAM.

Embedded RAM doesn't make up for a relative lack of transistors and no sane hardware engineer would claim that the embedded cache with slower main RAM could ever be "customized" into outperforming far faster GDDR5.

Hell, Sony lead designer (Mark Cerny) specifically mentioned that they considered an embedded solution and found GDDR5 to be a far superior solution. Sony smaller OS footprint let them bet on 512MB GDDR5 chips and settling for 256MB and only 4GB of system RAM if they were wrong. MS committed to a massive OS footprint and therefore needed 8GB of RAM, they couldn't gamble.

Most developers were in favor of Sony's choice when it looked like they would not have 4 gigs. Them having 8 instead only magnifies that difference.

Also, video games are a single application. Any competently designed game has no issues with RAM latency as the multiple operations needed are all actively occurring at a given instance. Latency matters when you need to suddenly issue a large RAM dump and refill due to completely changing applications (like going from Web browsing to a spreadsheet) . Many of your games this generation stream from the disc and disc read/write options have latency that is orders of magnitude greater than any RAM latency difference.
So you've programmed and have friends who work in the industry. Let's avoid making this a dick swinging contest on who's closer to the industry. I clearly know what I'm talking about, and everyone whose getting the best work in tech has NDA's way too far up their ass to be able to freely talk about who they work for and what they work on, so lets keep that out of this discussion.

You're flat out wrong about the way that on-die memory affects the graphics operations of an APU. Your quotes above completely miss the point, you're talking about completely unrelated things.

Of course the latency going to disc or HDD is much longer, that is irrelevant to the point because the scenarios where on-die memory provides savings involve a situation where all the relevant data is already in the Ram, and the RAM is feeding the APU. We aren't talking about RAM dumps, we are talking about time that the APU is idle; precious time that could be spent in cycle when it is instead waiting for packets from the memory module.

At certain points, the processing unit has to request information from the RAM module. The travel time, measured in nanoseconds for the processing unit to acquire information from the RAM can be shaven. The nanoseconds add up, and reduction of said idle time makes the processor more efficient. This is all in addition to increasing the effective bandwidth.

Sony considered on-die memory because it has its benefits. Just like Nvidia is providing similar solution sets for their next GPU series: http://www.anandtech.com/show/6846/n...or-beyond-2014.

Also I never said anything about the Xbox One outperforming the PS4, I said there's a lot of unknowns, that the gap could very well be smaller than the compute power per cycle would indicate, and that its not as simple as you understand it.

You went as far as to talk about the gap between these systems as comparable to the PS2 to Xbox gap. That just shows how much you are reaching in your statements, you're jumping way too far and quickly. There are still unknown parts to these systems, and they could very well end up much closer together in performance than the 33% difference in per clock graphics processing would indicate. Even in a worst case scenario, it is unimaginable for the PS4 to have image quality gains over its peer to the level of the Xbox vs. the PS2.

If you are speculating that the gap is really large enough to compare to the Xbox vs. PS2 generation, well... I think we should just agree to disagree and move on, because we are too far apart to come anywhere near a consensus. I'd think the chance of all life on earth ending tomorrow is higher than the gap between the Xbox One and PS4 being that noticeable. In fact, i would expect the gap to be almost unnoticeable in cross platform games, especially in screenshots.

Last edited by Willynowei; 05-25-2013 at 11:48 PM..
Willynowei is offline   Reply With Quote
Old 05-25-2013, 11:09 PM   #172
24champ
Livin' the dream!
 
24champ's Avatar
 
Keep Calm and Chive on

Join Date: Aug 2005
Location: Southern California
Posts: 19,585

Adopt-a-Bronco:
DomCasual
Default



Classic.
24champ is offline   Reply With Quote
Old 05-25-2013, 11:51 PM   #173
Houshyamama
I Make The Weather
 
Houshyamama's Avatar
 

Join Date: Sep 2006
Location: San Diego
Posts: 4,476

Adopt-a-Bronco:
Brock Osweiler
Default



I love this ****.
Houshyamama is offline   Reply With Quote
Old 05-26-2013, 10:17 AM   #174
24champ
Livin' the dream!
 
24champ's Avatar
 
Keep Calm and Chive on

Join Date: Aug 2005
Location: Southern California
Posts: 19,585

Adopt-a-Bronco:
DomCasual
Default

24champ is offline   Reply With Quote
Old 05-26-2013, 10:20 AM   #175
Requiem
~~~
 
Requiem's Avatar
 
~ ~ ~

Join Date: Feb 2006
Location: Earth Division
Posts: 22,960

Adopt-a-Bronco:
Princes of Tara
Default

Willy and Drek got their geek wangs out.
Requiem is offline   Reply With Quote
Reply

Thread Tools
Display Modes



Forum Jump


All times are GMT -7. The time now is 07:54 PM.


Denver Broncos