|
On July 25 2013 15:33 Blisse wrote:Show nested quote +On July 25 2013 11:36 Brett wrote:On July 25 2013 08:11 Blisse wrote:On July 25 2013 00:42 Acrofales wrote:From the first article: That processing power enables things like instantaneous Kinect, where voice commands immediately activate tasks on the Xbox, from waking up the machine instantly to changing the channel on your TV. That kind of processing exists alongside other things going on at the same time. You can, for instance, watch TV and then receive a Skype call without turning off the TV show. To do this type of thing, Microsoft had to design the box in a way that it could access memory and caches of data much faster than in past game consoles.
First off: I agree that the whole concept is awesome and if they manage to pull it off, it will be amazing. The only way to even have a hope of pulling it off is by designing the hardware in such a way that it can. However, this little bit is really really misleading. There is absolutely no way voice recognition software can run instantaneously if it has to send the audio to a datacenter, which processes it and sends the result back to your XB1. I have no clue what they're on about with their Skype call stuff, because my PC has been able to multitask this kind of stuff seamlessly since what seems like forever, which indicates that what they are selling as fancy newtech is basically the same architecture that has always existed in PCs. The fact that cellphones and tablets cannot do this is a design CHOICE, not a hardware limitation: it has to do with battery power and using the available hardware smartly, but there's no hardware limitation, just as there isn't when your PC is playing a movie or a stream and you get a skype call. Hell, even the sound is processed together, making a hell of a mess of things. So that's just one paragraph. The rest of the text reads a lot like the way I write my project reports about things that I haven't actually built yet, but want to build. It is all awesome, until in reality it turns out that if you live in Hickville, Oklahoma (not to mention anywhere in Brazil, which is also a tier 1 country in their fancy map), the internet isn't fast enough to do any of that. The resolution of the Kinect is great, but realtime expression recognition software is both terrible and extremely computationally intensive. The rendering of graphics in a cloud and synching that up with locally computed graphics is going to be an absolute nightmare for engine developers (not to mention, impossible for any multiplatform games). And a lot more of such things that make it seem like this is all really awesome, but completely unrealistic, when you read between the lines of the technobabble. Well really, my point is, these are people who are actually working on it versus ... your opinion/knowledge about it being unrealistic. Like, am I supposed to listen to you telling me it's not going to work, when clearly every single one of them have gone like, "yeah this cloud bsns is going to work at least somewhat well." I get that you guys are dubious of it working, but really I don't get why I would listen to your opinion about it being not doable when you're not actual engineers working on that technology, while they are. I'm an engineer too and it's not like I don't think that it's seriously hard to do, but I don't get why the assumption is that they're lying to us. Because no (gaming) company has ever promised the world and delivered far, far less before. A healthy amount of skepticism is warranted here. If they can deliver, wonderful. Until then, who gives a fuck? It's all unproven and the onus is on them to show results rather than promise them with the usual hyperbole. E: To be clear, if it actually works, it's great for the system and its longevity. I hope it does. But I expect it wont. Yeah, but the difference was that the tone in this thread isn't indifferent to the results until the results came out, it was a healthy dose of "nope this cloud shit ain't true fuck you microsoft for lying to me" instead of the more pleasant "i really don't see how this is possible but let's see if it's true". My point isn't that you're not correct in being skeptical. It's that the tone was never skepticism in the first place. I just went to a random page. Show nested quote +On June 11 2013 04:29 Womwomwom wrote: They are lying that it will improve graphics directly. They aren't lying that things like AI can improve.
Also: if you want to source technology, pick your sources better. Eurogamer's Digital Foundry actually does a lot of good work on technical analysis of hardware and console games. We're talking about esoteric bullshit. Am I going to listen to Digital Spy? No, I'm not because they're literally a wank rag that deals with entertainment news. From the article: "The cloud can tackle tasks in games like physics, artificial intelligence, and even some rendering. The tasks that require low latency, with split second interaction between one chip or one device and another, are those that the box — not the cloud — still needs to handle."
This is what most of us have been saying the entire time. They dont specify what kind of rendering, which is one of the main things I question. Physics... maybe on some level. Maybe they can use it to calculate predicted things in combination with the AI, but that will be wonky if you hit a lag spike for some reason and suddenly an object that should behave one way suddenly doesnt behave properly. But definitely AI, but AI can be handled in different ways too, and Sony has a pretty solid infrastructure to use the same method and there is nothing preventing them from implementing it later on if MS has any neat uses that they come up with.
The only example I have seen of the cloud for AI is with forza's driving AI profile thing, which can actually be done on the PS4 as well (or on any system really) by simply having an AI file saved on the console that can share with friends. The fact that no other game (that I am aware of) has really mentioned how their game benefits from the "processing power" of the cloud is why alarms are going off for me.
|
The tone in this thread wasn't really dubious and much more negative. The second I post something about the features, the response is not really constructive, cautious criticism as it is just outright telling me the sources are lying, and that no one cares.
The article says that they can do some rendering of distant trees if divide the task properly, so there's something somewhat concrete in graphics improvement.
From that article I linked he said that Microsoft wanted to push the always-on Internet thing in order to build for the next 5-10 years (since that's the direction society is moving towards) so that developers then can sooner build with the knowledge that they have a working Internet connection at all times, similar to how forcing Kinect on would also allow developers to build with the knowledge they have a Kinect at all times.
I guess it still feels like all their work was on a future-proofing strategy more than anything, but they've presented actual use cases that they've considered. And yes, it's their job to make it happen, now or 5 years from now, but the problem I have is still this really hostile attitude in the thread.
|
On July 25 2013 15:11 WolfintheSheep wrote:Show nested quote +On July 23 2013 22:37 paralleluniverse wrote:So the Steam Summer Sale has just come and gone. What did we learn? That unreleased AAA games like Rome 2 and Saints Row 4 didn't get discounted. What's this got to do with Xbox One? Well, if you remember back here and here, people were expecting Xbox One, with it's previous DRM policy, to announce discounts on it's AAA games more than 6 months in advance. They were mad and angry that Microsoft dare announce a console with DRM, but developers didn't announce cheaper games more than 6 months in advance, despite the fact that this doesn't even happened with Steam's comparatively draconian level of DRM. Delusional thinking. Uhhh....Edit: Oh, just realized it's you again. I have no idea what motivates you to continuously return to this thread and blatantly lie. I did not lie. These announced but unreleased games were not discounted in the Steam Summer Sale. Nor are they discounted on the Best Buy link you've given.
Yet somehow, people like you expect games to be discounted half a year in advance for the Xbox One, despite the fact that there is no precedence for such a crazy thing, as evidenced by the non-discount during the recently ended Steam Summer Sale. We've been over this.
|
On July 25 2013 16:47 Blisse wrote: The tone in this thread wasn't really dubious and much more negative. The second I post something about the features, the response is not really constructive, cautious criticism as it is just outright telling me the sources are lying, and that no one cares.
The article says that they can do some rendering of distant trees if divide the task properly, so there's something somewhat concrete in graphics improvement.
From that article I linked he said that Microsoft wanted to push the always-on Internet thing in order to build for the next 5-10 years (since that's the direction society is moving towards) so that developers then can sooner build with the knowledge that they have a working Internet connection at all times, similar to how forcing Kinect on would also allow developers to build with the knowledge they have a Kinect at all times.
I guess it still feels like all their work was on a future-proofing strategy more than anything, but they've presented actual use cases that they've considered. And yes, it's their job to make it happen, now or 5 years from now, but the problem I have is still this really hostile attitude in the thread. I'm not negative. I'm just not going to believe it until they actually demo some of that stuff. The Forza AI thing is really cool and I love that kind of use, but they're not using cloud computing for that, but rather the social aspect of the cloud: they are crowdsourcing the AI. This is a brilliant idea, but it has nothing to do with computing.
I am very much sitting on the fence, and waiting for MS to dazzle me. But their words are empty promises until they can actually show that they have gotten it working. I have lots of ideas that are awesome... but until I have an actual tech demo in which I show it works like I say it will, it is pointless technobabble.
And the further you claim you are ahead of the curve, the more likely it is that you cannot deliver on your promises. So yes, I am very skeptical about MS's ability to deliver on these promises.
|
On July 25 2013 21:59 paralleluniverse wrote:Show nested quote +On July 25 2013 15:11 WolfintheSheep wrote:On July 23 2013 22:37 paralleluniverse wrote:So the Steam Summer Sale has just come and gone. What did we learn? That unreleased AAA games like Rome 2 and Saints Row 4 didn't get discounted. What's this got to do with Xbox One? Well, if you remember back here and here, people were expecting Xbox One, with it's previous DRM policy, to announce discounts on it's AAA games more than 6 months in advance. They were mad and angry that Microsoft dare announce a console with DRM, but developers didn't announce cheaper games more than 6 months in advance, despite the fact that this doesn't even happened with Steam's comparatively draconian level of DRM. Delusional thinking. Uhhh....Edit: Oh, just realized it's you again. I have no idea what motivates you to continuously return to this thread and blatantly lie. I did not lie. These announced but unreleased games were not discounted in the Steam Summer Sale. Nor are they discounted on the Best Buy link you've given. Yet somehow, people like you expect games to be discounted half a year in advance for the Xbox One, despite the fact that there is no precedence for such a crazy thing, as evidenced by the non-discount during the recently ended Steam Summer Sale. We've been over this. Are you really that dense? The PC preorder is $10 cheaper than the console versions. And since all versions of SRTT used Steamworks, it's safe to assume this one does too.
Let me repeat that XBox360: $60. PS3: $60. Steam: $50.
|
Xboxone: 50 pounds or about $77
|
Currency rates mean nothing to publishers unless you try to buy all your games out of country and ship them to you but i doubt that works out unless done in bulk.
|
Microsoft has announced extra controllers for the Xbox One will come at $59.99 each, according to new listings on the Microsoft Store website. A controller that comes with a play-and-charge kit will cost $15 more.
Microsoft has slowly rolled out information about the new controller and its rumbling impulse triggers, new d-pad, and redesigned analog sticks. As is the custom for most gaming consoles, only one controller will be included in the box with the $499 Xbox One (along with two AA batteries and a USB port for external power).
For comparison, the Xbox 360 wireless controllers were priced at $49.99 back in 2005, while the wired controllers cost $39.99. The 360 play-and-charge kits were $20 standalone packages, while a standalone Xbox One play-and-charge kit will cost $24.99.
Microsoft also announced the pricing of Xbox One headsets, which will not be included in the box. They will debut at $24.99 (some models of the Xbox 360 are packaged with a headset).
Source
|
Xbox One getting a speed boost from 800 mhz to 853 mhz
This is the time when we’ve gone from the theory of how the hardware works - what we think the yield is going to look like, what is the thermal envelope, how do things come together - to actually having it in our hands,” Whitten explained. “That’s the time when you really start tweaking the knobs. Either your theory was dead on or you were too conservative or you were a little too aggressive. And an example of that is we’ve tweaked up the clock speed on our GPU, from 800 MHz to 853 MHz. Just an example of how you really start landing the program as you get closer to launch.
http://www.ign.com/articles/2013/08/02/microsoft-confirms-xbox-one-gpu-boost
|
So after all this time they still aren't finished with actual development.
|
United States22883 Posts
On August 03 2013 10:12 zoLo wrote:Xbox One getting a speed boost from 800 mhz to 853 mhz Show nested quote +This is the time when we’ve gone from the theory of how the hardware works - what we think the yield is going to look like, what is the thermal envelope, how do things come together - to actually having it in our hands,” Whitten explained. “That’s the time when you really start tweaking the knobs. Either your theory was dead on or you were too conservative or you were a little too aggressive. And an example of that is we’ve tweaked up the clock speed on our GPU, from 800 MHz to 853 MHz. Just an example of how you really start landing the program as you get closer to launch. http://www.ign.com/articles/2013/08/02/microsoft-confirms-xbox-one-gpu-boost Hope they don't run into issues similar to the RRoD because of this.
|
On August 03 2013 12:55 Jibba wrote:Show nested quote +On August 03 2013 10:12 zoLo wrote:Xbox One getting a speed boost from 800 mhz to 853 mhz This is the time when we’ve gone from the theory of how the hardware works - what we think the yield is going to look like, what is the thermal envelope, how do things come together - to actually having it in our hands,” Whitten explained. “That’s the time when you really start tweaking the knobs. Either your theory was dead on or you were too conservative or you were a little too aggressive. And an example of that is we’ve tweaked up the clock speed on our GPU, from 800 MHz to 853 MHz. Just an example of how you really start landing the program as you get closer to launch. http://www.ign.com/articles/2013/08/02/microsoft-confirms-xbox-one-gpu-boost Hope they don't run into issues similar to the RRoD because of this. That would be kinda funny. It will definitely reduce the lifespan of the console, the question is just how much.
|
On July 25 2013 23:13 Acrofales wrote:Show nested quote +On July 25 2013 16:47 Blisse wrote: The tone in this thread wasn't really dubious and much more negative. The second I post something about the features, the response is not really constructive, cautious criticism as it is just outright telling me the sources are lying, and that no one cares.
The article says that they can do some rendering of distant trees if divide the task properly, so there's something somewhat concrete in graphics improvement.
From that article I linked he said that Microsoft wanted to push the always-on Internet thing in order to build for the next 5-10 years (since that's the direction society is moving towards) so that developers then can sooner build with the knowledge that they have a working Internet connection at all times, similar to how forcing Kinect on would also allow developers to build with the knowledge they have a Kinect at all times.
I guess it still feels like all their work was on a future-proofing strategy more than anything, but they've presented actual use cases that they've considered. And yes, it's their job to make it happen, now or 5 years from now, but the problem I have is still this really hostile attitude in the thread. I'm not negative. I'm just not going to believe it until they actually demo some of that stuff. The Forza AI thing is really cool and I love that kind of use, but they're not using cloud computing for that, but rather the social aspect of the cloud: they are crowdsourcing the AI. This is a brilliant idea, but it has nothing to do with computing. I am very much sitting on the fence, and waiting for MS to dazzle me. But their words are empty promises until they can actually show that they have gotten it working. I have lots of ideas that are awesome... but until I have an actual tech demo in which I show it works like I say it will, it is pointless technobabble. And the further you claim you are ahead of the curve, the more likely it is that you cannot deliver on your promises. So yes, I am very skeptical about MS's ability to deliver on these promises.
Sorry to beat a dead horse, but that is almost exactly negative. The idea being you've already convinced yourself that they're going to fail to dazzle you. You said you're sitting on the fence and then you said you're skeptical about it. Negativity.
But mainly, it's that you've convinced yourself that Microsoft is lying about what they've said they can do. It's the very idea that we are even having this discussion because you consider their words empty promises already. Yes, that's what they are, but it should never have gotten to the point where this needed to be in question or brought up, because people have this idea that Microsoft is trying to lie about what they've been doing. Everything should always be criticized and every aspect shouldn't be accepted without skepticism, but there shouldn't be the undertone that "it's already going to fail."
http://www.vg247.com/2013/08/01/xbox-one-cloud-processing-gives-forza-5-600-more-ai-capability-says-dev/
idk, normally I like reading comments but I've learned it's kinda bullshit recently because everyone bitches in the comments. Too much negativity. But then it's especially hard with Microsoft shit which is really annoying to read through. At least I'm happy this thread has returned to normal, and a lot of the comments on game sites are starting to normalize between hating both the X1 and PS4 somewhat equally.
I'm somewhat scared actually for the WiiU. It seems like they're having price drops to $200, but it's been a year now? and the only game that every seemed interesting was ZombieU. :/
I'm still really worried that after 4-5~ years development on the PS4 is going to be miles ahead than on the X1 in terms of graphics. Mainly I'm worried that the X1 guys seriously underestimated the PS4 but can't pull back now. Hopefully they're more confident than I.
|
United States22883 Posts
|
Oh, I never saw that exact article from Carmac. I only saw one where he said it was "close" or something of the sort.
I feel like I should trust him, it's just that intuitively, it just feels that while X1 has the better developer tools (?), that extra bandwidth and GPU power really feels like it's going to pop up somewhere down the road 4-5 years. It didn't really show up for the PS3 and I would also be against Carmac's knowledge, but his comments don't 100% relieve my anxiety, even though it should.
|
On August 03 2013 14:29 Jibba wrote:Well, Carmac says they seem very, very close, although he hasn't done real benchmarks yet (and has a NDA anyways.) http://www.geek.com/games/john-carmack-its-weird-how-close-xbox-one-and-ps4-are-1564248/It seems like given they've got the exact same architecture, the flop difference should matter, no? It's not like a case where two different types of graphics cards are being compared and one's power can't be fully utilized. I think that gaming graphics is reaching a bit of a plateau. How much can be improved, really? Both will probably look about the same to the naked eye, but the PS4 might have fewer frame rate dips due to the bandwidth or something.
EDIT: I imagine the consoles are going to be figured out REALLY quickly in terms of what their capability is. Devs have been working on this same architecture for years now with the PC.
|
United States22883 Posts
On August 03 2013 14:43 TheRabidDeer wrote:I think that gaming graphics is reaching a bit of a plateau. How much can be improved, really? Both will probably look about the same to the naked eye, but the PS4 might have fewer frame rate dips due to the bandwidth or something. EDIT: I imagine the consoles are going to be figured out REALLY quickly in terms of what their capability is. Devs have been working on this same architecture for years now with the PC. Well, probably better AA and physics. Imagine how much Final Fantasy nerds will cream their pants if they get TressFX.
|
On August 03 2013 14:43 TheRabidDeer wrote:I think that gaming graphics is reaching a bit of a plateau. How much can be improved, really? Both will probably look about the same to the naked eye, but the PS4 might have fewer frame rate dips due to the bandwidth or something. EDIT: I imagine the consoles are going to be figured out REALLY quickly in terms of what their capability is. Devs have been working on this same architecture for years now with the PC.
You'd think so but compare the first PS3 games to the newest PS3 games and theres a noticeable difference in better graphics. I'd only assume it will be the same case for X1 and PS4, at the end of their spans or a couple years down the road here we will get much more graphical intense games than there are at launch.
|
On August 03 2013 14:48 Jibba wrote:Show nested quote +On August 03 2013 14:43 TheRabidDeer wrote:On August 03 2013 14:29 Jibba wrote:Well, Carmac says they seem very, very close, although he hasn't done real benchmarks yet (and has a NDA anyways.) http://www.geek.com/games/john-carmack-its-weird-how-close-xbox-one-and-ps4-are-1564248/It seems like given they've got the exact same architecture, the flop difference should matter, no? It's not like a case where two different types of graphics cards are being compared and one's power can't be fully utilized. I think that gaming graphics is reaching a bit of a plateau. How much can be improved, really? Both will probably look about the same to the naked eye, but the PS4 might have fewer frame rate dips due to the bandwidth or something. EDIT: I imagine the consoles are going to be figured out REALLY quickly in terms of what their capability is. Devs have been working on this same architecture for years now with the PC. Well, probably better AA and physics. Imagine how much Final Fantasy nerds will cream their pants if they get TressFX. Hah, I can only imagine... "the hair... its... REAL"
On August 03 2013 14:49 Zooper31 wrote:Show nested quote +On August 03 2013 14:43 TheRabidDeer wrote:On August 03 2013 14:29 Jibba wrote:Well, Carmac says they seem very, very close, although he hasn't done real benchmarks yet (and has a NDA anyways.) http://www.geek.com/games/john-carmack-its-weird-how-close-xbox-one-and-ps4-are-1564248/It seems like given they've got the exact same architecture, the flop difference should matter, no? It's not like a case where two different types of graphics cards are being compared and one's power can't be fully utilized. I think that gaming graphics is reaching a bit of a plateau. How much can be improved, really? Both will probably look about the same to the naked eye, but the PS4 might have fewer frame rate dips due to the bandwidth or something. EDIT: I imagine the consoles are going to be figured out REALLY quickly in terms of what their capability is. Devs have been working on this same architecture for years now with the PC. You'd think so but compare the first PS3 games to the newest PS3 games and theres a noticeable difference in better graphics. I'd only assume it will be the same case for X1 and PS4, at the end of their spans or a couple years down the road here we will get much more graphical intense games than there are at launch. PS3 was a very difficult console to make games for. It used a WILDLY different processor and architecture, and had no tools to develop with. Developers had nothing like the PS3 to work on beforehand. With the PS4 though, it is almost the same as a PC, which they have worked with for decades.
|
On August 03 2013 14:58 TheRabidDeer wrote:Show nested quote +On August 03 2013 14:48 Jibba wrote:On August 03 2013 14:43 TheRabidDeer wrote:On August 03 2013 14:29 Jibba wrote:Well, Carmac says they seem very, very close, although he hasn't done real benchmarks yet (and has a NDA anyways.) http://www.geek.com/games/john-carmack-its-weird-how-close-xbox-one-and-ps4-are-1564248/It seems like given they've got the exact same architecture, the flop difference should matter, no? It's not like a case where two different types of graphics cards are being compared and one's power can't be fully utilized. I think that gaming graphics is reaching a bit of a plateau. How much can be improved, really? Both will probably look about the same to the naked eye, but the PS4 might have fewer frame rate dips due to the bandwidth or something. EDIT: I imagine the consoles are going to be figured out REALLY quickly in terms of what their capability is. Devs have been working on this same architecture for years now with the PC. Well, probably better AA and physics. Imagine how much Final Fantasy nerds will cream their pants if they get TressFX. Hah, I can only imagine... "the hair... its... REAL" Show nested quote +On August 03 2013 14:49 Zooper31 wrote:On August 03 2013 14:43 TheRabidDeer wrote:On August 03 2013 14:29 Jibba wrote:Well, Carmac says they seem very, very close, although he hasn't done real benchmarks yet (and has a NDA anyways.) http://www.geek.com/games/john-carmack-its-weird-how-close-xbox-one-and-ps4-are-1564248/It seems like given they've got the exact same architecture, the flop difference should matter, no? It's not like a case where two different types of graphics cards are being compared and one's power can't be fully utilized. I think that gaming graphics is reaching a bit of a plateau. How much can be improved, really? Both will probably look about the same to the naked eye, but the PS4 might have fewer frame rate dips due to the bandwidth or something. EDIT: I imagine the consoles are going to be figured out REALLY quickly in terms of what their capability is. Devs have been working on this same architecture for years now with the PC. You'd think so but compare the first PS3 games to the newest PS3 games and theres a noticeable difference in better graphics. I'd only assume it will be the same case for X1 and PS4, at the end of their spans or a couple years down the road here we will get much more graphical intense games than there are at launch. PS3 was a very difficult console to make games for. It used a WILDLY different processor and architecture, and had no tools to develop with. Developers had nothing like the PS3 to work on beforehand. With the PS4 though, it is almost the same as a PC, which they have worked with for decades. Yup, dat PowerPC plus cell processors made shit complex, if anything development for XB1 will take a little bit longer due to their small dedicated memory which may be a quark or not, which btw is smarter. There is a reason we don't use GDDR for system RAM their latency blows chunks which isn't an issue when it's on a gpu, but it's more of an issue on a CPU. Which is why even with the spec differences they will likly run very similar due to DirectX optimizations and the ps4 hardware choice.
Anyways XB1 uses full on windows DirectX 11.2 now so it's really just a Windows 8 gaming PC PS4 will use LibGCM which is ugly because sony doesn't try hard enough to improve it, that alone may saw developers one way or another. Probably because sony can't hire programmers and QC for shit, they have to re issue nearly every other patch for ps3 because they brick some dudes ps3 here or there.
|
|
|
|