Xbox 360/PS3/(to a lesser extent) Wii owners represent
-
This post did not contain any content.
Apparently you are unaware of the shit storm that's been Nvidia lately.
-
This post did not contain any content.
How old is this meme? Older than that kid I think.
-
Apparently you are unaware of the shit storm that's been Nvidia lately.
Oh, NVIDIA have always been a shitstorm. From making defective PS3 GPUs (the subject of this meme) to the constant hell that is their Linux drivers to melting power connectors, I am astounded anyone trusts them to do anything.
-
This post did not contain any content.
It's more: making their cards competitive on price and performance lately for team red
-
It's more: making their cards competitive on price and performance lately for team red
AMD have been amazing lately. 9070 XT makes buying most other cards in that price range pointless, especially with NVIDIA's melting connectors being genuine hazards. ATI (who were dissolved in 2010 after being bought out by AMD) and NVIDIA in the mid to late 2000s however were dumpster fires in their own ways.
-
AMD have been amazing lately. 9070 XT makes buying most other cards in that price range pointless, especially with NVIDIA's melting connectors being genuine hazards. ATI (who were dissolved in 2010 after being bought out by AMD) and NVIDIA in the mid to late 2000s however were dumpster fires in their own ways.
Depends if you can actually find a 9070XT at the price they advertised it at. Once that happens I'll be convinced, right now though that's very much felt like a bit of a bait and switch. Holding out hope though.
-
Depends if you can actually find a 9070XT at the price they advertised it at. Once that happens I'll be convinced, right now though that's very much felt like a bit of a bait and switch. Holding out hope though.
Yeah, pricing is not the greatest at the moment, most likely because there's no reference card to keep other prices in check. Still (at least here in the UK) they are still well below the stratospheric NVIDIA prices for a 5070 Ti and are easily available.
-
It's hard to say for certain whose final call it was to do this underfill (it's a tossup between ATI's design engineers and the packaging partner they chose to work with to get the TSMC chip into a final product), but at the end of the day it was ATI's responsibility to validate the chip and ensure its reliability before shipping it off to Microsoft.
I always heard that was tsmc's decision.
-
This post did not contain any content.
Wiis and ps3's weren't crapping out, and the 360 failures weren't due to ati. This meme is dumb. Dumb in the bad meme kinda way.
-
This post did not contain any content.
Until they got bought by AMD, ATI was more reliable than nVidia cards which were prone to bursting into flames.
-
Wiis and ps3's weren't crapping out, and the 360 failures weren't due to ati. This meme is dumb. Dumb in the bad meme kinda way.
Wii was mostly okay, but boards with a 90nm Hollywood GPU are somewhat more likely to fail than later 65nm Hollywood-A boards (so RVL-CPU-40 boards and later), especially if you leave WiiConnect24 on as it keeps the Starlet ARM chip inside active even in fan off standby - most 90nm consoles will be okay due to low operating temperatures, but some (especially as thermal paste ages and dust builds) are more likely to die due to bumpgate related problems.
PS3s did crap out with yellow lights of death, although not as spectacularly as 360 red rings (lower proportion due to beefier cooling and different design making the flaws less immediately obvious, but still a problem). NVIDIA on the RSX made the same mistakes as ATI on the Xenos - poor underfill and bump choice that could not withstand the thermal cycles, which should have been caught (NVIDIA and bumpgate is a whole wild story in and of itself though, considering it plagued their desktop and mobile chips). The Cell CPU on there is very reliable though, even though it drew more power and consequently output more heat - it was just the GPU that could not take the heat.
360s mostly red ringed due to faulty GPUs, see previous comments about the PS3 RSX. ATI had a responsibility to choose the right materials, design, and packaging partner to ship to Microsoft for final assembly, and so they must take some responsibility (they also, like NVIDIA, had troubles with their other products at this time, leading to high failure rates of devices like the early MacBook Pros). However, it is unknown if they are fully to blame, as it is unknown who made the call for the final package design.
-
Wiis and ps3's weren't crapping out, and the 360 failures weren't due to ati. This meme is dumb. Dumb in the bad meme kinda way.
It is shit post to be fair
-
I don't know how much of it was ATI's fault or the fab's, but my understanding is that no one had experience handling that amount of heat.
Agreed, thermals were increasing faster than most manufacturers could handle. Only real exceptions in this time I can think of were IBM (because they had to, PowerPC G5 was such a power hog it pissed off Apple enough for them to switch architectures) and Intel (because they also had to, Pentium 4 was a disaster).
-
This post did not contain any content.
I've been refurbishing my PS3. It's running like a dream.
-
Oh, NVIDIA have always been a shitstorm. From making defective PS3 GPUs (the subject of this meme) to the constant hell that is their Linux drivers to melting power connectors, I am astounded anyone trusts them to do anything.
wrote on last edited by [email protected]CEO Jensen Huang after reading your disparaging remarks about his company:
Edit: apparently I fail at uploading gifs to lemmy
-
CEO Jensen Huang after reading your disparaging remarks about his company:
Edit: apparently I fail at uploading gifs to lemmy
Those leather jackets won't buy themselves!
-
Wii was mostly okay, but boards with a 90nm Hollywood GPU are somewhat more likely to fail than later 65nm Hollywood-A boards (so RVL-CPU-40 boards and later), especially if you leave WiiConnect24 on as it keeps the Starlet ARM chip inside active even in fan off standby - most 90nm consoles will be okay due to low operating temperatures, but some (especially as thermal paste ages and dust builds) are more likely to die due to bumpgate related problems.
PS3s did crap out with yellow lights of death, although not as spectacularly as 360 red rings (lower proportion due to beefier cooling and different design making the flaws less immediately obvious, but still a problem). NVIDIA on the RSX made the same mistakes as ATI on the Xenos - poor underfill and bump choice that could not withstand the thermal cycles, which should have been caught (NVIDIA and bumpgate is a whole wild story in and of itself though, considering it plagued their desktop and mobile chips). The Cell CPU on there is very reliable though, even though it drew more power and consequently output more heat - it was just the GPU that could not take the heat.
360s mostly red ringed due to faulty GPUs, see previous comments about the PS3 RSX. ATI had a responsibility to choose the right materials, design, and packaging partner to ship to Microsoft for final assembly, and so they must take some responsibility (they also, like NVIDIA, had troubles with their other products at this time, leading to high failure rates of devices like the early MacBook Pros). However, it is unknown if they are fully to blame, as it is unknown who made the call for the final package design.
I dunno, man. I never knew anyone who had a yellow light PS3 and the only ones I read about were from people who had kept them in enclosed cabinets. I also watched a very in depth 2 hour documentary on the 360 rrod and it wasn't due to ati.
-
Wii was mostly okay, but boards with a 90nm Hollywood GPU are somewhat more likely to fail than later 65nm Hollywood-A boards (so RVL-CPU-40 boards and later), especially if you leave WiiConnect24 on as it keeps the Starlet ARM chip inside active even in fan off standby - most 90nm consoles will be okay due to low operating temperatures, but some (especially as thermal paste ages and dust builds) are more likely to die due to bumpgate related problems.
PS3s did crap out with yellow lights of death, although not as spectacularly as 360 red rings (lower proportion due to beefier cooling and different design making the flaws less immediately obvious, but still a problem). NVIDIA on the RSX made the same mistakes as ATI on the Xenos - poor underfill and bump choice that could not withstand the thermal cycles, which should have been caught (NVIDIA and bumpgate is a whole wild story in and of itself though, considering it plagued their desktop and mobile chips). The Cell CPU on there is very reliable though, even though it drew more power and consequently output more heat - it was just the GPU that could not take the heat.
360s mostly red ringed due to faulty GPUs, see previous comments about the PS3 RSX. ATI had a responsibility to choose the right materials, design, and packaging partner to ship to Microsoft for final assembly, and so they must take some responsibility (they also, like NVIDIA, had troubles with their other products at this time, leading to high failure rates of devices like the early MacBook Pros). However, it is unknown if they are fully to blame, as it is unknown who made the call for the final package design.
Ok so, let me set this all straight.
The wii had nothing to do with the gpu but with the die of the gpu that nintendo had designed and kept secret.
Inside the gpu die is both the gpu (hollywood) but also an arm core called starlet. It runs the security software and thats where (rarely but happened) things went wrong, as it was always running code, even in standby. This had nothing to do with ati.And the ps3 was not what you said. The ps3's problem was that the ihs wasnt making a good enough contact to the core so the heat of the cpu didnt transfer well into the cooler. You can fix this, but is very tricky and is easy to permanently damage the ps3 in doing so ( you have to cut the silicon under the ihs without touching the die or the pcb, remove the silicon and do reattach it with less glue ). This could be contributed to the manufacturer i suppose
-
Ok so, let me set this all straight.
The wii had nothing to do with the gpu but with the die of the gpu that nintendo had designed and kept secret.
Inside the gpu die is both the gpu (hollywood) but also an arm core called starlet. It runs the security software and thats where (rarely but happened) things went wrong, as it was always running code, even in standby. This had nothing to do with ati.And the ps3 was not what you said. The ps3's problem was that the ihs wasnt making a good enough contact to the core so the heat of the cpu didnt transfer well into the cooler. You can fix this, but is very tricky and is easy to permanently damage the ps3 in doing so ( you have to cut the silicon under the ihs without touching the die or the pcb, remove the silicon and do reattach it with less glue ). This could be contributed to the manufacturer i suppose
Your description of the Starlet is more accurate, yes. However, its heat output consequently caused some of the issues with the ATI designed parts of the Hollywood, as it exacerbated the thermal issues the 90nm variants had, and that a better designed chip would have been able to handle.
The PS3's IHS was not the problem. There was decent contact and heat transfer, maybe not as perfect as it could have been (there's thermal paste instead of it being soldered into place, which is why a delid and relid is fairly essential if you have a working 90nm PS3 due to aging thermal paste), but definitely not big enough of a problem for a properly designed chip to cook itself at the operating temperatures of the PS3 (75-80 temperature target on the RSX on an early variant at full load). The Cell B.E. next to the RSX that uses more power (consequently outputs more heat) has a similar setup for its IHS, but IBM did not make the same design mistakes as NVIDIA, and so we see very few reports of the CPU cooking itself even in those early PS3s.
-
Your description of the Starlet is more accurate, yes. However, its heat output consequently caused some of the issues with the ATI designed parts of the Hollywood, as it exacerbated the thermal issues the 90nm variants had, and that a better designed chip would have been able to handle.
The PS3's IHS was not the problem. There was decent contact and heat transfer, maybe not as perfect as it could have been (there's thermal paste instead of it being soldered into place, which is why a delid and relid is fairly essential if you have a working 90nm PS3 due to aging thermal paste), but definitely not big enough of a problem for a properly designed chip to cook itself at the operating temperatures of the PS3 (75-80 temperature target on the RSX on an early variant at full load). The Cell B.E. next to the RSX that uses more power (consequently outputs more heat) has a similar setup for its IHS, but IBM did not make the same design mistakes as NVIDIA, and so we see very few reports of the CPU cooking itself even in those early PS3s.
Ye no, ive have a ps3 that ylod which i reflowed back to life. After it was working again i started digging and the temps the core was reporting wasnt even close to the ihs that i measured with thermal couple. Also the thermal paste is on top of the ihs, not under it and it wasnt soldered in place. Early ps3's did cook themselves. Less than 360 by a long shot, but they still did!
Also, side note, its funny how some 360's rrod was not due to the heat issue but can also be caused by power supply failure or the plug being faulty. Thats how i got and fixed my 360