Quote:
Originally Posted by Cscottrun
Are you serious? If you think the casual gamer will actually notice the difference, you have another thing coming. And yes, you obviously need to take into account the source material. ( but you didn't) Majority of the casual gamers are just buying into the hype.
And my source, my family has owned and operated an electronic repair store for over 30 years, specializing in TV's. I know the customer base for TV's. I know that even with the common knowledge out their that Samsung, Panasonic, and Sony out perform majority of the brands, plenty of people still buy the shitty brand. I also know that a ton of people don't even properly have their gaming consoles and cable boxes properly hooked up.
You clearly took what I said way to literally. In truth, I meant the disparity, especially on smaller screens, is negligible.
And as far as sources, here is just one of the first sources I found from google:
http://www.bundle.tv/faq/why-dont-ma...roadcasts.html
|
That article is all over the place. They start off by referring to non-1080p broadcasts as "lower resolutions", then later they acknowledge that 1080i and 1080p are the same resolution. Which is it? They don't even note that you're watching 1080p when you receive a 1080i60 signal, your set will deinterlace to 1080p unless you're watching on a very old HD CRT set, and the original source was likely 1080p to begin with and converted to 1080i60 for broadcast. So when they say "Televisions running in 1080i mode update half of their pixels at a time" they're technically correct, but that's less than 1% of all HDTV screens. That's why hardly anything is actually shot in interlaced anymore, because 99% of HDTV sets in people's homes are progressive panels.
All LCD and plasma PDPs with few exceptions are progressive panels, they can't display an interlaced signal. There is no difference in resolution between 1080i and 1080p. In fact 1080i60 takes up
more bandwith than 1080p24, another error in the article and they even have a section called "A matter of bandwidth". They are correct that the cost would be the with the cable box. They're wholly incorrect that most shows are shot in 1080i, the vast majority of shows that aren't filmed are being shot in HD at either 1080p24, 1080p30 or 1080p60. Just because something is broadcast in a resolution/fr doesn't mean it was shot that way. LOST was shot at 1080p24, but broadcast at 720p60.
They state in the pixel density section "Even though the smaller screen runs at a lower resolution" - huh? How is the smaller TV running at a lower resolution? They haven't stated what the resolutions of the TVs they're talking about are!
Your shop, you said it's a repair shop and you've used that to judge who is buying what TVs? Logical fallacy. The data shows which brand has the most TV repairs, not which brand is in most people's homes. Do you survey people when they come in and check off whether they're a gamer or not? Do your technicians when doing in house calls note how every piece of equipment is hooked up instead of just fixing a television? Sounds pretty bizarre. If this is not done, how do you ascertain whether these people are using their RCA sets for casual gaming or as a tertiary set in the laundry room? Actual TV sales tell a much different story, US market's top five brands are Samsung, LG, Vizio, Sony and Panasonic. In Q1/2013 27% of LCD sets sold were LARGER than 50" according to NPD figures. Repeat, LARGER than 50", 27% of sets sold. 40~50" has been the sweet spot for several years now with consumers picking up 42" plasmas for $400 or less now for quite some time.
Casual gamers haven't bought next gen systems yet, you equate a casual gamer to someone who has no visual acuity? There's another logical fallacy.
I'm a casual gamer, I play games once a month MAYBE. Casual gamer =/= AV idiot.
http://s3.carltonbale.com/resolution_chart.html
tl;dr yes I'm serious and your article is drivel