Slashdot videos: Now with more Slashdot!

View

Discuss

Share

We've improved Slashdot's video section; now you can view our video interviews, product close-ups and site visits with all the usual Slashdot options to comment, share, etc. No more walled garden! It's a work in progress -- we hope you'll check it out (Learn more about the recent updates).

New submitter pjlehtim writes "In a recent interview. Samsung's AV product manager, Chris Moseley, said, 'TVs are ultimately about picture quality. ... and there is no way that anyone, new or old, can come along this year or next year and beat us on picture quality.' Sounds familiar? There must be a change in the perceived role of television in the entertainment ecosystem before the general public starts to care about the smart TVs manufacturers are trying to push. That change is likely to come from outside the traditional home entertainment industry. It's not about technology; it is about user experience, again."

..yeah, no thanks. All I want or need is something that displays a 1080p signal well, and isn't going to break down and need to be replaced in a couple years. You can keep your so-called "smart", your "3D", and all your other silly bells and whistles. I'll stick to something that is quality, and if I need some "smarts" beyond what TiVo can do for me, I'll add an HTPC.

I dunno. I have an HTPC and it's kinda a pain in the ass. If Apple can come out with a TV that has a built-in HD, a decent OS, and Siri, that could very well be the sweet spot for a lot of people, including me. "Siri, I'd like to watch the latest episode of Venture Brothers." Boom. Off ya go.

But why does that have to be *in* the TV? If all it does is display a video signal at as high of quality as possible, it can last for many years. If you stick a bunch of apps in it, inevitably the CPU or RAM become inadequate, it doesn't have the latest codec support, manufacturers stop supporting the software, whatever. If you keep it separate the display from the "computer" you can replace the latter every year or two and it's no big deal. (note the AppleTV is $100 and an iPad is $500+)

I have gone through at least 3 computers over the lifespan of my 21" LCD monitor (which I still have and love). If I had to pay for a new display every time I upgraded to something that ran the latest games, apps, etc, I'd be really annoyed. Same thing for the 60" plasma I bought last year.

But why does that have to be *in* the TV? If all it does is display a video signal at as high of quality as possible, it can last for many years. If you stick a bunch of apps in it, inevitably the CPU or RAM become inadequate, it doesn't have the latest codec support, manufacturers stop supporting the software, whatever.

But why does that have to be *in* the TV? If all it does is display a video signal at as high of quality as possible, it can last for many years.

Let me tell you something.... many MANY people can't hook up a TV input to save their lives (or would be glad to avoid having to do that). For every one of you and me who could do this in their sleep, there are probably 3-5 people who either can't or would be very anxious if asked to do so. The vast proliferation of inputs (HDMI, Component, Composite, Coax) and ambiguous and tough setup conspire to make this uncomfortable even to me.

What does this mean in terms of market? It is possible (though not certain) that TV installation and setup could go mainstream and bypass the "knowledgeable enthusiast" and address the lager market of technically incompetent/insecure

Like the iPhone (which at first seemed a bit dumbed down to me, coming from a Treo), if Apple can completely avoid the need for inputs (think plug in power and internet signal (likely wireless), and if you're really pushing it, buy and position auto-connecting bluetooth speakers), these folks could "safely" buy and use a TV without us.

I guess I largely agree with you (except that TV over Wifi isn't going to be any more friendly...) The real problem with TV today, well, aside from the content, isn't so much the "How do I hook it up" but the fact that if you want quality, you end up having to hook up a hell of a lot more than a power supply and a coax connection. And then there's the interfaces which are awful. You'd have thought the Wii would have at

Let me tell you something.... many MANY people can't hook up a TV input to save their lives (or would be glad to avoid having to do that).

But that could be improved if you strip out all the legacy connectors (RCA, VGA, SCART in Europe). Have a bunch of easily-accessible, and completely interchangeable, HDMI inputs, and keep everything else behind a cover. Include a network hub and support HDMI+Ethernet on all the inputs, so even "smart"/IPTV boxes only need a single cable. Standardize a bit on auto-sensing, remote control sharing (which you generally get now if you stick to brand X components). Have a "soft" remote control (i.e. a cheap t

I think you are underestimating the amount of pointless complexity that's out there. Let me give you an example. I have the following kit at home:- a 5 year old Loewe Xelos, which comes as a screen and separate signal box- a bluray player- a Freesat HD boxTo get these gadgets to work well together, requires an *insane* amount of cabling and other peripherals:- the telly and signal box both have powercables, and there are two further cables that carry sound, video and control signals between the signal box a

The fanboys have this pathalogical need to pretend there is a void that Apple needs to fill. TV is their current fixation.

Whatever. I think you don't have a wife to realize what is and isn't wife-friendly.

I have a simple HTPC setup with a Harmony remote that control everything through a single 'activity' button press, and I still need to go "fix it" once a week. PC freezes, XBMC hangs, IR signals have no 'ack' protocol so they get missed...

Shit happens to most PC-based setups. Wives don't want to deal with the shit. Hence, most things outside of TV aren't wife-friendly.

2k+ content isn't coming for a LOOOONG time. 10 years at least. It took a long enough time to get to 1080i, and 1080p isn't even on the goddamned table.

Why? Bandwidth. There just simply isn't enough OTA bandwidth to support it, not enough satellite bandwidth to support it, cable bandwidth and certainly not enough IP bandwidth to support 2k+ content. Even FiOS isn't a magic bullet; you need back haul to support that kind of downstream and I don't know if Verizon or other FTTP vendors are interested in s

I dunno. I have an HTPC and it's kinda a pain in the ass. If Apple can come out with a TV that has a built-in HD, a decent OS, and Siri, that could very well be the sweet spot for a lot of people, including me. "Siri, I'd like to watch the latest episode of Venture Brothers." Boom. Off ya go.

Alternately:

"Siri, I'd like to watch the latest episode of Venture Brothers

"...Sorry, Dave. You have exceeded the number of authorized devices for this content.

The article is right about a couple things: TV UIs suck and remotes suck even more.

My mom can't operate a modern TV. I mean like not AT ALL. If it's anything more challenging than volume up or down, it's too much. She doesn't get it.There's a bunch of stuff we plug in and want to use now - DLNA clients, DVRs, Home Theater receivers, cable boxes, game machines - and it all works differently and needs some stupid or weird different control, both on-screen and in terms of the control device. The revolution will be the people who make some kind of master overlay and master remote (I love my Harmony but it doesn't go far enough) that handles everything.

Maybe that means a mic or a kinect that lets us talk or gesture. Maybe it means having a little display on a tablet. I don't know. I just know that what we have now is a huge mess.

Seriously. I have a modern highend Panasonic plasma. The UI for simple tasks is tolerable, complex tasks are atrocious. I was using a recent high end Sony TV with some XMB UI, even simple tasks were slow and unintuitive. How the average user enjoys these is beyond me.

Your friend could buy a $10 universal remote and spend the 5 minutes it takes to set it up instead of needing 5 remotes...

And then have the programming go away as soon as the batteries die. I swear universal remotes are great, but why the hell haven't they added 75 cents worth of flash memory to the things to hold the codes permanently? The only ones that seem to do that are the more expensive ones like the Logitech Harmony varieties (which though they are coming down in price, are still a lot more than $10).

It's because it is so bloody hard to program them amusing people are aware that they exist. When instructions include understanding flashing lights, it may as well be speaking Greek to them. And the product codes, could it be any more complicated?

Hey, if you are happy being either restricted by a lame antenna or tied up to a greedy powerhungry cable company, more power to you. I want my smart TV with a-la-carta programming and on demand shows without restrictive itineraries, and I will gladly pay a bit extra for it (as long as the advantages pay for themselves in a 2 year span.)

Hey, if you are happy being either restricted by a lame antenna or tied up to a greedy powerhungry cable company, more power to you. I want my smart TV with a-la-carta programming and on demand shows without restrictive itineraries, and I will gladly pay a bit extra for it (as long as the advantages pay for themselves in a 2 year span.)

Keep dreaming about a-la-carte. Content creators won't allow it.

The future is every fucking network having their own fucking store and their own fucking on demand service. The costs for this are so low that there's absolutely no reason to ride on the back of Apple, Google, Amazon, Netflix, or anyone else for your new content. All the "old" content (x-weeks after first air date) will of course be available on every fucking store.

Cable providers will simply bundle and resell subscriptions to these servic

Hey, if you are happy with a lame smart TV and all the DRM that will inevitably come with it, more power to you. I want my content exactly how I've always got it, and I'm willing to pay for it. EXCEPT nobody has got it right yet, and they never will, so I'll just take it for free.

I have a simple 4 year old flip phone on a non subsidized plan. It works for me, calls, voice mail and a few text every now and again. I also don't own a TV. I read news and watch it on youtube, listen to webcasts, hulu, etc. I just use my computer for such entertainment and am happy with my 23" 1080 screen. Costs too much for TV to be worth it, just to watch advertisements in the couple hours I might use it after work in the winter.

I suppose I am, since my focus is on socializing, playing cards or darts, or other social and interactive things. My observation is most people my age (mid-late 20s) have smartphones and will perpetually pull out their phone and ignore you when you're sitting down for dinner, in the middle of a conversation, or generally in situations where it's inappropriate or rude to be texting or facebooking.

Personally if I get a call while I'm sitting down eating dinner with friends I ignore it a

"..yeah, no thanks. All I want or need is something that displays a 1080p signal well,"

Why? If all you watch are the most recent Movies on BluRay, then I can understand that. but ALL cableTV and ALL satellite TV is 720P heavy compressed. I dont care what your settings on the receiver are, the signal is 720p and will stay that way for a very long time.

Everyone pines for 1080p but very few have seen 1080p content that is crisp and at a viewing distance where they can actually tell the difference.

If you know your source material, and you sit close enough to see it, Awesome for you! I also chased the 1080p dragon for my theater and succeeded. You will not find ANYTHING that will be a decent quality 1080p from a streaming service within the next 5 years. You just dont have the bandwidth.

I instead made my own. XBMC with a server in the basement that has 5 1tb drives in it. I rip the bluray discs to the server and use XBMC to play them back. XBMC will do a AC3 passthrough as well as HD audio passthrough toslink to the receiver that will recreate the audio perfectly. My theater with VOD system I have in my media center was in total $12,500 excluding the walls, sound control and seating.

If you want really good 1080p you are going to not only pay for it, but do it all yourself.

A good example of that is TiVo. I'm not sure why TiVos haven't become ubiquitous (indequate marketing, wrong price points, bad business model?), but the different "user experience" of a TV that allows efffortless time-shifting and commercial-skipping seriously altered my approach to television for the better, and I would never go back to the old way of sitting down to watch TV live. If the right company put together something with the same kind of game-cha

This is true, but I don't really need the TIVO to be actually built into the TV for that to be useful. In fact I might like my display from Samsung, my DVR from TIVO and my streaming video client from Apple. I don't want to pay for a bunch of stuff in my actual TV that I don't actually want.

Still, this might be the way it ends up going if they can use it to get people on a more aggressive upgrade cycle for their TVs.

I have mixed feelings about DVRs. I was all about them and 100% loved them when I first got one through the local cable company. No commercials and my shows when I want to watch them.

Then I realized that commercial breaks were perfect times for me to get up and do something, bathroom, laundry, snack, whatever, and they were gone. Soon I found myself pressured to watch everything that was recorded before it continued to pile up higher and higher.

I've seen several comments like this on this thread. Has no one on/. (of all places) seen Game of Thrones its only the first season but possibly better than Lord of the Rings. Admitidly the vast majority of content sucks but there are some very select shows that are good.

Watch some old TV shows sometime. Shows that had a story or otherwise actually had some entertainment value. HD would not add to the experience one iota.

Precisely the reason I still watch DVD instead of Blu-Ray. I have a Blu-Ray player, but I don't see the point in buying my movies over again, and around here, DVD releases are still cheaper than Blu-Ray. If it's worth owning, to me, then the content will be good enough that it won't matter if it's a lower resolution image. Besides which, they all get ripped to my hard drive and transcoded to a 1GB h.264 anyway, so there's literally no benefit to getting a Blu-Ray.

Frontline isn't perfect, either. They tend to pad material and repeat just like other shows. If you don't have an hour of content, just show what you have and cede the rest of your time for sports or something.

Sadly, I don't think media production is ready for it. I'd be happy if 4k displays remained exotic or stuck at desktop PC sizes if it meant that 1080p were widespread all the way down to tablet size displays.

I just read that a rapidly increasing number of movie producers are moving to shooting in 4K. I expect a lot of that will be 3D 4K. Interestingly, the higher the resolution the better compression works.

IMHO it depends on what you're doing with it. I'm running dual 1680x1050 (effectively 3360x1050 total) monitors on my workstation as I type this. If you want fine detail, the resolution really counts. I'm shopping for a new TV, and I will be using it for PC output at least as much as for TeeVee. I've been staring at flat screens at the stores, and from eight feet I can see quite a difference between the 46" and 55", even watching standard TV and movies. One less-obvious item I've noticed is that some 1

16 Bits per channel? Even Huge budget blockbusters are only mastered at 12 bits per channel. The only time I've ever worked in 16 bit is for math during compositing. It's always output to 10 or 12 bit. So like, that would be serious overkill.

Picture quality? Maybe if you're into seeing the pancake makeup and ridiculous quantity of hair gel necessary to make your Sitcom/Soap stars look the way they do. Not going to really help animation at all, a little blur helps hide the sharp contrast of lines. Great for sports, so you can rest assured you're right when you call the ref an idiot for getting the call wrong, while you smugly watch the replays in High Def.

More likely going to find the user experience is more a la carte, as people leave the traditional broadcast, cable, and satellite networks for what they pick and choose over the internet (assuming ISPs don't kill the fledgling market with opressive fees for bandwidth, as IF my piddly 6 Mb/s connection should be considered taxing of their infrastructure. where's 100Mb/s?!?) I'd rather see my shows when it suits me, without even bothering with recording them on a DVR.

The TV itself could have the bits built in, but at the present rate of change I'd prefer an external box which I can upgrade as needed while the big investment, the display, is only bought every 5 or 10 years (or longer apart -- my only TV is really getting on in years, but still works.)

I believe you have in fact nailed it shut. The reason TVs need to be smart is that people are preferring more and more to get their entertainment via the internet, not a broadcast medium. Hopefully one day we'll get working multicast and can combine the two, but you'll still need a smart device to consume the content. Like you, I prefer not to have the hardware built in. Since it goes behind the television there's not even any need for it to be a module, a cable is just fine.

The newer, especially bigger & higher resolution, screens do a pretty large amount of signal processing to turn that old POS television show into visual gold. Among other things most hi-res screens are doing sophisticated upsampling to turn a 480P show into something that looks closer to a 1080P. Otherwise that I Love Lucy re-run would look like crap (although in actuality that's unfair - Lucille Ball insisted on filming her shows in 35MM, not videotape, which now means that her shows have much more v

That's the problem I have. 1080p looks like crap to me because sprung for the 120Hz model thinking I would be annoyed by 60Hz, turns out the opposite was true. My next TV was 60Hz, because it doesn't make every movie look like a soap opera.

I actually started watching some movies with the Composite connectors so that I wouldn't be distracted by the higher refresh rate.

Considering that Samsung currently owns 95% of the OLED display market, it's not surprising that he'd say that. Picture quality (and thinness) is going to be the primary driver for OLED replacing LCD in the TV and monitor markets.

Of course the real question how the price vs. adoption curve plays out. There's a shot at seeing sub-$5k sets when their 8G OLED lines are up to full production this year. LG's faux-OLED (i.e. WOLED stack) is waiting in the wings too. It'll be interesting to see how it all plays out.

Yes they do. Apparently, that's the only way they can get the ignorant masses to know that there is a difference. They don't try to explain the concept of a backlight and an LCD, they just say, "It's an LED TV!"

Considering that Samsung currently owns 95% of the OLED display market, it's not surprising that he'd say that. Picture quality (and thinness) is going to be the primary driver for OLED replacing LCD in the TV and monitor markets.

I disagree... OLED having a significantly better picture quality will make a difference for some people, yes. Early adopters. What will make the difference is when they start being cheaper than LCD's, for the majority of the market. Even then, it'll still take a while for OLED's to completely supplant LCD's, because of attrition.

My computer monitor is an OLED display. It was very expensive, but it *does* have a very high quality picture. I will not, however, be replacing my TV until I need to take it out an

What you have is LED backlit, not OLED. OLED is an emissive technology and is pretty much only in cellphones right now (the majority being Samsung produced with Universal Display Corp PHOLED chemicals). Displays of 15" and larger are expected in production quantities later this year, more realistically in 2013.

Try this. Turn down your TV sound and try to work out what a programme is about. Now try the same with the sound audible and the picture blank (or just looking away). It's almost impossible to follow any programme without listening to the audio channel, but remove the video and little is lost (the exception is probably sports programmes, but for everything else it works).

Although the video component takes up the overwhelming amount of bandwidth - and cost both for production and TV set manufacture, it's the least important aspect of a programme.

The only thing that stops TV from being "radio with pictures" is the marketing of programmes, since this is ultimately where all the money is.

True for porn as well, only difference that it is still passable without the sound.

Try it. With just the sound of moaning/screaming/etc your immagination will fill in the rest of the experience in some instances much better than the same-old penetrated plastic-boobs dolls. You can substitue whatever turns you on to the sound quite effectively. Now try to do the same without the sound and you will really notice its absence. You will tend to focus much more on the face of the actors trying to almost lip-read

It's quite memorable when people expect the status quo to remain relatively static and instead see it change suddenly. We all go back and point to the people who expected things to say the same and talk about how short sighted they were. However, just because that happens occasionally doesn't mean that an industry leader's expectation of no shakeup implies in any way that a shakeup is more likely. The shakeups just stick out in our minds when we reminisce.

thats the point you burn with a fiery desire to get on top of the table, pull out your johnson and piss around on the faces of all participants...............

you sit in front of a tv, you click the remote, you watch the channel. that's what tv is. there is no more 'experience' in it. neither does a person coming home from work and dinner want an 'experience' to come out of their tv. they just want to click and watch.

When Apple release the SuperJumbo iPad -- err, iTV in the next year, with multi ARM processors, 1TB HDD Storage and the ability to remote control it with your existing iOS devices and run Apps in PIP-mode while watching movies, your run of the mill Flat panel displays will seem like the commodity display units they are.

Seriously though -- how can the likes of Samsung, Sharp or Sony Compete? Apple can deliver it's own streaming Movie and TV con

However, everything Apple did so far to skyrocket as it did was for portable devices.

So what are we talking here? A TV running iOS with integrated DVR and using some sort of next gen cable card? What will be the big hook? It needs more than integration with other Apple devices, although we can probably expect teleconferencing with face time (hello boardrooms of America) and streaming to iPad/iphones.

Being subscription services, cable companies will jump at the chance of exclusivity contracts and give in to Apple's demands and needs like compatibility with a better cablecard type system.

Oh, and the integration will be two way: no more separate stereo system, you can play you iPhone music to the TV.

Watch for an add with the spaghetti monster as a tangle of wires while an Apple logo shaped PAC Man chases after it while Justin Long is eating John Hodgman's meatballs allowing the monster to be destroyed. Okay, that may be a little too far out there.

I like many people get "high" speed internet from Comcast. If Comcast updated their Internet with modern tech, we could stream television on computers. Then people would drop their cable subscriptions. So why does Comcast want to increase the broadband speeds when it will hurt their profits?

It is sad you need to hope for Google to do their 1gb/s because the current ISP behemoths don't want to move.

Smart TVs are nice for things like streaming for a secondary TV, in a bedroom or basement, where you dont want a bunch of boxes and cables, but for my living room... i have a cable box, i have a game console, i have a networked dvd player. The TV is ONLY a display. This is one place where with some technologies moving as fast as they are, convergence is a bad thing. If some new streaming service comes out, i can reasonably assume theyll have a PS3 app. Depending on how proprietary things are they may not have an app for a smart TV even a few years old. Heck, i dont even need the main TV in the home theater to have speakers, i just want a big dumb good quality monitor with a digital video input. Let my receiver handle all the AV stuff and one or two boxes handle TV and recorded media.

Convergence has its place, its nice having a camera in my phone in my pocket all the time, but i dont want a cheap, prone to mechanical failure, blu-ray player or cheap PC that no one will make software for in 9 months stuck on the side of my nice high end tv.

Old, established companies will claim that old, established standards are best - until they devise a new product.New companies and old ones with new products will claim that new standards are best.The consumers -- the ones actually USING the product -- don't get to say what it is they need or want. Rather, they are told what they should need and want.

The intelligent consumers (all three of them) should ignore what the companies are saying and should define metrics in terms of value. One option would use zer

The only thing I want from the TV industry is for them to license their content to internet sites.

Streaming music over "Internet Radio" is very successful because there are licensing agreements in place that allow royalties to be paid back to the content providers.

There is no Internet TV because the dinosaur-brained TV execs don't want to relinquish control of their product (even though it has already been broadcast nationwide).

Hulu and Netflix have pitiful TV content because they simply can't license the content. The TV studios are totally missing out on a huge advertising revenue source, because of their backwards thinking.

Message to TV execs: WAKE UP and smell the internet. You could be making money RIGHT NOW if you licensed your content to websites to stream to millions upon millions of handheld devices. (Don't sweat the format, other people will fix that for you.)Or if you don't we'll just keep torrenting TV shows and you'll get nothing...

If TV is about picture quality, why does my wife watch Modern Family on the 15-inch screen on her laptop in our office and not on the 40-inch HD TV we have downstairs in the living room? Oh, right, because it's super easy for her to legally watch episodes whenever she wants via ABC's Web site in a browser, whereas doing so on our TV varies between "a pain in the ass" and "impossible."

The company that solves this problem will make millions, and it won't be a company that's convinced that all people want is

I understand the problem -- my wife can't operate our current tv, relies on our geek daughter to cue up what she wants to watch or choose the right input and navigate to the channel she's interested in. The TV ecosystem has gotten ridiculously complex. Some simplification or automation or integration is long overdue.

On the other hand, I'm pretty sure the answer is not to build all that stuff into a tv. TVs are a long term appliance, not something you buy every two years when an incremental improvement comes out. Remember TVs with VHS VCRs built in? The TV continues to work long after the VCR becomes dead weight. (Somewhat true also for TV/DVD combos, although I notice they're starting to use common laptop DVD drives now.)

I know, if, say, Amazon Instant Video goes away or Netflix changes or some new hot service becomes available, the manufacturer could add new features with a firmware upgrade, right?

Yeah, that worked really well for the cellular market. Why would manufacturers upgrade existing sets when they could use the new feature as leverage to replace the set?

Have a plain old 32" CRT TV with digital converter box, blu-ray capable player (mostly used for DVD).....resolution isn't anything to me, don't give enough of a crap. If I really want to see something with spectacular sound and resolution I'll play it on my workstation. and fuck 3D

TVs are about picture quality but, more importantly, about content. The first TVs were B&W and the picture quality was dismal compared to modern standards. Yet people paid huge prices for them because TVs allowed them to watch stuff they had only imagined.

We have a Tivo, Wii and LG Blu-Ray all plugged into a Yamaha AV amp, which is connected to a Metz TV.As a result i need:Two remotes to watch TV (tivo for channel and amp volume, and the TV remote to turn it on/change the AV channel)

two or three remotes to watch a DVD -Blu-ray + Amp remote + TV remote

trying to explain this to my mother-in-law is painful to say the least.

It's 2012 and all these devices still can't talk to each other, unless they're all from the same manufacturer. They all have their own, incompatible remote control technologies.

Please, TV and home entertainment equipment manufacturers, thrash out a common control communications standard and go with it - eg XML/SOAP over bluetooth or zigBee, or even HDMI, so I can control ALL my AV gear from one remote interface. I don't really care if it's a logitech-style remote or an android app; just give us something that works across manufacturers so i can have one remote to control them all.

The computing power is readily available and cheap, the frameworks all exist to do it - just choose a standard and implement it.

1080p 100Hz TV is good enough, I don't need or want craptastic 3D or a smart TV interface i'll never use. Just focus on the user experience. Make it easy for normal humans to use AV gear.

The big problem for TV manufacturers is convincing people who already have a flat screen TVs that they need a new TV. People bought (and are buying) flat screen TVs to replace their CRTs because a flat screen TV has clear advantages (even to the lay person) over a CRT in the same way DVD is better than VHS.

But for the people who already have a HDTV, the trick is getting them to buy a new one. Lots of that has focused on making the display itself better (faster refresh rates, better LCD panels, LED back-lighting etc) but its gotten to the point where further advancements in display technology currently cost too much to put into mass market TVs (such as OLED TVs).

So with there being little room to advance in actual display technology (at least in terms of advances that normal consumers will care enough about to buy a new TV), the way TV companies are trying to get people to buy is 3D (which is a hard sell to most consumers given the lack of 3D content out there for their 3DTVs) and smart TVs. Smart TVs make sense for the manufacturers because convincing people that already have a flat screen TV to buy a new one because the new one gets YouTube is a lot easier than convincing people that already have a flat screen TV to buy one because the new one has better picture quality, blacker blacks, faster refresh rate etc.

Personally I would much rather see the research invested in making TVs less power hungry than in making them support all this "smart content" stuff (content that the big media companies would rather you got through network TV or cable/satellite anyway)

It's not usability, it's all about DRM. The content providers are desperate to keep people from copying or modifying content. It everything is in one box, then you have no where to connect a recording device. Your cable box will be implemented in software instead of a separate piece of hardware that has to be maintained. Providers can change their encryption any time they want by pushing out a new patch, and keeping the "hackers" at bay. You want to record and watch later? There's an extra charge for that, and only on their terms.

I like the Samsungs - especially the ultra-thin 46" ones... with fast refresh and high-definition. Their biggest down-side is that they aren't competitively priced relative to other manufacturers - IMHO.

I am interested in a seamless way to use the TV to display what would be on my Laptop otherwise... I like the idea of watching internet video on a big screen... and I like the

The problem with Samsungs (and, I think all of the other TVs) is that the software sucks. I've got a 2(?) year old 42' Samsung. Nice TV but awful interface UI, terrible USB support, non existent documentation.

This is, as has been mentioned, where Apple could conceivably break in. Allow a non technical person to hook up a DVR or find a movie, get a remote without three hundred tiny little buttons with colored squiggles. Change inputs without looking at the Janglish manual. Lock people into iTunes. Mak

I am interested in a seamless way to use the TV to display what would be on my Laptop otherwise

Pretty much any new TV can do that. Many PCs have HDMI or DVI-D video outputs that plug right into an HDTV's HDMI input. Those that don't can use a VGA cable, as most HDTVs have a VGA input (except, I'm told, in Europe where TVs tend to have a SCART input instead of a VGA input). And until you replace your TV, you can go to SewellDirect.com and buy a PC to TV adapter that converts a VGA signal to an S-Video or composite signal.

Also that samsungs will NOT last 12 years like your last TV. you will be lucky to get 4 out of them. And this is a problem with MOST brands right now. Longevity is crap on these TV's. Even legendary Panasonic longevity has taken a huge hit with the company not fixing defective sets.

I don't know about that, I've got a 32" Samsung LCD I bought back in 2006 that's still going strong. Cost me $1600 new, compared to the $1000 the 55" cost that ended up taking it's place, so it wasn't cheap, but it's not nearly as bad as the Sanyos and Visios and shit I see people replacing every other year.

Still, I get what you're saying. My grandmother's ancient console TV in the basement worked from the day they bought it in the 60's until they sold it in the early 2000's. I doubt a single appliance or device I've bought within the last 10 years will last even half that.

That is still correct. What has changed is the weighting given to any given instruction. (In the Olde Dayes of Yore, it was sufficient to run a million lightweight instructions, such as ADD or MOV, and see how long it took. After caches started being added, benchmark programs that were any good simulated typical function sizes and typical arc lengths. These days, things are so complex that full applications are generally run from a known start point to a known end point.)

I think most people looked at their huge collection of DVD's and their huge collection of VHS and realized that no matter what format they purchased, it was effectively wasting money. Especially since you watch once and never watch again.