How Often Will You Have to Upgrade an Intel-Based TV?

One of the big topics of discussion here at the Connections Conference is how televisions are evolving beyond just displaying moving images. As widgets, web video and social features come to TVs, televisions become more like PCs. Does this mean that you’ll have to upgrade your television set every few years like you do your computer?

I spoke with Wilfred Martis, director of platform strategy and planning for Intel’s Digital Home Group, to get his take. For the past few years, Intel has been making a concerted effort to get its chips into more consumer devices like televisions. First, Martis says, research indicates that people are replacing their TV sets more often — every seven years now, down from the previous 10-year life cycle. How much lower will that number go? Martis says none of Intel’s customers know for sure, but there are three approaches being taken:

1. Integrated, beefed-up chips. Chips are built directly into the television, but those chips come with enough performance to accommodate future innovation.

2. The modular approach. Chips are built into a module that the consumer could swap out and replace with something newer.

3. Separate boxes. TVs become thin, dumb panels that are connected by wire or wirelessly to a bundled box that has the chip inside it. As more processing power is required, the box is swapped out for a new one, but the panel remains.

Martis says he likes the third approach because the television itself becomes like a piece of art you hang on the wall. Of course, Intel will like anything that gets us to buy more devices…filled with more chips.

loading

Comments have been disabled for this post