Too Many Signals: Delivering Wireless HD Video


Like Everest, the goal of wirelessly delivering high-definition video without compression may not be necessary, but it’s there, so technologists have to attempt it. And admittedly, sending wireless HD video from a PC to a television is compelling. But wireless signals are easy to mess with. How would you feel if every time your phone rang or microwave was cooking, the screen pixelated?

Anyhow, practicalities have never stopped a market, so here are the three biggest wireless technologies aiming to deliver high-def video to your TV.

Wi-Fi using 802.11n:

Wi-Fi is everywhere, from home networks to corporate ones, doing everything from connecting to digital photo frames to enabling voice calls. Sure everyone knows what Wi-Fi is, but like ice cream it comes in many different flavors. One of those flavors is 802.11n, which would allow up to 100 Mbps data transfers. That’s not really enough to stream high-def video content without compression, but that’s not stopping Broadcom and Monsoon Multimedia from trying to pushing the technology to that very limit.


After a lengthy and brutal standards war in the IEEE, ultra-wideband became an international data transmission standard thanks to ISO. The technology offers high-speed, short-range data transfers. Some UWB companies such as Alereon and Staccato Communications are focusing more on wireless USB, to connect computer peripherals without wires, but TZero and WiQuest are scaling the high-def video heights. TZero Technologies has chips inside a Hitachi television set, while WiQuest has achieved data rates of 960 Mbps and has a video platform. I don’t see any customer announcements for the chips yet, though.


This standard certainly has the bandwidth in the relatively empty 60GHz spectrum to deliver high-def video, but so far it has some real problems going the distance or getting through solid objects. It’s also expensive, and so far the technology is mostly theory. But IBM has shown test chip, and last week Vubiq launched a $12,500 development kit for OEMs interested in playing with 60Ghz technology. Other startups include SiBeam, which plans to have chips out this year. There are also various university research efforts a focused on this. It’ll probably be a few years before this technology makes it into homes, however.


Jesse Kopelman

Stacey, you are misunderstanding the issue here. HD video content is already compressed. What comes via cable and off-the-air is compressed using MPEG2. What comes via IPTV is compressed using some form of MPEG4. What comes from a Blu-Ray or HD-DVD is compressed using MPEG2, H264, or V-1. The only time the content absolutely has to be decompressed is right before it is right before it is displayed. The techniques for uncompressed wireless transmission assume that, just like with with physical cables, the decompression is done at the source (STB or disc player) and passed from the source to the display in uncompressed form. What I am suggesting is that you could just have the player pass the undecoded stream to wireless transmitter and let the receiver do the decompression. Yes this would require special hardware at the receiver end, but it would also negate the need for some exotic network protocol or frequency to support the bandwidth demands of uncompressed video. Given all the research that has already gone into very low cost video decoder hardware for all sorts of mass consumer applications, I think that approach may actually be more cost effective, at least in the short term. I do however agree that having a special network dedicated to HD media is likely to appeal to the same audio/videophile crowd who embrace the placeboid benefits of overpriced designer cabling and power conditioning solutions.

Jack Campbell

Jesse, the point here seems to be that to gain mass market traction whatever cable replacement solution that is offered must be just that: a cable replacement solution. HDMI dominates the home video interconnect space, so this must be the cable that is replaced. And, of course, that mandates uncompressed video.

I’m actively engaged in a client project now leading to some über-cool home TV gear that incorporates the Amimon WHDI uncompressed wireless system — as a point-to-point HDMI cable replacement. It will sell well, and sell to anyone, without any requirement for internal support within any connected equipment. So, it can be sold now — not in 5- to 10-years when some as-yet undetermined compressed standard has become sufficiently ubiquitous to be readily found in the majority of home video products.

Folks here uttering “uncompressed” handily ignore the reality that, until support for that compression scheme is widely implemented inside home theater products, there will be no market uptake. And, it becomes a chicken or the egg dilemma. An uncompressed solution solves this marketing conundrum and makes profitable products possible immediately.

Any successful technology has to fit hard, inflexible market realities. Uncompressed wireless video for home theater is that practical solution for now.

Stacey Higginbotham

Len, I’m actually talking to them next week after hearing those same claims. I’ll keep y’all posted.

Jesse, I’m not an HD purist, but my videophile friends seem to see a difference between compressed and uncompressed HD, so I wonder if the market might split with less particular consumers accepting Wi-Fi and technophiles going for something like 60Ghz or even (gasp) wires.

Len Rhee

Stacey, if you are covering the wireless HD space, you should check out Amimon (, a fabless chip startup with some compelling wireless HD interface solutions for delivering uncompressed HD video they call WHDI. Supposedly works through walls up to 100ft in the 5GHz band and supports up to 1080p. I heard the basic tech approach from its CEO over a dinner last year which is a much improved approach over others. They count Motorola as an investor and also won a CES Innovations award this past January. I’d expect to hear more about them in this area if their claims hold up.

Jesse Kopelman

Interference issues can be mitigated by using the right protocols. Something that has explicit pairing like Bluetooth would be optimal. This would allow for active interference rejection, as the only signals each end of the virtual cable would care about are those from the other end. Really, if a good propagating band like 2.4 GHz were chosen it wouldn’t be so much that WiFi or a phone would interfere with the video connection as the video connection would kill the phone and WiFi. Realistically, though, 2.4 GHz just doesn’t have enough spectrum to get the job done. If you want something that will penetrate walls you are pretty much consigned to using 5.2-5.4 GHz. If you don’t need to penetrate walls, 60 GHz is your best because of all the spectrum available. According to Rice’s Law, we know that in a line of sight situation increasing spectral bandwidth allows for a corresponding reduction in power without impacting data throughput. Since lower power usually translates to lower cost, 60 GHz would seem the obvious choice for typical A/V situations where we are trying to replace cables of 2m or less. It is only for those more specialized situations requiring longer cable runs that lower frequencies become the better choice. However, at a certain point we must consider if it wouldn’t be more effective to just build the ability to decode compressed media streams into the slave end (the $50 AMD HD3450 graphics card has such functionality, so the chips are clearly inexpensive) and take advantage of the dramatically reduced bandwidth requirements to distribute the feeds over more general purpose wireless networks (eg 802.11n).

Shah Ullah

Thanks Stacey, I wonder if the additional software layer makes battery consumption an issue as I was previously told by someone more knowledgeable that one of the benefits of a hardwired protocol is that processing power is inherently reduced. A software solution on the other hand, would have to process the requirements, create the communication protocol on the fly, and then actually send/receive data. With most televisions and home routers working off of AC, that still really shouldn’t be an issue. For other scenarios, for example, playing an HD movie stored on your video camera or IPOD on the 50incher, I think one of a software solution could be useful by mimicking Ultra-wide Band instead of WiFi which is considered to be significantly less power hungry. If you wanted to access the files on that same video camera from your PC, maybe you would access it through a mimicked wifi connection (or maybe a virtual WiFi connection). In other similar situations, a software solution could optimize which spectrum and other specs to operate under to make most efficient use of that battery, given that it is not taking 3 steps back to go 1 step forward.

Stacey Higginbotham

Shah, it’s possible to use a software defined radio to flip between certain frequency bands using software, but there are limits. Qualcomm’s Gobi tech is an example of using a software-defined radio to flip between HSPA and CDMA signals. Also antennas need to be calibrated for specific frequencies, so one may not be able to do the job. We’ve written about a company called Skycross who may have a breakthrough in that area here:

Shah Ullah

Hi Stacey, I think you hit this one right on the head. There definitely are a lot of short-distance wireless efforts out there and the ramifications of so many different connection signals are still uncertain.

There are some technologies coming together though as Bluetooth has announced plans to use Ultra-Wide Band and recently WiFi as well.

This potentially creates an opportunity for a company to create software for chips that create protocols on the fly to link up different wireless connections through a single chip… maybe a Vanu for short-distance wireless? Can anyone tell me if this feasible?

Comments are closed.