AppleTV: Xbox without the “X?”

Phil Waligora, who works at Microsoft, is watching Steve Jobs’ keynote (I’m not, but am trying to check in here and there) and calls me out, wondering if I’ll say the just announced AppleTV is innovative.

Oh, Phil, haven’t you gotten the memo? Everything Apple does is innovative. Even if Microsoft’s stuff is better (and three years earlier). Sorry to break the news to you.
But, seriously, I’m not throwing away my Xbox 360. It does that and lets me play games and look at the photos stored on the box upstairs.

Now, I am gonna take a look at iPhone and seriously consider upgrading to that. Gizmodo and Engadget are going nuts with coverage.

Is Engadget right? Is the Apple TV only 720p HD? That really, really, really sucks. If that’s true this thing is dead on arrival. Apple, the entire industry is ahead of you if that’s true.

The iPhone looks really cool, though.

319 thoughts on “AppleTV: Xbox without the “X?”

  1. I would buy an AppleTV if it had 1080i.
    I would buy an Airport Express if it had gigabit.

    The good news is, I guess, that nobody seems to be doing these things well. I have been thinking about buying an XBox 360, but I am scared of the fact that most Microsoft products are not compatible with things not made by Microsoft. Microsoft can rarely get their own stuff to be compatible.

    Linksys, D-Link, and Netgear all don’t seem to be able to make a reliable 802.11n router with gigabit. Apple’s router may be reliable, but it doesn’t have gigabit. I suppose I could buy a gigabit switch, but I don’t think I should need to.

  2. I would buy an AppleTV if it had 1080i.
    I would buy an Airport Express if it had gigabit.

    The good news is, I guess, that nobody seems to be doing these things well. I have been thinking about buying an XBox 360, but I am scared of the fact that most Microsoft products are not compatible with things not made by Microsoft. Microsoft can rarely get their own stuff to be compatible.

    Linksys, D-Link, and Netgear all don’t seem to be able to make a reliable 802.11n router with gigabit. Apple’s router may be reliable, but it doesn’t have gigabit. I suppose I could buy a gigabit switch, but I don’t think I should need to.

  3. Jason B.: I’m going to politely as possible tell you to bite me, because I shoot HD for money. I have my own gear. Your silly Mac insults don’t grant you any more credibility.

  4. Jason B.: I’m going to politely as possible tell you to bite me, because I shoot HD for money. I have my own gear. Your silly Mac insults don’t grant you any more credibility.

  5. Who the hell wants to take the time to download a movie, to then play on an iPod, to then sit in front of the computer and watch it, to then connect the computer to that expensive new Plasma screen …. only a Mac fanatic would do that, and then only to brag that he could do it.

    Well I can shit and fall back in it, I wouldn’t suggest bragging about that.

    No, bragging about useless gadgets is sort of like inflating one’s ego. I always sort of thought that men of small stature that insist on owning huge pick up trucks, you know the type, probably (just guessing here), are trying to some how make up for the fact that they feel they are lacking in other arenas that may be of greater interest to the ladies.

    So I sort of think that this is a parallel situation of an electronic nature. Computer nerds and hippies in SF can wooo about their iTV and their iPhone, and their iPod.

    Good gawd, we put a man on the moon with less computer technology than is in a typical 1998 Chrysler, and the friggen “ithis” makes people nearly orgasm? All I can say is that you people who enjoy these products probably don’t get much work done at work, probably have a very boring life, and probably are simply making up for some other area that you feel you are lacking in.

    Grow up, there is a real world out there and Steve Jobs ain’t god. Ha ha ha!

  6. Who the hell wants to take the time to download a movie, to then play on an iPod, to then sit in front of the computer and watch it, to then connect the computer to that expensive new Plasma screen …. only a Mac fanatic would do that, and then only to brag that he could do it.

    Well I can shit and fall back in it, I wouldn’t suggest bragging about that.

    No, bragging about useless gadgets is sort of like inflating one’s ego. I always sort of thought that men of small stature that insist on owning huge pick up trucks, you know the type, probably (just guessing here), are trying to some how make up for the fact that they feel they are lacking in other arenas that may be of greater interest to the ladies.

    So I sort of think that this is a parallel situation of an electronic nature. Computer nerds and hippies in SF can wooo about their iTV and their iPhone, and their iPod.

    Good gawd, we put a man on the moon with less computer technology than is in a typical 1998 Chrysler, and the friggen “ithis” makes people nearly orgasm? All I can say is that you people who enjoy these products probably don’t get much work done at work, probably have a very boring life, and probably are simply making up for some other area that you feel you are lacking in.

    Grow up, there is a real world out there and Steve Jobs ain’t god. Ha ha ha!

  7. ZF– all excellent points. The biggest being that progressive images degrade less in the encoding process, which is why bit rate is often more important to perceived resolution w/re picture quality than actual picture resolution. And as you pointed out, actual horizontal resolution of a broadcast 1080i60 images works out to a real world max of around 1400 pixels.

    So far, the only place where I’ve seen 720p24 used is in video cameras marketed primarily to independant filmmakers a couple of years ago (Panasonic and JVC). The advantage touted is that the progressive 24fps image is more film-like than a 1080i60 image.

    The best way to display a 720p24 image would be to triple the refresh to 720p72, and while most computer monitors (and higher-end projectors) should be able to display this refresh rate, it would require an outboard video processor that would end up costing several times the Apple’s $299 price. I don’t think any consumer television monitors perform this internally or even have the bandwidth to accept a 72hz progressive signal.

    It could also be doubled, but I know people who claim headaches from the flicker from a 48hz image. Of course this is in the home theatre world where such images are viewed in a completely light-controlled room, which would magnify such effects. And that’s also not the target market for this device.

    I’m more inclined to think of the device as DOA, but I tend to think that if I don’t want something, nobody else does, either. However, some of the vitrolic and woefully uninformed comments (not yours obviously) here are making me think that there may be a market for it. And there’s always hope for version 2.

  8. ZF– all excellent points. The biggest being that progressive images degrade less in the encoding process, which is why bit rate is often more important to perceived resolution w/re picture quality than actual picture resolution. And as you pointed out, actual horizontal resolution of a broadcast 1080i60 images works out to a real world max of around 1400 pixels.

    So far, the only place where I’ve seen 720p24 used is in video cameras marketed primarily to independant filmmakers a couple of years ago (Panasonic and JVC). The advantage touted is that the progressive 24fps image is more film-like than a 1080i60 image.

    The best way to display a 720p24 image would be to triple the refresh to 720p72, and while most computer monitors (and higher-end projectors) should be able to display this refresh rate, it would require an outboard video processor that would end up costing several times the Apple’s $299 price. I don’t think any consumer television monitors perform this internally or even have the bandwidth to accept a 72hz progressive signal.

    It could also be doubled, but I know people who claim headaches from the flicker from a 48hz image. Of course this is in the home theatre world where such images are viewed in a completely light-controlled room, which would magnify such effects. And that’s also not the target market for this device.

    I’m more inclined to think of the device as DOA, but I tend to think that if I don’t want something, nobody else does, either. However, some of the vitrolic and woefully uninformed comments (not yours obviously) here are making me think that there may be a market for it. And there’s always hope for version 2.

  9. “Also not true. For film-based sources, 1080i60 can be deinterlaced to a true (not interpolated) 1080p24 image.”

    That sounds good, and is a fairly easy process. Hopefully all the 1080p monitors out there will have such pulldown removal in hardware (as I’m sure do the HD DVD players). However, I’d be very surprised if the vertical resolution of the film image isn’t deliberately degraded a during the telecine or encoding process so it doens’t flicker on interlaced displays. Even so I’d expect film converted to 1080i60 to at least hold more horizontal information than 720, and for this detail to be apparent –assuming that the consumer’s output device can actually display the additional horizontal resolution. The consumer DLP 1080 HD monitors that are going to sell like hotcakes this year use micromirror arrays that are 960×1080 native resolution and use “wobbulation” to draw 1920 horizontal pixels. I don’t expect this to hold the detail as well as that which would be produced by a real 1920×1080 array. It’d be interesting to compare 1280×720 and 1920×1080 images produced on these 960×1080 sensor driven sets. In any case, point taken, with caveats.

    “And for video-based sources, 1080i60 can be deinterlaced to a true 1080p30 image. Both of these are very much superior to a 720p24 image.”

    I agree in theory, but it’s a little more complicated than that. 1080i wins in terms of horizontal resolution(*), but its vertical resolution is not necessarily any better even after the most perfect deinterlacing process because the vertical resolution of each set of fields was deliberately lowered in camera to reduce interlace flicker. Progressive images also degrade less from mpeg-2 compression (a common broadcast, satellite, and cable format) than interlaced ones (also true for h264?). And if we’re discussing video-based sources, I wonder if 720p24 is nearly as common as 720p60 for video acquisition. But yours is a fair comparison if the appleTV only can output 720p24, other devices on the market play 720p60, and 720p60 videos are available for playback on those devices.

    “And converting 720p60 video to 720p24 requires 3:2 pulldown, which introduces the same motion artifacts as are present in 1080i60, only larger.”

    True, but I’d guess not very relevant for most videos currently available online. The movies people buy in the iTMS and watch on their 720 playback devices and 720-capable monitors will all be 720p24, not 720p60, hence no motion artifacts (unless 720 capable monitors only run at a multiple of 60Hz and can’t do multiples of 24Hz). I’d agree that people who want to watch more 720p60 sports videos and other material originating in 720 would be better served by a player that can output 720p60 than one that has to convert the video to 720p24. If the XBox 360 with media extender plays 720p60, that’s another mark in its favor for such videos.

    There are so many compromises built into the HD capture, transmission/distribution, and display pipeline that the resolution advantage of 1080i formats over 720p is exaggerated in practice (at the current time), and the appleTV unit would likely not be the weakest link in one’s HD-viewing chain. But the v1 appleTV in offering only 720p does seem less than future-proof.

    Enough hair-splitting on my part. The point I’m trying and failing to make is: Scoble thinks the appleTV is “dead on arrival” due to the lack of 1080 support. I think the lack of 1080 support is a true but not deal-breaking deficiency, that the device has other, bigger deficiencies, that it won’t meet my needs, but I also think there might be a market for it and it may work well for that market segment. My disagreement extends mostly to the idea that the product deserves the DOA moniker.

    -Z

    * (few if any HD cameras commonly used for broadcast actually have 1920×1080 sensors, some commonly-used telecine devices are not truly 1920×1080, and most HD tape recording formats also use a lower horizontal resolution–both these caveats likely also apply to the 720 formats, I’m not sure to what degree –there may be more broadcast cameras that shoot full-res 720 and formats that record it, or there may not)

  10. “Also not true. For film-based sources, 1080i60 can be deinterlaced to a true (not interpolated) 1080p24 image.”

    That sounds good, and is a fairly easy process. Hopefully all the 1080p monitors out there will have such pulldown removal in hardware (as I’m sure do the HD DVD players). However, I’d be very surprised if the vertical resolution of the film image isn’t deliberately degraded a during the telecine or encoding process so it doens’t flicker on interlaced displays. Even so I’d expect film converted to 1080i60 to at least hold more horizontal information than 720, and for this detail to be apparent –assuming that the consumer’s output device can actually display the additional horizontal resolution. The consumer DLP 1080 HD monitors that are going to sell like hotcakes this year use micromirror arrays that are 960×1080 native resolution and use “wobbulation” to draw 1920 horizontal pixels. I don’t expect this to hold the detail as well as that which would be produced by a real 1920×1080 array. It’d be interesting to compare 1280×720 and 1920×1080 images produced on these 960×1080 sensor driven sets. In any case, point taken, with caveats.

    “And for video-based sources, 1080i60 can be deinterlaced to a true 1080p30 image. Both of these are very much superior to a 720p24 image.”

    I agree in theory, but it’s a little more complicated than that. 1080i wins in terms of horizontal resolution(*), but its vertical resolution is not necessarily any better even after the most perfect deinterlacing process because the vertical resolution of each set of fields was deliberately lowered in camera to reduce interlace flicker. Progressive images also degrade less from mpeg-2 compression (a common broadcast, satellite, and cable format) than interlaced ones (also true for h264?). And if we’re discussing video-based sources, I wonder if 720p24 is nearly as common as 720p60 for video acquisition. But yours is a fair comparison if the appleTV only can output 720p24, other devices on the market play 720p60, and 720p60 videos are available for playback on those devices.

    “And converting 720p60 video to 720p24 requires 3:2 pulldown, which introduces the same motion artifacts as are present in 1080i60, only larger.”

    True, but I’d guess not very relevant for most videos currently available online. The movies people buy in the iTMS and watch on their 720 playback devices and 720-capable monitors will all be 720p24, not 720p60, hence no motion artifacts (unless 720 capable monitors only run at a multiple of 60Hz and can’t do multiples of 24Hz). I’d agree that people who want to watch more 720p60 sports videos and other material originating in 720 would be better served by a player that can output 720p60 than one that has to convert the video to 720p24. If the XBox 360 with media extender plays 720p60, that’s another mark in its favor for such videos.

    There are so many compromises built into the HD capture, transmission/distribution, and display pipeline that the resolution advantage of 1080i formats over 720p is exaggerated in practice (at the current time), and the appleTV unit would likely not be the weakest link in one’s HD-viewing chain. But the v1 appleTV in offering only 720p does seem less than future-proof.

    Enough hair-splitting on my part. The point I’m trying and failing to make is: Scoble thinks the appleTV is “dead on arrival” due to the lack of 1080 support. I think the lack of 1080 support is a true but not deal-breaking deficiency, that the device has other, bigger deficiencies, that it won’t meet my needs, but I also think there might be a market for it and it may work well for that market segment. My disagreement extends mostly to the idea that the product deserves the DOA moniker.

    -Z

    * (few if any HD cameras commonly used for broadcast actually have 1920×1080 sensors, some commonly-used telecine devices are not truly 1920×1080, and most HD tape recording formats also use a lower horizontal resolution–both these caveats likely also apply to the 720 formats, I’m not sure to what degree –there may be more broadcast cameras that shoot full-res 720 and formats that record it, or there may not)

Comments are closed.