Slashdot is powered by your submissions, so send in your scoop


Forgot your password?
Media Social Networks Youtube Apple

Facebook Is Transcoding Video For iPad 277

Stoobalou sounds another death knell for Flash video. He says "Another heavy user of Adobe's video streaming software Flash is now pandering to the all-powerful iPad. Everybody's favourite waste of time, social notworking monster Facebook, is now streaming user videos to Apple's second coming of the portable computer with no sign of Flash in sight."
This discussion has been archived. No new comments can be posted.

Facebook Is Transcoding Video For iPad

Comments Filter:
  • by babycakes ( 564259 ) on Wednesday April 28, 2010 @11:51AM (#32015964)
    So you don't like Facebook. We get it. But would it have been so hard to write an unbiased summary? Some of us use Facebook and we a) actually don't mind it so much, and b) wouldn't really call it a "waste of time". Even if it does break sometimes :-)
  • by Thanshin ( 1188877 ) on Wednesday April 28, 2010 @12:04PM (#32016224)

    social notworking

    I propose the refusal of such tag in Slashdot, on the basis of ubiquity.

  • by eldavojohn ( 898314 ) * <> on Wednesday April 28, 2010 @12:12PM (#32016388) Journal

    Facebook may very well already be encoding its videos in H.264 (which is supported by Flash). In this case, all they need to do is to wrap the files into an MP4 container, with no transcoding necessary.

    YouTube already supports this, and I imagine, will begin to do it by default in the near future.

    Thanks for straightening me out. Well, I suppose that's what I get for reading the article:

    So rather than using HTML5, Facebook is actually detecting that the iPad's Safari browser is in the mix, and is transcoding the original video format to MP4 on the fly.

    I constantly forget about the container when dealing with video and audio file formats ... you would think I would have learned by now after using VLC so much to stream internet radio stations to both MP3 and Ogg formats for replay later with no internet connection. Could somebody explain to me what the container brings? I understand we gain compression and save space with the encoding of the material but why are there so many containers that describe how that encoding is stored? What trade offs do these containers bring and why are they so goddamn proprietary when they seem to provide little real value for the actual data being stored? It's simply some meta data about the actual data so why is it such a thorn in everyone's side? I don't develop in this realm so please tolerate my ineptitude and help me out here. It often confusese me [] relentlessly [] and I am dumbfounded at how these two things are mired in litigation.

  • Not called HTML 5? (Score:3, Interesting)

    by kabloom ( 755503 ) on Wednesday April 28, 2010 @12:20PM (#32016512) Homepage

    Why is Facebook's technique not called HTML5? I guess they're not serving it up to everybody, but when they detect an iPad, are they purposely avoiding the video tag and using the object tag instead?

  • by 99BottlesOfBeerInMyF ( 813746 ) on Wednesday April 28, 2010 @12:37PM (#32016858)

    Um..... please explain how Apple is responsible for the progression from floppies to hard drives, or from parallel ports to USB ports.

    Well, Apple did play a role in both these technologies, although I think the previous poster overstates the case. Apple was probably the first major PC maker to stop including floppy drives by default on their machines. As such, they helped kill the floppy drive. Hard drives had long since been deployed widely at this point by everyone though, so they had little to do with the switch to hard drives. I suppose you could make an argument about the Mac classic being one of the first popular PC's with a hard drive, alongside their introduction of the GUI to the mainstream.

    As for USB, well there's a lot more of a case for them. in 1998 USB existed, but the average user had never heard of it. Mice and keyboards all connected via parallel ports (or serial ports or ADB). USB was included on a few computers, but pretty much only for use with early webcams, and not many of them. The industry described USB adoption as a catch-22, in that peripheral makers could always reach a much larger market by using the old connectors and computer makers couldn't stop including them because they were needed for mice and keyboards.

    In came Apple, who switched all external peripheral connectors to USB. It was the only option. Suddenly there was a guaranteed market for USB peripherals. This is why pretty much all the oldest USB peripherals you can find were in blue and clear plastic, to match the colors of the original iMac. Apple was the early adopter that was able to drive adoption of a standard that had stagnated and was being ignored.

    The second is a result of the USB Consortium. To give Apple credit for this seems disingenuous, (especially since Apple would have preferred to kill USB in favor of Firewire).

    Apple has never tried to kill USB. They have always pushed it as the best way to connect low power peripherals like keyboards and mice. They deploy it in parallel with Firewire which they think is the best way to connect hard drives, video cameras, etc. I happen to agree with them too. Some companies, however, wanted a cheaper alternative to Firewire and did not mind losing some of the capabilities, so they reworked USB to try to be an inferior clone of Firewire as well. Apple has been less than supportive of this, since they already have Firewire for that purpose and don't like to downgrade to inferior technologies until all the rest of the industry has done so and they have no real choice.

  • by 99BottlesOfBeerInMyF ( 813746 ) on Wednesday April 28, 2010 @01:03PM (#32017368)

    At some point some hardware company had to make the plunge, and in these cases it happened to be Apple.

    Rather than it just "happened to be Apple", it was Apple for some very good reasons. Apple has a more loyal customer base for their PC's than other vendors because Apple has more differentiation, using a different OS. As such they can make more radical and major changes without losing as many customers to rival companies. Apple also spends more in R&D than most rivals PC makers because part of their business plan is to be more "cutting edge" and because they have the freedom to do so because they control more of the components of the systems they sell. In the long term this has developed a culture at Apple that pushes for these things. So being early adopters of GUI, mice, hard drives, USB, firewire, ethernet, etc. is not so much happenstance as business plan.

    I've heard people say that accessories for Apple products tend to be a bit more expensive than no-name accessories, or that more

    There are three causes for this belief. First, historically this idea took root because Apple accessories used different interfaces (first ADB then USB) from the standard and devices produced in lower quantity for a small market subset tend to cost more. The perception has persisted even when it is largely no longer true. Second, some peripherals require OS specific drivers and some manufacturers like to segregate their markets and sell the same hardware with different drivers at different prices which brings us to.. Third, retailers target markets with prices they think will make them the most money. The market for Apple users tends to be in the more affluent segment of society so some retailers target their premium (or premium branded) products as peripherals for Apple products.

  • by commodore64_love ( 1445365 ) on Wednesday April 28, 2010 @01:36PM (#32017952) Journal

    That AND because Microsoft had mandated back in 1996 that Firewire/USB/PCI was the future, and eliminate the old legacy busses. In a strange twist, it wasn't Apple that was innovating. It was Bill Gates. (Of course being Gates would could say he was actually "ordering" compliance.....)

  • by dachshund ( 300733 ) on Wednesday April 28, 2010 @02:33PM (#32018850)

    The iPad isn't particularly innovative, IMO; it's just likely well designed, well manufactured, well marketed, and has an extremely famous brand associated with it.

    I'm no Apple fanboy and I don't own an iPad, but your analysis doesn't seem exactly fair. The iPad isn't purely a product of slick design and branding (though that sure hasn't hurt.) Remember that when the iPhone interface came out it revolutionized the mobile phone UI world. Since then nearly all of the major manufacturers have completely reworked their UIs to mimic the touch-based interface-- Microsoft even scrapped their existing Mobile OS and completely replaced it. Palm is about to go out of business. The idea of a capacitive, multi-touch based interface with software designed from the ground up may not have been strictly novel (i.e., the component pieces were all out there), but Apple's method of integrating them all was really was a huge advance.

    Now it may seem reasonable to say that the iPad is just an iPhone scaled up to tablet size, so while the iPhone might count, the iPad is not a huge innovation. What this overlooks is that the iPad is just the second incarnation of the iPhone UI --- i.e., it's mostly the same innovation, but it's one that hasn't fully run its course. Taking that very successful UI approach up to tablet size may be an obvious step, but it's a worthy step that no competitors have been able to do convincingly. The tablet market was very close to zero right pre-iPad, and that's not all due to bad branding on the part of the existing tabletmakers. Mostly it's because the previous generation of tablets were very different animals and nobody wanted them (outside of a handful of specific fields). I'm guessing that if the iPad takes off (and a slew of Android/MS competitors succeed in its footsteps) it's not going to be due to good design and branding.

  • by node 3 ( 115640 ) on Wednesday April 28, 2010 @07:10PM (#32023560)

    And this is what I mean about taking what is actually a weakness and spinning it into a strength.

    Because it's *not* a weakness. It *is* a strength.

    These "weaknesses" are deliberate, not simple limitations in design or components. Apple makes these choices *not* because they want to control you (really? this is one of the most idiotic lines of reasoning perpetrated on Slashdot in recent times). It's because they want to control the technology so that it's appealing to more people.

    The iPhone is brilliant because it doesn't force people to conform to technology. It does this by limiting features?

    Yes! Glad you finally understand.

    You make the same logical fallacy that you criticized the GP for. You don't care. You assume that nobody else cares.

    It's not an assumption if it's true. And before you get technical, I've made it clear that I don't mean there aren't *some* people who care, just that most people don't. And the popularity of iPods and iPhones backs me up.

    Empirically, it can be said that Apple's platform is far more restrictive for developers and users than Android (or WebOS or even Windows Mobile).

    And like I said, nobody fucking cares.

    All things being equal, closed/restrictive systems tend to attract fewer developers than open/permissive ones. Fewer developers means fewer applications, less innovation on the platform...

    All things aren't equal, and it's irrational to assume they are.

    You act like there are only two options.

    1. Let Apple decide what's best.


    2. Have a terribly complicated experience that only a techie could love.

    the GP and myself would like an option 3:

    make the hardware and software as capable as possible and let the users/developers determine the boundaries of its capability.

    Option 3 is an illusion. It's just #2. And of course, I don't advocate "Let Apple decide what's best". QUIT WITH YOUR FUCKING STRAW MEN.

    p.s. "nerd rage"? you do know you're on Slashdot right? "News for Nerds" and all.

    Exactly why it's so prominent here. I'm just pointing out that the nerd rage that's so prevalent here is quite notably absent outside of nerd circles.

Adding manpower to a late software project makes it later. -- F. Brooks, "The Mythical Man-Month"