Apple's Adoption Of HEVC Will Drive A Massive Increase In Encoding Costs Requiring Cloud Hardware Acceleration (streamingmedia.com) 203
An anonymous reader shares a report: For the last 10 years, H.264/AVC has been the dominant video codec used for streaming but with Apple adopting H.265/HEVC in iOS 11 and Google heavily supporting VP9 in Android, a change is on the horizon. Next year the Alliance for Open Media will release their AV1 codec which will again improve video compression efficiency even further. But the end result is that the codec market is about to get very fragmented, with content owners soon having to decide if they need to support three codecs (H.264, H.265, and VP9) instead of just H.264 and with AV1 expected to be released in 2019. As a result of what's take place in the codec market, and with better quality video being demanded by consumers, content owners, broadcasters and OTT providers are starting to see a massive increase in encoding costs. New codecs like H.265 and VP9 need 5x the servers costs because of their complexity. Currently, AV1 needs over 20x the server costs. The mix of SD, HD and UHD continues to move to better quality: e.g. HDR, 10-bit and higher frame rates. Server encoding cost to move from 1080p SDR to 4K HDR is 5x. 360 and Facebook's 6DoF video are also growing in consumption by consumers which again increases encoding costs by at least 4x. If you add up all these variables, it's not hard to do the math and see that for some, encoding costs could increase by 500x over the next few years as new codecs, higher quality video, 360 video and general demand increases.
Server side optimization. (Score:5, Informative)
Isn't that kind of the point? You optimize once and you save more on the other end since each playback device isn't wasting battery and bandwidth playing the less efficient version.
Re:Server side optimization. (Score:4, Interesting)
Not to mention VP9 and AV1 are royalty-free, so you can imagine hardware encoders being built into future devices and server GPU/CPUs for both of these.
Re:Server side optimization. (Score:5, Informative)
Indeed. You encode once when the video is created or uploaded. Then you save bandwidth and decompression costs every time the video is downloaded, which may be thousands or even millions of times. I would expect this to put less load on the server hardware, rather than more.
Re:Server side optimization. (Score:5, Insightful)
Indeed. You encode once when the video is created or uploaded. Then you save bandwidth and decompression costs every time the video is downloaded, which may be thousands or even millions of times. I would expect this to put less load on the server hardware, rather than more.
Exactly.
IMHO, this article is scaremongering, or at the very least, written by someone who hasn't thought (or costed) the whole chain through.
Re: (Score:2)
Re: (Score:2)
I'm surprised more people here haven't questioned the 500x stat. Seriously, 500x more processing-intensive? If that's even remotely accurate, which I'm positive it's nowhere near that, it simply will just stay H.264 until things get cheaper.
That, and as another commenter said (paraphrasing) "What costs 500x today will cost 1.4x in a year."
As I said: "Scaremongering".
Re: (Score:2)
You are probably right that the total load will be less, but it might be a shift in where the load is. If uploads are suddenly slower because there is a long queue at the servers that encodes the videos during peak times, uploaders may not like that. And they will surely not give a crap that google saves money some
Re: (Score:2)
All of those still need encoding.
And YouTube encodes them all to their preferred bitrates and resolutions. It doesn't matter what format you upload to YouTube, it always re-encodes it. YouTube transcoded their catalog to VP9 [googleblog.com] to add VP9 support a few years ago.
Re: (Score:2)
You optimize once and you save more on the other end since each playback device isn't wasting battery and bandwidth playing the less efficient version.
Pretty sure compressed content uses more battery than uncompressed content
Re: (Score:2)
Yes, certainly, an incoherence or inchoate of sentence can certainly be constructed of more plyons if the user ups the value of the drive level on their H.264.
You misspelled "pylons". Try and be a bit more careful with your grammar and spelling next time.
Re: (Score:2)
Why mess with h.265 (Score:2)
The patent situation on h.265 is a total mess. Why even bother with it?
Re: (Score:2)
Actually, my question is: why does an OS have to make that choice for people? Is it not possible to provide more than one video codec on mobile devices? I could perhaps see the point of Google choosing NOT to support a format in which you need pay royalties, but why would Apple NOT choose to support a free format in addition?
Re:Why mess with h.265 (Score:4, Insightful)
Re: (Score:3)
Actually, my question is: why does an OS have to make that choice for people? Is it not possible to provide more than one video codec on mobile devices? I could perhaps see the point of Google choosing NOT to support a format in which you need pay royalties, but why would Apple NOT choose to support a free format in addition?
People have bad experience on the iPhone because of poor battery life because of a poorly supported codec so people buy less iPhones. So Apple says only these codecs, providers comply because they have to, users get a good experience, everybody happy? Not sure if it passes a reality check, but I'm pretty sure that's the line of reasoning.
Re: (Score:2)
People have bad experience on the iPhone because of poor battery life because of a poorly supported codec so people buy less iPhones. So Apple says only these codecs, providers comply because they have to, users get a good experience, everybody happy?
Yeah, that's it. The iPhone's battery life if poor because of a lack of accelerated video playback... not because the phone is too thin to have a decent-sized battery...
Re:Why mess with h.265 (Score:4, Insightful)
Actually, my question is: why does an OS have to make that choice for people? Is it not possible to provide more than one video codec on mobile devices? I could perhaps see the point of Google choosing NOT to support a format in which you need pay royalties, but why would Apple NOT choose to support a free format in addition?
Because when you are designing an SoC, and want to design-in a video codec subsystem, you generally only have the real-estate/budget to design-in ONE.
I'm sure they support more formats for DECODE, but ENCODE is where the rubber meets the road, and Apple really DOESN'T "need" to support more than one ENCODING format on their PHONE.
And a quick trip to Google allays my fears. Multiple formats are still supported for encode and decode; but the hardware preference is moving toward HEVC/H.265, which everything from the A8-forward for iOS/TVOS, and everything from 6th Gen. Intel-forward supports HEVC encode/decode in hardware.
Re: (Score:2)
Actually, my question is: why does an OS have to make that choice for people?
The same old usual, boring way: The OS maintainer says "Hey customers! We're including the libraries & paying the licensing so you can use [this codec]."
Apple has a pluggable system for codec support in QuickTime - if you want VPx, Theora, Opus -- get the plugin, and the codec works. It's not unlike adding a codec in GStreamer. That said, you can only install the codec plugin on a Mac.
For more special-purpose hardware (iOS and Apple TV), you can compile codec support into your app - VLC for iOS & t
Re: (Score:2)
70-80% of all content uploaded to YouTube is in h.264
YouTube transcodes everything to VP9 at their preferred resolutions and bitrates. The upload format doesn't matter.
~0.4% is 4k or higher
VP9 outperforms H.264 [medium.com] at all resolutions.
skipping VP9 entirely
It wasn't skipped. The practical reality is that VP9 has been used for years.
Re: (Score:2)
Being "better" has never been a sufficient reason to support a technology.
The fact that most uploads to YouTube are in AVC shows that most of the cameras and editing software pushing video do not use VP9.
The guys making the cameras and editing software don't see much reason to use VP9, and given AV1 is coming soon, it makes little sense to spend the time designing in support for VP9. H.264 works just fine for their purposes.
VP9 is only "free" in terms of licensing. Every adopter will have to spend for Engin
Re: (Score:2)
Spending money to support VP9 for a few months makes little sense.
It won't be for a few months. It will be a gradual transition from VP9 to AV1. You don't think Netflix is using VP9 [medium.com] for the fun of it, do you?
Re: (Score:2)
And for those "months"in the transition, you can use AVC or AV1. VP9 isn't necessary.
Believe me, customers will never notice the difference.
Re: (Score:2, Informative)
So is every major hardware vendor that deals with video... what's your point?
( http://www.mpegla.com/main/programs/AVC/Pages/Licensors.aspx)
Re: (Score:2)
http://www.mpegla.com/main/programs/AVC/Pages/Licensors.aspx
That's the AVC (aka H.264) patent pool. The MPEG LA's HEVC patent pool [mpegla.com] is different. Microsoft and Google, for example, are not part of it.
Re: (Score:2)
Minor nitpick: HEVC doesn't have a single patent pool -- which is, of course, a big part of the problem.
The MPEG LA's license pool is one of them, but there are pools controlled by HEVC Advance, Technicolor, one from Velos Media...
So instead of one license body trying to shakedown customers, there are four -- and the price to license HEVC is at least 4x that of AVC. There's a reason HEVC has been around for four years and hasn't seen significant adoption... they've priced themselves out of the market.
I per
Re: (Score:2)
Roughly 80% of YouTube videos are uploaded in h.264
The upload format is irrelevant. YouTube supports many upload formats. The uploaded file is re-encoded to new H.264 chunks and transcoded to VP9 chunks at YouTube's preferred bitrates and resolutions. VP9 is YouTube's leading distribution format. Watch any YouTube video in a browser that supports VP9, right click on the video, select Stats for nerds, and you'll see that almost always the video is VP9 with the occasional H.264 video.
Re: (Score:2)
And if you *dont* use a browser/device that has VP9, you'll never notice the difference. AVC is "good enough"
My point is more or less that if you haven't adopted VP9 by now, there's a good business case that you should save your money and wait for AV1.
Re: (Score:2)
AVC is "good enough"
Not really. Netflix streams 1080p H.264 at 7500 kbps. That's a lot of wasted bandwidth. VP9 and especially AV1 will do better.
Re: (Score:2)
Software decoding is a possible choice, but requires so much power that it is infeasible on anything but a laptop or desktop.
VP9 video plays back perfectly in software on my iPhone 7 using VLC for iOS [videolan.org].
Re: (Score:2)
The patent situation on h.265 is a total mess. Why even bother with it?
I encode video with h.265 every day. What's the problem?
Re: (Score:3)
How much do you pay in patent fees?
Re: (Score:2)
For content that is free to end users, none of the HEVC licensing bodies charge a royalty/content distribution fee.
In general, personal use of HEVC with either software or hardware encoders is free.
Unless you're distributing HEVC-encoded videos under a paid scheme, you're not going to pay any patent fees just for encoding videos.
Re: (Score:2)
none of the HEVC licensing bodies charge a royalty/content distribution fee.
You don't know that. You don't know that because the third HEVC patent pool, Velos Media [velosmedia.com], hasn't announced their licensing terms. You don't know that because some companies, like Technicolor [technicolor.com], are not in any patent pool and you must negotiate a separate HEVC license with them.
No point wasting time on HEVC's licensing mess. Just use VP9 now and use AV1 later. They really are royalty-free for all use cases.
Re:Why mess with h.265 (Score:5, Informative)
What a mess now there is a third licensing pool for h.265...
http://blog.streamingmedia.com... [streamingmedia.com]
Re: (Score:2)
The three - soon to be four - most recent generations of iPhones have hardware support for h.265 already built-in [appleinsider.com]. Apple has been using the codec for FaceTime for three years now.
I suspect Google will support h.265 in addition to their own codecs. I mean, they talked a lot about removing h.264 support, but when push came to shove they quietly shelved that idea.
Re:Why mess with h.265 (Score:5, Informative)
I suspect Google will support h.265 in addition to their own codecs
No. They use VP9 on YouTube [googleblog.com] and have been for two years. They dropped support for 4K video in H.264 on YouTube [appleinsider.com] a while back. YouTube will start encoding video with AV1 around six months after the bitstream is finalized.
H.265 is futureless for web video. Major streaming services are members of the Alliance for Open Media [aomedia.org] (Google, Netflix, Hulu, Amazon) because they want to use AV1 on their service. They recognize correctly that H.265's licensing mess makes it a poor option.
Re: (Score:2)
Re: (Score:2)
Because it's what everyone else uses.
Unless you can trick smart phones with h.265 acceleration built inside it to look smooth and not smoother battery life then you be my guest?
Re: (Score:2)
3 Reasons to bother with HEVC:
1. Of the three next gen video codecs, it is the most mature. The number of hardware decoders and encoders for HEVC dwarfs that of VP9. HEVC beats VP9 in both size and quality in many applications, though they are close. AV1's bitstream format hasn't even been frozen yet.
2. The patent situation is a mess, but it can be navigated. Not by most end users, but in general, most personal use of HEVC with x265 or your hardware encoder of choice is royalty free.
3. HEVC has already been
Re: (Score:2)
HEVC definitely makes sense for Apple:
* Both ATSC 3.0 and DVB-UHDTV adopted HEVC as their codec; this means new TV's in North America, Europe, Australia, and much of Asia and Africa will soon have HEVC built-in. Sattelite TV uses a variant of DVB, so it's probable they'll use HEVC for 4k as well, Digital Cable uses ATSC in the US - so that will also likely be HEVC.
* Blu-ray 4k adopted HEVC.
* Most professional 4k cameras use HEVC, as do quite a few consumer cameras.
There is a hard requirement for HEVC: TV
Re: (Score:2)
As for internet and mobile phones, there's actually a huge
Re: (Score:2)
It's generally super-easy to implement or accelerate in hardware compared with Google and other open source or patent free codecs.
And yet there's lots of VP9 hardware accelerated devices out there. Implementing VP9 hardware acceleration is clearly not that hard.
custom codec chips (Score:2)
purpose built chips will make your "times X" arguments irrelevant, and they'll support any needed coding system
Re: (Score:2)
purpose built chips will make your "times X" arguments irrelevant, and they'll support any needed coding system
Precisely.
And if they can do it in a PHONE, they can sure as HELL do it in full-on GPU.
Apple (Score:3)
Apple means paying customers, so they have a huge weight on CODEC decision for the market.
But there is one company much bigger than Apple when talking about video and that is of course Netflix. Whatever Netflix decides, companies will have to follow.
Re: (Score:3)
But there is one company much bigger than Apple when talking about video and that is of course Netflix. Whatever Netflix decides, companies will have to follow.
I suspect that decision has already been made [dailytech.com]... indications are Netflix is going with h.265.
Re: (Score:3)
If both Netflix and Apple are going with H.265 then we don't have a choice.
The fate of the alternatives is already decided and will join other relics like HD-DVD, miniDisc, etc.
Re: (Score:2)
indications are Netflix is going with h.265.
No. Netflix is going with VP9 [medium.com] and in future will go with AV1 [wikipedia.org]. Netflix is a member [aomedia.org] of the Alliance for Open Media [aomedia.org].
Re: (Score:2)
Hmm, they seem to be talking across each other. We know that Netflix is actually offering HEVC (h.265) streams right now, per my original link; and here is their manager of encoding technology [streamingmedia.com] talking about it back in 2014.
Re: (Score:2)
back in 2014
Yes, that was when there was still hope that the patent licensing mess would be resolved.
H.265 licensing has only become worse. Three patent pools (MPEG LA [mpegla.com], HEVC Advance [hevcadvance.com], Velos Media [velosmedia.com], one of which has not even announced terms, and companies like Technicolor [technicolor.com] who are not in any patent pool so you need a separate license from them. It's complete a joke. H.265 licensing is simply impractical.
It's cheaper and simpler to go with royalty-free formats like VP9 today and AV1 in future.
Re: (Score:2)
Then, last September, there was this article [extremetech.com] which says Netflix hasn't decided. They say HEVC saves them 20% of storage space versus the equivalent VP9 encode; but on the other hand VP9 saves them royalty payments.
Re: (Score:2)
Netflix hasn't decided
They have decided. AV1 will be their preferred codec. AV1 already outperforms H.265 [bitmovin.com] and companies like Bitmovin are adding support for it now. You can try an AV1 demo [bitmovin.com] with Firefox Nightly.
Re: (Score:2)
The cost looks to be $.005 (half a cent, US) per streaming user per month [wikipedia.org].
Re: (Score:2)
Not at all. A good example of this is Youtube actively adopting VP9 for what was at the time the single biggest source of data moving across the internet. Yet the hardware decoding adoption for consumers for VP9 stands precisely at 0%.
Netflix can adopt whatever they want, but unless they screw their entire legacy customer base they will need to maintain compatibility with the old, and unless they want to be in a position where someone else wants to take their streaming crown (and every man and his dog inclu
Re: (Score:3)
Yet the hardware decoding adoption for consumers for VP9 stands precisely at 0%
Intel has been shipping VP9 hardware decoding [techreport.com] for years. By default Microsoft Edge enables VP9 [windows.com] when hardware decoding is present (though you can override that to enable VP9 in software). VP9 is a standard video format on Android and many Android phones have VP9 hardware decoding.
Re: (Score:2)
Re: (Score:2)
running that in order for VP9 hardware decode.
No. VA-API on Linux has had accelerated VP9 encoding and decoding [phoronix.com] for a long time.
The need for hardware accelerated decode is overstated anyway. 1080p VP9 video from YouTube works fine in Firefox on an 11 year-old dual core desktop. My iPhone 7 plays VP9 video just fine in VLC for iOS.
Re: (Score:2)
Who cares about Intel and Microsoft and browsers? Only nerds watch Netflix on a freakin' computer. Normal people use set-top boxes and tablets.
nope, FPGAs (Score:2)
I think it's far more likely that this would drive Google to add an FPGA on a card to some of their boxes if they don't already have one on the motherboard. That would allow them to adapt to any new codec out there. Crisis averted!
Re: nope, FPGAs (Score:2)
Yes, FPGA and DSP makers will make a killing on that. Dedicated transcoding chip makers, not so (they will have to pay for license for every codec they touch (unless they are Chinese))
Re: (Score:2)
Google or Amazon could use cloud computing to transcode the video stream before it's sent to the mobile device. Problem solved.
Gonna need a source check on that. (Score:4, Informative)
H.265 encode and decode is baked into all hardware produced by the big three video card manufacturers.
Re: (Score:2)
The 00's are calling and want your servers back. (Score:2, Informative)
Most manufacturers now make barebones servers specifically designed to cram [tyan.com] in GPUs [supermicro.com]. Amazon AWS, Google Cloud and Microsoft Azure all offer virtual servers with multiple dedicated GPU's as well. Yes, your run of the mill server is still headless with an ASpeed IPMI but you can get absolutely crazy [servethehome.com] with GPU server platforms.
Re: (Score:2)
Re: (Score:2)
baked into all hardware
There's broad hardware support for VP9 [wikipedia.org] as well. The major CPU and GPU manufacturers are all members of the Alliance for Open Media [aomedia.org], so eventually they'll all have AV1 [wikipedia.org] support when it's finished.
The licensing mess around H.265 makes it a non-starter. There are three separate patent pools you need to buy a license from (MPEG LA [mpegla.com], HEVC Advance [hevcadvance.com], and Velos Media [velosmedia.com]).
No one can tell you what your final licensing cost will end up being. Velos Media hasn't even announced their licensing terms yet. Some companies, l [technicolor.com]
Re:Gonna need a source check on that. (Score:4, Informative)
Re:Gonna need a source check on that. (Score:4, Interesting)
As a tendency hardware H.264 encoding is inferior in terms of quality and bitrate compared to software encoding
While you're technically correct, that's a rather poor way of phrasing the actual situation: that the algorithms implemented in hardware are generally worse than the ones implemented in software. The way you've phrased things, it makes it sound like there's something inherently wrong with hardware that makes it produce worse results, but that isn't the case in the slightest. Rather, the problem is that hardware implementations will nearly always lag behind software implementations by anywhere from a few months to a few years, and that's why the "tendency" you're talking about holds true.
But for any given algorithm, it's worth pointing out that you'll get the same results regardless of where it's implemented, though you'll be able to do so far more efficiently in hardware.
Re: (Score:2)
Exactly.
Anything that is done in software, can be done in hardware.
Likewise anything that is done in hardware can be done in software ... thats why we have these things that are all the rage now called VMs.
DEMAND better quality! (Score:3)
Yeah, remember that meeting we all had last Spring? We all got together with our pitchforks and torches, rammed down the door to the codec people's house, and said, "Enough with H.264, already! Give us something better, dammit." That was a helluva time.
Re: (Score:2)
I think the studios got confused.
Many people are demanding better quality movies instead of the crap that currently gets churned out by Hollywood. But apparently the studios think that means we want to see more of JJ Abram's patented lens flare in higher resolution.
Encoding cost has benefits (Score:2)
The story summary talks all about the costs and nothing about the benefits. Less data to be served for high quality output could very well be worth the higher encoding cost.
An odd approach (Score:2)
Apparently the people that wrote this article think you'll be encoding media assets per request, instead of once per millions... otherwise what they are saying makes even less sense...
Good for Amazon (Score:2)
This is nothing but good news for AWS.
But (Score:2)
A nice shiny expensive new mac will get the job done just as well as your previous computer. Sure no ulterior motives there.
As moores law comes to an end technology companies need new ways to invent sales. One is renting software. The other is making standards require more expensive versions of their products
Not really a good way to gauge future costs (Score:5, Insightful)
Same thing with encryption. Old encryption standards typically aren't retired because they've been cracked. They're retired because a brute force attack against them used to take centuries or millenia, but computers have become fast enough that a brute force attack now takes only days or hours.
MPEG2 with its horrible compression ratio became the standard for DVDs because at the time MPEG4 took too much processing power to be economically added to every DVD player. The same is going to be true for these newer video codecs. Initially they'll be computationally expensive, but within a few years they'll be tolerable. And after a decade it'll be trivial and we'll be looking towards replacing them with a new codec which takes advantage of more powerful modern hardware.
Re: (Score:2)
Ironically, patents are what killed gif and tif. The industry back then avoided patents rather than wanting to join them.
Tif is expired now so if you have Windows 10 you can enable support in add or remove features?
This move is coloboration to pick shitty tech to increase sales. Needing new Macs and shiny new phones help Apple and Samsung greatly!
Re: (Score:2)
TIFFs are still out there all over the place, especially for scans and including within .pdfs.
That makes no sense (Score:2)
You encode the file once (which may well take 5 or 20 times the processor power) and upload to the server which will then save bandwidth and storage costs because of the smaller file size.
I just want it to work. (Score:3)
This Balkanization of codecs is a mess. Consumers, and developers, just want it to work. Let's see... I've got enough old DVDs and VHSs to watch for a decade... Maybe time to sit out this fight.
Holy crap... What The Bunny!?!?!?! (Score:2)
The author of this story seems to think that there's a correlation behind business and encoding complexity.
Let's start with this. While H.264 and H.265 and AV1, etc... are all really cool, large scale content delivery systems tend to profit far greater from better use of core components of a codec than from improvements to core components of the codec.
Let's consider things like improved motion search. Depending whether you
OF COURSE it's Apple's fault! (Score:2)
Nice clickbait-y headline, Dan. Despite the fact that the FIRST SENTENCE says "with Apple adopting H.265/HEVC in iOS 11 and [emphasis mine] Google heavily supporting VP9 in Android", the headline only mentions Apple. Gee, I wonder why that is?
Hey, remember how happy we all were when Android overtook Apple in market share, and now they're several times larger? So wouldn't a more ACCURATE headline put the bulk of the blame on Google? Who, by the way, is ALSO a strong driver of video codec change via YouTube?
A
Re: (Score:2)
I want better video quality, but in the sense that I'd like to see higher 720P quality at lower bitrates.
Re: (Score:2)
Re: (Score:2)
I'm not going to install Firefox just to see that, sorry.
Re: (Score:2)
Re: (Score:2)
The price difference isn't ridiculous because it's still a relatively small number. You're right - it'll be on anything and everything. H.264 used to be crazy impossible to decode way back when. I remember Microsoft distributing some form of MPEG-4 (may have been AVC1) video clip with surfing in 1080p. It played so choppy I couldn't even really view it except as practically a slideshow. I wish I could remember more, but I remember how I felt when my computer couldn't play it.
Re: (Score:2)
You can still buy brand new phones that don't have hardware acceleration for H.265. That's a 4 year old codec.
VP9 is even less well supported in hardware. It's 5 years old.
So, how long until a large chunk people have devices with hardware acceleration for a codec that isn't even public yet?
So (Score:2)
Re: (Score:2)
Cost times Number of Uses (Score:5, Insightful)
At some point, it must be easier to upgrade everyone to fibre and just stream the content natively.
Unlikely. What both you and the OP has forgotten about is that the increased cost of the servers needed to encode the video once is going to be offset by the reduced storage and bandwidth requirements. The video will be streamed thousands, if not millions, of times which not only requires huge amount of bandwidth but also means the file will be stored in multiple locations on multiple disks. This means that the savings in bandwidth and storage are magnified by the number of uses and will almost certainly offset the increase in cost of one encoding.
Re: (Score:2)
Re: (Score:2)
with better quality video being demanded by consumers
You're damn right that we demand better quality videos. The amount of crap that's being produced these days is astronomical, with only a few shows actually being of any real quality.
Of course the studios are just going to churn out even more derivative crap, because that's easier and quicker than producing quality videos.
Re: (Score:2)
Not remotely. What you forget is there are a large number of devices, especially low-powered devices, that do not have hardware HEVC decode and instead are still best served by MPEG4. While bandwidth changes will depend heavily upon the mix of devices, storage requirements will only increase since the provid
Re:Encoding costs (Score:5, Insightful)
Don't the compressions just need to run once per show? Not every time you stream. Yea, up to 500x more, but only once. Seems like there really isn't much of a problem.
Re: (Score:2)
but but.. we need to charge 500x more to stream this content each time you view it.
Re: Encoding costs (Score:3, Funny)
Meanwhile, in their latest bold move, Apple has abandoned multiple frames per second in favor of 1 frame per second for all content.
When asked about this disruptive move, an apple representatives stated that: "Eliminating frames in favor of the best one provides a substantial benefit for all of those involved, there is less bandwidth involved for the streaming companies, and our customers get the predetermined best movie at the best resolution with more time to enjoy each frame."
Re: Encoding costs (Score:2)
No, because you have to encode in different bitrates in order to have a seamless playback on multitude of devices and bandwidth. Today's streaming techniques don't encode one video file in entirety but instead break the video down into multiple short videos that are stitched together by the player using a playlist. This allows the video to start almost instantly at a lower quality then seamlessly play higher qualities a few seconds later, and drop down to lower qualities when you experience congestion in th
Re: (Score:2)
No, because you have to encode in different bitrates in order to have a seamless playback on multitude of devices and bandwidth. Today's streaming techniques don't encode one video file in entirety but instead break the video down into multiple short videos that are stitched together by the player using a playlist.
The short (or sometimes not-so-short) video that precedes every Apple TV viewing event, you know, the one with the spinner and the word "Buffering", seems to be encoded at a pretty decent resolution, Maybe it's built into the device for quick access every time?
Re: (Score:2)