New H.266 VCC Codec Up To 50% More Efficient Than Previous Standard (appleinsider.com) 115
The Fraunhofer Heinrich Hertz Institute on Tuesday announced the H.266 Versatile Video Coding codec, which will power more data-efficient video capture and transmission on future iPhones. AppleInsider reports: Apple adopted the predecessor to the new codec, H.265/HEVC, in iOS 11. The updated video codec, which was developed after years of research and standardization, will bring a number of tangible benefits to future iPhone users. In its announcement, the Fraunhofer HHI said that H.266 will reduce data requirements by around 50% thanks to improved compression. With the previous HEVC codec, it took about 10GB of data to transmit a 90-minute ultra-high definition (UHD) video. H.266 can do that with 5GB. The codec, as detailed in a 500-page specification, was designed from the ground up for use with 4K and 8K streaming. It'll allow users to store more high-definition video and reduce the amount of data on cellular networks.
Cool (Score:5, Interesting)
Re: (Score:2)
Given that there's more viable royalty-free codecs out there as compared to when H.265 was coming out, I suspect they'd need to dial back on that to gain adoption.
But Wikipedia seems to think that's not happening.
https://en.wikipedia.org/wiki/... [wikipedia.org]
Re: (Score:2)
Re: Cool (Score:1)
Re: Cool (Score:5, Insightful)
Encumbrance has nothing to do with the size of the royalty fee.
Re: (Score:1)
Let's hope that they only charge for using the encoders, not the decoders.That way the wealthy iPhone owners can bear the brunt of the costs.
Re: Cool (Score:4, Insightful)
Hey iPhones are a great deal. My 6 plus is still getting security updates. Show me an Android phone released in 2014 still getting OS updates.
Re: (Score:2)
Lol enjoy your ogg-vorbis files.
there's more to the world than iPhones dude (Score:1)
Re: (Score:2)
But seriously... no one is going to care that their $999 iphone has a $0.10 patent for VCC included in the cost.
Sure, but your iPhone videos won't work on anything that isn't paying the fee (eg. Linux).
Re: (Score:2)
And before you say it, things like streaming boxes that use Linux as their base OS would obviously have some proprietary decoding hardware/software and pay the fee on their devices.
Too late, AV1 has already won. (Score:5, Interesting)
Take a step back and spend a moment to look at the member organisations, of AV1, the competing codec that is free, open-source and has already been around for quite some time :
Netflix, Google (Youtube), Amazon (Prime), i.e.: "where nearly all the video streaming is coming from" and "the video ecosystem that is currently surpassing TV/Cable"
ffs even Giphycat is in there (one of the major meme factory of the net).
Mozilla (Firefox), Google (Chrome) and VLC are there: the most likely software that you're going to be watching the content on.
Intel, AMD, Nvidia, ARM, and countless others : the chip manufacturers that are going to make the devices you'll be watching this content on are there, no matter if the device is in your pocket, or connected to a giant wall and everything in between.
The thing is already in production (just check the stats for nerd on most hi resolution YouTube videos)
Remind me again why one would think that H266 is any relevant? If you think that's because the 2020-released codec performs better than the 2018 one, think again: do you really think all these organisation will be ready to backtrack and return into the patent minefield, or do you think they'll simply wait for the AV1 successor, still royalty free and opensource ?
Yes, the dying deprecated TV/Broadcast/Cable ecosystem is still deeply entrenched into MEPG / H.2xx encoding standard.
So there are probably going to use it for DVB-T3 or whatever, and there'll be a bit of hardware produced for the last couple of TV broadcaster that didn't get bankrupt by then, and the two elderly view who are still using them.
And probably Apple will go for the "luxury filled to the brim with useless feature" and "do my own thing, not the standard" route as usual, they'll probably support H266 on their product and probably will use it for their video streaming services.
But that's about. The rest of the industry, the big relevant part of it, has already moved on to better/cheaper alternatives.
Re: (Score:3)
Several of the same companies that are members of AOMedia and worked on AV1 also worked on H.266. Apple, Intel, and Microsoft are all governing members of AOMedia, and also worked on H.266. Ericsson, Huawei, Qualcomm, and Sony, on the other hand, worked on H.266 only.
Considering Qualcomm has a near monopoly on Android smartphone chipsets (at least in North America), phones that support H.266 in hardware but not AV1 will not help AV1's adoption.
Rest of the world (Score:3)
Considering Qualcomm has a near monopoly on Android smartphone chipsets (at least in North America), phones that support H.266 in hardware but not AV1 will not help AV1's adoption.
Exactly that: "at least in North America" (And it seems to me you are also one of the last bastion of cable TV networks, compared to the rest of the world).
Qualcomm tends to produce mid- to high- range chips that are mostly featured in mid- to high- range products.
Meanwhile, have a look at what is in the bargain bin of your local electronics stores: chances are a lot of the devices (cheap no-name smartphone, cheap no-name tablets, TV set top boxes, etc.) are featuring much cheaper alternatives like Mediatek
Re: (Score:3)
Remind me again why one would think that H266 is any relevant?
I'll wait for actual performance figures before I answer that question. I'll tell you the answer could be the same reason why we have declared that AV1 has "won" despite not one of the people in the member organisation producing content with it: efficiency.
At present hardware codecs don't exist. At present AV1 is woefully inefficient on software and absolutely painful to use. The best codec means diddlysquat if I can't hear my movie over the sound of a laptop fan or if Amazon need to dedicate their entire A
Time goes on (Score:2)
I'll wait for actual performance figures before I answer that question.
Well H266 is a codec released in 2020, it's going to be better than AV1 released in 2018, and conversely it's going to be beaten be whatever comes in 2022-2024.
The question is: will the gained be performance be worth the patent minefield.
despite not one of the people in the member organisation producing content with it:
You haven't checked the "Stats for Nerds" on Youtube as of lately? HD stream of popular channel are currently delivered over AV1 (if your hardware/CPU/GPGPU supports it).
Netflix also begs to disagree [netflixtechblog.com] with your assesment.
At present hardware codecs don't exist.
Yep, try to tell that to Mediatek [mediatek.com], the manufacturer of
Re: (Score:2)
Yep, try to tell that to Mediatek [mediatek.com],
I don't need to tell it to anyone, they need to tell it to me. I'm a tech head with all current gen hardware, current gen phone, current gen processors, current gen GPUs, current gen streaming device for my TV, and currently not a single hardware assisted AV1 device in the house.
Effectively they do not exist. The fact that one manufacturer has released something to market doesn't change that.
Likewise for your Netflix example, it doesn't change the fact that it isn't a default, is nothing more than a small t
Re: Too late, AV1 has already won. (Score:2)
Because AV1 sucks ass. That's why.
It is inferior to H.265, and even more so to H.266.
Oh, and fun fact: All the file sharers use H.265 and H.254. Nobody uses AV1.
That difference shows very clearly that it is purely a licensing cost issue. Nobody in the free world uses AV1. Only people in the imaginary property world use it. If H.266 would be free, they'd abandon AV1 before you could say "protection racket".
Torrents (Score:2)
Oh, and fun fact: All the file sharers use H.265 and H.254.
Do they? Actually? ;-) ) shows that x264 is the most frequently used encoder, not x265.
A quick survey (from that constantly asking friend
The thing is:
- file sharer aren't desperate down to the last bit of data, thanks to the distributed nature of torrents and other such peer-2-peer distribution channels.
- they need to balance small size with ease of use.
Sure, encoding H265 with x265 produces smaller files, but it turns out not that much hardware has built-in h265 capabilities.
That has direct
Re: Cool (Score:2)
The price is not reasonable at all, as it has no relationship with the actual work done.
So let's say 8 billion smartphones will ultimately get this every two years. That is $1.6-0.8 billion. $800-400 million a year.
*But the job was done only once!*
So you can easily see that it will be above the total all-included development cost very quickly.
And everything after that, in money they get, but did not work for. It quickly becomes usury.
Imagine you and me doing that.
Imagine proposing to your employer, that you
Re: (Score:2)
Imagine proposing to your employer, that you work 10 years, but you somehow get paid your salary until the death of your entire family lineage, including the ability by third parties to become your family, should it die out.
This is why patents expire after 20 years, not the life of grandchildren bullcrap that copyrights have.
Re: (Score:2)
Re:Cool (Score:5, Interesting)
The prior codec HEVC is the key example here. Many of its patent holders had basically reasonable demands, similar to h.264 which was super successful.. But some fraction of the patent holders held out, and in fact have not to this day said what they would charge. For example, perhaps they will charge a penny per minute for all content distributed. They will not say, and so this patent threat hangs over anyone using HEVC. Predictably, this has greatly hindered the adoption of HEVC.
This crazy dysfunction of the patent-fee driven HEVC motivated the creation of https://aomedia.org/ [aomedia.org] and the royalty free (to be litigated, but I'm guessing it remains free) AV1 codec. If you are in favor of free and open tech, AV1 is for you. Thus far its trajectory is good, but we'll need to see hardware support for decode and then encode in 2021 and 2022 to seal the deal.
HEVC is a great example of a corrupt process leading to a huge wast of human effort instead of actually solving the problem.
Re: (Score:2)
but we'll need to see hardware support for decode and then encode in 2021 and 2022 to seal the deal.
This is currently one of the biggest hindrances to AV1. It is painfully slow to software encode, and decoding while not too bad doesn't benefit from hardware acceleration which leads to high battery consumption on portable devices.
Re: (Score:2)
decoding [of AV1 video] while not too bad doesn't benefit from hardware acceleration
What in AV1 isn't amenable to GPU compute acceleration (CUDA, AMD Stream, OpenCL)?
Re: (Score:2)
Has a research institution ever just given away their work for free? Or more to the point, would you expect them to?
Re: (Score:3)
Has a research institution ever just given away their work for free?
Yes. It happens all the time. Particularly with research institutions affiliated with universities. Research is often funded by grants, even some government grants that result in data or software freely available to all..
Re: (Score:2)
Let me move the goalpost since I forgot a keyword: Has a PRIVATELY FUNDED research institution ever just given away their work for free. The Fraunhofer institute largely exists to further privately funded research with only a small portion of their work being government funded or through university affiliation / grants.
Re: (Score:2)
Google gave away VP8 and VP9; they are a private company, last I checked. Just because Apple and Fraunhofer don't give things away for free, doesn't mean the whole world acts that way.
Re: (Score:2)
Google did not give away VP8 or VP9 unconditionally. They gave them away with the caveat that they hold patents to them which you can license for free providing you give them a reciprocal agreement in return. They did this knowing fully well that you couldn't create a codec such as VP9 without infringing on someone else's patent. It was a gamble and it seems not to have paid off given how a patent pool was formed around VP9 by other companies last year who have announced the intention to license the encoder
Re: Cool (Score:2)
Researchers. Not profiteers. Key differemce in personality.
Researchers don't give a flying fuck about profit. Only about fame. And money only plays a role in achieving that purpose. E.g. via grants. Never on its own.
The problems start, when the money comes from a profiteer. By definition leeching on them.
Which is why the best reseach is financed by a government. As it *is* given away for free. As the improvement of *all* our lives it the (much nobler) goal.
Like the Internet, for example. Or the stuff NASA d
Open Letter to Grantmakers On Patent Policy (Score:2)
Something I wrote almost 20 years ago: https://pdfernhout.net/open-le... [pdfernhout.net]
"An Open Letter to All Grantmakers and Donors On Copyright And Patent Policy In a Post-Scarcity Society; executive summary: Foundations, other grantmaking agencies handling public tax-exempt dollars, and charitable donors need to consider the implications for their grantmaking or donation policies if they use a now obsolete charitable model of subsidizing proprietary publishing and proprietary research. In order to improve the effective
Re: (Score:2)
They just want 1% of the money spent on electricity to encode the file, Or just give them a 2GW nuclear power station whichever is the lesser. Same for decoding.
In other news, Nvidia says it expects to have a video card powerful enough to decode the files in realtime by 2040.
Re: (Score:2)
Re: (Score:2)
Is it free, unencumbered , and finally... (Score:5, Insightful)
Here's the Git repo (for Linux). BSD license (Score:5, Informative)
Here's the repo:
https://vcgit.hhi.fraunhofer.d... [fraunhofer.de]
The software is BSD license (basically free of copyright encumbrance).
https://vcgit.hhi.fraunhofer.d... [fraunhofer.de]
It uses some techniques that are patented. The group is aware that patent licensing was an issue with h.265, so they've tried to improve it for h.266. For some uses of some profiles, there will be a patent royalty of 10 cents or so. I haven't seen details on that yet, but it looks like the basic idea is that the manufacturer of your phone will pay a dime.
Re: (Score:2)
License and copyright are totally different.
Re: (Score:3)
> License and copyright are totally different.
Did you mean to say "patent and copyright are totally different" (and therefore discussed in two separate paragraphs of my post)?
The BSD licenses are a *copyright* licenses. Copy right is literally the right to make copies. Licenses such GPL and BSD grant you a license to make copies, under copy right law.
Re: Here's the Git repo (for Linux). BSD license (Score:2)
It is not a right though.
It is a privilege.
And it is purely the publisher's. Not the creators', inventors', artists', or users'.
That is part of why it is so evil and harmful.
Re: (Score:2)
It IS the creator's, under law.
The creator can, rather than setting up their own printing presses, sell publishing rights to someone who has printing presses and relationships with book stores, if they choose to.
Re: (Score:2)
I think the underlying issue is that publishers have formed a cartel that has successfully demanded concessions from authors that the original architects of the Berne Convention would consider unconscionable.
Re: (Score:2)
That's a massive understatement. H.265 had no less than 3 competing patent pools: groups of companies claiming their patents cover H.265 and demanding royalties. MPEG-LA [mpegla.com], the official licensing body for MPEG/ISO video standards, licenses H.264 decoders (software playback) at $0.25 per unit [mpegla.com]. This with patents from several dozen major American and Asian companies and universities.
Other companies saw this and decided to get in on the acti
Re: Here's the Git repo (for Linux). BSD license (Score:2)
Textbook racketeering.
And people have been made to believe this is not a crime...
Hell, even on Slashdot, where the view was quite the opposite of that, 10-15 years ago.
Re: (Score:2)
> : unlike the GPL, the BSD license does nothing to address patent issues, so it's entirely possible that use of a BSD work without additional licensing violates the patents of someone
That's *like* GPL. Me putting my code under GPL doesn't make your patents vanish. What GPLv3 does, and the reason mostly people won't use v3, is provide me a mechanism to nullify any patents held *by people who mirror my GPLv3 code*.
If someone doesn't mirror your code, GPL does not and cannot ffect their patents in any w
Re: (Score:1)
...does it run on Linux?
Yes. The codec runs in a bash shell inside a tiny Linux VM, tunneled over SSH, using DNSSEC.
Hope you fuckers are finally happy now.
Re: (Score:3)
...does it run on Linux?
Yes. The codec runs in a bash shell inside a tiny Linux VM, tunneled over SSH, using DNSSEC.
Hope you fuckers are finally happy now.
No Blockchain?
Re: Is it free, unencumbered , and finally... (Score:2)
That'd done via a very p.c. "AI" called TrumBLM.
Yummy equal perceptual quality (Score:2)
We'll see how that holds up.
Re: (Score:2)
If it is like everything else, it will only have the same visual perceptive quality for blind people. Every other compression technology that results in higher compression rates results in what is politely termed "Compressed All To Rat Shit" and is quite noticeable, unless, of course, you are blind.
Re:Yummy equal perceptual quality (Score:5, Informative)
MPEG codecs operate on a pretty simple principal.
- Treat video frames as rectangular blocks
- Assume the receiver has something to compare to (previous frames, early transmitted future frames, neighboring blocks, etc...)
- Find the biggest possible block that has the lowest possible delta (difference) to something the decoder should already have.
- Now compress the delta.
- To compress the delta, convert from the color domain into the frequency domain. This is like treating the RGB or YCbCr planes as
- We drop some frequencies... we drop a lot of them actually. If you were to view a master tape from a studio and compare it to the best bluray possible, you'd see a lot of this. In fact, this is where we get most of our bitrate allowance. We drop more and more frequencies... this doesn't mean we're actually always dropping them... in many cases, we're simply "quantizing" them or considering similar frequencies to be the same... so to your eyes, you will simply not generally see the difference between different frequencies in many cases... so we cheat there.
- There are some nifty bit magic tricks related to frequency matrices that happen here... like zig zags and such... but I'll skip that as it's not relevent.
- Then there's entropy coding... there are a few common ones like VLC and Arithmetic Coding which are part of the main profiles and there are often others in high profiles. The point is that what this does is try to represent data in the least number of bits. VLC is basically Huffman coding which creates a histogram of the most common values. Kind of like how the letter 'E' is very common in English but the letter 'X' isn't. So, just like morse code which will represent the more common letters with the least number of dots or dashes, VLC will represent the most commonly occurring numbers ('0' for example) with the least number of bits. The code is very special since the order of the bits are well placed and you don't need to encode a length and value... just the value since the pattern of bits are handle the length as well.
- Bonus, 4:2:2, 4:2:0, etc... meant things like Luma (brightness) would be transmitted in full resolution, but Chroma (generally red and blue differences... read up on this elsewhere) would be transmitted at half or quarter resolution. This is actually a form of frequency based quality loss. We transmit the black and which (luma) in full quality and high resolution... but the red, green and blue will be transmitted at varying qualities based on how we respond to color. Like green gets more "oomph" than red or blue.
This loses a lot of quality, but not nearly as much as you'd think and it is entirely independent of the compression and most digital master tapes (Digitial Betamax) for decades employed 4:2:2 or 4:2:0 in the masters themselves.
Overall, there's really place where quality is dropped... in all other places, the quality is entirely lossless... meaning that if you compare the raw input to the output, it is almost identical... and I say almost because there are cases where converting to and from floating point (which doesn't happening in H.265 or H.266) could be off by a little itty bitty bit.
The place where quality is generally lost is when we're quantizing... or we're trying to decide which frequencies look alike.
So, you've made the assertion that in order to achieve higher compression, we have to lose quality. This is e
Re: (Score:2)
tl;dr GP commenter just has satellite TV.
Re: (Score:2)
Thanks for the clear overview of video compression, always wondered about this stuff. Really appreciated.
Re: (Score:2)
what is politely termed "Compressed All To Rat Shit"
But it's 8k rat shit. That's better, right?
/s
Re: (Score:2)
More and more of the codec is becoming predictive, generating output that LOOKS correct to you while not being strictly based on original data. The natural outcome of this is a codec-implemented equivalent to those AI image-enhancement and interpolation algorithms out there right now, but better-performing since the encoder can decide which parts of the original data provide most bang-for-buck to the predictive code.
An extreme example for music would be reducing a song to sheet music (or a MIDI file if you
Re: Yummy equal perceptual quality (Score:1)
Re: Yummy equal perceptual quality (Score:1)
Unnecessary (Score:5, Insightful)
Because you will totally be able to tell the difference between 1080 and 4K on a screen that's under 15", let alone a smartphone.
Re: Unnecessary (Score:3)
But this one goes to 11.
Re: (Score:1)
> But this one goes to 11.
Only on an iPhone.
Re: (Score:1)
Not magic: Encoding CPU and high resolution (Score:2)
It's not magic any more than a Corvette is "magically" faster than an Accord.
The Corvette costs more to build and buy, and the gas mileage around town isn't as good as the Accord.
H.266 videos cost more CPU and a little more RAM to encode.
Just as a Corvette as designed for higher speeds, h.266 is designed for higher resolution. For one example, h.264 essentially treat each frame as a set of 4x4 or 32x32 pictures, and compress each block. H.266 uses 128x128 blocks. In embedded devices, the additional RAM wa
Re: (Score:2)
If you are doing for example a YouTube live stream, you might well record it on an iPhone, but that doesn't mean everyone will be watching it on an iPhone.
Re: (Score:1)
Because you will totally be able to tell the difference between 1080 and 4K on a screen that's under 15", let alone a smartphone.
Sure you can tell the difference.
The new 4K model filled with bullshit features you never asked for costs $1200.
The 1080p model is only $700. Unfortunately, it's far too cheap to be considered anything but obsolete now.
Re:Unnecessary (Score:4, Informative)
>"Because you will totally be able to tell the difference between 1080 and 4K on a screen that's under 15", let alone a smartphone."
****BINGO****
I bet 98% of randomly selected people can't tell the difference between quality 1080p and ANY form of 4K moving video on even a 70" screen at a normal 10 foot viewing distance. Phone size screen at normal holding distance = 45" at 10'. 8K? Give me a break.
I know quite a few people who can't even tell the difference between quality 480P and 1080P in the same setup- and that is actually shocking.
Re: (Score:3)
Re: (Score:2)
I personally sit people in front of an old black and white TV and ask them to tell the difference. It's just as relevant as asking people about shitty broadcast TV in the age of online high quality streaming and VR headsets.
It's like comparing the speed of a 100hp car to a 500hp car but telling people they aren't allowed to step on the accelerator and they need to keep the handbrake on. Don't get hung up on completely irrelevant past practices when discussing what benefit this could bring to your future.
As
Re: (Score:2)
The difference between 1080i and 720p is roughly the difference between 540p and 720p. Temporal resolution only matters if it wasn't 24 frame content converted with 3:2 pulldown for interlaced. Lots of prime time TV is shot at 24fps.
Re: (Score:2)
Do you watch video on a TV like some kind of weirdo? I for one can tell the difference between 4K and 8K content just fine on my 2x 3.5" screens.
Re: (Score:3)
I can see a clear difference between my wife's iPhone 11 (330 ppi) and my Pixel XL (530 ppi), especially on text.
Same with computer monitors. I can easily see a difference between 4k and 5k on a 28" display (155 ppi and 210 ppi respectively) at normal viewing distances.
I looked at some 8k TVs too. It's hard to say because the source material isn't up to scratch yet, I was mostly just noticing the artefacts in the compression which made it look worse than 4k demos.
Re: (Score:2)
I can see a clear difference between my wife's iPhone 11 (330 ppi) and my Pixel XL (530 ppi), especially on text.
There's quite a difference between looking at static text on a screen where the pixels are unchanging and you can examine every detail of their rendering, and video that is generally in constant motion where you eyes are focusing on larger shapes on the display.
Re: (Score:2)
Of course they can, but I bet they wouldn't be able to tell the difference between 4k and 1080p using the same bitwidth, and HDR tech. So far most of benefits from 4k content is more bits and HDR.
Re: Unnecessary (Score:2)
Re: (Score:2)
>"Why on earth would you watch a 70" screen from 10 feet? "
8 to 12 feet is the "standard" layout for a living/family room. So 10 is about average (regardless of screen size). You could, of course, sit closer, but furniture arrangement for movement and aesthetics are opposed to it for most people.
Re: (Score:2)
Agee with you - hate to date myself here but I feel like 10 bit, 4k HDR video will probably last ten to twenty years, with the right bitrate and encoder, it's very close to cinema quality.
I suspect a lovely 4k 90" (yes, 90) TV would probably be clearly superior to a 1080p 90" TV - but honestly I have a 65" 1080p and it looks fantastic. I refuse to go 4k until I can buy an 84" or larger set, for under $1600 US and I would very much prefer to avoid both LED / LCD and OLED (!!) I want a burn proof display,
Re: (Score:3)
Because you will totally be able to tell the difference between 1080 and 4K on a screen that's under 15", let alone a smartphone.
I can totally tell the difference between a 1080p and 4k stream even from crappy youtube videos on my 10" monitor.
I suppose if you have really bad vision or are sitting 10 feet away that might not be the case.
Re:Unnecessary (Score:4, Informative)
Of course you can, because Youtube's compression is a massacre. See this. [imgur.com]
Re:Unnecessary (Score:4, Informative)
One trick with YouTube is to convert your 1080p video to 4k before uploading. When streaming 4k gets a much higher bitrate and even with 1080p source looks better than just uploading 1080p.
Some channels are uploading native 8k now. Arguably pointless but undeniably it does make for very good looking video when streamed.
Re: Unnecessary (Score:5, Informative)
Re: (Score:2)
I can totally tell the difference between a 1080p and 4k stream even from crappy youtube videos on my 10" monitor.
Most 10" monitors that support 4k actually only have a 1080 panel.
What you are seeing is difference in compression parameters not differences in resolution itself.
Re: (Score:2)
Because you will totally be able to tell the difference between 1080 and 4K on a screen that's under 15", let alone a smartphone.
I can tell the difference between 4K and 8K just fine on my 3.5" wide screen so much so that I refuse to download 4K content now because it's too blury.
Oh what you thought the only use case for a video codec was to watch shit on Netflix and thus we shouldn't develop technological advancements?
Slashdot the site where "I can't wait for full holographic displays because VR isn't good enough" and "fuck it I can't tell anything better than 1080p anyway" both are unironically considered equally insightful.
Re: (Score:2)
I guess you've never heard of nor seen a VR headset or ever worked with a panoramic view where video needs to be encoded outside your field of view.
So the answer is: Neither. You just lack knowledge and imagination and just like advanced technology is indistinguishable from magic, you've only heard of someone describing experiencing said technology and determined "magic isn't real".
Re: (Score:2)
You can. Whether you should be holding the phone that close to your face instead of using a larger screen from further away is a separate matter entirely. Also, phones can cast to those larger screens.
Re: (Score:2)
You probably won't be able to tell the difference on your smartphone, but the iPhone can already record 4k video. Being able to compress it more while recording will be a big deal. It'll also help if, say, you want to stream the 4k video to your 4k TV wirelessly.
To an extent, I don't understand why you'd want to record 4k video on your phone in the first place, but it's a thing, so it may as well be a thing that works as well as possible. (And of course, looking at my photos and videos from my phones 15 yea
Re: (Score:2)
It's like all the people who decided to buy a new 4k tv because they saw a commercial for one on their current HD one. The picture looked amazing!
Patent landmine (Score:2, Insightful)
Beware of "will reduce file sizes by X" claims, because they don't say anything about retaining visual quality.
A 4K video compressed with h265 and h264 look similar, but you can certainly tell the difference when you watch it on a actual 4K screen. The only video content that really benefits at all from high, in-flight compression is 2D animation and 3D computer-generated graphics (eg such that a video game cutscene might produce), and even then, usually you pay for it with noisy dark scenes and rippled fas
Re: (Score:3)
Where are we regarding chroma sampling for h266?
I see that h266 is checkboxed for a lot of features that decrease bandwith, such as higher bit depths and HDR as a separate backlight value.
So those 50x claims need to eat something, because there is no free lunch. For 265 one of the big features was increased resolution allowing for far more aggressive Chroma subsampling, eating details and pixels.
At the same quality. Cost: Encoding takes more CPU (Score:5, Informative)
The quoted file size reduction is at the same visual quality.
You can of course decide to instead have no size reduction amd higher quality than h.265, or slightly improved quality and slightly lower file sizes.
A cost is that encoding takes more CPU.
It also works bear with larger resolutions. H.264 and H.265 were primarily designed for lower resolution, though they could be used at higher resolution. H.266 can be used for lower resolution, just like a Corvette can drive slow, but it won't get the same gas mileage as a Civic.
Re: (Score:2)
Hopefully 10 bit colour and better HDR support should help with poor quality in dark scenes.
Get the percentages right! (Score:2)
Up To 50% More Efficient Than Previous Standard
will reduce data requirements by around 50%
Well which is it? 50% more efficient means it reduces data requirements by 33%. Reducing data requirements by 50% means it's 100% more efficient.
Of course all the previous improvements, from H.262 up to H.265, have also all claimed close to 100% improvements but the reality has fallen somewhat short in each case. It's hard be be precise because the perceived quality of decoded video is subjective.
How does it compare with AV1? (Score:4, Insightful)
Does this new codec give enough of an advantage over the AV1 codec to justify whatever expensive patent license fees this codec no doubt requires you to pay?
Re: How does it compare with AV1? (Score:3)
AV1 is in the same ballpark as HEVC (unless you care about encoding speed, where it's much slower). VVC is the next generation and blows both HEVC and AV1 out of the water with its bitrate savings.
Your question is a bit like asking how does HEVC or AV1 compare with AVC or VP9?
Re: (Score:1)
It is reasonable to expect that VVC will be more efficient than, but "blows both out of the water" is an exaggeration. Either way, it is doubtful that the improvement will be worth the licensing insanity of HEVC/VVC though, so it hardly matters.
Re: (Score:2)
Apple have been all HEVC for a while - what do they know? Why aren't they being sued?
Sisvel has brought the same mess to AOM and AV1/VP9, so it really looks like there isn't much alternative with codecs these days other than having to deal with patents and licensing.
And of course MPEG also have EVC in the pipeline: baseline profile is royalty free and offers 30% bitrate savings over AVC. It's going to be interesting to see the effects of this codec on the market.
Re: (Score:2)
Even the old codec gives enough of an advantage as evident by the wide spread support for HEVC. If AV1 wants to be considered a codec rather than a technical curiosity we're going to need hardware encoder and decoder support. Currently no patent fee is too high when you consider the power consumption and processing requirements to software encode a library into AV1.
Patents are a problem (Score:2)
500 pages (Score:1)
... of which 50 pages is the technical description and 450 pages of patent claims </sarcasm>
Re: (Score:2)
Re: (Score:2)
Given that they're charging more for encoding than decoding, it makes sense that wealthy iPhone owners bear the brunt of the costs.
Re: (Score:2)