Apple's iPhone Cameras Accused of Being 'Too Smart' (newyorker.com) 162
The New Yorker argues that photos on newer iPhones are "coldly crisp and vaguely inhuman, caught in the uncanny valley where creative expression meets machine learning...."
"[T]he truth is that iPhones are no longer cameras in the traditional sense. Instead, they are devices at the vanguard of 'computational photography,' a term that describes imagery formed from digital data and processing as much as from optical information. Each picture registered by the lens is altered to bring it closer to a pre-programmed ideal." In late 2020, Kimberly McCabe, an executive at a consulting firm in the Washington, D.C. area, upgraded from an iPhone 10 to an iPhone 12 Pro... But the 12 Pro has been a disappointment, she told me recently, adding, "I feel a little duped." Every image seems to come out far too bright, with warm colors desaturated into grays and yellows. Some of the photos that McCabe takes of her daughter at gymnastics practice turn out strangely blurry. In one image that she showed me, the girl's upraised feet smear together like a messy watercolor. McCabe said that, when she uses her older digital single-lens-reflex camera (D.S.L.R.), "what I see in real life is what I see on the camera and in the picture." The new iPhone promises "next level" photography with push-button ease. But the results look odd and uncanny. "Make it less smart — I'm serious," she said. Lately she's taken to carrying a Pixel, from Google's line of smartphones, for the sole purpose of taking pictures....
Gregory Gentert, a friend who is a fine-art photographer in Brooklyn, told me, "I've tried to photograph on the iPhone when light gets bluish around the end of the day, but the iPhone will try to correct that sort of thing." A dusky purple gets edited, and in the process erased, because the hue is evaluated as undesirable, as a flaw instead of a feature. The device "sees the things I'm trying to photograph as a problem to solve," he added. The image processing also eliminates digital noise, smoothing it into a soft blur, which might be the reason behind the smudginess that McCabe sees in photos of her daughter's gymnastics. The "fix" ends up creating a distortion more noticeable than whatever perceived mistake was in the original.
Earlier this month, Apple's iPhone team agreed to provide me information, on background, about the camera's latest upgrades. A staff member explained that, when a user takes a photograph with the newest iPhones, the camera creates as many as nine frames with different levels of exposure. Then a "Deep Fusion" feature, which has existed in some form since 2019, merges the clearest parts of all those frames together, pixel by pixel, forming a single composite image. This process is an extreme version of high-dynamic range, or H.D.R., a technique that previously required some software savvy.... The iPhone camera also analyzes each image semantically, with the help of a graphics-processing unit, which picks out specific elements of a frame — faces, landscapes, skies — and exposes each one differently. On both the 12 Pro and 13 Pro, I've found that the image processing makes clouds and contrails stand out with more clarity than the human eye can perceive, creating skies that resemble the supersaturated horizons of an anime film or a video game. Andy Adams, a longtime photo blogger, told me, "H.D.R. is a technique that, like salt, should be applied very judiciously." Now every photo we take on our iPhones has had the salt applied generously, whether it is needed or not....
The average iPhone photo strains toward the appearance of professionalism and mimics artistry without ever getting there. We are all pro photographers now, at the tap of a finger, but that doesn't mean our photos are good.
"[T]he truth is that iPhones are no longer cameras in the traditional sense. Instead, they are devices at the vanguard of 'computational photography,' a term that describes imagery formed from digital data and processing as much as from optical information. Each picture registered by the lens is altered to bring it closer to a pre-programmed ideal." In late 2020, Kimberly McCabe, an executive at a consulting firm in the Washington, D.C. area, upgraded from an iPhone 10 to an iPhone 12 Pro... But the 12 Pro has been a disappointment, she told me recently, adding, "I feel a little duped." Every image seems to come out far too bright, with warm colors desaturated into grays and yellows. Some of the photos that McCabe takes of her daughter at gymnastics practice turn out strangely blurry. In one image that she showed me, the girl's upraised feet smear together like a messy watercolor. McCabe said that, when she uses her older digital single-lens-reflex camera (D.S.L.R.), "what I see in real life is what I see on the camera and in the picture." The new iPhone promises "next level" photography with push-button ease. But the results look odd and uncanny. "Make it less smart — I'm serious," she said. Lately she's taken to carrying a Pixel, from Google's line of smartphones, for the sole purpose of taking pictures....
Gregory Gentert, a friend who is a fine-art photographer in Brooklyn, told me, "I've tried to photograph on the iPhone when light gets bluish around the end of the day, but the iPhone will try to correct that sort of thing." A dusky purple gets edited, and in the process erased, because the hue is evaluated as undesirable, as a flaw instead of a feature. The device "sees the things I'm trying to photograph as a problem to solve," he added. The image processing also eliminates digital noise, smoothing it into a soft blur, which might be the reason behind the smudginess that McCabe sees in photos of her daughter's gymnastics. The "fix" ends up creating a distortion more noticeable than whatever perceived mistake was in the original.
Earlier this month, Apple's iPhone team agreed to provide me information, on background, about the camera's latest upgrades. A staff member explained that, when a user takes a photograph with the newest iPhones, the camera creates as many as nine frames with different levels of exposure. Then a "Deep Fusion" feature, which has existed in some form since 2019, merges the clearest parts of all those frames together, pixel by pixel, forming a single composite image. This process is an extreme version of high-dynamic range, or H.D.R., a technique that previously required some software savvy.... The iPhone camera also analyzes each image semantically, with the help of a graphics-processing unit, which picks out specific elements of a frame — faces, landscapes, skies — and exposes each one differently. On both the 12 Pro and 13 Pro, I've found that the image processing makes clouds and contrails stand out with more clarity than the human eye can perceive, creating skies that resemble the supersaturated horizons of an anime film or a video game. Andy Adams, a longtime photo blogger, told me, "H.D.R. is a technique that, like salt, should be applied very judiciously." Now every photo we take on our iPhones has had the salt applied generously, whether it is needed or not....
The average iPhone photo strains toward the appearance of professionalism and mimics artistry without ever getting there. We are all pro photographers now, at the tap of a finger, but that doesn't mean our photos are good.
What it does is help most people (Score:2)
Yes the iPhone default camera can and will override artistic content (especially in terms of color).
But that is why there are MANY iPhone camera apps that let you capture basically raw images, and let you figure out what adjustments you want to apply (if any).
Meanwhile the deep fusion and other technologies that Apple applies by default, make most peoples photos much better. It will capture low light photos that honestly are hard to get even with a modern DSLR, just from a typical handheld shaky hand shot.
Re: (Score:2)
Indeed. Couple of months ago I looked at old photo albums. The striking thing(besides how young I look) is the picture quality, even "good quality" photos from back in time tend to be worse in terms of color and accuracy than even a quick snap today.
Agree (Score:2)
I had noticed my photos were amazing but sometimes zoomed in kinda contrasty. If I was getting pictures in light and motion conditions I never would have expected such sharp well lit photos. So net gain is huge. But maybe bad for fine art
Re:What it does is help most people (Score:5, Funny)
Unless it delivers a mild electrical shock to people who hold the phone in portrait mode, it won't actually help anyone become a better photographer.
Re: (Score:2)
Love the complaints about computational photography, especially those that bemoan all of these newfangled features as opposed to getting the "real" images found in original digital cameras.
Guess those people don't understand all of the Bayer demosaicing and interpolation, white balance color correction, edge and line detection, denoising, scene and exposure evaluation, and more that "traditional" cameras apply to each and every photo.
Re: (Score:2)
yes, morons don't understand things - more to come at 11.
Re: (Score:3)
Photographers have complained about cameras' auto mode making comically saturated pictures since the beginning of digital photography.
Want snapshots that conform to mass aesthetics? Use auto.
Want control? Use manual.
Re: What it does is help most people (Score:2)
I doubt that's the case. As TFS noted, one of the photographers opted for a Google pixel instead, which is notorious for relying on software corrections to get the photos that is known for, as the pixel line just uses commodity camera hardware. In other words, there's nothing special about the camera hardware in the pixel line at all, yet they are very competitive in dxomark scores with phones that have very expensive and very specialized camera hardware. iPhones rarely make the top 10 in those ratings thes
Re: (Score:2)
But I guess that is the apple way of doing things. User is not important.
Re: (Score:2)
Re: (Score:2)
Yes the iPhone default camera can and will override artistic content (especially in terms of color).
But that is why there are MANY iPhone camera apps that let you capture basically raw images, and let you figure out what adjustments you want to apply (if any).
Meanwhile the deep fusion and other technologies that Apple applies by default, make most peoples photos much better. It will capture low light photos that honestly are hard to get even with a modern DSLR, just from a typical handheld shaky hand shot. It will correct most colors to be OK even in low light. Basically overall, it's making way more photos batter than worse, especially for people not really used to teh full range of camera tech or even things like shutter speed.
This - People who are fussy about the quality of their photos - save it raw. Or get that DSLR.
If we're at the point of bitching that our iPhone doesn't take the photos we like, perhaps it's time to invest in a high end DSLR, and a subscription to Photoshop.
A smartphone camera is and will always be just a smartphone camera. There are some physics involved here. Sensor size, lens focal lengths. basic photographic principles. Anything that can make them better can be transferred to the larger sensors and
Re: (Score:2)
Re: (Score:2)
Re: (Score:3)
DSLRs are a dead-end format, every major manufacturer has moved on to mirrorless now.
Point is, not much beats a larger sensor area, good lenses that allow you to control depth of field and other aspects of the image.
The mirror is just an adjunct.
Even then a proficient photographer works with the camera limitations. This is why a person who doesn't have the eye can use a professional camera and get crappy and banal results, and a talented person can get intriguing images with an old point and shoot camera. There are people who use old plastic Diana cameras and get wall hangers.
Which is
Them damn keyboards too! (Score:3)
Re: (Score:2)
Art of typewriting? Who are you? Hemingway?
Re: (Score:2)
Art of typewriting? Who are you? Hemingway?
More like Mavis Beacon [wikipedia.org].
Re: (Score:2)
"... Furthermore, I am of the opinion that Carthage should be destroyed" :-P
This is the problem with.... (Score:2)
This is one of the problems with Apple's locked down environment. If Samsung did this an Android user could just buy a Motorola or some other brand. With Apple though you're stuck with whatever design decisions they make unless you want to change your phone OS which means re-buying all your apps.
Re: (Score:3)
The majority of phones do this with their cameras. [youtu.be] Apple just gets the attention because, well...their Apple.
Re: (Score:2)
The Slashdot summary very specifically states the articles author found herself having much better luck with a Google Pixel so no, all phone cameras are not the same.
Re:This is the problem with.... (Score:5, Funny)
She was probably holding it wrong. You get just the picture you expect by holding it like this, see? ... Hold on, I meant like this. ... Anyway, that's beside the point, the point is she was holding it wrong.
Re:This is the problem with.... (Score:4, Insightful)
With Apple though you're stuck with whatever design decisions they make unless you want to change your phone OS which means re-buying all your apps.
Except you aren’t. There are plenty of apps like Halide that expose the controls for all the relevant settings. Want to take RAW photos and postprocess them yourself like an actual professional would? Have at it.
Even the default Camera app has a bajillion controls for you to manage most of what was being complained about in the summary. It sounds like these users don’t like the defaults in certain circumstances, but that’s to be expected. The defaults are intended to provide a good baseline for most users in most scenarios, but they can and do fall flat on their faces in some scenarios.
I mean, smudgy photos during gymnastics? That’s a use case where all of those “smarts” will fail you because by the time it captures that ninth frame for compositing, the subject of the shot could have completed half a backflip. Most consumer-focused cameras for the last few decades have had separate “sports” modes to deal with the specific needs of shooting sports scenes, and the iPhone is no different in that regard. Either toggle the relevant settings or switch to a mode appropriate for that use case (e.g. use Burst Mode, Slo-Mo Mode, etc. to capture those scenes in better clarity at the cost of computational benefits that might have improved other shots).
Your daughter’s foot may look slightly smudgy in that one picture where you used the wrong mode, which is entirely within your power to address, but your selfies, group shots, pretty sunsets, plates of food, Instagram pics, and other everyday pictures all look dramatically improved from where they were just a few years ago when using the default settings.
Re: (Score:2)
Apple products are advertised to work well out of the box and I very strongly think this is a big part of how Apple has been so successful. You can lay out as many alternative apps or setting adjustments you want but most consumers have no interest in such things.
I mean "Apple: It just works" certainly implies what you describe should be unnecessary and frankly as someone with minimal interest in photography and who just wants to snap a pic occasionally that looks good I would be annoyed at having to go thr
Re: (Score:2)
Hahahaha
Re:This is the problem with.... (Score:4, Interesting)
Rose tinted glasses applied to the whole wide world, hopefully this means people are actually starting to push back.
Re: This is the problem with.... (Score:2)
Since when did movies ever strive to make something look realistic as a universal objective?? Movies have *always* played around wit visual aesthetics to heighten the impact of what they are striving to convey
Re: (Score:2)
Compositing in film has made it so people are used to the artifacts of digital effects, they'll color grade the entire movie blue-orange and they do it so widely across the industry that nothing looks real at all anymore.
Errr no. Colour grading in film is a specific artistic decision to make the characters pop or a scene feel a certain way, and predates computers themselves. Careful use of chemicals and film promoted different colour grades intended to make skin tones "pop". Basic colour theory says you balance accentuated orange with blue. The sky is blue too so it works quite well when you make characters pop.
The effect is often used also to demonstrate the unreal (e.g. Amelie which doesn't have blue), or to make the audi
Re: (Score:2)
These sorts of filters have become so broadly applied that people don't know what naturally lit scenes look like anymore.
Though there’s plenty to say on your subject, you’re confused if you think that we’re talking about those sorts of filters here.
We’re talking about image signal processing (ISP) such as de-noising algorithms that take a composite from multiple images to intelligently remove the inherent digital noise you get because you’re dealing with people who don’t know how to light a scene or who are bumping up against the limits of physics with what a camera sensor with sensor pixel
Re: (Score:2)
Re: (Score:2)
there's a remastered HD version of Never Gonna Give You Up that is a perfect example of exactly what we're talking about. The dancers in the background smear out from the interpolation.
Actually, I'd suggest that's the exact opposite of what we're talking about here. Frame interpolation, which is what you're talking about, involves the insertion of new frames into a video where previously none existed. Even when you start with sharp images, it can introduce blurring where none previously existed.
In contrast, a gymnast flipping through the air with a phone set to the wrong mode is almost sure to have blurry images to begin with. The frame rate of the sensor would have been too slow, resulti
Re: (Score:2)
Maybe they should use some computational power to determine what the subject of the photo is before applying adjustments that may not be appropriate. If there is already a sports mode in the settings use some computational power to determine that a photo subject is sports and automatically enable the sports mode.
Started by Google in 2013 (Score:2)
The Nexus 5 was the first phone to use Google's HDR+, which shot up to nine frames and combined them computationally.
However, although even more sophisticated today, Google's Pixel phones don't do the over-oranging that Apple's phones do, or the over-greening that Samsung phones do.
It's a choice Apple made to process in this way, and IMHO many of the iPhone photos look way too "overcooked".
Re: (Score:2)
No. IPhone started it. They have Always oversaturated colors deliberately to make the photos it takes 'pop' more while sacrificing photo accuracy. Taking good looking but fake photos has always been a selling point of the device
Re: (Score:2)
Most people want good looking photos and not "photo accuracy". In fact, what is photo accuracy? Any photo you see will have been affected by variables not strictly derived from the light that comes through the lens.
Re: (Score:2)
99% of users do no editing, they post directly to social media. The iPhone camera consistently produces photos that are ready to post, with a few exceptions. The photos tend not to be very realistic though. That's not what people want.
Google's camera is unparalleled for accurate looking photos.
Re: (Score:2)
Re: (Score:2)
That's not what people want.
Exactly. What people (including me, despite all my DSLRs and lenses) want out of phone photos is minimal effort and something that looks good on phone screens under typical viewing conditions.
You can always shoot RAW on iPhones as well, I've done it and the results are definitely more comparable to dedicated cameras. But the convenience isn't much better than on a dedicated camera at that point so why settle for the smaller sensor...
Re: Started by Google in 2013 (Score:2)
Photo accuracy means it looks like what you actually see. For example, grass should look green, not neon green.
I still remember all of the shit people gave the pixel 2xl saying that the colors looked washed out while saying that the photos it took looked really good when viewed on another device. Actually it was the first phone to use an OLED display with sRGB colors out of the box, which gives you that accuracy, but compared to the normal oversaturated colors people are used to seeing on other phones, it w
Re: Started by Google in 2013 (Score:2)
Sadly even Adobe has taken this on-board and the auto settings in Lightroom now also crush contrast and over saturate the colours. The worst with iPhone photography is when itâ(TM)s done with a photo with completely overblown fake short depth of field
Re: (Score:2)
"In fact, what is photo accuracy?"
The photograph being as close as to what you see with your own eyes.
In the world of fluorescent mineral photography, it's very important.
Re: (Score:2)
Actually, they didn't. If anything iPhone photos tended to look undersaturated and flat compared to most "out of the box" photos taken by other cameras on other phones.
Apple tends towards "natural" colors but other camera makers (especially Samsung) bump saturation and contrast, making those photos look "better" and more "punchy" in comparison. That's why last year Apple added Photographic Styles that you could default to Vivid or Warn or whatever if you wanted that look.
Re: Started by Google in 2013 (Score:2)
No. Not unless you are from some alternative reality. In this one iphones started off with garbage cameras, then they got decent ones with off color fiddling
Re: (Score:2)
The Google Pixel cameras are also infamous for having a distinct look to them: with flattened dynamic range and dark edges.
Re: Started by Google in 2013 (Score:2)
I'm not sure what you mean by dark edges, but actually the dynamic range on the pixel phones good. Very, very good, in fact. What you're probably thinking of is color saturation, which some people confuse with dynamic range. And yeah, pixel phones like to err on the side of not oversaturating. That's actually a good thing. If you're used to over saturated colors, pixel will look dull to you. Like if you're used to really salty food, you'll think normal food tastes bland.
Having a good dynamic range is when t
Every idiot with a camera thinks they are a pro (Score:2)
Every idiot with a camera thinks they are a professional photographer. I first saw this when DSLRs got cheap enough the average Joe Schmoe could afford (instead of the cost of a new/used car).
I ran into this at weddings. While I moved into Videography, I had to work alongside *shudder* "Wedding Photographers" that didn't know jack about their cameras, lenses or settings. I can count on more fingers and toes I have of "Pros" shooting on auto mode.
In videography, with professional video cameras (think High-
Re: (Score:2)
Thing is, with Android, you can just download an app that gives you complete control over the camera. If you want full control over the image, you can shoot as raw and adjust as you want.
With Apple, you're stuck with whatever computational BS they do with the camera, and that's it. Your options are Apple's proprietary image format or JPEG, and that's basically it. (Not entirely, they added a bunch of Instagram-like filters to the camera, so if you want to make the image even worse, you can do that.)
Re: Every idiot with a camera thinks they are a pr (Score:3)
Thing is, with iOS, you can do exactly the same thing. Case in point: https://apps.apple.com/us/app/manual-raw-camera/id917146276
Re: (Score:3)
I said "just download" not "go and purchase." Android has Open Camera [opencamera.org.uk] which is a GPLv3 licensed open source camera app that gives you basically full access to anything your device can do with its camera.
Nothing like that exists on the iPhone. Yes, you can get apps that fake things by applying filters after the fact to the compressed image Apple gives you from the camera API. But you can't get direct access to the camera - Apple won't let you.
Re: (Score:2)
The problem with making assumptions.... someone who knows what a search engine is will come along and correct you.
Re: (Score:2)
Re: (Score:2)
The thing about DSLR's now is that 90% of the time you can run them in full Auto and you'll get perfectly acceptable photos. Simply switching from full auto to Aperture Priority or Shutter Priority modes will turn the vast majority of your acceptable photos into Good photos, assuming of course you know how to properly control the Aperture/Shutter. Obviously Manual lets you take better photos, but for 99% of real world situations, weddings including (assuming proper/standard lighting), it isn't necessary. Of
Settings exist. Use 'em (Score:2)
It's easier to complain about a new product rather than reading the instruction manual, or taking a photography class (or even watching a YouTube video) explaining how to take photograph
Apple: It just works. (Score:2)
I think a lot of people just want a cell camera that works without dicking around with the settings. I know I find no joy in fine tuning electronics.
Re: (Score:3)
Do not attempt to adjust the picture. We are controlling transmission. If we wish to make it louder, we will bring up the volume. If we wish to make it softer, we will tune it to a whisper. We will control the horizontal. We will control the vertical. We can roll the image, make it flutter. We can change the focus to a soft blur, or sharpen it to crystal clarity. For the next hour, sit quietly and we will control all that you see and hear.
The Outer LImits (Score:2)
Isnâ(TM)t this obvious? (Score:2)
Re: (Score:2)
Someone using a dslr knowledgeably, knows how to take better pictures than they can with their cell phone. Duh?
She might not actually know how to use "a dslr knowledgeably" - she might be a point-and-shoot dSLR owner. Even mid-range dSLRs from the past decade or so have excellent automatic modes that can interpret motion and capture moving objects very well.
Article focuses on the 0.01% failures (Score:2)
Re: (Score:2)
Just use RAW (Score:3)
If you want more traditional controls, just turn on the ProRAW feature and get files like RAW files you have complete control over right?
Huge files just like traditional RAW files, without the Apple computational work done.
Re: (Score:2)
Glad I scrolled down, saved me the trouble of googling. As distasteful as Apple's defaults might be, I figured you could dodge them somehow.
Re: (Score:2)
Re: (Score:2)
All RAW files are much larger than lossless formats like JPEGs They store an incredible amount of sensor data on them that you can take full advantage of in Photoshop. These are the files that quote "real photographers" work with. My Nikon kicks out 80MB RAW files along with the usual JPEG file.
As I understand it for the iPhones, the computational adjustments are applied to the JPEG/HEIF file and not to the RAW file.
I'm not saying it's a perfect scenario for the average mom taking photos of her kids, but
Re: (Score:2)
To your point: Getting Apple to (ever) to the right thing is and has been painful. 'Auto correcting' photo's has been an issue since the 35mm days (TLDR: there were some really interesting cloud formations around me; so I stopped to take photo's; and they wouldn't fit in one shot. When they were developed - the auto-
Re: (Score:2)
Larger than lossy.
RAW files are usually smaller than lossless because they don't de-Bayer.
They're bigger than JPEG because JPEG compresses the crap out of images.
Re: (Score:2)
Proraw supposedly sits somewhere in the middle, HDR/fusion applied, no colour grading ... not that Apple deems peons worthy to know exactly.
Re: (Score:2)
Or manual mode, on Samsung phones all of these smart features can be toggled
Re: (Score:2)
Good vs Good (Score:2)
This whole discussion just boils down to everyone disagreeing about what a "good" photo is. One person's good photo is another person's fuckup.
"Why doesn't your camera capture the beauty of the wondrous phenomenon that I was seeing in that moment?"
"Why doesn't your camera see beyond the ephemeral conditions and capture the Platonic essence of the subject, and then insert it the proper aesthetics du jour?"
Re: (Score:2)
A "Good" photo is a photo that's as close to 'photo realistic' (that term exists for a reason) as possible.
A "Good" photoshop/aftereffects is what you're getting with most cameras nowadays.
Re: (Score:2)
Re: (Score:2)
Hence the quotes. Because "Good" is nonsense.
I should've added a sarcasm tag seeing the reactions. lol
Re: (Score:2)
So what is that? Your eyes can't even agree on what it is, what with their auto white balancing, contrast adjustments, and oh yeah, extensive making shit up.
Buy a real camera if you are a Pro (Score:2)
Most people buy iphones to watch youtube on the shitter or take cat photos for their social media.
Semantic analysis? (Score:3)
The iPhone camera also analyzes each image semantically, with the help of a graphics-processing unit, which picks out specific elements of a frame — faces, landscapes, skies — and exposes each one differently.
First time I've encountered such use of "semantic" in a totally non-linguistic sense. Even the technological use of the word has some reference to language, say, speech recognition. But who knows, maybe faces and places have meanings, and a picture after all is worth a thousand words?
Re: (Score:2)
Apple's advanced Neural Engine can calculate exactly how many words each image is worth. In the process of merging the multiple images which make up one "shutter release", Apple weights each image according to its calculated effective word count, extrapolating as necessary and applying advanced splining techniques to guarantee each final photo is worth a minimum of 1014 words.
Re: (Score:3)
Semantics is just the study of meaning. This assigns meaning to different parts of the image.
- https://en.wikipedia.org/wiki/... [wikipedia.org]
Re: (Score:2)
I don't think the Wikipedia article even deals with images (ideas, yes). A Firefox search for the words "image" and "picture" yielded zero hits. Again most of the meanings of the term "semantics" deal with language, which by extension could include symbolic logic and programming languages (which, like natural languages, are said to have a grammar and syntax). By comparison, can images have a grammar? We can of course create a hieroglyphic-like language made of pictograms. But this would involve the use of i
Re: (Score:2)
I'm sorry, what point are you trying to make?
That "semantic analysis" is an incorrect phrase to describe what they're doing here? It is a perfectly correct usage of the word. It's just being used in a way you haven't personally encountered before. Now you have.
Naturally a huge part of "the study of meaning" is going to revolve around what words mean.
Re: (Score:2)
Sorry, my point is that semantics should at least deal with units of meaning. A single picture might be that unit of meaning when placed side by side with other pictures, say, to create a photo essay. The picture itself gains meaning only in relation to the other pictures. The word "and" can be a unit of meaning, but the letters that make it up (A, N and D) are meaningless.
From the article you linked to, WRT linguistic semantics.
Semantics can address meaning at the levels of words, phrases, sentences, or larger units of discourse.
Correcting or clarifying what I posted earlier, semantics is different from gra
Re: (Score:2)
LINGUISTIC semantics does. And that is certainly the usual connotation of the word. But semantics can have a broader meaning. Why are you so insistent that it cannot? You are not arguing with me, you are arguing with dictionaries.
Cherry-picking excerpts discussing semantics in language contexts (eg the computer science quote is specifically about programming languages) doesn't somehow disprove the more general definition I ha
Re: (Score:2)
No, I'm not arguing with you. I'm just pointing out that the use of semantics in describing smart camera software is either extended or useless.
The best way to get the sense of what I'm trying to say is by simplifying our example. Instead of faces, think of a red star. The red star has a definite connotation, for instance as a symbol of communism and for I don't-know-what-reason, post-Soviet Russia. That would be the red-star's "semantics". But our hypothetical smart camera doesn't need to know that it's th
Re: (Score:2)
It. Does. Not. Matter. That. The. Wikipedia. Article. Doesn't. Specifically. Discuss. Images.
I included it for the definition in the first sentence. That's it. Not because the article is some exhaustive reference that covers every single possible current or future application, which is how you decided to treat it for no reason whatsoever.
The definitions I have provided are sufficient. You haven't challenged any of them, you just keep repeating your own opinions. I'm sorry, but I don't care about your pe
Re: (Score:2)
Re: (Score:2)
Okay, that appears to clarify things. Still, I assume that semantics, when applied to imaging, would involve figuring out the real-world significance of an image or part of an image. To repeat my example in my reply to another user, I'd consider it "semantics" if the imaging software figured out that the red star is the red star of communism. I'd understand this extended use of semantics in relation to images. However, I doubt that the software would distinguish between the star of marxism and the star of M
Cameras have adjusted your photos for many decades (Score:2)
Cameras and film processing have almost always adjusted your photos to be "more pleasing". This was always present in the old days of commercial photo processing, back in the days when you were using film and running it through chemical baths.
Most people, most of the time, take pictures of other people. You were never going to be happy with that accurate picture of your family looking like pallid vampires with tuberculosis so, the film and processing warmed up your pictures with an emphasis on reddish shade
Re: (Score:2)
And a hidden secret: color film dyes weren't that consistent from roll to roll. The difference between cheap photo printing and expensive photo printing was the expensive photo printing paid a person to adjust the color roll-by-roll, and sometimes even frame-by-frame.
iPhones for the massless (Score:2)
Should have unadjusted option - but not Apple (Score:2)
A more reasonable approach would be a raw or at least semi-raw format with no adjustments other than fitting into the available bit depth.
Some of us in the Bay Area remember the day of the orange skies, during the bad fires - where it was dark enough at noon for street lights to turn on, and the skies were a hellish orange. Iphone pictures happily corr
Re: (Score:3)
Or, you know, download one of a hundred different apps and exercise all the manual control you want.
Just like you can with any modern camera. Just download...
No. Wait. You can't do that, can you? Have to use whatever interface and menu system that comes with the camera.... ;)
Why is my tech not MORE amazing?!? (Score:2)
Like, I get that tech can be more amazing still. But maybe one can sound less whiny about it.
I wouldn't be surprised if cameras soon will (Score:2)
helpfully blur out any "naughty bits" in the photos taken.
Annoyingly, no examples (Score:2)
The article doesn't have (or I didn't see) any examples and, while I understand what they're taking about, it would be nice to see some comparisons.
Too much computational photography... so Pixel? (Score:2)
I think it's hilarious that when she wants to get away from the computational photography, she chooses a Pixel -- Google's Pixel is the real vanguard of computational photography, doing enormous processing of every image. But doing it better, apparently.
we are standing on the precipice (Score:2)
This is where it begins, altering the truth or creating an alternate reality. What if we just said no? It's not me, it's the camera. CGI, ever heard of it.
Maybe I don't understand the situation (Score:2)
If you are aiming for realism, you don't use a phone to take your picture. The phone is giving people exactly what they want - a brighter, more vivid version of whatever they point in at. That's the correct behaviour.
In fact, I'd suggest that almost nobody wants their photo to be truly representative of the actual subject. You are always curating the moment, and presenting a careful fiction.
Somebody mentioned Ansel Adams above. His pictures were absolutely stunning. But what he shot never looked like that.
Tiny lens .... (Score:2)
....All the processing in the world cannot compensate for the miniscule lens size - having multiple helps, but those big lenses on DLSR's are there for a reason ...
Maybe for still photos? (Score:2)
My experience using my iPhone 13 Pro is, it's a pretty vast improvement for shooting casual video. It's able to adjust the focus pretty well to stay on the subject of the action when you tap on them in the frame to mark them, and the video is crystal clear (with pretty decent audio recorded too).
My results taking still photos are mixed, but it does a pretty good job for me of your typical indoor or "outdoor during the day" quick photo snaps. Overall, if you took a look at a collection of 100 random picture
Not all bad (Score:2)
Over Christmas, I took some photos, at dusk, of a color-LED lit christmas wreath, with an iPhone 12 Pro and with a Sony Alpha 6600 mirrorless interchangable lens camera.
The pictures captured by the Sony merged the red, yellow, and orange lights into an indistinguishable mess. I tried bracketing the exposure, and couldn't improve the results. The iPhone 12 Pro did not; red was red, orange was orange, and yellow was yellow.
Cameras still cannot record the same dynamic range as can your eyes, and different ca