Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Iphone AI

Apple's iPhone Cameras Accused of Being 'Too Smart' (newyorker.com) 162

The New Yorker argues that photos on newer iPhones are "coldly crisp and vaguely inhuman, caught in the uncanny valley where creative expression meets machine learning...."

"[T]he truth is that iPhones are no longer cameras in the traditional sense. Instead, they are devices at the vanguard of 'computational photography,' a term that describes imagery formed from digital data and processing as much as from optical information. Each picture registered by the lens is altered to bring it closer to a pre-programmed ideal." In late 2020, Kimberly McCabe, an executive at a consulting firm in the Washington, D.C. area, upgraded from an iPhone 10 to an iPhone 12 Pro... But the 12 Pro has been a disappointment, she told me recently, adding, "I feel a little duped." Every image seems to come out far too bright, with warm colors desaturated into grays and yellows. Some of the photos that McCabe takes of her daughter at gymnastics practice turn out strangely blurry. In one image that she showed me, the girl's upraised feet smear together like a messy watercolor. McCabe said that, when she uses her older digital single-lens-reflex camera (D.S.L.R.), "what I see in real life is what I see on the camera and in the picture." The new iPhone promises "next level" photography with push-button ease. But the results look odd and uncanny. "Make it less smart — I'm serious," she said. Lately she's taken to carrying a Pixel, from Google's line of smartphones, for the sole purpose of taking pictures....

Gregory Gentert, a friend who is a fine-art photographer in Brooklyn, told me, "I've tried to photograph on the iPhone when light gets bluish around the end of the day, but the iPhone will try to correct that sort of thing." A dusky purple gets edited, and in the process erased, because the hue is evaluated as undesirable, as a flaw instead of a feature. The device "sees the things I'm trying to photograph as a problem to solve," he added. The image processing also eliminates digital noise, smoothing it into a soft blur, which might be the reason behind the smudginess that McCabe sees in photos of her daughter's gymnastics. The "fix" ends up creating a distortion more noticeable than whatever perceived mistake was in the original.

Earlier this month, Apple's iPhone team agreed to provide me information, on background, about the camera's latest upgrades. A staff member explained that, when a user takes a photograph with the newest iPhones, the camera creates as many as nine frames with different levels of exposure. Then a "Deep Fusion" feature, which has existed in some form since 2019, merges the clearest parts of all those frames together, pixel by pixel, forming a single composite image. This process is an extreme version of high-dynamic range, or H.D.R., a technique that previously required some software savvy.... The iPhone camera also analyzes each image semantically, with the help of a graphics-processing unit, which picks out specific elements of a frame — faces, landscapes, skies — and exposes each one differently. On both the 12 Pro and 13 Pro, I've found that the image processing makes clouds and contrails stand out with more clarity than the human eye can perceive, creating skies that resemble the supersaturated horizons of an anime film or a video game. Andy Adams, a longtime photo blogger, told me, "H.D.R. is a technique that, like salt, should be applied very judiciously." Now every photo we take on our iPhones has had the salt applied generously, whether it is needed or not....

The average iPhone photo strains toward the appearance of professionalism and mimics artistry without ever getting there. We are all pro photographers now, at the tap of a finger, but that doesn't mean our photos are good.

This discussion has been archived. No new comments can be posted.

Apple's iPhone Cameras Accused of Being 'Too Smart'

Comments Filter:
  • Yes the iPhone default camera can and will override artistic content (especially in terms of color).

    But that is why there are MANY iPhone camera apps that let you capture basically raw images, and let you figure out what adjustments you want to apply (if any).

    Meanwhile the deep fusion and other technologies that Apple applies by default, make most peoples photos much better. It will capture low light photos that honestly are hard to get even with a modern DSLR, just from a typical handheld shaky hand shot.

    • Indeed. Couple of months ago I looked at old photo albums. The striking thing(besides how young I look) is the picture quality, even "good quality" photos from back in time tend to be worse in terms of color and accuracy than even a quick snap today.

      • I had noticed my photos were amazing but sometimes zoomed in kinda contrasty. If I was getting pictures in light and motion conditions I never would have expected such sharp well lit photos. So net gain is huge. But maybe bad for fine art

    • by Linux Torvalds ( 647197 ) on Sunday March 20, 2022 @02:59PM (#62374919)

      Unless it delivers a mild electrical shock to people who hold the phone in portrait mode, it won't actually help anyone become a better photographer.

    • by ceoyoyo ( 59147 )

      Photographers have complained about cameras' auto mode making comically saturated pictures since the beginning of digital photography.

      Want snapshots that conform to mass aesthetics? Use auto.

      Want control? Use manual.

    • I doubt that's the case. As TFS noted, one of the photographers opted for a Google pixel instead, which is notorious for relying on software corrections to get the photos that is known for, as the pixel line just uses commodity camera hardware. In other words, there's nothing special about the camera hardware in the pixel line at all, yet they are very competitive in dxomark scores with phones that have very expensive and very specialized camera hardware. iPhones rarely make the top 10 in those ratings thes

    • by Askmum ( 1038780 )
      The dumb thing is: you have to go out of your way to get the normal image. Instead of making a setting in the standard photo app to say "make it like apple wants it" or "make it like I want it" they just make it like apple wants it.
      But I guess that is the apple way of doing things. User is not important.
      • That would be the right way - just add an "unprocessed" option in the default camera app and this whole topic is moot. (At least for 99% of people who don't want to dicker about things at the level of bayer filters... of course there is no such thing as imaging without some sort of digital or chemical processing involved).
    • Yes the iPhone default camera can and will override artistic content (especially in terms of color).

      But that is why there are MANY iPhone camera apps that let you capture basically raw images, and let you figure out what adjustments you want to apply (if any).

      Meanwhile the deep fusion and other technologies that Apple applies by default, make most peoples photos much better. It will capture low light photos that honestly are hard to get even with a modern DSLR, just from a typical handheld shaky hand shot. It will correct most colors to be OK even in low light. Basically overall, it's making way more photos batter than worse, especially for people not really used to teh full range of camera tech or even things like shutter speed.

      This - People who are fussy about the quality of their photos - save it raw. Or get that DSLR.

      If we're at the point of bitching that our iPhone doesn't take the photos we like, perhaps it's time to invest in a high end DSLR, and a subscription to Photoshop.

      A smartphone camera is and will always be just a smartphone camera. There are some physics involved here. Sensor size, lens focal lengths. basic photographic principles. Anything that can make them better can be transferred to the larger sensors and

      • by imidan ( 559239 )
        These days, a DSLR doesn't even have to be that high-end to take good pictures. As always, the lenses are where they get you.
      • DSLRs are a dead-end format, every major manufacturer has moved on to mirrorless now.
        • DSLRs are a dead-end format, every major manufacturer has moved on to mirrorless now.

          Point is, not much beats a larger sensor area, good lenses that allow you to control depth of field and other aspects of the image.

          The mirror is just an adjunct.

          Even then a proficient photographer works with the camera limitations. This is why a person who doesn't have the eye can use a professional camera and get crappy and banal results, and a talented person can get intriguing images with an old point and shoot camera. There are people who use old plastic Diana cameras and get wall hangers.

          Which is

  • by Huitzil ( 7782388 ) on Sunday March 20, 2022 @01:41PM (#62374715)
    Everyone is so focused on how iPhone cameras that we forget where this all started: keyboards. Those damn tools disrupted the typewriter and blurred the line between the art of typewriting and the computationally intensive predictive text enabled by the modern PC keyboard.
  • This is one of the problems with Apple's locked down environment. If Samsung did this an Android user could just buy a Motorola or some other brand. With Apple though you're stuck with whatever design decisions they make unless you want to change your phone OS which means re-buying all your apps.

    • The majority of phones do this with their cameras. [youtu.be] Apple just gets the attention because, well...their Apple.

    • by Anubis IV ( 1279820 ) on Sunday March 20, 2022 @02:23PM (#62374823)

      With Apple though you're stuck with whatever design decisions they make unless you want to change your phone OS which means re-buying all your apps.

      Except you aren’t. There are plenty of apps like Halide that expose the controls for all the relevant settings. Want to take RAW photos and postprocess them yourself like an actual professional would? Have at it.

      Even the default Camera app has a bajillion controls for you to manage most of what was being complained about in the summary. It sounds like these users don’t like the defaults in certain circumstances, but that’s to be expected. The defaults are intended to provide a good baseline for most users in most scenarios, but they can and do fall flat on their faces in some scenarios.

      I mean, smudgy photos during gymnastics? That’s a use case where all of those “smarts” will fail you because by the time it captures that ninth frame for compositing, the subject of the shot could have completed half a backflip. Most consumer-focused cameras for the last few decades have had separate “sports” modes to deal with the specific needs of shooting sports scenes, and the iPhone is no different in that regard. Either toggle the relevant settings or switch to a mode appropriate for that use case (e.g. use Burst Mode, Slo-Mo Mode, etc. to capture those scenes in better clarity at the cost of computational benefits that might have improved other shots).

      Your daughter’s foot may look slightly smudgy in that one picture where you used the wrong mode, which is entirely within your power to address, but your selfies, group shots, pretty sunsets, plates of food, Instagram pics, and other everyday pictures all look dramatically improved from where they were just a few years ago when using the default settings.

      • by skam240 ( 789197 )

        Apple products are advertised to work well out of the box and I very strongly think this is a big part of how Apple has been so successful. You can lay out as many alternative apps or setting adjustments you want but most consumers have no interest in such things.

        I mean "Apple: It just works" certainly implies what you describe should be unnecessary and frankly as someone with minimal interest in photography and who just wants to snap a pic occasionally that looks good I would be annoyed at having to go thr

      • by Kremmy ( 793693 ) on Sunday March 20, 2022 @03:05PM (#62374927)
        It's not a dramatic improvement, it's a remarkably subjective false improvement. These sorts of filters have become so broadly applied that people don't know what naturally lit scenes look like anymore. Compositing in film has made it so people are used to the artifacts of digital effects, they'll color grade the entire movie blue-orange and they do it so widely across the industry that nothing looks real at all anymore. People are used to it now because they have been doing it for so long, but it was never an improvement.

        Rose tinted glasses applied to the whole wide world, hopefully this means people are actually starting to push back.
        • Since when did movies ever strive to make something look realistic as a universal objective?? Movies have *always* played around wit visual aesthetics to heighten the impact of what they are striving to convey

        • Compositing in film has made it so people are used to the artifacts of digital effects, they'll color grade the entire movie blue-orange and they do it so widely across the industry that nothing looks real at all anymore.

          Errr no. Colour grading in film is a specific artistic decision to make the characters pop or a scene feel a certain way, and predates computers themselves. Careful use of chemicals and film promoted different colour grades intended to make skin tones "pop". Basic colour theory says you balance accentuated orange with blue. The sky is blue too so it works quite well when you make characters pop.

          The effect is often used also to demonstrate the unreal (e.g. Amelie which doesn't have blue), or to make the audi

        • These sorts of filters have become so broadly applied that people don't know what naturally lit scenes look like anymore.

          Though there’s plenty to say on your subject, you’re confused if you think that we’re talking about those sorts of filters here.

          We’re talking about image signal processing (ISP) such as de-noising algorithms that take a composite from multiple images to intelligently remove the inherent digital noise you get because you’re dealing with people who don’t know how to light a scene or who are bumping up against the limits of physics with what a camera sensor with sensor pixel

          • by Kremmy ( 793693 )
            I'm not going to link it because it's Rick Astley, but there's a remastered HD version of Never Gonna Give You Up that is a perfect example of exactly what we're talking about. The dancers in the background smear out from the interpolation.
            • there's a remastered HD version of Never Gonna Give You Up that is a perfect example of exactly what we're talking about. The dancers in the background smear out from the interpolation.

              Actually, I'd suggest that's the exact opposite of what we're talking about here. Frame interpolation, which is what you're talking about, involves the insertion of new frames into a video where previously none existed. Even when you start with sharp images, it can introduce blurring where none previously existed.

              In contrast, a gymnast flipping through the air with a phone set to the wrong mode is almost sure to have blurry images to begin with. The frame rate of the sensor would have been too slow, resulti

      • by Rhipf ( 525263 )

        Maybe they should use some computational power to determine what the subject of the photo is before applying adjustments that may not be appropriate. If there is already a sports mode in the settings use some computational power to determine that a photo subject is sports and automatically enable the sports mode.

  • The Nexus 5 was the first phone to use Google's HDR+, which shot up to nine frames and combined them computationally.
    However, although even more sophisticated today, Google's Pixel phones don't do the over-oranging that Apple's phones do, or the over-greening that Samsung phones do.
    It's a choice Apple made to process in this way, and IMHO many of the iPhone photos look way too "overcooked".

    • No. IPhone started it. They have Always oversaturated colors deliberately to make the photos it takes 'pop' more while sacrificing photo accuracy. Taking good looking but fake photos has always been a selling point of the device

      • by jeremyp ( 130771 )

        Most people want good looking photos and not "photo accuracy". In fact, what is photo accuracy? Any photo you see will have been affected by variables not strictly derived from the light that comes through the lens.

        • by AmiMoJo ( 196126 )

          99% of users do no editing, they post directly to social media. The iPhone camera consistently produces photos that are ready to post, with a few exceptions. The photos tend not to be very realistic though. That's not what people want.

          Google's camera is unparalleled for accurate looking photos.

          • Incorrect. People post whatever shit pictures they get from the camera. I've seen pics that are so grainy they'll post them anyway. They'll comment on other's good quality pics but they've made their bed and refuse to change.
          • That's not what people want.

            Exactly. What people (including me, despite all my DSLRs and lenses) want out of phone photos is minimal effort and something that looks good on phone screens under typical viewing conditions.

            You can always shoot RAW on iPhones as well, I've done it and the results are definitely more comparable to dedicated cameras. But the convenience isn't much better than on a dedicated camera at that point so why settle for the smaller sensor...

        • Photo accuracy means it looks like what you actually see. For example, grass should look green, not neon green.

          I still remember all of the shit people gave the pixel 2xl saying that the colors looked washed out while saying that the photos it took looked really good when viewed on another device. Actually it was the first phone to use an OLED display with sRGB colors out of the box, which gives you that accuracy, but compared to the normal oversaturated colors people are used to seeing on other phones, it w

        • Sadly even Adobe has taken this on-board and the auto settings in Lightroom now also crush contrast and over saturate the colours. The worst with iPhone photography is when itâ(TM)s done with a photo with completely overblown fake short depth of field

        • by Khyber ( 864651 )

          "In fact, what is photo accuracy?"

          The photograph being as close as to what you see with your own eyes.

          In the world of fluorescent mineral photography, it's very important.

      • by shmlco ( 594907 )

        Actually, they didn't. If anything iPhone photos tended to look undersaturated and flat compared to most "out of the box" photos taken by other cameras on other phones.

        Apple tends towards "natural" colors but other camera makers (especially Samsung) bump saturation and contrast, making those photos look "better" and more "punchy" in comparison. That's why last year Apple added Photographic Styles that you could default to Vivid or Warn or whatever if you wanted that look.

    • by Misagon ( 1135 )

      The Google Pixel cameras are also infamous for having a distinct look to them: with flattened dynamic range and dark edges.

      • I'm not sure what you mean by dark edges, but actually the dynamic range on the pixel phones good. Very, very good, in fact. What you're probably thinking of is color saturation, which some people confuse with dynamic range. And yeah, pixel phones like to err on the side of not oversaturating. That's actually a good thing. If you're used to over saturated colors, pixel will look dull to you. Like if you're used to really salty food, you'll think normal food tastes bland.

        Having a good dynamic range is when t

  • Every idiot with a camera thinks they are a professional photographer. I first saw this when DSLRs got cheap enough the average Joe Schmoe could afford (instead of the cost of a new/used car).

    I ran into this at weddings. While I moved into Videography, I had to work alongside *shudder* "Wedding Photographers" that didn't know jack about their cameras, lenses or settings. I can count on more fingers and toes I have of "Pros" shooting on auto mode.

    In videography, with professional video cameras (think High-

    • by _xeno_ ( 155264 )

      Thing is, with Android, you can just download an app that gives you complete control over the camera. If you want full control over the image, you can shoot as raw and adjust as you want.

      With Apple, you're stuck with whatever computational BS they do with the camera, and that's it. Your options are Apple's proprietary image format or JPEG, and that's basically it. (Not entirely, they added a bunch of Instagram-like filters to the camera, so if you want to make the image even worse, you can do that.)

      • Thing is, with iOS, you can do exactly the same thing. Case in point: https://apps.apple.com/us/app/manual-raw-camera/id917146276

        • by _xeno_ ( 155264 )

          I said "just download" not "go and purchase." Android has Open Camera [opencamera.org.uk] which is a GPLv3 licensed open source camera app that gives you basically full access to anything your device can do with its camera.

          Nothing like that exists on the iPhone. Yes, you can get apps that fake things by applying filters after the fact to the compressed image Apple gives you from the camera API. But you can't get direct access to the camera - Apple won't let you.

      • by ceoyoyo ( 59147 )

        The problem with making assumptions.... someone who knows what a search engine is will come along and correct you.

    • The thing about DSLR's now is that 90% of the time you can run them in full Auto and you'll get perfectly acceptable photos. Simply switching from full auto to Aperture Priority or Shutter Priority modes will turn the vast majority of your acceptable photos into Good photos, assuming of course you know how to properly control the Aperture/Shutter. Obviously Manual lets you take better photos, but for 99% of real world situations, weddings including (assuming proper/standard lighting), it isn't necessary. Of

  • Like the driver's seat can be adjusted to align to the driver's dimensions (including seat pitch for the seatback) and the steering column can be telescopically adjusted (closer or further from the driver, ostensibly for the comfort of their arm length), so can the settings in any Camera (software app or physical device).

    It's easier to complain about a new product rather than reading the instruction manual, or taking a photography class (or even watching a YouTube video) explaining how to take photograph
    • I think a lot of people just want a cell camera that works without dicking around with the settings. I know I find no joy in fine tuning electronics.

    • by PPH ( 736903 )

      Do not attempt to adjust the picture. We are controlling transmission. If we wish to make it louder, we will bring up the volume. If we wish to make it softer, we will tune it to a whisper. We will control the horizontal. We will control the vertical. We can roll the image, make it flutter. We can change the focus to a soft blur, or sharpen it to crystal clarity. For the next hour, sit quietly and we will control all that you see and hear.

  • Someone using a dslr knowledgeably, knows how to take better pictures than they can with their cell phone. Duh?
    • Someone using a dslr knowledgeably, knows how to take better pictures than they can with their cell phone. Duh?

      She might not actually know how to use "a dslr knowledgeably" - she might be a point-and-shoot dSLR owner. Even mid-range dSLRs from the past decade or so have excellent automatic modes that can interpret motion and capture moving objects very well.

  • The article's argument against the iPhone amounts to the exposure lightening of faces, occasional motion blur from too slow a shutter speed, portrait mode blur for artistic effect (selectable), and the occasional unintended computational artifact. Compare that to the large volume of horrible photos a DSLR produces in a typical dilettante's hand and it's no contest - smartphone wins 99.9% of the time.
    • If you own a DSLR that does no Computational photography, you know what the difference between motion blur and AI blur. One you see a uniform distortion to all similar things in the image, often "smooth" and directional. AI can fuck up the image way beyond and unpredictiably. Ironically, you you can often computationally correct motion blur, but only after helping the it along telling it a few things.
  • by ThomasBHardy ( 827616 ) on Sunday March 20, 2022 @02:12PM (#62374793)

    If you want more traditional controls, just turn on the ProRAW feature and get files like RAW files you have complete control over right?
    Huge files just like traditional RAW files, without the Apple computational work done.

    • Glad I scrolled down, saved me the trouble of googling. As distasteful as Apple's defaults might be, I figured you could dodge them somehow.

    • by Bomarc ( 306716 )
      By reading the Apple support article for ProRAW [apple.com] -- being the request for removal of the odd photo editing is not addressed by disabling this feature. And worse: "ProRAW files are 10 to 12 times larger than HEIF or JPEG files. If you store the files on your device, you might run out of space more quickly than you expect. And if you use iCloud Photos to store your photos, you might need to upgrade your iCloud storage plan to make more space available for these larger files."
      • All RAW files are much larger than lossless formats like JPEGs They store an incredible amount of sensor data on them that you can take full advantage of in Photoshop. These are the files that quote "real photographers" work with. My Nikon kicks out 80MB RAW files along with the usual JPEG file.
        As I understand it for the iPhones, the computational adjustments are applied to the JPEG/HEIF file and not to the RAW file.

        I'm not saying it's a perfect scenario for the average mom taking photos of her kids, but

        • by Bomarc ( 306716 )
          For me there is a level of irony; in that AT&T is/has dropping support for 3G; I'm now required to get a 'newer' phone. I know that it will not be 'new'; for this (and related) reasons.

          To your point: Getting Apple to (ever) to the right thing is and has been painful. 'Auto correcting' photo's has been an issue since the 35mm days (TLDR: there were some really interesting cloud formations around me; so I stopped to take photo's; and they wouldn't fit in one shot. When they were developed - the auto-
        • by ceoyoyo ( 59147 )

          Larger than lossy.

          RAW files are usually smaller than lossless because they don't de-Bayer.

          They're bigger than JPEG because JPEG compresses the crap out of images.

    • Proraw supposedly sits somewhere in the middle, HDR/fusion applied, no colour grading ... not that Apple deems peons worthy to know exactly.

    • Or manual mode, on Samsung phones all of these smart features can be toggled

    • Only an option if you pay for the latest-and-greatest. Apparently this is not available on my iPhone 10 :/
  • This whole discussion just boils down to everyone disagreeing about what a "good" photo is. One person's good photo is another person's fuckup.

    "Why doesn't your camera capture the beauty of the wondrous phenomenon that I was seeing in that moment?"

    "Why doesn't your camera see beyond the ephemeral conditions and capture the Platonic essence of the subject, and then insert it the proper aesthetics du jour?"

    • by splutty ( 43475 )

      A "Good" photo is a photo that's as close to 'photo realistic' (that term exists for a reason) as possible.

      A "Good" photoshop/aftereffects is what you're getting with most cameras nowadays.

      • That's silly. Photography has always involved manipulating the captured scene to fit the limitations of the medium, such as tonal range, color gamut, field of view, depth of focus, motion blur, etc. "Good" photography does so in a way that is aesthetically pleasing. Do you claim all black-and-white photography ala Ansel Adams is not good, because it's not realistic?
        • by splutty ( 43475 )

          Hence the quotes. Because "Good" is nonsense.

          I should've added a sarcasm tag seeing the reactions. lol

      • by ceoyoyo ( 59147 )

        So what is that? Your eyes can't even agree on what it is, what with their auto white balancing, contrast adjustments, and oh yeah, extensive making shit up.

  • Most people buy iphones to watch youtube on the shitter or take cat photos for their social media.

  • by Catvid-22 ( 9314307 ) on Sunday March 20, 2022 @02:53PM (#62374899)

    The iPhone camera also analyzes each image semantically, with the help of a graphics-processing unit, which picks out specific elements of a frame — faces, landscapes, skies — and exposes each one differently.

    First time I've encountered such use of "semantic" in a totally non-linguistic sense. Even the technological use of the word has some reference to language, say, speech recognition. But who knows, maybe faces and places have meanings, and a picture after all is worth a thousand words?

    • Apple's advanced Neural Engine can calculate exactly how many words each image is worth. In the process of merging the multiple images which make up one "shutter release", Apple weights each image according to its calculated effective word count, extrapolating as necessary and applying advanced splining techniques to guarantee each final photo is worth a minimum of 1014 words.

    • Semantics is just the study of meaning. This assigns meaning to different parts of the image.

      Semantics (from Ancient Greek: smantikós, "significant") is the study of reference, meaning, or truth. The term can be used to refer to subfields of several distinct disciplines, including philosophy, linguistics and computer science.

      - https://en.wikipedia.org/wiki/... [wikipedia.org]

      • I don't think the Wikipedia article even deals with images (ideas, yes). A Firefox search for the words "image" and "picture" yielded zero hits. Again most of the meanings of the term "semantics" deal with language, which by extension could include symbolic logic and programming languages (which, like natural languages, are said to have a grammar and syntax). By comparison, can images have a grammar? We can of course create a hieroglyphic-like language made of pictograms. But this would involve the use of i

        • I'm sorry, what point are you trying to make?

          That "semantic analysis" is an incorrect phrase to describe what they're doing here? It is a perfectly correct usage of the word. It's just being used in a way you haven't personally encountered before. Now you have.

          Naturally a huge part of "the study of meaning" is going to revolve around what words mean.

          • Sorry, my point is that semantics should at least deal with units of meaning. A single picture might be that unit of meaning when placed side by side with other pictures, say, to create a photo essay. The picture itself gains meaning only in relation to the other pictures. The word "and" can be a unit of meaning, but the letters that make it up (A, N and D) are meaningless.

            From the article you linked to, WRT linguistic semantics.

            Semantics can address meaning at the levels of words, phrases, sentences, or larger units of discourse.

            Correcting or clarifying what I posted earlier, semantics is different from gra

            • semantics is different from grammar and syntax but uses those as its foundation

              LINGUISTIC semantics does. And that is certainly the usual connotation of the word. But semantics can have a broader meaning. Why are you so insistent that it cannot? You are not arguing with me, you are arguing with dictionaries.

              Cherry-picking excerpts discussing semantics in language contexts (eg the computer science quote is specifically about programming languages) doesn't somehow disprove the more general definition I ha

              • No, I'm not arguing with you. I'm just pointing out that the use of semantics in describing smart camera software is either extended or useless.

                The best way to get the sense of what I'm trying to say is by simplifying our example. Instead of faces, think of a red star. The red star has a definite connotation, for instance as a symbol of communism and for I don't-know-what-reason, post-Soviet Russia. That would be the red-star's "semantics". But our hypothetical smart camera doesn't need to know that it's th

                • It. Does. Not. Matter. That. The. Wikipedia. Article. Doesn't. Specifically. Discuss. Images.

                  I included it for the definition in the first sentence. That's it. Not because the article is some exhaustive reference that covers every single possible current or future application, which is how you decided to treat it for no reason whatsoever.

                  The definitions I have provided are sufficient. You haven't challenged any of them, you just keep repeating your own opinions. I'm sorry, but I don't care about your pe

    • A goal of Computer Vision is to mimic the ability of a person to describe a photograph with words. Describing a photograph requires understanding the meaning of each part of the image and the interconnected meanings among parts, which is analogous to how the word semantics is used linguistically. Is semantics the right word to describe this kind of work? It is twice-removed from its original meaning---referring to pictures rather than words and machine intelligence rather than human. Yet, for better or wors
      • Okay, that appears to clarify things. Still, I assume that semantics, when applied to imaging, would involve figuring out the real-world significance of an image or part of an image. To repeat my example in my reply to another user, I'd consider it "semantics" if the imaging software figured out that the red star is the red star of communism. I'd understand this extended use of semantics in relation to images. However, I doubt that the software would distinguish between the star of marxism and the star of M

  • Cameras and film processing have almost always adjusted your photos to be "more pleasing". This was always present in the old days of commercial photo processing, back in the days when you were using film and running it through chemical baths.

    Most people, most of the time, take pictures of other people. You were never going to be happy with that accurate picture of your family looking like pallid vampires with tuberculosis so, the film and processing warmed up your pictures with an emphasis on reddish shade

    • And a hidden secret: color film dyes weren't that consistent from roll to roll. The difference between cheap photo printing and expensive photo printing was the expensive photo printing paid a person to adjust the color roll-by-roll, and sometimes even frame-by-frame.

  • But it makes you look cute so what's the problem?
  • Apple very strongly supports the "don't worry your pretty head about it" approach to functionality. They know better than you do what you really want.

    A more reasonable approach would be a raw or at least semi-raw format with no adjustments other than fitting into the available bit depth.

    Some of us in the Bay Area remember the day of the orange skies, during the bad fires - where it was dark enough at noon for street lights to turn on, and the skies were a hellish orange. Iphone pictures happily corr
    • by shmlco ( 594907 )

      Or, you know, download one of a hundred different apps and exercise all the manual control you want.

      Just like you can with any modern camera. Just download...

      No. Wait. You can't do that, can you? Have to use whatever interface and menu system that comes with the camera.... ;)

  • This is complaining that the tech is amazing, it just isn't amazing enough. Why isn't it more amazing, why can't I point it a mountain or highway and get Ansel Adams [amazonaws.com], why can't I point it at a person and get Yousuf Karsh [digitalphotomentor.com]?

    Like, I get that tech can be more amazing still. But maybe one can sound less whiny about it.
  • helpfully blur out any "naughty bits" in the photos taken.

  • The article doesn't have (or I didn't see) any examples and, while I understand what they're taking about, it would be nice to see some comparisons.

  • I think it's hilarious that when she wants to get away from the computational photography, she chooses a Pixel -- Google's Pixel is the real vanguard of computational photography, doing enormous processing of every image. But doing it better, apparently.

  • This is where it begins, altering the truth or creating an alternate reality. What if we just said no? It's not me, it's the camera. CGI, ever heard of it.

  • If you are aiming for realism, you don't use a phone to take your picture. The phone is giving people exactly what they want - a brighter, more vivid version of whatever they point in at. That's the correct behaviour.

    In fact, I'd suggest that almost nobody wants their photo to be truly representative of the actual subject. You are always curating the moment, and presenting a careful fiction.

    Somebody mentioned Ansel Adams above. His pictures were absolutely stunning. But what he shot never looked like that.

  • ....All the processing in the world cannot compensate for the miniscule lens size - having multiple helps, but those big lenses on DLSR's are there for a reason ...

  • My experience using my iPhone 13 Pro is, it's a pretty vast improvement for shooting casual video. It's able to adjust the focus pretty well to stay on the subject of the action when you tap on them in the frame to mark them, and the video is crystal clear (with pretty decent audio recorded too).

    My results taking still photos are mixed, but it does a pretty good job for me of your typical indoor or "outdoor during the day" quick photo snaps. Overall, if you took a look at a collection of 100 random picture

  • Over Christmas, I took some photos, at dusk, of a color-LED lit christmas wreath, with an iPhone 12 Pro and with a Sony Alpha 6600 mirrorless interchangable lens camera.

    The pictures captured by the Sony merged the red, yellow, and orange lights into an indistinguishable mess. I tried bracketing the exposure, and couldn't improve the results. The iPhone 12 Pro did not; red was red, orange was orange, and yellow was yellow.

    Cameras still cannot record the same dynamic range as can your eyes, and different ca

Ocean: A body of water occupying about two-thirds of a world made for man -- who has no gills. -- Ambrose Bierce

Working...