Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Iphone

iPhone 14 Pro To Feature 48-Megapixel Camera, Periscope Lens Coming 2023 (macrumors.com) 45

Apple plans to add a 48-megapixel camera lens to the iPhone next year, followed by a periscope lens in 2023, according to analyst Ming-Chi Kuo. MacRumors reports: In a research note today with TF International Securities, obtained by MacRumors, Kuo said these iPhone camera upgrades over the next two years will help to boost Taiwanese manufacturer Largan Precision's market share, revenue, and profit. Kuo did not provide any further details, but he has previously claimed that the 48-megapixel camera will be limited to iPhone 14 Pro models and allow for 8K video recording, up from 4K currently. These high-resolution 8K videos would be suitable for viewing on Apple's AR/VR headset that is expected to launch next year, he said.

Kuo also previously claimed that iPhone 14 Pro models may support both 48-megapixel and 12-megapixel output, which would likely be achieved with a process known as pixel binning. Already in use on some Android smartphones, like Samsung's Galaxy S21 Ultra, pixel binning could allow iPhone 14 Pro models to shoot 48-megapixel photos in bright conditions and 12-megapixel photos in low-light conditions to preserve quality. Further ahead, Kuo reiterated his belief that at least one iPhone 15 model will gain a periscope lens in 2023, paving the way for significantly increased optical zoom. This lens would have folded camera optics, where light absorbed by the image sensor is bent or "folded," allowing for increased optical zoom while maintaining a compact design appropriate for smartphones.

This discussion has been archived. No new comments can be posted.

iPhone 14 Pro To Feature 48-Megapixel Camera, Periscope Lens Coming 2023

Comments Filter:
  • In Soviet Russia periscope features YOU!
  • Are all those pixels going to actually be taken advantage of*, or is this yet another numbers checklist game?

    * Used for a practical purpose, not "sexually harassed"

    • Finally, those detective show tricks will work! You know, where the detective zooms in on a reflection in a door knob and is able to make out the face of the killer.

    • Are all those pixels going to actually be taken advantage of*, or is this yet another numbers checklist game?

      * Used for a practical purpose, not "sexually harassed"

      Oh yes, it will be taken advantage of. Apple will take advantage of the significantly larger file size, coupled with the podding pace of on-phone storage in order to sell iCloud storage. Two or three Narcissistiselfies (TM) and it's time to pull out the credit card and pay for cloud storage, month after month after month until you die, mostly likely without having viewed more than a tiny fraction of the crap that's been preserved in stunning detail.

      • by Tablizer ( 95088 )

        > [Apple profits from big-file] Narcissistiselfies

        I was going to say very few want to see their face in that much detail because all the blemishes would show up, but then I realized most use Instagram-like filters to remove the blemishes. Thus, most resolution is wasted in the end because the filters intentionally remove detail.

        Still clever on Apple's part, from a wallet perspective. The IT industry is farked up in many ways. [reddit.com]

    • It's just another apple gimmick, just like 3d touch. Soon you'll see apple commercials that basically make it sound like apple has the highest megapixels of any camera, and they'll trademark a name on it, something cheesy misleading, yet also very apple, like "retina lens".

      Of course, it won't improve the quality of the photos, possibly even make them worse in a few ways, but iFans will happily shell out $1,600 for the privilege of having it, and they'll be thoroughly convinced that they now take better phot

    • It would help for zooming, hopefully without any cost but improvement in dynamic range and light sensitivity. Bigger pixels mean improved light sensitivity but smaller pixels improve image resolution.

    • by fermion ( 181285 )
      At a certain point on a certain sensor size pixel count can be a numbers game. At some point the 2 micrometers per pixel becomes just a jumble of pixels. We really saw this with low end half sensor DSLR that were being sold for outrageous prices based, in part, on high pixel counts. This is reminiscent of when PC were sold with ultra fast processors but ridiculously slow bus speeds,

      But in the current smartphone, sensor size and pixels are only part of the equation. The key is the software that generates t

    • The practical purpose is to provide a digital zoom, without having to drop image quality at âoenormalâ resolutions like 2k/4k
    • You really need those 14,000x9600 18MB images to post pictures of your dinner on Insta...
  • by PeeAitchPee ( 712652 ) on Monday December 20, 2021 @06:45PM (#62101109)

    Look at any iPhone photo at 100%. You'll see it's not sharp -- rather, it looks like a watercolor painting. iPhones use a metric shit-ton of processing to create pictures that Apple *thinks* most people want. And it works very well. But -- to be clear -- it's a highly processed and interpolated version of the original signal.

    Same thing with this upcoming camera. At that pixel density on a tiny little sensor, your noise necessarily goes thru the roof, as there is less physical space per pixel to collect the photons. A lot of that noise will be taken out by -- you guessed it -- processing. For most people and uses cases, this doesn't matter at all. But cell phones like this one are no threat to replace their larger dedicated camera counterparts in use cases that actually use all of those pixels, and require that they be accurate.

    • by dfghjk ( 711126 ) on Monday December 20, 2021 @06:50PM (#62101119)

      "Look at any iPhone photo at 100%. You'll see it's not sharp -- rather, it looks like a watercolor painting. iPhones use a metric shit-ton of processing to create pictures that Apple *thinks* most people want. And it works very well. But -- to be clear -- it's a highly processed and interpolated version of the original signal."

      This is literally the case with every Bayer pattern imaging sensor. Highly processed and interpolated BY DEFINITION. Also, anti-aliasing filter ensure that the image is not sharp at 100%, that is literally how the filter works.

      But of course, iPhones use a metric shit-ton of processing. All phone cameras do. Hell, all DSLRs do. You comments are incredibly uninteresting.

      • No, iPhones takes a lot of extra steps than most cameras. It is neither color nor detail accurate, but it takes better pictures in what people want.

        • Yes, more steps than most cameras, but not more steps than high end Android phones where the processing amount is similar.

          Current top end smartphones create a picture that is an "optimized version" of reality.

          You can see how the different manufacturers have a different "style" in the manipulations though.

          I think some company allowed for even removing of people or other objects from the picture..

      • on most DLSRs you can turn it off ... and the on is only light processing ...

        Camera Phones you cannot turn it off because they have tiny lenses and so have to heavily process it to get a decent picture, the iPhone tends to over process ..

      • by AmiMoJo ( 196126 )

        I was going to ask if there were any DSLRs that were as easy to use as a phone, with similar levels of processing/computational photography. Would be nice to have a point-and-shoot but with DSLR levels of quality.

        Then I read your last sentence and decided not to.

        • I have to admit, you and I do not see eye to eye on many issues on here -- quite the opposite. But you never go out of your way to attack or insult people for absolutely no reason. This guy deserves the -1, Douchebag mod if anyone ever did.
    • by ceoyoyo ( 59147 ) on Monday December 20, 2021 @08:11PM (#62101283)

      Unless you're shooting in the dark you should be operating in a region where the quantum noise is pretty insignificant, so the size of the sensor elements won't be particularly relevant. Yes I know DSLR pixel peepers like to imagine strong relationships between sensor element size and noise.

      More important is the diffraction limit. The iPhone 13 lens is f1.5, which gives an airy disk diameter of ~1.9 um. That's the same as the sensor element size in the 12 MP sensor. If you want to quadruple the sensor pixels then you need to increase the area of the sensor, decrease the focal length, or increase the lens aperture. Since the point of the periscope optical path is to get more optical zoom, increasing the lens aperture (by a lot) would be the only reasonable thing to do.

      • by AmiMoJo ( 196126 )

        I don't know about Apple, but other manufacturers combine multiple images to overcome limitations of the lens size. Since most shots are taken hand-held, there is constant movement. The movement is a form of noise, which in digital sampling theory can actually be used to reduce aliasing and increase the resolution of the sensor.

        • by ceoyoyo ( 59147 )

          In microscopy it's called blind ptychography. You typically don't need extra resolution on the sensor though. The idea is to combine the multiple images into a single image with higher resolution, overcoming the transfer function limits of the entire optical system, including the sensor. Having a higher than necessary resolution sensor just slows the whole thing down because you have extra pixels to process.

    • by Tablizer ( 95088 )

      > ton of processing to create pictures that Apple *thinks* most people want

      It might be tuned to smooth out faces but not nature. People will judge a camera better if it makes their complexion smoother/younger.

      My non-Apple phone-cam sharpens images to bring out detail, and it looks great for nature, by adds 20 years to my apparent age. It took me a bit to realize the effing cam was effing up my face. There is a switch to turn off the enhancement, but since I take more nature pics than selfies I leave it

    • by AmiMoJo ( 196126 )

      Apple is very much focused on producing "Instagram ready" images, because that's how most people use their cameras. No editing, not even any set-up for the photo. Just point, tap, post to social media. In that respect they do a decent job, at the expense of almost every photo being mediocre and generic looking.

      Phones with 50 MP cameras always bin the pixels anyway. That removes a lot of the noise, but has the advantage that the higher resolution underlying image can help with things like depth detection for

    • Look at any iPhone photo at 100%.

      Why? That's just a reflection of the fact you used the wrong lens.

      At that pixel density on a tiny little sensor, your noise necessarily goes thru the roof, as there is less physical space per pixel to collect the photons.

      Only if you ignore processing. The reality is that brute forcing large pixels reaches a limit. That limit eventually too will have noise. Then you process the result, except you don't have sufficient data to correct for in your noise reduction algorithm and the result absolutely demolishes all detail.

      The answer to that problem is oversampling, or subpixel sampling. Which is precisely the point of such sensors. Take 48mpxl of data, apply noise

    • Ok boomer
  • "plans to add a 48-megapixel camera lens to the iPhone next year" 48 MP sensor, not lens! the error is even in the first paragraph of the article
  • Get with it Apple, we're over it. How can you sell iPads and laptops with USB-C and not on the phone? Still?

    Stop it.

  • Competitor phones were using periscope lenses as far back as 2019. So Apple is 4 years behind. I actually have an iPhone, but I'm getting tired of waiting for features that are old-hat in other phones. Time to look at the competition.
    • Yup, apple is like 3 iterations behind the others now.
      48MP camera? I had one of those 3-4 years ago. That reality distortion field they have is strong stuff.
  • Remember: 8K=4K; 4K=2K (1K remains 1K because the "industry body" that makes these lies up didn't have a chance to lie about it, otherwise it would have been 2K).

    Also: 48MP is less than 7 times the resolution of 1MP.

  • If they want to design the phone to take pictures or video for use in Apple's AR/VR headset (or any VR headset) the phone is going to need 2 cameras so that it can record binocular video (i.e. record in 3D).

Every successful person has had failures but repeated failure is no guarantee of eventual success.

Working...