Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Input Devices Iphone Patents Apple Technology Science

Apple Patents Directional Flash Tech For Cameras 145

tekgoblin writes "A patent application has surfaced that shows Apple's attempts at creating a new way for a flash to work on a camera. The way the new flash works is very intriguing: a user can select a dimly lit area of the photo and the camera will try to illuminate just that area with the flash. The way Apple is attempting to accomplish this is similar to the way the autofocus works on the iPhone 4 where you can touch the screen in certain areas to focus on that area. Instead you will be able to light up that area with the flash. This is accomplished by the camera flash passing through a 'redirector' so the flash can be placed other than directly centered when a photo is taken."
This discussion has been archived. No new comments can be posted.

Apple Patents Directional Flash Tech For Cameras

Comments Filter:
  • by The_mad_linguist ( 1019680 ) on Monday September 27, 2010 @04:18AM (#33709086)

    This doesn't make any sense. I'm *sure* I heard Jobs say that he was against this type of technology.

    • That's what this is designed to fix. They had a problem with Flash battery life, so they found out how to make it directional, so that it uses less. Adobe should be thankful.
  • Sounds impossible (Score:5, Informative)

    by Jackie_Chan_Fan ( 730745 ) on Monday September 27, 2010 @04:22AM (#33709094)

    How does this redirector work? the problem with flashes on camera is that they are coming from the point of view of the photo. This creates rather unflattering light.

    You can redirect a flash by aiming it, but its still coming from the same point in space as the camera. This isnt ideal or good either.

    The best way is to get that flash off the camera... but if you cant, as would be the case with an iphone... it is best to bounce it by redirecting the flash onto a wall to the left, right if you can, or ceiling. Generally up and to the rigth and left work well, as it forces light to bounce off the wall, which in effect makes the wall a large light source.

    The problem with the flash being on the phone is that it is still a small light source. Small light sources cast hard shadows. This redirector wont change that, unless it can bounce light off a surface such as a wall. Which i dont see it doing as it has limited mobility being stuck in the back of the iphone. Generally with higher end camera flashes, you can rotate them in 360 degrees left to right and have a large up and down range of movement so you can point it right at the ceiling. you cant do that with an iphone.

    We'll see.

    Sounds like a cute gimmick for camera novices, but not a new solution to anything other than perhaps interface. Light is light.

    • Re:Sounds impossible (Score:4, Interesting)

      by Thanshin ( 1188877 ) on Monday September 27, 2010 @04:33AM (#33709150)

      That problem will be solved shortly. We just need a bit more computational power on the cameras and a separable flash.

      You point the camera to your target. The camerta creates a 3D map of the room, calculates the perfect surface for reflection and, in the screen, it points that angle to you.

      You separate the flash (camera in one hand, flash in the other) point it as the camera showed you, and make the picture.

      The camera will have a secondary flash to remove shadows.

      All the technology exists at this moment but I don't think a camera sized computer can do 3D maps at a reasonable speed.

      • That would be a hell of a powerful camera. The tech exists... I've seen large scale scanning solutions that are amazing but... i doubt apple will ever bundle that into a tiny phone...

        The problem is the scan data is often extremely high resolution point data... that has to be triangulated into surfaces and then light would have to be calculated off that etc. The power to do that in real time, makes it unrealistic on a tiny phone at any reasonable cost for mass markets. I think it would be wiser to just learn

      • Re: (Score:2, Interesting)

        by niftydude ( 1745144 )
        You won't need a separable flash. I'm willing to bet any money that they'll be using arrays of movable micro-electromechanical (MEMS) mirrors and micro-lenses in the in the camera to aim the flash automatically for you.
        MEMS mirrors are tiny, cheap to make now that we know how to make them, and are easily able to do this kind of application.

        I'm also not sure if a camera sized computer can do 3D maps at a reasonable speed yet - but a hardware chip which has instructions purely to implement that sort of al
        • I doubt that very much. The reason for the off camera flash is one of geometry. The closer the flash is to the lens the more likely it is that you get red eyes. Additionally, the closer the lens is to the flash the more directly the light bounces and the less control you have over the shadows.

          So, unless Apple is planning to start a new trend of huge cameras, it's unlikely that this technology will really make much of a difference. As the angles and distances involved are just not enough to make much of a
          • by rsborg ( 111459 )

            The closer the flash is to the lens the more likely it is that you get red eyes.

            If you read the parent thread, you'll realize they're talking about bouncing the flash.. which negates your argument.

            Now, how the hell they'll be able to get a bright enough flash (even if it's targeted to the subject dark areas) for bouncing, generated from a smartphone casing... I have no idea. That truly would be future-tech.

      • by Andy Dodd ( 701 )

        The computing power exists and is proven to work. The computer is known as "human brain".

        Someone who understands photographic light (read Strobist's Lighting 101 as a good way to start understanding it) and doing stuff like what you described above becomes second nature.

        Note: Regardless of computing method, non-white ceilings/walls greatly decrease the solution space of the problem. :(

        • I think we can all agree that there would be a certain utility to having a camera that can do the photographer's job, which is to say, calculating good angles and lighting. Further, I think most of us would agree that this will become feasible eventually. This frees up the photographer to be an artist, although it also sharply decreases the amount of available work by eliminating the easy stuff. That relegates the professional photographer to the trickiest stuff, but at least it's also most interesting. Als

    • Re: (Score:3, Funny)

      by dimeglio ( 456244 )

      Clearly you underestimate the power of Jobs at bending light and controlling photons. All you need is a singularity in the right place in order to direct the light where you want it to be.

      • There seem to be a lot of negative posts on this subject. Presumably this is due to the declining level of physics knowledge among the Slashdot audience. This is actually one of the simpler uses of the reality distortion field.
    • Re: (Score:3, Insightful)

      by chomsky68 ( 1719996 )
      It doesn't matter how and/or whether it works. The point of the whole excersize is that if anybody is going to figure out and implement it, the brave is going to have Apple's lawyers breathing down his neck for patent infringement.
      • That is very true.

        And they shouldnt be allowed to do that. Bouncing and aiming light has been apart of artistic illumination before cameras ever existed.

    • by bronney ( 638318 )

      dude this patent isn't for the iphone 4.. they could build in a flash on the rim of the phone for bouncy in future phones. Conveniently solves the antenna problem too! lol?

    • Actually, that seems to me like the perfect way to aim the flash at just someone's eyes :p

    • by Andy Dodd ( 701 )

      This is only a marginal improvement (and may in many situations prove to be a non-improvement making those who understand photographic lighting even sicker).

      The unfortunate fact is, we'll never see bounce flash coming from a cameraphone. There simply is not any way to get the energy/power requirements for bounce flash crammed into a cell phone. The patent above is, if anything, going in the exact opposite direction from bounce flash.

    • The best way is to get that flash off the camera... but if you cant, as would be the case with an iphone... it is best to bounce it by redirecting the flash onto a wall to the left, right if you can, or ceiling. Generally up and to the rigth and left work well, as it forces light to bounce off the wall, which in effect makes the wall a large light source.

      Simple - you bump two iPhones in "flash buddy mode" - from that point use the accelerometers to determine their relative positions, and have the first sign

  • by Andy Smith ( 55346 ) on Monday September 27, 2010 @04:23AM (#33709100)

    I'm a professional photographer and I've been using flash zoom, feathering etc for years to achieve this effect. Guess I won't be allowed to do that anymore without asking Apple for permission first?

    http://www.meejahor.com/2008/06/06/feathering-two-lights-for-the-price-of-one/ [meejahor.com]
    http://www.meejahor.com/2008/09/29/feathering-its-like-off-camera-lighting-but-faster/ [meejahor.com]

    (Just kidding. I know it's a patent for a specific method, not the technique.)

    • Re: (Score:2, Insightful)

      by inpher ( 1788434 )
      The difference from what I can tell is that Apple is seeking a way to make all this work automatically in a pocket-sized system (as far as I know this is the first time this has been attempted). Don't worry, you can keep doing what you are doing, almost no amount of technology can compete with a good human who knows what he/she is doing.
    • Basically Apple is patenting the ability of the camera to do that for you, but less good and with significantly less control in a consumer device.
      • by Andy Dodd ( 701 )

        My guess is that Apple's algorithm is more likely to target fixate on the foreground subject, creating the exact opposite effect.

    • by Kanasta ( 70274 )

      don't worry, the technique patent will come after this one is granted...

      waiting 30s to press submit

      because I type too fast

  • by pinkushun ( 1467193 ) on Monday September 27, 2010 @04:28AM (#33709122) Journal

    We don't have to hold it in a certain way to make it work properly.

    • Re: (Score:3, Funny)

      by beelsebob ( 529313 )

      You do, if your finger hits this one spot, with a circular piece of glass on it, then the camera doesn't pick up any light –you just totally lose all reception!

  • Call me crazy (Score:5, Insightful)

    by LBt1st ( 709520 ) on Monday September 27, 2010 @04:28AM (#33709126)

    The only time I bother taking a picture with my phone's camera as opposed to a normal camera is if it's something happening spontaneous and I want to take a shot immediately.

    If I'm going to take the time to make adjustments and setup lighting I'm not shooting with my cell phone.

    That said, if the camera can auto-select dark spots and light them without over-lighting other areas or otherwise screwing up the shot, I could certainly see that as a good thing.

    • Re: (Score:3, Insightful)

      by adolf ( 21054 )

      Sounds like it's more about getting better* spontaneous shots out of a small, portable device, than trying to replace a proper camera+lighting.

      I don't carry a dedicated camera, unless I'm planning on doing some photography. I've always got my phone, though. Any improvement is welcome.

      *"Better", as in, an improvement over what similar devices could do before, not "better" as in that which can be accomplished by less-convenient means.

    • "Auto" sounds likely, given Apple's design style. They seemingly added the HDR feature so that iPhone users wouldn't need* to properly set the metering their pictures, for instance.

      *Of course you get better results if you do set a metering "target" manually, but HDR's certainly reduced the number of photos that are ruined if you don't bother.

    • Depends on what you're trying to accomplish with the picture. Many people have slammed Polaroid instant pictures. They're generally washed out, the center flash makes faces look ghostlike, the whites are unbalanced (especially after drinking). But there are galleries of Polaroid art out there. The photographers worked within the medium to create interesting images that could not be taken with an SLR without manipulation. Sure, the iPhone is nowhere near as versatile as a traditional *SLR or even a dedicat

    • I agree with you, and I’ll take it a step further. The appropriate way to do this is with software, not with hardware.

      Design the phone to take two pictures only milliseconds apart, one with the full effect of the flash and one with pretty much no flash. Then you can snap those spontaneous shots with no prep work and after-the-fact you can “direct” the flash anywhere you want by selectively feathering the photos together.

  • How do you redirect light with a solid state system ?

    Will this use the same tech the new breed of laser projectors will use?

  • A Flash-spotlight? (Score:4, Interesting)

    by erroneus ( 253617 ) on Monday September 27, 2010 @05:15AM (#33709314) Homepage

    Well, I have to say, it is a novel idea as far as I can tell. I could probably do one better by combining the power of a projector lamp and a DLP mirror system to paint a rather precise lighting system for the purposes of portrait photography. Light can be manipulated with very precise detail, coloring and intensity over the whole scene, not just one point. (Now, someone go patent this idea...) Using this technology, you could photoshop an image before you take it.

    As someone pointed out, it is not so easy or as good when photos are edited after the fact than before. The reason why, I will assert, is that there is an unlimited range of variables of light while there is a far more limited range of variables of pixel data. The act of capturing an image on a CCD is already lossy compression of information. By setting up the image before-hand, you are increasing your ability to edit a final product in a more pleasing way.

    I would be interested to know how Apple intends to integrate this into an iProduct. iPhone/iPad wouldn't be particularly good at this type of photography I don't think. To accomplish this, a complex focusing system would have to be implemented and while I have heard of liquid lenses (here on slashdot) before, I can't help but believe that the throw distance of such projection technology would be rather short.

    Still, all in all, this is a neat idea. And it's not quite a software patent, so I'm okay with it.

  • Will never work... (Score:2, Insightful)

    by cbope ( 130292 )

    Yet another way to really fuck up your photos. If you know anything about professional photography, you immediately know this is a failed "solution". In many cases when you light a scene for photography, it's the DIRECTION that the light comes from that is important together with the amount of light. That's why you rarely see camera-mounted flash used in the studio, strobes (flashes) are positioned away from the camera so as to light the scene in a certain way from one or more directions. With the proposed

    • by bluefoxlucid ( 723572 ) on Monday September 27, 2010 @05:43AM (#33709416) Homepage Journal
      My first thought on this was actually that it'd look like shit (spotlight effect), and the real solution is to take two (read: many) rapid pictures while diddling the flash. Use the dark photos and the light photos to blend a composite effect and digitally light up the area in such a way that you have a smooth transition into a brighter, better-contrasted area; or even leave it "dark" but compensate for the CCD's poor performance in the dark by correcting the dim colors and improving the contrast.
    • by dangitman ( 862676 ) on Monday September 27, 2010 @06:37AM (#33709586)

      If you know anything about professional photography, you immediately know this is a failed "solution". In many cases when you light a scene for photography, it's the DIRECTION that the light comes from that is important together with the amount of light. That's why you rarely see camera-mounted flash used in the studio,

      Well, if you knew anything about professional photography, you'd know that on-camera flash definitely has a useful place. That's why you often see a ring-flash (the light actually surrounds the lens, so it comes from directly front-on) employed for fashion, macro and scientific photography. Flash coming from the direction of the lens is actually very useful as a fill-light, when used in moderation.

      With the proposed "invention", the direction light comes from will always be the same, close to the lens. It doesn't matter that it's only lighting a part of the scene.

      Actually, it would matter. One of the biggest problem with on-camera flash is that it lights the entire scene the same way, leading to highly over-exposed and under-exposed areas. If you can control where that light goes, then you will get a much better result than an on-camera flash that just blasts the scene indiscriminately.

      After all, you don't always have access to off-camera lighting, particularly with a compact unit. Of course it's not going to be the same a a set of studio lights (which people don;t carry around with their phones). But it's a step up from non-controllable on-camera flash.

  • by El_Muerte_TDS ( 592157 ) on Monday September 27, 2010 @05:44AM (#33709420) Homepage

    I thought Apple hated Flash, stating poor performance and what not.

  • Isn't one solution just to take two photos a split second apart (one pre-flash and one with flash), then blend the two images together, using the region of the 'flash' photo that the user selected as the 'flash region'. If its not doing that, then where do I make a patent application? :)

  • iFlash (Score:2, Funny)

    Apple does it and they get a patent. When I did it I got arrested.
  • sounds like Microsoft sold another license for their 'wedge' lens technology.

  • The top scored comments do not consider that the iPhone has enough power and resources to augment photos.

    You could take your photo and then wave the camera around in the air or even walk over to the side or closer to your subject, and the phone could be selectively lighting the scene while adding these frames onto the image. It can use the accelerometer with computer vision techniques to understand where the camera is moving. It can learn what the 3D shape of the subject is and computer 3D masking.

    What I th

  • The patent looks interesting, but also looks like something that is just a computer controlled version of a technique that has been around for a long time. I have some old potato masher shaped, flash-bulb, strobes. A few of them have the reflector dish that is just slightly off center and a little bit movable. Some call that dents and age, but it looks a lot like Apple's patent. Move the reflector just slightly, and you can highlight a dark area and remove some light from the bright parts. Easy enough and,

  • Oh, for goodness sake, when is timothy going to a) learn to link to the original source and not some third-rate blog and b) learn to distinguish between a patent application and a granted patent?

    Link to original source: Apple Working on the Next Wave of Digital Camera Technologies [patentlyapple.com]

    Link to U.S. Patent Application 20100238344 [uspto.gov]

  • I'm pretty sure that zooming the flash along with the lens was already being done on P&S cameras back in the 90s. Is this a case of tagging "on a telephone" to an existing patent?
  • Citroen DS lights work this way. In order to get a bit more visibility out of the puny late 1960s headlamps the reflector behind the lamp would pivot to point in the direction you pointed the steering wheel.

    DS 21M phares tournant [youtube.com]

There must be more to life than having everything. -- Maurice Sendak

Working...