US20010026683A1 - Digital camera - Google Patents

Digital camera Download PDF

Info

Publication number
US20010026683A1
US20010026683A1 US09/815,977 US81597701A US2001026683A1 US 20010026683 A1 US20010026683 A1 US 20010026683A1 US 81597701 A US81597701 A US 81597701A US 2001026683 A1 US2001026683 A1 US 2001026683A1
Authority
US
United States
Prior art keywords
image
detector
optical system
mirror
digital camera
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
US09/815,977
Other versions
US6453124B2 (en
Inventor
Yasuhiro Morimoto
Takeru Butsusaki
Kazuhiko Yukawa
Hiroaki Kubo
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Minolta Co Ltd
Original Assignee
Minolta Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from JP2000086391A external-priority patent/JP3726630B2/en
Priority claimed from JP2000093334A external-priority patent/JP3634232B2/en
Application filed by Minolta Co Ltd filed Critical Minolta Co Ltd
Assigned to MINOLTA CO., LTD. reassignment MINOLTA CO., LTD. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: YUKAWA, KAZUHIKO, BUTSUSAKI, TAKERU, KUBO, HIROAKI, MORIMOTO, YASUHIRO
Publication of US20010026683A1 publication Critical patent/US20010026683A1/en
Assigned to MINOLTA CO., LTD. reassignment MINOLTA CO., LTD. CORRECTIVE ASSIGNMENT, ADVISING THAT KAZUHIKO YUKUHIKO IS DECEASED AND ADDING KAZUMI YUKAWA AS HIS LEGAL REPRESENTATIVE FOR ASSIGNMENT PREVIOUSLY RECORDED UNDER REEL 011875, FRAME 0435. Assignors: YUKAWA, KAZUMI, LEGAL REPRESENTATIVE OF KAZUHIKO YUKAWA (DECEASED), BUTSUSAKI, TAKERU, KUBO, HIROAKI, MORIMOTO, YASUHIRO
Application granted granted Critical
Publication of US6453124B2 publication Critical patent/US6453124B2/en
Anticipated expiration legal-status Critical
Expired - Lifetime legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B7/00Mountings, adjusting means, or light-tight connections, for optical elements
    • G02B7/28Systems for automatic generation of focusing signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • H04N23/631Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters
    • H04N23/632Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters for displaying or modifying preview images prior to image capturing, e.g. variety of image resolutions or capturing parameters
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • H04N23/672Focus control based on electronic image sensor signals based on the phase difference signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • H04N23/673Focus control based on electronic image sensor signals based on contrast or high frequency components of image signals, e.g. hill climbing method
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • H04N23/84Camera processing pipelines; Components thereof for processing colour signals
    • H04N23/843Demosaicing, e.g. interpolating colour pixel values
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/10Circuitry of solid-state image sensors [SSIS]; Control thereof for transforming different wavelengths into image signals
    • H04N25/11Arrangement of colour filter arrays [CFA]; Filter mosaics
    • H04N25/13Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements
    • H04N25/134Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements based on three different wavelength filter elements

Definitions

  • the present invention relates to a digital camera for capturing a subject image to generate image data, and especially to an autofocus (hereinafter referred to as “AF”) technique used in a digital camera.
  • AF autofocus
  • AF techniques include image-signal autofocus that attains focus on the basis of contrast of image data obtained in an image sensor: more specifically, a contrast AF method for achieving focus based on image contrast.
  • the contrast AF method attains higher focus accuracy than the phase difference AF method.
  • a digital camera should adopt the contrast AF method depending on an increased number of pixels in the image sensor.
  • contrast AF cannot be used in a single lens reflex digital camera.
  • contrast AF method a technique so called a “contrast AF method” (or “hill-climbing AF method”) is adopted for autofocusing in image capture devices such as video cameras.
  • This contrast AF method is more specifically a technique for while driving a focusing lens included in a taking lens, obtaining as evaluation values the contrast of captured images at each drive step and selecting a lens position with the maximum evaluation value as an in-focus position.
  • in-focus position refers to a position at which a lens is positioned to provide an in-focus image.
  • the phase difference AF method is conventionally adopted for autofocusing in single lens reflex cameras for silver halide films.
  • the extent to which an in-focus position of a lens is offset from a film plane can be recognized instantaneously by the distance (phase difference) between images at the time when a phase difference detection sensor with a CCD line sensor receives light from a subject image. From this, the phase difference AF method is advantageous in that only one lens drive brings the in-focus position into coincidence with the film plane.
  • phase difference AF method Even in autofocusing by the phase difference AF method, it is necessary to improve the resolution of the CCD line sensor to achieve an in-focus condition with high accuracy. This increases the size and cost of the phase difference detection sensor for detecting the phase difference. Moreover, the phase difference AF method may have an error in the in-focus position because of an error in installation of the phase difference detection sensor.
  • the present invention is directed to a digital camera.
  • the digital camera comprises an image sensor for capturing a subject image; a mirror movable between a first position to enter an optical path from an imaging optical system to the image sensor, and a second position to withdraw from the optical path; a driver for driving the mirror; a first detector for detecting an in-focus condition of the imaging optical system on the basis of an image signal from the image sensor; a display for displaying an image signal obtained by the image sensor; and a focus controller for when the display provides a display, controlling the driver to move the mirror to the second position and driving the imaging optical system according to a result of detection by the first detector.
  • the digital camera comprises: an image sensor for capturing a subject image; a mirror movable between a first position to enter an optical path from an imaging optical system to the image sensor, and a second position to withdraw from the optical path; a driver for driving the mirror; a first detector for detecting an in-focus condition of the imaging optical system on the basis of an image signal from the image sensor; a second detector for detecting an in-focus condition of the imaging optical system by a phase difference method; a selector for selecting either the first detector or the second detector; and a driver controller for when the selector selects the first detector to detect an in-focus condition, controlling the driver to move the mirror to the second position and when the selector selects the second detector to detect an in-focus condition, placing the mirror at the first position.
  • the selection from the first and second detectors allows switching between the live view display and the optical viewfinder. Thereby autofocusing can be performed even for framing by the optical viewfinder.
  • the digital camera comprises: an image sensor for capturing a subject image; a mirror movable between a first position to enter an optical path from an imaging optical system to the image sensor, and a second position to withdraw from the optical path; a driver for driving the mirror; a first detector for detecting an in-focus condition of the imaging optical system on the basis of an image signal from the image sensor; a second detector for detecting an in-focus condition of the imaging optical system by a phase difference method; an operating member for image capture being movable from a first operating position to a second operating position which is a further pressed position from the first operating position; and a focus controller for when the operating member for image capture is at the first operating position, placing the mirror at the first position and driving the imaging optical system according to a result of detection by the second detector and when the operating member for image capture is at the second operating position, controlling the driver to move the mirror to the second position and driving the imaging optical system according to a result of detection by the first detector.
  • the digital camera comprises: an image sensor for capturing a subject image; a mirror movable between a first position to enter an optical path from an imaging optical system to the image sensor, and a second position to withdraw from the optical path; a driver for driving the mirror; a first detector for detecting an in-focus condition of the imaging optical system on the basis of an image signal from the image sensor; a second detector for detecting an in-focus condition of the imaging optical system by a phase difference method; and a focus controller for after driving the imaging optical system according to a result of detection by the second detector with the mirror located at the first position, then controlling the driver to move the mirror to the second position and driving the imaging optical system according to a result of detection by the first detector.
  • an object of the present invention is to enable autofocusing responsive to an image signal from an image sensor in a digital camera and to obtain a focus condition of an image formed in the image sensor with great accuracy and efficiency.
  • FIG. 1 diagrammatically shows a configuration of a main mechanism of a digital camera according to the present invention
  • FIGS. 2A to 2 D diagrammatically show operating conditions of the main mechanism in image capture
  • FIG. 3 is a block diagram of a control system in the digital camera
  • FIG. 4 shows a rear face of the digital camera
  • FIG. 5 is a state transition diagram when a digital camera according to a first preferred embodiment is in capture mode
  • FIG. 6 is an explanatory diagram for explaining the position of a focusing lens in image capture of three successive frames.
  • FIG. 7 is a state transition diagram when a digital camera according to a second preferred embodiment is in capture mode.
  • FIG. 1 diagrammatically shows a configuration of a main mechanism of a digital camera according to a first preferred embodiment of the present invention
  • FIGS. 2A to 2 D diagrammatically show operating conditions of the main mechanism in image capture.
  • a digital camera 1 has a camera body 2 which modifies a single lens reflex camera for silver halide films.
  • the camera body 2 has on its front face a taking lens 3 which is equipped with a taking lens portion 4 , a diaphragm 5 , and the like.
  • a quick return mirror M 1 is located which is pivotally supported by a pivot 6 in the upper rear portion of the camera body 2 so that it can rotatably be displaced. Further, a focal plane shutter 7 is located at the rear of the quick return mirror M 1 in the direction of the optical path and an image sensor 8 at the rear of the focal plane shutter 7 .
  • focal plane shutter 7 While the focal plane shutter 7 remains in the camera body 2 , it may be removed depending on the type of the image sensor 8 .
  • an optical low pass filter 18 is provided for excluding the influence of return noise at the sampling of an analog image signal from the image sensor 8 .
  • the optical low pass filter 18 , the focal plane shutter 7 , and the image sensor 8 constitute an imaging unit 19 .
  • the imaging unit 19 is movable back and forth along the optical path by a movement mechanism 30 . Responsive to upward rotational movement of the quick return mirror M 1 in image capture, the imaging unit 19 moves forward in a direction of the optical axis to its image capture position, i.e., until a light receiving surface of the image sensor 8 is moved to a position of back focal length. After image capture, responsive to downward rotational movement of the quick return mirror M 1 , the imaging unit 19 moves backward in the direction of the optical axis to its retracted position where no mechanical interference with the quick return mirror M 1 occurs.
  • the movement mechanism 30 can adopt any mechanism of a known configuration: for example, it can be constituted by a mechanism for converting rotation of a motor-driven bolt into axial linear motion.
  • a finder equivalent part 9 corresponding to a finder of a silver-halide-film camera is formed above the quick return mirror M 1 in the camera body 2 .
  • the finder equivalent side 9 is provided with a pentagonal prism 11 with a focusing screen 10 thereunder.
  • a predetermined relay lens 12 is located at the rear of the prism 11 and an eyepiece 13 at the rear of the relay lens 12 , while a light measuring sensor 14 is located above the relay lens 12 .
  • the relay lens 12 is not shown.
  • a range from the taking lens 3 to the optical low pass filter 18 in the imaging unit 19 corresponds to an imaging optical system of the invention.
  • the quick return mirror M 1 , the prism 11 , the relay lens 12 , and the eyepiece 13 constitute an optical viewfinder.
  • the quick return mirror M 1 is in a stationary position as shown in FIGS. 1 and 2A, i.e., it is inclined 45 degrees to the optical axis, in which case an optical path L from the taking lens portion 4 is toward the focusing screen 10 .
  • the quick return mirror M 1 is, as shown in FIGS. 2B to 2 D, rotationally displaced about the pivot 6 to almost a horizontal position, to open the optical path L from the taking lens portion 4 .
  • a mirror M 2 is a mirror integrated with the quick return mirror M 1 .
  • This mirror M 2 and a fixed mirror M 3 thereunder direct an optical image which passes through a half mirror portion partially formed in the quick return mirror M 1 , toward a distance-measuring sensor 15 .
  • the distance-measuring sensor 15 Upon receipt of light from the optical image, the distance-measuring sensor 15 detects a distance to the subject and generates a phase difference detection signal.
  • the phase difference detection signal is for use in autofocusing of the taking lens portion 4 .
  • the prism 11 has the function of inverting and scaling down an optical image formed on the focusing screen 10 and directing a resultant image toward the light measuring sensor 14 and the eyepiece 13 . Further, control values such as the aperture value and the shutter speed are determined according to light quantity data obtained by the light measuring sensor 14 , or by a camera control CPU 20 on the basis of image data from the image sensor 8 . Also, the amount of light exposure in the image sensor 8 is determined.
  • the camera body 2 further comprises a focus motor 36 for driving a focusing lens included in the taking lens portion 4 in the direction of the optical axis.
  • a display unit 16 constituted by a liquid crystal display (LCD) for displaying images obtained from the output of the image sensor 8 .
  • LCD liquid crystal display
  • FIG. 3 is a block diagram of a control system in the digital camera 1 .
  • reference numeral 3 designates a taking lens
  • 4 designates a taking lens portion
  • 5 designates a diaphragm
  • M 1 designates a quick return mirror
  • 7 designates a focal plane shutter
  • 8 designates an image sensor
  • 11 designates a prism
  • 13 designates an eyepiece
  • 16 designates a display unit, all of which are identical to those shown in FIGS. 1 and 2A to 2 D.
  • Reference numeral 20 designates a camera control CPU which controls each part of the camera body 2 . More specifically, the diaphragm 5 is controlled through a diaphragm driver 21 and the image sensor 8 is controlled through a timing generator (sensor driver) 22 . An actuator 17 for driving the quick return mirror M 1 and the movement mechanism 30 for driving the imaging unit 19 are controlled through a mirror/imaging-unit driving circuit 23 , and the focal plane shutter 7 is controlled through a shutter driver 25 . The focus motor 36 is controlled through a motor driver 26 .
  • the camera control CPU 20 is connected to a camera control switch 24 which includes the shutter button 24 a , a power switch, and the like.
  • the image sensor 8 is formed of a charge coupled device (CCD) which is an area sensor with primary-colors filters R (red), G (green), and B (blue) arranged in a checkerboard pattern on a pixel by pixel basis.
  • CCD charge coupled device
  • the image sensor 8 performs photoelectric conversion of a subject's optical image formed by the taking lens portion 4 into an image signal with RGB color components (i.e., a signal consisting of a sequence of image signals from each pixel), and then outputs that image signal.
  • the timing generator 22 generates and outputs a drive control signal for the image sensor 8 in accordance with a reference clock given from the camera control CPU 20 .
  • the timing generator 22 for example, generates clock signals such as a timing signal for starting or stopping the integral (exposure) and a readout control signal (e.g., a horizontal synchronization signal, a vertical synchronization signal, a transfer signal) for readout of signals from each pixel. Those signals are outputted through a driver to the image sensor 8 .
  • Output from the image sensor 8 is subjected to signal processing in a correlated double sampling (CDS) circuit 81 , an automatic gain control (AGC) circuit 82 , and an A/D converter 83 .
  • the CDS circuit 81 reduces noise in an image signal and the AGC circuit 82 provides gain control to adjust the level of the image signal.
  • the A/D converter 83 converts a normalized analog signal from the AGC circuit 82 into a 10-bit digital signal.
  • Reference numeral 40 designates an image processor for performing image processing on the output from the A/ID converter 83 to form an image file. This processor 40 is controlled by an image processing CPU.
  • image data from the image sensor 8 is fetched into the image processor 40 where a variety of processing operations are performed.
  • the signal fetched from the A/D converter 83 into the image processor 40 is written into image memory 61 in synchronization with readout from the image sensor 8 . For subsequent processing, this data in the image memory 61 is accessed and processed in each block.
  • a pixel interpolation block 41 is a block for performing pixel interpolation in a predetermined interpolation pattern.
  • the pixel G with higher frequency component than the pixels R and B is replaced with a means value which is obtained by a median filter using intermediate two values of four pixels surrounding that pixel, while the pixels R and B are subjected to average interpolation to obtain respective outputs.
  • a color-balance control block 42 corrects the gains of individual outputs for R, G, and B after pixel interpolation in the pixel interpolation block 41 , whereby color correction is made to the pixels R, G, and B.
  • the camera control CPU 20 performs calculations R/G, B/G for the mean values of the respective outputs for R, G, and B, and resultant values are taken as corrected gains of R and B.
  • a gamma correction block 43 performs nonlinear conversion of the respective outputs for R, G, and B after normalization of color balance. Thereby a proper contrast transform for the display unit 16 is performed.
  • the gamma-corrected image data is stored in the image memory 61 .
  • a video encoder 44 reads out the above data stored in the image memory 61 and encodes it to an NTSC/PAL format, the result of which is displayed on the display unit 16 .
  • An image compression block 45 performs compression on a captured image from the image sensor 8 by fetching image data from the image memory 61 .
  • the compressed captured image is recorded on a memory card 62 through a memory card driver 46 .
  • the memory card 62 is removably loaded in a predetermined part of the camera body 2 .
  • FIG. 4 shows a rear face of the digital camera 1 .
  • buttons On the rear face of the camera body 2 , there are provided the aforementioned display unit 16 and a 4-way switch 35 on the right side of the display unit 16 .
  • Using U, D, L, and R buttons allows for various operations corresponding to the display on the display unit 16 , e.g., a choice from selection items.
  • the LCD button 31 is used for turning on/off a display on the display unit 16 .
  • the OK button 32 and the cancel button 33 are used by an operator either to confirm or to cancel a selection of items at various settings.
  • the menu button 34 is used for switching of various setting screens, e.g., a menu selection screen as described later, on the display.
  • the digital camera 1 mainly comes in two operating modes, namely, a “capture” mode and a “playback” mode.
  • the capture mode is a mode of performing processing for image capture, in which mode in a shooting standby state, the display unit 16 displays live view images in some instances as described later and immediately after image capture, the display unit 16 displays a captured image.
  • the playback mode is a mode of performing processing on already-recorded images, e.g., playing back and displaying on the display unit 16 a captured image recorded on the memory card 62 .
  • Switching between the capture and the playback modes is done as follows. By the actuation of the menu button 34 or the like, a mode selection screen is displayed on the display unit 16 , on which screen the switching between the capture and the playback modes is effected by the actuation of the 4-way switch 35 , the OK button 32 , and the cancel button 33 .
  • FIG. 5 is a state transition diagram of the digital camera 1 in the capture mode.
  • the state transitions in the capture mode will be set forth. Unless otherwise specified, the operation of each unit is controlled by the camera control CPU 20 .
  • the digital camera 1 goes into a capture mode with the optical viewfinder, in which mode the digital camera 1 starts to operate with the quick return mirror M 1 in the down position as shown in FIG. 2A, the display unit 16 off, and accordingly a live view display described later in the off state (state S 1 ).
  • state S 1 AF processing is not performed.
  • a subject image in the optical viewfinder is slightly out of focus; however, rough framing by the optical viewfinder is possible in this state.
  • a menu setting screen appears on the display with a user's press of the menu button 34 , on which screen a user makes menu settings (state S 2 ).
  • FIG. 4 shows the menu setting screen displayed on the screen of the display unit 16 .
  • the menu setting screen allows a user to selectively define an AF method to be applied at a half shutter press of the shutter button 24 a (hereinafter also referred to only as a “half shutter press”).
  • a half shutter press of the shutter button 24 a
  • a selection of the AF method at the half shutter press is made from the contrast AF method and the phase difference AF method.
  • phase difference AF method is selected in the state S 2
  • a transition to the state S 1 occurs
  • if the contrast AF method is selected in the state S 2 a transition to the state S 6 occurs.
  • the transition from the states S 2 to Si turns off the display unit 16 (i.e., live view display), while the transition from the states S 2 to S 6 holds the display unit 16 in the on state and turns on the live view display.
  • phase difference AF and exposure adjustments are performed with the quick return mirror M 1 in the down position and the live view display off (state S 3 ).
  • the operation in the state S 3 will be set forth in detail.
  • the optical path L of light incident from the taking lens portion 4 and the diaphragm 5 changes its direction upward because of the presence of the quick return mirror M 1 in the camera body 2 .
  • An image is then formed on the focusing screen 10 , inverted and scaled down by the pentagonal prism 11 , and received by the light measuring sensor 14 .
  • the light measuring sensor 14 measures the amount of light, from which the camera control CPU 20 obtains exposure control data.
  • the diaphragm 5 is controlled through the diaphragm driver 21 and the timing generator 22 for applying the drive control signal to the image sensor 8 is controlled, so that the image sensor 8 receives a proper amount of light exposure.
  • the imaging unit 19 is in its retracted position to avoid mechanical interference with the quick return mirror M 1 , and the image receiving plane of the image sensor 8 is located behind the position of back focal length.
  • the distance-measuring sensor 15 detects a distance to the subject based on which the focusing lens in the taking lens portion 4 is driven to for autofocusing.
  • the optical image, the optical path L of which changed direction at the quick return mirror M 1 is scaled down by the prism 11 and the relay lens 12 and then reaches the eyepiece 13 . From this, a user can visually recognize a focused subject image through the eyepiece 13 . Although not shown, a user can go back to the state S 12 for reframing at the release of the half-pressed shutter button 24 a during the state S 3 , which provides precise framing.
  • FIG. 6 is an explanatory diagram for explaining the position of the focusing lens in image capture of three successive frames.
  • the image sensor 8 obtains partial image data about only the AF area of central partial rectangular area of an image at different positions of the focusing lens: a position at the time of full shutter press (i.e., an in-focus position determined by the phase difference AF) and positions determined by shifting the above in-focus position both forward and backward by the amount of deviation d obtained from the depth of focus (hereinafter referred to as a “front focus position” and a “rear focus position”). That is, a total of three frames of partial images are obtained.
  • the depth of focus is obtained from the position of the focusing lens and aperture value at the time of full shutter press.
  • the amount of deviation d is previously obtained for each depth of focus and summarized in a table stored in a ROM (not shown) in the camera control CPU 20 . From this table, the amount of deviation d corresponding to the depth of focus is obtained for use.
  • AF evaluation values for the three partial image data are obtained and compared with each other. From comparison, the partial image with a maximum AF evaluation value is taken as the image in sharpest focus and a corresponding position of the focusing lens is selected.
  • the diaphragm 5 is set to a predetermined aperture value with the focusing lens held at the position when driven by the phase difference AF at the time of half shutter press, and upward rotatable displacement of the quick return mirror M 1 about the pivot 6 starts as indicated by an open arrow in FIG. 2B.
  • the imaging unit 19 is moved forward by the movement mechanism 30 in the direction of the optical axis of the taking lens portion 4 .
  • the actuator 17 for driving the quick return mirror M 1 and the movement mechanism 30 for moving the imaging unit 19 are not shown.
  • the depth of focus at the time of full shutter press is obtained in the same manner as above described and the corresponding amount of deviation d of the focusing lens is obtained.
  • the focusing lens is moved to its front and rear focus positions at each of which a partial image is captured as above described. Then, the position of the focusing lens which provides the partial image in sharpest focus is selected as above described and the focusing lens is actually moved to the selected position.
  • an image capture operation is performed at the selected position of the focusing lens (state S 5 ). More specifically, as shown in FIG. 2D, the focal plane shutter 7 opens and closes at a predetermined speed, whereby the optical image passing through the taking lens portion 4 and the diaphragm 5 is directly formed on the image sensor 8 and photoelectrically converted therein. The photoelectrically converted signal is then outputted through the buffer.
  • the image data outputted from the image sensor 8 is subjected to predetermined signal processing in the CDS circuit 81 , the AGC circuit 82 , and the A/D converter 83 , fetched into the image processor 40 , and written into the image memory 61 in synchronization with readout of the image sensor 8 .
  • the quick return mirror M 1 is rotated back to its original position, whereby the optical path L again goes toward the focusing screen 10 and the digital camera 1 is placed in the shooting standby state.
  • the imaging unit 19 moves backward to its retracted position in the direction of the optical axis to avoid interference with the rotational movement of the quick return mirror M 1 .
  • selected image data is recorded on the memory card 62 . More specifically, the image data written into the image memory 61 is subjected to the aforementioned processing such as pixel interpolation, color balance control, and gamma correction in the image processor 40 and is stored again in the image memory 61 . Resultant image data is fetched from the image memory 61 and displayed as a captured image on the display unit 16 . Simultaneously, this image data is also subjected to image compression in the image compression block 45 and recorded on the memory card 62 through the memory card driver 46 .
  • the image data written into the image memory 61 is subjected to the aforementioned processing such as pixel interpolation, color balance control, and gamma correction in the image processor 40 and is stored again in the image memory 61 . Resultant image data is fetched from the image memory 61 and displayed as a captured image on the display unit 16 . Simultaneously, this image data is also subjected to image compression in the image compression block 45 and recorded on the
  • the digital camera 1 At the completion of the image capture operation, the digital camera 1 returns to the state S 1 .
  • the display unit 16 is turned on and the digital camera 1 goes into the capture mode with live view display. As shown in FIG. 2D, the quick return mirror M 1 is flipped up, the display unit 16 is turned on to start a live view display, and the contrast AF is performed (state S 6 ).
  • the quick return mirror M 1 in the up position allows light from the taking lens portion 4 to reach the image sensor 8 .
  • image data outputted from the image sensor 8 for each predetermined period of time e.g., at every ⁇ fraction (1/30) ⁇ th second
  • the image data is then read out by the image processor 40 for the aforementioned image processing and stored again in the image memory 61 .
  • the video encoder 44 reads out the above data stored in the image memory 61 and encodes it to the NTSC/PAL format, the result of which is displayed on the display unit 16 and produces a live view display.
  • the camera control CPU 20 performs the contrast AF even for the live view display.
  • the contrast AF method is an autofocus method for achieving focus by fetching image data from the image memory 61 to obtain an AF evaluation value (contrast) for the image and driving the focus motor 36 to move the focusing lens to the position with the maximum AF evaluation value.
  • a known technique such as a “hill-climbing” method or the like can be adopted as a control method for achieving maximum contrast.
  • an image capture operation is performed based on the contrast AF (state S 8 ).
  • the focal plane shutter 7 opens and closes at a predetermined speed, whereby the optical image from the taking lens portion 4 is directly formed on the image sensor 8 and resultant image data is stored in the image memory 61 .
  • This image data is fetched into the image processor 40 for the aforementioned image processing and stored again in the image memory 61 .
  • the image data is also recorded on the memory card 62 .
  • the digital camera 1 returns again to the state S 6 and is enabled for next image capture.
  • a single lens reflex digital camera performs the contrast AF for live view display while withdrawing the quick return mirror M 1 from the imaging optical system comprised of the taking lens portion 4 , the diaphragm 5 , and the imaging unit 19 .
  • the contrast AF can thus be performed responsive to the live view display. Further, the contrast AF for live view display provides precise framing.
  • the phase difference AF is performed with the quick return mirror M 1 placed in the imaging optical system to allow visual recognition of the subject image in the optical viewfinder.
  • a user can intentionally switch between the live view display and the optical viewfinder, which permits autofocusing even for framing using the optical viewfinder.
  • the phase difference AF is performed at the half shutter press of the shutter button 24 a (state S 3 ). This shortens processing time for autofocusing at the time of half shutter press, thereby preventing the occurrence of a time lag due to autofocusing.
  • the quick return mirror M 1 is withdrawn from the imaging optical system (state S 6 ), while when the phase difference AF is selected on the menu setting screen, the quick return mirror M 1 is placed inside the imaging optical system (state S 1 ). From this, the contrast AF can be performed when high focusing precision is required and the phase difference AF when high-speed autofocusing is required. That is, a most suitable autofocusing method can be selected at a user's discretion.
  • the live view display is turned on. This provides precise framing based on live view images based on the contrast AF.
  • the quick return mirror M 1 is located inside the imaging optical system and the phase difference AF method is correspondingly selected. This permits framing even during the off state of the live view display.
  • the phase difference AF is performed at the time of half shutter press and the contrast AF at the time of full shutter press. That is, the contrast AF can be performed at the time of full shutter press that requires high-precision autofocusing.
  • a digital camera 1 according to the second preferred embodiment is basically identical to that of the aforementioned first preferred embodiment. The following description will thus be made mainly in parts that differ from the first preferred embodiment.
  • FIG. 7 is a state transition diagram of the digital camera 1 in the capture mode according to the second preferred embodiment.
  • the state transitions in the capture mode will be set forth in detail. Unless otherwise specified, the operation of each unit is controlled by the camera control CPU 20 (cf. FIG. 3).
  • the digital camera 1 goes into the capture mode with the optical viewfinder, in which mode the digital camera 1 starts to operate with the quick return mirror M 1 in the down position as shown in FIG. 2A, the display unit 16 off, and accordingly the live view display described later in the off state (state S 11 ).
  • state S 11 the live view display described later in the off state.
  • AF processing is not performed.
  • a subject image in the optical viewfinder is slightly out of focus, however, rough framing by the optical viewfinder is possible in this state.
  • FIG. 4 shows the menu setting screen displayed on the screen of the display unit 16 .
  • the menu setting screen allows a user to selectively define an AF method to be applied at a half shutter press of the shutter button 24 a .
  • a selection of the AF method at the half shutter press is made from the contrast AF method and the phase difference AF method.
  • phase difference AF method is selected in the state S 12
  • a transition to the state S 1 occurs
  • the contrast AF method is selected in the state S 12
  • a transition to the state S 16 occurs.
  • the transition from the states S 12 to S 11 turns off the display unit 16 (i.e., live view display), while the transition from the states S 12 to S 16 holds the display unit 16 in the on state and turns on the live view display.
  • the optical path L of light incident from the taking lens portion 4 and the diaphragm 5 changes its direction upward because of the presence of the quick return mirror M 1 in the camera body 2 .
  • An image is then formed on the focusing screen 10 , inverted and scaled down by the pentagonal prism 11 , and received by the light measuring sensor 14 .
  • the light measuring sensor 14 measures the amount of light, from which the camera control CPU 20 obtains exposure control data.
  • the diaphragm 5 is controlled through the diaphragm driver 21 and the timing generator 22 for applying the drive control signal to the image sensor 8 is controlled, so that the image sensor 8 receives a proper amount of light exposure.
  • the imaging unit 19 is in its retracted position to avoid mechanical interference with the quick return mirror M 1 , and the image receiving plane of the image sensor 8 is located behind the position of back focal length.
  • the optical image, the optical path L of which changed direction at the quick return mirror M 1 is scaled down by the prism 11 and the relay lens 12 and then reaches the eyepiece 13 . From this, a user can visually recognize a focused subject image through the eyepiece 13 . Although not shown, a user can go back to the state S 11 for reframing at the release of the half-pressed shutter button 24 a during the state S 13 , which provides precise framing.
  • the shutter button 24 a is further pressed to its full-pressed position by a user and the focal length of the taking lens 3 (which may be a zoom lens) is 50 mm or less on a 135-mm film basis in a state S 33 , an image capture operation is performed in a state S 15 .
  • the focal length is longer than 50 mm, on the other hand, image capture of three successive frames is performed in a predetermined AF area and a position of the focusing lens is selected which provides the image in sharpest focus out of the three partial images obtained (state S 14 ). That is, when the taking lens 3 is a lens for use in telephotography, a lens drive is first conducted based on the phase difference detection signal and then another lens drive is conducted based on the AF evaluation value according to the contrast AF method.
  • the image sensor 8 obtains partial image data about only the AF area of central partial rectangular area of the image at different positions of the focusing lens: a position at the time of full shutter press (i.e., an in-focus position determined by the phase difference AF) and positions determined by shifting the above in-focus position both forward and backward by the amount of deviation d obtained from the depth of focus (hereinafter referred to as a “front focus position” and a “rear focus position”). That is, a total of three frames of partial images are obtained.
  • the depth of focus is obtained from the position of the focusing lens and aperture value at the time of full shutter press.
  • the amount of deviation d is previously obtained for each depth of focus and summarized in a table stored in a ROM (not shown) in the camera control CPU 20 . From this table, the amount of deviation d corresponding to the depth of focus is obtained for use.
  • AF evaluation values for the three partial image data are obtained and compared with each other. From comparison, the partial image with a maximum AF evaluation value is taken as the image in sharpest focus and a corresponding position of the focusing lens is selected.
  • the diaphragm 5 is set to a predetermined aperture value with the focusing lens held at the position when driven by the phase difference AF at the time of half shutter press, and upward rotatable displacement of the quick return mirror M 1 about the pivot 6 starts as indicated by the open arrow in FIG. 2B.
  • the imaging unit 19 is moved forward by the movement mechanism 30 in the direction of the optical axis of the taking lens portion 4 .
  • the actuator 17 for driving the quick return mirror M 1 and the movement mechanism 30 for moving the imaging unit 19 are not shown.
  • the depth of focus at the time of full shutter press is obtained in the same manner as above described and the corresponding amount of deviation d of the focusing lens is obtained.
  • the focusing lens is moved to its front and rear focus positions at each of which a partial image is captured as above described. Then, the position of the focusing lens which provides the partial image in sharpest focus is selected as above described and the focusing lens is actually moved to the selected position.
  • an image capture operation is performed at the selected position of the focusing lens (state S 15 ). More specifically, as shown in FIG. 2D, the focal plane shutter 7 opens and closes at a predetermined speed, whereby the optical image passing through the taking lens portion 4 and the diaphragm 5 is directly formed on the image sensor 8 and photoelectrically converted therein. The photoelectrically converted signal is then outputted through the buffer.
  • the image data outputted from the image sensor 8 is subjected to predetermined signal processing in the CDS circuit 81 , the AGC circuit 82 , and the A/D converter 83 , fetched into the image processor 41 , and written into the image memory 61 in synchronization with readout of the image sensor 8 .
  • the quick return mirror M 1 After the image capture, the quick return mirror M 1 is rotated back to its original position, whereby the optical path L again goes toward the focusing screen 10 and the digital camera 1 is placed in the shooting standby state.
  • the imaging unit 19 moves backward to its retracted position in the direction of the optical axis to avoid interference with the rotational movement of the quick return mirror M 1 .
  • selected image data is recorded on the memory card 62 . More specifically, the image data written into the image memory 61 is subjected to the aforementioned processing such as pixel interpolation, color balance control, and gamma correction in the image processor 40 and is stored again in the image memory 61 . Resultant image data is fetched from the image memory 61 and displayed as a captured image on the display unit 16 . Simultaneously, this image data is also subjected to image compression in the image compression block 45 and recorded on the memory card 62 through the memory card driver 46 .
  • the image data written into the image memory 61 is subjected to the aforementioned processing such as pixel interpolation, color balance control, and gamma correction in the image processor 40 and is stored again in the image memory 61 . Resultant image data is fetched from the image memory 61 and displayed as a captured image on the display unit 16 . Simultaneously, this image data is also subjected to image compression in the image compression block 45 and recorded on the
  • the digital camera 1 At the completion of the image capture operation, the digital camera 1 returns to the state S 11 .
  • the display unit 16 When the LCD button 31 is pressed in the state S 11 , the display unit 16 is turned on and the digital camera 1 goes into the capture mode with live view display. As depicted in FIG. 2D, the quick return mirror M 1 is flipped up, the display unit 16 is turned on to start a live view display, and the contrast AF is performed (state S 16 ).
  • the quick return mirror M 1 in the up position allows light from the taking lens portion 4 to reach the image sensor 8 .
  • image data outputted from the image sensor 8 for each predetermined period of time e.g., at every ⁇ fraction (1/30) ⁇ th second
  • the image data is then read out by the image processor 40 for the aforementioned image processing and stored again in the image memory 61 .
  • the video encoder 44 reads out the above data stored in the image memory 61 and encodes it to the NTSC/PAL format, the result of which is displayed on the display unit 16 and produces a live view display.
  • the camera control CPU 20 performs the contrast AF even for the live view display.
  • the contrast AF method is an autofocus method for achieving focus by fetching image data from the image memory 61 to obtain an AF evaluation value (contrast) for the image and driving the focus motor 36 to move the focusing lens to the position with the maximum AF evaluation value.
  • a known technique such as a “hill-climbing” method or the like can be adopted as a control method for achieving maximum contrast.
  • an image capture operation is performed based on the contrast AF (state S 18 ).
  • the focal plane shutter 7 opens and closes at a predetermined speed, whereby the optical image from the taking lens portion 4 is directly formed on the image sensor 8 and resultant image data is stored in the image memory 61 .
  • This image data is fetched into the image processor 40 for the aforementioned image processing and stored again in the image memory 61 .
  • the image data is also recorded on the memory card 62 .
  • the digital camera 1 At the completion of the image capture operation, the digital camera 1 returns again to the state S 16 and is enabled for next image capture.
  • a single lens reflex digital camera is configured to first conduct a lens drive based on the phase difference detection signal according to the phase difference AF method and then conduct another lens drive based on the evaluation value according to the contrast AF method.
  • a rough in-focus condition can be achieved in a short time by the phase difference AF method and a more precise in-focus condition can be achieved by the contrast AF method.
  • the digital camera 1 of this preferred embodiment can thus achieve an in-focus condition with great accuracy and efficiency.
  • the digital camera 1 is also configured to conduct a lens drive based on the AF evaluation value after the completion of the exposure control. That is, the taking lens can be driven on the basis of the evaluation value which is obtained under the same conditions as actual image recording. This achieves a more precise in-focus condition.
  • the digital camera 1 conducts a lens drive based on the phase difference detection signal. This permits an efficient operation of the digital camera 1 and especially an efficient lens drive.
  • the digital camera 1 first conducts a lens drive based on the phase difference detection signal and then conducts another lens drive based on the AF evaluation value. This achieves an in-focus condition with efficiency.

Abstract

At power-on, a digital camera starts to operate with a display unit off (state S1). At a subsequent half shutter press of a shutter button, a quick return mirror is flipped down and phase difference AF is performed with live view display off (state S3). At a subsequent full press of the shutter button, the digital camera enters an image capture operation. With a quick return mirror in the up position, a focusing lens is moved to its in-focus, front focus and rear focus positions, at each of which contrast in an AF area of predetermined partial portion of an image is obtained for comparison to select a position of the focusing lens with maximum contrast (state S4). Then, image data is obtained and recorded on a memory card (state S5). With the above processing, the digital camera can perform autofocusing, as circumstances demand, by a contrast AF method.

Description

  • This application is based on the applications Nos. 2000-86391 and 2000-93334 filed in Japan, the contents of which are hereby incorporated by reference. [0001]
  • BACKGROUND OF THE INVENTION
  • 1. Field of the Invention [0002]
  • The present invention relates to a digital camera for capturing a subject image to generate image data, and especially to an autofocus (hereinafter referred to as “AF”) technique used in a digital camera. [0003]
  • 2. Description of the Background Art [0004]
  • Most of single lens reflex digital cameras are configured such that an image sensor such as a CCD is installed in a film loading part of a single lens reflex camera utilizing silver halide films (hereinafter referred to as a “silver-halide-film camera”). Such digital cameras achieve focus by a phase difference autofocus (AF) method in which focusing is obtained by separating light passing through a taking lens by a separator lens and obtaining the amount of lens travel on the basis of a distance between images. [0005]
  • The recent trend to increase the number of pixels with reduced pixel pitch in an image sensor requires improved focus accuracy. In the phase difference AF method, however, there is limitations to the focus accuracy and therefore it is getting difficult to perform autofocusing with focus accuracy appropriate to the pixel pitch. [0006]
  • Besides the phase difference AF method, AF techniques include image-signal autofocus that attains focus on the basis of contrast of image data obtained in an image sensor: more specifically, a contrast AF method for achieving focus based on image contrast. The contrast AF method attains higher focus accuracy than the phase difference AF method. Thus, it is desired that a digital camera should adopt the contrast AF method depending on an increased number of pixels in the image sensor. However, as above described, contrast AF cannot be used in a single lens reflex digital camera. [0007]
  • Now, if a digital camera is configured with a CCD image sensor which includes a plurality of pixels more densely packed than before, a permissible circle of confusion becomes smaller than before and a higher degree of focus-position detection accuracy is required in autofocusing. [0008]
  • Conventionally, a technique so called a “contrast AF method” (or “hill-climbing AF method”) is adopted for autofocusing in image capture devices such as video cameras. This contrast AF method is more specifically a technique for while driving a focusing lens included in a taking lens, obtaining as evaluation values the contrast of captured images at each drive step and selecting a lens position with the maximum evaluation value as an in-focus position. Hereinafter, the terminology “in-focus position” refers to a position at which a lens is positioned to provide an in-focus image. [0009]
  • In the field of video cameras and the like for motion image capture, since the number of pixels in the CCD image sensor to be used is approximately several hundreds of thousands, a permissible circle of confusion is large and improved accuracy is not required in autofocusing. Further, an excessively high focus speed in video recording causes frequent changes of a focused portion of an image in response to movements of the camera and the subject and thereby unnatural images are produced because the human eye cannot follow such frequency. From this, required features of video cameras for autofocusing differ from those of still cameras. [0010]
  • Digital cameras for still image capture, on the other hand, are required to immediately achieve focus in order not to lose a shutter release opportunity. [0011]
  • Besides, the phase difference AF method is conventionally adopted for autofocusing in single lens reflex cameras for silver halide films. In autofocusing by the phase difference AF method, the extent to which an in-focus position of a lens is offset from a film plane can be recognized instantaneously by the distance (phase difference) between images at the time when a phase difference detection sensor with a CCD line sensor receives light from a subject image. From this, the phase difference AF method is advantageous in that only one lens drive brings the in-focus position into coincidence with the film plane. [0012]
  • On the other hand, in autofocusing by the contrast AF method especially when the contrast of a captured image is low, a change in the evaluation values before and after a taking lens drive may be so small that in which direction the lens should be moved to its in-focus position cannot be determined. The contrast AF method therefore has a problem of taking much time to achieve an in-focus condition. [0013]
  • Even in autofocusing by the phase difference AF method, it is necessary to improve the resolution of the CCD line sensor to achieve an in-focus condition with high accuracy. This increases the size and cost of the phase difference detection sensor for detecting the phase difference. Moreover, the phase difference AF method may have an error in the in-focus position because of an error in installation of the phase difference detection sensor. [0014]
  • SUMMARY OF THE INVENTION
  • The present invention is directed to a digital camera. [0015]
  • According to an aspect of the present invention, the digital camera comprises an image sensor for capturing a subject image; a mirror movable between a first position to enter an optical path from an imaging optical system to the image sensor, and a second position to withdraw from the optical path; a driver for driving the mirror; a first detector for detecting an in-focus condition of the imaging optical system on the basis of an image signal from the image sensor; a display for displaying an image signal obtained by the image sensor; and a focus controller for when the display provides a display, controlling the driver to move the mirror to the second position and driving the imaging optical system according to a result of detection by the first detector. [0016]
  • Thus, when the display provides a display, autofocusing based on the image signal from the image sensor can be performed. Such autofocusing for the live view display provides precise framing. [0017]
  • According to another aspect of the present invention, the digital camera comprises: an image sensor for capturing a subject image; a mirror movable between a first position to enter an optical path from an imaging optical system to the image sensor, and a second position to withdraw from the optical path; a driver for driving the mirror; a first detector for detecting an in-focus condition of the imaging optical system on the basis of an image signal from the image sensor; a second detector for detecting an in-focus condition of the imaging optical system by a phase difference method; a selector for selecting either the first detector or the second detector; and a driver controller for when the selector selects the first detector to detect an in-focus condition, controlling the driver to move the mirror to the second position and when the selector selects the second detector to detect an in-focus condition, placing the mirror at the first position. [0018]
  • The selection from the first and second detectors allows switching between the live view display and the optical viewfinder. Thereby autofocusing can be performed even for framing by the optical viewfinder. [0019]
  • According to still another aspect of the present invention, the digital camera comprises: an image sensor for capturing a subject image; a mirror movable between a first position to enter an optical path from an imaging optical system to the image sensor, and a second position to withdraw from the optical path; a driver for driving the mirror; a first detector for detecting an in-focus condition of the imaging optical system on the basis of an image signal from the image sensor; a second detector for detecting an in-focus condition of the imaging optical system by a phase difference method; an operating member for image capture being movable from a first operating position to a second operating position which is a further pressed position from the first operating position; and a focus controller for when the operating member for image capture is at the first operating position, placing the mirror at the first position and driving the imaging optical system according to a result of detection by the second detector and when the operating member for image capture is at the second operating position, controlling the driver to move the mirror to the second position and driving the imaging optical system according to a result of detection by the first detector. [0020]
  • By driving the imaging optical system according to the result of detection by the second detector, a rough in-focus condition is achieved. Subsequently by driving the imaging optical system according to the result of detection by the first detector with the operating member for image capture at the second operating position, a more precise in-focus condition can be achieved with efficiency. [0021]
  • According to still another aspect of the present invention, the digital camera comprises: an image sensor for capturing a subject image; a mirror movable between a first position to enter an optical path from an imaging optical system to the image sensor, and a second position to withdraw from the optical path; a driver for driving the mirror; a first detector for detecting an in-focus condition of the imaging optical system on the basis of an image signal from the image sensor; a second detector for detecting an in-focus condition of the imaging optical system by a phase difference method; and a focus controller for after driving the imaging optical system according to a result of detection by the second detector with the mirror located at the first position, then controlling the driver to move the mirror to the second position and driving the imaging optical system according to a result of detection by the first detector. [0022]
  • By driving the imaging optical system according to the result of detection by the second detector, a rough in-focus condition is achieved. Subsequently by moving the mirror to the second position and driving the imaging optical system according to the result of detection by the first detector, a more precise in-focus condition can be achieved with efficiency. [0023]
  • In this fashion, an object of the present invention is to enable autofocusing responsive to an image signal from an image sensor in a digital camera and to obtain a focus condition of an image formed in the image sensor with great accuracy and efficiency.[0024]
  • These and other objects, features, aspects and advantages of the present invention will become more apparent from the following detailed description of the present invention when taken in conjunction with the accompanying drawings. [0025]
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 diagrammatically shows a configuration of a main mechanism of a digital camera according to the present invention; [0026]
  • FIGS. 2A to [0027] 2D diagrammatically show operating conditions of the main mechanism in image capture;
  • FIG. 3 is a block diagram of a control system in the digital camera; [0028]
  • FIG. 4 shows a rear face of the digital camera; [0029]
  • FIG. 5 is a state transition diagram when a digital camera according to a first preferred embodiment is in capture mode; [0030]
  • FIG. 6 is an explanatory diagram for explaining the position of a focusing lens in image capture of three successive frames; and [0031]
  • FIG. 7 is a state transition diagram when a digital camera according to a second preferred embodiment is in capture mode.[0032]
  • DESCRIPTION OF THE PREFERRED EMBODIMENTS
  • Hereinafter, preferred embodiments according to the present invention will be set forth in detail with reference to the drawings. [0033]
  • 1. First Preferred Embodiment
  • First, a first preferred embodiment of the present invention will be described. [0034]
  • 1-1. Device Configuration
  • FIG. 1 diagrammatically shows a configuration of a main mechanism of a digital camera according to a first preferred embodiment of the present invention, and FIGS. 2A to [0035] 2D diagrammatically show operating conditions of the main mechanism in image capture.
  • A [0036] digital camera 1 has a camera body 2 which modifies a single lens reflex camera for silver halide films. The camera body 2 has on its front face a taking lens 3 which is equipped with a taking lens portion 4, a diaphragm 5, and the like.
  • At the rear of the taking [0037] lens portion 4 in a direction of the optical path, a quick return mirror M1 is located which is pivotally supported by a pivot 6 in the upper rear portion of the camera body 2 so that it can rotatably be displaced. Further, a focal plane shutter 7 is located at the rear of the quick return mirror M1 in the direction of the optical path and an image sensor 8 at the rear of the focal plane shutter 7.
  • While the [0038] focal plane shutter 7 remains in the camera body 2, it may be removed depending on the type of the image sensor 8.
  • On the front face of the [0039] image sensor 8, an optical low pass filter 18 is provided for excluding the influence of return noise at the sampling of an analog image signal from the image sensor 8. The optical low pass filter 18, the focal plane shutter 7, and the image sensor 8 constitute an imaging unit 19.
  • The [0040] imaging unit 19 is movable back and forth along the optical path by a movement mechanism 30. Responsive to upward rotational movement of the quick return mirror M1 in image capture, the imaging unit 19 moves forward in a direction of the optical axis to its image capture position, i.e., until a light receiving surface of the image sensor 8 is moved to a position of back focal length. After image capture, responsive to downward rotational movement of the quick return mirror M1, the imaging unit 19 moves backward in the direction of the optical axis to its retracted position where no mechanical interference with the quick return mirror M1 occurs.
  • The [0041] movement mechanism 30 can adopt any mechanism of a known configuration: for example, it can be constituted by a mechanism for converting rotation of a motor-driven bolt into axial linear motion.
  • A finder [0042] equivalent part 9 corresponding to a finder of a silver-halide-film camera is formed above the quick return mirror M1 in the camera body 2. The finder equivalent side 9 is provided with a pentagonal prism 11 with a focusing screen 10 thereunder. Further, a predetermined relay lens 12 is located at the rear of the prism 11 and an eyepiece 13 at the rear of the relay lens 12, while a light measuring sensor 14 is located above the relay lens 12. In FIGS. 2A to 2D, the relay lens 12 is not shown. A range from the taking lens 3 to the optical low pass filter 18 in the imaging unit 19 corresponds to an imaging optical system of the invention. Also, the quick return mirror M1, the prism 11, the relay lens 12, and the eyepiece 13 constitute an optical viewfinder.
  • Until a [0043] shutter button 24 a shown in FIG. 3 is fully pressed, the quick return mirror M1 is in a stationary position as shown in FIGS. 1 and 2A, i.e., it is inclined 45 degrees to the optical axis, in which case an optical path L from the taking lens portion 4 is toward the focusing screen 10. At a full shutter press of the shutter button 24 a (hereinafter also referred to only as a “full shutter press”), the quick return mirror M1 is, as shown in FIGS. 2B to 2D, rotationally displaced about the pivot 6 to almost a horizontal position, to open the optical path L from the taking lens portion 4.
  • A mirror M[0044] 2 is a mirror integrated with the quick return mirror M1. This mirror M2 and a fixed mirror M3 thereunder direct an optical image which passes through a half mirror portion partially formed in the quick return mirror M1, toward a distance-measuring sensor 15. Upon receipt of light from the optical image, the distance-measuring sensor 15 detects a distance to the subject and generates a phase difference detection signal. The phase difference detection signal is for use in autofocusing of the taking lens portion 4.
  • The [0045] prism 11 has the function of inverting and scaling down an optical image formed on the focusing screen 10 and directing a resultant image toward the light measuring sensor 14 and the eyepiece 13. Further, control values such as the aperture value and the shutter speed are determined according to light quantity data obtained by the light measuring sensor 14, or by a camera control CPU 20 on the basis of image data from the image sensor 8. Also, the amount of light exposure in the image sensor 8 is determined.
  • The [0046] camera body 2 further comprises a focus motor 36 for driving a focusing lens included in the taking lens portion 4 in the direction of the optical axis.
  • On the rear face of the [0047] camera body 2 there is provided a display unit 16 constituted by a liquid crystal display (LCD) for displaying images obtained from the output of the image sensor 8.
  • FIG. 3 is a block diagram of a control system in the [0048] digital camera 1.
  • In FIG. 3, [0049] reference numeral 3 designates a taking lens; 4 designates a taking lens portion: 5 designates a diaphragm; M1 designates a quick return mirror; 7 designates a focal plane shutter; 8 designates an image sensor; 11 designates a prism; 13 designates an eyepiece; and 16 designates a display unit, all of which are identical to those shown in FIGS. 1 and 2A to 2D.
  • [0050] Reference numeral 20 designates a camera control CPU which controls each part of the camera body 2. More specifically, the diaphragm 5 is controlled through a diaphragm driver 21 and the image sensor 8 is controlled through a timing generator (sensor driver) 22. An actuator 17 for driving the quick return mirror M1 and the movement mechanism 30 for driving the imaging unit 19 are controlled through a mirror/imaging-unit driving circuit 23, and the focal plane shutter 7 is controlled through a shutter driver 25. The focus motor 36 is controlled through a motor driver 26.
  • The [0051] camera control CPU 20 is connected to a camera control switch 24 which includes the shutter button 24 a, a power switch, and the like.
  • In this preferred embodiment, the [0052] image sensor 8 is formed of a charge coupled device (CCD) which is an area sensor with primary-colors filters R (red), G (green), and B (blue) arranged in a checkerboard pattern on a pixel by pixel basis. The image sensor 8 performs photoelectric conversion of a subject's optical image formed by the taking lens portion 4 into an image signal with RGB color components (i.e., a signal consisting of a sequence of image signals from each pixel), and then outputs that image signal.
  • The [0053] timing generator 22 generates and outputs a drive control signal for the image sensor 8 in accordance with a reference clock given from the camera control CPU 20. The timing generator 22, for example, generates clock signals such as a timing signal for starting or stopping the integral (exposure) and a readout control signal (e.g., a horizontal synchronization signal, a vertical synchronization signal, a transfer signal) for readout of signals from each pixel. Those signals are outputted through a driver to the image sensor 8.
  • Output from the [0054] image sensor 8 is subjected to signal processing in a correlated double sampling (CDS) circuit 81, an automatic gain control (AGC) circuit 82, and an A/D converter 83. The CDS circuit 81 reduces noise in an image signal and the AGC circuit 82 provides gain control to adjust the level of the image signal. The A/D converter 83 converts a normalized analog signal from the AGC circuit 82 into a 10-bit digital signal.
  • [0055] Reference numeral 40 designates an image processor for performing image processing on the output from the A/ID converter 83 to form an image file. This processor 40 is controlled by an image processing CPU.
  • In image capture, image data from the [0056] image sensor 8 is fetched into the image processor 40 where a variety of processing operations are performed.
  • The signal fetched from the A/[0057] D converter 83 into the image processor 40 is written into image memory 61 in synchronization with readout from the image sensor 8. For subsequent processing, this data in the image memory 61 is accessed and processed in each block.
  • In the [0058] image processor 40, a pixel interpolation block 41 is a block for performing pixel interpolation in a predetermined interpolation pattern. In this preferred embodiment, after pixels R, G, and B are masked in their respective filter patterns, the pixel G with higher frequency component than the pixels R and B is replaced with a means value which is obtained by a median filter using intermediate two values of four pixels surrounding that pixel, while the pixels R and B are subjected to average interpolation to obtain respective outputs.
  • A color-[0059] balance control block 42 corrects the gains of individual outputs for R, G, and B after pixel interpolation in the pixel interpolation block 41, whereby color correction is made to the pixels R, G, and B. With respect to color balance, the camera control CPU 20 performs calculations R/G, B/G for the mean values of the respective outputs for R, G, and B, and resultant values are taken as corrected gains of R and B.
  • A [0060] gamma correction block 43 performs nonlinear conversion of the respective outputs for R, G, and B after normalization of color balance. Thereby a proper contrast transform for the display unit 16 is performed. The gamma-corrected image data is stored in the image memory 61.
  • A [0061] video encoder 44 reads out the above data stored in the image memory 61 and encodes it to an NTSC/PAL format, the result of which is displayed on the display unit 16.
  • An [0062] image compression block 45 performs compression on a captured image from the image sensor 8 by fetching image data from the image memory 61. The compressed captured image is recorded on a memory card 62 through a memory card driver 46.
  • The [0063] memory card 62 is removably loaded in a predetermined part of the camera body 2.
  • FIG. 4 shows a rear face of the [0064] digital camera 1.
  • On the rear face of the [0065] camera body 2, there are provided the aforementioned display unit 16 and a 4-way switch 35 on the right side of the display unit 16. Using U, D, L, and R buttons allows for various operations corresponding to the display on the display unit 16, e.g., a choice from selection items.
  • Under the 4-[0066] way switch 35 on the rear face of the camera body 2, there are further an LCD button 31, an OK button 32, a cancel button 33, and a menu button 34. The LCD button 31 is used for turning on/off a display on the display unit 16. At each push of the LCD button 31, the display unit 16 is switched from on to off, and vice versa. The OK button 32 and the cancel button 33 are used by an operator either to confirm or to cancel a selection of items at various settings. The menu button 34 is used for switching of various setting screens, e.g., a menu selection screen as described later, on the display.
  • 1-2. State Transitions and Operations
  • Now, state transitions and operations of the [0067] digital camera 1 will be set forth. The digital camera 1 mainly comes in two operating modes, namely, a “capture” mode and a “playback” mode. The capture mode is a mode of performing processing for image capture, in which mode in a shooting standby state, the display unit 16 displays live view images in some instances as described later and immediately after image capture, the display unit 16 displays a captured image. The playback mode is a mode of performing processing on already-recorded images, e.g., playing back and displaying on the display unit 16 a captured image recorded on the memory card 62.
  • Switching between the capture and the playback modes is done as follows. By the actuation of the [0068] menu button 34 or the like, a mode selection screen is displayed on the display unit 16, on which screen the switching between the capture and the playback modes is effected by the actuation of the 4-way switch 35, the OK button 32, and the cancel button 33.
  • FIG. 5 is a state transition diagram of the [0069] digital camera 1 in the capture mode. Hereinafter, the state transitions in the capture mode will be set forth. Unless otherwise specified, the operation of each unit is controlled by the camera control CPU 20.
  • At power-on, the [0070] digital camera 1 goes into a capture mode with the optical viewfinder, in which mode the digital camera 1 starts to operate with the quick return mirror M1 in the down position as shown in FIG. 2A, the display unit 16 off, and accordingly a live view display described later in the off state (state S1). In this state S1, AF processing is not performed. Thus, a subject image in the optical viewfinder is slightly out of focus; however, rough framing by the optical viewfinder is possible in this state.
  • In the state S[0071] 1 and a state S6 which will be described later, a menu setting screen appears on the display with a user's press of the menu button 34, on which screen a user makes menu settings (state S2).
  • FIG. 4 shows the menu setting screen displayed on the screen of the [0072] display unit 16. As shown in FIG. 4, the menu setting screen allows a user to selectively define an AF method to be applied at a half shutter press of the shutter button 24 a (hereinafter also referred to only as a “half shutter press”). By pressing either the button U or D of the 4-way switch 35, a selection of the AF method at the half shutter press is made from the contrast AF method and the phase difference AF method.
  • If the phase difference AF method is selected in the state S[0073] 2, a transition to the state S1 occurs, while if the contrast AF method is selected in the state S2, a transition to the state S6 occurs. At this time, the transition from the states S2 to Si turns off the display unit 16 (i.e., live view display), while the transition from the states S2 to S6 holds the display unit 16 in the on state and turns on the live view display.
  • When the [0074] shutter button 24 a is pressed halfway down in the state S1, phase difference AF and exposure adjustments are performed with the quick return mirror M1 in the down position and the live view display off (state S3). Hereinafter, the operation in the state S3 will be set forth in detail.
  • At the half shutter press of the [0075] shutter button 24 a, as shown in FIG. 2A, the optical path L of light incident from the taking lens portion 4 and the diaphragm 5 changes its direction upward because of the presence of the quick return mirror M1 in the camera body 2. An image is then formed on the focusing screen 10, inverted and scaled down by the pentagonal prism 11, and received by the light measuring sensor 14. The light measuring sensor 14 measures the amount of light, from which the camera control CPU 20 obtains exposure control data. According to the exposure control data, the diaphragm 5 is controlled through the diaphragm driver 21 and the timing generator 22 for applying the drive control signal to the image sensor 8 is controlled, so that the image sensor 8 receives a proper amount of light exposure.
  • At this time, the [0076] imaging unit 19 is in its retracted position to avoid mechanical interference with the quick return mirror M1, and the image receiving plane of the image sensor 8 is located behind the position of back focal length.
  • Part of the light incident from the taking [0077] lens portion 4 and the diaphragm 5 passes through a half mirror portion in the center of the quick return mirror M1 and travels through the mirror M2 and the fixed mirror M3 toward the distance-measuring sensor 15. Upon receipt of the light, the distance-measuring sensor 15 detects a distance to the subject based on which the focusing lens in the taking lens portion 4 is driven to for autofocusing.
  • Simultaneously with the aforementioned light and distance measuring operations, the optical image, the optical path L of which changed direction at the quick return mirror M[0078] 1, is scaled down by the prism 11 and the relay lens 12 and then reaches the eyepiece 13. From this, a user can visually recognize a focused subject image through the eyepiece 13. Although not shown, a user can go back to the state S12 for reframing at the release of the half-pressed shutter button 24 a during the state S3, which provides precise framing.
  • If the [0079] shutter button 24 a is further pressed to its full-pressed position by a user, image capture of three successive frames is performed in a predetermined AF area and a position of the focusing lens is selected which provides the image in sharpest focus out of the three partial images obtained (state S4). Hereinafter, the operation in the state S4 will be set forth in detail.
  • FIG. 6 is an explanatory diagram for explaining the position of the focusing lens in image capture of three successive frames. For image capture of three successive frames, the [0080] image sensor 8 obtains partial image data about only the AF area of central partial rectangular area of an image at different positions of the focusing lens: a position at the time of full shutter press (i.e., an in-focus position determined by the phase difference AF) and positions determined by shifting the above in-focus position both forward and backward by the amount of deviation d obtained from the depth of focus (hereinafter referred to as a “front focus position” and a “rear focus position”). That is, a total of three frames of partial images are obtained. The depth of focus is obtained from the position of the focusing lens and aperture value at the time of full shutter press. The amount of deviation d is previously obtained for each depth of focus and summarized in a table stored in a ROM (not shown) in the camera control CPU 20. From this table, the amount of deviation d corresponding to the depth of focus is obtained for use.
  • Then, AF evaluation values (contrast) for the three partial image data are obtained and compared with each other. From comparison, the partial image with a maximum AF evaluation value is taken as the image in sharpest focus and a corresponding position of the focusing lens is selected. [0081]
  • Hereinafter, the internal operation when the [0082] shutter button 24 a is fully pressed in the state S4 will be discussed. At a full shutter press of the shutter button 24 a, the diaphragm 5 is set to a predetermined aperture value with the focusing lens held at the position when driven by the phase difference AF at the time of half shutter press, and upward rotatable displacement of the quick return mirror M1 about the pivot 6 starts as indicated by an open arrow in FIG. 2B. In response to this, the imaging unit 19 is moved forward by the movement mechanism 30 in the direction of the optical axis of the taking lens portion 4. In FIGS. 2A to 2D, the actuator 17 for driving the quick return mirror M1 and the movement mechanism 30 for moving the imaging unit 19 are not shown.
  • When the quick return mirror M[0083] 1 reaches under the focusing screen 10 as shown in FIG. 2C to complete its mirror-up operation, the forward movement of the imaging unit 19 comes to a stop and the image receiving plane of the image sensor 8 is placed in the position of back focal length. Then, as shown in FIG. 2D, the focal plane shutter 7 opens and closes at a predetermined speed, whereby the optical image passing through the taking lens portion 4 and the diaphragm 5 is directly formed on the image sensor 8 and photoelectrically converted therein. Under timing control of the camera control CPU 20, only a partial image signal in the AF area out of the photoelectrically converted signal is outputted through the buffer.
  • Then, the depth of focus at the time of full shutter press is obtained in the same manner as above described and the corresponding amount of deviation d of the focusing lens is obtained. According to those values, the focusing lens is moved to its front and rear focus positions at each of which a partial image is captured as above described. Then, the position of the focusing lens which provides the partial image in sharpest focus is selected as above described and the focusing lens is actually moved to the selected position. [0084]
  • Next, an image capture operation is performed at the selected position of the focusing lens (state S[0085] 5). More specifically, as shown in FIG. 2D, the focal plane shutter 7 opens and closes at a predetermined speed, whereby the optical image passing through the taking lens portion 4 and the diaphragm 5 is directly formed on the image sensor 8 and photoelectrically converted therein. The photoelectrically converted signal is then outputted through the buffer.
  • The image data outputted from the [0086] image sensor 8 is subjected to predetermined signal processing in the CDS circuit 81, the AGC circuit 82, and the A/D converter 83, fetched into the image processor 40, and written into the image memory 61 in synchronization with readout of the image sensor 8.
  • After the image capture, the quick return mirror M[0087] 1 is rotated back to its original position, whereby the optical path L again goes toward the focusing screen 10 and the digital camera 1 is placed in the shooting standby state. In response to the rotational return movement of the quick return mirror M1, the imaging unit 19 moves backward to its retracted position in the direction of the optical axis to avoid interference with the rotational movement of the quick return mirror M1.
  • Next, selected image data is recorded on the [0088] memory card 62. More specifically, the image data written into the image memory 61 is subjected to the aforementioned processing such as pixel interpolation, color balance control, and gamma correction in the image processor 40 and is stored again in the image memory 61. Resultant image data is fetched from the image memory 61 and displayed as a captured image on the display unit 16. Simultaneously, this image data is also subjected to image compression in the image compression block 45 and recorded on the memory card 62 through the memory card driver 46.
  • At the completion of the image capture operation, the [0089] digital camera 1 returns to the state S1.
  • If the [0090] LCD button 31 is pressed in the state S1, the display unit 16 is turned on and the digital camera 1 goes into the capture mode with live view display. As shown in FIG. 2D, the quick return mirror M1 is flipped up, the display unit 16 is turned on to start a live view display, and the contrast AF is performed (state S6).
  • In the state S[0091] 6, the quick return mirror M1 in the up position allows light from the taking lens portion 4 to reach the image sensor 8. Thereby, image data outputted from the image sensor 8 for each predetermined period of time (e.g., at every {fraction (1/30)} th second) is once stored in the image memory 61 through the image processor 40. The image data is then read out by the image processor 40 for the aforementioned image processing and stored again in the image memory 61. The video encoder 44 reads out the above data stored in the image memory 61 and encodes it to the NTSC/PAL format, the result of which is displayed on the display unit 16 and produces a live view display.
  • The [0092] camera control CPU 20 performs the contrast AF even for the live view display. Here, the contrast AF method is an autofocus method for achieving focus by fetching image data from the image memory 61 to obtain an AF evaluation value (contrast) for the image and driving the focus motor 36 to move the focusing lens to the position with the maximum AF evaluation value. At this time, a known technique such as a “hill-climbing” method or the like can be adopted as a control method for achieving maximum contrast.
  • At a subsequent half shutter press of the [0093] shutter button 24 a, in which case the conditions are almost identical to those in the state S6, exposure adjustments and more precise contrast AF are performed (state S7). The exposure adjustment at this time is accomplished by adjusting the diaphragm 5 through the diaphragm driver 21 and adjusting the amount of light exposure in the image sensor 8, i.e., charge storage time in the image sensor 8 corresponding to the shutter speed. The contrast AF performs focusing with higher accuracy by dividing the step of moving the focusing lens into smaller segments than in the state S6. This provides more precise framing.
  • At a subsequent full shutter press of the [0094] shutter button 24 a, an image capture operation is performed based on the contrast AF (state S8). In the image capture operation, as above described, the focal plane shutter 7 opens and closes at a predetermined speed, whereby the optical image from the taking lens portion 4 is directly formed on the image sensor 8 and resultant image data is stored in the image memory 61. This image data is fetched into the image processor 40 for the aforementioned image processing and stored again in the image memory 61. The image data is also recorded on the memory card 62.
  • At the completion of the image capture operation, the [0095] digital camera 1 returns again to the state S6 and is enabled for next image capture.
  • If the [0096] LCD button 31 is pressed in the state S6, the display unit 16 and accordingly the live view display are turned off and the digital camera 1 goes into the capture mode with the optical viewfinder in the aforementioned state S1.
  • The above description has provided the state transitions and operations of the [0097] digital camera 1.
  • According to this preferred embodiment as has been described, a single lens reflex digital camera performs the contrast AF for live view display while withdrawing the quick return mirror M[0098] 1 from the imaging optical system comprised of the taking lens portion 4, the diaphragm 5, and the imaging unit 19. The contrast AF can thus be performed responsive to the live view display. Further, the contrast AF for live view display provides precise framing.
  • With the live view display off, the phase difference AF is performed with the quick return mirror M[0099] 1 placed in the imaging optical system to allow visual recognition of the subject image in the optical viewfinder. Thus, a user can intentionally switch between the live view display and the optical viewfinder, which permits autofocusing even for framing using the optical viewfinder.
  • When the live view display is off, the [0100] display unit 16 is turned off. This reduces power consumption.
  • In the capture mode with the optical viewfinder, the phase difference AF is performed at the half shutter press of the [0101] shutter button 24 a (state S3). This shortens processing time for autofocusing at the time of half shutter press, thereby preventing the occurrence of a time lag due to autofocusing.
  • When the contrast AF is selected on the menu setting screen, the quick return mirror M[0102] 1 is withdrawn from the imaging optical system (state S6), while when the phase difference AF is selected on the menu setting screen, the quick return mirror M1 is placed inside the imaging optical system (state S1). From this, the contrast AF can be performed when high focusing precision is required and the phase difference AF when high-speed autofocusing is required. That is, a most suitable autofocusing method can be selected at a user's discretion.
  • When the contrast AF method is selected on the menu setting screen, the live view display is turned on. This provides precise framing based on live view images based on the contrast AF. [0103]
  • When the [0104] display unit 16 is off, the quick return mirror M1 is located inside the imaging optical system and the phase difference AF method is correspondingly selected. This permits framing even during the off state of the live view display.
  • After actual image recording, at least a display on the [0105] display unit 16 is restored to either the on or off state of the live view display immediately before the actual image recording. This enhances the operability of the digital camera.
  • Further, the contrast AF performed at the time of actual image recording (full shutter press) achieves precise focus in the actual image recording. [0106]
  • In the capture mode with the optical viewfinder, the phase difference AF is performed at the time of half shutter press and the contrast AF at the time of full shutter press. That is, the contrast AF can be performed at the time of full shutter press that requires high-precision autofocusing. [0107]
  • 2. Second Preferred Embodiment
  • Next, a second preferred embodiment of the present invention will be set forth. A [0108] digital camera 1 according to the second preferred embodiment is basically identical to that of the aforementioned first preferred embodiment. The following description will thus be made mainly in parts that differ from the first preferred embodiment.
  • FIG. 7 is a state transition diagram of the [0109] digital camera 1 in the capture mode according to the second preferred embodiment. Hereinafter, the state transitions in the capture mode will be set forth in detail. Unless otherwise specified, the operation of each unit is controlled by the camera control CPU 20 (cf. FIG. 3).
  • At power-on, the [0110] digital camera 1 goes into the capture mode with the optical viewfinder, in which mode the digital camera 1 starts to operate with the quick return mirror M1 in the down position as shown in FIG. 2A, the display unit 16 off, and accordingly the live view display described later in the off state (state S11). In this state, AF processing is not performed. Thus, a subject image in the optical viewfinder is slightly out of focus, however, rough framing by the optical viewfinder is possible in this state.
  • In the state S[0111] 1 and a state S16 which will be described later, a menu setting screen appears on the display with a user's press of the menu button 34, on which screen a user makes menu settings (state S12).
  • As previously described, FIG. 4 shows the menu setting screen displayed on the screen of the [0112] display unit 16. As shown in FIG. 4, the menu setting screen allows a user to selectively define an AF method to be applied at a half shutter press of the shutter button 24 a. By pressing either the button U or D of the 4-way switch 35, a selection of the AF method at the half shutter press is made from the contrast AF method and the phase difference AF method.
  • If the phase difference AF method is selected in the state S[0113] 12, a transition to the state S1 occurs, while if the contrast AF method is selected in the state S12, a transition to the state S 16 occurs. At this time, the transition from the states S12 to S11 turns off the display unit 16 (i.e., live view display), while the transition from the states S12 to S16 holds the display unit 16 in the on state and turns on the live view display.
  • When the [0114] shutter button 24 a is pressed halfway down in the state S11, phase difference AF and exposure adjustments are performed with the quick return mirror M1 in the down position and the live view display off (state S13). Hereinafter, the operation in state S13 will be set forth in detail.
  • At the half shutter press of the [0115] shutter button 24 a, as shown in FIG. 2A, the optical path L of light incident from the taking lens portion 4 and the diaphragm 5 changes its direction upward because of the presence of the quick return mirror M1 in the camera body 2. An image is then formed on the focusing screen 10, inverted and scaled down by the pentagonal prism 11, and received by the light measuring sensor 14. The light measuring sensor 14 measures the amount of light, from which the camera control CPU 20 obtains exposure control data. According to the exposure control data, the diaphragm 5 is controlled through the diaphragm driver 21 and the timing generator 22 for applying the drive control signal to the image sensor 8 is controlled, so that the image sensor 8 receives a proper amount of light exposure.
  • At this time, the [0116] imaging unit 19 is in its retracted position to avoid mechanical interference with the quick return mirror M1, and the image receiving plane of the image sensor 8 is located behind the position of back focal length.
  • Part of the light incident from the taking [0117] lens portion 4 and the diaphragm 5 passes through a half mirror portion in the center of the quick return mirror M1 and travels through the mirror M2 and the fixed mirror M3 toward the distance-measuring sensor 15. Upon receipt of the light, the distance-measuring sensor 15 detects a distance to the subject to generate a phase difference detection signal. According to this phase difference detection signal, the camera control CPU 20 drives the focusing lens in the taking lens portion 4 for autofocusing.
  • Simultaneously with the aforementioned light and distance measuring operations, the optical image, the optical path L of which changed direction at the quick return mirror M[0118] 1, is scaled down by the prism 11 and the relay lens 12 and then reaches the eyepiece 13. From this, a user can visually recognize a focused subject image through the eyepiece 13. Although not shown, a user can go back to the state S11 for reframing at the release of the half-pressed shutter button 24 a during the state S13, which provides precise framing.
  • If the [0119] shutter button 24 a is further pressed to its full-pressed position by a user and the focal length of the taking lens 3 (which may be a zoom lens) is 50 mm or less on a 135-mm film basis in a state S33, an image capture operation is performed in a state S15. If the focal length is longer than 50 mm, on the other hand, image capture of three successive frames is performed in a predetermined AF area and a position of the focusing lens is selected which provides the image in sharpest focus out of the three partial images obtained (state S14). That is, when the taking lens 3 is a lens for use in telephotography, a lens drive is first conducted based on the phase difference detection signal and then another lens drive is conducted based on the AF evaluation value according to the contrast AF method.
  • Hereinafter, the operation in the state S[0120] 14 will be set forth in detail.
  • As shown in FIG. 6, for image capture of three successive frames, the [0121] image sensor 8 obtains partial image data about only the AF area of central partial rectangular area of the image at different positions of the focusing lens: a position at the time of full shutter press (i.e., an in-focus position determined by the phase difference AF) and positions determined by shifting the above in-focus position both forward and backward by the amount of deviation d obtained from the depth of focus (hereinafter referred to as a “front focus position” and a “rear focus position”). That is, a total of three frames of partial images are obtained. The depth of focus is obtained from the position of the focusing lens and aperture value at the time of full shutter press. The amount of deviation d is previously obtained for each depth of focus and summarized in a table stored in a ROM (not shown) in the camera control CPU 20. From this table, the amount of deviation d corresponding to the depth of focus is obtained for use.
  • Then, AF evaluation values (contrast) for the three partial image data are obtained and compared with each other. From comparison, the partial image with a maximum AF evaluation value is taken as the image in sharpest focus and a corresponding position of the focusing lens is selected. [0122]
  • Hereinafter, the internal operation when the [0123] shutter button 24 a is fully pressed in the state S14 will be discussed. At a full shutter press of the shutter button 24 a, the diaphragm 5 is set to a predetermined aperture value with the focusing lens held at the position when driven by the phase difference AF at the time of half shutter press, and upward rotatable displacement of the quick return mirror M1 about the pivot 6 starts as indicated by the open arrow in FIG. 2B. In response to this, the imaging unit 19 is moved forward by the movement mechanism 30 in the direction of the optical axis of the taking lens portion 4. In FIGS. 2A to 2D, the actuator 17 for driving the quick return mirror M1 and the movement mechanism 30 for moving the imaging unit 19 are not shown.
  • When the quick return mirror M[0124] 1 reaches under the focusing screen 10 as shown in FIG. 2C to complete its mirror-up operation, the forward movement of the imaging unit 19 comes to a stop and the image receiving plane of the image sensor 8 is placed in the position of back focal length. Then, as shown in FIG. 2D, the focal plane shutter 7 opens and closes at a predetermined speed, whereby the optical image passing through the taking lens portion 4 and the diaphragm 5 is directly formed on the image sensor 8 and photoelectrically converted therein. Under timing control of the camera control CPU 20, only a partial image signal in the AF area out of the photoelectrically converted signals is outputted through the buffer.
  • Then, the depth of focus at the time of full shutter press is obtained in the same manner as above described and the corresponding amount of deviation d of the focusing lens is obtained. According to those values, the focusing lens is moved to its front and rear focus positions at each of which a partial image is captured as above described. Then, the position of the focusing lens which provides the partial image in sharpest focus is selected as above described and the focusing lens is actually moved to the selected position. [0125]
  • Next, an image capture operation is performed at the selected position of the focusing lens (state S[0126] 15). More specifically, as shown in FIG. 2D, the focal plane shutter 7 opens and closes at a predetermined speed, whereby the optical image passing through the taking lens portion 4 and the diaphragm 5 is directly formed on the image sensor 8 and photoelectrically converted therein. The photoelectrically converted signal is then outputted through the buffer.
  • The image data outputted from the [0127] image sensor 8 is subjected to predetermined signal processing in the CDS circuit 81, the AGC circuit 82, and the A/D converter 83, fetched into the image processor 41, and written into the image memory 61 in synchronization with readout of the image sensor 8.
  • After the image capture, the quick return mirror M[0128] 1 is rotated back to its original position, whereby the optical path L again goes toward the focusing screen 10 and the digital camera 1 is placed in the shooting standby state. In response to the rotational return movement of the quick return mirror M1, the imaging unit 19 moves backward to its retracted position in the direction of the optical axis to avoid interference with the rotational movement of the quick return mirror M1.
  • Next, selected image data is recorded on the [0129] memory card 62. More specifically, the image data written into the image memory 61 is subjected to the aforementioned processing such as pixel interpolation, color balance control, and gamma correction in the image processor 40 and is stored again in the image memory 61. Resultant image data is fetched from the image memory 61 and displayed as a captured image on the display unit 16. Simultaneously, this image data is also subjected to image compression in the image compression block 45 and recorded on the memory card 62 through the memory card driver 46.
  • At the completion of the image capture operation, the [0130] digital camera 1 returns to the state S11.
  • When the [0131] LCD button 31 is pressed in the state S11, the display unit 16 is turned on and the digital camera 1 goes into the capture mode with live view display. As depicted in FIG. 2D, the quick return mirror M1 is flipped up, the display unit 16 is turned on to start a live view display, and the contrast AF is performed (state S16).
  • In the state S[0132] 16, the quick return mirror M1 in the up position allows light from the taking lens portion 4 to reach the image sensor 8. Thereby, image data outputted from the image sensor 8 for each predetermined period of time (e.g., at every {fraction (1/30)}th second) is once stored in the image memory 61 through the image processor 40. The image data is then read out by the image processor 40 for the aforementioned image processing and stored again in the image memory 61. The video encoder 44 reads out the above data stored in the image memory 61 and encodes it to the NTSC/PAL format, the result of which is displayed on the display unit 16 and produces a live view display.
  • The [0133] camera control CPU 20 performs the contrast AF even for the live view display. Here, the contrast AF method is an autofocus method for achieving focus by fetching image data from the image memory 61 to obtain an AF evaluation value (contrast) for the image and driving the focus motor 36 to move the focusing lens to the position with the maximum AF evaluation value. At this time, a known technique such as a “hill-climbing” method or the like can be adopted as a control method for achieving maximum contrast.
  • At a subsequent half shutter press of the [0134] shutter button 24 a, in which case the conditions are almost identical to those as in the state S16, exposure adjustments and more precise contrast AF are performed (state S17). The exposure adjustment at this time is accomplished by adjusting the diaphragm 5 through the diaphragm driver 21 and adjusting the amount of light exposure in the image sensor 8, i.e., charge storage time in the image sensor 8 corresponding to the shutter speed. The contrast AF performs focusing with higher accuracy by dividing the step of moving the focusing lens into smaller segments than in the state S16. This provides precise framing.
  • At a subsequent full shutter press of the [0135] shutter button 24 a, an image capture operation is performed based on the contrast AF (state S18). In the image capture operation, as above described, the focal plane shutter 7 opens and closes at a predetermined speed, whereby the optical image from the taking lens portion 4 is directly formed on the image sensor 8 and resultant image data is stored in the image memory 61. This image data is fetched into the image processor 40 for the aforementioned image processing and stored again in the image memory 61. The image data is also recorded on the memory card 62.
  • At the completion of the image capture operation, the [0136] digital camera 1 returns again to the state S 16 and is enabled for next image capture.
  • If the [0137] LCD button 31 is pressed in the state S16, the display unit 16 and accordingly the live view display are turned off and the digital camera 1 goes into the capture mode with the optical viewfinder in the aforementioned state S11.
  • The above description has provided the state transitions and operations of the [0138] digital camera 1.
  • According to this preferred embodiment as has been described, a single lens reflex digital camera is configured to first conduct a lens drive based on the phase difference detection signal according to the phase difference AF method and then conduct another lens drive based on the evaluation value according to the contrast AF method. Thus, a rough in-focus condition can be achieved in a short time by the phase difference AF method and a more precise in-focus condition can be achieved by the contrast AF method. The [0139] digital camera 1 of this preferred embodiment can thus achieve an in-focus condition with great accuracy and efficiency.
  • The [0140] digital camera 1 is also configured to conduct a lens drive based on the AF evaluation value after the completion of the exposure control. That is, the taking lens can be driven on the basis of the evaluation value which is obtained under the same conditions as actual image recording. This achieves a more precise in-focus condition.
  • In parallel with the exposure control, the [0141] digital camera 1 conducts a lens drive based on the phase difference detection signal. This permits an efficient operation of the digital camera 1 and especially an efficient lens drive.
  • Further, when the taking lens is a lens for use in telephotography, the [0142] digital camera 1 first conducts a lens drive based on the phase difference detection signal and then conducts another lens drive based on the AF evaluation value. This achieves an in-focus condition with efficiency.
  • 3. Modifications
  • While several examples of the [0143] digital camera 1 have been given in the aforementioned preferred embodiments, it is to be understood that the present invention is not limited thereto.
  • For example, in the aforementioned preferred embodiments, when the shutter button is fully pressed in the capture mode with the optical viewfinder, image capture of three successive frames is performed to achieve the best in-focus condition for image capture. Instead, only a single frame of image may be captured as usual by a normal contrast AF method such as a “hill-climbing” method. [0144]
  • While the invention has been shown and described in detail, the foregoing description is in all aspects illustrative and not restrictive. It is therefore understood that numerous modifications and variations can be devised without departing from the scope of the invention. [0145]

Claims (16)

What is claimed is:
1. A digital camera comprising:
an image sensor for capturing a subject image;
a mirror movable between a first position to enter an optical path from an imaging optical system to said image sensor, and a second position to withdraw from said optical path;
a driver for driving said mirror;
a first detector for detecting an in-focus condition of said imaging optical system on the basis of an image signal from said image sensor;
a display for displaying an image signal obtained by said image sensor; and
a focus controller for when said display provides a display, controlling said driver to move said mirror to said second position and driving said imaging optical system according to a result of detection by said first detector.
2. The digital camera according to
claim 1
, wherein said display displays a live view image.
3. The digital camera according to
claim 1
, further comprising:
a second detector for detecting an in-focus condition of said imaging optical system by a phase difference method;
an operating member for display to turn off said display; and
a display controller for turning off said display upon operation of said operating member for display, wherein when said display controller turns off said display, said focus controller places said mirror at said first position and drives said imaging optical system according to a result of detection by said second detector.
4. The digital camera according to
claim 3
, further comprising:
an operating member for image capture being movable between a first operating position and a second operating position which is a further pressed position from said first operating position, wherein
when said operating member for image capture is at said first operating position, said focus controller drives said imaging optical system according to a result of detection by said second detector.
5. The digital camera according to
claim 1
, further comprising:
an optical viewfinder for allowing visual recognition of a subject when said mirror is at said first position.
6. A digital camera comprising:
an image sensor for capturing a subject image;
a mirror movable between a first position to enter an optical path from an imaging optical system to said image sensor, and a second position to withdraw from said optical path;
a driver for driving said mirror;
a first detector for detecting an in-focus condition of said imaging optical system on the basis of an image signal from said image sensor;
a second detector for detecting an in-focus condition of said imaging optical system by a phase difference method;
a selector for selecting either said first detector or said second detector; and
a driver controller for when said selector selects said first detector to detect an in-focus condition, controlling said driver to move said mirror to said second position and when said selector selects said second detector to detect an in-focus condition, placing said mirror at said first position.
7. The digital camera according to
claim 6
, further comprising:
a display capable of displaying a live view image; and
a display controller for turning on said display when said selector selects said first detector to detect an in-focus condition.
8. The digital camera according to
claim 7
, wherein when said display is turned off, said driver controller places said mirror at said first position.
9. The digital camera according to
claim 8
, wherein after actual image recording, said display controller restores a condition of said display to a condition immediately before said actual image recording.
10. The digital camera according to
claim 6
, further comprising:
an optical viewfinder for allowing visual recognition of a subject when said mirror is at said first position.
11. A digital camera comprising:
an image sensor for capturing a subject image;
a mirror movable between a first position to enter an optical path from an imaging optical system to said image sensor, and a second position to withdraw from said optical path;
a driver for driving said mirror;
a first detector for detecting an in-focus condition of said imaging optical system on the basis of an image signal from said image sensor;
a second detector for detecting an in-focus condition of said imaging optical system by a phase difference method;
an operating member for image capture being movable from a first operating position to a second operating position which is a further pressed position from said first operating position; and
a focus controller for when said operating member for image capture is at said first operating position, placing said mirror at said first position and driving said imaging optical system according to a result of detection by said second detector and when said operating member for image capture is at said second operating position, controlling said driver to move said mirror to said second position and driving said imaging optical system according to a result of detection by said first detector.
12. The digital camera according to
claim 11
, further comprising:
an optical viewfinder for allowing visual recognition of a subject when said mirror is at said first position.
13. A digital camera comprising:
an image sensor for capturing a subject image;
a mirror movable between a first position to enter an optical path from an imaging optical system to said image sensor, and a second position to withdraw from said optical path;
a driver for driving said mirror;
a first detector for detecting an in-focus condition of said imaging optical system on the basis of an image signal from said image sensor;
a second detector for detecting an in-focus condition of said imaging optical system by a phase difference method; and
a focus controller for after driving said imaging optical system according to a result of detection by said second detector with said mirror located at said first position, then controlling said driver to move said mirror to said second position and driving said imaging optical system according to a result of detection by said first detector.
14. The digital camera according to
claim 13
, further comprising:
an exposure controller for performing a predetermined computation of exposure to obtain a proper exposure value for image capture and exercising exposure control over said image sensor on the basis of said proper exposure value, wherein
said focus controller drives said imaging optical system according to a result of detection by said first detector after completion of the exposure control by said exposure controller.
15. The digital camera according to
claim 14
, wherein
said focus controller controls a drive of said imaging optical system according to a result of detection by said second detector, in parallel with the exposure control by said exposure controller.
16. The digital camera according to
claim 13
, wherein,
when a focal length of said imaging optical system is not less than a predetermined value, said focus controller drives said imaging optical system according to a result of detection by said second detector and then drives said imaging optical system according to a result of detection by said first detector.
US09/815,977 2000-03-27 2001-03-22 Digital camera Expired - Lifetime US6453124B2 (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
JP2000086391A JP3726630B2 (en) 2000-03-27 2000-03-27 Digital still camera
JP2000-086391 2000-03-27
JP2000-093334 2000-03-30
JP2000093334A JP3634232B2 (en) 2000-03-30 2000-03-30 Digital still camera

Publications (2)

Publication Number Publication Date
US20010026683A1 true US20010026683A1 (en) 2001-10-04
US6453124B2 US6453124B2 (en) 2002-09-17

Family

ID=26588414

Family Applications (1)

Application Number Title Priority Date Filing Date
US09/815,977 Expired - Lifetime US6453124B2 (en) 2000-03-27 2001-03-22 Digital camera

Country Status (1)

Country Link
US (1) US6453124B2 (en)

Cited By (41)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020149689A1 (en) * 2001-04-12 2002-10-17 Masato Sannoh Image pick-up device
US20040155976A1 (en) * 2003-02-12 2004-08-12 Canon Kabushiki Kaisha Image taking apparatus and lens apparatus
EP1458181A1 (en) * 2003-03-14 2004-09-15 Ricoh Company Digital camera with distance-dependent focussing method
US20050185083A1 (en) * 2004-02-20 2005-08-25 Hiroto Okawara Lens controlling apparatus and image-taking apparatus
US20060008265A1 (en) * 2004-07-12 2006-01-12 Kenji Ito Optical apparatus
US20060127080A1 (en) * 2004-12-09 2006-06-15 Konica Minolta Photo Imaging, Inc. Digital single-reflex camera
US20060127072A1 (en) * 2004-12-15 2006-06-15 Pentax Corporation Drive mechanism for camera
US20060210257A1 (en) * 2005-03-21 2006-09-21 Samsung Electronics Co., Ltd. Mobile communication terminal having camera function and method for performing the same
US20060291847A1 (en) * 2005-06-27 2006-12-28 Hiroshi Terada Digital camera
US20070003269A1 (en) * 2005-07-01 2007-01-04 Matsushita Electric Industrial Co., Ltd. Single-lens reflex camera system
US20070064142A1 (en) * 2002-02-22 2007-03-22 Fujifilm Corporation Digital camera
US20070140676A1 (en) * 2005-12-16 2007-06-21 Pentax Corporation Camera having an autofocus system
US20070153114A1 (en) * 2005-12-06 2007-07-05 Matsushita Electric Industrial Co., Ltd. Digital camera
US20070188647A1 (en) * 2006-02-15 2007-08-16 Hitoshi Ikeda Image pickup apparatus with display apparatus, and display control method for display apparatus
US20070292123A1 (en) * 2006-06-20 2007-12-20 Matsushita Electric Industrial Co., Ltd. Digital camera
US20080118238A1 (en) * 2006-11-17 2008-05-22 Sony Corporation Imaging apparatus
US20080138055A1 (en) * 2006-12-08 2008-06-12 Sony Ericsson Mobile Communications Ab Method and Apparatus for Capturing Multiple Images at Different Image Foci
US20080151065A1 (en) * 2006-12-20 2008-06-26 Yoichiro Okumura Camera capable of displaying moving image and control method of the same
US20080198258A1 (en) * 2007-02-15 2008-08-21 Olympus Imaging Corp. Single lens reflex type electronic imaging apparatus
US20080316352A1 (en) * 2007-05-12 2008-12-25 Quanta Computer Inc. Focusing apparatus and method
US20080316351A1 (en) * 2007-06-19 2008-12-25 Samsung Techwin Co., Ltd. Photographing apparatus and photographing method
US7511757B2 (en) 2006-06-20 2009-03-31 Panasonic Corporation Digital camera
US20090185069A1 (en) * 2008-01-22 2009-07-23 Canon Kabushiki Kaisha Imaging apparatus and control method thereof
US20100066895A1 (en) * 2005-12-06 2010-03-18 Panasonic Corporation Digital camera
US20100066889A1 (en) * 2005-12-06 2010-03-18 Panasonic Corporation Digital camera
US20100066890A1 (en) * 2005-12-06 2010-03-18 Panasonic Corporation Digital camera
US20100091169A1 (en) * 2008-10-14 2010-04-15 Border John N Dithered focus evaluation
US20100171868A1 (en) * 2007-05-28 2010-07-08 Panasonic Corporation Camera system and camera body
DE102004017536B4 (en) * 2003-04-08 2011-08-18 Hoya Corp. Automatic sharpening system
US8139130B2 (en) 2005-07-28 2012-03-20 Omnivision Technologies, Inc. Image sensor with improved light sensitivity
US8194296B2 (en) 2006-05-22 2012-06-05 Omnivision Technologies, Inc. Image sensor with improved light sensitivity
US8274715B2 (en) 2005-07-28 2012-09-25 Omnivision Technologies, Inc. Processing color and panchromatic pixels
CN102833482A (en) * 2011-06-14 2012-12-19 三星电子株式会社 Apparatus and method of adjusting automatic focus
US8416339B2 (en) 2006-10-04 2013-04-09 Omni Vision Technologies, Inc. Providing multiple video signals from single sensor
US20130113983A1 (en) * 2007-08-29 2013-05-09 Panasonic Corporation Imaging device and camera
US20130215251A1 (en) * 2012-02-17 2013-08-22 Sony Corporation Imaging apparatus, imaging control program, and imaging method
US20140039257A1 (en) * 2012-08-02 2014-02-06 Olympus Corporation Endoscope apparatus and focus control method for endoscope apparatus
US20140104462A1 (en) * 2010-03-10 2014-04-17 Sony Corporation Imaging apparatus and method of controlling the imaging apparatus
US20140226065A1 (en) * 2013-02-14 2014-08-14 Warner Bros. Entertainment Inc. Video conversion technology
EP3326365A4 (en) * 2015-07-31 2019-03-27 Hsni, Llc Virtual three dimensional video creation and management system and method
US10284800B2 (en) * 2016-10-21 2019-05-07 Canon Kabushiki Kaisha Solid-state image pickup element, method of controlling a solid-state image pickup element, and image pickup apparatus

Families Citing this family (27)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4142205B2 (en) * 1999-05-19 2008-09-03 オリンパス株式会社 Electronic still camera
JP2002006208A (en) * 2000-06-23 2002-01-09 Asahi Optical Co Ltd Digital still camera with automatic focus detecting mechanism
JP2003295047A (en) * 2002-04-05 2003-10-15 Canon Inc Image pickup device and image pickup system
US7099575B2 (en) * 2002-07-08 2006-08-29 Fuji Photo Film Co., Ltd. Manual focus device and autofocus camera
JP3964315B2 (en) * 2002-12-03 2007-08-22 株式会社リコー Digital camera
JP2004198701A (en) * 2002-12-18 2004-07-15 Olympus Corp Focus detecting optical system and camera provided with the same
US20050046739A1 (en) * 2003-08-29 2005-03-03 Voss James S. System and method using light emitting diodes with an image capture device
JP4416462B2 (en) * 2003-09-19 2010-02-17 パナソニック株式会社 Autofocus method and autofocus camera
JP2005221602A (en) * 2004-02-04 2005-08-18 Konica Minolta Photo Imaging Inc Image pickup device
JP4042710B2 (en) * 2004-02-25 2008-02-06 カシオ計算機株式会社 Autofocus device and program thereof
JP2006047602A (en) * 2004-08-04 2006-02-16 Casio Comput Co Ltd Camera device
US20070156021A1 (en) * 2005-09-14 2007-07-05 Bradford Morse Remote imaging apparatus having an adaptive lens
US7751700B2 (en) * 2006-03-01 2010-07-06 Nikon Corporation Focus adjustment device, imaging device and focus adjustment method
US7792420B2 (en) * 2006-03-01 2010-09-07 Nikon Corporation Focus adjustment device, imaging device and focus adjustment method
US7657171B2 (en) * 2006-06-29 2010-02-02 Scenera Technologies, Llc Method and system for providing background blurring when capturing an image using an image capture device
JP2008053845A (en) * 2006-08-22 2008-03-06 Olympus Imaging Corp Lens interchangeable camera
JP5007612B2 (en) * 2007-06-29 2012-08-22 株式会社ニコン Focus adjustment device
US7844176B2 (en) * 2007-08-01 2010-11-30 Olympus Imaging Corp. Imaging device, and control method for imaging device
JP2009036986A (en) * 2007-08-01 2009-02-19 Olympus Imaging Corp Photographing device and control method for photographing device
JP5273965B2 (en) * 2007-08-07 2013-08-28 キヤノン株式会社 Imaging device
JP2009069170A (en) * 2007-08-22 2009-04-02 Olympus Imaging Corp Photographing device and control method of photographing device
JP5451259B2 (en) * 2009-08-28 2014-03-26 キヤノン株式会社 Imaging apparatus, control method thereof, and program
JP5653035B2 (en) 2009-12-22 2015-01-14 キヤノン株式会社 Imaging apparatus, focus detection method, and control method
JP2011166378A (en) * 2010-02-08 2011-08-25 Canon Inc Imaging device and control method of the same
JP2011164543A (en) * 2010-02-15 2011-08-25 Hoya Corp Ranging-point selecting system, auto-focus system, and camera
US9554029B2 (en) * 2012-09-11 2017-01-24 Sony Corporation Imaging apparatus and focus control method
CN104796616A (en) * 2015-04-27 2015-07-22 惠州Tcl移动通信有限公司 Focusing method and focusing system based on distance sensor of mobile terminal

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4949117A (en) * 1988-12-23 1990-08-14 Eastman Kodak Company Camera
JPH0743605A (en) 1993-08-02 1995-02-14 Minolta Co Ltd Automatic focusing device
US6037972A (en) * 1994-10-21 2000-03-14 Canon Kabushiki Kaisha Camera
JPH09181954A (en) 1995-12-25 1997-07-11 Canon Inc Electronic still camera and method for focus-controlling the same
US5815748A (en) * 1996-02-15 1998-09-29 Minolta Co., Ltd. Camera

Cited By (121)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6750914B2 (en) * 2001-04-12 2004-06-15 Ricoh Company, Limited Image pick-up device
US20020149689A1 (en) * 2001-04-12 2002-10-17 Masato Sannoh Image pick-up device
US7646420B2 (en) * 2002-02-22 2010-01-12 Fujifilm Corporation Digital camera with a number of photographing systems
US20070064142A1 (en) * 2002-02-22 2007-03-22 Fujifilm Corporation Digital camera
US20040155976A1 (en) * 2003-02-12 2004-08-12 Canon Kabushiki Kaisha Image taking apparatus and lens apparatus
US7847853B2 (en) 2003-02-12 2010-12-07 Canon Kabushiki Kaisha Image taking apparatus and lens apparatus
US7414664B2 (en) * 2003-02-12 2008-08-19 Canon Kabushiki Kaisha Image taking apparatus and lens apparatus
US20080316353A1 (en) * 2003-02-12 2008-12-25 Canon Kabushiki Kaisha Image taking apparatus and lens apparatus
EP1458181A1 (en) * 2003-03-14 2004-09-15 Ricoh Company Digital camera with distance-dependent focussing method
US7365790B2 (en) 2003-03-14 2008-04-29 Ricoh Company, Ltd. Autofocus system for an image capturing apparatus
US20040240871A1 (en) * 2003-03-14 2004-12-02 Junichi Shinohara Image inputting apparatus
DE102004017536B4 (en) * 2003-04-08 2011-08-18 Hoya Corp. Automatic sharpening system
US7471330B2 (en) * 2004-02-20 2008-12-30 Canon Kabushiki Kaisha Lens controlling apparatus and image-taking apparatus with focus control based on first and second signals derived from different focus control methods
US20050185083A1 (en) * 2004-02-20 2005-08-25 Hiroto Okawara Lens controlling apparatus and image-taking apparatus
EP1617652A2 (en) * 2004-07-12 2006-01-18 Canon Kabushiki Kaisha Hybrid autofocus system for a camera
US20060008265A1 (en) * 2004-07-12 2006-01-12 Kenji Ito Optical apparatus
US7469098B2 (en) 2004-07-12 2008-12-23 Canon Kabushiki Kaisha Optical apparatus
EP1617652A3 (en) * 2004-07-12 2006-04-05 Canon Kabushiki Kaisha Hybrid autofocus system for a camera
EP2015567A1 (en) 2004-07-12 2009-01-14 Canon Kabushiki Kaisha Hybrid autofocus system for a camera
CN100464243C (en) * 2004-07-12 2009-02-25 佳能株式会社 Optical apparatus
US20060127080A1 (en) * 2004-12-09 2006-06-15 Konica Minolta Photo Imaging, Inc. Digital single-reflex camera
US7437064B2 (en) * 2004-12-15 2008-10-14 Hoya Corporation Drive mechanism for camera
US20060127072A1 (en) * 2004-12-15 2006-06-15 Pentax Corporation Drive mechanism for camera
US7657166B2 (en) * 2005-03-21 2010-02-02 Samsung Electronics Co., Ltd. Mobile communication terminal having camera function and method for performing the same
US20060210257A1 (en) * 2005-03-21 2006-09-21 Samsung Electronics Co., Ltd. Mobile communication terminal having camera function and method for performing the same
US7593634B2 (en) * 2005-06-27 2009-09-22 Olympus Imaging Corp. Digital camera
US20060291847A1 (en) * 2005-06-27 2006-12-28 Hiroshi Terada Digital camera
US7539408B2 (en) * 2005-07-01 2009-05-26 Panasonic Corporation Single-lens reflex camera system
US20070003269A1 (en) * 2005-07-01 2007-01-04 Matsushita Electric Industrial Co., Ltd. Single-lens reflex camera system
US8330839B2 (en) 2005-07-28 2012-12-11 Omnivision Technologies, Inc. Image sensor with improved light sensitivity
US8711452B2 (en) 2005-07-28 2014-04-29 Omnivision Technologies, Inc. Processing color and panchromatic pixels
US8139130B2 (en) 2005-07-28 2012-03-20 Omnivision Technologies, Inc. Image sensor with improved light sensitivity
US8274715B2 (en) 2005-07-28 2012-09-25 Omnivision Technologies, Inc. Processing color and panchromatic pixels
US7782394B2 (en) 2005-12-06 2010-08-24 Panasonic Corporation Digital camera
US7787045B2 (en) 2005-12-06 2010-08-31 Panasonic Corporation Digital camera having a control portion for light measurement
US8223242B2 (en) 2005-12-06 2012-07-17 Panasonic Corporation Digital camera which switches the displays of images with respect to a plurality of display portions
US20100066845A1 (en) * 2005-12-06 2010-03-18 Panasonic Corporation Digital camera
US20080068490A1 (en) * 2005-12-06 2008-03-20 Matsushita Electric Industrial Co., Ltd. Digital camera
US8411196B2 (en) 2005-12-06 2013-04-02 Panasonic Corporation Digital camera with movable mirror for AF in live view mode and optical viewfinder mode
EP2282523A2 (en) * 2005-12-06 2011-02-09 Panasonic Corporation Digital Camera
US9071747B2 (en) 2005-12-06 2015-06-30 Panasonic Intellectual Property Management Co., Ltd. Digital camera
US8264596B2 (en) * 2005-12-06 2012-09-11 Panasonic Corporation Digital camera with live view mode
US20070153114A1 (en) * 2005-12-06 2007-07-05 Matsushita Electric Industrial Co., Ltd. Digital camera
US20090303374A1 (en) * 2005-12-06 2009-12-10 Panasonic Corporation Digital camera
US20090310012A1 (en) * 2005-12-06 2009-12-17 Matsushita Electric Industrial Co., Ltd. Digital camera
US8228416B2 (en) 2005-12-06 2012-07-24 Panasonic Corporation Digital camera
US8223263B2 (en) 2005-12-06 2012-07-17 Panasonic Corporation Digital camera
US7646421B2 (en) 2005-12-06 2010-01-12 Panasonic Corporation Digital camera
US20070153113A1 (en) * 2005-12-06 2007-07-05 Matsushita Electric Industrial Co., Ltd. Digital camera
RU2510866C2 (en) * 2005-12-06 2014-04-10 Панасоник Корпорэйшн Digital camera
CN103647894A (en) * 2005-12-06 2014-03-19 松下电器产业株式会社 Digital camera, camera frame, camera system and control method of the digital camera
US8970759B2 (en) 2005-12-06 2015-03-03 Panasonic Intellectual Property Management Co., Ltd. Digital camera
US20100066889A1 (en) * 2005-12-06 2010-03-18 Panasonic Corporation Digital camera
US20100066890A1 (en) * 2005-12-06 2010-03-18 Panasonic Corporation Digital camera
US8218071B2 (en) 2005-12-06 2012-07-10 Panasonic Corporation Digital camera
US7408586B2 (en) 2005-12-06 2008-08-05 Matsushita Electric Industrial Co., Ltd. Digital camera
US8111323B2 (en) 2005-12-06 2012-02-07 Panasonic Corporation Digital camera
EP2282520A2 (en) * 2005-12-06 2011-02-09 Panasonic Corporation Digital camera
US20100066895A1 (en) * 2005-12-06 2010-03-18 Panasonic Corporation Digital camera
US7796160B2 (en) 2005-12-06 2010-09-14 Panasonic Corporation Digital camera
US20100265379A1 (en) * 2005-12-06 2010-10-21 Panasonic Corporation Digital camera
US20100271530A1 (en) * 2005-12-06 2010-10-28 Panasonic Corporation Digital camera
US20100271531A1 (en) * 2005-12-06 2010-10-28 Panasonic Corporation Digital camera
US20100271532A1 (en) * 2005-12-06 2010-10-28 Panasonic Corporation Digital camera
US20100295955A1 (en) * 2005-12-06 2010-11-25 Panasonic Corporation Digital camera
US20100302411A1 (en) * 2005-12-06 2010-12-02 Matsushita Electric Industrial Co., Ltd. Digital camera
US20070153112A1 (en) * 2005-12-06 2007-07-05 Matsushita Electric Industrial Co., Ltd. Digital camera
US7620310B2 (en) * 2005-12-16 2009-11-17 Hoya Corporation Camera having an autofocus system
US20070140676A1 (en) * 2005-12-16 2007-06-21 Pentax Corporation Camera having an autofocus system
US8023031B2 (en) 2006-02-15 2011-09-20 Canon Kabushiki Kaisha Image pickup apparatus with display apparatus, and display control method for display apparatus
EP1821520A1 (en) 2006-02-15 2007-08-22 Canon Kabushiki Kaisha Image pickup apparatus with display apparatus, and display control method for display apparatus
US20070188647A1 (en) * 2006-02-15 2007-08-16 Hitoshi Ikeda Image pickup apparatus with display apparatus, and display control method for display apparatus
US8194296B2 (en) 2006-05-22 2012-06-05 Omnivision Technologies, Inc. Image sensor with improved light sensitivity
US8190019B2 (en) 2006-06-20 2012-05-29 Panasonic Corporation Digital camera
US7511757B2 (en) 2006-06-20 2009-03-31 Panasonic Corporation Digital camera
US7885538B2 (en) 2006-06-20 2011-02-08 Panasonic Corporation Digital camera
US20070292123A1 (en) * 2006-06-20 2007-12-20 Matsushita Electric Industrial Co., Ltd. Digital camera
US7668456B2 (en) 2006-06-20 2010-02-23 Panasonic Corporation Digital camera
US20110097068A1 (en) * 2006-06-20 2011-04-28 Panasonic Corporation Digital camera
US8416339B2 (en) 2006-10-04 2013-04-09 Omni Vision Technologies, Inc. Providing multiple video signals from single sensor
US8203642B2 (en) * 2006-11-17 2012-06-19 Sony Corporation Selection of an auto focusing method in an imaging apparatus
US20080118238A1 (en) * 2006-11-17 2008-05-22 Sony Corporation Imaging apparatus
US20080138055A1 (en) * 2006-12-08 2008-06-12 Sony Ericsson Mobile Communications Ab Method and Apparatus for Capturing Multiple Images at Different Image Foci
US7646972B2 (en) * 2006-12-08 2010-01-12 Sony Ericsson Mobile Communications Ab Method and apparatus for capturing multiple images at different image foci
US20080151065A1 (en) * 2006-12-20 2008-06-26 Yoichiro Okumura Camera capable of displaying moving image and control method of the same
CN101207718B (en) * 2006-12-20 2012-04-25 奥林巴斯映像株式会社 Camera capable of displaying moving image and control method of the same
US7940306B2 (en) 2006-12-20 2011-05-10 Olympus Imaging Corp. Camera capable of displaying moving image and control method of the same
US7944499B2 (en) * 2007-02-15 2011-05-17 Olympus Imaging Corporation Single lens reflex type electronic imaging apparatus
US20080198258A1 (en) * 2007-02-15 2008-08-21 Olympus Imaging Corp. Single lens reflex type electronic imaging apparatus
US20080316352A1 (en) * 2007-05-12 2008-12-25 Quanta Computer Inc. Focusing apparatus and method
US8223254B2 (en) 2007-05-28 2012-07-17 Panasonic Corporation Camera body
US20100171868A1 (en) * 2007-05-28 2010-07-08 Panasonic Corporation Camera system and camera body
US8553135B2 (en) 2007-05-28 2013-10-08 Panasonic Corporation Camera system and camera body
US8243187B2 (en) * 2007-06-19 2012-08-14 Samsung Electronics Co., Ltd. Photographing apparatus and photographing method
US20080316351A1 (en) * 2007-06-19 2008-12-25 Samsung Techwin Co., Ltd. Photographing apparatus and photographing method
US8704905B2 (en) * 2007-08-29 2014-04-22 Panasonic Corporation Camera body and camera system
US20130113983A1 (en) * 2007-08-29 2013-05-09 Panasonic Corporation Imaging device and camera
US8004597B2 (en) * 2007-12-05 2011-08-23 Quanta Computer Inc. Focusing control apparatus and method
US20090185069A1 (en) * 2008-01-22 2009-07-23 Canon Kabushiki Kaisha Imaging apparatus and control method thereof
WO2010044831A1 (en) * 2008-10-14 2010-04-22 Eastman Kodak Company Dithered focus evaluation
US20100091169A1 (en) * 2008-10-14 2010-04-15 Border John N Dithered focus evaluation
US8164682B2 (en) 2008-10-14 2012-04-24 Omnivision Technologies, Inc. Dithered focus evaluation
US20140104462A1 (en) * 2010-03-10 2014-04-17 Sony Corporation Imaging apparatus and method of controlling the imaging apparatus
US9716839B2 (en) * 2010-03-10 2017-07-25 Sony Corporation Imaging apparatus and method of controlling the imaging apparatus
CN102833482A (en) * 2011-06-14 2012-12-19 三星电子株式会社 Apparatus and method of adjusting automatic focus
US9229211B2 (en) * 2012-02-17 2016-01-05 Sony Corporation Imaging apparatus, imaging control program, and imaging method
US20130215251A1 (en) * 2012-02-17 2013-08-22 Sony Corporation Imaging apparatus, imaging control program, and imaging method
US20140039257A1 (en) * 2012-08-02 2014-02-06 Olympus Corporation Endoscope apparatus and focus control method for endoscope apparatus
US10682040B2 (en) * 2012-08-02 2020-06-16 Olympus Corporation Endoscope apparatus and focus control method for endoscope apparatus
US9516999B2 (en) * 2012-08-02 2016-12-13 Olympus Corporation Endoscope apparatus and focus control method for endoscope apparatus
US20170071452A1 (en) * 2012-08-02 2017-03-16 Olympus Corporation Endoscope apparatus and focus control method for endoscope apparatus
US9723257B2 (en) 2013-02-14 2017-08-01 Warner Bros. Entertainment Inc. Video conversion technology
US20140226065A1 (en) * 2013-02-14 2014-08-14 Warner Bros. Entertainment Inc. Video conversion technology
US10277862B2 (en) 2013-02-14 2019-04-30 Warner Bros. Entertainment Inc. Video conversion technology
US9241128B2 (en) * 2013-02-14 2016-01-19 Warner Bros. Entertainment Inc. Video conversion technology
US10687017B2 (en) 2013-02-14 2020-06-16 Warner Bros. Entertainment Inc. Video conversion technology
EP3326365A4 (en) * 2015-07-31 2019-03-27 Hsni, Llc Virtual three dimensional video creation and management system and method
US10356338B2 (en) * 2015-07-31 2019-07-16 Hsni, Llc Virtual three dimensional video creation and management system and method
US11108972B2 (en) * 2015-07-31 2021-08-31 Hsni, Llc Virtual three dimensional video creation and management system and method
US10284800B2 (en) * 2016-10-21 2019-05-07 Canon Kabushiki Kaisha Solid-state image pickup element, method of controlling a solid-state image pickup element, and image pickup apparatus
US10609315B2 (en) 2016-10-21 2020-03-31 Canon Kabushiki Kaisha Solid-state image pickup element, apparatus, and method for focus detection

Also Published As

Publication number Publication date
US6453124B2 (en) 2002-09-17

Similar Documents

Publication Publication Date Title
US6453124B2 (en) Digital camera
JP3634232B2 (en) Digital still camera
US8345141B2 (en) Camera system, camera body, interchangeable lens unit, and imaging method
US7609294B2 (en) Image pick-up apparatus capable of taking moving images and still images and image picking-up method
JP4528235B2 (en) Digital camera
TWI400558B (en) Camera device
US7844176B2 (en) Imaging device, and control method for imaging device
JP3726630B2 (en) Digital still camera
JP2002094862A (en) Image pickup apparatus
JP4094458B2 (en) Image input device
JP2001177761A (en) Digital camera
US20050013605A1 (en) Digital camera
US20060055991A1 (en) Image capture apparatus and image capture method
KR100211531B1 (en) Electronic still camera
US7665912B2 (en) Image-taking apparatus and control method thereof
JP2001275033A (en) Digital still camera
CN111868597B (en) Image pickup apparatus, focusing assist method thereof, and recording medium
JP2007096455A (en) Photographing apparatus
JP2007225897A (en) Focusing position determination device and method
JP4013026B2 (en) Electronic camera and image display method during automatic focus adjustment
US6345154B1 (en) Interchangeable lens
JP2009036987A (en) Photographing device and control method for photographing device
JP2001136429A (en) Electronic camera
JP2007282063A (en) Digital single-lens reflex camera
JP5069076B2 (en) Imaging apparatus and continuous imaging method

Legal Events

Date Code Title Description
AS Assignment

Owner name: MINOLTA CO., LTD., JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:MORIMOTO, YASUHIRO;BUTSUSAKI, TAKERU;YUKAWA, KAZUHIKO;AND OTHERS;REEL/FRAME:011875/0435;SIGNING DATES FROM 20010521 TO 20010529

AS Assignment

Owner name: MINOLTA CO., LTD., JAPAN

Free format text: CORRECTIVE ASSIGNMENT, ADVISING THAT KAZUHIKO YUKUHIKO IS DECEASED AND ADDING KAZUMI YUKAWA AS HIS LEGAL REPRESENTATIVE FOR ASSIGNMENT PREVIOUSLY RECORDED UNDER REEL 011875, FRAME 0435;ASSIGNORS:MORIMOTO, YASUHIRO;BUTSUSAKI, TAKERU;YUKAWA, KAZUMI, LEGAL REPRESENTATIVE OF KAZUHIKO YUKAWA (DECEASED);AND OTHERS;REEL/FRAME:012247/0403;SIGNING DATES FROM 20010521 TO 20010529

STCF Information on status: patent grant

Free format text: PATENTED CASE

FEPP Fee payment procedure

Free format text: PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

FPAY Fee payment

Year of fee payment: 4

FEPP Fee payment procedure

Free format text: PAYER NUMBER DE-ASSIGNED (ORIGINAL EVENT CODE: RMPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Free format text: PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

FPAY Fee payment

Year of fee payment: 8

FPAY Fee payment

Year of fee payment: 12