USRE46012E1 - Non-contact probe - Google Patents

Non-contact probe Download PDF

Info

Publication number
USRE46012E1
USRE46012E1 US14/562,093 US200814562093A USRE46012E US RE46012 E1 USRE46012 E1 US RE46012E1 US 200814562093 A US200814562093 A US 200814562093A US RE46012 E USRE46012 E US RE46012E
Authority
US
United States
Prior art keywords
image
imaging device
images
probe
optical pattern
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active, expires
Application number
US14/562,093
Inventor
Nicholas John Weston
Yvonne Ruth Huddart
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Renishaw PLC
Original Assignee
Renishaw PLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from GBGB0716080.7A external-priority patent/GB0716080D0/en
Priority claimed from GBGB0716109.4A external-priority patent/GB0716109D0/en
Priority claimed from GBGB0716088.0A external-priority patent/GB0716088D0/en
Application filed by Renishaw PLC filed Critical Renishaw PLC
Priority to US14/562,093 priority Critical patent/USRE46012E1/en
Application granted granted Critical
Publication of USRE46012E1 publication Critical patent/USRE46012E1/en
Active legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/002Measuring arrangements characterised by the use of optical techniques for measuring two or more coordinates
    • G01B11/005Measuring arrangements characterised by the use of optical techniques for measuring two or more coordinates coordinate measuring machines
    • G01B11/007Measuring arrangements characterised by the use of optical techniques for measuring two or more coordinates coordinate measuring machines feeler heads therefor
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/02Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness
    • G01B11/026Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness by measuring distance between sensor and object
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • G01B11/2518Projection by scanning of the object
    • G01B11/2527Projection by scanning of the object with phase change by in-plane movement of the patern
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • G06T7/0057
    • G06T7/0075
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/521Depth or shape recovery from laser ranging, e.g. using interferometry; from the projection of structured light
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/55Depth or shape recovery from multiple images
    • G06T7/593Depth or shape recovery from multiple images from stereo images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30108Industrial image inspection
    • G06T2207/30164Workpiece; Machine component

Definitions

  • This invention relates to a method and apparatus for measuring an object without contacting the object.
  • Photogrammetry is a known technique for determining the location of certain points on an object from photographs taken at different perspectives, i.e. positions and/or orientations.
  • photogrammetry comprises obtaining at least two images of an object taken from two different perspectives. For each image the two dimensional coordinates of a feature of the object on the image can determined. It is then possible from the knowledge of the relative location and orientation of the camera(s) which took the images, and the points at which the feature is formed on the images to determine the three dimensional coordinates of the feature on the object via triangulation.
  • Such a technique is disclosed for example in U.S. Pat. No. 5,251,156 the entire content of which is incorporated into this specification by this reference.
  • Non-contact optical measuring systems are also known for measuring the topography of a surface. These may typically consist of a projector which projects a structured light pattern onto a surface and a camera, set at an angle to the projector, which detects the structured light pattern on the surface. Height variation on the surface causes a distortion in the pattern. From this distortion the geometry of the surface can be calculated via triangulation and/or phase analysis techniques.
  • the invention provides a method and apparatus in which measurement of an object via photogrammetric techniques and triangulation and/or phase analysis techniques can be performed on images obtained by a common probe.
  • a non-contact measurement apparatus comprising: a probe for mounting on a coordinate positioning apparatus, comprising at least one imaging device for capturing an image of an object to be measured; an image analyser configured to analyse at least one first image of an object obtained by the probe from a first perspective and at least one second image of the object obtained by the probe from a second perspective so as to identify at least one target feature on the object to be measured, and further configured to obtain topographical data regarding a surface of the object via analysis of an image, obtained by the probe, of the object on which an optical pattern is projected.
  • both the position of target features on the object, and the topographical data of the surface of the object are determined by the image analyser using images obtained by the same probe. Accordingly, it is not necessary to have two separate imaging systems for obtaining both the position of target features of an object and the topographical form of the surface of the object.
  • a perspective can be a particular view point of the object.
  • a perspective can be defined by the position and/or orientation of the imaging device relative to the object.
  • the at least one first image and the at least one second image can be obtained by at least one suitable imaging device.
  • suitable imaging devices can comprise at least one image sensor.
  • suitable imaging devices can comprise an optical electromagnetic radiation (EMR) sensitive detector, such as a charge-coupled device (CCD), a complementary metal-oxide-semiconductor (CMOS).
  • EMR electromagnetic radiation
  • CCD charge-coupled device
  • CMOS complementary metal-oxide-semiconductor
  • Suitable imaging devices can be optically configured to focus light at the image plane.
  • the image plane can be defined by the image sensor.
  • suitable imaging devices can comprise at least one optical component configured to focus optical EMR at the image plane.
  • the at least one optical component comprises a lens.
  • Suitable imaging devices can be based on the pinhole camera model which consists of a pinhole, which can also be referred to as the imaging device's perspective centre, through which optical EMR rays are assumed to pass before intersecting with the image plane.
  • imaging devices that do not comprise a pinhole but instead comprise a lens to focus optical EMR also have a perspective centre and this can be the point through which all optical EMR rays that intersect with the image plane are assumed to pass.
  • the perspective centre can be found relative to the image sensor using a calibration procedure, such as those described in J. Heikkila and O. Silven, “A four-step camera calibration procedure with implicit image correction”, Proceedings of the 1997 Conference in Computer Vision and Pattern Recognition (CVPR '97) and J. G Fryer, “Camera Calibration” in K. B. Atkinson (ed.) “Close range photogrammetry and machine vision”, Whittles publishing (1996). Correction parameters such as those for correcting lens aberrations can be provided and are well known and are for instance described in these two documents.
  • the probe can comprise a plurality of imaging devices.
  • the images analysed by the image analyser are obtained using a common imaging device.
  • the probe can comprise a single imaging device only.
  • the optical pattern is projected over an area of the object.
  • the pattern extends over an area of the object so as to facilitate the measurement of a plurality of points of the object over the area image.
  • the pattern is a substantially repetitive pattern.
  • Particularly preferred optical patterns comprise substantially periodic optical patterns.
  • a periodic optical pattern can be a pattern which repeats after a certain finite distance. The minimum distance between repetitions can be the period of the pattern.
  • the optical pattern is periodic in at least one dimension.
  • the optical pattern can be periodic in at least two perpendicular dimensions.
  • Suitable optical patterns for use with the present invention include patterns of concentric circles, patterns of lines of varying colour, shades, and/or tones.
  • the colour, shades and/or tones could alternate between two or more different values.
  • the colour, shade and/or tones could vary between a plurality of discrete values.
  • the colour, shade and/or tones varies continuously across the optical pattern.
  • the optical pattern is a fringe pattern.
  • the optical pattern can be a set of sinusoidal fringes.
  • the optical pattern can be in the infrared to ultraviolet range.
  • the optical pattern is a visible optical pattern.
  • an optical pattern for use in methods such as that of the present invention is also commonly referred to as a structured light pattern.
  • the optical pattern could be projected onto the object via at least one projector.
  • Suitable projectors for the optical pattern include a digital light projector configured to project an image input from a processor device. Such a projector enables the pattern projected to be changed.
  • Suitable projectors could comprise a light source and one or more diffraction gratings arranged to produce the optical pattern.
  • the diffraction grating(s) could be moveable so as to enable the pattern projected by the projector to be changed.
  • the diffraction grating(s) can be mounted on a piezoelectric transducer.
  • the diffraction gratings could be fixed such that the pattern projected by the projector cannot be changed.
  • the projector could comprise a light source and a hologram. Further, the projector could comprise a light source and a patterned slide. Further still, the projector could comprise two mutually coherent light sources.
  • the coherent light sources could be moveable so as to enable the pattern projected by the projector to be changed.
  • the coherent light sources can be mounted on a piezoelectric transducer.
  • the coherent light sources could be fixed such that the pattern projected by the projector cannot be changed.
  • the at least one projector could be provided separately to the probe.
  • the probe comprises the at least one projector.
  • the probe comprises a single projector only.
  • a target feature can be a predetermined mark on the object.
  • the predetermined mark could be a part of the object, for example a predetermined pattern formed on the object's surface.
  • the mark could be attached to the object for the purpose of identifying a target feature.
  • the mark could be a coded “bull's eye”, wherein the “bull's-eye” has a unique central point which is invariant with perspective, surrounded by a set of concentric black and white rings which code a unique identifier.
  • Automatic feature recognition methods can be used to both locate the centre of the target and also decode the unique identifier. By means of such targets the images can be automatically analysed and the coordinates of the “bull's-eye” centre returned.
  • the image analyser could be configured to analyser further images of the object being obtained from further known perspectives that are different to the perspectives of the other images. The more images that are analysed the more accurate and reliable the position determination of the target feature on the object can be.
  • a target feature on the object to be measured can be identified by feature recognition techniques. For example, a Hough Transform can be used to identify a straight line feature on the object.
  • At least one of the at least one first image and at least second image can be an image of the object onto which an optical pattern is projected.
  • the optical pattern need not be the same as the imaged optical pattern used for obtaining topographical data.
  • the at least one first image and at least second image are images of the object onto which an optical pattern is projected. This enables topographical data to be obtained from at least one of the at least one first and at least one second image.
  • the image analyser is configured to identify an irregularity in the optical pattern in each of the first and second images as the at least one target feature.
  • target features can be identified without the use of markers placed on the object. This has been found to enable highly accurate measurements of the object to be taken quickly. It has also been found that the method of the invention can require less processing resources to identify points on complex shaped objects than by other known image processing techniques.
  • an irregularity in the optical pattern can also be referred to as discontinuity in the optical pattern.
  • An irregularity in the optical pattern can be a deformation of the optical pattern caused by a discontinuous feature on the object.
  • a deformation of the optical pattern can, for example, be caused at the boundary between two continuous sections of an object.
  • the boundary could be the edge of a cube at which two faces of the cube meet.
  • a discontinuous feature on the object can be where the gradient of the surface of the object changes significantly. The greater the gradient of the surface relative to the optical pattern projector, the greater the deformation of the optical pattern at that point on the surface.
  • an irregularity could be identified by identifying those points on the object at which the optical pattern is deformed by more than a predetermined threshold.
  • This predetermined threshold will depend on a number of factors, including the size and shape of the object to be measured.
  • the predetermined threshold can be determined and set prior to operation by a user based on the knowledge of the object to be measured.
  • An irregularity can be can be identified by identifying in an image those points on the object at which the rate of change of the optical pattern is greater than a predetermined threshold rate of change.
  • a predetermined threshold rate of change For instance, in embodiments in which the optical pattern is a periodic optical pattern, an irregularity can be identified by identifying in an image those points on the object at which the rate of change of the phase of the periodic optical pattern is greater than a predetermined threshold rate of change.
  • the optical pattern is a fringe pattern
  • an irregularity can be identified by identifying in an image those points on the object at which the rate of change of the phase of the fringe pattern is greater than a predetermined threshold rate of change.
  • the rate of change of the phase of an optical pattern as imaged when projected onto an object can be identified by creating a phase map from the image, and then looking for jumps in the phase between adjacent points in the phase map above a predetermined threshold.
  • a phase map is a map which contains the phase a pattern projected onto the object's surface for a plurality of pixels in an image.
  • the phase map could be a wrapped phase map.
  • the phase map could be an unwrapped phase map.
  • Known techniques can be used to unwrap a wrapped phase map in order to obtain an unwrapped phase map.
  • a phase map can be created from a single image of the optical pattern object. For example, Fourier Transform techniques could be used to create the phase map.
  • a phase map is created from a set of images of the object from substantially the same perspective, in which the position of the optical pattern on the object is different for each image.
  • a phase map can be created using a phase stepping approach. This can provide a more accurate phase map.
  • Phase stepping algorithms known and are for example described in Creath, K. “Comparison of phase measurement algorithms” Proc. SPIE 680, 19-28 (1986).
  • the method can comprise obtaining a set of first images of the optical pattern on the object from the first perspective.
  • the method can further comprise obtaining a set of second images of the optical pattern on the object from the second perspective.
  • a set of images can comprise a plurality of images of the object from a given perspective.
  • a set of images comprises at least two images, more preferably at least three images, especially preferably at least four images.
  • the position (e.g. phase) of the optical pattern on the object can be different for each image in a set.
  • the image analyser can be configured to process: a set of first images obtained from the first known perspective, the position of the optical pattern on the object being different for each image in the set; and a set of second images obtained from the second known perspective, the position of the optical pattern on the object being different for each image in the set in order to identify at least one target feature on the object to be measured and to determine the position of the target feature on the object relative to the image sensor.
  • topographical data can be data indicating the topography of at least a part of the object's surface.
  • the topographical data can be data indicating the height of the object's surface at least one point on the object, and preferably at a plurality of points across the object.
  • the topographical data can be data indicating the gradient of the object's surface, at least one point on the object, and preferably at a plurality of points across the object.
  • the topographical data can be data indicating the height and/or gradient of the object's surface relative to the image sensor.
  • the topographical data can be obtained via analysing the optical pattern.
  • the topographical data can be obtained via analysing the deformation of the optical pattern. This can be done for example via triangulation techniques.
  • the topographical data can be obtained via analysing the optical pattern using phase analysis techniques.
  • the at least one image processed by the imager analyser to obtain topographical data can be a separate image from the at least one first and at least one second images.
  • the topographical data regarding the surface on which the optical pattern is projected can be obtained from at least one of the at least one first and at least one second images.
  • at least one of the at least one first and at least one second images can be an image of the object on which a optical pattern is projected.
  • the image analyser could be configured to generate a phase map from at least one of the plurality of images.
  • the image analyser could be configured to generate a phase map from at least one of the at least one first image and at least one second image.
  • the phase map could be generated by Fourier Transforming one of the plurality of images.
  • the image analyser can be configured to process a set of images in which the position of an optical pattern on the object is different for each image in the set in order to determine the topographical data.
  • a phase map can be created from a set of images of the object from the same perspective, in which the position (e.g. phase) of the optical pattern at the object is different for each image.
  • the image analyser can be configured to process at least one of the first or second sets of images in order to determine the topographical data. Accordingly, the image analyser can be configured to process at least one of: a set of first images obtained from the first perspective, the position of the optical pattern on the object being different for each image in the set; and a set of second images obtained from the second perspective, the position of the optical pattern on the object being different for each image in the set, in order to determine the height variation data. Accordingly, the image analyser can be configured to calculate a phase map from at least one of the set of first images and the set of second images.
  • a wrapped phase map can be used to obtain topographical data.
  • a wrapped phase map can be unwrapped, and the topographical data can be obtained from the unwrapped phase map.
  • the image analyser can be configured to unwrap the wrapped phase map and to obtain the topographical data from the unwrapped phase map.
  • the topographical data could be in the form of height data. As will be understood, height data can detail the position of a plurality of points on the surface.
  • Obtaining topographical data can comprise determining the gradient of the surface.
  • Obtaining topographical data can comprise determining the gradient of the surface relative to the imaging device.
  • Determining the gradient of the surface relative to the imaging device can comprise calculating a phase shift map from the plurality of images.
  • Suitable algorithms for generating a phase shift map from the plurality of images include a Carré algorithm such as that described in Cure, P. “Installation et utilisation du comparateur photo 10, et interferential du Bureau International des Podis et Mesure” Metrologia 2 13-23 (1996).
  • Determining the gradient of the surface can further obtaining a gradient map based on the phase shift map.
  • the gradient map can be obtained by converting the value of each of the points on a phase shift map to a gradient value.
  • the value of a point in a phase shift map can be converted to a gradient value using a predetermined mapping procedure.
  • phase shift map can detail the phase shift for a plurality of points on the surface due to the change in position of projected fringes on the object's surface.
  • the phase shift can be bound in a range of 360 degrees.
  • a gradient map can detail the surface gradient relative to the image sensor of a plurality of points on the surface.
  • the method can further comprise integrating the gradient map to obtain height data.
  • height data can detail the position of a plurality of points on the surface relative to the image sensor.
  • the image analyser can be configured to calculate at least one of a first phase map from the set of first images and a second phase map from the set of second images.
  • Phase maps calculated from a set of images taken from the substantially the same perspective, the position of the optical pattern on the object in each image being different, can provide a more accurate and reliable phase map.
  • the image analyser can be configured to determine the topographical data from at least one of the at least one of a first phase map and second phase map.
  • the phase maps can be wrapped phase maps.
  • the at least one of a first wrapped phase map and second wrapped phase map can be unwrapped, and the topographical data can be obtained from the unwrapped phase map.
  • the position of the optical pattern could be changed between obtaining each of the images in a set of images by changing the optical pattern emitted by the projector.
  • a projector can comprise a laser beam which is incident on a lens which diverges the beam on to a liquid crystal system to generate at least one fringe pattern on the surface to be measured.
  • a computer can be used to control the pitch and phase of the fringe pattern generated by the liquid crystal system. The computer and the liquid crystal system can perform a phase-shifting technique in order to change the phase of the optical pattern.
  • the position of the optical pattern could be changed by relatively moving the object and the projector.
  • the object and projector could be rotated relative to each other in order to displace the optical pattern on the surface.
  • the object and projector are laterally displaced relative to each other.
  • the object could be moved between the obtaining each of the plurality of images.
  • the projector could be moved between the obtaining each of the plurality of images.
  • the projector can be configured such that it can project one optical pattern only.
  • the projector could be one in which the pitch or phase of the optical pattern cannot be altered.
  • the object and projector can be moved relative to each other by any amount which provides a change in the position of the projected optical pattern relative to the object.
  • the object and projector are moved relative to each other such that the position of the pattern on the object is at least nominally moved by a non-integral multiple of the period of the pattern.
  • the optical pattern is a fringe pattern
  • the object and projector can be relative to each other such that the position of the pattern on the object is at least nominally moved by a non-integral multiple of the fringe period.
  • the object and projector can be moved relative to each other such that the position of the pattern on the object is at least nominally moved by a 1 ⁇ 4 of the fringe period.
  • the actual distance the projector and object are to be moved relative to each other to obtain such a shift in the pattern on the object can depend on a number of factors including the period of the periodic optical pattern projected and the distance between the object and the projector.
  • relatively moving the projector and object will cause a change in the position of the optical pattern on the object. However, it may appear from images of the optical pattern on the object taken before and after the relative movement that the optical pattern has not moved. This can be referred to as nominal movement. Whether or not the movement is nominal or actual will depend on a number of factors including the form of the optical pattern projected, and the shape and/or orientation of the surface of the object relative to the projector. For instance, the change in position of the optical pattern on a surface for a given movement will be different for differently shaped and oriented surfaces.
  • the projector could be moved such that the position of the optical pattern relative to a predetermined reference plane in the measurement space is changed.
  • the projector could be moved such that the position of the optical pattern relative to a predetermined reference plane in the measurement space is changed by a non-integral multiple of the period of the pattern.
  • the predetermined reference plane could be the reference plane of the image sensor. Again, the shape and/or orientation of the surface of the object can then be determined by effectively comparing the position of the optical pattern on the surface relative to what it would be like at the reference plane.
  • the object and imaging device will be moved relative to each other as a consequence of obtaining a shift in the position of the optical pattern on the object.
  • the amount of relative movement should be sufficiently small such that the perspective of the object obtained by the image sensor in each of the images is substantially the same.
  • the movement is sufficiently small that any change in the perspective between the plurality of images can be compensated for in the step of analysing the plurality images.
  • the probe can be moved between images by rotating the probe about the imaging device's perspective centre. It has been found that rotating about the imaging device's perspective centre makes processing the images to compensate for any relative movement between the object and imaging device (discussed in more detail below). In particular it makes matching corresponding pixels across a number of images easier. For instance, matching corresponding pixels is possible using a coordinate transformation which is independent of the distance between the object and the imaging device. Accordingly, it is not necessary to know the distance between the object and imaging device in order to process the images to compensate for any relative movement between the object and imaging device.
  • the image analyser can be configured to: i) identify common image areas covered by each of the images in a set of images.
  • the image analyser can be configured to then ii) calculate the phase map for the set using the common image areas only. Identifying common image areas covered by each of the images in a set of images can comprise adjusting the image coordinates to compensate for relative movement between the object and the imaging device.
  • this application describes a non-contact measurement apparatus, comprising: a probe for mounting on a coordinate positioning apparatus, the probe comprising a projector for projecting an optical pattern onto the surface of an object to be measured, and an image sensor for imaging the optical pattern on the surface of the object; an image analyser configured to analyse at least one first image of an object on which an optical pattern is projected, the first image being obtained from a first known perspective, and at least one second image of the object on which the optical pattern is projected, the second image being obtained from a second known perspective, so as to: a) identify at least one target feature on the object to be measured and to determine the position of the target feature on the object relative to the image sensor; and b) determine topographical data regarding the surface on which the optical pattern is projected from at least one of the first and second images.
  • This application also describes in particular a non-contact method for measuring an object located within a measurement space comprising, in any suitable order, the steps of: i) an image sensor obtaining at least a first image of an object on which an optical pattern is projected, the at least first image being obtained from a first perspective; ii) the image sensor obtaining at least a second image of the object on which the optical pattern is projected, the second image being obtained from a second perspective; and iii) analysing the first and at least second images so as to: a) identify at least one target feature on the object to be measured and to determine the position of the target feature on the object relative to the image sensor; and b) obtain shape data of the surface on which the optical pattern is projected from at least one of the first and second imaged optical patterns.
  • an image analyser for use in a non-contact measurement apparatus as described above.
  • a non-contact method for measuring an object located within a measurement space using a probe comprising at least one imaging device, the method comprising: the probe obtaining a plurality of images of the object, comprising at least one first image of the object from a first perspective, at least one second image of the object from a second perspective, and at least one image of the object on which a optical pattern is projected; analysing the plurality of images to identify at least one target feature on the object to be measured and to obtain topographical data regarding a surface of the object via analysis of the optical pattern.
  • At least one of the at least one image of the object from a first perspective and at least one image of the object from a second perspective comprises the at least one image of the object on which an optical pattern is projected. Accordingly, the method can comprise obtaining topographical data from at least one of the at least one first image of the object from a first perspective and at least one second image of the object from a second perspective.
  • the method can comprise relatively moving the object and probe between the first and second perspectives. This can be particularly preferred when the probe comprises a single imaging device only.
  • the optical pattern can be projected by a projector that is separate to the probe.
  • the probe can comprise at least one projector for projecting an optical pattern.
  • a non-contact measurement apparatus comprising: a coordinate positioning apparatus having a repositionable head; and a non-contact measurement probe mounted on the head comprising: a projector for projecting an optical pattern onto the surface of an object to be measured; and an image sensor for imaging the optical pattern on the surface of the object.
  • the probe is mounted on a coordinate positioning apparatus. Doing so facilitates the acquisition of images of an object from multiple perspectives through the use of only a single probe device. Further as the probe is mounted on a coordinate positioning apparatus, it can be possible to accurately determine the position and orientation of the probe from the coordinate positioning machine's position reporting features.
  • the coordinate position machine could comprise a plurality of encoders for determining the position of relatively moveable parts of the coordinate positioning machine. In this case, the position and orientation of the image sensor could be determined from the output of the encoders.
  • coordinate positioning apparatus include coordinate measuring machines and other positioning apparatus such as articulating arms and machine tools, the position of what a tool or other device mounted on them can be determined.
  • the head is an articulating probe head. Accordingly, preferably the probe head can be rotated about at least one axis.
  • the coordinate positioning apparatus is a computer controlled positioning apparatus.
  • the coordinate positioning apparatus could comprise a coordinate measuring machine (CMM).
  • the coordinate positioning apparatus could comprise a machine tool.
  • the non-contact measurement apparatus could further comprise an image analyser configured to determine topographical data regarding the surface on which an optical pattern is projected by the projector from at least one of image obtained by the image sensor.
  • the image analyser could be configured as described above.
  • This application also describes a non-contact measurement probe for mounting on a coordinate positioning apparatus, comprising: a projector for projecting an optical pattern onto the surface of an object to be measured; and an image sensor for imaging the optical pattern on the surface of the object.
  • This application further describes a non-contact measurement method comprising: a projector mounted on a head of a coordinate positioning machine projecting an optical pattern onto a surface of an object to be measured; an image sensor imaging the optical pattern on the surface; and an image analyser determining topographical data regarding the surface of the object based on the image and on position information from the coordinate positioning machine.
  • the optical pattern can extend in two dimensions.
  • the optical pattern projected can enable the determination of the topology of the surface of an object in two dimensions from a single image of the optical pattern on the object.
  • the optical pattern can be a substantially full-field optical pattern.
  • a substantially full-field optical pattern can be one in which the pattern extends over at least 50% of the field of view of the image sensor at a reference plane (described in more detail below), more preferably over at least 75%, especially preferably over at least 95%, for example substantially over the entire field of view of the image sensor at a reference plane.
  • the reference plane can be a plane that is a known distance away from the image sensor.
  • the reference plane can be a plane which contains the point at which the projector's and image sensor's optical axes intersect.
  • the reference plane can extend perpendicular to the image sensor's optical axis.
  • the optical pattern could be a set of concentric circles, or a set of parallel lines of alternating colour, shades, or tones.
  • the periodic optical pattern is a fringe pattern.
  • the periodic optical pattern can be a set of sinusoidal fringes.
  • the periodic optical pattern can be in the infrared to ultraviolet range.
  • the periodic optical pattern is a visible periodic optical pattern.
  • a non-contact measurement apparatus comprising: a probe for mounting on a coordinate positioning apparatus, the probe comprising a projector for projecting a structured light pattern onto the surface of an object to be measured, and an image sensor for imaging the structured light pattern on the surface of the object; an image analyser configured to analyse at least one first image of an object on which a structured light pattern is projected, the first image being obtained from a first known perspective, and at least one second image of the object on which the structured light pattern is projected, the second image being obtained from a second known perspective, so as to: a) identify at least one target feature on the object to be measured and to determine the position of the target feature on the object relative to the image sensor; and b) determine topographical data regarding the surface on which the structured light pattern is projected from at least one of the first and second images.
  • FIG. 1 shows a schematic perspective view of a coordinate measuring machine on which a probe for measuring an object via a non-contact method according to the present invention is mounted;
  • FIG. 2 illustrates various images of the object shown in FIG. 1 obtained by the probe from three different perspectives
  • FIG. 3 illustrates a plurality of wrapped phase maps for each of the three different perspectives
  • FIG. 4 shows a flow chart illustrating the high-level operation of the apparatus shown in FIG. 1 ;
  • FIG. 5 illustrates the method of capturing a perspective image set
  • FIG. 6 illustrates the method of obtaining fringe shifted images
  • FIG. 7 illustrates the method of analysing the images
  • FIG. 8 illustrates the method of calculating the wrapped phase maps
  • FIG. 9 illustrates a first method for obtaining a height map
  • FIG. 10 illustrates the a second method for obtaining a height map
  • FIG. 11 is a schematic diagram of the components of the probe shown in FIG. 1 ;
  • FIG. 12 is a schematic diagram of the positional relationship of the imaging device and projector of the probe shown in FIG. 11 ;
  • FIG. 13 is a schematic diagram of the projector shown in FIG. 11 ;
  • FIG. 14 illustrates a set of fringe shifted images, the position of the fringe on the object being different in each image
  • FIG. 15 illustrates the effect of moving the image sensor relative to the object
  • FIG. 16 illustrates how the gradient of the object surface can be determined from the phase shift
  • FIG. 17 illustrates obtaining fringe shifted images by causing rotation about the image sensor's perspective centre
  • FIG. 18 illustrates the stand-off distance and depth of field of an imaging device.
  • CMM coordinate measuring machine
  • the CMM 2 comprises a base 10 , supporting a frame 12 which in turn holds a quill 14 .
  • Motors (not shown) are provided to move the quill 14 along the three mutually orthogonal axes X, Y and Z.
  • the quill 14 holds an articulating head 16 .
  • the head 16 has a base portion 20 attached to the quill 14 , an intermediate portion 22 and a probe retaining portion 24 .
  • the base portion 20 comprises a first motor (not shown) for rotating the intermediate portion 22 about a first rotational axis 18 .
  • the intermediate portion 22 comprises a second motor (not shown) for rotating the probe retaining portion 24 about a second rotational axis that is substantially perpendicular to the first rotational axis.
  • bearings may also be provided between the moveable parts of the articulating head 16 .
  • measurement encoders may be provided for measuring the relative positions of the base 10 , frame 12 , quill 14 , and articulating head 16 so that the position of the measurement probe 4 relative to a workpiece located on the base 10 can be determined.
  • the probe 4 is removably mounted (e.g. using a kinematic mount) on the probe retaining portion 24 .
  • the probe 4 can be held by the probe retaining portion 24 by the use of corresponding magnets (not shown) provided on or in the probe 4 and probe retaining portion 24 .
  • the head 16 allows the probe 4 to be moved with two degrees of freedom relative to the quill 14 .
  • the combination of the two degrees of freedom provided by the head 16 and the three linear (X, Y, Z) axes of translation of the CMM 2 allows the probe 4 to be moved about five axes.
  • a controller 26 comprising a CMM controller 27 for controlling the operation of the CMM 2 is also provided, and a probe controller 29 for controlling the operation of the probe 4 and an image analyser 31 for analysing the images obtained form the probe 4 .
  • the controller 26 may be a dedicated electronic control system and/or may comprise a personal computer.
  • the CMM controller 27 is arranged to provide appropriate drive currents to the first and second motors so that, during use, each motor imparts the required torque.
  • the torque imparted by each motor may be used to cause movement about the associated rotational axis or to maintain a certain rotational position. It can thus be seen that a drive current needs to be applied continuously to each motor of the head 16 during use; i.e. each motor needs to be powered even if there is no movement required about the associated rotational axis.
  • FIG. 1 provides only a top level description of a CMM 2 .
  • a more complete description of such apparatus can be found elsewhere; for example, see EP402440 the entire contents of which are incorporated herein by this reference.
  • the probe 4 comprises a projector 40 for projecting, under the control of a processing unit 42 a fringe pattern onto the object 28 , an imaging device 44 for obtaining, under the control of the processing unit 42 an image of the object 28 onto which the fringe pattern is projected.
  • the imaging device 44 comprises suitable optics and sensors for capturing images of the object 28 .
  • the imaging device comprises an image sensor, in particular a CCD defining an image plane 62 .
  • the imaging device 44 also comprises a lens (not shown) to focus light at the image plane 62 .
  • the processing unit 42 is connected to the probe controller 29 and image analyser 31 in the controller unit 26 such that the processing unit 42 can communicate with them via a communication line 46 .
  • the communication line 46 could be a wired or wireless communication line.
  • the probe 4 also comprises a random access memory (RAM) device 48 for temporarily storing data, such as image data, used by the processing unit 42 .
  • RAM random access memory
  • the probe 4 need not necessarily contain the processing unit 42 and/or RAM 48 .
  • all processing and data storage can be done by a device connected to the probe 4 , for instance the controller 26 or an intermediate device connected between the probe 4 and controller 26 .
  • the projector's 40 image plane 60 and the imaging device's 44 image plane 62 are angled relative to each other such that the projector's 40 and imaging device's optical axes 61 , 63 intersect at a reference plane 64 .
  • the probe 4 is positioned such that the fringes projected onto the object's surface can be clearly imaged by the imaging device 44 .
  • the projector 40 comprises a laser diode 50 for producing a coherent source of light, a collimator 52 for collimating light emitted from the laser diode 50 , a grating 54 for producing a sinusoidal set of fringes, and a lens assembly 56 for focussing the fringes at the reference plane 64 .
  • the projector could. comprise a light source and a mask to selectively block and transmit light emitted from the projector in a pattern.
  • the periodic optical pattern projected by the projector 40 is a set of sinusoidal fringes.
  • other forms of structured light could be projected, such as for example a set of parallel lines having different colours or tones (e.g. alternating black and white lines, or parallel red, blue and green lines), or even for example a set of concentric circles.
  • the operation begins at step 100 when the operator turns the CMM 2 on.
  • the system is initialised. This includes loading the probe 4 onto the articulating head 16 , positioning the object 28 to be measured on the base 10 , sending the CMM's encoders to a home or reference position such that the position of the articulating head 16 relative to the CMM 2 is known, and also calibrating the CMM 2 end probe 4 such that the position of a reference point of the probe 4 relative to the CMM 2 is known.
  • This step is performed a plurality of times so that a plurality of image sets are obtained, wherein each set corresponds to a different perspective or view point of the object 28 .
  • three sets of images are obtained corresponding to three different perspectives. The process of obtaining a set of images is explained in more detail below with respect to FIG. 5 .
  • the images are analysed at step 106 by the image analyser 31 in the controller 26 .
  • the image analyser 31 calculates from the images a set of three dimensional (“3D”) coordinates relative to the CMM 2 which describe the shape of the object 28 .
  • the method of analysing the images will be described in more detail below with reference to FIG. 7 .
  • the 3D coordinates are then output at step 108 as a 3D point cloud.
  • the 3D point cloud could be stored on a memory device for later use.
  • the 3D point cloud data could be used to determine the shape and dimensions of the object and compare it to predetermined threshold data to assess whether the object 28 has been made within predetermined tolerances.
  • the 3D point cloud could be displayed on a graphical user interface which provides a user with virtual 3D model of the object 28 .
  • step 110 ends at step 110 when the system is turned off.
  • a subsequent operation could be begun by repeating steps 104 to 108 .
  • the user might want to obtain multiple sets of measurement data for the same object 28 , or to obtain measurement data for a different object.
  • the process 104 of capturing an image set for a perspective begins at step 200 at which point the probe 4 is moved to a first perspective.
  • the user can move the probe 4 under the control of a joystick (not shown) which controls the motors of the CMM 2 so as to move the quill 14 .
  • the first (and subsequent) perspective could be predetermined and loaded into the CMM controller 27 such that during the measurement operation the probe 4 is automatically moved to the predetermined perspectives.
  • the user could physically drag the probe 4 to the perspectives, wherein the positioning apparatus monitors the position of the probe 4 via, for example, encoders mounted on the moving parts of the apparatus.
  • an initialising image is obtained at step 202 .
  • the initialising image is sent back to the image analyser 31 and at step 204 , the image is analysed for image quality properties. This can include, for example, determining the average intensity of light and contrast of the image and comparing them to predetermined threshold levels to determine whether the image quality is sufficient to perform the measurement processes. For example, if the image is too dark then the imaging device 44 or projector 40 properties could be changed so as to increase the brightness of the projected fringe pattern and/or adjust the expose time or gain of the imaging device 44 .
  • the initialising image will not be used in subsequent processes for obtaining measurement data about the object 28 and so certain aspects of the image, such as the resolution of the image, need not be as high as that for the measurement images as discussed below.
  • a light sensor such as a photodiode, separate to the imaging device could be provided in the probe to measure the amount of light at a perspective position, the output of the photodiode being used to set up the projector 40 and/or imaging device 44 .
  • the first measurement image is obtained at step 206 .
  • a measurement image is one which is used in the “analysis images” process 106 described in more detail below.
  • Obtaining the first measurement image involves the probe controller 29 sending a signal to the processing unit 42 of the probe 4 such that the processing unit 42 then operates the projector 40 to project a fringe pattern onto the object 28 and for the imaging device 44 to simultaneously capture an image of the object 28 with the fringe pattern on it.
  • the first measurement image is sent back to the image analyser 31 and at step 208 , the first measurement image is again analysed for image quality properties. If the image quality is sufficient for use in the “analysis images” process 106 described below, then control is passed to step 210 , otherwise control is passed back to step 204 .
  • fringe shifted images are obtained for the current perspective.
  • Fringe shifted images are a plurality of images of the object from substantially the same perspective but with the position of the fringes being slightly different in each image. The method this step is described in more detail below with respect to FIG. 6 .
  • the capture perspective image set process 104 is repeated a plurality of times for a plurality of different perspectives.
  • the capture perspective image set process is performed three times, for first, second and third perspectives.
  • the probe 4 is moved to each perspective either under the control of the user or controller as explained above.
  • the fringes projected on the object 28 are shifted by physically moving the probe 4 by a small distance in a direction such that the position of the fringes on the object 28 are different from the previous position.
  • the projector 40 within it, and hence the projector's optical axis 61 will also be shifted relative to the object 28 . This is what provides the change in position of the fringes of the object 28 .
  • the probe 4 is moved in a direction that is parallel to the imaging device's 44 image plane and perpendicular to the length of the fringes.
  • the hinge shifting could be achieved by rotating the probe 4 .
  • the probe 4 could be rotated about an axis extending perpendicular to the projector's image plane 60 .
  • the probe could be rotated about an axis extending perpendicular to the imaging device's 44 image plane.
  • the probe 4 can be rotated about the imaging device's 44 perspective centre. This is advantageous because this ensures that the perspective of the features captured by the imaging device 44 across the different images will be the same. It also enables any processing of the images to compensate for relative movement of the object and image sensor to be done without knowledge of the distance between the object and image sensor.
  • the probe 4 is located at a first position (referred to by reference numeral 4 ′) relative to an object 70 to be inspected.
  • the probe's projector 40 is at a first position (referred to by reference numeral 40 ′) which projects a fringe pattern illustrated by the dotted fringe markings 72 ′ on the object 70 .
  • An image 74 of the object with the fringe markings 72 ′ is captured by the imaging device 44 which is at a first position referred to by reference numeral 44 ′.
  • the probe 4 is then moved to a second position, referred to by reference numeral 4 ′′, by rotating the probe 4 relative to the object 70 about the imaging device's perspective centre.
  • an imaging device's perspective centre is the point through which all light rays that intersect with the image plane are assumed to pass.
  • the perspective centre is referred to by reference numeral 76 .
  • the projector referred to by reference numeral 40 ′′
  • the new position of the fringe pattern on the object 70 is illustrated by the striped fringe markings 72 ′′ on the object 70 .
  • An image 74 of the object is captured by the imaging device at its second position 44 ′′.
  • the perspective the imaging device 44 has of the object 70 does not change between the positions. Accordingly, for example, features that are hidden due to occlusion in one image will also be hidden due to occlusion in the second.
  • the rays 78 illustrating the view the imaging device 44 has of the tall feature 80 on the object.
  • the imaging device 44 is rotated about its perspective centre, the rays 78 are identical for both positions and so only the location of the feature on the imaging device 44 changes between the positions, not the form of the feature itself.
  • rotating about the perspective centre can be advantageous as the image sensor's perspective of the object does not change thereby ensuring that the same points on the object are visible for each position.
  • the distance between the image points of it before and after the relative rotation of camera and object is independent of the distance to the object. That is, for an unknown object, if the camera is rotated about its own perspective centre it is possible to predict, for each imaged point before the rotation, where it will be imaged after rotation.
  • the position of an image point after the rotation depends on the position of the initial image point, the angle (and axis) of rotation, and the internal camera parameters—all known values. Accordingly, as is described in more detail below, rotating about the perspective centre allows the relative motion to be compensated for without knowing the distance to the object.
  • the probe 4 is moved a distance corresponding to a fringe shift of 1 ⁇ 4 period at the point where the imaging device's 44 optical axis 63 intersects the reference plane 64 .
  • the actual distance the probe 4 is moved will depend on the period of the fringes projected and other factors such as the magnification of the projector 40 .
  • step 302 another measurement image is obtained at step 302 .
  • the steps of shifting the probe 300 and obtaining a measurement image 302 is repeated two more times. Each time, the probe is shifted so that for each measurement image the position of the fringe pattern on the object is different for all previous images. Accordingly, at the end of the obtain fringe shifted images process 210 four images of the object have been obtained for a given perspective, with the position of the fringe pattern on the object for each image being slightly different.
  • Row A shows the view of the object 28 at each of the three perspectives with no fringes projected onto it.
  • Row B illustrates, for each of the first, second and third perspectives the image 1000 that will be obtained by the imaging device 44 at step 206 of the process for capturing a perspective image set 104 .
  • Schematically shown behind each of those images 1000 are the fringe shifted images 1002 , 1004 and 1006 which are obtained during execution of steps 300 and 302 for each of the first, second and third perspectives.
  • FIGS. 14(a) to 14(d) shows an example of the images 1000 - 1006 obtained for the first perspective.
  • the relative position of the object and imaging device has moved slightly between obtaining each image in an image set for a perspective, and this needs to be taken into consideration and/or compensated for during processing of the images as described in more detail below (especially as described in connection with FIG. 8 ).
  • the image analyser 31 will have a set of images 1000 - 1006 for each of the first, second and third perspectives.
  • a wrapped phase map is a map which contains the phase of the fringes projected onto the object's surface for a plurality of pixels in one of the measurement images in a perspective image set, where the phase angle is bound within a range of 360 degrees.
  • a wrapped phase map is obtained using each of the four phase shifted images for that perspective in a particular order.
  • the four wrapped phase maps for a given perspective are obtained by using each of the four phase shifted images in different orders.
  • the method for obtaining a wrapped phase map will be explained in more detail below with reference to FIG. 8 .
  • columns X, Y and Z illustrate for each of the different perspectives four different wrapped phase maps 1010 , 1012 , 1014 and 1016 .
  • Each of those wrapped phase maps for a given perspective has been calculated using a unique order of the four different images 1002 - 1006 for that perspective.
  • Four different wrapped phase maps 1010 - 1016 for each perspective are calculated in order to be able to distinguish between those discontinuities caused by features on the object 28 and those discontinuities caused by the wrapping of the phase, as explained in more detail below.
  • a feature such as an edge or corner on the object 28 causes a discontinuity in the fringe pattern.
  • edge 30 on the object 28 causes a discontinuity in the fringe pattern along line 32 in the image of the object 28 with the fringe projected on it. Accordingly, it is possible to identify features of the object 28 by identifying discontinuities in the fringe pattern.
  • discontinuities in the fringe pattern are identified for each of the perspectives. This is achieved by identifying discontinuities in each of the wrapped phase maps.
  • a discontinuity in a wrapped phase map is identified by comparing the phase value of each pixel to the phase values of adjacent surrounding pixels. If the difference in the phase value between adjacent pixels is above a threshold level, then one of those pixels identifies a discontinuity point. As will be understood, it is not important which one of those pixels is selected as the discontinuity point so long as the selection criteria is consistent for the selection of all discontinuity points, e.g. always select the pixel to the left or to the top of the difference, depending on whether the differences between adjacent pixels are being calculated in the x or y direction along the image.
  • the positions of the discontinuities can be refined if required using image processing techniques, for example by looking at the gradient of the phase, or the gradient of the intensities in the measurement images in the surrounding region, in order to find the location of the discontinuity to sub-pixel accuracy, for example as described in J. R. Parker, “Algorithms for image processing and computer vision”, John Wiley and Sons, Inc (1997).
  • the preferred threshold level depends on a number of factors including the object shape, level of noise in the image and period of the fringe pattern.
  • the threshold level could be set by a user prior to the operation or could be calculated from an analysis of the image itself.
  • a discontinuity will be identified between adjacent pixels at point 34 due to the difference in the phase value caused by the distortion along line 32 of the fringe due to the edge 30 .
  • This discontinuity will also be identified in the other wrapped phase maps 1012 , 1014 and 1016 at the same point 34 .
  • phase map 1010 for the first perspective at point 36 where the phase values jump from 360 degrees to 0 degrees (illustrated by the dark pixels and light pixels respectively).
  • the phase value for adjacent pixels will jump significant at point 36 due to the phase map being wrapped.
  • a discontinuity would have been identified at point 36 in the first wrapped phase map 1010 for the first perspective.
  • a discontinuity would not have been identified at that same point 36 .
  • the different wrapped phase maps have been calculated using a different order of the fringe shifted images 1000 to 1006 , thereby ensuring that the phase wrapping in the wrapped phase maps occurs at different points. Accordingly, as the discontinuity identified at point 36 in the first wrapped phase map 1010 is not also identified in the other wrapped maps 1012 to 1016 , then that discontinuity can be discarded.
  • point 34 in the first wrapped phase map 1010 has been confirmed by discontinuities identified at the same point 34 in all the other wrapped phase maps 1012 to 1014 , point 34 is identified as a real discontinuity, i.e. a discontinuity caused by a feature on the object 28 , rather than as a result of phase wrapping.
  • corresponding discontinuity points between each of the perspectives are identified.
  • Corresponding discontinuity points are those points in the wrapped phase maps which identify a discontinuity caused by the same feature on the object 28 .
  • discontinuity point 38 on each of the first wrapped phase maps 1010 for each of the first, second and third perspectives all identify the same corner 39 on the object 28 .
  • Corresponding discontinuity points can be determined by known matching techniques and, for example, utilising epipolar geometry. Such known techniques are described, for example in A. Gruen, “Least squares matching: a fundamental measurement algorithm” in K. B. Atkinson (ed.), “Close range photogrammetry and machine vision”, Whittles Publishing (2001).
  • the correlated discontinuity points can then be used as target points, the 3D coordinates of which relative to the probe 4 can be determined at step 408 by known photogrammetry techniques, such as those described in, for example, M. A. R Cooper with S. Robson, “Theory of close-range photogrammetry” in K. B. Atkinson (ed.), “Close range photogrammetry and machine vision”, Whittles Publishing (2001).
  • step 408 a number of discrete points on the object 28 will have been identified and their position relative to the probe 4 measured.
  • a height map for a continuous section of the object 28 is calculated.
  • a height map provides information on the height of the surface above a known reference plane 6 relative to the probe 4 .
  • a continuous section is an area of the object enclosed by discontinuous features, e.g. the face of a cube which is enclosed by four edges. Continuous sections can be identified by identifying those areas in the wrapped phase map which are enclosed by discontinuity points previously identified in steps 402 to 406 .
  • the height map provides measurement data on the shape of the surface between those discrete points. Methods for obtaining the height map for a continuous section are described below in more detail with respect to FIGS. 9 and 10 . Steps 410 could be performed a plurality of times for different continuous sections for one or more of the different perspectives.
  • the unwrapped phase map is correct only to some unknown multiple of 2 ⁇ radians, and therefore the height above the reference plane 64 may be wrong by whatever height corresponds to this unknown phase difference. This is often called 2 ⁇ ambiguity.
  • the measured 3D coordinates of the real discontinuities obtained in step 408 are used in order to resolve these ambiguities.
  • the 3D coordinates of the real discontinuity points obtained in step 408 and the height map data obtained in step 410 provide the position of the object relative to a predetermined reference point in the probe 4 . Accordingly, at step 412 , these coordinates are converted to 3D coordinates relative to the CMM 2 . This can be performed using routine trigonometry techniques as the relative position of the CMM 2 and the reference point in the probe 4 is known from calibration, and also because the position and orientation of the probe 4 relative to the CMM 2 at the point each image was obtained was recorded with each image.
  • Calculating a wrapped phase map comprises calculating the phase for each pixel for one of a set of fringe-shifted images. This can be done using various techniques, the selection of which can depend on various factors including the method by which the fringe-shifted images are obtained. Standard phase-shifting algorithms rely on that the relative position between the object and imaging device 44 is the same across all of the fringe-shifted images. However, if either of the methods described above (e.g. either moving the probe 4 laterally or rotating it about the imaging device's perspective centre) are used to obtain the fringe-shifted images then the imaging device 44 will have moved a small distance relative to the object.
  • a given pixel in each image will be identifying the intensity of a different point on the object. Accordingly, if standard phase-shifting algorithms are to be used it is necessary to identify across all of the fringe shifted images which pixels correspond to same point on the object, and to then compensate for this.
  • One way of doing this when the imaging device 44 has moved laterally is to determine by how much and in what direction the imaging device 44 has traveled between each image, and by then cropping the images so that each image contains image data common to all of them. For example, if the movement of the imaging device 44 between two images means that a point on an object has shifted five pixels in one dimension, then the first image can be cropped to remove five pixel widths worth of data.
  • FIG. 15 schematically illustrates corresponding rows of pixels for each of the first 1000 , second 1002 , third 1004 and fourth 1006 images.
  • point X on the object 28 is imaged by the 7 th pixel from the left for the first image 1000 , the 5 th pixel from the left for the second image 1002 , the 3 rd pixel from the left for the third image 1004 and the 4 th pixel from the left for the fourth image 1006 .
  • An effective way of compensating for the relative movement of image sensor and object 28 is to crop the image data such that each image 1000 - 1006 contains a data representing a common region, such as that highlighted by window 51 in FIG. 15 .
  • Cropping the images is one example of a coordinate transformation, where the transformation is a linear function. This can be most accurate in situations where the distance to the object is known, or, for instance, where the stand-off distance is large compared to the depth of the measuring volume.
  • the stand-off distance is the distance from the imaging device's perspective centre 76 to the centre of the imaging device's measurement volume and the depth of field 65 or depth of measurement volume is the range over which images recorded by the device appear sharp.
  • the stand-off distance is the nominal distance from the probe 4 to the object to be measured. For instance, if the ratio of stand-off distance to depth of measuring volume is around 10:1 then there can be an error of up to 10% in the compensation for some pixels.
  • the most appropriate coordinate transformation to compensate for relative motion of the imaging device and the object can depend, in general on the distance to the object and the actual motion. However, it has been found that if the motion is rotation about the imaging device's 44 perspective centre then the coordinate transformation that best compensates for the motion is independent of the unknown distance to the object. This is due to the geometry of the system and the motion. Furthermore, this enables accurate compensation to be performed even if the stand-off distance is not large compared to the depth of the measuring volume, for instance in situations in which the ratio of stand-off distance to depth of measuring volume is less than 10:1, for example less than 5:1, for instance 1:1. Accordingly, this enables measurement of an object to be performed even when the probe is located close to the object.
  • the next step 502 involves using a phase-shifting algorithm to calculate the wrapped phase at each pixel.
  • a suitable phase-shifting algorithm not requiring known phase shift for instance the Carré algorithm, may be used to calculate the wrapped phase, phase shift and modulation amplitude.
  • the process for calculating a wrapped phase map 400 is repeated three further times for each perspective image set, each time using the phase shifted images in a different order, so as to obtain four wrapped phase maps for each perspective. Accordingly, in the process for calculating the wrapped phase maps 400 is performed twelve times in total.
  • a first process for obtaining the height map 410 will now be described with reference to FIG. 9 .
  • the method involves at step 600 unwrapping the continuous section of one of the phase maps by adding integer multiples of 360 degrees to the wrapped phase of individual pixels as required to remove the discontinuities found due to the phase calculation algorithm.
  • the method then involves converting the unwrapped phase map to a height map for that continuous section at step 602 .
  • the phase for a pixel is dependent on the relative height of the surface of the object. Accordingly, it is possible, at step 602 to create a height map for the continuous section from that phase by directly mapping the phase value of each pixel to a height value using a predetermined mapping table and procedure.
  • h is the distance from the imaging device's 44 perspective centre to the object point imaged at x
  • ⁇ h is the change in this distance after translation ⁇ X.
  • a is the known direction of the imaging device's optic axis
  • X c is the position of the perspective centre, also known.
  • the change in h due to the motion of the imaging device 44 only is equal to ⁇ X.a. If this quantity is zero, so that the motion is perpendicular to the imaging device axis and parallel to the image plane, then any remaining change in h must be due to the object shape.
  • the change in h is actually recorded as a change in phase, ⁇ , where, again, this will consist of a component caused by the shape of the object, and a component caused by any motion of the imaging device parallel to its axis.
  • the Carré algorithm is used to calculate for each pixel in a given image in an image set, the phase and phase shift and modulation amplitude from the four phase-shifted images.
  • the Carré algorithm assumes that the four shifts in phase are equal. This will be the case, for instance, if the motion used is a translation and the surface is planar. If this is not the case then a good approximation can be obtained by choosing motion that it small enough that the surface gradient does not vary significantly over the scale of the motion.
  • phase data can be converted to height data.
  • phase shift data can be converted to gradient data and subsequently to height data using the method described below in connection with FIG. 10 .
  • the above described method provides optimum results when the object's reflectivity and surface gradient is substantially constant on the scale of the relative motion. Accordingly, it can be preferred that the motion between the images in an image set is small. Areas of the surface at too low or too high a gradient relative to the imaging device, or with a high degree of curvature, can be detected by inspecting the modulation amplitude returned by the Carré algorithm, and can subsequently be measured by changing the relative motion used to induce the phase shift and if necessary by viewing the object from a different perspective.
  • a Carré algorithm provides both phase and phase shift data for each pixel in an image.
  • the above methods described above in connection with FIG. 9 use the phase data to obtain the height data. However, it has been possible to obtain the height information using the phase-shift data.
  • a second process for obtaining the height map 410 will now be described with reference to FIG. 10 .
  • This method begins at step 700 by, for a continuous section (which is identifiable from the discontinuities previously identified as explained above), calculating a phase shift map using a Carré algorithm on all of the images in a perspective image set.
  • the phase shift for a pixel is dependent on the gradient of the surface of the object and how far away the object is from the probe 4 .
  • step 702 it is possible, at step 702 to create a gradient map for the continuous section from that phase shift by directly mapping the phase shift value of each pixel to a gradient value using a predetermined mapping table and procedure.
  • the gradient map is then integrated in order to get a height map for the continuous surface relative to the probe 4 .
  • the measured 3D coordinates of the real discontinuities obtained in step 408 are used in order to resolve the constant of integration to find the height above the reference plane 64 .
  • the projector may consist simply of a grating, light source, and focussing optics. There is no need for any moving parts within the projector or for a programmable projector—only one pattern is required to be projected. Furthermore, no information about the distance to the object is required, except that it (or a section of it) is within the measuring volume—there is no requirement to have a large stand-off distance compared to the measurement volume. Furthermore, the motion between the object and probe unit need not necessarily be in any particular direction, and may be produced by a rotation rather than a translation or a combination of the two.
  • the probe is mounted on a mounting structure equivalent to the quill of a CMM.
  • This invention is also suitable for use with planning the course of motion of a measurement device mounted on other machine types.
  • the probe 4 could be mounted on a machine tool.
  • the probe 4 may be mounted onto the distal end of an inspection robot, which may for example comprise a robotic arm having several articulating joints.
  • the probe is mounted on a mounting structure equivalent to the quill of a CMM.
  • This invention is also suitable for use with planning the course of motion of a measurement device mounted on other machine types.
  • the probe 4 could be mounted on a machine tool.
  • the probe 4 may be mounted onto the distal end of an inspection robot, which may for example comprise a robotic arm having several articulating joints.
  • the probe 4 might be in a fixed position and the object could be moveable, for example via a positioning machine.
  • the description of the specific embodiment also involves obtaining and processing images to obtain topographical data via phase analysis of a periodic optical pattern.
  • techniques such as triangulation might be used instead of using phase-stepping algorithms.
  • the shift in the pattern could be obtained using techniques other than that described above. For instance, they could be obtained by changing the pattern projected by the projector, or by moving the object.
  • target points can be identified using other known methods.
  • target points can be identified by markers placed on the object or by projecting a marker onto the object.
  • the description describes using the same images for identifying target features as well as for obtaining topographical data. However, this need not necessarily be the case as, for instance, separate images could be obtained for use in the different processes. In this case, if target features are identified using markers stuck on or projected onto the object then it would not be necessary to project a pattern during the obtaining of images for use in identifying target features.
  • the invention is described as a single probe containing a projector and imaging device, the projector and image sensor could be provided separately (e.g. so that they can be physically manipulated independently of each other). Furthermore, the probe could comprise a plurality of imaging devices.

Abstract

A non-contact measurement apparatus and method. A probe is provided for mounting on a coordinate positioning apparatus, comprising at least one imaging device for capturing an image of an object to be measured. Also provided is an image analyzer configured to analyze at least one first image of an object obtained by the probe from a first perspective and at least one second image of the object obtained by the probe from a second perspective so as to identify at least one target feature on the object to be measured. The image analyzer is further configured to obtain topographical data regarding a surface of the object via analysis of an image, obtained by the probe, of the object on which an optical pattern is projected.

Description

This invention relates to a method and apparatus for measuring an object without contacting the object.
Photogrammetry is a known technique for determining the location of certain points on an object from photographs taken at different perspectives, i.e. positions and/or orientations. Typically photogrammetry comprises obtaining at least two images of an object taken from two different perspectives. For each image the two dimensional coordinates of a feature of the object on the image can determined. It is then possible from the knowledge of the relative location and orientation of the camera(s) which took the images, and the points at which the feature is formed on the images to determine the three dimensional coordinates of the feature on the object via triangulation. Such a technique is disclosed for example in U.S. Pat. No. 5,251,156 the entire content of which is incorporated into this specification by this reference.
Non-contact optical measuring systems are also known for measuring the topography of a surface. These may typically consist of a projector which projects a structured light pattern onto a surface and a camera, set at an angle to the projector, which detects the structured light pattern on the surface. Height variation on the surface causes a distortion in the pattern. From this distortion the geometry of the surface can be calculated via triangulation and/or phase analysis techniques.
Current known systems enable either photogrammetry or phase analysis to be performed in order to obtain measurement data regarding the object.
The invention provides a method and apparatus in which measurement of an object via photogrammetric techniques and triangulation and/or phase analysis techniques can be performed on images obtained by a common probe.
According to a first aspect of the invention there is provided, a non-contact measurement apparatus, comprising: a probe for mounting on a coordinate positioning apparatus, comprising at least one imaging device for capturing an image of an object to be measured; an image analyser configured to analyse at least one first image of an object obtained by the probe from a first perspective and at least one second image of the object obtained by the probe from a second perspective so as to identify at least one target feature on the object to be measured, and further configured to obtain topographical data regarding a surface of the object via analysis of an image, obtained by the probe, of the object on which an optical pattern is projected.
It is an advantage of the present invention that both the position of target features on the object, and the topographical data of the surface of the object are determined by the image analyser using images obtained by the same probe. Accordingly, it is not necessary to have two separate imaging systems for obtaining both the position of target features of an object and the topographical form of the surface of the object.
As will be understood, a perspective can be a particular view point of the object. A perspective can be defined by the position and/or orientation of the imaging device relative to the object.
The at least one first image and the at least one second image can be obtained by at least one suitable imaging device. Suitable imaging devices can comprise at least one image sensor. For example, suitable imaging devices can comprise an optical electromagnetic radiation (EMR) sensitive detector, such as a charge-coupled device (CCD), a complementary metal-oxide-semiconductor (CMOS). Suitable imaging devices can be optically configured to focus light at the image plane. As will be understood, the image plane can be defined by the image sensor. For example, suitable imaging devices can comprise at least one optical component configured to focus optical EMR at the image plane. Optionally, the at least one optical component comprises a lens.
Suitable imaging devices can be based on the pinhole camera model which consists of a pinhole, which can also be referred to as the imaging device's perspective centre, through which optical EMR rays are assumed to pass before intersecting with the image plane. As will be understood, imaging devices that do not comprise a pinhole but instead comprise a lens to focus optical EMR also have a perspective centre and this can be the point through which all optical EMR rays that intersect with the image plane are assumed to pass.
As will be understood, the perspective centre can be found relative to the image sensor using a calibration procedure, such as those described in J. Heikkila and O. Silven, “A four-step camera calibration procedure with implicit image correction”, Proceedings of the 1997 Conference in Computer Vision and Pattern Recognition (CVPR '97) and J. G Fryer, “Camera Calibration” in K. B. Atkinson (ed.) “Close range photogrammetry and machine vision”, Whittles publishing (1996). Correction parameters such as those for correcting lens aberrations can be provided and are well known and are for instance described in these two documents.
The probe can comprise a plurality of imaging devices. Preferably, the images analysed by the image analyser are obtained using a common imaging device. Accordingly, in this case the probe can comprise a single imaging device only.
Preferably the optical pattern is projected over an area of the object. Preferably the pattern extends over an area of the object so as to facilitate the measurement of a plurality of points of the object over the area image. Preferably the pattern is a substantially repetitive pattern. Particularly preferred optical patterns comprise substantially periodic optical patterns. As will be understood, a periodic optical pattern can be a pattern which repeats after a certain finite distance. The minimum distance between repetitions can be the period of the pattern. Preferably the optical pattern is periodic in at least one dimension. Optionally, the optical pattern can be periodic in at least two perpendicular dimensions.
Suitable optical patterns for use with the present invention include patterns of concentric circles, patterns of lines of varying colour, shades, and/or tones. The colour, shades and/or tones could alternate between two or more different values. Optionally, the colour, shade and/or tones could vary between a plurality of discrete values. Preferably, the colour, shade and/or tones varies continuously across the optical pattern. Preferably, the optical pattern is a fringe pattern. For example, the optical pattern can be a set of sinusoidal fringes. The optical pattern can be in the infrared to ultraviolet range. Preferably, the optical pattern is a visible optical pattern. As will be understood, an optical pattern for use in methods such as that of the present invention is also commonly referred to as a structured light pattern.
The optical pattern could be projected onto the object via at least one projector. Suitable projectors for the optical pattern include a digital light projector configured to project an image input from a processor device. Such a projector enables the pattern projected to be changed. Suitable projectors could comprise a light source and one or more diffraction gratings arranged to produce the optical pattern. The diffraction grating(s) could be moveable so as to enable the pattern projected by the projector to be changed. For instance, the diffraction grating(s) can be mounted on a piezoelectric transducer. Optionally, the diffraction gratings could be fixed such that the pattern projected by the projector cannot be changed. Optionally the projector could comprise a light source and a hologram. Further, the projector could comprise a light source and a patterned slide. Further still, the projector could comprise two mutually coherent light sources. The coherent light sources could be moveable so as to enable the pattern projected by the projector to be changed. For instance, the coherent light sources can be mounted on a piezoelectric transducer. Optionally, the coherent light sources could be fixed such that the pattern projected by the projector cannot be changed.
The at least one projector could be provided separately to the probe. Preferably, the probe comprises the at least one projector. Preferably the probe comprises a single projector only.
A target feature can be a predetermined mark on the object. The predetermined mark could be a part of the object, for example a predetermined pattern formed on the object's surface. Optionally, the mark could be attached to the object for the purpose of identifying a target feature. For example, the mark could be a coded “bull's eye”, wherein the “bull's-eye” has a unique central point which is invariant with perspective, surrounded by a set of concentric black and white rings which code a unique identifier. Automatic feature recognition methods can be used to both locate the centre of the target and also decode the unique identifier. By means of such targets the images can be automatically analysed and the coordinates of the “bull's-eye” centre returned.
As will be understood, the image analyser could be configured to analyser further images of the object being obtained from further known perspectives that are different to the perspectives of the other images. The more images that are analysed the more accurate and reliable the position determination of the target feature on the object can be.
A target feature on the object to be measured can be identified by feature recognition techniques. For example, a Hough Transform can be used to identify a straight line feature on the object.
At least one of the at least one first image and at least second image can be an image of the object onto which an optical pattern is projected. The optical pattern need not be the same as the imaged optical pattern used for obtaining topographical data. Preferably, the at least one first image and at least second image are images of the object onto which an optical pattern is projected. This enables topographical data to be obtained from at least one of the at least one first and at least one second image.
Preferably, the image analyser is configured to identify an irregularity in the optical pattern in each of the first and second images as the at least one target feature. This is advantageous as target features can be identified without the use of markers placed on the object. This has been found to enable highly accurate measurements of the object to be taken quickly. It has also been found that the method of the invention can require less processing resources to identify points on complex shaped objects than by other known image processing techniques.
As will be understood, an irregularity in the optical pattern can also be referred to as discontinuity in the optical pattern.
An irregularity in the optical pattern can be a deformation of the optical pattern caused by a discontinuous feature on the object. Such a deformation of the optical pattern can, for example, be caused at the boundary between two continuous sections of an object. For instance, the boundary could be the edge of a cube at which two faces of the cube meet. Accordingly, a discontinuous feature on the object can be where the gradient of the surface of the object changes significantly. The greater the gradient of the surface relative to the optical pattern projector, the greater the deformation of the optical pattern at that point on the surface. Accordingly, an irregularity could be identified by identifying those points on the object at which the optical pattern is deformed by more than a predetermined threshold. This predetermined threshold will depend on a number of factors, including the size and shape of the object to be measured. Optionally, the predetermined threshold can be determined and set prior to operation by a user based on the knowledge of the object to be measured.
An irregularity can be can be identified by identifying in an image those points on the object at which the rate of change of the optical pattern is greater than a predetermined threshold rate of change. For instance, in embodiments in which the optical pattern is a periodic optical pattern, an irregularity can be identified by identifying in an image those points on the object at which the rate of change of the phase of the periodic optical pattern is greater than a predetermined threshold rate of change. In particular, in embodiments in which the optical pattern is a fringe pattern, an irregularity can be identified by identifying in an image those points on the object at which the rate of change of the phase of the fringe pattern is greater than a predetermined threshold rate of change.
The rate of change of the phase of an optical pattern as imaged when projected onto an object can be identified by creating a phase map from the image, and then looking for jumps in the phase between adjacent points in the phase map above a predetermined threshold. As will be understood, a phase map is a map which contains the phase a pattern projected onto the object's surface for a plurality of pixels in an image. The phase map could be a wrapped phase map. The phase map could be an unwrapped phase map. Known techniques can be used to unwrap a wrapped phase map in order to obtain an unwrapped phase map.
A phase map can be created from a single image of the optical pattern object. For example, Fourier Transform techniques could be used to create the phase map.
Preferably a phase map is created from a set of images of the object from substantially the same perspective, in which the position of the optical pattern on the object is different for each image. Accordingly, a phase map can be created using a phase stepping approach. This can provide a more accurate phase map. Phase stepping algorithms known and are for example described in Creath, K. “Comparison of phase measurement algorithms” Proc. SPIE 680, 19-28 (1986). Accordingly, the method can comprise obtaining a set of first images of the optical pattern on the object from the first perspective. The method can further comprise obtaining a set of second images of the optical pattern on the object from the second perspective. A set of images can comprise a plurality of images of the object from a given perspective. Preferably, a set of images comprises at least two images, more preferably at least three images, especially preferably at least four images. The position (e.g. phase) of the optical pattern on the object can be different for each image in a set.
The image analyser can be configured to process: a set of first images obtained from the first known perspective, the position of the optical pattern on the object being different for each image in the set; and a set of second images obtained from the second known perspective, the position of the optical pattern on the object being different for each image in the set in order to identify at least one target feature on the object to be measured and to determine the position of the target feature on the object relative to the image sensor.
Further details of a method of identifying an irregularity in the optical pattern in each of the at least one first and second images as a target feature are disclosed in the co-pending PCT application filed on the same day as the present application with the title NON-CONTACT MEASUREMENT APPARATUS AND METHOD and having the applicant's reference number 741/WO/0 and claiming priority from UK Patent Application nos. 0716080.7, 0716088.0, 0716109.4. Subject matter that is disclosed in that application is incorporated in the specification of the present application by this reference.
As will be understood, topographical data can be data indicating the topography of at least a part of the object's surface. The topographical data can be data indicating the height of the object's surface at least one point on the object, and preferably at a plurality of points across the object. The topographical data can be data indicating the gradient of the object's surface, at least one point on the object, and preferably at a plurality of points across the object. The topographical data can be data indicating the height and/or gradient of the object's surface relative to the image sensor.
The topographical data can be obtained via analysing the optical pattern. For instance, the topographical data can be obtained via analysing the deformation of the optical pattern. This can be done for example via triangulation techniques. Optionally, the topographical data can be obtained via analysing the optical pattern using phase analysis techniques.
The at least one image processed by the imager analyser to obtain topographical data can be a separate image from the at least one first and at least one second images. Optionally, the topographical data regarding the surface on which the optical pattern is projected can be obtained from at least one of the at least one first and at least one second images. Accordingly, at least one of the at least one first and at least one second images can be an image of the object on which a optical pattern is projected.
The image analyser could be configured to generate a phase map from at least one of the plurality of images. The image analyser could be configured to generate a phase map from at least one of the at least one first image and at least one second image. The phase map could be generated by Fourier Transforming one of the plurality of images.
The image analyser can be configured to process a set of images in which the position of an optical pattern on the object is different for each image in the set in order to determine the topographical data. Optionally, as described above, a phase map can be created from a set of images of the object from the same perspective, in which the position (e.g. phase) of the optical pattern at the object is different for each image.
In particular, the image analyser can be configured to process at least one of the first or second sets of images in order to determine the topographical data. Accordingly, the image analyser can be configured to process at least one of: a set of first images obtained from the first perspective, the position of the optical pattern on the object being different for each image in the set; and a set of second images obtained from the second perspective, the position of the optical pattern on the object being different for each image in the set, in order to determine the height variation data. Accordingly, the image analyser can be configured to calculate a phase map from at least one of the set of first images and the set of second images.
A wrapped phase map can be used to obtain topographical data. For instance, a wrapped phase map can be unwrapped, and the topographical data can be obtained from the unwrapped phase map. Accordingly, the image analyser can be configured to unwrap the wrapped phase map and to obtain the topographical data from the unwrapped phase map. The topographical data could be in the form of height data. As will be understood, height data can detail the position of a plurality of points on the surface.
Obtaining topographical data can comprise determining the gradient of the surface. Obtaining topographical data can comprise determining the gradient of the surface relative to the imaging device.
Determining the gradient of the surface relative to the imaging device can comprise calculating a phase shift map from the plurality of images. Suitable algorithms for generating a phase shift map from the plurality of images include a Carré algorithm such as that described in Cure, P. “Installation et utilisation du comparateur photoelectrique et interferential du Bureau International des Podis et Mesure” Metrologia 2 13-23 (1996). Determining the gradient of the surface can further obtaining a gradient map based on the phase shift map. The gradient map can be obtained by converting the value of each of the points on a phase shift map to a gradient value. The value of a point in a phase shift map can be converted to a gradient value using a predetermined mapping procedure. As will be understood, a phase shift map can detail the phase shift for a plurality of points on the surface due to the change in position of projected fringes on the object's surface. The phase shift can be bound in a range of 360 degrees. A gradient map can detail the surface gradient relative to the image sensor of a plurality of points on the surface.
The method can further comprise integrating the gradient map to obtain height data. As explained above, height data can detail the position of a plurality of points on the surface relative to the image sensor.
The image analyser can be configured to calculate at least one of a first phase map from the set of first images and a second phase map from the set of second images. Phase maps calculated from a set of images taken from the substantially the same perspective, the position of the optical pattern on the object in each image being different, can provide a more accurate and reliable phase map.
The image analyser can be configured to determine the topographical data from at least one of the at least one of a first phase map and second phase map. As mentioned above, the phase maps can be wrapped phase maps. In this case, the at least one of a first wrapped phase map and second wrapped phase map can be unwrapped, and the topographical data can be obtained from the unwrapped phase map.
The position of the optical pattern could be changed between obtaining each of the images in a set of images by changing the optical pattern emitted by the projector. For instance, a projector can comprise a laser beam which is incident on a lens which diverges the beam on to a liquid crystal system to generate at least one fringe pattern on the surface to be measured. A computer can be used to control the pitch and phase of the fringe pattern generated by the liquid crystal system. The computer and the liquid crystal system can perform a phase-shifting technique in order to change the phase of the optical pattern.
Optionally, the position of the optical pattern could be changed by relatively moving the object and the projector. The object and projector could be rotated relative to each other in order to displace the optical pattern on the surface. Optionally, the object and projector are laterally displaced relative to each other. As will be understood, the object could be moved between the obtaining each of the plurality of images. Optionally, the projector could be moved between the obtaining each of the plurality of images.
This can be particularly preferred when the projector has a fixed optical pattern. Accordingly, the projector can be configured such that it can project one optical pattern only. For example, the projector could be one in which the pitch or phase of the optical pattern cannot be altered.
The object and projector can be moved relative to each other by any amount which provides a change in the position of the projected optical pattern relative to the object. When the optical pattern has a period, preferably the object and projector are moved relative to each other such that the position of the pattern on the object is at least nominally moved by a non-integral multiple of the period of the pattern. For instance, when the optical pattern is a fringe pattern, the object and projector can be relative to each other such that the position of the pattern on the object is at least nominally moved by a non-integral multiple of the fringe period. For example, the object and projector can be moved relative to each other such that the position of the pattern on the object is at least nominally moved by a ¼ of the fringe period. As will be understood, the actual distance the projector and object are to be moved relative to each other to obtain such a shift in the pattern on the object can depend on a number of factors including the period of the periodic optical pattern projected and the distance between the object and the projector.
As will be understood, relatively moving the projector and object will cause a change in the position of the optical pattern on the object. However, it may appear from images of the optical pattern on the object taken before and after the relative movement that the optical pattern has not moved. This can be referred to as nominal movement. Whether or not the movement is nominal or actual will depend on a number of factors including the form of the optical pattern projected, and the shape and/or orientation of the surface of the object relative to the projector. For instance, the change in position of the optical pattern on a surface for a given movement will be different for differently shaped and oriented surfaces. It might be that due to the shape and/or orientation of the surface that it would appear that the optical pattern has not changed position, when it fact it has moved and that that movement would have been apparent on a differently shaped or positioned object. What is important is that it is known that the relative movement is such that it would cause a change in the position of the optical pattern on a reference surface of a known shape and orientation relative to the projector. Accordingly, it is effectively possible to determine the shape and orientation of the surface by determining how the position of the optical pattern as imaged differs from the known reference.
The projector could be moved such that the position of the optical pattern relative to a predetermined reference plane in the measurement space is changed. The projector could be moved such that the position of the optical pattern relative to a predetermined reference plane in the measurement space is changed by a non-integral multiple of the period of the pattern. The predetermined reference plane could be the reference plane of the image sensor. Again, the shape and/or orientation of the surface of the object can then be determined by effectively comparing the position of the optical pattern on the surface relative to what it would be like at the reference plane.
If the probe comprises the projector, then the object and imaging device will be moved relative to each other as a consequence of obtaining a shift in the position of the optical pattern on the object. In this case, then preferably the amount of relative movement should be sufficiently small such that the perspective of the object obtained by the image sensor in each of the images is substantially the same. In particular, preferably the movement is sufficiently small that any change in the perspective between the plurality of images can be compensated for in the step of analysing the plurality images.
In a preferred embodiment in which the probe comprises a projector and an imaging device, the probe can be moved between images by rotating the probe about the imaging device's perspective centre. It has been found that rotating about the imaging device's perspective centre makes processing the images to compensate for any relative movement between the object and imaging device (discussed in more detail below). In particular it makes matching corresponding pixels across a number of images easier. For instance, matching corresponding pixels is possible using a coordinate transformation which is independent of the distance between the object and the imaging device. Accordingly, it is not necessary to know the distance between the object and imaging device in order to process the images to compensate for any relative movement between the object and imaging device.
Accordingly, the image analyser can be configured to: i) identify common image areas covered by each of the images in a set of images. The image analyser can be configured to then ii) calculate the phase map for the set using the common image areas only. Identifying common image areas covered by each of the images in a set of images can comprise adjusting the image coordinates to compensate for relative movement between the object and the imaging device.
Details of a method and apparatus in which the position of an optical pattern on the object is changed between obtaining each of a plurality of images of the object, and in which topographical data is obtained by analysing those images are disclosed in the co-pending PCT application filed on the same day as the present application with the title PHASE ANALYSIS MEASUREMENT APPARATUS AND METHOD and having the applicant's reference number 742/WO/0 and claiming priority from UK Patent Application nos. 0716080.7, 0716088.0, 0716109.4.
Accordingly, in particular, this application describes a non-contact measurement apparatus, comprising: a probe for mounting on a coordinate positioning apparatus, the probe comprising a projector for projecting an optical pattern onto the surface of an object to be measured, and an image sensor for imaging the optical pattern on the surface of the object; an image analyser configured to analyse at least one first image of an object on which an optical pattern is projected, the first image being obtained from a first known perspective, and at least one second image of the object on which the optical pattern is projected, the second image being obtained from a second known perspective, so as to: a) identify at least one target feature on the object to be measured and to determine the position of the target feature on the object relative to the image sensor; and b) determine topographical data regarding the surface on which the optical pattern is projected from at least one of the first and second images.
This application also describes in particular a non-contact method for measuring an object located within a measurement space comprising, in any suitable order, the steps of: i) an image sensor obtaining at least a first image of an object on which an optical pattern is projected, the at least first image being obtained from a first perspective; ii) the image sensor obtaining at least a second image of the object on which the optical pattern is projected, the second image being obtained from a second perspective; and iii) analysing the first and at least second images so as to: a) identify at least one target feature on the object to be measured and to determine the position of the target feature on the object relative to the image sensor; and b) obtain shape data of the surface on which the optical pattern is projected from at least one of the first and second imaged optical patterns.
According to a second aspect of the invention there is provided an image analyser for use in a non-contact measurement apparatus as described above.
According to a third aspect of the invention there is provided a non-contact method for measuring an object located within a measurement space using a probe comprising at least one imaging device, the method comprising: the probe obtaining a plurality of images of the object, comprising at least one first image of the object from a first perspective, at least one second image of the object from a second perspective, and at least one image of the object on which a optical pattern is projected; analysing the plurality of images to identify at least one target feature on the object to be measured and to obtain topographical data regarding a surface of the object via analysis of the optical pattern.
At least one of the at least one image of the object from a first perspective and at least one image of the object from a second perspective comprises the at least one image of the object on which an optical pattern is projected. Accordingly, the method can comprise obtaining topographical data from at least one of the at least one first image of the object from a first perspective and at least one second image of the object from a second perspective.
The method can comprise relatively moving the object and probe between the first and second perspectives. This can be particularly preferred when the probe comprises a single imaging device only.
The optical pattern can be projected by a projector that is separate to the probe. Optionally, the probe can comprise at least one projector for projecting an optical pattern.
According to a fourth aspect of the invention there is provided a non-contact measurement apparatus, comprising: a coordinate positioning apparatus having a repositionable head; and a non-contact measurement probe mounted on the head comprising: a projector for projecting an optical pattern onto the surface of an object to be measured; and an image sensor for imaging the optical pattern on the surface of the object.
It is an advantage of the invention that the probe is mounted on a coordinate positioning apparatus. Doing so facilitates the acquisition of images of an object from multiple perspectives through the use of only a single probe device. Further as the probe is mounted on a coordinate positioning apparatus, it can be possible to accurately determine the position and orientation of the probe from the coordinate positioning machine's position reporting features. For example, the coordinate position machine could comprise a plurality of encoders for determining the position of relatively moveable parts of the coordinate positioning machine. In this case, the position and orientation of the image sensor could be determined from the output of the encoders. As will be understood, coordinate positioning apparatus include coordinate measuring machines and other positioning apparatus such as articulating arms and machine tools, the position of what a tool or other device mounted on them can be determined.
Preferably the head is an articulating probe head. Accordingly, preferably the probe head can be rotated about at least one axis. Preferably the coordinate positioning apparatus is a computer controlled positioning apparatus. The coordinate positioning apparatus could comprise a coordinate measuring machine (CMM). The coordinate positioning apparatus could comprise a machine tool.
The non-contact measurement apparatus could further comprise an image analyser configured to determine topographical data regarding the surface on which an optical pattern is projected by the projector from at least one of image obtained by the image sensor. The image analyser could be configured as described above.
This application also describes a non-contact measurement probe for mounting on a coordinate positioning apparatus, comprising: a projector for projecting an optical pattern onto the surface of an object to be measured; and an image sensor for imaging the optical pattern on the surface of the object.
This application further describes a non-contact measurement method comprising: a projector mounted on a head of a coordinate positioning machine projecting an optical pattern onto a surface of an object to be measured; an image sensor imaging the optical pattern on the surface; and an image analyser determining topographical data regarding the surface of the object based on the image and on position information from the coordinate positioning machine.
The optical pattern can extend in two dimensions. The optical pattern projected can enable the determination of the topology of the surface of an object in two dimensions from a single image of the optical pattern on the object. The optical pattern can be a substantially full-field optical pattern. A substantially full-field optical pattern can be one in which the pattern extends over at least 50% of the field of view of the image sensor at a reference plane (described in more detail below), more preferably over at least 75%, especially preferably over at least 95%, for example substantially over the entire field of view of the image sensor at a reference plane. The reference plane can be a plane that is a known distance away from the image sensor. Optionally, the reference plane can be a plane which contains the point at which the projector's and image sensor's optical axes intersect. The reference plane can extend perpendicular to the image sensor's optical axis.
The optical pattern could be a set of concentric circles, or a set of parallel lines of alternating colour, shades, or tones. Preferably, the periodic optical pattern is a fringe pattern. For example, the periodic optical pattern can be a set of sinusoidal fringes. The periodic optical pattern can be in the infrared to ultraviolet range. Preferably, the periodic optical pattern is a visible periodic optical pattern.
According to a further aspect of the invention there is provided computer program code comprising instructions which, when executed by a controller, causes the machine controller to control a probe comprising at least one imaging device and image analyser in accordance with the above described methods.
According to a yet further aspect of the invention there is provided a computer readable medium, bearing computer program code as described above.
As will be understood, features described in connection with the first aspect of the invention are also applicable to the other aspects of the invention where appropriate.
Accordingly, this application describes, a non-contact measurement apparatus, comprising: a probe for mounting on a coordinate positioning apparatus, the probe comprising a projector for projecting a structured light pattern onto the surface of an object to be measured, and an image sensor for imaging the structured light pattern on the surface of the object; an image analyser configured to analyse at least one first image of an object on which a structured light pattern is projected, the first image being obtained from a first known perspective, and at least one second image of the object on which the structured light pattern is projected, the second image being obtained from a second known perspective, so as to: a) identify at least one target feature on the object to be measured and to determine the position of the target feature on the object relative to the image sensor; and b) determine topographical data regarding the surface on which the structured light pattern is projected from at least one of the first and second images.
An embodiment of the invention will now be described, by way of example only, with reference to the following Figures, in which:
FIG. 1 shows a schematic perspective view of a coordinate measuring machine on which a probe for measuring an object via a non-contact method according to the present invention is mounted;
FIG. 2 illustrates various images of the object shown in FIG. 1 obtained by the probe from three different perspectives;
FIG. 3 illustrates a plurality of wrapped phase maps for each of the three different perspectives;
FIG. 4 shows a flow chart illustrating the high-level operation of the apparatus shown in FIG. 1;
FIG. 5 illustrates the method of capturing a perspective image set;
FIG. 6 illustrates the method of obtaining fringe shifted images;
FIG. 7 illustrates the method of analysing the images;
FIG. 8 illustrates the method of calculating the wrapped phase maps;
FIG. 9 illustrates a first method for obtaining a height map;
FIG. 10 illustrates the a second method for obtaining a height map;
FIG. 11 is a schematic diagram of the components of the probe shown in FIG. 1;
FIG. 12 is a schematic diagram of the positional relationship of the imaging device and projector of the probe shown in FIG. 11;
FIG. 13 is a schematic diagram of the projector shown in FIG. 11; and
FIG. 14 illustrates a set of fringe shifted images, the position of the fringe on the object being different in each image;
FIG. 15 illustrates the effect of moving the image sensor relative to the object;
FIG. 16 illustrates how the gradient of the object surface can be determined from the phase shift;
FIG. 17 illustrates obtaining fringe shifted images by causing rotation about the image sensor's perspective centre; and
FIG. 18 illustrates the stand-off distance and depth of field of an imaging device.
Referring to FIG. 1, a coordinate measuring machine (CMM) 2 on which a measurement probe 4 according to the present invention is mounted, is shown.
The CMM 2 comprises a base 10, supporting a frame 12 which in turn holds a quill 14. Motors (not shown) are provided to move the quill 14 along the three mutually orthogonal axes X, Y and Z. The quill 14 holds an articulating head 16. The head 16 has a base portion 20 attached to the quill 14, an intermediate portion 22 and a probe retaining portion 24. The base portion 20 comprises a first motor (not shown) for rotating the intermediate portion 22 about a first rotational axis 18. The intermediate portion 22 comprises a second motor (not shown) for rotating the probe retaining portion 24 about a second rotational axis that is substantially perpendicular to the first rotational axis. Although not shown, bearings may also be provided between the moveable parts of the articulating head 16. Further, although not shown, measurement encoders may be provided for measuring the relative positions of the base 10, frame 12, quill 14, and articulating head 16 so that the position of the measurement probe 4 relative to a workpiece located on the base 10 can be determined.
The probe 4 is removably mounted (e.g. using a kinematic mount) on the probe retaining portion 24. The probe 4 can be held by the probe retaining portion 24 by the use of corresponding magnets (not shown) provided on or in the probe 4 and probe retaining portion 24.
The head 16 allows the probe 4 to be moved with two degrees of freedom relative to the quill 14. The combination of the two degrees of freedom provided by the head 16 and the three linear (X, Y, Z) axes of translation of the CMM 2 allows the probe 4 to be moved about five axes.
A controller 26 comprising a CMM controller 27 for controlling the operation of the CMM 2 is also provided, and a probe controller 29 for controlling the operation of the probe 4 and an image analyser 31 for analysing the images obtained form the probe 4. The controller 26 may be a dedicated electronic control system and/or may comprise a personal computer.
The CMM controller 27 is arranged to provide appropriate drive currents to the first and second motors so that, during use, each motor imparts the required torque. The torque imparted by each motor may be used to cause movement about the associated rotational axis or to maintain a certain rotational position. It can thus be seen that a drive current needs to be applied continuously to each motor of the head 16 during use; i.e. each motor needs to be powered even if there is no movement required about the associated rotational axis.
It should be noted that FIG. 1 provides only a top level description of a CMM 2. A more complete description of such apparatus can be found elsewhere; for example, see EP402440 the entire contents of which are incorporated herein by this reference.
Referring now to FIG. 11, the probe 4 comprises a projector 40 for projecting, under the control of a processing unit 42 a fringe pattern onto the object 28, an imaging device 44 for obtaining, under the control of the processing unit 42 an image of the object 28 onto which the fringe pattern is projected. As will be understood, the imaging device 44 comprises suitable optics and sensors for capturing images of the object 28. In the embodiment described, the imaging device comprises an image sensor, in particular a CCD defining an image plane 62. The imaging device 44 also comprises a lens (not shown) to focus light at the image plane 62.
The processing unit 42 is connected to the probe controller 29 and image analyser 31 in the controller unit 26 such that the processing unit 42 can communicate with them via a communication line 46. As will be understood, the communication line 46 could be a wired or wireless communication line. The probe 4 also comprises a random access memory (RAM) device 48 for temporarily storing data, such as image data, used by the processing unit 42.
As will be understood, the probe 4 need not necessarily contain the processing unit 42 and/or RAM 48. For instance, all processing and data storage can be done by a device connected to the probe 4, for instance the controller 26 or an intermediate device connected between the probe 4 and controller 26.
As illustrated in FIG. 12, the projector's 40 image plane 60 and the imaging device's 44 image plane 62 are angled relative to each other such that the projector's 40 and imaging device's optical axes 61, 63 intersect at a reference plane 64. In use, the probe 4 is positioned such that the fringes projected onto the object's surface can be clearly imaged by the imaging device 44.
With reference to FIG. 13, the projector 40 comprises a laser diode 50 for producing a coherent source of light, a collimator 52 for collimating light emitted from the laser diode 50, a grating 54 for producing a sinusoidal set of fringes, and a lens assembly 56 for focussing the fringes at the reference plane 64. As will be understood, other types of projectors would be suitable for use with the present invention. For instance, the projector could. comprise a light source and a mask to selectively block and transmit light emitted from the projector in a pattern.
In the described embodiment, the periodic optical pattern projected by the projector 40 is a set of sinusoidal fringes. However, as will be understood, other forms of structured light could be projected, such as for example a set of parallel lines having different colours or tones (e.g. alternating black and white lines, or parallel red, blue and green lines), or even for example a set of concentric circles.
Referring to FIGS. 2 to 10, the operation of the probe 4 will now be described.
Referring first to FIG. 4, the operation begins at step 100 when the operator turns the CMM 2 on. At step 102, the system is initialised. This includes loading the probe 4 onto the articulating head 16, positioning the object 28 to be measured on the base 10, sending the CMM's encoders to a home or reference position such that the position of the articulating head 16 relative to the CMM 2 is known, and also calibrating the CMM 2 end probe 4 such that the position of a reference point of the probe 4 relative to the CMM 2 is known.
Once initialised and appropriately calibrated, control passes to step 104 at which point a set of images of the object 28 is obtained by the probe 4. This step is performed a plurality of times so that a plurality of image sets are obtained, wherein each set corresponds to a different perspective or view point of the object 28. In the example described, three sets of images are obtained corresponding to three different perspectives. The process of obtaining a set of images is explained in more detail below with respect to FIG. 5.
Once all of the images have been obtained, the images are analysed at step 106 by the image analyser 31 in the controller 26. The image analyser 31 calculates from the images a set of three dimensional (“3D”) coordinates relative to the CMM 2 which describe the shape of the object 28. The method of analysing the images will be described in more detail below with reference to FIG. 7. The 3D coordinates are then output at step 108 as a 3D point cloud. As will be understood, the 3D point cloud could be stored on a memory device for later use. The 3D point cloud data could be used to determine the shape and dimensions of the object and compare it to predetermined threshold data to assess whether the object 28 has been made within predetermined tolerances. Optionally, the 3D point cloud could be displayed on a graphical user interface which provides a user with virtual 3D model of the object 28.
The operation ends at step 110 when the system is turned off. Alternatively, a subsequent operation could be begun by repeating steps 104 to 108. For instance, the user might want to obtain multiple sets of measurement data for the same object 28, or to obtain measurement data for a different object.
Referring now to FIG. 5, the process 104 of capturing an image set for a perspective will now be described. The process begins at step 200 at which point the probe 4 is moved to a first perspective. In the described embodiment, the user can move the probe 4 under the control of a joystick (not shown) which controls the motors of the CMM 2 so as to move the quill 14. As will be understood, the first (and subsequent) perspective could be predetermined and loaded into the CMM controller 27 such that during the measurement operation the probe 4 is automatically moved to the predetermined perspectives. Further, on a different positioning apparatus, the user could physically drag the probe 4 to the perspectives, wherein the positioning apparatus monitors the position of the probe 4 via, for example, encoders mounted on the moving parts of the apparatus.
Once the probe 4 is positioned at the first perspective, an initialising image is obtained at step 202. This involves the probe controller 29 sending a signal to the processing unit 42 of the probe 4 such that it operates the imaging device 44 to capture an image of the object 28.
The initialising image is sent back to the image analyser 31 and at step 204, the image is analysed for image quality properties. This can include, for example, determining the average intensity of light and contrast of the image and comparing them to predetermined threshold levels to determine whether the image quality is sufficient to perform the measurement processes. For example, if the image is too dark then the imaging device 44 or projector 40 properties could be changed so as to increase the brightness of the projected fringe pattern and/or adjust the expose time or gain of the imaging device 44. The initialising image will not be used in subsequent processes for obtaining measurement data about the object 28 and so certain aspects of the image, such as the resolution of the image, need not be as high as that for the measurement images as discussed below. Furthermore, in alternative embodiments, a light sensor, such as a photodiode, separate to the imaging device could be provided in the probe to measure the amount of light at a perspective position, the output of the photodiode being used to set up the projector 40 and/or imaging device 44.
Once the projector 40 and imaging device 44 have been set up, the first measurement image is obtained at step 206. What is meant by a measurement image is one which is used in the “analyse images” process 106 described in more detail below. Obtaining the first measurement image involves the probe controller 29 sending a signal to the processing unit 42 of the probe 4 such that the processing unit 42 then operates the projector 40 to project a fringe pattern onto the object 28 and for the imaging device 44 to simultaneously capture an image of the object 28 with the fringe pattern on it.
The first measurement image is sent back to the image analyser 31 and at step 208, the first measurement image is again analysed for image quality properties. If the image quality is sufficient for use in the “analyse images” process 106 described below, then control is passed to step 210, otherwise control is passed back to step 204.
At step 210, fringe shifted images are obtained for the current perspective. Fringe shifted images are a plurality of images of the object from substantially the same perspective but with the position of the fringes being slightly different in each image. The method this step is described in more detail below with respect to FIG. 6.
Once the fringe shifted images have been obtained, all of the images are then sent back to the imager analyser 31 for analysis at step 212. As will be understood, data concerning the position and orientation that the probe 4 was at when each image was obtained will be provided to the image analyser 31 along with each image, such that 3D coordinates of the object 28 relative to the CMM 2 can be obtained as explained in more detail below. The process then ends at step 214.
As explained above, the capture perspective image set process 104 is repeated a plurality of times for a plurality of different perspectives. In this described example, the capture perspective image set process is performed three times, for first, second and third perspectives. The probe 4 is moved to each perspective either under the control of the user or controller as explained above.
With reference to FIG. 6, the process 210 for obtaining the fringe shifted images will now be described. The fringes projected on the object 28 are shifted by physically moving the probe 4 by a small distance in a direction such that the position of the fringes on the object 28 are different from the previous position. As the probe 4 is shifted, the projector 40 within it, and hence the projector's optical axis 61, will also be shifted relative to the object 28. This is what provides the change in position of the fringes of the object 28.
In one embodiment, the probe 4 is moved in a direction that is parallel to the imaging device's 44 image plane and perpendicular to the length of the fringes.
However, this need not necessarily be the case, so long as the position of the fringes on the object is moved. For example, the hinge shifting could be achieved by rotating the probe 4. For instance, the probe 4 could be rotated about an axis extending perpendicular to the projector's image plane 60. Optionally the probe could be rotated about an axis extending perpendicular to the imaging device's 44 image plane. In another preferred embodiment the probe 4 can be rotated about the imaging device's 44 perspective centre. This is advantageous because this ensures that the perspective of the features captured by the imaging device 44 across the different images will be the same. It also enables any processing of the images to compensate for relative movement of the object and image sensor to be done without knowledge of the distance between the object and image sensor.
For example, with reference to FIG. 17 the probe 4 is located at a first position (referred to by reference numeral 4′) relative to an object 70 to be inspected. At this instance the probe's projector 40 is at a first position (referred to by reference numeral 40′) which projects a fringe pattern illustrated by the dotted fringe markings 72′ on the object 70. An image 74 of the object with the fringe markings 72′ is captured by the imaging device 44 which is at a first position referred to by reference numeral 44′.
The probe 4 is then moved to a second position, referred to by reference numeral 4″, by rotating the probe 4 relative to the object 70 about the imaging device's perspective centre. As will be understood, an imaging device's perspective centre is the point through which all light rays that intersect with the image plane are assumed to pass. In the figure shown, the perspective centre is referred to by reference numeral 76.
As can be seen, at the second position the projector, referred to by reference numeral 40″, has moved such that the position of the fringe pattern on the object 70 has moved. The new position of the fringe pattern on the object 70 is illustrated by the striped fringe markings 72″ on the object 70. An image 74 of the object is captured by the imaging device at its second position 44″. As can be seen, although the position of the image of the object on the imaging device 44 has changed between the first 44′ and second 44″ positions of the imaging device, the perspective the imaging device 44 has of the object 70 does not change between the positions. Accordingly, for example, features that are hidden due to occlusion in one image will also be hidden due to occlusion in the second. This is illustrated by the rays 78 illustrating the view the imaging device 44 has of the tall feature 80 on the object. As can be seen, because the imaging device 44 is rotated about its perspective centre, the rays 78 are identical for both positions and so only the location of the feature on the imaging device 44 changes between the positions, not the form of the feature itself.
Accordingly, rotating about the perspective centre can be advantageous as the image sensor's perspective of the object does not change thereby ensuring that the same points on the object are visible for each position. Furthermore, for any point viewed, the distance between the image points of it before and after the relative rotation of camera and object is independent of the distance to the object. That is, for an unknown object, if the camera is rotated about its own perspective centre it is possible to predict, for each imaged point before the rotation, where it will be imaged after rotation. The position of an image point after the rotation depends on the position of the initial image point, the angle (and axis) of rotation, and the internal camera parameters—all known values. Accordingly, as is described in more detail below, rotating about the perspective centre allows the relative motion to be compensated for without knowing the distance to the object.
The probe 4 is moved a distance corresponding to a fringe shift of ¼ period at the point where the imaging device's 44 optical axis 63 intersects the reference plane 64. As will be understood, the actual distance the probe 4 is moved will depend on the period of the fringes projected and other factors such as the magnification of the projector 40.
Once the probe 4 has been shifted, another measurement image is obtained at step 302. The steps of shifting the probe 300 and obtaining a measurement image 302 is repeated two more times. Each time, the probe is shifted so that for each measurement image the position of the fringe pattern on the object is different for all previous images. Accordingly, at the end of the obtain fringe shifted images process 210 four images of the object have been obtained for a given perspective, with the position of the fringe pattern on the object for each image being slightly different.
Reference is now made to FIG. 2. Row A shows the view of the object 28 at each of the three perspectives with no fringes projected onto it. Row B illustrates, for each of the first, second and third perspectives the image 1000 that will be obtained by the imaging device 44 at step 206 of the process for capturing a perspective image set 104. Schematically shown behind each of those images 1000 are the fringe shifted images 1002, 1004 and 1006 which are obtained during execution of steps 300 and 302 for each of the first, second and third perspectives. FIGS. 14(a) to 14(d) shows an example of the images 1000-1006 obtained for the first perspective. As shown, the relative position of the object and imaging device has moved slightly between obtaining each image in an image set for a perspective, and this needs to be taken into consideration and/or compensated for during processing of the images as described in more detail below (especially as described in connection with FIG. 8).
Accordingly, once the step 104 of capturing the first, second and third image sets has been completed, the image analyser 31 will have a set of images 1000-1006 for each of the first, second and third perspectives.
The process 106 for analysing the images will now be described with reference to FIG. 7. The process begins at step 400 at which point four wrapped phase maps are calculated for each of the first, second and third perspectives. As will be understood, a wrapped phase map is a map which contains the phase of the fringes projected onto the object's surface for a plurality of pixels in one of the measurement images in a perspective image set, where the phase angle is bound within a range of 360 degrees.
For a given perspective, a wrapped phase map is obtained using each of the four phase shifted images for that perspective in a particular order. The four wrapped phase maps for a given perspective are obtained by using each of the four phase shifted images in different orders. The method for obtaining a wrapped phase map will be explained in more detail below with reference to FIG. 8.
As will be understood, it need not be necessary to calculate four wrapped phase maps for each perspective. For instance, two or more wrapped phase maps could be calculated for each of the perspectives. As will be understood, the more wrapped phase maps that are calculated, the more reliable the determination of real discontinuities as explained in more detail below, but the more processing resources required.
Referring to FIG. 3, columns X, Y and Z illustrate for each of the different perspectives four different wrapped phase maps 1010, 1012, 1014 and 1016. Each of those wrapped phase maps for a given perspective has been calculated using a unique order of the four different images 1002-1006 for that perspective. Four different wrapped phase maps 1010-1016 for each perspective are calculated in order to be able to distinguish between those discontinuities caused by features on the object 28 and those discontinuities caused by the wrapping of the phase, as explained in more detail below.
As can be seen from the images in row B of FIG. 2, a feature, such as an edge or corner on the object 28 causes a discontinuity in the fringe pattern. For example, edge 30 on the object 28 causes a discontinuity in the fringe pattern along line 32 in the image of the object 28 with the fringe projected on it. Accordingly, it is possible to identify features of the object 28 by identifying discontinuities in the fringe pattern.
At step 402, discontinuities in the fringe pattern are identified for each of the perspectives. This is achieved by identifying discontinuities in each of the wrapped phase maps. A discontinuity in a wrapped phase map is identified by comparing the phase value of each pixel to the phase values of adjacent surrounding pixels. If the difference in the phase value between adjacent pixels is above a threshold level, then one of those pixels identifies a discontinuity point. As will be understood, it is not important which one of those pixels is selected as the discontinuity point so long as the selection criteria is consistent for the selection of all discontinuity points, e.g. always select the pixel to the left or to the top of the difference, depending on whether the differences between adjacent pixels are being calculated in the x or y direction along the image. As will be understood, the positions of the discontinuities, once found by the above described method, can be refined if required using image processing techniques, for example by looking at the gradient of the phase, or the gradient of the intensities in the measurement images in the surrounding region, in order to find the location of the discontinuity to sub-pixel accuracy, for example as described in J. R. Parker, “Algorithms for image processing and computer vision”, John Wiley and Sons, Inc (1997).
The preferred threshold level depends on a number of factors including the object shape, level of noise in the image and period of the fringe pattern. The threshold level could be set by a user prior to the operation or could be calculated from an analysis of the image itself.
For example, referring to the first wrapped phase map 1010 (in FIG. 3) for the first perspective, a discontinuity will be identified between adjacent pixels at point 34 due to the difference in the phase value caused by the distortion along line 32 of the fringe due to the edge 30. This discontinuity will also be identified in the other wrapped phase maps 1012, 1014 and 1016 at the same point 34.
Other discontinuities will also be identified in the wrapped phase maps 1010-1016, such as for example all the way along line 32, which corresponds to the edge 30.
It is possible that the above process could result in false discontinuities being identified due to the phase map being wrapped. For example, adjacent pixels might have phase values of, for instance, close to 0 degrees and 360 degrees respectively. If so, then it would appear as if there has been a large phase jump between those pixels and this would be identified as a discontinuity. However, the phase jump has merely been caused as a result of the wrapping around of the phase, rather than due to a discontinuity in the surface of the object being measured. An example of this can be seen in the first wrapped phase map 1010 for the first perspective at point 36 where the phase values jump from 360 degrees to 0 degrees (illustrated by the dark pixels and light pixels respectively). The phase value for adjacent pixels will jump significant at point 36 due to the phase map being wrapped.
Accordingly, once all discontinuities have been identified for each of the four wrapped phase maps for a given perspective, then falsely identified discontinuities are removed at step 404. This is achieved by comparing the discontinuities for each of the wrapped phase maps for a given perspective, and only keeping the discontinuities that appear in at least two of the four wrapped phase maps. As will be understood, a more stringent test could be applied by, for example, only keeping the discontinuities that appear in three or four of the wrapped phase maps. This can help overcome problems caused by noise on the images. This process 404 is performed for each of the first to third perspective image sets.
For example, as mentioned above a discontinuity would have been identified at point 36 in the first wrapped phase map 1010 for the first perspective. However, when looking at the other wrapped phase maps 1012 to 1016 for the first perspective, a discontinuity would not have been identified at that same point 36. This is because the different wrapped phase maps have been calculated using a different order of the fringe shifted images 1000 to 1006, thereby ensuring that the phase wrapping in the wrapped phase maps occurs at different points. Accordingly, as the discontinuity identified at point 36 in the first wrapped phase map 1010 is not also identified in the other wrapped maps 1012 to 1016, then that discontinuity can be discarded.
However, as the discontinuity at point 34 in the first wrapped phase map 1010 has been confirmed by discontinuities identified at the same point 34 in all the other wrapped phase maps 1012 to 1014, point 34 is identified as a real discontinuity, i.e. a discontinuity caused by a feature on the object 28, rather than as a result of phase wrapping.
At step 406, corresponding discontinuity points between each of the perspectives are identified. Corresponding discontinuity points are those points in the wrapped phase maps which identify a discontinuity caused by the same feature on the object 28. For example, discontinuity point 38 on each of the first wrapped phase maps 1010 for each of the first, second and third perspectives all identify the same corner 39 on the object 28. Corresponding discontinuity points can be determined by known matching techniques and, for example, utilising epipolar geometry. Such known techniques are described, for example in A. Gruen, “Least squares matching: a fundamental measurement algorithm” in K. B. Atkinson (ed.), “Close range photogrammetry and machine vision”, Whittles Publishing (2001). The correlated discontinuity points can then be used as target points, the 3D coordinates of which relative to the probe 4 can be determined at step 408 by known photogrammetry techniques, such as those described in, for example, M. A. R Cooper with S. Robson, “Theory of close-range photogrammetry” in K. B. Atkinson (ed.), “Close range photogrammetry and machine vision”, Whittles Publishing (2001).
Accordingly, after step 408 a number of discrete points on the object 28 will have been identified and their position relative to the probe 4 measured.
At step 410, a height map for a continuous section of the object 28 is calculated. A height map provides information on the height of the surface above a known reference plane 6 relative to the probe 4. A continuous section is an area of the object enclosed by discontinuous features, e.g. the face of a cube which is enclosed by four edges. Continuous sections can be identified by identifying those areas in the wrapped phase map which are enclosed by discontinuity points previously identified in steps 402 to 406. The height map provides measurement data on the shape of the surface between those discrete points. Methods for obtaining the height map for a continuous section are described below in more detail with respect to FIGS. 9 and 10. Steps 410 could be performed a plurality of times for different continuous sections for one or more of the different perspectives.
As is usual in similar fringe analysis systems, the unwrapped phase map is correct only to some unknown multiple of 2π radians, and therefore the height above the reference plane 64 may be wrong by whatever height corresponds to this unknown phase difference. This is often called 2π ambiguity. The measured 3D coordinates of the real discontinuities obtained in step 408 are used in order to resolve these ambiguities.
At this stage, the 3D coordinates of the real discontinuity points obtained in step 408 and the height map data obtained in step 410 provide the position of the object relative to a predetermined reference point in the probe 4. Accordingly, at step 412, these coordinates are converted to 3D coordinates relative to the CMM 2. This can be performed using routine trigonometry techniques as the relative position of the CMM 2 and the reference point in the probe 4 is known from calibration, and also because the position and orientation of the probe 4 relative to the CMM 2 at the point each image was obtained was recorded with each image.
The process for calculating a wrapped phase map 400 will now be described with reference to FIG. 8. Calculating a wrapped phase map comprises calculating the phase for each pixel for one of a set of fringe-shifted images. This can be done using various techniques, the selection of which can depend on various factors including the method by which the fringe-shifted images are obtained. Standard phase-shifting algorithms rely on that the relative position between the object and imaging device 44 is the same across all of the fringe-shifted images. However, if either of the methods described above (e.g. either moving the probe 4 laterally or rotating it about the imaging device's perspective centre) are used to obtain the fringe-shifted images then the imaging device 44 will have moved a small distance relative to the object. Accordingly, for each successive image in a perspective image set, a given pixel in each image will be identifying the intensity of a different point on the object. Accordingly, if standard phase-shifting algorithms are to be used it is necessary to identify across all of the fringe shifted images which pixels correspond to same point on the object, and to then compensate for this. One way of doing this when the imaging device 44 has moved laterally is to determine by how much and in what direction the imaging device 44 has traveled between each image, and by then cropping the images so that each image contains image data common to all of them. For example, if the movement of the imaging device 44 between two images means that a point on an object has shifted five pixels in one dimension, then the first image can be cropped to remove five pixel widths worth of data.
This can be seen more clearly with reference to FIG. 15 which schematically illustrates corresponding rows of pixels for each of the first 1000, second 1002, third 1004 and fourth 1006 images. As can be seen, due to relative movement of the imaging device 44 and the object 28 between the images, the same point on an object is imaged by different pixels in each image. For instance, point X on the object 28 is imaged by the 7th pixel from the left for the first image 1000, the 5th pixel from the left for the second image 1002, the 3rd pixel from the left for the third image 1004 and the 4th pixel from the left for the fourth image 1006. An effective way of compensating for the relative movement of image sensor and object 28 is to crop the image data such that each image 1000-1006 contains a data representing a common region, such as that highlighted by window 51 in FIG. 15.
Cropping the images is one example of a coordinate transformation, where the transformation is a linear function. This can be most accurate in situations where the distance to the object is known, or, for instance, where the stand-off distance is large compared to the depth of the measuring volume. As will be understood, and with reference to FIG. 18, the stand-off distance is the distance from the imaging device's perspective centre 76 to the centre of the imaging device's measurement volume and the depth of field 65 or depth of measurement volume is the range over which images recorded by the device appear sharp. In other words, the stand-off distance is the nominal distance from the probe 4 to the object to be measured. For instance, if the ratio of stand-off distance to depth of measuring volume is around 10:1 then there can be an error of up to 10% in the compensation for some pixels. If either the stand-off distance is not large compared to the depth of the measuring volume, or if the relative motion is not a linear translation, then the most appropriate coordinate transformation to compensate for relative motion of the imaging device and the object can depend, in general on the distance to the object and the actual motion. However, it has been found that if the motion is rotation about the imaging device's 44 perspective centre then the coordinate transformation that best compensates for the motion is independent of the unknown distance to the object. This is due to the geometry of the system and the motion. Furthermore, this enables accurate compensation to be performed even if the stand-off distance is not large compared to the depth of the measuring volume, for instance in situations in which the ratio of stand-off distance to depth of measuring volume is less than 10:1, for example less than 5:1, for instance 1:1. Accordingly, this enables measurement of an object to be performed even when the probe is located close to the object.
Once the pixel data has been compensated for the relative motion so that the same pixel in each adjusted image represents the same point on the object, the next step 502 involves using a phase-shifting algorithm to calculate the wrapped phase at each pixel. A suitable phase-shifting algorithm not requiring known phase shift, for instance the Carré algorithm, may be used to calculate the wrapped phase, phase shift and modulation amplitude.
The process for calculating a wrapped phase map 400 is repeated three further times for each perspective image set, each time using the phase shifted images in a different order, so as to obtain four wrapped phase maps for each perspective. Accordingly, in the process for calculating the wrapped phase maps 400 is performed twelve times in total.
A first process for obtaining the height map 410 will now be described with reference to FIG. 9. The method involves at step 600 unwrapping the continuous section of one of the phase maps by adding integer multiples of 360 degrees to the wrapped phase of individual pixels as required to remove the discontinuities found due to the phase calculation algorithm. The method then involves converting the unwrapped phase map to a height map for that continuous section at step 602. The phase for a pixel is dependent on the relative height of the surface of the object. Accordingly, it is possible, at step 602 to create a height map for the continuous section from that phase by directly mapping the phase value of each pixel to a height value using a predetermined mapping table and procedure.
In contrast to the methods for calculating a wrapped-phase map described above in connection with FIG. 8, i.e. in which the image coordinates are compensated for, it has been found that there is another way to calculate the wrapped phase when the object and imaging device 44 are moved relative to each other which doesn't require image coordinate compensation. This method relies on the fact that a pixel of the imaging device's 44 CCD will be viewing a different point on the object for each different image. If the points viewed by a single pixel in multiple images are at different distances to the imaging device 44, then a different phase will be recorded at that pixel in each image. That is, the phase of the fringe pattern at that pixel will be shifted between each image. The actual phase shift will depend on the distance to the object and on the gradient of the object, as well as the known relative motion of the imaging device 44 and object and the fixed system parameters. The phase shift will therefore vary across the image.
As an example, with reference to FIG. 16, consider an object point Xp, imaged at x in the camera plane. If the imaging device 44 is translated by some vector dX with respect the plane, then the point imaged by the imaging device 44 will change, as show. For clarity, the projector 40 is omitted from the diagram, but it is to be understood that the imaging device 44 and projector 40 are fixed with respect to each other.
h is the distance from the imaging device's 44 perspective centre to the object point imaged at x, and δh is the change in this distance after translation δX. a is the known direction of the imaging device's optic axis, and Xc is the position of the perspective centre, also known. The change in h due to the motion of the imaging device 44 only is equal to δX.a. If this quantity is zero, so that the motion is perpendicular to the imaging device axis and parallel to the image plane, then any remaining change in h must be due to the object shape.
The change in h is actually recorded as a change in phase, δφ, where, again, this will consist of a component caused by the shape of the object, and a component caused by any motion of the imaging device parallel to its axis.
To measure the phase at a given pixel, we take multiple phase shifted images. The intensity recorded at a pixel in image k can be expressed as
Ik=A±B cos φk
where:
    • A=offset (i.e. the average intensity of the fringe pattern projected onto the object as recorded by that pixel, including any background light);
    • B=amplitude modulation of the light intensity recorded by that pixel; and
      φkk-1+Δφk≈φk+φk-1·δXk,k>0
      using a first order Taylor series expansion, which assumes that the translation δX is small.
The Carré algorithm is used to calculate for each pixel in a given image in an image set, the phase and phase shift and modulation amplitude from the four phase-shifted images. The Carré algorithm assumes that the four shifts in phase are equal. This will be the case, for instance, if the motion used is a translation and the surface is planar. If this is not the case then a good approximation can be obtained by choosing motion that it small enough that the surface gradient does not vary significantly over the scale of the motion.
The phase data can be converted to height data. Optionally the phase shift data can be converted to gradient data and subsequently to height data using the method described below in connection with FIG. 10.
The above described method provides optimum results when the object's reflectivity and surface gradient is substantially constant on the scale of the relative motion. Accordingly, it can be preferred that the motion between the images in an image set is small. Areas of the surface at too low or too high a gradient relative to the imaging device, or with a high degree of curvature, can be detected by inspecting the modulation amplitude returned by the Carré algorithm, and can subsequently be measured by changing the relative motion used to induce the phase shift and if necessary by viewing the object from a different perspective.
A Carré algorithm provides both phase and phase shift data for each pixel in an image. The above methods described above in connection with FIG. 9 use the phase data to obtain the height data. However, it has been possible to obtain the height information using the phase-shift data. In particular, a second process for obtaining the height map 410 will now be described with reference to FIG. 10. This method begins at step 700 by, for a continuous section (which is identifiable from the discontinuities previously identified as explained above), calculating a phase shift map using a Carré algorithm on all of the images in a perspective image set. The phase shift for a pixel is dependent on the gradient of the surface of the object and how far away the object is from the probe 4. Accordingly, it is possible, at step 702 to create a gradient map for the continuous section from that phase shift by directly mapping the phase shift value of each pixel to a gradient value using a predetermined mapping table and procedure. At step 704, the gradient map is then integrated in order to get a height map for the continuous surface relative to the probe 4. The measured 3D coordinates of the real discontinuities obtained in step 408 are used in order to resolve the constant of integration to find the height above the reference plane 64.
It is an advantage of the invention that the projector may consist simply of a grating, light source, and focussing optics. There is no need for any moving parts within the projector or for a programmable projector—only one pattern is required to be projected. Furthermore, no information about the distance to the object is required, except that it (or a section of it) is within the measuring volume—there is no requirement to have a large stand-off distance compared to the measurement volume. Furthermore, the motion between the object and probe unit need not necessarily be in any particular direction, and may be produced by a rotation rather than a translation or a combination of the two.
In the described embodiments the probe is mounted on a mounting structure equivalent to the quill of a CMM. This invention is also suitable for use with planning the course of motion of a measurement device mounted on other machine types. For example, the probe 4 could be mounted on a machine tool. Further, the probe 4 may be mounted onto the distal end of an inspection robot, which may for example comprise a robotic arm having several articulating joints.
As will be understood, the above provides a detailed description of just one particular embodiment of the invention and many features are merely optional or preferable rather than essential to the invention.
For instance, in the described embodiments the probe is mounted on a mounting structure equivalent to the quill of a CMM. This invention is also suitable for use with planning the course of motion of a measurement device mounted on other machine types. For example, the probe 4 could be mounted on a machine tool. Further, the probe 4 may be mounted onto the distal end of an inspection robot, which may for example comprise a robotic arm having several articulating joints. Furthermore, the probe 4 might be in a fixed position and the object could be moveable, for example via a positioning machine.
As will be understood, the description of the specific embodiment also involves obtaining and processing images to obtain topographical data via phase analysis of a periodic optical pattern. As will be understood, this need not necessarily be the case. For example, techniques such as triangulation might be used instead of using phase-stepping algorithms. Further still, if phase-stepping methods are to be used, the shift in the pattern could be obtained using techniques other than that described above. For instance, they could be obtained by changing the pattern projected by the projector, or by moving the object.
The description of the specific embodiment also involves obtaining and processing images to obtain photogrammetrical target points by identifying discontinuities in the pattern projected onto the object. As will be understood, this need not necessarily be the case. For example, target points can be identified using other known methods. For instance, target points can be identified by markers placed on the object or by projecting a marker onto the object.
Furthermore, the description describes using the same images for identifying target features as well as for obtaining topographical data. However, this need not necessarily be the case as, for instance, separate images could be obtained for use in the different processes. In this case, if target features are identified using markers stuck on or projected onto the object then it would not be necessary to project a pattern during the obtaining of images for use in identifying target features.
Further still, although the invention is described as a single probe containing a projector and imaging device, the projector and image sensor could be provided separately (e.g. so that they can be physically manipulated independently of each other). Furthermore, the probe could comprise a plurality of imaging devices.

Claims (36)

The invention claimed is:
1. A non-contact measurement apparatus, comprising:
a probe configured to be mounted on a coordinate positioning apparatus, comprising an imaging device for capturing an image of an object to be measured;
a processor configured to:
a) analyse at least one first image of an object obtained by the imaging device from a first perspective and at least one second image of the object obtained by the imaging device, which is the same imaging device used to obtain the at least one first image of the object, from a second perspective so as to identify in each of the at least one first image and the at least one second image of the object at least one common photogrammetric target feature on the object to be measured, determine the two-dimensional coordinates of the at least one common photogrammetric target feature on the object within each image, and then, based on knowledge of the relative location and orientation of the imaging device that took the images, determine the three dimensional coordinates of the at least one common photogrammetric target feature; and
b) obtain topographical data regarding a form of a surface of the object via analysis of the distortion of a structured light pattern projected on the object caused by height variation on the surface of the object as imaged in at least one image, obtained by the imaging device, which is the same imaging device used to obtain the at least one first image and the at least one second image of the object,
wherein the non-contact measurement apparatus being further configured to use both the data obtained from a) and b) to provide a 3D point cloud that describes the shape of the object.
2. A non-contact measurement apparatus as claimed in claim 1, in which the probe comprises at least one projector for projecting an optical pattern onto the surface of the object to be measured.
3. A non-contact measurement apparatus as claimed in claim 1, in which the processor is configured to obtain the topographical data regarding the surface of the object via analysis of at least one of the at least one first image and the at least one second image.
4. A non-contact measurement apparatus as claimed in claim 1, in which the processor is configured to process a set of images in which the position of an optical pattern on the object is different for each image in the set in order to determine the topographical data.
5. A non-contact measurement apparatus as claimed in claim 1, in which the processor is configured to identify an irregularity in an optical pattern projected on the object in each of the first and second images as the at least one common photogrammetric target feature.
6. A non-contact measurement apparatus as claimed in claim 5, in which the processor is configured to process:
a set of first images obtained by the imaging device from the first perspective, the position of an optical pattern projected onto the object being different for each image in the set; and
a set of second images obtained by the imaging device from the second perspective, the position of an optical pattern projected onto the object being different for each image in the set, in order to identify the at least one common photogrammetric target feature on the object to be measured and to determine the position of the common photogrammetric target feature on the object relative to the an image sensor of the imaging device.
7. A non-contact measurement apparatus as claimed in claim 6, in which the processor is configured to process at least one of the first or second sets of images in order to determine the topographical data.
8. A non-contact measurement apparatus as claimed in claim 7, in which the processor is configured to calculate at least one of a first phase map from the set of first images and a second phase map from the set of second images.
9. A non-contact measurement apparatus as claimed in claim 8, in which the processor is configured to determine the topographical data from at least one of the at least one first phase map and second phase map.
10. A non-contact measurement apparatus as claimed in claim 2, in which the projector has a fixed optical pattern.
11. A device for use in a non-contact measurement apparatus that includes a probe that is configured to be mounted on a coordinate positioning apparatus, having an imaging device for capturing an image of an object to be measured, the device comprising:
a processor configured to:
a) analyse at least one first image of an object obtained by the imaging device from a first perspective and at least one second image of the object obtained by the imaging device, which is the same imaging device used to obtain the at least one first image of the object, from a second perspective so as to identify in each of the at least one first image and the at least one second image of the object at least one common photogrammetric target feature on the object to be measured, determine the two-dimensional coordinates of the at least one common photogrammetric target feature on the object within each image, and then, based on knowledge of the relative location and orientation of the imaging device that took the images, determine the three dimensional coordinates of the at least one common photogrammetric target feature; and
b) obtain topographical data regarding a form of a surface of the object via analysis of the distortion of a structured light pattern projected on the object caused by height variation on the surface of the object as imaged in at least one image, obtained by the imaging device, which is the same imaging device used to obtain the at least one first image and the at least one second image of the object,
wherein the device being further configured to use both the data obtained from a) and b) to provide a 3D point cloud that describes the shape of the object.
12. A non-contact method for measuring an object located within a measurement space using a probe comprising an imaging device, the method comprising:
a) analysing at least one first image of an object obtained by the imaging device from a first perspective and at least one second image of the object obtained by the imaging device, which is the same imaging device used to obtain the at least one first image of the object, from a second perspective so as to identify in each of the at least one first image and the at least one second image of the object at least one common photogrammetric target feature on the object to be measured, determining the two-dimensional coordinates of the at least one common photogrammetric target feature on the object within each image, and then, based on knowledge of the relative location and orientation of the imaging device that took the images, determining the three dimensional coordinates of the at least one common photogrammetric target feature; and
b) obtaining topographical data regarding a form of a surface of the object via analysis of the distortion of a structured light pattern projected on the object caused by height variation on the surface of the object as imaged in at least one image, obtained by the imaging device, which is the same imaging device used to obtain the at least one first image and the at least one second image of the object,
wherein both the data obtained from a) and b) is used to provide a 3D point cloud that describes the shape of the object.
13. A method as claimed in claim 12, in which at least one of the at least one first image of the object from the first perspective and at least one second image of the object from the second perspective comprises the at least one image of the object on which an optical pattern is projected.
14. A method as claimed in claim 12 in which the method comprises relatively moving the object and the imaging device between the first and second perspectives.
15. A method as claimed in claim 12 in which the probe comprises a projector for projecting an optical pattern.
16. A memory and a processor, the memory storing instructions which, when executed by the processor, cause the processor to control the probe comprising the imaging device in accordance with the method of claim 12.
17. A non-transitory computer readable medium storing instructions, which when executed, perform the method of claim 12.
18. A non-contact measurement apparatus as claimed in claim 1, in which the coordinate positioning apparatus is a coordinate measuring machine.
19. A non-contact measurement apparatus as claimed in claim 1, in which the coordinate positioning apparatus is a machine tool.
20. A non-contact measurement apparatus as claimed in claim 1, in which the probe is mounted on an articulated probe head comprising at least one rotational axis.
21. A non-contact measurement apparatus as claimed in claim 20, in which the articulated probe head comprises at least two rotational axes.
22. A non-contact measurement apparatus as claimed in claim 20, in which the coordinate positioning apparatus comprises a base for the object, a frame on which a quill is mounted which can be moved along three mutually orthogonal axes and on which the articulated probe head is mounted.
23. A method as claimed in claim 12, in which the obtaining topographical data step comprises analysis of at least one of the at least one first image and the at least one second image.
24. A method as claimed in claim 12, in which the method comprises processing a set of images in which the position of an optical pattern on the object is different for each image in the set in order to determine the topographical data.
25. A method as claimed in claim 12, in which the method comprises identifying an irregularity in an optical pattern projected on the object in each of the first and second images as the at least one common photogrammetric target feature.
26. A method as claimed in claim 25, in which the method comprises:
processing a set of first images obtained by the imaging device from the first perspective, the position of an optical pattern projected onto the object being different for each image in the set; and
processing a set of second images obtained by the imaging device from the second perspective, the position of an optical pattern projected onto the object being different for each image in the set,
in order to identify the at least one common photogrammetric target feature on the object to be measured and to determine the position of the common photogrammetric target feature on the object relative to an image sensor of the imaging device.
27. A method as claimed in claim 26, in which the method comprises processing at least one of the first or second sets of images in order to determine the topographical data.
28. A method as claimed in claim 27, in which the method comprises calculating at least one of a first phase map from the set of first images and a second phase map from the set of second images.
29. A method as claimed in claim 28, in which the method comprises determining the topographical data from at least one of the at least one first phase map and second phase map.
30. A method as claimed in claim 15, in which the projector has a fixed optical pattern.
31. A method as claimed in claim 12, in which the probe is mounted on a coordinate positioning apparatus.
32. A method as claimed in claim 31, in which the coordinate positioning apparatus is a coordinate measuring machine.
33. A method as claimed in claim 31, in which the coordinate positioning apparatus is a machine tool.
34. A method as claimed in claim 31, in which the coordinate positioning apparatus comprises a base for the object, a frame on which a quill is mounted which can be moved along three mutually orthogonal axes and on which an articulated probe head is mounted.
35. A method as claimed in claim 12, in which the probe is mounted on an articulated probe head comprising at least one rotational axis.
36. A method as claimed in claim 35, in which the articulated probe head comprises at least two rotational axes.
US14/562,093 2007-08-17 2008-08-15 Non-contact probe Active 2030-01-01 USRE46012E1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US14/562,093 USRE46012E1 (en) 2007-08-17 2008-08-15 Non-contact probe

Applications Claiming Priority (9)

Application Number Priority Date Filing Date Title
GBGB0716080.7A GB0716080D0 (en) 2007-08-17 2007-08-17 Non-contact measurement apparatus and method
GB0716109.4 2007-08-17
GBGB0716109.4A GB0716109D0 (en) 2007-08-17 2007-08-17 Non-contact measurement apparatus and method
GBGB0716088.0A GB0716088D0 (en) 2007-08-17 2007-08-17 Non-contact measurement apparatus and method
GB0716080.7 2007-08-17
GB0716088.0 2007-08-17
US14/562,093 USRE46012E1 (en) 2007-08-17 2008-08-15 Non-contact probe
US12/733,021 US8605983B2 (en) 2007-08-17 2008-08-15 Non-contact probe
PCT/GB2008/002760 WO2009024758A1 (en) 2007-08-17 2008-08-15 Non-contact probe

Publications (1)

Publication Number Publication Date
USRE46012E1 true USRE46012E1 (en) 2016-05-24

Family

ID=39876207

Family Applications (4)

Application Number Title Priority Date Filing Date
US12/733,021 Ceased US8605983B2 (en) 2007-08-17 2008-08-15 Non-contact probe
US12/733,025 Active 2030-05-29 US8923603B2 (en) 2007-08-17 2008-08-15 Non-contact measurement apparatus and method
US14/562,093 Active 2030-01-01 USRE46012E1 (en) 2007-08-17 2008-08-15 Non-contact probe
US12/733,022 Active 2030-06-15 US8792707B2 (en) 2007-08-17 2008-08-15 Phase analysis measurement apparatus and method

Family Applications Before (2)

Application Number Title Priority Date Filing Date
US12/733,021 Ceased US8605983B2 (en) 2007-08-17 2008-08-15 Non-contact probe
US12/733,025 Active 2030-05-29 US8923603B2 (en) 2007-08-17 2008-08-15 Non-contact measurement apparatus and method

Family Applications After (1)

Application Number Title Priority Date Filing Date
US12/733,022 Active 2030-06-15 US8792707B2 (en) 2007-08-17 2008-08-15 Phase analysis measurement apparatus and method

Country Status (5)

Country Link
US (4) US8605983B2 (en)
EP (4) EP2183546B1 (en)
JP (4) JP5689681B2 (en)
CN (3) CN101821578B (en)
WO (3) WO2009024756A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11454106B2 (en) 2017-06-15 2022-09-27 Drillscan France Sas Generating drilling paths using a drill model

Families Citing this family (126)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2142878B1 (en) * 2007-04-30 2018-09-26 Renishaw PLC Analogue probe with temperature control and method of operation
WO2008153127A1 (en) * 2007-06-15 2008-12-18 Kabushiki Kaisha Toshiba Instrument for examining/measuring object to be measured
CN101821578B (en) 2007-08-17 2014-03-12 瑞尼斯豪公司 Non-contact measurement apparatus and method
DE102007054906B4 (en) * 2007-11-15 2011-07-28 Sirona Dental Systems GmbH, 64625 Method for optical measurement of the three-dimensional geometry of objects
GB0809037D0 (en) 2008-05-19 2008-06-25 Renishaw Plc Video Probe
US9482755B2 (en) 2008-11-17 2016-11-01 Faro Technologies, Inc. Measurement system having air temperature compensation between a target and a laser tracker
US7991575B2 (en) 2009-01-08 2011-08-02 Trimble Navigation Limited Method and system for measuring angles based on 360 degree images
WO2010094949A1 (en) * 2009-02-17 2010-08-26 Absolute Robotics Limited Measurement of positional information for a robot arm
US9551575B2 (en) 2009-03-25 2017-01-24 Faro Technologies, Inc. Laser scanner having a multi-color light source and real-time color receiver
DE102009015920B4 (en) 2009-03-25 2014-11-20 Faro Technologies, Inc. Device for optically scanning and measuring an environment
JP4715944B2 (en) * 2009-04-03 2011-07-06 オムロン株式会社 Three-dimensional shape measuring apparatus, three-dimensional shape measuring method, and three-dimensional shape measuring program
GB0909635D0 (en) * 2009-06-04 2009-07-22 Renishaw Plc Vision measurement probe
GB0915904D0 (en) * 2009-09-11 2009-10-14 Renishaw Plc Non-contact object inspection
US9113023B2 (en) 2009-11-20 2015-08-18 Faro Technologies, Inc. Three-dimensional scanner with spectroscopic energy detector
US9210288B2 (en) 2009-11-20 2015-12-08 Faro Technologies, Inc. Three-dimensional scanner with dichroic beam splitters to capture a variety of signals
US9529083B2 (en) 2009-11-20 2016-12-27 Faro Technologies, Inc. Three-dimensional scanner with enhanced spectroscopic energy detector
DE102009057101A1 (en) 2009-11-20 2011-05-26 Faro Technologies, Inc., Lake Mary Device for optically scanning and measuring an environment
DE102009056177B4 (en) * 2009-11-27 2014-07-10 Siemens Aktiengesellschaft Device and method for measuring and compensating for the effects of movement in phase-shifting profilometers and its application to mobile, freehand-guided profilometry
DE102009054591A1 (en) * 2009-12-14 2011-06-16 Robert Bosch Gmbh Measuring tool for detecting a contour of an object
US9628775B2 (en) 2010-01-20 2017-04-18 Faro Technologies, Inc. Articulated arm coordinate measurement machine having a 2D camera and method of obtaining 3D representations
US9607239B2 (en) 2010-01-20 2017-03-28 Faro Technologies, Inc. Articulated arm coordinate measurement machine having a 2D camera and method of obtaining 3D representations
US20130222816A1 (en) * 2010-01-20 2013-08-29 Faro Technologies, Inc. Coordinate measuring machine having an illuminated probe end and method of operation
US9879976B2 (en) 2010-01-20 2018-01-30 Faro Technologies, Inc. Articulated arm coordinate measurement machine that uses a 2D camera to determine 3D coordinates of smoothly continuous edge features
JP2013517508A (en) 2010-01-20 2013-05-16 ファロ テクノロジーズ インコーポレーテッド Multifunctional coordinate measuring machine
US9163922B2 (en) 2010-01-20 2015-10-20 Faro Technologies, Inc. Coordinate measurement machine with distance meter and camera to determine dimensions within camera images
US9772394B2 (en) 2010-04-21 2017-09-26 Faro Technologies, Inc. Method and apparatus for following an operator and locking onto a retroreflector with a laser tracker
US9400170B2 (en) 2010-04-21 2016-07-26 Faro Technologies, Inc. Automatic measurement of dimensional data within an acceptance region by a laser tracker
US9377885B2 (en) 2010-04-21 2016-06-28 Faro Technologies, Inc. Method and apparatus for locking onto a retroreflector with a laser tracker
US8619265B2 (en) 2011-03-14 2013-12-31 Faro Technologies, Inc. Automatic measurement of dimensional data with a laser tracker
DE102010020925B4 (en) 2010-05-10 2014-02-27 Faro Technologies, Inc. Method for optically scanning and measuring an environment
JP5657276B2 (en) * 2010-05-14 2015-01-21 一般社団法人モアレ研究所 Shape measuring apparatus and shape measuring method
US10010268B2 (en) * 2010-09-15 2018-07-03 Olympus Corporation Endoscope apparatus
DE102010038177B3 (en) * 2010-10-14 2012-04-19 Deutsches Zentrum für Luft- und Raumfahrt e.V. Optical measuring method for measuring grey-scale images of phase object, involves comparing recorded images with reference image of two-dimensional stochastic dot patterns at condition of object, which influences images of dot patterns
WO2012057283A1 (en) * 2010-10-27 2012-05-03 株式会社ニコン Shape measuring device, shape measuring method, structure manufacturing method, and program
US9168654B2 (en) 2010-11-16 2015-10-27 Faro Technologies, Inc. Coordinate measuring machines with dual layer arm
US8755043B2 (en) * 2010-11-19 2014-06-17 Koh Young Technology Inc. Method of inspecting a substrate
JP5172040B2 (en) * 2010-12-17 2013-03-27 パナソニック株式会社 Surface shape measuring method and surface shape measuring apparatus
US20120194651A1 (en) * 2011-01-31 2012-08-02 Nikon Corporation Shape measuring apparatus
GB2518543A (en) 2011-03-03 2015-03-25 Faro Tech Inc Target apparatus and method
EP2505959A1 (en) 2011-03-28 2012-10-03 Renishaw plc Coordinate positioning machine controller
US9686532B2 (en) 2011-04-15 2017-06-20 Faro Technologies, Inc. System and method of acquiring three-dimensional coordinates using multiple coordinate measurement devices
JP2014516409A (en) 2011-04-15 2014-07-10 ファロ テクノロジーズ インコーポレーテッド Improved position detector for laser trackers.
US9482529B2 (en) 2011-04-15 2016-11-01 Faro Technologies, Inc. Three-dimensional coordinate scanner and method of operation
US9164173B2 (en) 2011-04-15 2015-10-20 Faro Technologies, Inc. Laser tracker that uses a fiber-optic coupler and an achromatic launch to align and collimate two wavelengths of light
GB201107053D0 (en) 2011-04-27 2011-06-08 Univ Sheffield Improvements in providing image data
US9280848B1 (en) * 2011-10-24 2016-03-08 Disney Enterprises Inc. Rendering images with volumetric shadows using rectified height maps for independence in processing camera rays
JP5798823B2 (en) * 2011-07-22 2015-10-21 株式会社ミツトヨ Abscissa calibration jig and abscissa calibration method for laser interference measuring apparatus
EP2737426B1 (en) * 2011-07-29 2019-04-03 Hexagon Metrology, Inc Coordinate measuring system data reduction
US9251562B1 (en) * 2011-08-04 2016-02-02 Amazon Technologies, Inc. Registration of low contrast images
US9891043B2 (en) * 2011-10-11 2018-02-13 Nikon Corporation Profile measuring apparatus, structure manufacturing system, method for measuring profile, method for manufacturing structure, and non-transitory computer readable medium
CN102494625B (en) * 2011-11-22 2013-10-02 吴江市博众精工科技有限公司 Laser mechanism rotating module
CN102494624B (en) * 2011-11-22 2013-10-02 吴江市博众精工科技有限公司 Laser adjustment module
JP2013117453A (en) * 2011-12-05 2013-06-13 Hitachi Ltd Distance measuring method and apparatus and shape measuring apparatus with the same packaged therein
GB201201140D0 (en) * 2012-01-24 2012-03-07 Phase Focus Ltd Method and apparatus for determining object characteristics
DE102012100609A1 (en) 2012-01-25 2013-07-25 Faro Technologies, Inc. Device for optically scanning and measuring an environment
US20130226330A1 (en) * 2012-02-24 2013-08-29 Alliance For Sustainable Energy, Llc Optical techniques for monitoring continuous manufacturing of proton exchange membrane fuel cell components
CN103376076A (en) * 2012-04-23 2013-10-30 鸿富锦精密工业(深圳)有限公司 Three-dimensional probe compensation and space error measuring system and method
GB201207800D0 (en) 2012-05-03 2012-06-13 Phase Focus Ltd Improvements in providing image data
JP6132126B2 (en) * 2012-05-14 2017-05-24 パルステック工業株式会社 Translucent object inspection apparatus and translucent object inspection method
US20150160005A1 (en) * 2012-06-12 2015-06-11 Shima Seiki Mfg., Ltd. Three-dimensional measurement apparatus, and three-dimensional measurement method
US8997362B2 (en) 2012-07-17 2015-04-07 Faro Technologies, Inc. Portable articulated arm coordinate measuring machine with optical communications bus
US10380469B2 (en) * 2012-07-18 2019-08-13 The Boeing Company Method for tracking a device in a landmark-based reference system
WO2014023345A1 (en) * 2012-08-07 2014-02-13 Carl Zeiss Industrielle Messtechnik Gmbh Improved device for examining an object and method
US8823930B2 (en) 2012-08-07 2014-09-02 Carl Zeiss Industrielle Messtechnik Gmbh Apparatus and method for inspecting an object
JP5740370B2 (en) * 2012-09-04 2015-06-24 株式会社東芝 Region specifying apparatus, method, and program
CN103659465B (en) * 2012-09-21 2016-03-09 财团法人工业技术研究院 For the compensating control method of multi-spindle machining
DE102012109481A1 (en) 2012-10-05 2014-04-10 Faro Technologies, Inc. Device for optically scanning and measuring an environment
US9513107B2 (en) 2012-10-05 2016-12-06 Faro Technologies, Inc. Registration calculation between three-dimensional (3D) scans based on two-dimensional (2D) scan data from a 3D scanner
US10067231B2 (en) 2012-10-05 2018-09-04 Faro Technologies, Inc. Registration calculation of three-dimensional scanner data performed between scans based on measurements by two-dimensional scanner
GB201222361D0 (en) * 2012-12-12 2013-01-23 Univ Birmingham Surface geometry imaging
US9041914B2 (en) 2013-03-15 2015-05-26 Faro Technologies, Inc. Three-dimensional coordinate scanner and method of operation
EP2967348B1 (en) * 2013-03-15 2022-03-23 Synaptive Medical Inc. Intelligent positioning system
KR101405227B1 (en) * 2013-04-02 2014-06-10 현대자동차 주식회사 Speed measurement device of conveyor line
US9736433B2 (en) * 2013-05-17 2017-08-15 The Boeing Company Systems and methods for detection of clear air turbulence
US9651525B2 (en) * 2013-06-27 2017-05-16 TecScan Systems Inc. Method and apparatus for scanning an object
US20150070468A1 (en) * 2013-09-10 2015-03-12 Faro Technologies, Inc. Use of a three-dimensional imager's point cloud data to set the scale for photogrammetry
FI20135961A (en) * 2013-09-25 2015-03-26 Aalto Korkeakoulusäätiö Imaging arrangements and procedures as well as systems for mapping the topography of three-dimensional surface
CN103528514B (en) * 2013-10-12 2016-09-21 南京理工大学 The many visual fields of machine vision work in coordination with mechanism and equipped with this mechanism measurement with detection device
US9384540B2 (en) * 2013-12-03 2016-07-05 Sunedison Semiconductor Limited (Uen201334164H) Systems and methods for interferometric phase measurement
WO2015134277A1 (en) * 2014-03-05 2015-09-11 Faxitron Bioptics, Llc System and method for multi-axis imaging of specimens
EP2930462B1 (en) * 2014-04-08 2017-09-13 Hexagon Technology Center GmbH Method for generating information about a sensor chain of a coordinate measuring machine (CMM)
US9395174B2 (en) 2014-06-27 2016-07-19 Faro Technologies, Inc. Determining retroreflector orientation by optimizing spatial fit
US10330465B2 (en) 2014-08-08 2019-06-25 Applied Research Associates, Inc. Systems, methods, and apparatuses for measuring deformation of a surface
CN104316012A (en) * 2014-11-25 2015-01-28 宁夏共享模具有限公司 Industrial robot for measuring size of large part
JP6388722B2 (en) * 2014-12-04 2018-09-12 アプレ インストゥルメンツ, エルエルシーApre Instruments, Llc Interferometric non-contact optical probe and measurement
EP3848867A1 (en) 2015-03-18 2021-07-14 United Parcel Service Of America, Inc. Systems and methods for verifying the contents of a shipment
CA2981978C (en) 2015-04-16 2021-02-09 United Parcel Service Of America, Inc. Enhanced multi-layer cargo screening system, computer program product, and method of using the same
CN104792263B (en) 2015-04-20 2018-01-05 合肥京东方光电科技有限公司 The method and apparatus for determining the region to be detected of display master blank
US10598476B2 (en) * 2015-05-12 2020-03-24 Hexagon Metrology, Inc. Apparatus and method of controlling CMM using environmental information or CMM information
EP3322959A1 (en) * 2015-07-13 2018-05-23 Renishaw Plc. Method for measuring an artefact
CN105157569B (en) * 2015-08-31 2018-05-22 宁夏共享模具有限公司 A kind of evaporative pattern mold laser measuring machine
WO2017040977A1 (en) 2015-09-04 2017-03-09 Faxitron Bioptics, Llc Multi-axis specimen imaging device with embedded orientation markers
DE102015122844A1 (en) 2015-12-27 2017-06-29 Faro Technologies, Inc. 3D measuring device with battery pack
JP6685767B2 (en) * 2016-02-25 2020-04-22 株式会社ミツトヨ Surface texture measuring machine
KR102084535B1 (en) * 2016-03-30 2020-03-05 가부시키가이샤 히다치 하이테크놀로지즈 Defect inspection device, defect inspection method
EP3239927B1 (en) * 2016-04-25 2021-04-07 ALSTOM Transport Technologies Assembly completeness inspection method using active ranging
US10401145B2 (en) * 2016-06-13 2019-09-03 Carl Zeiss Industrielle Messtechnik Gmbh Method for calibrating an optical arrangement
JP7090068B2 (en) * 2016-07-28 2022-06-23 レニショウ パブリック リミテッド カンパニー Non-contact probe and method of operation
JP6295299B2 (en) * 2016-08-26 2018-03-14 株式会社ミツトヨ Coordinate correction method and three-dimensional measuring apparatus
CN107005703A (en) * 2016-09-09 2017-08-01 深圳市大疆创新科技有限公司 Method for encoding images and system
US11083426B2 (en) 2016-11-04 2021-08-10 Hologic, Inc. Specimen radiography system comprising cabinet and a specimen drawer positionable by a controller in the cabinet
JP2018110057A (en) * 2016-12-28 2018-07-12 パナソニックIpマネジメント株式会社 Lighting system, lighting method and program
JP6353573B1 (en) * 2017-03-02 2018-07-04 Ckd株式会社 3D measuring device
JP6308637B1 (en) * 2017-05-08 2018-04-11 国立大学法人福井大学 3D measurement method and apparatus using feature quantity
CN107462164A (en) * 2017-08-17 2017-12-12 苏州新捷毅自动化科技有限公司 A kind of method that light source measurement height is made of projecting apparatus
US10372155B2 (en) * 2017-08-20 2019-08-06 Pixart Imaging Inc. Joystick and related control method
US10969878B2 (en) 2017-08-20 2021-04-06 Pixart Imaging Inc. Joystick with light emitter and optical sensor within internal chamber
EP3682228A4 (en) 2017-09-11 2021-06-16 Faxitron Bioptics, LLC Imaging system with adaptive object magnification
CN107664478B (en) * 2017-10-26 2023-11-03 成都众鑫聚合科技有限公司 Vertical non-contact gyrosome high-precision measuring device and measuring method thereof
CN107990821B (en) * 2017-11-17 2019-12-17 深圳大学 Bridge deformation monitoring method, storage medium and bridge deformation monitoring receiver
CN108441889B (en) * 2018-03-19 2020-06-26 河南科技大学 Method and device for detecting attachment of steel claw of anode guide rod
JP6888580B2 (en) * 2018-04-05 2021-06-16 オムロン株式会社 Information processing equipment, information processing methods, and programs
CA3153295C (en) 2018-05-09 2023-06-13 Kawasaki Jukogyo Kabushiki Kaisha Sampling method and sampling system
GB2574064B (en) * 2018-05-25 2020-05-27 Imetrum Ltd Motion encoder
CN111258158B (en) 2018-11-30 2022-10-25 中强光电股份有限公司 Projector and brightness adjusting method
CN111258157B (en) * 2018-11-30 2023-01-10 中强光电股份有限公司 Projector and brightness adjusting method
US20220074738A1 (en) * 2019-04-11 2022-03-10 Hewlett-Packard Development Company, L.P. Three dimensional imaging
US11367201B2 (en) 2019-09-24 2022-06-21 The Boeing Company System and method for continual localization of scanner using non-destructive inspection data
US10929670B1 (en) 2019-10-21 2021-02-23 The Boeing Company Marker-to-model location pairing and registration for augmented reality applications
US20210291435A1 (en) * 2020-03-19 2021-09-23 Ricoh Company, Ltd. Measuring apparatus, movable apparatus, robot, electronic device, fabricating apparatus, and measuring method
EP3910287A1 (en) * 2020-05-14 2021-11-17 Fill Gesellschaft m.b.H. Method and device for measuring a physical object
US11398085B2 (en) * 2020-07-31 2022-07-26 Wisconsin Alumni Research Foundation Systems, methods, and media for directly recovering planar surfaces in a scene using structured light
CN112050749B (en) * 2020-08-18 2022-04-05 丰柯电子科技(上海)有限公司 Online detection device and detection method for curvature of automobile part for machining
CN111951505B (en) * 2020-08-25 2022-02-08 青岛大学 Fence vibration intrusion positioning and mode identification method based on distributed optical fiber system
CA3138634C (en) 2021-03-04 2023-09-19 TecScan Systems Inc. System and method for scanning an object using an array of ultrasonic transducers
CN115327541B (en) * 2022-10-12 2023-03-14 中国人民解放军国防科技大学 Array scanning holographic penetration imaging method and handheld holographic penetration imaging radar system

Citations (109)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2088095A (en) 1980-11-20 1982-06-03 Tokico Ltd Robot
JPS61114109A (en) 1984-11-09 1986-05-31 Ishida Scales Mfg Co Ltd Weight receiving method
US4767212A (en) 1984-09-19 1988-08-30 Ishida Scales Mfg. Co., Ltd. Volume determination process
DE3829925A1 (en) 1988-09-02 1990-03-15 Kaltenbach & Voigt Optical probe for 3D measurement of teeth in the buccal cavity
DE3938714A1 (en) 1989-11-23 1991-05-29 Bernd Dr Breuckmann Optical determination of object shapes, shape variations - using structured coloured light projected onto objects for high resolution, dynamic measurement
EP0445618A2 (en) 1990-03-09 1991-09-11 Firma Carl Zeiss Device and procedure to measure without contact the surface-contour of an object
WO1991015732A1 (en) 1990-04-05 1991-10-17 Intelligent Automation Systems, Inc. Real time three dimensional sensing system
JPH04204314A (en) 1990-11-30 1992-07-24 Mazda Motor Corp Surface defect inspection instrument
US5175601A (en) 1991-10-15 1992-12-29 Electro-Optical Information Systems High-speed 3-D surface measurement surface inspection and reverse-CAD system
US5237404A (en) 1990-06-28 1993-08-17 Mazda Motor Corporation Inspection apparatus with improved detection of surface defects over large and curved surfaces
US5251156A (en) 1990-08-25 1993-10-05 Carl-Zeiss-Stiftung, Heidenheim/Brenz Method and apparatus for non-contact measurement of object surfaces
US5262844A (en) 1990-07-03 1993-11-16 Bertin & Cie Apparatus for determining the three-dimensional shape of an object optically without contact
US5289264A (en) 1991-09-26 1994-02-22 Hans Steinbichler Method and apparatus for ascertaining the absolute coordinates of an object
JPH06138055A (en) 1992-10-28 1994-05-20 Sumitomo Metal Ind Ltd Method for inspecting surface defect
US5319445A (en) 1992-09-08 1994-06-07 Fitts John M Hidden change distribution grating and use in 3D moire measurement sensors and CMM applications
DE4301538A1 (en) 1992-03-17 1994-07-28 Peter Dr Ing Brueckner Method and arrangement for contactless three-dimensional measurement, in particular for measuring denture models
US5372502A (en) 1988-09-02 1994-12-13 Kaltenbach & Voight Gmbh & Co. Optical probe and method for the three-dimensional surveying of teeth
EP0402440B1 (en) 1988-12-19 1995-06-07 Renishaw plc Method of and apparatus for scanning the surface of a workpiece
JPH07260451A (en) 1994-03-18 1995-10-13 Shiseido Co Ltd Three dimensional shape measuring system
US5488477A (en) * 1993-11-15 1996-01-30 Zygo Corporation Methods and apparatus for profiling surfaces of transparent objects
WO1997005449A1 (en) 1995-07-26 1997-02-13 Crampton Stephen J Scanning apparatus and method
DE19634254A1 (en) 1995-09-04 1997-03-06 Volkswagen Ag Optical-numerical determination of entire surface of solid object e.g. for motor vehicle mfr.
US5646733A (en) 1996-01-29 1997-07-08 Medar, Inc. Scanning phase measuring method and system for an object at a vision station
WO1997036144A1 (en) 1996-03-22 1997-10-02 Loughborough University Innovations Limited Method and apparatus for measuring shape of objects
JPH11211443A (en) 1998-01-27 1999-08-06 Matsushita Electric Works Ltd Three-dimensional shape measuring device
JPH11211442A (en) 1998-01-27 1999-08-06 Matsushita Electric Works Ltd Method and device for detecting defect of object surface
US5953448A (en) 1996-03-01 1999-09-14 Textile/Clothing Technology Corporation Contour measurement of an object having a discontinuous surface using block point identification techniques
US6028672A (en) 1996-09-30 2000-02-22 Zheng J. Geng High speed three dimensional imaging method
JP2000097672A (en) 1998-09-18 2000-04-07 Sanyo Electric Co Ltd Control information generating method and assisting system in three-dimensional measuring system
WO2000021034A1 (en) 1998-10-06 2000-04-13 Easyscan Műszaki Fejlesztő Kereskedelmi És Szolgál Tató Kft. Method and apparatus for the scanning of spatial objects and to create a three dimensional computer model
DE19846145A1 (en) 1998-10-01 2000-04-20 Klaus Koerner Three-dimensional imaging device for shape measurement has transmitter array whose elements move in straight, parallel lines
US6055056A (en) 1996-05-06 2000-04-25 Fraunhofer-Gesellschaft Zur Foerderung Der Angewandten Forschung E.V. Device for non-contact measurement of the surface of a three dimensional object
US6084712A (en) 1998-11-03 2000-07-04 Dynamic Measurement And Inspection,Llc Three dimensional imaging using a refractive optic design
US6100984A (en) 1999-06-11 2000-08-08 Chen; Fang Surface measurement system with a laser light generator
US6144453A (en) 1998-09-10 2000-11-07 Acuity Imaging, Llc System and method for three-dimensional inspection using patterned light projection
JP2001012925A (en) 1999-04-30 2001-01-19 Nec Corp Three-dimensional shape measurement method and device and record medium
JP2001108422A (en) 1999-10-12 2001-04-20 Wakayama Univ Method and apparatus for measuring shape
US6256099B1 (en) 1998-11-06 2001-07-03 Frederick Kaufman Methods and system for measuring three dimensional spatial coordinates and for external camera calibration necessary for that measurement
WO2001051887A1 (en) 2000-01-07 2001-07-19 Cyberoptics Corporation Phase profilometry system with telecentric projector
US6291817B1 (en) 1998-06-23 2001-09-18 Fuji Photo Optical Co., Ltd. Moire apparatus having projection optical system and observation optical system which have optical axes parallel to each other
JP2002054912A (en) 2000-08-08 2002-02-20 Ricoh Co Ltd Shape measuring system, imaging device, shape measuring method, and recording medium
JP2002090126A (en) 2000-09-14 2002-03-27 Wakayama Univ Real time shape deformation measuring method by color rectangular wave grid projection
US20020057832A1 (en) 1996-06-13 2002-05-16 Marc R.A.B. Proesmans Method and system for acquiring a three-dimensional shape description
JP2002162215A (en) 2000-11-27 2002-06-07 Matsushita Electric Works Ltd Three-dimensional shape measuring method and its system
US6438272B1 (en) 1997-12-31 2002-08-20 The Research Foundation Of State University Of Ny Method and apparatus for three dimensional surface contouring using a digital video projection system
US20020181764A1 (en) 1997-05-22 2002-12-05 Kabushiki Kaisha Topcon Measuring apparatus
US6532299B1 (en) 2000-04-28 2003-03-11 Orametrix, Inc. System and method for mapping a surface
US20030123707A1 (en) 2001-12-31 2003-07-03 Park Seujeung P. Imaging-based distance measurement and three-dimensional profiling system
US6600511B1 (en) 1997-01-08 2003-07-29 Pentax Corporation Camera for use in photogrammetric analytical measurement
US20030174880A1 (en) 2002-03-12 2003-09-18 Nec Corporation Three-dimensional shape measurement technique
US6674893B1 (en) 1999-10-19 2004-01-06 Fuji Xerox Co., Ltd. Three-dimensional shape measuring apparatus
US6728423B1 (en) 2000-04-28 2004-04-27 Orametrix, Inc. System and method for mapping a surface
US6738508B1 (en) 2000-04-28 2004-05-18 Orametrix, Inc. Method and system for registering data
US6744932B1 (en) 2000-04-28 2004-06-01 Orametrix, Inc. System and method for mapping a surface
US6744914B1 (en) 2000-04-28 2004-06-01 Orametrix, Inc. Method and system for generating a three-dimensional object
US6771809B1 (en) 2000-04-28 2004-08-03 Orametrix, Inc. Method and system for registering data
WO2004083778A1 (en) 2003-03-18 2004-09-30 Hermary Alexander Thomas Coded-light dual-view profile scanner
WO2004096502A1 (en) 2003-04-28 2004-11-11 Stephen James Crampton Cmm arm with exoskeleton
JP2004317495A (en) 2003-03-31 2004-11-11 Mitsutoyo Corp Method and instrument for measuring noncontactly three-dimensional shape
US20040246496A1 (en) 2003-03-31 2004-12-09 Mitutoyo Corporation Method and apparatus for non-contact three-dimensional surface measurement
US20050018209A1 (en) 2003-07-24 2005-01-27 Guylain Lemelin Optical 3D digitizer with enlarged non-ambiguity zone
WO2005059470A1 (en) 2003-12-17 2005-06-30 Universität Karlsruhe Method for the dynamic, three-dimensional detection and representation of a surface
WO2005073669A1 (en) 2004-01-29 2005-08-11 Siemens Corporate Research, Inc. Semi and fully-automatic camera calibration tools using laser-based measurement devices
US20050201611A1 (en) * 2004-03-09 2005-09-15 Lloyd Thomas Watkins Jr. Non-contact measurement method and apparatus
US20050271264A1 (en) 2004-04-21 2005-12-08 Topcon Corporation Three-dimensional image measuring apparatus and method
JP2005345383A (en) 2004-06-04 2005-12-15 Asahi Glass Co Ltd Inspection method, and inspection device for surface shape
US7001024B2 (en) 2001-02-14 2006-02-21 Ricoh Company, Ltd. Image input apparatus using projected light
US20060093206A1 (en) 2000-04-28 2006-05-04 Rudger Rubbert System and method for mapping a surface
US20060103854A1 (en) 2001-06-27 2006-05-18 Franke Ernest A Non-contact apparatus and method for measuring surface profile
JP2006179031A (en) 2001-02-14 2006-07-06 Ricoh Co Ltd Image input apparatus
US7103211B1 (en) 2001-09-04 2006-09-05 Geometrix, Inc. Method and apparatus for generating 3D face models from one camera
US20060210146A1 (en) 2005-01-07 2006-09-21 Jin Gu Creating 3D images of objects by illuminating with infrared patterns
US7133551B2 (en) 2002-02-07 2006-11-07 National Central University Semi-automatic reconstruction method of 3-D building models using building outline segments
US7136170B2 (en) 2002-04-24 2006-11-14 Fraunhofer-Gesellschaft Zur Foerderung Der Angewandten Forschung E.V. Method and device for determining the spatial co-ordinates of an object
US20060268153A1 (en) * 2005-05-11 2006-11-30 Xenogen Corporation Surface contruction using combined photographic and structured light information
US7171328B1 (en) * 2004-08-30 2007-01-30 Sandia Corporation Method for measuring thermal properties using a long-wavelength infrared thermal image
JP2007024764A (en) 2005-07-20 2007-02-01 Daihatsu Motor Co Ltd Noncontact three-dimensional measuring method and system
US20070057946A1 (en) 2003-07-24 2007-03-15 Dan Albeck Method and system for the three-dimensional surface reconstruction of an object
JP2007093412A (en) 2005-09-29 2007-04-12 Fujinon Corp Three-dimensional shape measuring device
CN1952595A (en) 2005-10-20 2007-04-25 欧姆龙株式会社 Three-dimensional shape measuring apparatus, computer-readable recording medium, and three-dimensional shape measuring method
JP2007121294A (en) 2005-10-24 2007-05-17 General Electric Co <Ge> Method and device for inspecting object
GB2434541A (en) 2006-01-30 2007-08-01 Mailling Wright Products Ltd Preparing a clinical restraint
US20070183631A1 (en) 2006-02-06 2007-08-09 Beijing University Of Aeronautics And Astronautics Methods and apparatus for measuring the flapping deformation of insect wings
US7256899B1 (en) 2006-10-04 2007-08-14 Ivan Faul Wireless methods and systems for three-dimensional non-contact shape sensing
US20070206204A1 (en) 2005-12-01 2007-09-06 Peirong Jia Full-field three-dimensional measurement method
US20070211258A1 (en) 2006-03-07 2007-09-13 Korea Advanced Institute Of Science And Technology Three-dimensional shape measurement apparatus and method for eliminating2pi ambiguity of moire principle and omitting phase shifting means
WO2007121953A1 (en) 2006-04-21 2007-11-01 Quiss Gmbh Device for automatically applying or producing and monitoring a structure placed on a substrate with a determination of geometric dimensions and corresponding method therefor
WO2007125081A1 (en) 2006-04-27 2007-11-08 Metris N.V. Optical scanning probe
JP2007315882A (en) 2006-05-25 2007-12-06 Dainippon Printing Co Ltd Apparatus and method for positioning substrate, and device and method for manufacturing color filter
US7310514B2 (en) 2001-03-16 2007-12-18 Nec Corporation Transmission-origin mobile telephone capable of detecting the media types and data formats of a multimedia message receivable by destination mobile telephones in a multimedia communication system
CN101105393A (en) 2006-07-13 2008-01-16 周波 Vision measuring method for projecting multiple frequency grating object surface tri-dimensional profile
US20080075328A1 (en) * 2006-09-15 2008-03-27 Sciammarella Cesar A System and method for analyzing displacements and contouring of surfaces
JP2008076107A (en) 2006-09-19 2008-04-03 Denso Corp Apparatus and method for visual inspection, height measuring method, and circuit board manufacturing method
WO2008046663A2 (en) 2006-10-16 2008-04-24 Fraunhofer Gesellschaft Zur Förderung Der Angewandten Forschung E. V. Device and method for the contactless detection of a three-dimensional contour
US20080165341A1 (en) 2005-04-06 2008-07-10 Dimensional Photonics International, Inc. Multiple Channel Interferometric Surface Contour Measurement System
US20080239288A1 (en) 2007-04-02 2008-10-02 Korea Advanced Institute Of Science And Technology 3D shape measurement apparatus and method using stereo moire technique
WO2009024757A1 (en) 2007-08-17 2009-02-26 Renishaw Plc Phase analysis measurement apparatus and method
US7538891B1 (en) 2005-09-30 2009-05-26 California Institute Of Technology Surface characterization based on lateral shearing of diffracted wave fronts to measure in-plane and out-of-plane displacement gradient fields
WO2009094510A1 (en) 2008-01-25 2009-07-30 Cyberoptics Corporation Multi-source sensor for three-dimensional imaging using phased structured light
WO2009097066A1 (en) 2008-01-31 2009-08-06 Cyberoptics Corporation Improved method for three-dimensional imaging using multi-phase structured light
US20090238449A1 (en) 2005-11-09 2009-09-24 Geometric Informatics, Inc Method and Apparatus for Absolute-Coordinate Three-Dimensional Surface Imaging
US20090268214A1 (en) * 2006-05-26 2009-10-29 Miljenko Lucic Photogrammetric system and techniques for 3d acquisition
US20100008588A1 (en) 2008-07-08 2010-01-14 Chiaro Technologies LLC Multiple channel locating
US20100296104A1 (en) 2009-05-21 2010-11-25 General Electric Company Inspection system and method with multi-image phase shift analysis
US20100329538A1 (en) 2009-06-25 2010-12-30 Jason Remillard Cell Feature Extraction and Labeling Thereof
US7912673B2 (en) * 2005-03-11 2011-03-22 Creaform Inc. Auto-referenced system and apparatus for three-dimensional scanning
US20110317879A1 (en) * 2009-02-17 2011-12-29 Absolute Robotics Limited Measurement of Positional Information for a Robot Arm
US8111907B2 (en) * 2007-07-31 2012-02-07 United Technologies Corporation Method for repeatable optical determination of object geometry dimensions and deviations
US20120307260A1 (en) 2008-04-01 2012-12-06 Perceptron, Inc. Hybrid system

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB716109A (en) 1950-06-22 1954-09-29 Crosweller & Co Ltd W Improvements in, or relating to, apparatus for winding paper, or other similar flexible strip material, into roll form
BE513714A (en) 1951-08-23 1900-01-01
GB716088A (en) 1952-10-29 1954-09-29 Miguel Angel Mascarello Improvements relating to musical instruments
JPS6049475A (en) * 1983-08-29 1985-03-18 Matsushita Electric Ind Co Ltd Object detecting method
JPH0623656B2 (en) * 1987-07-13 1994-03-30 新技術事業団 Multi-view device
JP2996069B2 (en) * 1993-08-19 1999-12-27 村田機械株式会社 3D measuring device
JP3156602B2 (en) * 1996-10-11 2001-04-16 日産自動車株式会社 Defect inspection method for inspected surface
JP3921432B2 (en) * 2002-09-13 2007-05-30 株式会社リコー Shape measuring apparatus and shape measuring method using moire optical system
US8050491B2 (en) * 2003-12-17 2011-11-01 United Technologies Corporation CAD modeling system and method
JP4429135B2 (en) * 2004-10-05 2010-03-10 Necエンジニアリング株式会社 Three-dimensional shape measurement system and measurement method
JP4741344B2 (en) * 2005-11-07 2011-08-03 ダイハツ工業株式会社 Shape recognition apparatus and distortion evaluation apparatus

Patent Citations (125)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2088095A (en) 1980-11-20 1982-06-03 Tokico Ltd Robot
US4767212A (en) 1984-09-19 1988-08-30 Ishida Scales Mfg. Co., Ltd. Volume determination process
JPS61114109A (en) 1984-11-09 1986-05-31 Ishida Scales Mfg Co Ltd Weight receiving method
DE3829925A1 (en) 1988-09-02 1990-03-15 Kaltenbach & Voigt Optical probe for 3D measurement of teeth in the buccal cavity
US5372502A (en) 1988-09-02 1994-12-13 Kaltenbach & Voight Gmbh & Co. Optical probe and method for the three-dimensional surveying of teeth
EP0402440B1 (en) 1988-12-19 1995-06-07 Renishaw plc Method of and apparatus for scanning the surface of a workpiece
DE3938714A1 (en) 1989-11-23 1991-05-29 Bernd Dr Breuckmann Optical determination of object shapes, shape variations - using structured coloured light projected onto objects for high resolution, dynamic measurement
US5135309A (en) 1990-03-09 1992-08-04 Carl-Zeiss-Stiftung Method and apparatus for non-contact measuring of object surfaces
EP0445618A2 (en) 1990-03-09 1991-09-11 Firma Carl Zeiss Device and procedure to measure without contact the surface-contour of an object
WO1991015732A1 (en) 1990-04-05 1991-10-17 Intelligent Automation Systems, Inc. Real time three dimensional sensing system
US5237404A (en) 1990-06-28 1993-08-17 Mazda Motor Corporation Inspection apparatus with improved detection of surface defects over large and curved surfaces
US5262844A (en) 1990-07-03 1993-11-16 Bertin & Cie Apparatus for determining the three-dimensional shape of an object optically without contact
US5251156A (en) 1990-08-25 1993-10-05 Carl-Zeiss-Stiftung, Heidenheim/Brenz Method and apparatus for non-contact measurement of object surfaces
JPH04204314A (en) 1990-11-30 1992-07-24 Mazda Motor Corp Surface defect inspection instrument
US5289264A (en) 1991-09-26 1994-02-22 Hans Steinbichler Method and apparatus for ascertaining the absolute coordinates of an object
US5175601A (en) 1991-10-15 1992-12-29 Electro-Optical Information Systems High-speed 3-D surface measurement surface inspection and reverse-CAD system
DE4301538A1 (en) 1992-03-17 1994-07-28 Peter Dr Ing Brueckner Method and arrangement for contactless three-dimensional measurement, in particular for measuring denture models
US5319445A (en) 1992-09-08 1994-06-07 Fitts John M Hidden change distribution grating and use in 3D moire measurement sensors and CMM applications
JPH06138055A (en) 1992-10-28 1994-05-20 Sumitomo Metal Ind Ltd Method for inspecting surface defect
US5488477A (en) * 1993-11-15 1996-01-30 Zygo Corporation Methods and apparatus for profiling surfaces of transparent objects
JPH07260451A (en) 1994-03-18 1995-10-13 Shiseido Co Ltd Three dimensional shape measuring system
WO1997005449A1 (en) 1995-07-26 1997-02-13 Crampton Stephen J Scanning apparatus and method
DE19634254A1 (en) 1995-09-04 1997-03-06 Volkswagen Ag Optical-numerical determination of entire surface of solid object e.g. for motor vehicle mfr.
US5646733A (en) 1996-01-29 1997-07-08 Medar, Inc. Scanning phase measuring method and system for an object at a vision station
US5953448A (en) 1996-03-01 1999-09-14 Textile/Clothing Technology Corporation Contour measurement of an object having a discontinuous surface using block point identification techniques
WO1997036144A1 (en) 1996-03-22 1997-10-02 Loughborough University Innovations Limited Method and apparatus for measuring shape of objects
US6055056A (en) 1996-05-06 2000-04-25 Fraunhofer-Gesellschaft Zur Foerderung Der Angewandten Forschung E.V. Device for non-contact measurement of the surface of a three dimensional object
US20020057832A1 (en) 1996-06-13 2002-05-16 Marc R.A.B. Proesmans Method and system for acquiring a three-dimensional shape description
US6028672A (en) 1996-09-30 2000-02-22 Zheng J. Geng High speed three dimensional imaging method
US6600511B1 (en) 1997-01-08 2003-07-29 Pentax Corporation Camera for use in photogrammetric analytical measurement
US20020181764A1 (en) 1997-05-22 2002-12-05 Kabushiki Kaisha Topcon Measuring apparatus
US6438272B1 (en) 1997-12-31 2002-08-20 The Research Foundation Of State University Of Ny Method and apparatus for three dimensional surface contouring using a digital video projection system
JPH11211442A (en) 1998-01-27 1999-08-06 Matsushita Electric Works Ltd Method and device for detecting defect of object surface
JPH11211443A (en) 1998-01-27 1999-08-06 Matsushita Electric Works Ltd Three-dimensional shape measuring device
US6291817B1 (en) 1998-06-23 2001-09-18 Fuji Photo Optical Co., Ltd. Moire apparatus having projection optical system and observation optical system which have optical axes parallel to each other
US6144453A (en) 1998-09-10 2000-11-07 Acuity Imaging, Llc System and method for three-dimensional inspection using patterned light projection
JP2000097672A (en) 1998-09-18 2000-04-07 Sanyo Electric Co Ltd Control information generating method and assisting system in three-dimensional measuring system
DE19846145A1 (en) 1998-10-01 2000-04-20 Klaus Koerner Three-dimensional imaging device for shape measurement has transmitter array whose elements move in straight, parallel lines
WO2000021034A1 (en) 1998-10-06 2000-04-13 Easyscan Műszaki Fejlesztő Kereskedelmi És Szolgál Tató Kft. Method and apparatus for the scanning of spatial objects and to create a three dimensional computer model
US6084712A (en) 1998-11-03 2000-07-04 Dynamic Measurement And Inspection,Llc Three dimensional imaging using a refractive optic design
US6256099B1 (en) 1998-11-06 2001-07-03 Frederick Kaufman Methods and system for measuring three dimensional spatial coordinates and for external camera calibration necessary for that measurement
JP2001012925A (en) 1999-04-30 2001-01-19 Nec Corp Three-dimensional shape measurement method and device and record medium
US6421629B1 (en) 1999-04-30 2002-07-16 Nec Corporation Three-dimensional shape measurement method and apparatus and computer program product
US6100984A (en) 1999-06-11 2000-08-08 Chen; Fang Surface measurement system with a laser light generator
JP2001108422A (en) 1999-10-12 2001-04-20 Wakayama Univ Method and apparatus for measuring shape
US6674893B1 (en) 1999-10-19 2004-01-06 Fuji Xerox Co., Ltd. Three-dimensional shape measuring apparatus
JP2003527582A (en) 2000-01-07 2003-09-16 サイバーオプティクス コーポレーション Phase profile measurement system with telecentric projector
GB2375392B (en) 2000-01-07 2004-12-15 Cyberoptics Corp Phase profilometry system with telecentric projector
WO2001051887A1 (en) 2000-01-07 2001-07-19 Cyberoptics Corporation Phase profilometry system with telecentric projector
US6738508B1 (en) 2000-04-28 2004-05-18 Orametrix, Inc. Method and system for registering data
US20060093206A1 (en) 2000-04-28 2006-05-04 Rudger Rubbert System and method for mapping a surface
US7068836B1 (en) 2000-04-28 2006-06-27 Orametrix, Inc. System and method for mapping a surface
US6771809B1 (en) 2000-04-28 2004-08-03 Orametrix, Inc. Method and system for registering data
US6744914B1 (en) 2000-04-28 2004-06-01 Orametrix, Inc. Method and system for generating a three-dimensional object
US6532299B1 (en) 2000-04-28 2003-03-11 Orametrix, Inc. System and method for mapping a surface
US6728423B1 (en) 2000-04-28 2004-04-27 Orametrix, Inc. System and method for mapping a surface
US6744932B1 (en) 2000-04-28 2004-06-01 Orametrix, Inc. System and method for mapping a surface
JP2002054912A (en) 2000-08-08 2002-02-20 Ricoh Co Ltd Shape measuring system, imaging device, shape measuring method, and recording medium
JP2002090126A (en) 2000-09-14 2002-03-27 Wakayama Univ Real time shape deformation measuring method by color rectangular wave grid projection
JP2002162215A (en) 2000-11-27 2002-06-07 Matsushita Electric Works Ltd Three-dimensional shape measuring method and its system
US7001024B2 (en) 2001-02-14 2006-02-21 Ricoh Company, Ltd. Image input apparatus using projected light
JP2006179031A (en) 2001-02-14 2006-07-06 Ricoh Co Ltd Image input apparatus
US7310514B2 (en) 2001-03-16 2007-12-18 Nec Corporation Transmission-origin mobile telephone capable of detecting the media types and data formats of a multimedia message receivable by destination mobile telephones in a multimedia communication system
US20060103854A1 (en) 2001-06-27 2006-05-18 Franke Ernest A Non-contact apparatus and method for measuring surface profile
US7103211B1 (en) 2001-09-04 2006-09-05 Geometrix, Inc. Method and apparatus for generating 3D face models from one camera
US20030123707A1 (en) 2001-12-31 2003-07-03 Park Seujeung P. Imaging-based distance measurement and three-dimensional profiling system
US7133551B2 (en) 2002-02-07 2006-11-07 National Central University Semi-automatic reconstruction method of 3-D building models using building outline segments
US7315643B2 (en) 2002-03-12 2008-01-01 Nec Corporation Three-dimensional shape measurement technique
US20030174880A1 (en) 2002-03-12 2003-09-18 Nec Corporation Three-dimensional shape measurement technique
JP2003269928A (en) 2002-03-12 2003-09-25 Nec Corp Method and instrument for measuring three-dimensional shape, and program
US7136170B2 (en) 2002-04-24 2006-11-14 Fraunhofer-Gesellschaft Zur Foerderung Der Angewandten Forschung E.V. Method and device for determining the spatial co-ordinates of an object
WO2004083778A1 (en) 2003-03-18 2004-09-30 Hermary Alexander Thomas Coded-light dual-view profile scanner
US20040246496A1 (en) 2003-03-31 2004-12-09 Mitutoyo Corporation Method and apparatus for non-contact three-dimensional surface measurement
JP2004317495A (en) 2003-03-31 2004-11-11 Mitsutoyo Corp Method and instrument for measuring noncontactly three-dimensional shape
WO2004096502A1 (en) 2003-04-28 2004-11-11 Stephen James Crampton Cmm arm with exoskeleton
US20070057946A1 (en) 2003-07-24 2007-03-15 Dan Albeck Method and system for the three-dimensional surface reconstruction of an object
US20050018209A1 (en) 2003-07-24 2005-01-27 Guylain Lemelin Optical 3D digitizer with enlarged non-ambiguity zone
WO2005059470A1 (en) 2003-12-17 2005-06-30 Universität Karlsruhe Method for the dynamic, three-dimensional detection and representation of a surface
WO2005073669A1 (en) 2004-01-29 2005-08-11 Siemens Corporate Research, Inc. Semi and fully-automatic camera calibration tools using laser-based measurement devices
US20050201611A1 (en) * 2004-03-09 2005-09-15 Lloyd Thomas Watkins Jr. Non-contact measurement method and apparatus
US20050271264A1 (en) 2004-04-21 2005-12-08 Topcon Corporation Three-dimensional image measuring apparatus and method
US7394536B2 (en) 2004-06-04 2008-07-01 Asahi Glass Company, Limited Method and apparatus for inspecting front surface shape
JP2005345383A (en) 2004-06-04 2005-12-15 Asahi Glass Co Ltd Inspection method, and inspection device for surface shape
US7171328B1 (en) * 2004-08-30 2007-01-30 Sandia Corporation Method for measuring thermal properties using a long-wavelength infrared thermal image
US20060210146A1 (en) 2005-01-07 2006-09-21 Jin Gu Creating 3D images of objects by illuminating with infrared patterns
US7430312B2 (en) 2005-01-07 2008-09-30 Gesturetek, Inc. Creating 3D images of objects by illuminating with infrared patterns
US7912673B2 (en) * 2005-03-11 2011-03-22 Creaform Inc. Auto-referenced system and apparatus for three-dimensional scanning
US20080165341A1 (en) 2005-04-06 2008-07-10 Dimensional Photonics International, Inc. Multiple Channel Interferometric Surface Contour Measurement System
US20060268153A1 (en) * 2005-05-11 2006-11-30 Xenogen Corporation Surface contruction using combined photographic and structured light information
JP2007024764A (en) 2005-07-20 2007-02-01 Daihatsu Motor Co Ltd Noncontact three-dimensional measuring method and system
JP2007093412A (en) 2005-09-29 2007-04-12 Fujinon Corp Three-dimensional shape measuring device
US7538891B1 (en) 2005-09-30 2009-05-26 California Institute Of Technology Surface characterization based on lateral shearing of diffracted wave fronts to measure in-plane and out-of-plane displacement gradient fields
CN1952595A (en) 2005-10-20 2007-04-25 欧姆龙株式会社 Three-dimensional shape measuring apparatus, computer-readable recording medium, and three-dimensional shape measuring method
US7684052B2 (en) 2005-10-20 2010-03-23 Omron Corporation Three-dimensional shape measuring apparatus, program, computer-readable recording medium, and three-dimensional shape measuring method
JP2007121294A (en) 2005-10-24 2007-05-17 General Electric Co <Ge> Method and device for inspecting object
US7898651B2 (en) 2005-10-24 2011-03-01 General Electric Company Methods and apparatus for inspecting an object
US7929751B2 (en) 2005-11-09 2011-04-19 Gi, Llc Method and apparatus for absolute-coordinate three-dimensional surface imaging
US20090238449A1 (en) 2005-11-09 2009-09-24 Geometric Informatics, Inc Method and Apparatus for Absolute-Coordinate Three-Dimensional Surface Imaging
US20070206204A1 (en) 2005-12-01 2007-09-06 Peirong Jia Full-field three-dimensional measurement method
US7545516B2 (en) 2005-12-01 2009-06-09 University Of Waterloo Full-field three-dimensional measurement method
GB2434541A (en) 2006-01-30 2007-08-01 Mailling Wright Products Ltd Preparing a clinical restraint
US20070183631A1 (en) 2006-02-06 2007-08-09 Beijing University Of Aeronautics And Astronautics Methods and apparatus for measuring the flapping deformation of insect wings
US20070211258A1 (en) 2006-03-07 2007-09-13 Korea Advanced Institute Of Science And Technology Three-dimensional shape measurement apparatus and method for eliminating2pi ambiguity of moire principle and omitting phase shifting means
WO2007121953A1 (en) 2006-04-21 2007-11-01 Quiss Gmbh Device for automatically applying or producing and monitoring a structure placed on a substrate with a determination of geometric dimensions and corresponding method therefor
WO2007125081A1 (en) 2006-04-27 2007-11-08 Metris N.V. Optical scanning probe
JP2007315882A (en) 2006-05-25 2007-12-06 Dainippon Printing Co Ltd Apparatus and method for positioning substrate, and device and method for manufacturing color filter
US20090268214A1 (en) * 2006-05-26 2009-10-29 Miljenko Lucic Photogrammetric system and techniques for 3d acquisition
CN101105393A (en) 2006-07-13 2008-01-16 周波 Vision measuring method for projecting multiple frequency grating object surface tri-dimensional profile
US20080075328A1 (en) * 2006-09-15 2008-03-27 Sciammarella Cesar A System and method for analyzing displacements and contouring of surfaces
JP2008076107A (en) 2006-09-19 2008-04-03 Denso Corp Apparatus and method for visual inspection, height measuring method, and circuit board manufacturing method
US7256899B1 (en) 2006-10-04 2007-08-14 Ivan Faul Wireless methods and systems for three-dimensional non-contact shape sensing
US20100046005A1 (en) 2006-10-16 2010-02-25 Fraunhofer-Gesellschaft Zur Forderung Der Angewand Ten Forschung E.V. Electrostatice chuck with anti-reflective coating, measuring method and use of said chuck
WO2008046663A2 (en) 2006-10-16 2008-04-24 Fraunhofer Gesellschaft Zur Förderung Der Angewandten Forschung E. V. Device and method for the contactless detection of a three-dimensional contour
US20080239288A1 (en) 2007-04-02 2008-10-02 Korea Advanced Institute Of Science And Technology 3D shape measurement apparatus and method using stereo moire technique
US8111907B2 (en) * 2007-07-31 2012-02-07 United Technologies Corporation Method for repeatable optical determination of object geometry dimensions and deviations
WO2009024756A1 (en) 2007-08-17 2009-02-26 Renishaw Plc Non-contact measurement apparatus and method
WO2009024757A1 (en) 2007-08-17 2009-02-26 Renishaw Plc Phase analysis measurement apparatus and method
WO2009024758A1 (en) 2007-08-17 2009-02-26 Renishaw Plc Non-contact probe
WO2009094510A1 (en) 2008-01-25 2009-07-30 Cyberoptics Corporation Multi-source sensor for three-dimensional imaging using phased structured light
WO2009097066A1 (en) 2008-01-31 2009-08-06 Cyberoptics Corporation Improved method for three-dimensional imaging using multi-phase structured light
US20120307260A1 (en) 2008-04-01 2012-12-06 Perceptron, Inc. Hybrid system
US20100008588A1 (en) 2008-07-08 2010-01-14 Chiaro Technologies LLC Multiple channel locating
US20110317879A1 (en) * 2009-02-17 2011-12-29 Absolute Robotics Limited Measurement of Positional Information for a Robot Arm
US20100296104A1 (en) 2009-05-21 2010-11-25 General Electric Company Inspection system and method with multi-image phase shift analysis
US20100329538A1 (en) 2009-06-25 2010-12-30 Jason Remillard Cell Feature Extraction and Labeling Thereof

Non-Patent Citations (110)

* Cited by examiner, † Cited by third party
Title
"3D Coordinate Measurement -Milling on digitized data; Casted Blanks," www.gom.com, obtained Aug. 7, 2007, GOM mbH.
"3D-Digitizing of a Ford Focus-Interior/Exterior-Product Analysis," www.gom.com, obtained Oct. 6, 2008, GOM mbH.
"Application Example: 3D-Coordinate Measurement Mobile 3D Coordinate Measurement for Shipbuilding", 6 pages, GOM Optical Measuring Techniques, downloaded Sep. 6, 2012 from http://www.gom.com/fileadmin/user-upload/industries/shipbuilding-EN.pdf.
"Application Notes-TRITOP", 1 page, GOM Optical Measuring Techniques, downloaded Sep. 6, 2012 from http://www.gom.com/industries/application-notes-tritop.html.
"Measuring Systems-ATOS," http://www.gom.com/EN/measuring.systems/atos/system/system.html, obtained Oct. 6, 2008, GOM mbH.
"Measuring systems-TRITOP," http://www.gom.com/EN/measuring.systems/tritop/system.html, obtained Aug. 7, 2007, GOM mbH.
"Measuring Systems-TRITOP," http://www.gom.com/EN/measuring.systems/tritop/system/system.html, obtained Aug. 7, 2007, GOM mbH.
"optoTOP-HE-The HighEnd 3D Digitising System," http://www.breuckmann.com/index.php?id=optotop-he&L=2, obtained Oct. 6, 2008, Breuckmann.
"Picture Perfect Measurements, Do I need to use special targets with the system?", 1 page, Geodetic Systems Inc., downloaded Sep. 6, 2012 from http://www.geodetic.com/do-i-need-to-use-special-targets-with-system.aspx.
"Picture Perfect Measurements, The Basics of Photogrammetry", 14 pages, Geodetic Systems Inc., downloaded Sep. 6, 2012 from http://www.geodetic.com/v-stars/what-is-photogrammetry.aspx.
Apr. 23, 2012 Chinese Office Action issued in Chinese Patent Application No. 200880111248.8 (with translation).
Apr. 26, 2012 Office Action issued in European Patent Application No. 08 788 328.6.
Aug. 12, 2014 Office Action issued in Japanese Patent Application No. 2012-5288441 (with translation).
Aug. 16, 2013 Office Action issued in Japanese Patent Application No. 2010-521465 (with translation).
Aug. 21, 2012 Office Action issued in U.S. Appl. No. 12/733,025.
Aug. 23, 2012 Office Action issued in U.S. Appl. No. 12/733,022.
Aug. 5, 2015 Office Action issued in U.S. Appl. No. 13/392,710.
Aug. 9, 2013 Office Action issued in Japanese Patent Application No. 2010-521466 (with translation).
Aug. 9, 2013 Office Action issued in Japanese Patent Application No. 2010-521467 (with translation).
Brauer-Burchardt et al., "Phase unwrapping in fringe projection systems using epipolar geometry," LNCS 5259, pp. 422-432 (2008).
Carré, "Installation et utilisation du comparateur photoélectrique et interférential du Bureau International des Poids et Mesures," 1966, Metrologia, pp. 13-23, vol. 2, No. 1, France (with abstract).
Chen et al., "Overview of three-dimensional shape measurement using optical methods," Opt. Eng., 2000, pp. 10-22, Society of Photo-Optical Instrumentation Engineers.
Chen et al., "Range data acquisition using color structured lighting and stereo vision," Image and Vision Computing, 1997, pp. 445-456, vol. 15, Elsevier.
Chinese Office Action issued in Chinese Application No. 200880111248.8 on Mar. 9, 2011 (translation only).
Clarke, "Non-contact measurement provides six of the best," Quality Today, 1998, pp. s46, s48.
Coggrave, "Wholefield Optical Metrology: Surface Profile Measurement," 2002-2004, pp. 1-35, Phase Vision Ltd.
Cooper et al., "Theory of close range photogrammetry," Close Range Photogrammetry and Machine Vision, 2001, pp. 9-51, Whittles Publishing.
Creath, "Comparison of Phase-Measurement Algorithms," Surface Characterization and Testing, 1986, pp. 19-28, SPIE, vol. 680.
Cuypers, W. et al., "Optical measurement techniques for mobile and large-scale dimensional metrology", Optics and Lasers in Engineering, 47, 2009, pp. 292-300.
Dec. 2, 2013 Chinese Office Action issued in Chinese Patent Application No. 201080040329.0 (with translation).
Dec. 21, 2010 International Search Report issued in International Application No. PCT/GB2010/0001675.
Dec. 21, 2010 Written Opinion of the International Searching Authority issued in International Application No. PCT/GB2010/0001675.
English translation of Jun. 15, 2011 Office Action issued in Chinese Patent Application No. 200880111247.3.
English-language translation of JP-A-11-211442 published Aug. 6, 1999.
English-language translation of JP-A-2007-93412 published Apr. 12, 2007.
Feb. 16, 2013 Office Action issued in Chinese Application No. 200880111247.3 (with translation).
Feb. 16, 2013 Office Action issued in Chinese Application No. 200880112194.7 (with translation).
Feb. 25, 2014 Office Action issued in Japanese Patent Application No. 2010-521467 (with translation).
Feb. 27, 2011 Office Action issued in European Patent Application No. 08 788 328.6.
Feb. 28, 2014 Notice of Allowance issued in U.S. Appl. No. 12/733,025.
Feb. 6, 2013 Office Action issued in U.S. Appl. No. 12/733,025.
Fryer, "Camera Calibration," Close Range Photogrammetry and Machine Vision, 1996, pp. 156-179, Whittles Publishing.
Galanulis et al., "Optical Digitizing by ATOS for Press Parts and Tools," www.gom.com, Feb. 2004, GOM mbH.
Geometrical Product Specifications (GPS)-Geometrical Features, British Standard, BS EN ISO 1466-1:2000.
Gruen, "Least squares matching: a fundamental measurement algorithm," Close Range Photogrammetry and Machine Vision, 2001, pp. 217-255, Whittles Publishing.
Hailong, J. et al., "Shape reconstruction methods from gradient field," Laser Journal, 2007, pp. 41-43, vol. 28, No. 6 (with Abstract).
Heikkilä et al., "A Four-step Camera Calibration Procedure with Implicit Image Correction," 1997, Proceedings of the 1997 Conference in Computer Vision and Pattern Recognition.
Human translation of DE 196 34 254 A1.
Huntley et al., "Shape measurement by temporal phase unwrapping: comparison of unwrapping algorithms," Measurement Science and Technology 8, pp. 986-992 (1997).
Ishiyama et al., "Precise 3-D Measurement Using Uncalibrated Pattern Projection," IEEE International Conference on Image Processing, 2007, pp. 225-228.
Isliiyama et al., "Absolute phase measurements using geometric constraints between multiple cameras and projectors," Applied Optics 46 (17), pp. 3528-3538 (2007).
Jan. 28, 2014 Office Action issued in Japanese Patent Application No. 2012-528441 (with translation).
Jan. 29, 2015 Office Action issued in U.S. Appl. No. 13/392,710.
Jan. 30, 2012 Office Action issued in European Application No. 08 788 327.8.
Jan. 30, 2012 Office Action issued in European Application No. 08 788 329.4.
Jan. 31, 2014 Notice of Allowance issued in U.S. Appl. No. 12/733,022.
Jan. 6, 2012 Second Office Action issued in Chinese Patent Application No. 200880111248.8 (translation only).
Jul. 22, 2014 Office Action issued in Japanese Patent Application No. 2010-521465 (with translation).
Jul. 24, 2014 Office Action issued in European Application No. 08 788 329.4.
Jul. 25, 2013 Office Action issued in U.S. Appl. No. 12/733,022.
Jul. 6, 2015 Office Action issued in European Application No. 10 761 058.6.
Jun. 15, 2011 Office Action issued in Chinese Patent Application No. 200880111247.3 (translation only).
Jun. 15, 2011 Office Action issued in Chinese Patent Application No. 200880112194.7.
Kemper et al., "Quantitative determination of out-of-plane displacements by endoscopic electronic-speckle-pattern interferometry", Optics Communication 194, pp. 75-82, Jul. 1, 2001.
Kim et al., "An active trinocular vision system of sensing indoor navigation environment for mobile robots," Sensors and Actuators A, Sep. 2005, pp. 192-209, vol. 125, Elsevier.
Korner et al., Absolute macroscopic 3-D measurements with the innovative depth-scanning fringe projection technique (DSFP), Optik International Journal for Light and Electron Optics, 2001, pp. 433-441, vol. 112, No. 9.
Kowarschik et al., "Adaptive optical three-dimensional measurement with structured light," Optical Engineering 39 (1), pp, 150-158 (2000).
Kühmstedt et al., "3D shape measurement with phase correlation based fringe projection," Optical Measurement Systems for Industrial Inspection V, 2007, pp. 1-9, vol. 6616, Proc. of SPIE.
Leymarie, "Theory of Close Range Photogrammetry," http://www.lems.brown.edu/vision/people/leymarie/Refs/Photogrammetry/Atkinson90/Ch2Theory.html, May 10, 2010 update, Ch. 2 of [Atkinson 90], obtained Mar. 31, 2010.
Mar. 12, 2013 Office Action issued in U.S. Appl. No. 12/733,022.
Mar. 3, 2015 Notice for Revocation of Reconsideration issued in Japanese Application No. 2010-521465.
Mar. 9, 2011 Chinese Office Action issued in Chinese Application No. 200880111248.8 (translation only).
Marapane, "Region-Based Stereo Analysis for Robotic Applications", IEEE, 1989, pp. 307-324.
May 28, 2014 Notice of Allowance issued in U.S. Appl. No. 12/733,025.
May 3, 2012 Chinese Office Action issued in Chinese Patent Application No. 200880112194.7 (with translation).
May 3, 2012 Office Action issued in Chinese Patent Application No. 200880111247.3 (with translation).
Nov. 16, 2012 Japanese Office Action issued in Application No. 2010-521466 (with translation).
Nov. 16, 2012 Office Action issued in Japanese Patent Application No. 2010-521467 (with translation).
Nov. 18, 2008 International Search Report issued in International Patent Application No. PCT/GB2008/002758.
Nov. 18, 2008 International Search Report issued in International Patent Application No. PCT/GB2008/002759.
Nov. 18, 2008 International Search Report issued in International Patent Application No. PCT/GB2008/002760.
Nov. 18, 2008 Written Opinion issued in International Patent Application No. PCT/GB2008/002758.
Nov. 18, 2008 Written Opinion issued in International Patent Application No. PCT/GB2008/002759.
Nov. 18, 2008 Written Opinion issued In International Patent Application No. PCT/GB2008/002760.
Oct. 20, 2014 Notice of Allowance issued in U.S. Appl. No. 12/733,025.
Parker, "Advanced-Edge Detection Techniques: The Canny and the shen-Castan Methods," 1997, pp. 1-33, John Wiley & Sons, Inc.
Patil et al., "Guest editorial. Moving ahead with phase," Optics and Lasers in Engineering 45, pp. 253-257 (2007).
Pawlowski et al., "Shape and Position Determination Based on Combination of Photogrammetry with Phase Analysis of Fringe Patterns", CAIP 2001, LNCS 2124, pp. 391-399.
Reeves et al., "Dynamic shape measurement system for laser materials processing," Optical Engineering 42 (10), pp. 2923-2929 (2003).
Reich et al., "3-D shape measurement of complex objects by combining photogrammetry and fringe projection," Optical Engineering, 2000, pp. 224-231, vol. 39, No. 1, Society of Photo-Optical Instrumentation Engineers.
Sansont et al., "Three-dimensional vision based on a combination of gray-code and phase-shift light projection: analysis and compensation of the systematic errors," Applied Optics, 1999, pp. 6565-6573, vol. 38, No. 31, Optical Society of America.
Sasso et al., "Superimposed fringe projection for three-dimensional shape acquisition by image analysis," Applied Optics 48 (13), pp. 2410-2420 (2009).
Scharstein et al., "High-Accuracy Stereo Depth Maps Using Structured Light," Proc. 2003 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2003, pp. 195-202, vol. 1, Computer Society.
Schreiber et al., "Managing some calibration problems in fringe projection shape measurement systems," Measurement systems for Optical Methodology, 1997, pp. 443-450, Fringe.
Schreiber et al., "Theory and arrangements of self-calibrating whole-body three-dimensional measurement systems using fringe projection technique," Optical Engineering, 2000, pp. 159-169, vol. 39, No. 1, Society of Photo-Optical Instrumentation Engineers.
Sep. 11, 2013 Office action issued in U.S. Appl. No. 12/733,025.
Sep. 21, 2012 Office Action issued in Japanese Patent Application No. 2010-521465 (with translation).
Sep. 29, 2009 Search Report issued in Great Britain Application No. GB0915904.7.
Stoilov et al.., "Phase-stepping Interferometry: Five-frame Algorithm with an Arbitrary Step," Optics and Lasers in Engineering, 1997, pp. 61-69, vol. 28.
Takasaki, "MOIRE Topography", Applied Optics, Jun. 1970, vol. 9, No. 6, pp. 1467-1472.
Takeda et al, "Fourier-transform method of fringe-pattern analysis for computer-based topography and interferometry", Optical Society of America, Jan. 1982, vol. 72, No. 1, pp, 156-160.
Takeda et al., "Fourier transform profilometry for the automatic measurement of 3-D object shapes," Applied Optics 22 (24), pp. 3977-3982 (1983).
Tsai, R. et al., "A New Technique for Fully Autonomous and Efficient 3D Robotics Hand/Eye Calibration," IEEE Transactions on Robotics and Automation, Jun. 1989, pp. 345-358, col. 5, No. 3.
Tsai, R. et al., "A New Technique for Fully Autonomous and Efficient 3D Robotics Hand/Eye Calibration," IEEE Transactions on Robotics and Automation, Jun. 1989, pp. 345-358, vol. 5, No. 3.
U.S. Appl. No. 12/733,022, filed Feb. 3, 2010 in the name of Weston et al.
U.S. Appl. No. 12/733,025, filed Feb. 3, 2010 in the name of Weston et al.
U.S. Appl. No. 13/392,710, filed Feb. 27, 2012 in the name of Weston et al.
Wallace et al., "High-speed photogrammetry system for measuring the kinematics of insect wings," Applied Optics, vol. 45, No. 17, Jun. 10, 2006, pp. 4165-4173.
Wolfson, et al. "Three-Dimensional Vision Technology Offers Real-Time Inspection Capability," Sensor Review, 1997, pp. 299-303, vol., 17, No. 4, MCB University Press.
Wong et al., "Fast acquisition of dense depth data by a new structured light scheme," Computer Vision and Image Understanding, Dec. 2004, pp. 398-422, vol. 98, Elsevier.

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11454106B2 (en) 2017-06-15 2022-09-27 Drillscan France Sas Generating drilling paths using a drill model

Also Published As

Publication number Publication date
US20100142798A1 (en) 2010-06-10
EP2183545A1 (en) 2010-05-12
JP5485889B2 (en) 2014-05-07
EP2183545B1 (en) 2014-12-17
US20100158322A1 (en) 2010-06-24
US20100135534A1 (en) 2010-06-03
US8792707B2 (en) 2014-07-29
CN101828092B (en) 2014-03-12
EP2183546B1 (en) 2015-10-21
JP5943547B2 (en) 2016-07-05
CN101821579B (en) 2013-01-23
EP2183544B1 (en) 2015-07-15
JP2015057612A (en) 2015-03-26
EP2977719A1 (en) 2016-01-27
EP2183544A1 (en) 2010-05-12
EP2183546A1 (en) 2010-05-12
CN101821578B (en) 2014-03-12
WO2009024758A1 (en) 2009-02-26
WO2009024756A1 (en) 2009-02-26
CN101821578A (en) 2010-09-01
US8605983B2 (en) 2013-12-10
JP2010537183A (en) 2010-12-02
CN101821579A (en) 2010-09-01
WO2009024757A1 (en) 2009-02-26
CN101828092A (en) 2010-09-08
JP2010537182A (en) 2010-12-02
JP5689681B2 (en) 2015-03-25
JP2010537181A (en) 2010-12-02
US8923603B2 (en) 2014-12-30

Similar Documents

Publication Publication Date Title
USRE46012E1 (en) Non-contact probe
US9329030B2 (en) Non-contact object inspection
US8243286B2 (en) Device and method for the contactless detection of a three-dimensional contour
US10812694B2 (en) Real-time inspection guidance of triangulation scanner
Eiríksson et al. Precision and accuracy parameters in structured light 3-D scanning
US20150015701A1 (en) Triangulation scanner having motorized elements
EP0866308A2 (en) Optical profile sensor
JP2011504586A (en) Method for optically measuring the three-dimensional shape of an object
CN105960569A (en) Methods of inspecting a 3d object using 2d image processing
Berssenbrügge et al. Characterization of the 3D resolution of topometric sensors based on fringe and speckle pattern projection by a 3D transfer function
KR20040071531A (en) Three-dimensional image measuring apparatus and method thereof
Schreiber et al. Optical 3D coordinate-measuring system using structured light

Legal Events

Date Code Title Description
FEPP Fee payment procedure

Free format text: PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

FPAY Fee payment

Year of fee payment: 4

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 8TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1552); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 8