US9739605B2 - Shape measurement apparatus and method - Google Patents

Shape measurement apparatus and method Download PDF

Info

Publication number
US9739605B2
US9739605B2 US15/002,784 US201615002784A US9739605B2 US 9739605 B2 US9739605 B2 US 9739605B2 US 201615002784 A US201615002784 A US 201615002784A US 9739605 B2 US9739605 B2 US 9739605B2
Authority
US
United States
Prior art keywords
height
weight
respect
grating
pattern
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
US15/002,784
Other versions
US20160153772A1 (en
Inventor
Joong-Ki Jeong
Min-Young Kim
Seung-Jun Lee
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Koh Young Technology Inc
Original Assignee
Koh Young Technology Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from KR1020090044423A external-priority patent/KR101088497B1/en
Priority claimed from KR1020100007025A external-priority patent/KR101158324B1/en
Application filed by Koh Young Technology Inc filed Critical Koh Young Technology Inc
Priority to US15/002,784 priority Critical patent/US9739605B2/en
Publication of US20160153772A1 publication Critical patent/US20160153772A1/en
Application granted granted Critical
Publication of US9739605B2 publication Critical patent/US9739605B2/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • G01B11/2531Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object using several gratings, projected with variable angle of incidence on the object, and one detection device
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • G01B11/2513Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object with several lines being projected in more than one direction, e.g. grids, patterns
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • G01B11/254Projection of a pattern, viewing through a pattern, e.g. moiré
    • G06K9/036
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/95Investigating the presence of flaws or contamination characterised by the material or shape of the object to be examined
    • G01N21/956Inspecting patterns on the surface of objects
    • G01N2021/95638Inspecting patterns on the surface of objects for PCB's

Definitions

  • Exemplary embodiments of the present invention relate to a shape measurement apparatus and a shape measurement method. More particularly, exemplary embodiments of the present invention relate to a shape measurement apparatus and a shape measurement method capable of enhancing accuracy of measurement.
  • These techniques for measuring three-dimensional shape may be used for inspection of a printed circuit board, and tries for enhancing accuracy are performed.
  • Exemplary embodiments of the present invention provide a shape measurement apparatus capable of measuring a two-dimensional shape together with a three-dimensional shape, and enhancing accuracy of measurement.
  • Exemplary embodiments of the present invention also provide a shape measurement method capable of measuring a two-dimensional shape together with a three-dimensional shape, and enhancing accuracy of measurement.
  • Exemplary embodiments of the present invention also provide a method of measuring a three dimensional shape capable of accurately measuring a three dimensional shape of a measurement target in total areas.
  • An exemplary embodiment of the present invention discloses a shape measurement apparatus.
  • the shape measurement apparatus includes a work stage supporting a target substrate, a pattern-projecting section including a light source, a grating part transmitting and blocking light generated by the light source to generate a grating image and a projecting lens part making the grating image on a measurement target of the target substrate, an image-capturing section capturing the grating image reflected by the measurement target of the target substrate, and a control section controlling the work stage, the pattern-projecting section and the image-capturing section, calculating a reliability index of the grating image and phases of the grating image, which is corresponding to the measurement target, and inspecting the measurement target by using the reliability index and the phases.
  • the shape measurement apparatus may inspect a surface of a pad through the reliability index when the pad is the measurement target.
  • the pad may be to be electrically connected to an external device.
  • the reliability index may be at least one of an intensity, a visibility and a signal to noise ratio.
  • the control section may determine the pad is bad, when the reliability index is out of a setup value.
  • the shape measurement apparatus may further include a subsidiary light source for inspecting the measurement target of the target substrate. The control section may determine that the pad is bad when light generated by the subsidiary light source is reflected by the pad and captured by the image-capturing section to form a two-dimensional image, and the pad is determined as bad in the two-dimensional image, even though the reliability index shows that the pad is good.
  • the shape measurement method includes acquiring a grating image reflected by a measurement target, while shifting the grating image for specific times, acquiring a reliability index including at least one of an intensity, a visibility and a signal to noise ratio of the grating image by using the grating image, and determining a pad for being electrically connected to an external device is good when the reliability index is within a setup value, and bad when the reliability index is out of the setup value, in case that the pad is the measurement target.
  • Still another exemplary embodiment of the present invention discloses a method of measuring a three dimensional shape.
  • the method includes illuminating grating pattern lights in a plurality of directions onto a measurement target while changing each of the grating pattern lights by N times and detecting the grating pattern lights reflected by the measurement target, to acquire N pattern images of the measurement target with respect to each direction, extracting a phase ⁇ P i (x,y) ⁇ and a brightness ⁇ A i (x,y) ⁇ with respect to each direction corresponding to each position ⁇ i(x,y) ⁇ in an X-Y coordinate system from the pattern images, extracting a height weight ⁇ W i (x,y) ⁇ with respect to each direction by using a weight function employing the brightness as a parameter, and calculating a weight height ⁇ W i (x,y) ⁇ H i (x,y) ⁇ with respect to each direction by using a height based on the phase with respect to each direction and the height weight, and summing weight heights, to
  • the brightness may correspond to an average brightness that is obtained by averaging the detected grating pattern lights.
  • the weight function may further employ at least one of a visibility and an SNR (signal-to-noise ratio) with respect to each direction extracted from the pattern images with respect to each direction as parameters.
  • the weight function may further employ a measurement scope ( ⁇ ) corresponding to a grating pitch of each grating pattern light extracted from the pattern images with respect to each direction as a parameter.
  • the measurement scope may have at least two values according to the grating pattern lights.
  • the weight function may decrease the height weight, as the average brightness increases or decreases from a predetermined value.
  • the predetermined value may be a mid value of the average brightness.
  • the weight function may increase the height weight, as the visibility or the SNR increases.
  • the weight function may decrease the height weight, as the measurement scope increases.
  • Extracting the height weight with respect to each direction may include dividing the pattern images into a shadow area, a saturation area and a non-saturation area.
  • the shadow area corresponds to an area, in which the average brightness is below a minimum brightness and the visibility or the SNR is below a minimum reference value
  • the saturation area corresponds to an area, in which the average brightness is more than a maximum brightness and the visibility or the SNR is below the minimum reference value
  • the non-saturation area corresponds to a remaining area except the shadow area and the saturation area.
  • the weight function may be regarded as ‘0’ to obtain the height weight in the shadow area and the saturation area.
  • the weight function corresponding to the non-saturation area may decrease the height weight, as the average brightness increases or decreases from a mid value of the average brightness, may increase the height weight as the visibility or the SNR increases, and may decrease the height weight as the measurement scope increases.
  • Still another exemplary embodiment of the present invention discloses a method of measuring a three dimensional shape.
  • the method includes illuminating grating pattern lights in a plurality of directions onto a measurement target while changing each of the grating pattern lights by N times and detecting the grating pattern lights reflected by the measurement target, to acquire N pattern images of the measurement target with respect to each direction, extracting a phase ⁇ P i (x,y) ⁇ and a visibility ⁇ V i (x,y) ⁇ with respect to each direction corresponding to each position ⁇ i(x,y) ⁇ in an X-Y coordinate system from the pattern images, extracting a height weight ⁇ W i (x,y) ⁇ with respect to each direction by using a weight function employing the visibility as a parameter, and calculating a weight height ⁇ W i (x,y) ⁇ H i (x,y) ⁇ with respect to each direction by multiplying a height based on the phase by the height weight, and summing weight heights, to produce a height
  • two-dimensional shape image may be obtained by using three-dimensional data measured, so that additional data for two-dimensional shape image may not be required.
  • average brightness, visibility or SNR, and measurement scope are extracted from the pattern images photographed in each direction, and height weight is determined according to the extracted result, to thereby more accurately measure a height at each position of measurement target in total areas including a shadow area and a saturation area.
  • FIG. 1 is a schematic side view illustrating a shape measurement apparatus according to an exemplary embodiment of the present invention.
  • FIG. 2 is a schematic top view illustrating a shape measurement apparatus according to another exemplary embodiment of the present invention.
  • FIG. 3 is a top view illustrating a target substrate in FIG. 1 .
  • FIG. 4 is a diagram showing the shape measurement apparatus measuring a three-dimensional image.
  • FIG. 5 is graphs showing a principle for measuring a two-dimensional image.
  • FIG. 6 is a schematic view illustrating a three dimensional shape measurement apparatus used to a method of measuring a three dimensional shape according to an exemplary embodiment of the present invention.
  • FIG. 7 is a plan view illustrating a grating pattern image by a grating pattern light illuminated onto a measurement target in FIG. 6 .
  • FIG. 8 is a plan view illustrating an image measured in the camera when the grating pattern light is illuminated onto the measurement target from a right side.
  • FIG. 9 is a plan view illustrating an image measured in the camera when the grating pattern light is illuminated onto the measurement target from a left side.
  • FIG. 10 is a graph showing a relation between average brightness and weight of the pattern images measured in the camera.
  • FIG. 11 is a graph showing a relation between visibility or SNR and weight of the pattern images measured in the camera.
  • FIG. 12 is a graph showing a relation between measurement scope and weight of the pattern images measured in the camera.
  • first, second, third etc. may be used herein to describe various elements, components, regions, layers and/or sections, these elements, components, regions, layers and/or sections should not be limited by these terms. These terms are only used to distinguish one element, component, region, layer or section from another region, layer or section. Thus, a first element, component, region, layer or section discussed below could be termed a second element, component, region, layer or section without departing from the teachings of the present invention.
  • spatially relative terms such as “beneath,” “below,” “lower,” “above,” “upper” and the like, may be used herein for ease of description to describe one element or feature's relationship to another element(s) or feature(s) as illustrated in the figures. It will be understood that the spatially relative terms are intended to encompass different orientations of the device in use or operation in addition to the orientation depicted in the figures. For example, if the device in the figures is turned over, elements described as “below” or “beneath” other elements or features would then be oriented “above” the other elements or features. Thus, the exemplary term “below” can encompass both an orientation of above and below. The device may be otherwise oriented (rotated 90 degrees or at other orientations) and the spatially relative descriptors used herein interpreted accordingly.
  • Example embodiments of the invention are described herein with reference to cross-sectional illustrations that are schematic illustrations of idealized example embodiments (and intermediate structures) of the present invention. As such, variations from the shapes of the illustrations as a result, for example, of manufacturing techniques and/or tolerances, are to be expected. Thus, example embodiments of the present invention should not be construed as limited to the particular shapes of regions illustrated herein but are to include deviations in shapes that result, for example, from manufacturing. For example, an implanted region illustrated as a rectangle will, typically, have rounded or curved features and/or a gradient of implant concentration at its edges rather than a binary change from implanted to non-implanted region.
  • a buried region formed by implantation may result in some implantation in the region between the buried region and the surface through which the implantation takes place.
  • the regions illustrated in the figures are schematic in nature and their shapes are not intended to illustrate the actual shape of a region of a device and are not intended to limit the scope of the present invention.
  • FIG. 1 is a schematic side view illustrating a shape measurement apparatus according to an exemplary embodiment of the present invention.
  • a shape measurement apparatus 1100 includes a work stage 1130 , a pattern-projecting section 1110 , an image-capturing section 1150 and a control section 1140 . Additionally, the shape measurement apparatus 1100 may further include a first subsidiary light source 1160 and a second subsidiary light source 1170 .
  • the work stage 1130 support a target substrate 1120 on which a measurement target A is disposed. Furthermore, the work stage 1130 transports the measurement target A along at least one of an x-axis direction and a y-axis direction.
  • the first subsidiary light source 1160 and the second subsidiary light source 1170 may radiate a light toward the measurement target A of the target substrate 1120 to set up a total measuring regions of the target substrate 1120 by using, for example, a identification mark of the target substrate 1120 .
  • the pattern-projecting section 1110 projecting a grating image toward the measurement target A.
  • the shape measurement apparatus 1100 may include a plurality of the pattern-projecting sections 1110 disposed such that the plurality of pattern-projecting sections 1110 project grating images toward the target substrate 1120 with a specific angle with respect to a normal line of the target substrate 1120 . Furthermore, the plurality of pattern-projecting sections 1110 may be disposed symmetric with respect to the normal line.
  • Each of the pattern-projecting sections 1110 includes a light source 1111 , a grating part 1112 and a projecting lens part 1113 . For example, two the pattern-projecting sections 1110 may be disposed symmetrically with respect to the measurement target A.
  • the light source 1111 radiates light toward the measurement target A.
  • the grating part 1112 makes a grating image by using the light generated by the light source 1111 .
  • the grating part 1112 includes a light-blocking region (not shown) and a light-transmitting region (not shown).
  • the light-blocking region blocks a portion of the light generated by the light source 1111 , and the light-transmitting region transmits other portion of the light.
  • the grating part 1112 may be formed in various types.
  • the grating part 1112 may be formed by a glass plate on which a grating with a light-blocking region and a light-transmitting region is patterned.
  • a liquid crystal display panel may be used as the grating part 1112 .
  • the shape measurement apparatus 1100 further includes an actuator (not shown) for minutely transporting the grating part 1112 .
  • an actuator not shown
  • a liquid crystal display panel is employed as the grating part 1112
  • a grating pattern may be displayed by the liquid crystal display panel, so that the shape measurement apparatus 1100 does not need the actuator.
  • the projecting lens part 1113 makes a grating image of the grating part 1112 on the measurement target A of the target substrate 1120 .
  • the projecting lens part 1113 may includes, for example, a plurality of lenses, the grating part 1112 to focus the grating image to be displayed on the measurement target A on the target substrate 1120 .
  • the image-capturing section 1150 receives the grating image reflected by the measurement target A of the target substrate 1120 .
  • the image-capturing section 1150 includes, for example, a camera 1151 and a capturing lens part 1152 .
  • the grating image reflected by the measurement target A passes through the capturing lens part 1152 to be captured by the camera 1151 .
  • the control section 1140 controls the work stage 1130 , the pattern-projecting section 1110 and the image-capturing section 1150 , calculates a reliability index of the grating image captured by the image-capturing section 1150 and phases of the measurement target A, and processes the grating image captured by the image-capturing section 1150 to measure a two-dimensional shape and a three-dimensional shape.
  • the process for measuring the two-dimensional shape and the three-dimensional shape, which is performed by the control section 1140 will be explained later in detail.
  • the control section 1140 inspects the measurement target by using the phase and the reliability index.
  • the phase may be used for measuring the three-dimensional shape of the measurement target A
  • the reliability index may be used for determining good or bad regarding the measurement target.
  • at least one of a signal intensity, a visibility and an SNR (Signal to Noise Ratio) may be used for the reliability index.
  • the signal intensity may be explained referring to Expression 14 and Expression 15, the visibility may be explained referring to Expression 16 or Expression 17, and the SNR means a ratio of or difference between a periodic function generated during the N-bucket algorithm process of filtering images captured by the image-capturing section 1150 and a real signal.
  • the SNR is (visibility*D in Expression 1)/temporal noise D.
  • control section 1140 determines the measurement target A as a bad one.
  • control section 1140 determines that the measurement target is bad, when the difference between the visibility ⁇ of a specific region of the shape image obtained through Expression 16 or Expression 17 and the visibility ⁇ of peripheral region is out of the range of the setup value.
  • one of the first subsidiary light source 1160 and the second subsidiary light source 1170 may be used for measuring two-dimension shape.
  • one of the first subsidiary light source 1160 and the second subsidiary light source 1170 radiates light toward the measurement target A of the target substrate 1120 , and reflected light is captured by the camera 1151 of the image-capturing section 1150 to generate two-dimensional shape image.
  • control section 1140 may determine that the measurement target A is bad when the luminance difference between the specific region of the two-dimensional shape image and the peripheral region of the two-dimensional shape image is out of another setup value. Furthermore, the control section 1140 may determine that the measurement target A is bad when the luminance of a specific region of the measurement target A is out of another setup value.
  • the control section 1140 determines that the measurement target A is bad when luminance difference or intensity difference between the specific region and the peripheral region of the two-dimensional image obtained through the first subsidiary light source 1160 or the second subsidiary light source 1170 is out of another setup value.
  • the control section 1140 inspects the two-dimensional shape and the three-dimensional shape of a region of interest (ROI) in fields of view (FOV) in sequence.
  • ROI region of interest
  • FOV fields of view
  • FIG. 2 is a schematic top view illustrating a shape measurement apparatus according to another exemplary embodiment of the present invention.
  • the shape measurement apparatus according to the present embodiment is substantially same except for the pattern-projecting section of the shape measurement apparatus 1100 in FIG. 1 . Therefore, same reference numerals will be used for the same elements and any further explanation will be omitted.
  • the shape measurement apparatus includes a plurality of pattern-projecting sections 1110 , each of which has a grating part 1112 .
  • the plurality of pattern-projecting sections 1110 is arranged at apexes of a polygon.
  • four pattern-projecting sections 1110 are arranged at apexes of a square.
  • the plurality of pattern-projecting sections 1110 may be arranged at apexes of hexagon, octagon, etc.
  • the grating image When the grating image is captured only at one side, exact three-dimensional shape may be obtained since the measurement target A is a protrusion so that the grating image may be arrive at the other side. Therefore, the grating image may be captured at both sides opposite to each other in order to obtain the exact three-dimensional shape.
  • control section 1140 may turn on two pattern-projecting sections 1110 disposed opposite to each other.
  • the control section 1140 may turn on more than two pattern-projecting sections 1110 .
  • FIG. 3 is a top view illustrating a target substrate in FIG. 1 .
  • the target substrate 1120 such as a printed circuit board (PCB) includes, for example, a pad region 1121 (or fan out region) and a device-mounting region 1122 .
  • PCB printed circuit board
  • the pad region 1121 is a region in which a pad for electrical connecting is formed, and the device-mounting region 1122 is a region on which a device is mounted.
  • a device is mounted on the device-mounting region 1122 through solder paste.
  • the device may be electrically connected with other devices to induce mal-function. Therefore, in order to check that the shape or the amount of the solder paste is property controlled, the shape and the height of the solder paste is measured to obtain three-dimensional shape of the solder paste.
  • the pad region 1121 should be checked for preventing electrical short with other pad region.
  • two-dimensional shape obtained through Expression 14 or Expression 15 may be used for checking electrical short between the pad regions.
  • the pad region 1121 should have flat surface. When the pad region 1121 is scratched, the pad region may induce a bad connection with a device. Therefore, the surface inspection of the pad region 1121 is very important.
  • the reliability index of the pad region 1121 is inspected.
  • the pad region 1121 is determined to be bad.
  • a luminance difference of a specific region and a peripheral region in a two dimensional image obtained by using one of the first subsidiary light source 1160 and the second subsidiary light source 1170 in FIG. 1 is out of another setup value, the pad is determined to be bad since the pad has a scratch.
  • the pad region 1121 is a flat metal surface, so that the amount of light reflected by the pad region 1121 and captured by the camera 1151 of the image-capturing section 1150 in FIG. 1 may be saturated. Therefore, a shifted phase value may be measured.
  • the reliability index may be measured. Therefore, the pad region 1121 may be inspected by using the reliability index even when the amount of the light reflected by the pad region 1121 is saturated.
  • the reliability index of each pattern-projecting section 1110 may be used as a weight value for the height measured by the each pattern-projecting section 1110 .
  • the shape measurement method according to the present embodiment is substantially the same as that of the shape measurement apparatus. That is, According to shape measurement method of the present invention, grating images reflected by a measurement target are obtained while shifting a grating, several times. Then, the reliability index of the grating images is obtained. When the reliability index is within the setup value, the measurement target is determined to be good, and when the reliability index is out of the setup value, the measurement target is determined to be bad.
  • two-dimensional shape image of the measurement target may be obtained, and even when the reliability index of the pad is within the setup value, the pad may be determined to be bad when a luminance difference between a specific region and a peripheral region of the two-dimensional shape image is out of a specific value.
  • FIG. 4 is a diagram showing the shape measurement apparatus measuring a three-dimensional image.
  • the grating image is radiated onto the target substrate 1120 in FIG. 1 .
  • intensity I of images reflected by the target substrate 1120 and captured by the image-capturing section 1150 is expressed as the following Expression 1 corresponding to Moire equation.
  • I intensity captured by the image-capturing section 1150
  • D is signal intensity (or a function of DC light intensity (or light source intensity) and reflectivity)
  • visibility (a function of reflectivity and period of grating)
  • Moire equivalence period (a function of magnification, the period of grating and radiation angle ⁇ ).
  • intensity I is a function of height h, so that height h may be obtained by using intensity I.
  • Expression 1 When the phase of grating is shifted and reflected image is captured by the image-capturing section 1150 in FIG. 1 , Expression 1 may be expressed as Expression 2.
  • I k D ⁇ [ 1 + ⁇ ⁇ ⁇ cos ⁇ ( 2 ⁇ ⁇ ⁇ ⁇ ⁇ h ⁇ + ⁇ k ) ] Expression ⁇ ⁇ 2
  • ⁇ k is phase shift
  • 2 ⁇ h/ ⁇ corresponds to a phase ⁇ corresponding to the measurement target.
  • the height h may be obtained as follows.
  • zero radian is applied as ⁇ 1 to obtain I 1
  • Expression 2 is expressed as following Expression 3.
  • Expression 2 2 ⁇ /3 radian is applied as ⁇ 2 to obtain I 2 , and then Expression 2 is expressed as following Expression 4.
  • Expression 2 4 ⁇ /3 radian is applied as ⁇ 3 to obtain I 3 , and then Expression 2 is expressed as following Expression 5.
  • the height h may be obtained as following Expression 7.
  • the height h may be obtained as follows.
  • zero radian is applied as ⁇ 1 to obtain I 1
  • Expression 2 is expressed as following Expression 8.
  • Expression 2 ⁇ /2 radian is applied as ⁇ 2 to obtain I 2 , and then Expression 2 is expressed as following Expression 9.
  • Expression 2 ⁇ radian is applied as ⁇ 3 to obtain I 3 , and then Expression 2 is expressed as following Expression 10.
  • Expression 2 3 ⁇ /2 radian is applied as ⁇ 4 to obtain I 4 , and then Expression 2 is expressed as following Expression 11.
  • the height h may be obtained as following Expression 13.
  • three-dimensional shape of the measurement target may be obtained by using Expression 7 or Expression 13.
  • FIG. 5 is graphs showing a principle for measuring a two-dimensional image.
  • Arithmetic mean value lave of I 1 I 2 , I 3 and I 4 may be obtained as following Expression 14.
  • effect of grating may be offset when averaged, so that two-dimensional shape image may be obtained.
  • the visibility ⁇ in Expression 2 may be expressed as following Expression 16 by using Expression 3, 4, 5 and 15 in case of 3-bucket algorithm.
  • the visibility ⁇ in Expression 2 may be expressed as following Expression 17 by using Expression 8, 9, 10, 11 and 14 in case of 4-bucket algorithm.
  • two-dimensional shape image may be obtained by using three-dimensional data measured, so that additional data for two-dimensional shape image may be not required.
  • FIG. 6 is a schematic view illustrating a three dimensional shape measurement apparatus used to a method of measuring a three dimensional shape according to an exemplary embodiment of the present invention.
  • a three dimensional shape measurement apparatus used to a method of measuring a three dimensional shape may include a measurement stage section 100 , an image photographing section 200 , first and second illumination sections 300 and 400 , an image acquiring section 500 , a module control section 600 and a central control section 700 .
  • the measurement stage section 100 may include a stage 110 supporting a measurement target 10 and a stage transfer unit 120 transferring the stage 110 .
  • a measurement location may be changed in the measurement target 10 .
  • the image photographing section 200 is disposed over the stage 110 to receive light reflected by the measurement target 10 and measure an image of the measurement target 10 . That is, the image photographing section 200 receives the light that exits the first and second illumination sections 300 and 400 and is reflected by the measurement target 10 , and photographs a plan image of the measurement target 10 .
  • the image photographing section 200 may include a camera 210 , an imaging lens 220 , a filter 230 and a lamp 240 .
  • the camera 210 receives the light reflected by the measurement target 10 and photographs the plan image of the measurement target 10 .
  • the camera 210 may include, for example, one of a CCD camera and a CMOS camera.
  • the imaging lens 220 is disposed under the camera 210 to image the light reflected by the measurement target 10 on the camera 210 .
  • the filter 230 is disposed under the imaging lens 220 to filter the light reflected by the measurement target 10 and provide the filtered light to the imaging lens 220 .
  • the filter 230 may include, for example, one of a frequency filter, a color filter and a light intensity control filter.
  • the lamp 240 may be disposed under the filter 230 in a circular shape to provide the light to the measurement target 10 , so as to photograph a particular image such as a two-dimensional shape of the measurement target 10 .
  • the first illumination section 300 may be disposed, for example, at a right side of the image photographing section 200 to be inclined with respect to the stage 110 supporting the measurement target 10 .
  • the first illumination section 300 may include a first light source unit 310 , a first grating unit 320 , a first grating transfer unit 330 and a first condensing lens 340 .
  • the first light source unit 310 may include a light source and at least one lens to generate light, and the first grating unit 320 is disposed under the first light source unit 310 to change the light generated by the first light source unit 310 into a first grating pattern light having a grating pattern.
  • the first grating transfer unit 330 is connected to the first grating unit 320 to transfer the first grating unit 320 , and may include, for example, one of a piezoelectric transfer unit and a fine linear transfer unit.
  • the first condensing lens 340 is disposed under the first grating unit 320 to condense the first grating pattern light exiting the first grating unit 320 on the measurement target 10 .
  • the second illumination section 400 may be disposed at a left side of the image photographing section 200 to be inclined with respect to the stage 110 supporting the measurement target 10 .
  • the second illumination section 400 may include a second light source unit 410 , a second grating unit 420 , a second grating transfer unit 430 and a second condensing lens 440 .
  • the second illumination section 400 is substantially the same as the first illumination section 300 described above, and thus any further description will be omitted.
  • the image photographing section 200 may sequentially receive the N first grating pattern lights reflected by the measurement target 10 and photograph N first pattern images.
  • the image photographing section 200 may sequentially receive the N second grating pattern lights reflected by the measurement target 10 and photograph N second pattern images.
  • the ‘N’ is a natural number, and for example may be four.
  • the first and second illumination sections 300 and 400 are described as an illumination apparatus generating the first and second grating pattern lights.
  • the illumination section may be more than or equal to three.
  • the grating pattern light may be illuminated onto the measurement target 10 in various directions, and various pattern images may be photographed.
  • three illumination sections are disposed in an equilateral triangle form with the image photographing section 200 being the center of the equilateral triangle form
  • three grating pattern lights may be illuminated onto the measurement target 10 in different directions.
  • four illumination sections are disposed in a square form with the image photographing section 200 being the center of the square form
  • four grating pattern lights may be illuminated onto the measurement target 10 in different directions.
  • the image acquiring section 500 is electrically connected to the camera 210 of the image photographing section 200 to acquire the pattern images from the camera 210 and store the acquired pattern images.
  • the image acquiring section 500 may include an image system that receives the N first pattern images and the N second pattern images photographed in the camera 210 and stores the images.
  • the module control section 600 is electrically connected to the measurement stage section 100 , the image photographing section 200 , the first illumination section 300 and the second illumination section 400 , to control the measurement stage section 100 , the image photographing section 200 , the first illumination section 300 and the second illumination section 400 .
  • the module control section 600 may include, for example, an illumination controller, a grating controller and a stage controller.
  • the illumination controller controls the first and second light source units 310 and 410 to generate light
  • the grating controller controls the first and second grating transfer units 330 and 430 to move the first and second grating units 320 and 420 .
  • the stage controller controls the stage transfer unit 120 to move the stage 110 in an up-and-down motion and a left-and-right motion.
  • the central control section 700 is electrically connected to the image acquiring section 500 and the module control section 600 to control the image acquiring section 500 and the module control section 600 .
  • the central control section 700 receives the N first pattern images and the N second pattern images from the image system of the image acquiring section 500 to process the images, so that three dimensional shape of the measurement target may be measured.
  • the central control section 700 may control an illumination controller, a grating controller and a stage controller of the module control section 600 .
  • the central control section may include an image processing board, a control board and an interface board.
  • FIG. 7 is a plan view illustrating a grating pattern image by a grating pattern light illuminated onto a measurement target in FIG. 6 .
  • the grating pattern image includes a plurality of grating patterns, and in the present embodiment, an interval between the grating patterns, i.e., a grating pitch is defined as a measurement scope ⁇ .
  • the measurement scope ⁇ may be the same irrespective of sorts of the grating pattern lights, but alternatively, may be different from each other according to sorts of the grating pattern lights.
  • the measurement scope ⁇ may have at least two values according to sorts of the grating pattern lights.
  • the grating pattern image by the first grating pattern light generated from the first illumination section 300 may have grating patterns of a first measurement scope
  • the grating pattern image by the second grating pattern light generated from the second illumination section 400 may have grating patterns of a second measurement scope different from the first measurement scope.
  • FIG. 8 is a plan view illustrating an image measured in the camera when the grating pattern light is illuminated onto the measurement target from a right side.
  • FIG. 9 is a plan view illustrating an image measured in the camera when the grating pattern light is illuminated onto the measurement target from a left side. In the images of FIGS. 8 and 9 , a relative amount with respect to brightness (luminance) is just shown, and the grating pattern is omitted.
  • an image photographed in the camera 210 may include a shadow area that is relatively dark and a saturation area that is relatively bright.
  • the saturation area is formed at a right portion of the measurement target 10
  • the shadow area is formed at a left portion of the measurement target 10
  • the saturation area is formed at a left portion of the measurement target 10
  • the shadow area is formed at a right portion of the measurement target 10 .
  • the grating pattern lights generated in a plurality of directions are sequentially illuminated onto the measurement target 10 disposed on the stage 110 , and the grating pattern lights reflected by the measurement target 10 are sequentially detected in the camera 210 to acquire a plurality of pattern images.
  • each of the grating pattern lights is moved aside and illuminated onto the measurement target 10 by N times, for example, three times or four times, to acquire N pattern images of the measurement target 10 for each of the directions.
  • N times for example, three times or four times
  • N first pattern images and N second pattern images may be acquired.
  • N brightness degrees ⁇ I i 1 , I i 2 , . . . , I i N ⁇ at each position ⁇ i(x,y) ⁇ in an X-Y coordinate system, and the measurement scope ⁇ as shown in FIG. 7 are extracted from N pattern images with respect to each direction.
  • phase ⁇ P i (x,y) ⁇ , brightness ⁇ A i (x,y) ⁇ and visibility ⁇ V i (x,y) ⁇ with respect to each direction are calculated from the N brightness degrees ⁇ I i 1 , I i 2 , . . . , I i N ⁇ .
  • the phase ⁇ P i (x,y) ⁇ , the brightness ⁇ A i (x,y) ⁇ and the visibility ⁇ V i (x,y) ⁇ with respect to each direction may be calculated by using an N-bucket algorithm.
  • the brightness ⁇ A i (x,y) ⁇ may be an average brightness that is obtained by averaging the detected grating pattern lights.
  • the brightness ⁇ A i (x,y) ⁇ will be called “average brightness ⁇ A i (x,y) ⁇ ”.
  • phase ⁇ P i (x,y) ⁇ , average brightness ⁇ A i (x,y) ⁇ and visibility ⁇ V i (x,y) ⁇ may be calculated as shown in the following equations through a three-bucket algorithm.
  • B i (x,y) indicates an amplitude of an image signal (brightness signal) in three pattern images with respect to each direction.
  • I i 1 corresponds to “a+b cos( ⁇ )”
  • I i 2 corresponds to “a+b cos( ⁇ +2 ⁇ /3)”
  • I i 3 corresponds to “a+b cos( ⁇ +4 ⁇ /3)”.
  • phase ⁇ P i (x,y) ⁇ , average brightness ⁇ A i (x,y) ⁇ and visibility ⁇ V i (x,y) ⁇ may be calculated as shown in the following equations through a four-bucket algorithm.
  • B i (x,y) indicates an amplitude of an image signal (brightness signal) in four pattern images with respect to each direction.
  • I i 1 corresponds to “a+b cos( ⁇ )”
  • I i 2 corresponds to “a+b cos( ⁇ + ⁇ /2)”
  • I i 3 corresponds to “a+b cos( ⁇ + ⁇ )”
  • I i 4 corresponds to “a+b cos(p+3 ⁇ /2)”.
  • a signal-to-noise ratio may be calculated and used in place of the visibility ⁇ V i (x,y) ⁇ or together with the visibility ⁇ V i (x,y) ⁇ .
  • the SNR Indicates a ratio of an image signal S to a noise signal N (S/N) in N pattern images with respect to each direction.
  • height ⁇ H i (x,y) ⁇ with respect to each direction is calculated from the phase ⁇ P i (x,y) ⁇ with respect to each direction by the following equation.
  • k i (x,y) is a phase-to-height conversion scale that indicates a conversion ratio between a phase and a height.
  • H i ( x,y ) k i ( x,y ) ⁇ P i ( x,y )
  • Height weight ⁇ W i (x,y) ⁇ with respect to each direction is calculated by using at least one of the average brightness ⁇ A i (x,y) ⁇ , the visibility ⁇ V i (x,y) ⁇ and the measurement scope ⁇ .
  • W i ( x,y ) f ( A i ,V i , ⁇ )
  • the height ⁇ H i (x,y) ⁇ with respect to each direction is multiplied by the height weight ⁇ W i (x,y) ⁇ with respect to each direction to calculate weight height ⁇ W i (x,y) ⁇ H i (x,y) ⁇ with respect to each direction.
  • the weight heights in the total directions are summed and divided by the sum of the height weights ⁇ W i (x,y) ⁇ to calculate height ⁇ W i (x,y) ⁇ H i (x,y)/ ⁇ W i (x,y) ⁇ at each position.
  • the three dimensional shape of the measurement target 10 may be accurately measured by combining the heights according to positions calculated as the above.
  • FIG. 10 is a graph showing a relation between average brightness and weight of the pattern images measured in the camera.
  • the weight function ⁇ f(A i ,V i , ⁇ ) ⁇ may act on the height weight ⁇ W i (x,y) ⁇ to decrease.
  • the height weight ⁇ W i (x,y) ⁇ has relatively the greatest value, and as the average brightness ⁇ A i (x,y) ⁇ becomes distant from the predetermined value, the height weight ⁇ W i (x,y) ⁇ may decrease.
  • the predetermined value may be set when determining a three dimensional condition by using a specimen stone or may be arbitrarily set by a user. However, the predetermined value may preferably be an average value, i.e., a mid value of the average brightness ⁇ A i (x,y) ⁇ .
  • FIG. 11 is a graph showing a relation between visibility or SNR and weight of the pattern images measured in the camera.
  • the weight function ⁇ f(A i ,V i , ⁇ ) ⁇ may act on the height weight to increase.
  • the height weight ⁇ W i (x,y) ⁇ may also slowly increase.
  • FIG. 12 is a graph showing a relation between measurement scope and weight of the pattern images measured in the camera.
  • the weight function ⁇ f(A i ,V i , ⁇ ) ⁇ may act on the height weight ⁇ W i (x,y) ⁇ to decrease.
  • the height weight ⁇ W i (x,y) ⁇ may slowly decrease.
  • the N pattern images with respect to each direction are divided into a shadow area, a saturation area and non-saturation area, and different height weight ⁇ W i (x,y) ⁇ may be given according to each area.
  • the average brightness ⁇ A i (x,y) ⁇ is below a minimum brightness A 1
  • the visibility ⁇ V i (x,y) ⁇ or the SNR is below a minimum reference value Vmin.
  • the saturation area the average brightness ⁇ A i (x,y) ⁇ is more than a maximum brightness A 2
  • the visibility or the SNR is below the minimum reference value Vmin.
  • the non-saturation area corresponds to a remaining area except the shadow area and the saturation area.
  • the weight function ⁇ f(A i ,V i , ⁇ ) ⁇ is regarded as ‘0’ to obtain the height weight ⁇ W i (x,y) ⁇ .
  • the height weight ⁇ W i (x,y) ⁇ is determined as ‘0’.
  • the weight function ⁇ f(A i ,V i , ⁇ ) ⁇ may decrease the height weight ⁇ W i (x,y) ⁇ when the average brightness ⁇ A i (x,y) ⁇ increases or decreases from the mid value, increase the height weight ⁇ W i (x,y) ⁇ when the visibility ⁇ V i (x,y) ⁇ or the SNR increases, and decrease the height weight ⁇ W i (x,y) ⁇ when the measurement scope ⁇ increases.
  • the weight function ⁇ f(A i ,V i , ⁇ ) ⁇ may be regarded as the same to obtain the height weight ⁇ W i (x,y) ⁇ .
  • all of the first, second, third and fourth height weights W 1 , W 2 , W 3 and W 4 may be determined as ‘1 ⁇ 4’.
  • the average brightness ⁇ A i (x,y) ⁇ , the visibility ⁇ V i (x,y) ⁇ or SNR, and the measurement scope ⁇ are extracted from the N pattern images photographed in each directions, and the height weight ⁇ W i (x,y) ⁇ is determined according to the extraction result, thereby accurately measuring the height according to each position of the measurement target 10 in all the areas.
  • the N pattern images with respect to each direction are divided into the shadow area, the saturation area and the non-saturation area, and different height weight ⁇ W i (x,y) ⁇ is given according to each area, to thereby prevent reliability reduction for the height in the shadow area and the saturation area.
  • the height weight ⁇ W i (x,y) ⁇ is given as relatively low value, for example, ‘0’ in the shadow area and the saturation area
  • the height weight ⁇ W i (x,y) ⁇ is given as relatively great value in the non-saturation area, thereby compensating for adverse effect incurred from the shadow area and the saturation area to more accurately measure a three dimensional shape of the measurement target.

Abstract

A shape measurement apparatus includes a work stage supporting a target substrate, a pattern-projecting section including a light source, a grating part partially transmitting and blocking light generated by the light source to generate a grating image and a projecting lens part making the grating image on a measurement target of the target substrate, an image-capturing section capturing the grating image reflected by the measurement target of the target substrate, and a control section controlling the work stage, the pattern-projecting section and the image-capturing section, calculating a reliability index of the grating image and phases of the grating image, which is corresponding to the measurement target, and inspecting the measurement target by using the reliability index and the phases. Thus, the accuracy of measurement may be enhanced.

Description

CROSS REFERENCE TO RELATED APPLICATION
This application is a continuation of U.S. patent application Ser. No. 12/784,707, filed on May 21, 2010 (currently pending), the disclosure of which is herein incorporated by reference in its entirety. The U.S. patent application Ser. No. 12/784,707 claims priority to and the benefit of Korean Patent Application Nos. 10-2009-0044423 filed on May 21, 2009, and 10-2010-0007025 filed on Jan. 26, 2010, the entire contents of which are incorporated herein by reference.
BACKGROUND OF THE INVENTION
Field of the Invention
Exemplary embodiments of the present invention relate to a shape measurement apparatus and a shape measurement method. More particularly, exemplary embodiments of the present invention relate to a shape measurement apparatus and a shape measurement method capable of enhancing accuracy of measurement.
Discussion of the Background
Electronic devices have been developed to have relatively lighter weight and smaller size. Therefore, possibility of defects in these electronic devices increases and apparatus for inspecting the defects is under development and improvement.
Recently, the technique for inspecting a three-dimensional shape becomes effective in various technical fields. In the technique for inspecting a three-dimensional shape, a coordinate measurement machine (CMM), which detects three-dimensional shape by a contacting method, was used. However, non-contact method for inspecting a three-dimensional shape by using optical theories has been under development.
Meadows and Takasaki developed shadow Moire method which is the representative non-contact method for inspecting a three-dimensional shape in 1970. However, the shadow Moire method has a problem that a grating for measurement should be larger than a measurement target in size. In order to solve the above problem, Yoshino developed the projection Moire technique. Additionally, Kujawinska applied the phase shifting method which is used for analysis of optical coherence to Moire technique for inspecting three-dimensional shape so that the measurement resolution is enhanced and the limitation of Moire pattern is removed.
These techniques for measuring three-dimensional shape may be used for inspection of a printed circuit board, and tries for enhancing accuracy are performed.
SUMMARY OF THE INVENTION
Exemplary embodiments of the present invention provide a shape measurement apparatus capable of measuring a two-dimensional shape together with a three-dimensional shape, and enhancing accuracy of measurement.
Exemplary embodiments of the present invention also provide a shape measurement method capable of measuring a two-dimensional shape together with a three-dimensional shape, and enhancing accuracy of measurement.
Exemplary embodiments of the present invention also provide a method of measuring a three dimensional shape capable of accurately measuring a three dimensional shape of a measurement target in total areas.
Additional features of the invention will be set forth in the description which follows, and in part will be apparent from the description, or may be learned by practice of the invention.
An exemplary embodiment of the present invention discloses a shape measurement apparatus. The shape measurement apparatus includes a work stage supporting a target substrate, a pattern-projecting section including a light source, a grating part transmitting and blocking light generated by the light source to generate a grating image and a projecting lens part making the grating image on a measurement target of the target substrate, an image-capturing section capturing the grating image reflected by the measurement target of the target substrate, and a control section controlling the work stage, the pattern-projecting section and the image-capturing section, calculating a reliability index of the grating image and phases of the grating image, which is corresponding to the measurement target, and inspecting the measurement target by using the reliability index and the phases.
The shape measurement apparatus may inspect a surface of a pad through the reliability index when the pad is the measurement target. The pad may be to be electrically connected to an external device. The reliability index may be at least one of an intensity, a visibility and a signal to noise ratio. The control section may determine the pad is bad, when the reliability index is out of a setup value. The shape measurement apparatus may further include a subsidiary light source for inspecting the measurement target of the target substrate. The control section may determine that the pad is bad when light generated by the subsidiary light source is reflected by the pad and captured by the image-capturing section to form a two-dimensional image, and the pad is determined as bad in the two-dimensional image, even though the reliability index shows that the pad is good.
Another exemplary embodiment of the present invention discloses a shape measurement method. The shape measurement method includes acquiring a grating image reflected by a measurement target, while shifting the grating image for specific times, acquiring a reliability index including at least one of an intensity, a visibility and a signal to noise ratio of the grating image by using the grating image, and determining a pad for being electrically connected to an external device is good when the reliability index is within a setup value, and bad when the reliability index is out of the setup value, in case that the pad is the measurement target.
Still another exemplary embodiment of the present invention discloses a method of measuring a three dimensional shape. The method includes illuminating grating pattern lights in a plurality of directions onto a measurement target while changing each of the grating pattern lights by N times and detecting the grating pattern lights reflected by the measurement target, to acquire N pattern images of the measurement target with respect to each direction, extracting a phase {Pi(x,y)} and a brightness {Ai(x,y)} with respect to each direction corresponding to each position {i(x,y)} in an X-Y coordinate system from the pattern images, extracting a height weight {Wi(x,y)} with respect to each direction by using a weight function employing the brightness as a parameter, and calculating a weight height {Wi(x,y)·Hi(x,y)} with respect to each direction by using a height based on the phase with respect to each direction and the height weight, and summing weight heights, to produce a height {ΣWi(x,y)·Hi(x,y)/ΣWi(x,y)} at each position.
The brightness may correspond to an average brightness that is obtained by averaging the detected grating pattern lights.
The weight function may further employ at least one of a visibility and an SNR (signal-to-noise ratio) with respect to each direction extracted from the pattern images with respect to each direction as parameters.
The weight function may further employ a measurement scope (λ) corresponding to a grating pitch of each grating pattern light extracted from the pattern images with respect to each direction as a parameter. The measurement scope may have at least two values according to the grating pattern lights.
The weight function may decrease the height weight, as the average brightness increases or decreases from a predetermined value. The predetermined value may be a mid value of the average brightness.
The weight function may increase the height weight, as the visibility or the SNR increases.
The weight function may decrease the height weight, as the measurement scope increases.
Extracting the height weight with respect to each direction may include dividing the pattern images into a shadow area, a saturation area and a non-saturation area. The shadow area corresponds to an area, in which the average brightness is below a minimum brightness and the visibility or the SNR is below a minimum reference value, the saturation area corresponds to an area, in which the average brightness is more than a maximum brightness and the visibility or the SNR is below the minimum reference value, and the non-saturation area corresponds to a remaining area except the shadow area and the saturation area. The weight function may be regarded as ‘0’ to obtain the height weight in the shadow area and the saturation area. The weight function corresponding to the non-saturation area may decrease the height weight, as the average brightness increases or decreases from a mid value of the average brightness, may increase the height weight as the visibility or the SNR increases, and may decrease the height weight as the measurement scope increases.
Sum of the height weights may be equal to 1 {ΣWi(x,y)=1}.
Still another exemplary embodiment of the present invention discloses a method of measuring a three dimensional shape. The method includes illuminating grating pattern lights in a plurality of directions onto a measurement target while changing each of the grating pattern lights by N times and detecting the grating pattern lights reflected by the measurement target, to acquire N pattern images of the measurement target with respect to each direction, extracting a phase {Pi(x,y)} and a visibility {Vi(x,y)} with respect to each direction corresponding to each position {i(x,y)} in an X-Y coordinate system from the pattern images, extracting a height weight {Wi(x,y)} with respect to each direction by using a weight function employing the visibility as a parameter, and calculating a weight height {Wi(x,y)·Hi(x,y)} with respect to each direction by multiplying a height based on the phase by the height weight, and summing weight heights, to produce a height {ΣWi(x,y)·Hi(x,y)/ΣWi(x,y)} at each position.
According to the present invention, two-dimensional shape image may be obtained by using three-dimensional data measured, so that additional data for two-dimensional shape image may not be required.
Furthermore, when two-dimensional shape image and three-dimensional shape image, both of which are measured, are used together, defects of PCB may be effectively inspected.
Furthermore, when luminance of additional two-dimensional images used, accuracy of inspection may be enhanced.
In addition, average brightness, visibility or SNR, and measurement scope are extracted from the pattern images photographed in each direction, and height weight is determined according to the extracted result, to thereby more accurately measure a height at each position of measurement target in total areas including a shadow area and a saturation area.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory and are intended to provide further explanation of the invention as claimed.
BRIEF DESCRIPTION OF THE DRAWINGS
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this specification, illustrate embodiments of the invention, and together with the description serve to explain the principles of the invention.
FIG. 1 is a schematic side view illustrating a shape measurement apparatus according to an exemplary embodiment of the present invention.
FIG. 2 is a schematic top view illustrating a shape measurement apparatus according to another exemplary embodiment of the present invention.
FIG. 3 is a top view illustrating a target substrate in FIG. 1.
FIG. 4 is a diagram showing the shape measurement apparatus measuring a three-dimensional image.
FIG. 5 is graphs showing a principle for measuring a two-dimensional image.
FIG. 6 is a schematic view illustrating a three dimensional shape measurement apparatus used to a method of measuring a three dimensional shape according to an exemplary embodiment of the present invention.
FIG. 7 is a plan view illustrating a grating pattern image by a grating pattern light illuminated onto a measurement target in FIG. 6.
FIG. 8 is a plan view illustrating an image measured in the camera when the grating pattern light is illuminated onto the measurement target from a right side.
FIG. 9 is a plan view illustrating an image measured in the camera when the grating pattern light is illuminated onto the measurement target from a left side.
FIG. 10 is a graph showing a relation between average brightness and weight of the pattern images measured in the camera.
FIG. 11 is a graph showing a relation between visibility or SNR and weight of the pattern images measured in the camera.
FIG. 12 is a graph showing a relation between measurement scope and weight of the pattern images measured in the camera.
DETAILED DESCRIPTION OF THE ILLUSTRATED EMBODIMENTS
The present invention is described more fully hereinafter with reference to the accompanying drawings, in which example embodiments of the present invention are shown. The present invention may, however, be embodied in many different forms and should not be construed as limited to the example embodiments set forth herein. Rather, these example embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the present invention to those skilled in the art. In the drawings, the sizes and relative sizes of layers and regions may be exaggerated for clarity.
It will be understood that when an element or layer is referred to as being “on,” “connected to” or “coupled to” another element or layer, it can be directly on, connected or coupled to the other element or layer or intervening elements or layers may be present. In contrast, when an element is referred to as being “directly on,” “directly connected to” or “directly coupled to” another element or layer, there are no intervening elements or layers present. Like numerals refer to like elements throughout. As used herein, the term “and/or” includes any and all combinations of one or more of the associated listed items.
It will be understood that, although the terms first, second, third etc. may be used herein to describe various elements, components, regions, layers and/or sections, these elements, components, regions, layers and/or sections should not be limited by these terms. These terms are only used to distinguish one element, component, region, layer or section from another region, layer or section. Thus, a first element, component, region, layer or section discussed below could be termed a second element, component, region, layer or section without departing from the teachings of the present invention.
Spatially relative terms, such as “beneath,” “below,” “lower,” “above,” “upper” and the like, may be used herein for ease of description to describe one element or feature's relationship to another element(s) or feature(s) as illustrated in the figures. It will be understood that the spatially relative terms are intended to encompass different orientations of the device in use or operation in addition to the orientation depicted in the figures. For example, if the device in the figures is turned over, elements described as “below” or “beneath” other elements or features would then be oriented “above” the other elements or features. Thus, the exemplary term “below” can encompass both an orientation of above and below. The device may be otherwise oriented (rotated 90 degrees or at other orientations) and the spatially relative descriptors used herein interpreted accordingly.
The terminology used herein is for the purpose of describing particular example embodiments only and is not intended to be limiting of the present invention. As used herein, the singular forms “a,” “an” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms “comprises” and/or “comprising,” when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
Example embodiments of the invention are described herein with reference to cross-sectional illustrations that are schematic illustrations of idealized example embodiments (and intermediate structures) of the present invention. As such, variations from the shapes of the illustrations as a result, for example, of manufacturing techniques and/or tolerances, are to be expected. Thus, example embodiments of the present invention should not be construed as limited to the particular shapes of regions illustrated herein but are to include deviations in shapes that result, for example, from manufacturing. For example, an implanted region illustrated as a rectangle will, typically, have rounded or curved features and/or a gradient of implant concentration at its edges rather than a binary change from implanted to non-implanted region. Likewise, a buried region formed by implantation may result in some implantation in the region between the buried region and the surface through which the implantation takes place. Thus, the regions illustrated in the figures are schematic in nature and their shapes are not intended to illustrate the actual shape of a region of a device and are not intended to limit the scope of the present invention.
Unless otherwise defined, all terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. It will be further understood that terms, such as those defined in commonly used dictionaries, should be interpreted as having a meaning that is consistent with their meaning in the context of the relevant art and will not be interpreted in an idealized or overly formal sense unless expressly so defined herein.
Hereinafter, exemplary embodiments of the present invention will be described in detail with reference to the accompanying drawings.
FIG. 1 is a schematic side view illustrating a shape measurement apparatus according to an exemplary embodiment of the present invention.
Referring to FIG. 1, a shape measurement apparatus 1100 according to an exemplary embodiment of the present invention includes a work stage 1130, a pattern-projecting section 1110, an image-capturing section 1150 and a control section 1140. Additionally, the shape measurement apparatus 1100 may further include a first subsidiary light source 1160 and a second subsidiary light source 1170.
The work stage 1130 support a target substrate 1120 on which a measurement target A is disposed. Furthermore, the work stage 1130 transports the measurement target A along at least one of an x-axis direction and a y-axis direction. When the work stage 1130 is controlled to transport the target substrate 1120 to a proper position by the control section 1140, the first subsidiary light source 1160 and the second subsidiary light source 1170 may radiate a light toward the measurement target A of the target substrate 1120 to set up a total measuring regions of the target substrate 1120 by using, for example, a identification mark of the target substrate 1120.
The pattern-projecting section 1110 projecting a grating image toward the measurement target A. The shape measurement apparatus 1100 may include a plurality of the pattern-projecting sections 1110 disposed such that the plurality of pattern-projecting sections 1110 project grating images toward the target substrate 1120 with a specific angle with respect to a normal line of the target substrate 1120. Furthermore, the plurality of pattern-projecting sections 1110 may be disposed symmetric with respect to the normal line. Each of the pattern-projecting sections 1110 includes a light source 1111, a grating part 1112 and a projecting lens part 1113. For example, two the pattern-projecting sections 1110 may be disposed symmetrically with respect to the measurement target A.
The light source 1111 radiates light toward the measurement target A.
The grating part 1112 makes a grating image by using the light generated by the light source 1111. The grating part 1112 includes a light-blocking region (not shown) and a light-transmitting region (not shown). The light-blocking region blocks a portion of the light generated by the light source 1111, and the light-transmitting region transmits other portion of the light. The grating part 1112 may be formed in various types. For example, the grating part 1112 may be formed by a glass plate on which a grating with a light-blocking region and a light-transmitting region is patterned. Alternatively, a liquid crystal display panel may be used as the grating part 1112.
When the glass plate on which the grating with the light-blocking region and the light-transmitting region is employed as the grating part 1112, the shape measurement apparatus 1100 further includes an actuator (not shown) for minutely transporting the grating part 1112. When a liquid crystal display panel is employed as the grating part 1112, a grating pattern may be displayed by the liquid crystal display panel, so that the shape measurement apparatus 1100 does not need the actuator.
The projecting lens part 1113 makes a grating image of the grating part 1112 on the measurement target A of the target substrate 1120. The projecting lens part 1113 may includes, for example, a plurality of lenses, the grating part 1112 to focus the grating image to be displayed on the measurement target A on the target substrate 1120.
The image-capturing section 1150 receives the grating image reflected by the measurement target A of the target substrate 1120. The image-capturing section 1150 includes, for example, a camera 1151 and a capturing lens part 1152. The grating image reflected by the measurement target A passes through the capturing lens part 1152 to be captured by the camera 1151.
The control section 1140 controls the work stage 1130, the pattern-projecting section 1110 and the image-capturing section 1150, calculates a reliability index of the grating image captured by the image-capturing section 1150 and phases of the measurement target A, and processes the grating image captured by the image-capturing section 1150 to measure a two-dimensional shape and a three-dimensional shape. The process for measuring the two-dimensional shape and the three-dimensional shape, which is performed by the control section 1140, will be explained later in detail.
The control section 1140 inspects the measurement target by using the phase and the reliability index. In detail, the phase may be used for measuring the three-dimensional shape of the measurement target A, and the reliability index may be used for determining good or bad regarding the measurement target. For example, at least one of a signal intensity, a visibility and an SNR (Signal to Noise Ratio) may be used for the reliability index. The signal intensity may be explained referring to Expression 14 and Expression 15, the visibility may be explained referring to Expression 16 or Expression 17, and the SNR means a ratio of or difference between a periodic function generated during the N-bucket algorithm process of filtering images captured by the image-capturing section 1150 and a real signal. In more detail, the SNR is (visibility*D in Expression 1)/temporal noise D.
When the reliability index is out of a setup value, the control section 1140 determines the measurement target A as a bad one.
For example, the control section 1140 determines that the measurement target is bad, when the difference between the visibility γ of a specific region of the shape image obtained through Expression 16 or Expression 17 and the visibility γ of peripheral region is out of the range of the setup value.
Furthermore, one of the first subsidiary light source 1160 and the second subsidiary light source 1170 may be used for measuring two-dimension shape. In more detail, one of the first subsidiary light source 1160 and the second subsidiary light source 1170 radiates light toward the measurement target A of the target substrate 1120, and reflected light is captured by the camera 1151 of the image-capturing section 1150 to generate two-dimensional shape image.
Even when difference of the reliability index is within the setup value, the control section 1140 may determine that the measurement target A is bad when the luminance difference between the specific region of the two-dimensional shape image and the peripheral region of the two-dimensional shape image is out of another setup value. Furthermore, the control section 1140 may determine that the measurement target A is bad when the luminance of a specific region of the measurement target A is out of another setup value.
For example, even when the difference between the visibility γ of a specific region obtained through Expression 16 or Expression 17 and the visibility γ of a peripheral region is within the setup value, the control section 1140 determines that the measurement target A is bad when luminance difference or intensity difference between the specific region and the peripheral region of the two-dimensional image obtained through the first subsidiary light source 1160 or the second subsidiary light source 1170 is out of another setup value.
The control section 1140 inspects the two-dimensional shape and the three-dimensional shape of a region of interest (ROI) in fields of view (FOV) in sequence.
FIG. 2 is a schematic top view illustrating a shape measurement apparatus according to another exemplary embodiment of the present invention. The shape measurement apparatus according to the present embodiment is substantially same except for the pattern-projecting section of the shape measurement apparatus 1100 in FIG. 1. Therefore, same reference numerals will be used for the same elements and any further explanation will be omitted.
Referring to FIG. 2, the shape measurement apparatus according to the present embodiment includes a plurality of pattern-projecting sections 1110, each of which has a grating part 1112. The plurality of pattern-projecting sections 1110 is arranged at apexes of a polygon. In FIG. 2, four pattern-projecting sections 1110 are arranged at apexes of a square. However, the plurality of pattern-projecting sections 1110 may be arranged at apexes of hexagon, octagon, etc.
When the grating image is captured only at one side, exact three-dimensional shape may be obtained since the measurement target A is a protrusion so that the grating image may be arrive at the other side. Therefore, the grating image may be captured at both sides opposite to each other in order to obtain the exact three-dimensional shape.
For example, when the measurement target A has rectangular shape, the control section 1140 may turn on two pattern-projecting sections 1110 disposed opposite to each other. When the shape of the measurement target A, which is grasped by the control section 1140, is complex, the control section 1140 may turn on more than two pattern-projecting sections 1110.
FIG. 3 is a top view illustrating a target substrate in FIG. 1.
Referring to FIG. 3, the target substrate 1120 such as a printed circuit board (PCB) includes, for example, a pad region 1121 (or fan out region) and a device-mounting region 1122.
The pad region 1121 is a region in which a pad for electrical connecting is formed, and the device-mounting region 1122 is a region on which a device is mounted.
A device is mounted on the device-mounting region 1122 through solder paste. When the shape or the amount of the solder paste is not properly controlled, the device may be electrically connected with other devices to induce mal-function. Therefore, in order to check that the shape or the amount of the solder paste is property controlled, the shape and the height of the solder paste is measured to obtain three-dimensional shape of the solder paste.
Furthermore, the pad region 1121 should be checked for preventing electrical short with other pad region. In this case, two-dimensional shape obtained through Expression 14 or Expression 15 may be used for checking electrical short between the pad regions.
Additionally, the pad region 1121 should have flat surface. When the pad region 1121 is scratched, the pad region may induce a bad connection with a device. Therefore, the surface inspection of the pad region 1121 is very important.
For the surface inspection, the reliability index of the pad region 1121 is inspected. When the reliability index of a specific region is out of the setup value, the pad region 1121 is determined to be bad. Even when the reliability index of the specific region is within the setup value, a luminance difference of a specific region and a peripheral region in a two dimensional image obtained by using one of the first subsidiary light source 1160 and the second subsidiary light source 1170 in FIG. 1 is out of another setup value, the pad is determined to be bad since the pad has a scratch.
The pad region 1121 is a flat metal surface, so that the amount of light reflected by the pad region 1121 and captured by the camera 1151 of the image-capturing section 1150 in FIG. 1 may be saturated. Therefore, a shifted phase value may be measured. However the reliability index may be measured. Therefore, the pad region 1121 may be inspected by using the reliability index even when the amount of the light reflected by the pad region 1121 is saturated. Furthermore, the reliability index of each pattern-projecting section 1110 may be used as a weight value for the height measured by the each pattern-projecting section 1110.
Hereinbefore, the shape measurement apparatuses according to the present embodiments are explained. The shape measurement method according to the present embodiment is substantially the same as that of the shape measurement apparatus. That is, According to shape measurement method of the present invention, grating images reflected by a measurement target are obtained while shifting a grating, several times. Then, the reliability index of the grating images is obtained. When the reliability index is within the setup value, the measurement target is determined to be good, and when the reliability index is out of the setup value, the measurement target is determined to be bad. Furthermore, two-dimensional shape image of the measurement target may be obtained, and even when the reliability index of the pad is within the setup value, the pad may be determined to be bad when a luminance difference between a specific region and a peripheral region of the two-dimensional shape image is out of a specific value.
FIG. 4 is a diagram showing the shape measurement apparatus measuring a three-dimensional image.
The grating image is radiated onto the target substrate 1120 in FIG. 1. Then, intensity I of images reflected by the target substrate 1120 and captured by the image-capturing section 1150 is expressed as the following Expression 1 corresponding to Moire equation.
I = D [ 1 + γ cos ( 2 π h Λ ) ] Expression 1
wherein I is intensity captured by the image-capturing section 1150, D is signal intensity (or a function of DC light intensity (or light source intensity) and reflectivity), γ is visibility (a function of reflectivity and period of grating), Λ is Moire equivalence period (a function of magnification, the period of grating and radiation angle θ).
In Expression 1 intensity I is a function of height h, so that height h may be obtained by using intensity I.
When the phase of grating is shifted and reflected image is captured by the image-capturing section 1150 in FIG. 1, Expression 1 may be expressed as Expression 2.
I k = D [ 1 + γ cos ( 2 π h Λ + δ k ) ] Expression 2
where δk is phase shift, and 2πh/Λ corresponds to a phase Φ corresponding to the measurement target.
In order to obtain height h by using Expression 2, at least three phase shifts are required.
For example, when three phase shifts are applied (3-bucket algorithm), the height h may be obtained as follows. In Expression 2, zero radian is applied as δ1 to obtain I1, and then Expression 2 is expressed as following Expression 3.
I 1 = D [ 1 + γ cos ( 2 π h Λ ) ] Expression 3
In Expression 2, 2π/3 radian is applied as δ2 to obtain I2, and then Expression 2 is expressed as following Expression 4.
I 2 = D [ 1 + γ cos ( 2 π h Λ + 2 π 3 ) ] = D [ 1 - γ ( cos ( 2 π h Λ ) ( 1 2 ) + sin ( 2 π h Λ ) ( 3 2 ) ) ] Expression 4
In Expression 2, 4π/3 radian is applied as δ3 to obtain I3, and then Expression 2 is expressed as following Expression 5.
I 3 = D [ 1 + γ cos ( 2 π h Λ + 4 π 3 ) ] = D [ 1 - γ ( cos ( 2 π h Λ ) ( - 1 2 ) - sin ( 2 π h Λ ) ( 3 2 ) ) ] Expression 5
By using Expression 3, Expression 4 and Expression 5, following Expression 6 is obtained.
( I 3 - I 2 ) 2 I 1 - I 3 - I 2 = tan ( 2 π h Λ ) Expression 6
By using Expression 6, the height h may be obtained as following Expression 7.
h = Λ 2 π tan - 1 [ 3 ( I 3 - I 2 ) 2 I 1 - I 3 - I 2 ] Expression 7
For example, when four phase shifts are applied (4-bucket algorithm), the height h may be obtained as follows. In Expression 2, zero radian is applied as δ1 to obtain I1, and then Expression 2 is expressed as following Expression 8.
I 1 = D [ 1 + γ cos ( 2 π h Λ ) ] Expression 8
In Expression 2, π/2 radian is applied as δ2 to obtain I2, and then Expression 2 is expressed as following Expression 9.
I 2 = D [ 1 + γ cos ( 2 π h Λ + π 2 ) ] = D [ 1 - γ sin ( 2 π h Λ ) ] Expression 9
In Expression 2, π radian is applied as δ3 to obtain I3, and then Expression 2 is expressed as following Expression 10.
I 3 = D [ 1 + γ cos ( 2 π h Λ + π ) ] = D [ 1 - γ cos ( 2 π h Λ ) ] Expression 10
In Expression 2, 3π/2 radian is applied as δ4 to obtain I4, and then Expression 2 is expressed as following Expression 11.
I 4 = D [ 1 + γ cos ( 2 π h Λ + 3 π 2 ) ] = D [ 1 + γ sin ( 2 π h Λ ) ] Expression 11
By using Expression 8, Expression 9, Expression 10 and Expression 11, following Expression 12 is obtained.
I 4 - I 2 I 1 - I 3 = tan ( 2 π h Λ ) Expression 12
By using Expression 12, the height h may be obtained as following Expression 13.
h = Λ 2 π tan - 1 [ ( I 4 - I 2 ) ( I 1 - I 3 ) ] Expression 13
When grating image is radiated onto the measurement target and captured the reflected image while shifting the grading, three-dimensional shape of the measurement target may be obtained by using Expression 7 or Expression 13.
FIG. 5 is graphs showing a principle for measuring a two-dimensional image.
Arithmetic mean value lave of I1 I2, I3 and I4 may be obtained as following Expression 14.
I ave = I 1 + I 2 + I 3 + I 4 4 = D Expression 14
As shown in Expression 14, effect of grating may be offset when averaged, so that two-dimensional shape image may be obtained.
In case of 3-bucket algorithm, arithmetic mean value lave of I1 I2 and I3 in Expressions 3, 4 and 5, respectively may be expressed as following Expression 15.
I ave = I 1 + I 2 + I 3 3 = D Expression 15
On the other hand, the visibility γ in Expression 2 may be expressed as following Expression 16 by using Expression 3, 4, 5 and 15 in case of 3-bucket algorithm.
γ = ( 2 I 1 - I 2 - I 3 ) 2 + 3 ( I 2 - I 3 ) 2 ( I 1 + I 2 + I 3 ) Expression 16
The visibility γ in Expression 2 may be expressed as following Expression 17 by using Expression 8, 9, 10, 11 and 14 in case of 4-bucket algorithm.
γ = 2 ( I 1 - I 3 ) 2 + ( I 2 - I 4 ) 2 ( I 1 + I 2 + I 3 + I 4 ) Expression 17
According to the present invention, two-dimensional shape image may be obtained by using three-dimensional data measured, so that additional data for two-dimensional shape image may be not required.
Furthermore, when two-dimensional shape image and three-dimensional shape image, both of which are measured, are used together, defects of PCB may be effectively inspected.
FIG. 6 is a schematic view illustrating a three dimensional shape measurement apparatus used to a method of measuring a three dimensional shape according to an exemplary embodiment of the present invention.
Referring to FIG. 6, a three dimensional shape measurement apparatus used to a method of measuring a three dimensional shape according to an exemplary embodiment of the present invention may include a measurement stage section 100, an image photographing section 200, first and second illumination sections 300 and 400, an image acquiring section 500, a module control section 600 and a central control section 700.
The measurement stage section 100 may include a stage 110 supporting a measurement target 10 and a stage transfer unit 120 transferring the stage 110. In an exemplary embodiment, according as the measurement target 10 moves with respect to the image photographing section 200 and the first and second illumination sections 300 and 400 by the stage 110, a measurement location may be changed in the measurement target 10.
The image photographing section 200 is disposed over the stage 110 to receive light reflected by the measurement target 10 and measure an image of the measurement target 10. That is, the image photographing section 200 receives the light that exits the first and second illumination sections 300 and 400 and is reflected by the measurement target 10, and photographs a plan image of the measurement target 10.
The image photographing section 200 may include a camera 210, an imaging lens 220, a filter 230 and a lamp 240. The camera 210 receives the light reflected by the measurement target 10 and photographs the plan image of the measurement target 10. The camera 210 may include, for example, one of a CCD camera and a CMOS camera. The imaging lens 220 is disposed under the camera 210 to image the light reflected by the measurement target 10 on the camera 210. The filter 230 is disposed under the imaging lens 220 to filter the light reflected by the measurement target 10 and provide the filtered light to the imaging lens 220. The filter 230 may include, for example, one of a frequency filter, a color filter and a light intensity control filter. The lamp 240 may be disposed under the filter 230 in a circular shape to provide the light to the measurement target 10, so as to photograph a particular image such as a two-dimensional shape of the measurement target 10.
The first illumination section 300 may be disposed, for example, at a right side of the image photographing section 200 to be inclined with respect to the stage 110 supporting the measurement target 10. The first illumination section 300 may include a first light source unit 310, a first grating unit 320, a first grating transfer unit 330 and a first condensing lens 340. The first light source unit 310 may include a light source and at least one lens to generate light, and the first grating unit 320 is disposed under the first light source unit 310 to change the light generated by the first light source unit 310 into a first grating pattern light having a grating pattern. The first grating transfer unit 330 is connected to the first grating unit 320 to transfer the first grating unit 320, and may include, for example, one of a piezoelectric transfer unit and a fine linear transfer unit. The first condensing lens 340 is disposed under the first grating unit 320 to condense the first grating pattern light exiting the first grating unit 320 on the measurement target 10.
For example, the second illumination section 400 may be disposed at a left side of the image photographing section 200 to be inclined with respect to the stage 110 supporting the measurement target 10. The second illumination section 400 may include a second light source unit 410, a second grating unit 420, a second grating transfer unit 430 and a second condensing lens 440. The second illumination section 400 is substantially the same as the first illumination section 300 described above, and thus any further description will be omitted.
When the first grating transfer unit 330 sequentially moves the first grating unit 320 by N times and N first grating pattern lights are illuminated onto the measurement target 10 in the first illumination section 300, the image photographing section 200 may sequentially receive the N first grating pattern lights reflected by the measurement target 10 and photograph N first pattern images. In addition, when the second grating transfer unit 430 sequentially moves the second grating unit 420 by N times and N first grating pattern lights are illuminated onto the measurement target 10 in the second illumination section 400, the image photographing section 200 may sequentially receive the N second grating pattern lights reflected by the measurement target 10 and photograph N second pattern images. The ‘N’ is a natural number, and for example may be four.
In an exemplary embodiment, the first and second illumination sections 300 and 400 are described as an illumination apparatus generating the first and second grating pattern lights. Alternatively, the illumination section may be more than or equal to three. In other words, the grating pattern light may be illuminated onto the measurement target 10 in various directions, and various pattern images may be photographed. For example, when three illumination sections are disposed in an equilateral triangle form with the image photographing section 200 being the center of the equilateral triangle form, three grating pattern lights may be illuminated onto the measurement target 10 in different directions. For example, when four illumination sections are disposed in a square form with the image photographing section 200 being the center of the square form, four grating pattern lights may be illuminated onto the measurement target 10 in different directions.
The image acquiring section 500 is electrically connected to the camera 210 of the image photographing section 200 to acquire the pattern images from the camera 210 and store the acquired pattern images. For example, the image acquiring section 500 may include an image system that receives the N first pattern images and the N second pattern images photographed in the camera 210 and stores the images.
The module control section 600 is electrically connected to the measurement stage section 100, the image photographing section 200, the first illumination section 300 and the second illumination section 400, to control the measurement stage section 100, the image photographing section 200, the first illumination section 300 and the second illumination section 400. The module control section 600 may include, for example, an illumination controller, a grating controller and a stage controller. The illumination controller controls the first and second light source units 310 and 410 to generate light, and the grating controller controls the first and second grating transfer units 330 and 430 to move the first and second grating units 320 and 420. The stage controller controls the stage transfer unit 120 to move the stage 110 in an up-and-down motion and a left-and-right motion.
The central control section 700 is electrically connected to the image acquiring section 500 and the module control section 600 to control the image acquiring section 500 and the module control section 600. Particularly, the central control section 700 receives the N first pattern images and the N second pattern images from the image system of the image acquiring section 500 to process the images, so that three dimensional shape of the measurement target may be measured. In addition, the central control section 700 may control an illumination controller, a grating controller and a stage controller of the module control section 600. Thus, the central control section may include an image processing board, a control board and an interface board.
Hereinafter, a method of measuring the measurement target 10 formed on a printed circuit board by using the above described three dimensional shape measurement apparatus will be described in detail. It will be described employing a solder as an example of the measurement target 10.
FIG. 7 is a plan view illustrating a grating pattern image by a grating pattern light illuminated onto a measurement target in FIG. 6.
Referring to FIGS. 6 and 7, when the grating pattern light from one of the plurality of the illumination sections is illuminated onto the measurement target 10, a grating pattern image is formed on the measurement target 10. The grating pattern image includes a plurality of grating patterns, and in the present embodiment, an interval between the grating patterns, i.e., a grating pitch is defined as a measurement scope λ.
The measurement scope λ may be the same irrespective of sorts of the grating pattern lights, but alternatively, may be different from each other according to sorts of the grating pattern lights. The measurement scope λ may have at least two values according to sorts of the grating pattern lights. For example, the grating pattern image by the first grating pattern light generated from the first illumination section 300 may have grating patterns of a first measurement scope, and the grating pattern image by the second grating pattern light generated from the second illumination section 400 may have grating patterns of a second measurement scope different from the first measurement scope.
FIG. 8 is a plan view illustrating an image measured in the camera when the grating pattern light is illuminated onto the measurement target from a right side. FIG. 9 is a plan view illustrating an image measured in the camera when the grating pattern light is illuminated onto the measurement target from a left side. In the images of FIGS. 8 and 9, a relative amount with respect to brightness (luminance) is just shown, and the grating pattern is omitted.
Referring to FIGS. 6, 8 and 9, when the grating pattern light from one of the plurality of the illumination sections is illuminated onto the measurement target 10, an image photographed in the camera 210 may include a shadow area that is relatively dark and a saturation area that is relatively bright.
For example, as shown in FIG. 8, when the grating pattern light is illuminated onto the measurement target 10 from right side, typically, the saturation area is formed at a right portion of the measurement target 10, and the shadow area is formed at a left portion of the measurement target 10. In contrast, as shown in FIG. 9, when the grating pattern light is illuminated onto the measurement target 10 from left side, typically, the saturation area is formed at a left portion of the measurement target 10, and the shadow area is formed at a right portion of the measurement target 10.
Hereinafter, referring again to FIGS. 6 to 8, a method of measuring a three dimensional shape according to the present embodiment will be described based on the above described explanation.
Firstly, the grating pattern lights generated in a plurality of directions are sequentially illuminated onto the measurement target 10 disposed on the stage 110, and the grating pattern lights reflected by the measurement target 10 are sequentially detected in the camera 210 to acquire a plurality of pattern images.
Particularly, each of the grating pattern lights is moved aside and illuminated onto the measurement target 10 by N times, for example, three times or four times, to acquire N pattern images of the measurement target 10 for each of the directions. For example, as shown in FIG. 6, when the first and second grating pattern lights generated from the first and second illumination sections 300 and 400 are illuminated onto the measurement target 10, N first pattern images and N second pattern images may be acquired.
Then, N brightness degrees {Ii 1, Ii 2, . . . , Ii N} at each position {i(x,y)} in an X-Y coordinate system, and the measurement scope λ as shown in FIG. 7 are extracted from N pattern images with respect to each direction. Thereafter, phase {Pi(x,y)}, brightness {Ai(x,y)} and visibility {Vi(x,y)} with respect to each direction are calculated from the N brightness degrees {Ii 1, Ii 2, . . . , Ii N}. The phase {Pi(x,y)}, the brightness {Ai(x,y)} and the visibility {Vi(x,y)} with respect to each direction may be calculated by using an N-bucket algorithm. In addition, the brightness {Ai(x,y)} may be an average brightness that is obtained by averaging the detected grating pattern lights. Thus, hereinafter, the brightness {Ai(x,y)} will be called “average brightness {Ai(x,y)}”.
For example, when N is 3, three brightness degrees {Ii 1, Ii 2, Ii 3} are extracted from three pattern images with respect to each direction, and phase {Pi(x,y)}, average brightness {Ai(x,y)} and visibility {Vi(x,y)} may be calculated as shown in the following equations through a three-bucket algorithm. In the following equations, Bi(x,y) indicates an amplitude of an image signal (brightness signal) in three pattern images with respect to each direction. Ii 1 corresponds to “a+b cos(Φ)”, Ii 2 corresponds to “a+b cos(φ+2π/3)”, and Ii 3 corresponds to “a+b cos(φ+4π/3)”.
P i ( x , y ) = tan - 1 3 ( I 3 i - I 2 i ) 2 I 1 i - I 2 i - I 3 i A i ( x , y ) = I 1 i + I 2 i + I 3 i 3 V i ( x , y ) = B i A i = ( 2 I 1 i - I 2 i - I 3 i ) 2 + 3 ( I 2 i - I 3 i ) 2 ( I 1 i + I 2 i + I 3 i )
In contrast, for example, when N is 4, four brightness degrees {Ii 1, Ii 2, Ii 3, Ii 4} are extracted from four pattern images with respect to each direction, and phase {Pi(x,y)}, average brightness {Ai(x,y)} and visibility {Vi(x,y)} may be calculated as shown in the following equations through a four-bucket algorithm. In the following equations, Bi(x,y) indicates an amplitude of an image signal (brightness signal) in four pattern images with respect to each direction. Ii 1 corresponds to “a+b cos(Φ)”, Ii 2 corresponds to “a+b cos(φ+π/2)”, and Ii 3 corresponds to “a+b cos(φ+π)” and Ii 4 corresponds to “a+b cos(p+3π/2)”.
P i ( x , y ) = tan - 1 I 4 i - I 2 i I 1 i - I 3 i A i ( x , y ) = I 1 i + I 2 i + I 3 i + I 4 i 4 V i ( x , y ) = B i A i = 2 ( I 1 i - I 3 i ) 2 + ( I 2 i - I 4 i ) 2 ( I 1 i + I 2 i + I 3 i + I 4 i )
In an exemplary embodiment, a signal-to-noise ratio (SNR) may be calculated and used in place of the visibility {Vi(x,y)} or together with the visibility {Vi(x,y)}. The SNR Indicates a ratio of an image signal S to a noise signal N (S/N) in N pattern images with respect to each direction.
Thereafter, height {Hi(x,y)} with respect to each direction is calculated from the phase {Pi(x,y)} with respect to each direction by the following equation. In the following equation, ki(x,y) is a phase-to-height conversion scale that indicates a conversion ratio between a phase and a height.
H i(x,y)=k i(x,yP i(x,y)
Height weight {Wi(x,y)} with respect to each direction is calculated by using at least one of the average brightness {Ai(x,y)}, the visibility {Vi(x,y)} and the measurement scope λ. The height weight {Wi(x,y)} with respect to each direction may be obtained as follows by a weight function {f(Ai,Vi,λ)} having parameters of, for example, the average brightness {Ai(x,y)}, the visibility {Vi(x,y)} and the measurement scope (λ). Sum of the height weights in the total directions may be 1 {ΣWi(x,y)=1}.
W i(x,y)=f(A i ,V i,λ)
Then, the height {Hi(x,y)} with respect to each direction is multiplied by the height weight {Wi(x,y)} with respect to each direction to calculate weight height {Wi(x,y)·Hi(x,y)} with respect to each direction. Thereafter, the weight heights in the total directions are summed and divided by the sum of the height weights {ΣWi(x,y)} to calculate height {ΣWi(x,y)·Hi(x,y)/ΣWi(x,y)} at each position.
Then, the three dimensional shape of the measurement target 10 may be accurately measured by combining the heights according to positions calculated as the above.
Hereinafter, relations between the height weight {Wi(x,y)} with respect to each direction and characteristics of the weight function {f(Ai,Vi,λ)}, i.e., the average brightness {Ai(x,y)}, the visibility {Vi(x,y)} or the SNR, and the measurement scope λ, will be described in detail.
FIG. 10 is a graph showing a relation between average brightness and weight of the pattern images measured in the camera.
Referring to FIG. 10, firstly, when the average brightness {Ai(x,y)} increases or decreases from a predetermined value that is set in advance, the weight function {f(Ai,Vi,λ)} may act on the height weight {Wi(x,y)} to decrease. In other words, when the average brightness {Ai(x,y)} has the predetermined value, the height weight {Wi(x,y)} has relatively the greatest value, and as the average brightness {Ai(x,y)} becomes distant from the predetermined value, the height weight {Wi(x,y)} may decrease. The predetermined value may be set when determining a three dimensional condition by using a specimen stone or may be arbitrarily set by a user. However, the predetermined value may preferably be an average value, i.e., a mid value of the average brightness {Ai(x,y)}.
FIG. 11 is a graph showing a relation between visibility or SNR and weight of the pattern images measured in the camera.
Referring to FIG. 11, thereafter, when the visibility {Vi(x,y)} or the SNR increases, the weight function {f(Ai,Vi,λ)} may act on the height weight to increase. In other words, as the visibility {Vi(x,y)} or the SNR slowly increases, the height weight {Wi(x,y)} may also slowly increase.
FIG. 12 is a graph showing a relation between measurement scope and weight of the pattern images measured in the camera.
Referring to FIG. 12, then, when the measurement scope λ increases, the weight function {f(Ai,Vi,λ)} may act on the height weight {Wi(x,y)} to decrease. In other words, as the measurement scope λ slowly increases, the height weight {Wi(x,y)} may slowly decrease.
Referring again to FIGS. 7, 10 and 11, the N pattern images with respect to each direction are divided into a shadow area, a saturation area and non-saturation area, and different height weight {Wi(x,y)} may be given according to each area. In the shadow area, the average brightness {Ai(x,y)} is below a minimum brightness A1, and the visibility {Vi(x,y)} or the SNR is below a minimum reference value Vmin. In the saturation area, the average brightness {Ai(x,y)} is more than a maximum brightness A2, and the visibility or the SNR is below the minimum reference value Vmin. The non-saturation area corresponds to a remaining area except the shadow area and the saturation area.
Firstly, in the shadow area and the saturation area, the weight function {f(Ai,Vi,λ)} is regarded as ‘0’ to obtain the height weight {Wi(x,y)}. In other words, in the shadow area and the saturation area, the height weight {Wi(x,y)} is determined as ‘0’.
Then, in the non-saturation area, as shown in FIGS. 10 to 12, the weight function {f(Ai,Vi,λ)} may decrease the height weight {Wi(x,y)} when the average brightness {Ai(x,y)} increases or decreases from the mid value, increase the height weight {Wi(x,y)} when the visibility {Vi(x,y)} or the SNR increases, and decrease the height weight {Wi(x,y)} when the measurement scope λ increases.
In contrast, in the non-saturation area, the weight function {f(Ai,Vi,λ)} may be regarded as the same to obtain the height weight {Wi(x,y)}. For example, when height weights with respect to four directions in the non-saturation area are called for first, second, third and fourth height weights W1, W2, W3 and W4, all of the first, second, third and fourth height weights W1, W2, W3 and W4 may be determined as ‘¼’.
According to the present embodiment, the average brightness {Ai(x,y)}, the visibility {Vi(x,y)} or SNR, and the measurement scope λ are extracted from the N pattern images photographed in each directions, and the height weight {Wi(x,y)} is determined according to the extraction result, thereby accurately measuring the height according to each position of the measurement target 10 in all the areas.
Especially, the N pattern images with respect to each direction are divided into the shadow area, the saturation area and the non-saturation area, and different height weight {Wi(x,y)} is given according to each area, to thereby prevent reliability reduction for the height in the shadow area and the saturation area. In other words, the height weight {Wi(x,y)} is given as relatively low value, for example, ‘0’ in the shadow area and the saturation area, the height weight {Wi(x,y)} is given as relatively great value in the non-saturation area, thereby compensating for adverse effect incurred from the shadow area and the saturation area to more accurately measure a three dimensional shape of the measurement target.
It will be apparent to those skilled in the art that various modifications and variation can be made in the present invention without departing from the spirit or scope of the invention. Thus, it is intended that the present invention cover the modifications and variations of this invention provided they come within the scope of the appended claims and their equivalents.

Claims (17)

What is claimed is:
1. A method of measuring a three dimensional shape comprising:
illuminating grating pattern lights in a plurality of directions onto a measurement target by N times and acquiring N pattern images reflected from the measurement target with respect to each direction, wherein a plurality of pattern-projecting sections disposed in the plurality of directions illuminates the grating pattern lights in the plurality of directions;
extracting a phase with respect to each direction, corresponding to each position from the N pattern images;
extracting N brightness degrees from the N pattern images with respect to each direction, corresponding to each position:
averaging the N brightness degrees with respect to each direction to obtain an average brightness with respect to each direction, corresponding to each position;
extracting a height weight with respect to each direction by using a weight function employing the average brightness as a parameter;
calculating a weight height with respect to each direction by using a height based on the phase and the height weight; and
calculating a height at each position by using weight heights with respect to the plurality of directions associated with the plurality of pattern-projecting sections.
2. The method of claim 1, wherein the grating pattern lights illuminated by N times includes pattern lights each having different patterns.
3. The method of claim 1, further comprising measuring a three dimensional shape by combining the height at each position.
4. The method of claim 1, wherein sum of the height weights with respect to the plurality of directions is equal to 1.
5. The method of claim 1, wherein the weight function further employs at least one of a visibility and a signal-to-noise ratio (SNR) with respect to each direction extracted from the pattern images with respect to each direction as parameters.
6. The method of claim 5, wherein the weight function further employs a measurement scope (λ) corresponding to a grating pitch of each grating pattern light extracted from the pattern images with respect to each direction as a parameter.
7. The method of claim 6, wherein the measurement scope has at least two values according to the grating pattern lights.
8. The method of claim 6, wherein the weight function decreases the height weight, as the measurement scope increases.
9. The method of claim 5, wherein
the weight function decreases the height weight, as the average brightness increases or decreases from a predetermined value.
10. The method of claim 9, wherein the predetermined value is a mid value of the average brightness.
11. The method of claim 5, wherein the weight function increases the height weight, as the visibility or the SNR increases.
12. The method of claim 5,
wherein extracting the height weight with respect to each direction comprising dividing the pattern images into a shadow area, a saturation area and a non-saturation area, and
wherein the shadow area corresponds to an area, in which the average brightness is below a minimum brightness and the visibility or the SNR is below a minimum reference value,
the saturation area corresponds to an area, in which the average brightness is more than a maximum brightness and the visibility or the SNR is below the minimum reference value, and
the non-saturation area corresponds to a remaining area except the shadow area and the saturation area.
13. The method of claim 12, wherein the weight function is calculated by regarding the height weight as ‘0’ in the shadow area and the saturation area.
14. The method of claim 13, wherein the weight function corresponding to the non-saturation area decreases the height weight, as the average brightness increases or decreases from a mid value of the average brightness, and increases the height weight as the visibility or the SNR increases.
15. A method of measuring a three dimensional shape comprising:
illuminating grating pattern lights in a plurality of directions onto a measurement target by N times and acquiring N pattern images of the measurement target with respect to each direction, wherein a plurality of pattern-projecting sections disposed in the plurality of directions illuminates the grating pattern lights in the plurality of directions;
extracting a phase and a visibility with respect to each direction, corresponding to each position from the N pattern images;
extracting a height weight with respect to each direction by using a weight function employing the visibility as a parameter;
calculating a weight height with respect to each direction by using a height based on the phase and the height weight; and
calculating a height at each position by using weight heights with respect to the plurality of directions associated with the plurality of pattern-projecting sections,
wherein the visibility is an amplitude of brightness signal divided by an average brightness.
16. The method of claim 15, further comprising measuring a three dimensional shape by combining the height at each position.
17. The method of claim 15, wherein sum of the height weights with respect to the plurality of directions is equal to 1.
US15/002,784 2009-05-21 2016-01-21 Shape measurement apparatus and method Active US9739605B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US15/002,784 US9739605B2 (en) 2009-05-21 2016-01-21 Shape measurement apparatus and method

Applications Claiming Priority (6)

Application Number Priority Date Filing Date Title
KR1020090044423A KR101088497B1 (en) 2009-05-21 2009-05-21 Method for measuring three dimensional shape
KR10-2009-0044423 2009-05-21
KR10-2010-0007025 2010-01-26
KR1020100007025A KR101158324B1 (en) 2010-01-26 2010-01-26 Image Sensing System
US12/784,707 US9275292B2 (en) 2009-05-21 2010-05-21 Shape measurement apparatus and method
US15/002,784 US9739605B2 (en) 2009-05-21 2016-01-21 Shape measurement apparatus and method

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US12/784,707 Continuation US9275292B2 (en) 2009-05-21 2010-05-21 Shape measurement apparatus and method

Publications (2)

Publication Number Publication Date
US20160153772A1 US20160153772A1 (en) 2016-06-02
US9739605B2 true US9739605B2 (en) 2017-08-22

Family

ID=43102692

Family Applications (3)

Application Number Title Priority Date Filing Date
US12/784,707 Active 2033-06-28 US9275292B2 (en) 2009-05-21 2010-05-21 Shape measurement apparatus and method
US13/860,228 Active 2032-02-26 US9791266B2 (en) 2009-05-21 2013-04-10 Shape measurement apparatus and method
US15/002,784 Active US9739605B2 (en) 2009-05-21 2016-01-21 Shape measurement apparatus and method

Family Applications Before (2)

Application Number Title Priority Date Filing Date
US12/784,707 Active 2033-06-28 US9275292B2 (en) 2009-05-21 2010-05-21 Shape measurement apparatus and method
US13/860,228 Active 2032-02-26 US9791266B2 (en) 2009-05-21 2013-04-10 Shape measurement apparatus and method

Country Status (5)

Country Link
US (3) US9275292B2 (en)
JP (1) JP5202575B2 (en)
CN (2) CN104034280B (en)
DE (2) DE102010064593A1 (en)
TW (1) TWI440821B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160260217A1 (en) * 2015-03-04 2016-09-08 Canon Kabushiki Kaisha Measurement apparatus and measurement method

Families Citing this family (51)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101547218B1 (en) 2010-11-19 2015-08-25 주식회사 고영테크놀러지 Method for inspecting substrate
KR101174676B1 (en) 2010-11-19 2012-08-17 주식회사 고영테크놀러지 Method and apparatus of profiling a surface
CN102305601B (en) * 2011-05-18 2012-10-10 天津大学 High-precision non-contact measurement method and device for three-dimensional profile of optical freeform curved surface
CN102914543A (en) * 2011-08-03 2013-02-06 浙江中茂科技有限公司 Article detection device of three-dimensional stereo image
JP5709009B2 (en) * 2011-11-17 2015-04-30 Ckd株式会社 3D measuring device
JP5918984B2 (en) * 2011-12-06 2016-05-18 キヤノン株式会社 Information processing apparatus, information processing apparatus control method, and program
JP5900037B2 (en) * 2012-03-08 2016-04-06 オムロン株式会社 Image processing apparatus and control method thereof
CN102628811B (en) * 2012-03-30 2015-04-22 中国科学院长春光学精密机械与物理研究所 Verifying device of grating groove defect
TWI546518B (en) * 2012-04-20 2016-08-21 德律科技股份有限公司 Three dimensional measurement system and three dimensional measurement method
CN105823438B (en) * 2012-05-22 2018-10-19 株式会社高永科技 The height measurement method of 3 d shape measuring apparatus
US8937657B2 (en) * 2012-07-15 2015-01-20 Erik Klass Portable three-dimensional metrology with data displayed on the measured surface
JP6009288B2 (en) * 2012-09-11 2016-10-19 株式会社キーエンス Measurement microscope apparatus, measurement microscope apparatus operation program, and computer-readable recording medium
US10165197B2 (en) * 2012-11-12 2018-12-25 Astral Images Corporation System and method for processing an image carried by an optical substrate and computer readable medium made using same
JP2014106094A (en) * 2012-11-27 2014-06-09 Keyence Corp Shape measurement device
WO2014083386A2 (en) * 2012-11-29 2014-06-05 Csir A method of calibrating a camera and a system therefor
KR101590831B1 (en) * 2013-04-02 2016-02-03 주식회사 고영테크놀러지 Method of inspecting foreign substance on a board
CN103322944B (en) * 2013-06-14 2016-06-29 上海大学 Coaxial-illuminating mirror-image mole measuring device and method
JP5843241B2 (en) * 2013-11-26 2016-01-13 レーザーテック株式会社 Inspection apparatus and inspection method
TWI489101B (en) * 2013-12-02 2015-06-21 Ind Tech Res Inst Apparatus and method for combining 3d and 2d measurement
US10417472B2 (en) 2014-01-10 2019-09-17 Koh Young Technology Inc. Device and method for measuring three-dimensional shape
JP6287360B2 (en) * 2014-03-06 2018-03-07 オムロン株式会社 Inspection device
JP6486010B2 (en) * 2014-03-11 2019-03-20 国立大学法人 和歌山大学 Shape measuring apparatus and shape measuring method
EP3156763B1 (en) * 2014-06-13 2019-02-06 Nikon Corporation Shape measurement device
JP6303867B2 (en) * 2014-06-27 2018-04-04 オムロン株式会社 Substrate inspection apparatus and control method thereof
JP6451142B2 (en) * 2014-08-20 2019-01-16 オムロン株式会社 Quality control device and control method for quality control device
CN104634277B (en) * 2015-02-12 2018-05-15 上海图漾信息科技有限公司 Capture apparatus and method, three-dimension measuring system, depth computing method and equipment
JP6126640B2 (en) * 2015-05-11 2017-05-10 Ckd株式会社 Three-dimensional measuring apparatus and three-dimensional measuring method
JP6109255B2 (en) * 2015-07-14 2017-04-05 Ckd株式会社 3D measuring device
JP6027204B1 (en) * 2015-10-05 2016-11-16 Ckd株式会社 3D measuring device
CN105547190B (en) * 2015-12-14 2018-08-14 深圳先进技术研究院 3 D measuring method and device based on double angle unifrequency fringe projections
CN105588518B (en) * 2015-12-14 2018-09-11 深圳先进技术研究院 Three-dimensional appearance acquisition methods based on double angle multi-frequency fringe projections and device
KR102079181B1 (en) * 2016-03-04 2020-02-19 주식회사 고영테크놀러지 Pattern lighting appartus and method thereof
CN106705855B (en) * 2017-03-10 2018-12-14 东南大学 A kind of high dynamic performance method for three-dimensional measurement based on adaptive optical grating projection
DE102017004428B4 (en) * 2017-05-08 2018-11-29 Universität Stuttgart Method and device for robust, deep-scanning focusing strip triangulation with multiple wavelets
CN107121079B (en) * 2017-06-14 2019-11-22 华中科技大学 A kind of curved surface elevation information measuring device and method based on monocular vision
EP3489619A1 (en) 2017-11-28 2019-05-29 Koh Young Technology Inc. Apparatus for inspecting substrate and method thereof
KR102138622B1 (en) * 2017-11-28 2020-07-28 주식회사 고영테크놀러지 Apparatus for inspecting substrate and method thereof
WO2019116708A1 (en) * 2017-12-12 2019-06-20 ソニー株式会社 Image processing device, image processing method and program, and image processing system
JP6979885B2 (en) * 2018-01-17 2021-12-15 株式会社ミツトヨ 3D shape auto trace method and measuring machine
JP6954142B2 (en) * 2018-01-17 2021-10-27 オムロン株式会社 Image inspection equipment and lighting equipment
WO2019164381A1 (en) * 2018-02-26 2019-08-29 주식회사 고영테크놀러지 Method for inspecting mounting state of component, printed circuit board inspection apparatus, and computer readable recording medium
JP2020071060A (en) * 2018-10-29 2020-05-07 株式会社ミツトヨ Shape measuring device
JP2020095519A (en) * 2018-12-13 2020-06-18 エスゼット ディージェイアイ テクノロジー カンパニー リミテッドSz Dji Technology Co.,Ltd Shape estimation device, shape estimation method, program, and recording medium
WO2020159513A1 (en) * 2019-01-31 2020-08-06 Halliburton Energy Services, Inc. Downhole depth extraction using structured illumination
JP7247031B2 (en) * 2019-06-27 2023-03-28 株式会社キーエンス inspection equipment
CN114096800A (en) * 2019-06-28 2022-02-25 株式会社高迎科技 Apparatus and method for determining three-dimensional image of object
JP7393737B2 (en) 2020-02-27 2023-12-07 オムロン株式会社 Image inspection device and image inspection method
CN114076576B (en) * 2020-08-21 2023-11-21 深圳市万普拉斯科技有限公司 Light emitter, camera module, electronic equipment and image three-dimensional information acquisition method
CN112414945B (en) * 2021-01-25 2021-04-06 深圳宜美智科技股份有限公司 Drilling hole defect detection method and drilling hole defect detection equipment based on image detection
US20220252893A1 (en) * 2021-02-09 2022-08-11 Himax Technologies Limited Light projection apparatus
CN113358063B (en) * 2021-06-04 2022-03-18 华中科技大学 Surface structured light three-dimensional measurement method and system based on phase weighted fusion

Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5135308A (en) * 1990-03-09 1992-08-04 Carl-Zeiss-Stiftung Method and apparatus for non-contact measuring of object surfaces
US5166811A (en) * 1990-01-19 1992-11-24 Sharp Kabushiki Kaisha Image data processing apparatus
US6650443B1 (en) * 1999-07-30 2003-11-18 Canon Kabushiki Kaisha Apparatus and method for reading image, and computer-readable storage medium storing image processing program
US6823080B2 (en) * 1996-07-01 2004-11-23 Canon Kabushiki Kaisha Three-dimensional information processing apparatus and method
US20050046865A1 (en) * 2003-08-28 2005-03-03 Brock Neal J. Pixelated phase-mask interferometer
US20060077398A1 (en) * 2004-10-13 2006-04-13 Michel Cantin System and method for height profile measurement of reflecting objects
US20060109482A1 (en) * 2003-06-11 2006-05-25 Yan Duval 3D and 2D measurement system and method with increased sensitivity and dynamic range
US20060158664A1 (en) * 2003-02-06 2006-07-20 Koh Young Technology Inc Three-dimensional image measuring apparatus
US20070146683A1 (en) * 2005-12-20 2007-06-28 Kabushiki Kaisha Topcon Distance measuring method and distance measuring device
US20070177159A1 (en) * 2006-01-26 2007-08-02 Kim Min Y Method for measuring three-dimension shape
US20070223805A1 (en) * 2006-03-23 2007-09-27 Moon Young Jeon Apparatus for measuring three dimensional shape
US20070296979A1 (en) * 2006-06-23 2007-12-27 Konica Minolta Sensing, Inc. Three-dimensional shape measuring apparatus
US20100268069A1 (en) * 2009-04-16 2010-10-21 Rongguang Liang Dental surface imaging using polarized fringe projection

Family Cites Families (81)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS60173403A (en) * 1984-02-10 1985-09-06 Nippon Telegr & Teleph Corp <Ntt> Correspondence processing method of stereophonic image
US5085502A (en) * 1987-04-30 1992-02-04 Eastman Kodak Company Method and apparatus for digital morie profilometry calibrated for accurate conversion of phase information into distance measurements in a plurality of directions
CA1285661C (en) * 1988-08-19 1991-07-02 Randy Tsang Automatic visual measurement of surface mount device placement
CN2086427U (en) * 1990-06-20 1991-10-09 林家义 No-contact three-dimensional coordinate fast determining and recording device
CN1020503C (en) * 1990-09-13 1993-05-05 清华大学 Optical method for surveying three dimensional figure without reference plane
JP3194499B2 (en) * 1992-10-22 2001-07-30 株式会社日立国際電気 Appearance shape inspection method and device
JPH07332956A (en) * 1994-06-06 1995-12-22 Yamatake Honeywell Co Ltd Apparatus for measuring surface shape
JPH08152310A (en) * 1994-11-30 1996-06-11 Kubota Corp Method for generating coordinate operation reference data in three-dimensional shape input unit
JP3525964B2 (en) * 1995-07-05 2004-05-10 株式会社エフ・エフ・シー 3D shape measurement method for objects
DE19634254B4 (en) * 1995-09-04 2009-06-10 Volkswagen Ag Optical-numerical method for determining the entire surface of a three-dimensional object
DE19637682B4 (en) * 1996-09-05 2004-04-29 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. Method for determining the spatial coordinates of objects and / or their temporal change and device for applying this method
DE19747061B4 (en) * 1997-10-24 2005-02-10 Mähner, Bernward Method and device for the planar, three-dimensional, optical measurement of objects
US6316736B1 (en) * 1998-06-08 2001-11-13 Visteon Global Technologies, Inc. Anti-bridging solder ball collection zones
DE19852149C2 (en) * 1998-11-04 2000-12-07 Fraunhofer Ges Forschung Device for determining the spatial coordinates of objects
US6177905B1 (en) * 1998-12-08 2001-01-23 Avaya Technology Corp. Location-triggered reminder for mobile user devices
CA2373284A1 (en) * 1999-05-14 2000-11-23 3D Metrics, Incorporated Color structured light 3d-imaging system
JP2001311613A (en) * 2000-02-24 2001-11-09 Satoru Toyooka Phase unlapping method in image analysis of speckle interference
CA2301822A1 (en) * 2000-03-24 2001-09-24 9071 9410 Quebec Inc. Simultaneous projection of several patterns with simultaneous acquisition for inspection of objects in three-dimensions
US6680675B1 (en) * 2000-06-21 2004-01-20 Fujitsu Limited Interactive to-do list item notification system including GPS interface
US6957076B2 (en) * 2000-11-22 2005-10-18 Denso Corporation Location specific reminders for wireless mobiles
US20020067308A1 (en) * 2000-12-06 2002-06-06 Xerox Corporation Location/time-based reminder for personal electronic devices
JP3925611B2 (en) * 2001-03-22 2007-06-06 セイコーエプソン株式会社 Information providing system, information providing apparatus, program, information storage medium, and user interface setting method
US7139722B2 (en) * 2001-06-27 2006-11-21 Bellsouth Intellectual Property Corporation Location and time sensitive wireless calendaring
US7302686B2 (en) * 2001-07-04 2007-11-27 Sony Corporation Task management system
US6604059B2 (en) * 2001-07-10 2003-08-05 Koninklijke Philips Electronics N.V. Predictive calendar
JP3722052B2 (en) * 2001-11-27 2005-11-30 松下電工株式会社 Method and apparatus for evaluating pad portion for mounting semiconductor chip
JP3878033B2 (en) * 2002-02-28 2007-02-07 シーケーディ株式会社 3D measuring device
CN100338434C (en) * 2003-02-06 2007-09-19 株式会社高永科技 Thrre-dimensional image measuring apparatus
JP4077754B2 (en) * 2003-04-04 2008-04-23 オリンパス株式会社 3D shape measuring device
JP2004309402A (en) * 2003-04-10 2004-11-04 Mitsutoyo Corp Interference measuring instrument and phase shift fringe analyzer
US20050015772A1 (en) * 2003-07-16 2005-01-20 Saare John E. Method and system for device specific application optimization via a portal server
JP4480488B2 (en) * 2003-08-28 2010-06-16 富士通株式会社 Measuring device, computer numerical control device, and program
US20050108074A1 (en) * 2003-11-14 2005-05-19 Bloechl Peter E. Method and system for prioritization of task items
US20050114140A1 (en) * 2003-11-26 2005-05-26 Brackett Charles C. Method and apparatus for contextual voice cues
US7447630B2 (en) * 2003-11-26 2008-11-04 Microsoft Corporation Method and apparatus for multi-sensory speech enhancement
KR100462292B1 (en) * 2004-02-26 2004-12-17 엔에이치엔(주) A method for providing search results list based on importance information and a system thereof
US7084758B1 (en) * 2004-03-19 2006-08-01 Advanced Micro Devices, Inc. Location-based reminders
JP2006023178A (en) * 2004-07-07 2006-01-26 Olympus Corp 3-dimensional measuring method and device
US7853574B2 (en) * 2004-08-26 2010-12-14 International Business Machines Corporation Method of generating a context-inferenced search query and of sorting a result of the query
US20060061488A1 (en) * 2004-09-17 2006-03-23 Dunton Randy R Location based task reminder
US7603381B2 (en) * 2004-09-30 2009-10-13 Microsoft Corporation Contextual action publishing
US7885844B1 (en) * 2004-11-16 2011-02-08 Amazon Technologies, Inc. Automatically generating task recommendations for human task performers
CN1609859A (en) * 2004-11-26 2005-04-27 孙斌 Search result clustering method
US8069422B2 (en) * 2005-01-10 2011-11-29 Samsung Electronics, Co., Ltd. Contextual task recommendation system and method for determining user's context and suggesting tasks
US7306341B2 (en) * 2005-02-28 2007-12-11 Hewlett-Packard Development Company, L.P. Multi-projector geometric calibration
US7925525B2 (en) * 2005-03-25 2011-04-12 Microsoft Corporation Smart reminders
JP4611782B2 (en) * 2005-03-28 2011-01-12 シチズンホールディングス株式会社 Three-dimensional shape measuring method and measuring apparatus
US8024195B2 (en) * 2005-06-27 2011-09-20 Sensory, Inc. Systems and methods of performing speech recognition using historical information
US20070027732A1 (en) * 2005-07-28 2007-02-01 Accu-Spatial, Llc Context-sensitive, location-dependent information delivery at a construction site
JP4908094B2 (en) * 2005-09-30 2012-04-04 株式会社リコー Information processing system, information processing method, and information processing program
US7577522B2 (en) * 2005-12-05 2009-08-18 Outland Research, Llc Spatially associated personal reminder system and method
US7898651B2 (en) * 2005-10-24 2011-03-01 General Electric Company Methods and apparatus for inspecting an object
US20070102478A1 (en) * 2005-11-10 2007-05-10 Speedline Technologies, Inc. Optimal imaging system and method for a stencil printer
KR100612932B1 (en) * 2005-12-14 2006-08-14 주식회사 고영테크놀러지 3d image measuring apparatus and method thereof
JP2007183864A (en) * 2006-01-10 2007-07-19 Fujitsu Ltd File retrieval method and system therefor
KR100672818B1 (en) * 2006-01-26 2007-01-22 주식회사 고영테크놀러지 Method for measuring three-dimension shape
US7818291B2 (en) * 2006-02-03 2010-10-19 The General Electric Company Data object access system and method using dedicated task object
US8595041B2 (en) * 2006-02-07 2013-11-26 Sap Ag Task responsibility system
US7649454B2 (en) * 2006-09-28 2010-01-19 Ektimisi Semiotics Holdings, Llc System and method for providing a task reminder based on historical travel information
US7930197B2 (en) * 2006-09-28 2011-04-19 Microsoft Corporation Personal data mining
US7528713B2 (en) * 2006-09-28 2009-05-05 Ektimisi Semiotics Holdings, Llc Apparatus and method for providing a task reminder based on travel history
US20080082390A1 (en) * 2006-10-02 2008-04-03 International Business Machines Corporation Methods for Generating Auxiliary Data Operations for a Role Based Personalized Business User Workplace
JP5123522B2 (en) * 2006-12-25 2013-01-23 パナソニック株式会社 3D measurement method and 3D shape measurement apparatus using the same
KR100903346B1 (en) 2006-12-28 2009-06-22 (주) 인텍플러스 Method for optical visual examination of three-dimensional shape
JP5252820B2 (en) * 2007-03-27 2013-07-31 パナソニック株式会社 3D measurement method and 3D shape measurement apparatus using the same
US20080313335A1 (en) * 2007-06-15 2008-12-18 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Communicator establishing aspects with context identifying
US8019606B2 (en) * 2007-06-29 2011-09-13 Microsoft Corporation Identification and selection of a software application via speech
US8190359B2 (en) * 2007-08-31 2012-05-29 Proxpro, Inc. Situation-aware personal information management for a mobile device
US20090112677A1 (en) * 2007-10-24 2009-04-30 Rhett Randolph L Method for automatically developing suggested optimal work schedules from unsorted group and individual task lists
WO2009094510A1 (en) * 2008-01-25 2009-07-30 Cyberoptics Corporation Multi-source sensor for three-dimensional imaging using phased structured light
US20090239552A1 (en) * 2008-03-24 2009-09-24 Yahoo! Inc. Location-based opportunistic recommendations
US8666824B2 (en) * 2008-04-23 2014-03-04 Dell Products L.P. Digital media content location and purchasing system
US8423288B2 (en) * 2009-11-30 2013-04-16 Apple Inc. Dynamic alerts for calendar events
US8166019B1 (en) * 2008-07-21 2012-04-24 Sprint Communications Company L.P. Providing suggested actions in response to textual communications
US9200913B2 (en) * 2008-10-07 2015-12-01 Telecommunication Systems, Inc. User interface for predictive traffic
JP5257311B2 (en) * 2008-12-05 2013-08-07 ソニー株式会社 Information processing apparatus and information processing method
US8321527B2 (en) * 2009-09-10 2012-11-27 Tribal Brands System and method for tracking user location and associated activity and responsively providing mobile device updates
US20110161309A1 (en) * 2009-12-29 2011-06-30 Lx1 Technology Limited Method Of Sorting The Result Set Of A Search Engine
US9413869B2 (en) * 2010-02-10 2016-08-09 Qualcomm Incorporated Mobile device having plurality of input modes
US8375320B2 (en) * 2010-06-22 2013-02-12 Microsoft Corporation Context-based task generation
EP2702473A1 (en) * 2011-04-25 2014-03-05 Veveo, Inc. System and method for an intelligent personal timeline assistant

Patent Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5166811A (en) * 1990-01-19 1992-11-24 Sharp Kabushiki Kaisha Image data processing apparatus
US5135308A (en) * 1990-03-09 1992-08-04 Carl-Zeiss-Stiftung Method and apparatus for non-contact measuring of object surfaces
US6823080B2 (en) * 1996-07-01 2004-11-23 Canon Kabushiki Kaisha Three-dimensional information processing apparatus and method
US6650443B1 (en) * 1999-07-30 2003-11-18 Canon Kabushiki Kaisha Apparatus and method for reading image, and computer-readable storage medium storing image processing program
US20060158664A1 (en) * 2003-02-06 2006-07-20 Koh Young Technology Inc Three-dimensional image measuring apparatus
US20060109482A1 (en) * 2003-06-11 2006-05-25 Yan Duval 3D and 2D measurement system and method with increased sensitivity and dynamic range
US20050046865A1 (en) * 2003-08-28 2005-03-03 Brock Neal J. Pixelated phase-mask interferometer
US20060077398A1 (en) * 2004-10-13 2006-04-13 Michel Cantin System and method for height profile measurement of reflecting objects
US20070146683A1 (en) * 2005-12-20 2007-06-28 Kabushiki Kaisha Topcon Distance measuring method and distance measuring device
US20070177159A1 (en) * 2006-01-26 2007-08-02 Kim Min Y Method for measuring three-dimension shape
US20070223805A1 (en) * 2006-03-23 2007-09-27 Moon Young Jeon Apparatus for measuring three dimensional shape
US20070296979A1 (en) * 2006-06-23 2007-12-27 Konica Minolta Sensing, Inc. Three-dimensional shape measuring apparatus
US20100268069A1 (en) * 2009-04-16 2010-10-21 Rongguang Liang Dental surface imaging using polarized fringe projection

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160260217A1 (en) * 2015-03-04 2016-09-08 Canon Kabushiki Kaisha Measurement apparatus and measurement method
US10121246B2 (en) * 2015-03-04 2018-11-06 Canon Kabushiki Kaisha Measurement apparatus that obtains information of a shape of a surface using a corrected image and measurement method

Also Published As

Publication number Publication date
JP5202575B2 (en) 2013-06-05
US20160153772A1 (en) 2016-06-02
US9791266B2 (en) 2017-10-17
CN101893428A (en) 2010-11-24
DE102010029091A1 (en) 2011-01-13
DE102010064593A1 (en) 2015-07-30
US20100295941A1 (en) 2010-11-25
JP2010271316A (en) 2010-12-02
DE102010029091B4 (en) 2015-08-20
CN104034280B (en) 2017-09-08
US20130222578A1 (en) 2013-08-29
CN101893428B (en) 2014-07-16
TW201105924A (en) 2011-02-16
US9275292B2 (en) 2016-03-01
TWI440821B (en) 2014-06-11
CN104034280A (en) 2014-09-10

Similar Documents

Publication Publication Date Title
US9739605B2 (en) Shape measurement apparatus and method
US9091725B2 (en) Board inspection apparatus and method
US8724883B2 (en) Method for inspecting measurement object
KR101190122B1 (en) Apparatus and method for measuring three dimension shape using multi-wavelength
US10126252B2 (en) Enhanced illumination control for three-dimensional imaging
US9124810B2 (en) Method of checking an inspection apparatus and method of establishing a measurement variable of the inspection apparatus
KR100747050B1 (en) 3-dimensional measuring device
US8730464B2 (en) Method of inspecting a substrate
US20120120414A1 (en) Method of inspecting board
US20110255771A1 (en) Method of discriminating between an object region and a ground region and method of measuring three dimensional shape by using the same
KR101158324B1 (en) Image Sensing System
KR101684244B1 (en) Board inspection method
US8755043B2 (en) Method of inspecting a substrate
KR101262242B1 (en) Image Sensing Method and System
JP2020091203A (en) Projection device and three-dimensional measuring device
KR20130045301A (en) Image sensing method and system

Legal Events

Date Code Title Description
STCF Information on status: patent grant

Free format text: PATENTED CASE

FEPP Fee payment procedure

Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.)

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 4