Sign up to receive free email alerts when patent applications with chosen keywords are publishedSIGN UP

Abstract:

Depth sensing imaging pixels include pairs of left and right pixels
forming an asymmetrical angular response to incident light. A single
microlens is positioned above each pair of left and right pixels. Each
microlens spans across each of the pairs of pixels in a horizontal
direction. Each microlens has a length that is substantially twice the
length of either the left or right pixel in the horizontal direction; and
each microlens has a width that is substantially the same as a width of
either the left or right pixel in a vertical direction. The horizontal
and vertical directions are horizontal and vertical directions of a
planar image array. A light pipe in each pixel is used to improve light
concentration and reduce cross talk.

Claims:

1. Depth sensing imaging pixels comprising: left and right pixels forming
an asymmetrical angular response to incident light, and a single
microlens positioned above the left and right pixels, wherein the single
microlens spans across the left and right pixels in a horizontal
direction.

2. The imaging pixels of claims 1 wherein the single microlens has a
length that is substantially twice the length of either the left or right
pixel in the horizontal direction, and the single microlens has a width
that is substantially the same as a width of either the left or right
pixel in a vertical direction; wherein the horizontal and vertical
directions are horizontal and vertical directions of a planar image
array.

3. The imaging pixels of claim 2 wherein the single microlens has a
radius of curvature in the horizontal direction that is the same or
different from its radius of curvature in the vertical direction.

4. The imaging pixels of claim 1 including: a color filter disposed
between the single microlens and the left and right pixels.

5. The imaging pixels of claim 4 wherein the color filter spans across
the left and right pixels in the horizontal direction.

6. The imaging pixels of claim 5 wherein the color filter has a length
that is substantially twice the length of either the left or right pixel
in the horizontal direction, and the color filter has a width that is
substantially the same as a width of either the left or right pixel in
the vertical direction, wherein the horizontal and vertical directions
are horizontal and vertical directions of a planar image array.

7. The imaging pixels of claim 5 wherein the color filter is configured
to provide one of the following in a CFA pattern: (a) left and right
Bayer images for two separated images, and (b) a single Bayer image for a
single image, in which the left and right pixels are summed together.

8. The imaging pixels of claim 1 including a left light pipe (LP)
disposed between the left pixel and the single microlens for directing
the incident light toward the left pixel, and a right light pipe (LP)
disposed between the right pixel and the single microlens for directing
the incident light toward the right pixel.

9. The imaging pixels of claim 8 wherein the left light pipe is
configured to receive the incident light at a relatively high signal
response, when the incident light forms a positive angle with respect to
a vertical plane passing between the left and right pixels, and the right
light pipe is configured to receive the incident light at a relatively
high signal response, when the incident light forms a negative angle with
respect to a vertical plane passing between the left and right pixels.

10. The imaging pixels of claim 8 wherein the left and right light pipes
include material of a refractive index greater than a refractive index of
material external of the left and right light pipes.

11. An imaging array comprising: multiple pixel pairs of left and right
pixels, each pixel pair forming an asymmetrical angular response to
incident light, and multiple color filters disposed on top of the
multiple pixel pairs, wherein the multiple color filters provide at least
one Bayer color pattern and each color filter spans across one pixel
pair.

12. The imaging array of claim 11 wherein the multiple pixel pairs are
configured to provide one of the following: (a) two separate images, one
image derived from left pixels and the other image derived from right
pixels, and (b) a single image upon summing respective left and right
pixels in each of the multiple pixel pairs.

13. The imaging array of claim 11 including: multiple microlenses
covering the multiple pixel pairs, wherein each microlens covers
2.times.1 pixels, defining two pixels in a row and one pixel in a column.

14. The imaging array of claim 13 wherein each microlens in a row is
shifted by one pixel relative to each microlens in a neighboring row.

15. The imaging array of claim 14 wherein two separate patterns are
formed, in which left and right image arrays are formed from neighboring
four rows.

16. The imaging array of claim 11 including multiple light pipes disposed
between the multiple pixel pairs and multiple color filters, wherein each
light pipe is configured to direct the incident light toward one of
either a left pixel or a right pixel.

17. An imaging array for a camera comprising: multiple pixel pairs
forming rows and columns of a focal planar array (FPA), each pixel pair
including left and right pixels forming an asymmetrical angular response
to incident light, and multiple light pipes (LP) disposed within the
multiple pixel pairs, wherein each LP is configured to direct the
incident light toward one of either left or right pixels of respective
pixel pairs.

18. The imaging array of claim 17 including: multiple color filters
disposed above the multiple LPs, wherein the multiple color filters are
configured to provide at least one Bayer color pattern, and each color
filter spans across each pixel pair.

20. The imaging array of claim 19 wherein each microlens in a row of the
FPA is zigzag shifted by one pixel relative to each microlens in a
neighboring row, a left imaging array in four adjacent rows form a Bayer
pattern, and a right imaging array in four adjacent rows form another
Bayer pattern.

[0002] The present invention relates, in general, to imaging systems. More
specifically, the present invention relates to imaging systems with depth
sensing capabilities and stereo perception, although using only a single
sensor with a single lens.

BACKGROUND OF THE INVENTION

[0003] Modern electronic devices such as cellular telephones, cameras, and
computers often use digital image sensors. Imagers (i.e., image sensors)
may be formed from a two-dimensional array of image sensing pixels. Each
pixel receives incident photons (light) and converts the photons into
electrical signals.

[0004] Some applications, such as three-dimensional (3D) imaging may
require electronic devices to have depth sensing capabilities. For
example, to properly generate a 3D image for a given scene, an electronic
device may need to identify the distances between the electronic device
and objects in the scene. To identify distances, conventional electronic
devices use complex arrangements. Some arrangements require the use of
multiple cameras with multiple image sensors and lenses that capture
images from various viewpoints. These increase cost and complexity in
obtaining good stereo imaging performance. Other arrangements require the
addition of lenticular arrays that focus incident light on sub-regions of
a two-dimensional pixel array. Due to the addition of components, such as
complex lens arrays, these arrangements lead to reduced spatial
resolution, increased cost and complexity.

[0005] The present invention, as will be explained, addresses an improved
imager that obtains stereo performance using a single sensor with a
single lens. Such imager reduces complexity and cost, and improves the
stereo imaging performance.

BRIEF DESCRIPTION OF THE FIGURES

[0006] The invention may be best understood from the following detailed
description when read in connection with the accompanying figures:

[0007]FIG. 1 is a schematic diagram of an electronic device with a camera
sensor that may include depth sensing pixels, in accordance with an
embodiment of the present invention.

[0008]FIG. 2A is a cross-sectional view of a pair of depth sensing pixels
covered by one microlens that has an asymmetric angular response, in
accordance with an embodiment of the present invention.

[0009] FIGS. 2B and 2C are cross-sectional views of a depth sensing pixel
that may be asymmetrically sensitive to incident light at negative and
positive angles of incidence, in accordance with an embodiment of the
present invention.

[0010]FIG. 2D shows a cross-sectional view and a top view of a pair of
depth sensing pixels covered by one microlens, in accordance with an
embodiment of the present invention.

[0011]FIG. 3 is a diagram of illustrative output signals of a depth
sensing pixel for incident light striking the depth sensing pixel at
varying angles of incidence, in accordance with an embodiment of the
present invention.

[0012]FIG. 4 is a diagram of illustrative output signals of depth sensing
pixels in a depth sensing pixel pair for incident light striking the
depth sensing pixel pair at varying angles of incidence, in accordance
with an embodiment of the present invention.

[0013]FIG. 5A is a diagram of a depth sensing imager having a lens and an
object located at a focal distance away from the lens, showing how the
lens focuses light from the object onto the depth sensing imager, in
accordance with an embodiment of the present invention.

[0014]FIG. 5B is a diagram of a depth sensing imager having a lens and an
object located at more than a focal distance away from the lens, showing
how the lens focuses light from the object onto the depth sensing imager,
in accordance with an embodiment of the present invention.

[0015]FIG. 5c is a diagram of a depth sensing imager having a lens and an
object located less than a focal distance away from the imaging lens,
showing how the lens focuses light from the object onto the depth sensing
imager, in accordance with an embodiment of the present invention.

[0016]FIG. 6 is a diagram of illustrative depth output signals of a depth
sensing pixel pair for an object at varying distances from the depth
sensing pixel, in accordance with an embodiment of the present invention.

[0017]FIG. 7 is a perspective view of one microlens covering two depth
sensing pixels, in accordance with an embodiment of the present
invention.

[0018]FIG. 8 is a diagram showing a top view of two sets of two depth
sensing pixels of FIG. 7 arranged in a Bayer pattern, in accordance with
an embodiment of the present invention.

[0019]FIG. 9 is diagram of a cross-sectional view of two sets of two
depth sensing pixels, showing light entering one light pipe (LP) in each
set, in accordance with an embodiment of the present invention.

[0020]FIG. 10 is diagram of a side view of the two sets of two depth
sensing pixels shown in FIG. 9.

[0021]FIG. 11 is plot of the relative signal response versus the incident
angle of light entering left and right pixels in each set of pixels shown
in FIG. 9, in accordance with an embodiment of the present invention.

[0022]FIG. 12 is a top view of sets of left and right pixels arranged in
a Bayer pattern, in accordance with an embodiment of the present
invention.

[0023] FIGS. 13A and 13B are top views of sets of left and right pixels
arranged differently so that each forms a Bayer pattern, in accordance
with an embodiment of the present invention.

[0025] Still and video image data from camera sensor 14 may be provided to
image processing and data formatting circuitry 16 via path 26. Image
processing and data formatting circuitry 16 may be used to perform image
processing functions such as data formatting, adjusting white balance and
exposure, implementing video image stabilization, face detection, etc.
Image processing and data formatting circuitry 16 may also be used to
compress raw camera image files, if desired (e.g., to Joint Photographic
Experts Group, or JPEG format). In a typical arrangement, which is
sometimes referred to as a system-on-chip, or SOC arrangement, camera
sensor 14 and image processing and data formatting circuitry 16 are
implemented on a common integrated circuit. The use of a single
integrated circuit to implement camera sensor 14 and image processing and
data formatting circuitry 16 may help to minimize costs.

[0027] It may be desirable to form image sensors with depth sensing
capabilities (e.g., for use in 3D imaging applications, such as machine
vision applications and other three dimensional imaging applications). To
provide depth sensing capabilities, camera sensor 14 may include pixels
such as pixels 100A, and 100B, shown in FIG. 2A.

[0028]FIG. 2A shows an illustrative cross-section of pixels 100A and
100B. Pixels 100A and 100B may contain microlens 102, color filter 104, a
stack of dielectric layers 106, substrate layer 108, a photosensitive
area, such as photosensitive area 110A and 110B formed in substrate layer
108, and pixel separating areas 112 formed in substrate layer 108.

[0029] Microlens 102 may direct incident light towards a substrate area
between pixel separators 112. Color filter 104 may filter the incident
light by only allowing predetermined wavelengths to pass through color
filter 104 (e.g., color filter 104 may only be transparent to wavelengths
corresponding to a green color). Photo-sensitive areas 110A and 110B may
serve to absorb incident light focused by microlens 102 and produce image
signals that correspond to the amount of incident light absorbed.

[0030] A pair of pixels 100A and 100B may be covered by one microlens 102.
Thus, the pair of pixels may be provided with an asymmetric angular
response (e.g., pixels 100A and 100B may produce different image signals
based on the angle at which incident light reaches pixels 100A and 100B).
The angle at which incident light reaches pixels 100A and 100B may be
referred to herein as an incident angle, or angle of incidence.

[0031] In the example of FIG. 2B, incident light 113 may originate from
the left of a normal axis 116 and may reach a pair of pixels 100A and
100B with an angle 114 relative to normal axis 116. Angle 114 may be a
negative angle of incident light. Incident light 113 that reaches
microlens 102 at a negative angle, such as angle 114, may be focused
towards photosensitive area 110A, and pixel 100A may produce relatively
high image signals.

[0032] In the example of FIG. 2c, incident light 113 may originate from
the right of normal axis 116 and reach the pair of pixels 100A and 100B
with an angle 118 relative to normal axis 116. Angle 118 may be a
positive angle of incident light.

[0033] Incident light that reaches microlens 102 at a positive angle, such
as angle 118, may be focused towards photosensitive area 110B. In this
case, pixel 100B may produce an image signal output that is relatively
high.

[0034] Due to the special formation of the microlens, pixels 100A and 100B
may have an asymmetric angular response (e.g., pixel 100A and 100B may
produce different signal outputs for incident light with a given
intensity, based on an angle of incidence). In the diagram of FIG. 3, an
example of the image output signals of pixel 100A in response to varying
angles of incident light are shown. As shown, pixel 100A may produce
larger image signals for negative angles of incident light and smaller
image signals for positive angles of incident light. In other words,
pixel 100A produces larger image signals as the incident angle becomes
more negative.

[0035]FIG. 2D illustrates an adjacent pair of pixels (100A and 100B) with
the same microlens, in which pixel 100A is formed on the right side of
the pair, and pixel 100B is formed on the left side of the pair. An
adjacent pair of pixels, such as pixels 100A and 100B, may be referred to
herein as pixel pair 200. The pixel pair 200 may also be referred to
herein as pixel type 1 and pixel type 2.

[0036] Incident light 113 that reaches pair of pixels 100A and 100B may
have an angle of incidence that is approximately equal for both pixels.
In the arrangement of FIG. 2D, incident light 113 may be focused by
microlens 102A onto photosensitive area 110B in pixel 100A and
photosensitive area 110B in pixel 100B. In response to receiving incident
light 113, pixel 100A may produce an output image signal that is high and
pixel 100B may produce an output image signal that is high by the
microlens design.

[0037] The respective output image signals for pixel pair 200 (e.g.,
pixels 100A and 100B) are shown in FIG. 4. As shown, line 160 may reflect
the output image signal for pixel 100A and line 162 may reflect the
output image signal for pixel 100B. For negative angles of incidence, the
output image signal for pixel 100A may increase (because incident light
is focused onto photosensitive area 110A of pixel 100A) and the output
image signal for pixel 100B may decrease (because incident light is
focused away from photosensitive area 110B of pixel 100B). For positive
angles of incidence, the output image signal for pixel 100A may be
relatively small and the output image signal for pixel 100B may be
relatively large (e.g., the output signal from pixel 100A may decrease
and the output signal from pixel 100B may increase).

[0038] Line 164 of FIG. 4 may reflect the sum of the output signals for
pixel pair 200. As shown, line 164 may remain relatively constant
regardless of the angle of incidence (e.g., for any given angle of
incidence, the total amount of light that is absorbed by the combination
of pixels 100A and 100B may be constant).

[0039] Pixel pairs 200 may be used to form imagers with depth sensing
capabilities. FIGS. 5A, 5B and 5C show illustrative image sensors 14 with
depth sensing capabilities. As shown, image sensor 14 may contain an
array of pixels 201 formed from pixel pairs 200 (e.g., pixel pairs 200A,
200B, 200C, etc.). Image sensor 14 may have an associated camera lens 202
that focuses light originating from a scene of interest (e.g., a scene
that includes an object 204) onto the array of pixels. Camera lens 202
may be located at a distance DF from image sensor 14. Distance
DF may correspond to the focal length of camera lens 202.

[0040] In the arrangement of FIG. 5A, object 204 may be located at
distance D0 from camera lens 202. Distance D0 may correspond to
a focused object plane of camera lens 202 (e.g., a plane located at a
distance Do from camera lens 202). The focused object plane and a
plane corresponding to image sensor 14 may sometimes be referred to as
conjugate planes. In this case, light from object 204 may be focused onto
pixel pair 200A, at an angle θ0 and an angle -θ0.
The image output signals of pixels 100A and 100B of pixel pair 200 may be
equal (e.g., most of the light is absorbed by pixel 100A for the positive
angle and most of the light is absorbed by pixel 100B for the negative
angle).

[0041] In the arrangement of FIG. 5B, object 204 may be located at a
distance D1 from camera lens 202. Distance D1 may be larger
than the distance of the focused object plane (e.g., the focused object
plane corresponding to distance D0) of camera lens 202. In this
case, some of the light from object 204 may be focused onto pixel pair
200B at a negative angle -θ1(e.g., the light focused by the
bottom half pupil of camera lens 202) and some of the light from object
204 may be focused onto pixel pair 200C at a positive angle θ1
(e.g., the light focused by the top half pupil of camera lens 202).

[0042] In the arrangement of FIG. 5c, object 204 may be located at a
distance D2 from camera lens 202. Distance D2 may be smaller
than the distance of the focused object plane (e.g., the focused object
plane corresponding to distance D0) of camera lens 202. In this
case, some of the light from object 204 may be focused by the top half
pupil of camera lens 202 onto pixel pair 200B at a positive angle
θ2 and some of the light from object 204 may be focused by the
bottom half pupil of camera lens 202 onto pixel pair 200C at a negative
angle -θ2.

[0043] The arrangements of FIGS. 5A, 5B and 5C may effectively partition
the light focused by camera lens 202 into two halves split by a center
plane at a midpoint between the top of the lens pupil and the bottom of
the lens pupil (e.g., split into a top half and a bottom half). Each
pixel in the paired pixel array 201 may receive different amounts of
light from top or bottom half of the lens pupil, respectively. For
example, for an object at distance D1, pixel 100A of 200B may
receive more light than pixel 100B of 200B. For an object at distance
D2, pixel 100A of 200B may receive less light than 100B of 200B. The
partitioning of the light focused by camera lens 202 may be referred to
herein as lens partitioning, or lens pupil division.

[0044] The output image signals of each pixel pair 200 of image sensor 14
may depend on the distance from camera lens 202 to object 204. The angle
at which incident light reaches pixel pairs 200 of image sensor 14
depends on the distance between lens 202 and objects in a given scene
(e.g., the distance between objects such as object 204 and device 10).

[0045] An image depth signal may be calculated from the difference between
the two output image signals of each pixel pair 200. The diagram of FIG.
6 shows an image depth signal that may be calculated for pixel pair 200B
by subtracting the image signal output of pixel 100B from the image
signal output of pixel 100A (e.g., by subtracting line 162 from line 160
of FIG. 4). As shown in FIG. 6, for an object at a distance that is less
than distance D0, the image depth signal may be negative. For an
object at a distance that is greater than the focused object distance
D0, the image depth signal may be positive.

[0046] For distances greater than D4 and less than D3, the image
depth signal may remain constant. Pixels 100A and 100B may be unable to
resolve incident angles with magnitudes larger than the magnitudes of
angles provided by objects at distances greater than D4, or at
distances less than D3. In other words, a depth sensing imager may
be unable to accurately measure depth information for objects at
distances greater than D4, or at distances less than D3. The
depth sensing imager may be unable to distinguish whether an object is at
a distance D4 or a distance D5 (as an example). If desired, the
depth sensing imager may assume that all objects that result in an image
depth signal equivalent to distance D2 or D4 are at a distance
of D2 or D4, respectively.

[0047] To provide an imager 14 with depth sensing capabilities, two
dimensional pixel arrays 201 may be formed from various combinations of
depth sensing pixel pairs 200 and regular pixels (e.g., pixels without
asymmetric angular responses). For a more comprehensive description of
two dimensional pixel arrays 201, with depth sensing capabilities and
with regular pixels (e.g., pixels without asymmetric angular responses),
reference is made to Application Ser. No. 13/188,389, filed on Jul. 21,
2011, titled Imagers with Depth Sensing Capabilities, having common
inventors. That application is incorporated herein by reference in its
entirety.

[0048] It should be understood that the depth sensing pixels may be formed
with any desirable types of color filters. Depth sensing pixels may be
formed with red color filters, blue color filters, green color filters,
or color filters that pass other desirable wavelengths of light, such as
infrared and ultraviolet light wavelengths. If desired, depth sensing
pixels may be formed with color filters that pass multiple wavelengths of
light. For example, to increase the amount of light absorbed by a depth
sensing pixel, the depth sensing pixel may be formed with a color filter
that passes many wavelengths of light. As another example, the depth
sensing pixel may be formed without a color filter (sometimes referred to
as a clear pixel).

[0049] Referring now to FIG. 7, there is shown a perspective view of an
embodiment of the present invention. The pixel pair 302 is similar to the
pixel pair 200 shown in FIG. 2D. The pixel pair includes left and right
pixels, or as sometimes referred to as pixel type-one and pixel type-two.
As shown in FIG. 7, a single microlens 300 (same as 102 in FIG. 2D) is
positioned above the left and right pixels so that the single microlens
spans across both pixels in the horizontal direction.

[0051] Referring now to FIG. 9, there is shown an asymmetric pixel
configuration that includes microlens 300 and pixel pair 302, similar to
the pixel configuration of FIG. 7. It will be appreciated that FIG. 9
shows four pixels, namely, pixels 316A and 316B forming one pair of
pixels on the left side of the figure and pixels 316A and 316B forming
another pair of pixels on the right side of the figure. As shown, each
microlens 300 covers two pixels in the horizontal direction. A
planarization layer 310 is disposed under each microlens 300. Below
planarization layer 310, there is shown a color filter which spans across
two pixels 316A and 316B. Thus, color filter 312 is similar in length to
the length of microlens 300 and covers a pixel pair (or a set of pixels).

[0052] Disposed between each color filter 312 and each pixel pair 316A and
316B are two light pipes (LPs). Each LP improves the light concentration
that impinges upon each respective pixel. The LP improves, not only the
light concentration, but also reduces cross-talk and insures good three
dimensional performance, even with very small pixel pitches, such as 1.4
microns or less.

[0053] As shown on the left side of FIG. 9, light enters pixel
photosensitive area 316B by way of LP 314B. Similarly, on the right side
of FIG. 9, light enters LP 314A and pixel photosensitive area 316A. It
will be appreciated that LP 314B, on the left side of the figure,
includes most of the light, because the light passing through microlens
300 is angled at a negative angle with respect to a vertical line through
microlens 300. In a similar way, the light on the right side of the
figure, enters LP 314A, because the light passing through microlens 300
is angled at a positive angle with respect to a vertical line through
microlens 300.

[0054]FIG. 10 shows the same pixels as in FIG. 9, except that a side-view
is shown of the pixel pair. As shown, microlens 300 only spans one pixel
in the vertical direction, or the column direction of a pixel array.
Accordingly, microlens 300 is effective in reducing cross-talk in the
vertical direction of the pixel array. Also shown in the figure is a
side-view of LP 314 and pixel photosensitive area 316. In addition, light
is shown concentrated in LP 314 and passing into pixel photosensitive
area 316.

[0055]FIG. 11 shows the relative signal response versus the incident
angle of light entering a pixel pair. As shown, the right pixel (or pixel
314B on the left side of FIG. 9) responds strongly, when the light enters
at a negative angle with respect to a vertical line passing through
microlens 300. On the other hand, when the left pixel (or pixel 314A on
the right side of FIG. 9) receives light at a positive angle with respect
to a normal passing through microlens 300, the pixel also responds
strongly. At normal incidence, however, the responses of the left and
right pixels are relatively low. It will be appreciated that if the two
pixels forming each pixel pair is summed in the horizontal direction, a
normal image may be formed. On the other hand, since the left and right
pixels form asymmetric pixel angular responses, the present invention
obtains depth sensing capabilities.

[0056] It will now be understood that an asymmetric angular response
stereo sensor is provided by the present invention. By having a 2×1
CFA pattern, as shown in FIG. 8, the present invention may process the
color normally for two separate images and obtain two separate Bayer
patterns, as shown in FIG. 12. Accordingly, the two pixel pairs shown on
the left side of FIG. 12 may be separated into two images (the left image
has two pixels and the right image has two pixels).

[0057] For example, the first pixel pair provides a green color; when the
pair is separated into left and right images, the present invention
provides a single green pixel for the left image and a single green pixel
for the right image. Similarly, when the two right pixels providing red
colors are separated into left and right images, the present invention
forms a left image with a red color and a right image with a red color.
Thus, a 2×1 CFA pattern enables the present invention to form a
normal Bayer color process for two separate images (left and right Bayer
images), as shown in FIG. 12.

[0058] Referring next to FIGS. 13A and 13B, there are shown two different
CFA/microlens arrangements, namely arrangement 1 in FIG. 13A and
arrangement 2 in FIG. 13B. It will be appreciated that each arrangement
includes microlenses that cover 2×1 pixels, as shown in FIG. 7. The
microlenses, however, are shown zigzag-shifted relative to each other by
one pixel in neighboring rows. These arrangement result in no resolution
loss in the horizontal direction and would be valuable for HD video
format.

[0059] In arrangement 1 shown in FIG. 13A, the first and second rows' CFA
pattern is GRGRGR . . . , and the third and fourth rows' CFA patterns is
BGBGBG . . . . The 2×1 microlens for the first and third rows start
from the first column, whereas the microlens for the second and fourth
rows start one column earlier, or later. Therefore, the left image pixel
array is formed by pixels L1, L2, L3, L4, L5, L6, L7 and L8. Similarly,
the right image pixel array is formed by pixels R1, R2, R3, R4, R5, R6,
R7 and R8. The first Bayer pattern for the left image is formed by Gr=L1
in the first row, R=L2 in the second row, B=L1 in the third row, and
Gb=L2 in the fourth row. The first Bayer pattern for the right image is
formed by Gr=R1 in the second row, R=R2 in the first row, B=R1 in the
fourth row, and Gb=R2 in the third row.

[0060] In arrangement 2, shown in FIG. 13B, the first and third rows are
an all green CFA, the second row is an all red CFA, and the fourth row is
an all blue CFA. The 2×1 microlens for the first and third rows
start from the first column, whereas the microlens for second and fourth
rows start one column earlier, or later. Therefore, the left image pixel
array is formed by pixels L1, L2, L3, L4, L5, L6, L7 and L8. Similarly,
the right image pixel array is formed by pixels R1, R2, R3, R4, R5, R6,
R7 and R8. The first Bayer pattern for the left image is formed by Gr=L1
in the first row, R=L2 in the second row, Gb=L1 in the third row, and
B=L2 in the fourth row. The first Bayer pattern for the right image is
formed by Gr=R1 in the first row, R=R2 in the second row, Gb=R1 in the
third row and B=R2 in the fourth row.

[0061] Referring again to FIGS. 9 and FIG. 10, it will be understood that
each microlens covers two pixels in the horizontal direction, but only
covers one pixel in the vertical direction. Furthermore, the radius of
curvature of each microlens in both directions are different due to
processing limitations. The microlens material includes an optical index
(n) that varies in range between 1.5 and 1.6. Furthermore, the LP may be
filled by material having a higher optical index (n greater than 1.6)
than its surrounding oxide material, in which the latter may have an
optical index of 1.4 or 1.5. In this manner, the light is maintained
within the LP.

[0062] Although the invention is illustrated and described herein with
reference to specific embodiments, the invention is not intended to be
limited to the details shown. Rather, various modifications may be made
in the details within the scope and range of equivalents of the claims
and without departing from the invention.