Image device for generating depth images and related electronic device
20190387165 ยท 2019-12-19
Inventors
Cpc classification
H04N13/161
ELECTRICITY
H04N2201/3254
ELECTRICITY
H04N13/254
ELECTRICITY
H04N5/2628
ELECTRICITY
H04N13/239
ELECTRICITY
H04N2201/3277
ELECTRICITY
H04N2013/0081
ELECTRICITY
International classification
H04N13/254
ELECTRICITY
H04N13/239
ELECTRICITY
Abstract
An image device for generating depth images includes at least two image capturers and a rotating device. When the rotating device rotates the at least two image capturers, multiple images captured by the at least two image capturers are utilized to generate a depth image, wherein a view angle corresponding to the depth image is not less than a view angle of each image capturer of the at least two image capturers.
Claims
1. An image device for generating depth images, comprising: at least two image capturers; and a rotating device, wherein when the rotating device rotates the at least two image capturers, multiple images captured by the at least two image capturers are utilized to generate a depth image, wherein a view angle corresponding to the depth image is not less than a view angle of each image capturer of the at least two image capturers.
2. The image device of claim 1, further comprising: a supporting unit coupled to the rotating device, wherein the at least two image capturers are installed on the supporting unit.
3. The image device of claim 1, wherein a rotation axis of the rotating device passes through an optical center of the each image capturer.
4. The image device of claim 1, wherein a processor stitches a plurality of first images captured by an image capturer of the at least two image capturers to generate a color image according to a feature point matching method or a fixed angle method, and stitches a plurality of second images captured by another image capturer of the at least two image capturers to generate the depth image according to the feature point matching method or the fixed angle method, wherein the another image capturer is different from the image capturer.
5. The image device of claim 4, wherein the processor is installed in the image device or outside the image device.
6. The image device of claim 4, wherein the color image and the depth image are integrated into a color and depth image, and the color and depth image is stored as a file.
7. The image device of claim 6, wherein when the color and depth image is stored as the file, the color image is compressed according to a Joint Photographic Experts Group (JPEG) format, the depth image is compressed according to a lossless format, and a compressed depth image is stored in a header of the file.
8. An image device for generating depth images, comprising: at least one image capturer; at least one light source emitting emission light; and a rotating device; wherein when the rotating device rotates the at least one image capturer and the at least one light source, multiple images captured by the at least one image capturer and the emission light are used for generating a depth image, wherein a view angle corresponding to the depth image is not less than a view angle of each image capturer of the at least one image capturer.
9. The image device of claim 8, wherein when the rotating device rotates the at least one image capturer and the at least one light source, a plurality of images comprising the emission light captured by an image capturer of the at least one image capturer or flight time corresponding to the emission light passing from the at least one light source to each object and passing from the each object to the image capturer after the emission light is reflected by the each object is used for generating the depth map.
10. The image device of claim 9, wherein when the plurality of images are used for generating the depth image, the emission light is structured light.
11. The image device of claim 8, further comprising: a supporting unit coupled to the rotating device, wherein the at least one image capturer and the at least one light source are installed on the supporting unit.
12. The image device of claim 8, wherein a rotating axis of the rotating device passes through an optical center of the each image capturer of the at least one image capturer.
13. An electronic device for viewing a panoramic color and depth image, comprising: a display; and a processor reading a 360 degree panoramic color image and a 360 degree panoramic depth image, wherein the 360 degree panoramic color image corresponds to the 360 degree panoramic depth image; wherein when the electronic device is moved or rotated, the processor further converts a part of the 360 degree panoramic color image corresponding to a view angle corresponding to motion or rotation of the electronic device into a planar color image and a part of the 360 degree panoramic depth image corresponding to the view angle into a planar depth image according to the view angle, combines the planar color image with the planar depth image to generate a planar color and depth image corresponding to the view angle, and makes the display display the planar color image according to the planar color and depth image.
14. The electronic device of claim 13, further comprising: an inertial sensor determining the view angle corresponding to the motion or the rotation of the electronic device according to the motion or the rotation of the electronic device.
15. The electronic device of claim 13, wherein each depth value of the 360 degree panoramic depth image corresponds to a virtual optical center which the electronic device acts as, and each depth value of the planar depth image corresponds to a virtual optical plane where the electronic device is located.
16. The electronic device of claim 13, wherein when the processor combines the planar color image with the planar depth image to generate the planar color and depth image, the processor converts the planar depth image into depth information corresponding to a virtual optical plane to make the planar color and depth image have the depth information.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0011]
[0012]
[0013]
[0014]
[0015]
[0016]
DETAILED DESCRIPTION
[0017] Please refer to
[0018] As shown in
[0019] After the processor 110 generates the 360 degree panoramic depth image, the processor 110 can combine the first 360 degree panoramic color image (or the second 360 degree panoramic color image) with the 360 degree panoramic depth image to generate a 360 degree panoramic color and depth image, and the processor 110 compresses the first 360 degree panoramic color image (or the second 360 degree panoramic color image) according to a standard compression format (e.g. a Joint Photographic Experts Group (JPEG) format) and compresses the 360 degree panoramic depth image according to a lossy format (or a lossless format) to integrate the 360 degree panoramic color and depth image into a file, wherein a compressed 360 degree panoramic depth image is stored in a header of the file. That is, the compressed 360 degree panoramic depth image is usually stored in a user defined header of the file. In addition, the standard compression format is not limited to JPEG, that is, the first 360 degree panoramic color image (or the second 360 degree panoramic color image) can also be compressed according to PNG, MPEG1, MPEG2, MPEG4, H.264, H.265, and so on. In addition, in another embodiment of the present invention, when the 360 degree panoramic color and depth image is integrated into the file, the 360 degree panoramic depth image is not compressed
[0020] Please refer to
[0021] In addition, in another embodiment of the present invention, the processor 110 can optionally adjust intensity of the light source 302 according to luminance corresponding to each second image of the plurality of second images and a target value, wherein the target value is set according to reflection coefficient of a human skin of a user corresponding to the structured light emitted by the light source 302. For example, the processor 110 can generate a luminance distribution map corresponding to the each second image according to the each second image, and optionally adjust the intensity of the light source 302 according to a percentage of the each second image occupied by an area corresponding to a maximum luminance value of at least one luminance value within the luminance distribution map greater than the target value. In addition, in another embodiment of the present invention, the processor 110 can optionally adjust the intensity of the light source 302 according to average luminance of the each second image and the target value. In addition, in another embodiment of the present invention, the processor 110 can generate a luminance histogram corresponding to a plurality of pixels of the each second image according to the each second image, and optionally adjust the intensity of the light source 302 according to a median of the luminance histogram and the target value, or according to a predetermined quantile of the luminance histogram and the target value.
[0022] In addition, in another embodiment of the present invention, after the light source 302 is turned on, the processor 110 can optionally dynamically adjust the intensity of the light source 302 according to a distance between at least one predetermined object within the each second image and the image capturer 304 (or the image capturer 102) and a first lookup table, wherein the first lookup table stores relationships between a distance corresponding to an object and the intensity of the light source 302. In addition, in another embodiment of the present invention, the processor 110 continuously detects the luminance of the environment which the image device 300 is located at under the light source 302 being turned off. When the luminance of the environment is brighter, the processor 110 increases the intensity of the light source 302 (when the light source 302 is turned on) according to a second lookup table, wherein the second lookup table stores relationships between the intensity of the light source 302 (when the light source 302 is turned on) and the luminance of the environment.
[0023] In addition, in the above-mentioned embodiments, when the processor 110 utilizes a first pulse width modulation signal in a continuous mode to adjust the intensity of the emitted light, the processor 110 can adjust the intensity of the emitted light by changing a duty cycle of the first pulse width modulation signal; in the above-mentioned embodiments, when the processor 110 utilizes a second pulse width modulation signal in a burst mode to adjust the intensity of the emitted light, the processor 110 can adjust the intensity of the emitted light by changing an enabling time of the second pulse width modulation signal; in the above-mentioned embodiments, when the processor 110 utilizes the first pulse width modulation signal and the second pulse width modulation signal to adjust the intensity of the emitted light, the processor 110 can adjust the intensity of the emitted light by simultaneously changing the enabling time of the second pulse width modulation signal and the duty cycle of the first pulse width modulation signal.
[0024] In addition, subsequent operational principles of the image device 300 can be referred to those of the image device 100, so further description thereof is omitted for simplicity.
[0025] In addition, in another embodiment of the present invention, the light source 302 is applied to a time of flight (TOF), wherein when the light source 302 is applied to the time of flight, the emission light is diffused light, that is, the emission light is uniform light. Therefore, the processor 110 can generate the 360 degree panoramic depth image according to a difference between a receiving time for the image capturer 304 receiving reflected light and a generating time corresponding to the emission light, wherein the reflected light is generated by at least one object reflecting the emitted light, and meanwhile the image capturer 304 is a time of flight sensor. In addition, when the light source 302 is applied to the time of flight, operational principles of the processor 110 determining whether to adjust the intensity of the emitted light can be referred to the above-mentioned corresponding descriptions, so further description thereof is omitted for simplicity.
[0026] In addition, because the above-mentioned functions of the processor 110 are fully disclosed, one of ordinary skill in the art should easily utilize a field programmable gate array (FPGA) with the above-mentioned functions of the processor 110, or an application-specific integrated circuit (ASIC) with the above-mentioned functions of the processor 110, or a software module with the above-mentioned functions of the processor 110, or an analog integrated circuit with the above-mentioned functions of the processor 110 to realize the processor 110 according to corresponding descriptions of the above-mentioned functions of processor 110. Therefore, a corresponding structure of the processor 110 is omitted for simplicity.
[0027] In addition, in another embodiment of the present invention, the image capturer 102 and the image capturer 304 can act as a stereo camera (or a depth camera), and the emission light generated by the light source 302 is used for assisting the stereo camera. The processor 110 can utilize a time division multiplexing method to control the light source 302. When the processor 110 control the light source 302 to generate the emission light, the stereo camera acts as the stereo camera, and when the processor 110 turns off the light source 302, images captured by the image capturer 102 and the image capturer 304 are used for generating a color image.
[0028] In addition, please refer to
[0029] In addition, please refer to
[0030] Please refer to
[0031] When the electronic device 400 is moved or rotated (e.g. as shown in
[0032] In addition, because the planar color and depth image has the depth information of the planar depth image, when the electronic device 400 is applied to virtual reality (VR), augmented reality (AR), substitutional reality (SR), and mixed reality (MR), the processor 404 can convert the planar color and depth image into a corresponding left eye color image and a corresponding right eye color image according to the depth information of the planar depth image and the depth-image-based rendering, wherein as shown in
[0033] In addition, because the above-mentioned functions of the processor 404 are fully disclosed, one of ordinary skill in the art should easily utilize a field programmable gate array with the above-mentioned functions of the processor 404, or an application-specific integrated circuit with the above-mentioned functions of the processor 404, or a software module with the above-mentioned functions of the processor 404, or an analog integrated circuit with the above-mentioned functions of the processor 404 to realize the processor 404 according to corresponding descriptions of the above-mentioned functions of processor 404. Therefore, a corresponding structure of the processor 404 is omitted for simplicity.
[0034] To sum up, because the image device utilizes the rotating device to rotate the image capturers, and utilizes the processor to generate a 360 degree panoramic depth image according to multiple images captured by the image capturers, compared to the prior art, the image device does not have a problem that optical centers in the prior art cannot overlap. In addition, because a 360 degree panoramic color and depth image generated by the present invention has depth information of a 360 degree panoramic depth image, compared to the prior art, when the electronic device displays the 360 degree panoramic color and depth image, the electronic device can display a corresponding planar color and depth image according to motion or rotation of the electronic device.
[0035] Those skilled in the art will readily observe that numerous modifications and alterations of the device and method may be made while retaining the teachings of the invention. Accordingly, the above disclosure should be construed as limited only by the metes and bounds of the appended claims.