ELECTRONIC DEVICE AND METHOD
20220057195 · 2022-02-24
Assignee
Inventors
Cpc classification
G01J1/20
PHYSICS
G01B11/16
PHYSICS
G01B11/245
PHYSICS
International classification
Abstract
An electronic device comprising a deformable object which is at least partially filled with a light-absorbing material, one or more light sources configured to illuminate the inside of the deformable object, an imaging unit configured to capture respective images of the light sources, and circuitry configured to reconstruct the shape of the deformable object based on the captured images of the light sources.
Claims
1. An electronic device comprising a deformable object which is at least partially filled with a light-absorbing material, one or more light sources configured to illuminate the inside of the deformable object, an imaging unit configured to capture respective images of the light sources, and circuitry configured to reconstruct the shape of the deformable object based on the captured images of the light sources.
2. The electronic device of claim 1, wherein the circuitry is configured to determine from the images of the light sources respective positions of the light sources and to determine the shape of the deformable object based on the positions of the light sources.
3. The electronic device of claim 1, wherein the circuitry is configured to determine an intensity of an image of a light source captured by an image sensor, and to determine a radial position of the light source based on the intensity.
4. The electronic device of claim 3, wherein the circuitry is configured to determine a position shift related to a light source based on the intensity of the image of the light source.
5. The electronic device of claim 4, wherein the circuitry is configured to determine an extinction value based on the intensity of the image of the light source, and to determine the position shift based on the extinction value.
6. The electronic device of claim 5, wherein the circuitry is configured to determine the extinction value according to
7. The electronic device of claim 4, wherein the light-absorbing material comprises a light-absorbing dye distributed within the deformable object, and wherein the circuitry is configured to determine a radial position shift of the light source based on the intensity of the image of the light source according to
8. The electronic device of claim 4, wherein the circuitry is configured to determine the radial position of the light source based on the position shift and a reference position.
9. The electronic device of claim 1, wherein the circuitry is configured to determine a position of an image of a light source on an image sensor and to determine a direction of the light source based on this position.
10. The electronic device of claim 1, wherein the light sources are embedded in the deformable object.
11. The electronic device of claim 1, wherein the circuitry is configured to reconstruct the shape of the object by reconstructing a 3D position for each of the light sources.
12. The electronic device of claim 2, wherein the circuitry is further configured to reconstruct a shape of the object based on an interpolation through positions of the light sources.
13. The electronic device of claim 1, wherein the imaging unit is located inside the object.
14. The electronic device of claim 1, wherein the imaging unit comprises a plurality of camera sensors and lens systems to capture a 360° angle of view.
15. The electronic device of claim 1, wherein the light sources are placed at the inner side of an outer part of the object.
16. The electronic device of claim 1, wherein the light sources are placed inside the object to illuminate a reflective layer of the deformable object.
17. The electronic device of claim 1, wherein the light sources are LEDs.
18. The electronic device of claim 1, wherein an interior part of the deformable object comprises a non-scattering, flexible, elastic and/or non-compressible polymer in combination with a homogeneously dissolved light-absorbing dye.
19. The electronic device of claim 1, wherein the outer part of the object comprises a flexible, stretchable and/or plastic material.
20. A method comprising illuminating the inside of a deformable object which is at least partially filled with a light-absorbing material, capturing respective images of the light sources, and reconstructing the shape of the deformable object based on the captured images of the light sources.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0012] Embodiments are explained by way of example with respect to the accompanying drawings, in which:
[0013]
[0014]
[0015]
[0016]
[0017]
[0018]
[0019]
[0020]
[0021]
[0022]
[0023] Before a detailed description of the embodiments under reference of
[0024] The embodiments described below in more detail disclose an electronic device comprising a deformable object which is at least partially filled with a light-absorbing material, one or more light sources configured to illuminate the inside of the deformable object, an imaging unit configured to capture images of the light sources, and circuitry configured to reconstruct the shape of the deformable object based on the captured images of the light sources.
[0025] The electronic device may for example be used as a flexible user interface device whose outer shape can be changed. A recognition of the changed outer shape may be interesting, especially in gaming and entertainment, soft robotics and many other industry fields.
[0026] According to the embodiment it is possible to realize a deformable user interface device, which detects its deformation in real-time and subsequently reconstructs its shape deformation based on the captured images. Other methods for depth estimation like time-of-flight may be not suitable due to the short distances.
[0027] Circuitry may include a processor, a memory (RAM, ROM or the like), a DNN unit, a storage, input means (mouse, keyboard, camera, etc.), output means (display (e.g. liquid crystal, (organic) light emitting diode, etc.), loudspeakers, etc., a (wireless) interface, etc., as it is generally known for electronic devices (computers, smartphones, etc.).
[0028] The circuitry may be configured to determine from the images respective positions of the light sources and to determine the shape of the deformable object based on the positions of the light sources.
[0029] The circuitry may be configured to determine an intensity of an image of a light source captured by an image sensor, and to determine a radial position of the light source based on the intensity.
[0030] The circuitry may be configured to determine a position shift related to a light source based on the intensity of the image of the light source.
[0031] The circuitry may be configured to determine an extinction value based on the intensity of the image of the light source, and to determine the position shift based on the extinction value.
[0032] The circuitry may be configured to determine the extinction value based on an intensity of the image of the light source, and a reference intensity.
[0033] The light-absorbing material comprises a light-absorbing dye distributed within the deformable object, and wherein the circuitry is configured to determine a radial position shift of the light source based on the intensity of the image of the light source, and based on a molar attenuation coefficient of a light-absorbing dye, and a concentration of a light-absorbing dye distributed within the deformable object.
[0034] The circuitry may be configured to determine the radial position of the light source based on the position shift and a reference position.
[0035] The circuitry may be configured to determine a position of an image of a light source on an image sensor and to determine a direction of the light source based on this position.
[0036] The light sources may be embedded in the deformable object.
[0037] The circuitry may be configured to reconstruct the shape of the object by reconstructing a 3D position for each of the light sources.
[0038] The circuitry may be configured to reconstruct a shape of the object based on an interpolation through positions of the light sources.
[0039] The imaging unit may be located inside the object.
[0040] The imaging unit may comprise a plurality of camera sensors and lens systems to capture a 360° angle of view.
[0041] The light sources may be placed at the inner side of an outer part of the object.
[0042] The light sources may be placed inside the object to illuminate a reflective layer of the deformable object.
[0043] The light sources may be LEDs.
[0044] According to an embodiment, an interior part of the deformable object comprises a non-scattering, flexible, elastic and/or non-compressible polymer in combination with a homogeneously dissolved light-absorbing dye.
[0045] According to an embodiment, the outer part of the object comprises a flexible, stretchable and/or plastic material.
[0046] The embodiments also relate to a method comprising illuminating the inside of a deformable object which is at least partially filled with a light-absorbing material, capturing respective images (C) of the light sources, and reconstructing the shape of the deformable object based on the captured image of the light sources.
[0047] The embodiments also relate to a computer-implemented method which comprises reconstructing the shape of the deformable object based on captured images of light sources.
[0048] Embodiments are now described by reference to the drawings.
DETAILED DESCRIPTION OF EMBODIMENTS
[0049]
[0050] The non-scattering, flexible or elastic/viscoelastic non-compressible polymer of the interior part 101 may be transparent (in which case the light absorption is done only by the dye). The viscoelastic silicone polymer may be made of polydimethylsiloxane crosslinked with boric acid in various ratios and additives.
[0051] Light which is emitted by a light source attached to the inner side of the outer part 102 or light that is emitted by a light source in the imaging and processing unit 103 is captured by the imaging system in the imaging and processing unit 103. The light-absorbing dye which is homogeneously dissolved in the interior part 101 of the deformable object 100 and gradually absorbs the light and thus decreases its intensity. Based on the decreased intensity the travelled distance of the light (depth information) is determined and a shape of the deformable object is determined. Thereby a real-time detection of the deforming and a shape recognition of the deformable object 100 can be realized.
[0052] The deformable object 100 with optical shape recognition may be used as a user interface device, for example as a controller for gaming applications or may be used for robotics applications. The deformable object 100 may have any default shape, like a sphere, a cube, a polyhedron or the like.
[0053] The light-absorbing dye may belong to one of the following dye classes: Zinc chlorines and their free base chlorine form (absorption maxima from ˜600 to ˜700 nm), zinc or free-base chlorin-imides (absorption maxima from ˜650 to ˜720 nm), zinc or free-base bacteriochlorins (absorption maxima from ˜700 to ˜800 nm), and zinc or free-base bacteriochlorin-imides (absorption maxima from ˜780 to ˜850 nm). Still further, other NIR-active dyes are classified as phthalocyanines, cyanine dyes and squaraine dyes.
[0054] The flexible, plastic outer part 102 may be made of an elastic polymer like silicone rubber or natural rubber.
[0055]
[0056] With this described setup of camera sensors and lens systems a full 360° angle of view of the space surrounding the imaging and processing unit 103 can be captured. The lens systems 202a-202d are respectively protected by transparent protection layers 203a-203d, which may for example be made of glass. Data collected by the camera sensors 201a-201d is sent to a processing unit 205 via an interface (for example Camera Serial Interface) to be further processed. The processing unit 205 may comprise a wireless interface like Bluetooth or wireless LAN to send data to an outside processor. The distance d.sub.1 indicates the distance between the LED 204a and the transparent protection layer 203a (correspondingly a distance between each transparent detection layer and the corresponding LED which illuminates it can be measured). The distance d.sub.1 is the mean travelled distance of all light beams emitted by the LED 204a and captured by the camera sensor 201a. The LED 204a emits light in several directions (for example 45°). This light is focused on the camera sensor 201 by the lens system 202a where its intensity is measure (see below). It may be assumed that the distance d.sub.1 is the distance from the LED 204a through the centre of the lens system 202a to the camera sensor 201a. This may for example be ensured by a calibration process (see below).
[0057] The LED 204a emits light with a predetermined emission intensity I.sub.0 (for example predetermined by calibration). The camera sensor 201a captures an image of the LED 204a by detecting the light that is emitted by the LED 204a and a measured intensity I.sub.1 on the camera sensor 201a can be determined. The light emitted by the LED 204a has travelled the distance d.sub.1 between the LED and the transparent protection layer 204a trough the interior part 101 which consists of an elastic non-compressible polymer in combination with a homogeneously dissolved light-absorbing dye. The light which is emitted by the LED 204a is gradually absorbed by the homogeneously dissolved light-absorbing dye and thereby the emission intensity I.sub.0 decreases along the distance d.sub.1 to the measured intensity I.sub.1 on the camera sensor. The amount of light absorption and therefore the loss of intensity is depending on a molar attenuation coefficient of the dye and the distance d.sub.1. Based on the measured intensity I.sub.1 and the Lambert-Beer law the a position shift may of a light source be determined (see
[0058] The same principle can be applied for all LEDs. By determining the coordinates of the plurality of LEDs 204a-204o the shape of the object 100 may be reconstructed. The shape of the object 100 may for example be reconstructed by providing a set of coordinates (for example in 3D global coordinates) of a plurality of LEDs 204a-204o which are homogeneously spaced around the inner surface of the outer part 102 the deformable object 100.
[0059] The measured light intensity translates to a voltage. However, instead of the light intensity also the irradiance could be to measure, which would measure the light power per unit square like W/cm2. The term “irradiance” of a light beam may refer to what amount of energy arrives on a certain surface with a given orientation and intensity of a light beam may be defined as the amount of energy going through an area perpendicular to the beam.
[0060] For the detection of a deformation of the object, a shape reconstruction may be applied in real-time by constantly sending out light by the LEDs and measuring the distance of the LEDs based on the principle explained above.
[0061] The location of the LEDs in
[0062] In the embodiment of
[0063]
[0064] The LED 304a emits light with a predetermined (for example predetermined by calibration) emission intensity I.sub.0. The camera sensor 301a captures an image of the LED 304a by detecting the light that is emitted by the LED 304a and reflected by the reflective layer 306 and a measured intensity I.sub.1 on the camera sensor 301a can be determined. The light emitted by the LED 304a has travelled the distance 2d.sub.2 between the LED, the reflective layer 306 and the transparent protection layer 304a trough the interior part 101 which consists of an elastic non-compressible polymer in combination with a homogeneously dissolved light-absorbing dye. The light which was emitted by the LED 304a light was gradually absorbed by the homogeneously dissolved light-absorbing dye and thereby the emission intensity I.sub.0 decreased along the distance 2d.sub.2 to the measured intensity I.sub.1 on the camera sensor 301a. The amount of light absorption and therefore the loss of intensity is depending on a molar attenuation coefficient of the dye and the distance 2d.sub.2. Based on the measured intensity I.sub.1 and the Lambert-Beer law the distance d.sub.2 may be determined, and the coordinates of the the reflection point of the LED 304a at the reflective layer 306 may be determined. The distance between the transparent protection layer 304a and the camera sensor 301a may be small and neglectable compared to the distance d and the light absorption loss of intensity occurring at the distance between the transparent protection layer 304a and the camera sensor 301a may be small and neglectable because the material along this way (i.e. the material of the lens system 302a for example glass and the air between the elements 301a, 302a and 303a) may have a very low light absorption characteristic.
[0065] The same principle can be applied for all LEDs. By determining the coordinates of the plurality of reflection points of the plurality of LEDs 304a-p at the reflective layer 306, a shape of the deformable object 100 may be reconstructed. The shape of the object may be reconstructed by stating a set of coordinates (for example in 3D global coordinates) of a plurality reflection points of the plurality of LEDs 304a-p at the reflective layer 306 (which is attached to the inner surface of the outer part 102 the deformable object 100).
[0066] The detection of a deformation of the object a shape reconstruction may be applied in real-time by constantly sending out light by the LEDs and measuring the distance of the LEDs based in the principle explained above.
[0067] The location of the LEDs in
[0068] In the embodiment of
[0069]
[0070] The imaging and processing unit 103 which is described with regard to
[0071] The hemispherical imaging system as described in
[0072] The drawings of
[0073] In another embodiment under micro-lenses 401b, 401d, 401e, 401g, 401h and 401j instead of photodiodes 403a-403f, full camera sensors (for example CMOS sensors) may be placed.
[0074]
[0075]
[0076] As shown in of pixels that are illuminated by light emerging from the same LED (204a-204o in
[0077] For sake of clarity, only one pixel cluster is shown in
[0078] By integrating the pixel intensities I(p) of all pixels p belonging to the pixel cluster , a cluster intensity I.sub.1 related to cluster
is obtained:
I.sub.1=I(p)
[0079] This cluster intensity I.sub.1 reflects the amount of light that is collected on the camera sensor and that originates from the LED.
[0080] A 2D position (b.sub.x, b.sub.y) of the pixel cluster on the captured camera image 601 is obtained by determining for example the centre of mass (with regard to the intensity) of the positions (x.sub.p, y.sub.p) of all pixels p in the cluster
:
(b.sub.x,b.sub.y)=1/I.sub.1,(x(p),y(p))I(p)
[0081] As an alternative to the centre of mass (with regard to the intensity), the centroid (geometric center) may be determined to determine position (b.sub.x, b.sub.y) of the pixel cluster .
[0082] . At 702, a 2D position (b.sub.x, b.sub.y) of the pixel cluster
on the image sensor is determined. At 703, a direction (θ, ϕ) to the LED is determined based on the 2D position (b.sub.x, b.sub.y) of the pixel cluster
on the image sensor. At 704, the pixel cluster
is associated with a specific LED and a reference intensity I.sub.0 an a reference distance r.sub.0 predefined for this LED is obtained. At 705, the pixel intensities I(p) of all pixels p of the pixel cluster are integrated to determine a cluster intensity I.sub.1. At 706, an extinction value E.sub.λ of the LED light on the way from the LED, through the light-absorbing dye to the image sensor is determined based on the cluster intensity I.sub.1 and the emission intensity I.sub.0. At 707, a radial position shift Δr is determined based on the extinction value E.sub.λ. At 708, a radial distance r of LED is determined based on the reference distance r.sub.0 and the radial position shift Δr. In this way, a 3D position (r, θ, ϕ) of the LED is obtained.
[0083] The process of associating a pixel cluster with a specific LED and a reference intensity I.sub.0 and a reference distance r.sub.0 predefined for this LED, it may for example be applied a tracking of pixels and pixel clusters such as described in the scientific paper “Detect to track and track to detect” by Feichtenhofer, Christoph, Axel Pinz, and Andrew Zisserman published in Proceedings of the IEEE International Conference on Computer Vision. 2017 may be used or the teachings of the scientific paper “Multi-object Detection and Tracking (MODT) Machine Learning Model for Real-Time Video Surveillance Systems”, by Elhoseny, Mohamed, published in Circuits, Systems, and Signal Processing 39.2 (2020): 611-630.
[0084] In another embodiment frequency modulation can be used to track the LEDs.
[0085] In
[0086] Aspects of this process are described in more detail with reference to
[0087]
[0088] The light emitted by LED 204a travels through light-absorbing dye and is then focused onto a camera sensor 201a by a lens system 202a at a distance b.sub.x from the optical axis 802 of the lens system 202a. The lens system 202a is defined by a lens plane 801 (main plane) and an optical axis 802 which is orthogonal to the main plane 801. The intersection of the main plane 801 and the optical axis 802 is here denoted as centre of the lens system and is treated as origin of a polar coordinate system. With single thin lenses, the main plane can be approximated by the lens centre. The image distance a describes the distance between the image produced by the optical lens 202a on the sensor 201a and the main plane 801 along the optical axis 802.
[0089] It shall be noted that in the schematic example of
[0090] The radial distance r of the LED is the distance between the light emitting LED 204a and the origin (centrum of the lens system 202a).
[0091] On its way from the LED 204a through the light-absorbing dye to the camera sensor, the LED light is gradually absorbed by homogeneously dissolved light-absorbing dye located in the deformable object 100 so that the intensity of the light decreases to a measured intensity I.sub.1, which is captured at the camera sensor 201a as described with regard to
[0092] If the deformable object is deformed, the position of the LED changes.
[0093] In
[0094] When the deformable object 100 is deformed by some external forces, the LED moves from the reference position 204a to a new position 204a′. In the example of
[0095] The radial position shift Δr can be determined based on the Lambert-Beer law (see
[0096] is an extinction value at wavelength λ that is related to the position shift Δr.
[0097] For the sake of illustration, it is assumed here, that the LED 204a emits light at a specific wavelength λ and that ∈.sub.λ is the molar attenuation coefficient ∈.sub.λ at this predefined wavelength λ. The skilled person can however adapt the teaching presented here to cases where LED 204a transmits light at multiple wavelengths λ by considering a known wavelength-dependency of the molar attenuation coefficient.
[0098] The radial position of the LED 204a after position shift Δr can thus be obtained from the predefined reference distance r.sub.0 as:
[0099] An incident angle θ which is indicative of the polar direction of LED 204a with respect to the lens system 801 may be determined from the position b.sub.x of the pixel cluster on the sensor 201a as
[0100] where a is the predefined image distance of the lens system 202a which is known as a configuration parameter of the lens system 202a.
[0101] With the same process as described with regards
[0102] The tuple (r, θ, ϕ) is 3D spherical coordinate of the light emitting LED 204a that is associated with the pixel object on the camera sensor. This 3D spherical coordinate (r, θ, ϕ) can be transformed into a global coordinate system (e.g. cartesian coordinate system) by coordination transformations which are well known to the skilled person. The transformation from a camera coordinate system to the global coordinate system can be performed for each respective camera coordinate system separately in order to perform sensor fusion of the data obtained from each of the camera sensors. That is, this process as described above may be applied to each of the plurality of the LEDs and images obtained from all camera sensors which results in that all received 3D coordinates are transformed into same global coordinate system.
[0103] Based on the 3D coordinates of the LEDs, the shape of the object 100 is reconstructed. The shape of the object 100 may for example be defined by collecting the received coordinates of the plurality of LEDs (or plurality of reflection points of the plurality of LEDs in the reflective layer 306) in a set of 3D coordinates that describes the shape of the object. Optionally, interpolation techniques such as surface spline interpolation or the like may be applied to the of set of 3D coordinates that describes the shape of the object in order to define the shape of the object.
[0104] The same principle as explained with regard to
[0105]
[0106] According to the Lambert-Beer law, the extinction E.sub.λ (absorbance of the light-absorbing dye for light of the wavelength λ) is given by
where I.sub.0 is the intensity of the incident (radiated) light, expressed for example in W.Math.m.sup.−2, I.sub.1 is the intensity of transmitted light expressed for example in W.Math.m.sup.−2, c is the concentration of the light-absorbing dye, ∈.sub.λ is the molar attenuation coefficient (or absorptivity) of the light-absorbing dye at a wavelength λ (for example NIR wavelength), and d (also called optical path) is the layer thickness of the material, expressed for example in meters. If the concentration is given in moles, ∈.sub.λ is given as the decadal molar extinction coefficient, for example in the unit m.sup.2.Math.mol.sup.−2.
[0107] Graph 901 schematically shows a function which maps the layer thickness (distance d that a light beam travels through a light-absorbing dye), and the extinction ∈.sub.λ that occurs along this travelled distance d in the light-absorbing dye.
Implementation
[0108]
[0109] It should be recognized that the embodiments describe methods with an exemplary ordering of method steps. The specific ordering of method steps is, however, given for illustrative purposes only and should not be construed as binding.
[0110] It should also be noted that the division of the electronic device of
[0111] All units and entities described in this specification and claimed in the appended claims can, if not stated otherwise, be implemented as integrated circuit logic, for example, on a chip, and functionality provided by such units and entities can, if not stated otherwise, be implemented by software.
[0112] In so far as the embodiments of the disclosure described above are implemented, at least in part, using software-controlled data processing apparatus, it will be appreciated that a computer program providing such software control and a transmission, storage or other medium by which such a computer program is provided are envisaged as aspects of the present disclosure.
[0113] Note that the present technology can also be configured as described below:
[0114] (1) An electronic device comprising a deformable object (100) which is at least partially filled with a light-absorbing material, one or more light sources (204; 304; 403) configured to illuminate the inside of the deformable object (100), an imaging unit (103) configured to capture respective images () of the light sources (204; 304; 403), and circuitry configured to reconstruct the shape of the deformable object (100) based on the captured images (
) of the light sources (204; 304; 403).
[0115] (2) The electronic device of (1), wherein the circuitry is configured to determine from the images () of the light sources (204; 304; 403) respective positions (r, θ, ϕ) of the light sources (204; 304; 403) and to determine the shape of the deformable object (100) based on the positions (r, θ, ϕ) of the light sources (204; 304; 403).
[0116] (3) The electronic device of (1) or (2), wherein the circuitry is configured to determine an intensity (I.sub.1) of an image () of a light source (204; 304; 403) captured by an image sensor (601), and to determine a radial position (r) of the light source (204; 304; 403) based on the intensity (I.sub.1).
[0117] (4) The electronic device of (3), wherein the circuitry is configured to determine a position shift (Δr) related to a light source (204; 304; 403) based on the intensity (I.sub.1) of the image () of the light source (204; 304; 403).
[0118] (5) The electronic device of (4), wherein the circuitry is configured to determine an extinction value (E.sub.λ) based on the intensity (I.sub.1) of the image () of the light source (204; 304; 403), and to determine the position shift (Δr) based on the extinction value (E.sub.λ).
[0119] (6) The electronic device of (5), wherein the circuitry is configured to determine the extinction value (E.sub.λ) according to
[0120] where I.sub.1 is the intensity of the image () of the light source (204; 304; 403), and where I.sub.0 is a reference intensity.
[0121] (7) The electronic device of anyone of (4) to (6), wherein the light-absorbing material comprises a light-absorbing dye distributed within the deformable object (100), and wherein the circuitry is configured to determine a radial position shift (Δr) of the light source (204; 304; 403) based on the intensity (I.sub.1) of the image () of the light source (204; 304; 403) according to
[0122] where ∈.sub.λ is the molar attenuation coefficient of a light-absorbing dye at wavelength λ, and c is the concentration of a light-absorbing dye distributed within the deformable object (100).
[0123] (8) The electronic device of anyone of (4) to (7), wherein the circuitry is configured to determine the radial position (r) of the light source (204; 304; 403) based on the position shift (Δr) and a reference position (r.sub.0).
[0124] (9) The electronic device of anyone of (1) to (8), wherein the circuitry is configured to determine a position (b.sub.x, b.sub.y) of an image () of a light source (204; 304; 403) on an image sensor (601) and to determine a direction (θ, ϕ) of the light source (204; 304; 403) based on this position (b.sub.x, b.sub.y).
[0125] (10) The electronic device of anyone of (1) to (9), wherein the light sources (204; 304; 403) are embedded in the deformable object (100).
[0126] (11) The electronic device of anyone of (1) to (10), wherein the circuitry is configured to reconstruct the shape of the object (100) by reconstructing a 3D position (r, θ, ϕ) for each of the light sources (204; 304; 403).
[0127] (12) The electronic device of anyone of (2) to (11), wherein the circuitry is further configured to reconstruct a shape of the object (100) based on an interpolation through positions (r, θ, ϕ) of the light sources (204; 304; 403).
[0128] (13) The electronic device of anyone of (1) to (12), wherein the imaging unit (103) is located inside the object (100).
[0129] (14) The electronic device of anyone of (1) to (13), wherein the imaging unit (103) comprises a plurality of camera sensors (201) and lens systems (202) to capture a 360° angle of view.
[0130] (15) The electronic device of anyone of (1) to (14), wherein the light sources (204) are placed at the inner side of an outer part (102) of the object (100).
[0131] (16) The electronic device of anyone of (1) to (15), wherein the light sources (304; 403) are placed inside the object (100) to illuminate a reflective layer (306) of the deformable object (100).
[0132] (17) The electronic device of anyone of (1) to (16), wherein the light sources (204; 304; 403) are LEDs.
[0133] (18) The electronic device of anyone of (1) to (17), wherein an interior part (101) of the deformable object (100) comprises a non-scattering, flexible, elastic and/or non-compressible polymer in combination with a homogeneously dissolved light-absorbing dye.
[0134] (19) The electronic device of anyone of (1) to (18), wherein the outer part (102) of the object (100) comprises a flexible, stretchable and/or plastic material.
[0135] (20) A method comprising illuminating the inside of a deformable object (100) which is at least partially filled with a light-absorbing material, capturing respective images () of the light sources (204; 304; 403), and reconstructing the shape of the deformable object (100) based on the captured images (
) of the light sources (204; 304; 403).