Apparatus and method for creating three-dimensional personalized figure
09846804 ยท 2017-12-19
Assignee
Inventors
- Seong-Jae Lim (Daejeon, KR)
- Bon-Woo Hwang (Daejeon, KR)
- Kap-Kee Kim (Daejeon, KR)
- Seung-Uk Yoon (Daejeon, KR)
- Hye-Ryeong Jun (Daejeon, KR)
- Jin-Sung Choi (Daejeon, KR)
- Bon-Ki Koo (Daejeon, KR)
Cpc classification
Y02P90/02
GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
G06T19/20
PHYSICS
B33Y80/00
PERFORMING OPERATIONS; TRANSPORTING
G05B2219/49023
PHYSICS
G06V20/653
PHYSICS
International classification
B33Y80/00
PERFORMING OPERATIONS; TRANSPORTING
G05B19/4093
PHYSICS
Abstract
Disclosed herein is an apparatus and method for automatically creating a 3D personalized figure suitable for 3D printing by detecting a face area and features for respective regions from face data acquired by heterogeneous sensors and by optimizing global/local transformation. The 3D personalized figure creation apparatus acquires face data of a user corresponding to a reconstruction target; extracts feature points for respective regions from the face data, and reconstructs unique 3D models of the user's face, based on the extracted feature points; creates 3D figure models based on the unique 3D models and previously stored facial expression models and body/adornment models; and verifies whether each 3D figure model has a structure and a shape corresponding to actual 3D printing, corrects and edits the 3D figure model based on results of verification, and outputs a 3D figure model corresponding to 3D printing.
Claims
1. A method for creating a three-dimensional (3D) personalized figure, the method being performed by an apparatus for creating a 3D personalized figure, comprising: acquiring face data captured by a plurality of heterogeneous sensors of a user corresponding to a reconstruction target; extracting feature points for respective regions from the face data, and reconstructing unique 3D models of the user's face, based on the extracted feature points; creating 3D figure models based on the unique 3D models and previously stored facial expression models and body/adornment models by: generating facial expressions that match an actually input facial expression of the user, based on the unique 3D models and the previously stored facial expression models; and selecting a facial expression that stochastically matches a facial expression of the actual user from among the created facial expressions, and creating the 3D figure models based on the selected facial expression; verifying whether each 3D figure model has a structure and a shape corresponding to actual 3D printing; correcting and editing the 3D figure model based on results of the verification; and outputting a 3D figure model corresponding to 3D printing, wherein reconstructing the unique 3D models of the user's face comprises: generating a front image based on the face data; detecting a face area from the front image; detecting feature points for respective regions from the face area; detecting 3D corresponding points of the face data corresponding to the feature points of a 3D standard model, based on the feature points for respective regions, and then matching and transforming appearance information of the 3D standard model; generating a face texture map using the transformed appearance information of the 3D standard model and the front image; and creating the unique 3D models of the user's face using the face texture map.
2. The method of claim 1, the unique 3D models of the user's face comprises creating the unique 3D models of the user's face based on a procedure of causing a color of a remaining area of the face, which is not captured, to match a color of the face area, which is captured, using the face texture map.
3. The method of claim 1, wherein creating the 3D figure models comprises: creating adorned unique 3D models by combining the previously stored body/adornment models with the unique 3D models; and selecting a body/adornment model that stochastically matches that of the actual user from among the adorned unique 3D models.
4. The method of claim 1, wherein the face data corresponds to a 3D unrefined mesh model.
5. An apparatus for creating a 3D personalized figure, comprising: an information acquisition unit for acquiring face data of a user corresponding to a reconstruction target from a plurality of heterogeneous sensors; a face reconstruction unit for extracting feature points for respective regions from the face data, and reconstructing unique 3D models of the user's face, based on the extracted feature points; a model creation unit for creating 3D figure models based on the unique 3D models and previously stored facial expression models and body/adornment models; and a facial expression generation unit for generating facial expressions that match an actually input facial expression of the user, based on the unique 3D models and the previously stored facial expression models; a selection unit for selecting a facial expression that stochastically matches that of the actual user from among the created facial expressions, and creates the 3D figure models based on the results of selection; a model verification unit for verifying whether each 3D figure model has a structure and a shape corresponding to actual 3D printing, correcting and editing the 3D figure model based on results of verification, and outputting a 3D figure model corresponding to 3D printing, wherein the face reconstruction unit comprises: a generation unit for generating a front image based on the face data; a face area detection unit for detecting a face area from the front image; a feature point detection unit for detecting feature points for respective regions from the face area; a matching and transformation unit for detecting 3D corresponding points of the face data corresponding to the feature points of a 3D standard model, based on the feature points for respective regions, and then matching and transforming appearance information of the 3D standard model; a texture map generation unit for generating a face texture map using the transformed appearance information of the 3D standard model and the front image; and a unique 3D model creation unit for creating the unique 3D models of the user's face using the face texture map.
6. The apparatus of claim 5, wherein the face data corresponds to a 3D unrefined mesh model.
7. The apparatus of claim 5, wherein the unique 3D model creation unit creates the unique 3D models of the user's face based on a procedure of causing a color of a remaining area of the face, which is not captured, to match a color of the face area, which is captured, using the face texture map.
8. The apparatus of claim 5, wherein the model creation unit comprises: a body/adornment model creation unit for creating adorned unique 3D models by combining the previously stored body/adornment models with the unique 3D models; wherein the selection unit selects a body/adornment model that stochastically matches that of the actual user from among the adorned unique 3D models.
9. The apparatus of claim 8, wherein the model creation unit is configured such that a facial expression model storage unit including the facial expression models and a body/adornment model storage unit including the body/adornment models are operated in conjunction with each other.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
(1) The above and other objects, features and advantages of the present invention will be more clearly understood from the following detailed description taken in conjunction with the accompanying drawings, in which:
(2)
(3)
(4)
(5)
(6)
(7)
(8)
(9)
DESCRIPTION OF THE PREFERRED EMBODIMENTS
(10) The present invention will be described in detail below with reference to the accompanying drawings. Repeated descriptions and descriptions of known functions and configurations which have been deemed to make the gist of the present invention unnecessarily obscure will be omitted below. The embodiments of the present invention are intended to fully describe the present invention to a person having ordinary knowledge in the art to which the present invention pertains. Accordingly, the shapes, sizes, etc. of components in the drawings may be exaggerated to make the description clearer.
(11) Hereinafter, an apparatus and method for creating a 3D personalized figure according to preferred embodiments of the present invention will be described in detail with reference to the attached drawings.
(12)
(13) Referring to
(14) The information acquisition unit 100 acquires the face data (=3D unrefined mesh model) of a user corresponding to a reconstruction target using heterogeneous sensors.
(15) The face reconstruction unit 200 extracts feature points for respective regions based on statistical feature information, and reconstructs unique 3D models of the user's face based on the extracted feature points.
(16) The model creation unit 300 creates various 3D figure models stochastically matching an actual user, based on the unique 3D models reconstructed by the face reconstruction unit 200, facial expression models stored in the facial expression model storage unit 31, and body/adornment models stored in the body/adornment model storage unit 32.
(17) The model verification unit 400 verifies whether each 3D figure model created by the model creation unit 300 has a structure and shape suitable for actual 3D printing, corrects and edits the 3D figure model based on the results of verification, and outputs a 3D figure model suitable for 3D printing.
(18) Below, the face reconstruction unit 200 will be described in detail with reference to
(19)
(20) Referring to
(21) The generation unit 210 generates a front image based on face data (=3D unrefined mesh model).
(22) The face area detection unit 220 detects a face area from the front image.
(23) The feature point detection unit 230 automatically detects feature points for respective regions from the face area detected by the face area detection unit 220.
(24) The matching and transformation unit 240 detects 3D corresponding points of a 3D unrefined mesh model, corresponding to the feature points of a 3D standard model, based on feature points detected by the feature point detection unit 230, and matches and transforms the appearance information of the 3D standard model using a global and local transformation optimization technique.
(25) The texture map generation unit 250 generates a face texture map using both the transformed appearance information of the 3D standard model and the front image.
(26) The unique 3D model creation unit 260 creates unique 3D models of the user's face using a procedure of causing the color of the remaining area of the face, which is not captured, to match the color of the face area, which is captured, using the face texture map.
(27) Below, the model creation unit 300 will be described in detail with reference to
(28)
(29) Referring to
(30) The facial expression generation unit 310 transforms the appearance of each unique 3D model created by the face reconstruction unit 200, based on statistical feature-based 3D facial expression models stored in the facial expression model storage unit 31, and generates various facial expressions matching the actually input facial expression of the user.
(31) The body/adornment model creation unit 320 creates adorned unique 3D models by combining the unique 3D models created by the face reconstruction unit 200 with body/adornment models stored in the body/adornment model storage unit 32.
(32) The selection unit 330 selects a facial expression stochastically matching that of the actual user from among the facial expressions generated by the facial expression generation unit 310 and a body/adornment model stochastically matching that of the actual user from among the adorned unique 3D models created by the body/adornment model creation unit 320, and then creates a final 3D figure model based on the selected facial expression and body/adornment model. Generally, figures may be created in various shapes by adding bodies, hairstyles, beard/mustache, glasses, hats, accessories, etc. conforming to various themes.
(33) Below, the model verification unit 400 will be described in detail with reference to
(34)
(35) Referring to
(36) The editing unit 410 corrects and edits the 3D figure model created by the model creation unit 300 in consideration of printing suitability (internal/external thicknesses, joint states, etc.), supportability, and safety which are required by various 3D printers.
(37) The verification unit 420 verifies the 3D printing suitability/supportability/safety of the 3D figure model corrected and edited by the editing unit 410, and outputs a final 3D figure model suitable for 3D printing from the results of verification.
(38) The verification unit 420 may verify the 3D printing suitability/supportability/safety of the 3D figure model by comparing printing values corresponding to the 3D figure model with previously stored reference values. Here, the printing values correspond to the concept of typical numerical values required when a 3D figure model is printed using a 3D printer. Further, the reference values correspond to the concept of numerical values for a structure and a shape corresponding to actual 3D printing.
(39) In this way, the 3D personalized figure creation apparatus according to the embodiment of the present invention may reconstruct the appearance of a 3D face by matching and transforming a 3D standard model, into which statistical 3D facial feature vectors are incorporated, using unrefined mesh data of the face (face data of the user) acquired from various heterogeneous sensors, and may incorporate various 3D body/adornment models into the reconstructed 3D face appearance and transform the reconstructed 3D face appearance, thus automatically creating various 3D personalized figure models.
(40) Below, a method for creating a 3D personalized figure will be described in detail with reference to
(41)
(42) Referring to
(43) The 3D personalized figure creation apparatus extracts features points for respective regions from the 3D unrefined mesh model acquired at step S100 based on statistical feature information, and reconstructs unique 3D models of the user's face based on the extracted feature points at step S200.
(44) The 3D personalized figure creation apparatus creates various 3D figure models stochastically matching the actual user, based on the unique 3D models reconstructed at step S200, facial expression models stored in the facial expression model storage unit 31, and body/adornment models stored in the body/adornment model storage unit 32 at step S300.
(45) The 3D personalized figure creation apparatus verifies whether each created 3D figure model has a structure and a shape suitable for actual 3D printing, corrects and edits the 3D figure model based on the results of verification, and then outputs a 3D figure model suitable for 3D printing at step S400.
(46) In greater detail, the 3D personalized figure creation apparatus may verify 3D printing suitability/supportability/safety by comparing printing values corresponding to the 3D figure model with previously stored reference values. Here, the printing values correspond to the concept of typical numerical values required when a 3D figure model is printed using a 3D printer, and the reference values correspond to the concept of numerical values corresponding to a structure and a shape for actual 3D printing.
(47) Below, a procedure of reconstructing a unique 3D model and a procedure of creating a 3D figure model will be described in detail with reference to
(48)
(49) Referring to
(50) The 3D personalized figure creation apparatus detects a face area from the front image at step S220.
(51) The 3D personalized figure creation apparatus automatically detects feature points for respective regions from the face area, detected at step S220, at step S230.
(52) The 3D personalized figure creation apparatus detects 3D corresponding points of the 3D unrefined mesh model corresponding to the feature points of a 3D standard model, based on the feature points detected at step S230, and thus performs matching and transformation on the appearance information of the 3D standard model using a global and local transformation optimization technique at step S240.
(53) The 3D personalized figure creation apparatus generates a face texture map using the appearance information of the 3D standard model transformed at step S240 and the front image at step S250.
(54) The 3D personalized figure creation apparatus creates unique 3D models of the face based on a procedure of causing the color of the remaining area of the face, which is not captured, to match the color of the face area, which is captured, using the face texture map, at step S260.
(55) The 3D personalized figure creation apparatus generates various facial expressions of the unique 3D models created at step S260, and creates adorned unique 3D models by combining the body/adornment models with the unique 3D models created at step S260 at step S310.
(56) The 3D personalized figure creation apparatus selects a facial expression stochastically matching that of the actual user from among the facial expressions generated at step S310 and a body/adornment model stochastically matching that of the actual user from among the adorned unique 3D models, and creates a final 3D figure model based on the selected facial expression and body/adornment model at step S320.
(57) Below, a procedure of outputting a 3D figure model suitable for 3D printing will be described in detail with reference to
(58)
(59) Referring to
(60) The 3D personalized figure creation apparatus verifies the 3D printing suitability/supportability/safety of the 3D figure model, corrected and edited at step S410, at step S420.
(61) When verification has failed at step S420, the 3D personalized figure creation apparatus corrects and edits the 3D figure model again, whereas when the verification has succeeded at step S420, the 3D personalized figure creation apparatus outputs a final 3D figure model suitable for 3D printing at step S430.
(62) In this way, the present invention reconstructs a desirably refined 3D personalized face appearance that is suitable for 3D printing by means of the transformation and transition of a 3D standard model by automatically detecting feature information based on statistical 3D facial features using unrefined mesh data of the face acquired from various heterogeneous sensors, generates texture of the 3D face model based on the color information of a face area from an input image, and then reproduces the realistic color of the face. Further, the present invention may automatically create various 3D personalized figure models by incorporating facial expression/body/adornment models such as various 3D facial expressions, bodies, hairstyles, beard/mustache, glasses, hats, and accessories into the reconstructed unique 3D personalized face model and by transforming the reconstructed unique 3D personalized face model.
(63) Furthermore, the present invention may print the results of correction/editing of 3D personalized figure models by testing the 3D personalized figure models for suitability (appearance thickness, joint states, etc.), supportability, and safety for 3D printing, and may then utilize the 3D personalized figure models for various purposes such as the creation of 3D personalized figures and the setup of personal busts.
(64) In accordance with the present invention, the present invention may create 3D personalized figures suitable for 3D printing by detecting a face area and features for respective regions from face data acquired by heterogeneous sensors and by optimizing global/local transformation.
(65) Further, the present invention may accommodate various heterogeneous sensor inputs and automatically create various 3D personalized figures suitable for 3D printing.
(66)
(67) The apparatus for creating a 3D personalized figure may be implemented as a computer 500 illustrated in
(68) The apparatus for creating a 3D personalized figure may be implemented in a computer system including a computer-readable storage medium. As illustrated in
(69) At least one unit of the apparatus for creating a 3D personalized figure may be configured to be stored in the memory 523 and to be executed by at least one processor 521. Functionality related to the data or information communication of the apparatus for creating a 3D personalized figure may be performed via the network interface 529.
(70) The at least one processor 521 may perform the above-described operations, and the storage 528 may store the above-described constants, variables and data, etc.
(71) The methods according to embodiments of the present invention may be implemented in the form of program instructions that can be executed by various computer means. The computer-readable storage medium may include program instructions, data files, and data structures solely or in combination. Program instructions recorded on the storage medium may have been specially designed and configured for the present invention, or may be known to or available to those who have ordinary knowledge in the field of computer software. Examples of the computer-readable storage medium include all types of hardware devices specially configured to record and execute program instructions, such as magnetic media, such as a hard disk, a floppy disk, and magnetic tape, optical media, such as compact disk (CD)-read only memory (ROM) and a digital versatile disk (DVD), magneto-optical media, such as a floptical disk, ROM, random access memory (RAM), and flash memory. Examples of the program instructions include machine code, such as code created by a compiler, and high-level language code executable by a computer using an interpreter. The hardware devices may be configured to operate as one or more software modules in order to perform the operation of the present invention, and the vice versa.
(72) At least one embodiment of the present invention provides an operation method and apparatus for implementing a compression function for fast message hashing.
(73) At least one embodiment of the present invention provides an operation method and apparatus for implementing a compression function that are capable of enabling message hashing while ensuring protection from attacks.
(74) At least one embodiment of the present invention provides an operation method and apparatus for implementing a compression function that use combinations of bit operators commonly used in a central processing unit (CPU), thereby enabling fast parallel processing and also reducing the computation load of a CPU.
(75) At least one embodiment of the present invention provides an operation method and apparatus that enable the structure of a compression function to be defined with respect to inputs having various lengths.
(76) Although the present invention has been described in conjunction with the limited embodiments and drawings, the present invention is not limited thereto, and those skilled in the art will appreciate that various modifications, additions and substitutions are possible from this description. For example, even when described technology is practiced in a sequence different from that of a described method, and/or components, such as systems, structures, devices, units, and/or circuits, are coupled to or combined with each other in a form different from that of a described method and/or one or more thereof are replaced with one or more other components or equivalents, appropriate results may be achieved.
(77) Therefore, other implementations, other embodiments and equivalents to the claims fall within the scope of the attached claims.