SYSTEM AND METHOD FOR PERSONALIZED AVATAR GENERATION, ESPECIALLY FOR COMPUTER GAMES
20170312634 · 2017-11-02
Inventors
Cpc classification
A63F13/63
HUMAN NECESSITIES
A63F2300/5553
HUMAN NECESSITIES
International classification
Abstract
A system and method for generating a 3D personalized avatar including a computerized server, a computerized client device, a bidirectional communications channel between the server and the client device, a memory in the client device, storing 3D scan data of at least part of a user's body, a memory in the server stores the 3D scan data received from the client device. A plurality of 3D model data sets are stored in the server memory. A gaming system selector provides information about a gaming system selected for personalized avatar generation. A personalized 3D avatar generation engine is responsive to the selected gaming system for merging the user 3D scan data with a 3D model data set. An avatar package generator generates a personalized avatar package containing the merged data. An avatar package installer in the client device receives the package and makes the personalized 3D avatar accessible to the selected gaming system.
Claims
1. A system for generating a 3D personalized avatar for use in particular in gaming applications, comprising: a computerized server, a computerized client device, a bidirectional communications channel between said server and said client device, a memory in said client device, storing 3D scan data of at least part of a user's body, a memory in said server for storing said 3D scan data after transmission from said client device through said bidirectional communications channel, a plurality of 3D model data sets associated to a plurality of gaming systems, stored in said server memory, a gaming system selector for providing to server information about a gaming system selected for personalized avatar generation, a personalized 3D avatar generation engine provided in said server and responsive to the selected gaming system for merging said user 3D scan data with a 3D model data set associated with the selected gaming system, an avatar package generator provided in said server for generating a personalized avatar package containing said merged data, and an avatar package installer provided in said client device for receiving said package from said server through said communications channel and for making the personalized 3D avatar accessible to the selected gaming system.
2. A system according to claim 1, wherein said user 3D scan data are unoriented scan data, and said server further comprises a 3D scan data analyzer configured for receiving from said client said unoriented 3D scan data, for generating and storing a plurality of 2D renderings of said unoriented 3D scan data from a corresponding plurality of different viewpoints, for performing image analysis on each 2D rendering for identifying and locating characteristic body and/or head areas, for selecting a best 2D view including the best found characteristic areas, and for processing said unoriented 3D scan data so that they refer to head or body axes.
3. A system according to claim 2, wherein said unoriented 3D scan data comprise head data and said 3D scan data analyzer is configured for identifying characteristic areas corresponding to eyes and mouth in said renderings.
4. A system according to claim 3, wherein said 3D scan data analyzer is configured for performing a fine reorientation of the unoriented 3D scan data from the positions of said characteristic areas in the best 2D view.
5. A system according to claim 1, wherein said server comprises a universal scan file generator for generating scan files containing data capable of being merged with a plurality of different formats corresponding to said 3D model data sets.
6. A system according to claim 1, wherein said universal scan file generator is capable of generating a first scan file of higher definition adapted for use by said avatar generation engine and a second scan file of lower definition adapted for display in a client device.
7. A system according to claim 1, wherein said server and said client device are configured for interactive avatar parameter adjustment by transmitting low definition scan data from said server to said client device, for computing at client side changes in the 3D scan aspect in response to parameter changes also made at client side, and for displaying the changed 3D scan aspect as parameters are changed.
8. A system according to claim 7, wherein said client device is configured to transmit the final avatar parameters to said server, said parameters being used by said avatar generation engine for processing said user 3D scan data before merging.
9. A system according to claim 1, wherein said avatar generation engine is configured for determining whether 3D points are located within the scan area, or within the model area, or else in a transition area between the scan and the avatar, and selecting which model data are to be replaced with scan data or combined with scan data based on such determination, for generating a merged 3D structure.
10. A system according to claim 9, wherein the coordinates of a pair of boundaries are associated with the stored 3D models, a first boundary extending between the scan area and the transition area, and a second boundary extending between the transition area and the model area.
11. A system according to claim 10, wherein the coordinates of the merged 3D structure in the transition area are determined by interpolation between scan coordinates and model coordinates.
12. A system according to claim 11, wherein said interpolation uses interpolation coefficients that vary gradually from the first boundary to a second boundary to ensure a smooth shape transition between the 3D scan in the scan area and the 3D model in the model area.
13. A system according to claim 12, wherein said avatar generation engine is further configured to gradually mix the textures of the 3D scan and the textures of the 3D model in the transition area.
14. A computer-implemented method for generating a 3D personalized avatar for use in particular in gaming applications, comprising: generating and transmitting to a server 3D scan data of at least part of a user's body and storing said scan data in a server memory, providing a plurality of 3D model data sets associated to a plurality of gaming systems in said server memory, selecting a particular gaming system among said plurality of gaming systems, generating a personalized 3D avatar by merging said user 3D scan data with a 3D model data set associated with said selected gaming system, generating an avatar package containing said merged data in said server, transmitting said avatar package to a client device connectable to gaming system of the selected type, and installing said avatar package in said client device for making the personalized 3D avatar accessible to said gaming system.
15. A method according to claim 14, wherein said user 3D scan data are unoriented scan data and the method further includes: generating from said unoriented 3D scan data a plurality of 2D renderings of said unoriented 3D scan data from a corresponding plurality of different viewpoints, performing image analysis on each 2D rendering for identifying and locating characteristic body and/or head areas, selecting a best 2D view including the best found characteristic areas, and processing said unoriented 3D scan data so that they refer to head or body axes.
16. A method according to claim 15, wherein said unoriented 3D scan data comprise head data and said characteristic areas correspond to eyes and mouth in said renderings.
17. A method according to claim 16, comprising the further step of performing a fine reorientation of the unoriented 3D scan data from the positions of said characteristic areas in the best 2D view.
18. A method according to claim 14, comprising the step of generating from said 3D scan data a universal scan file containing data capable of being merged with a plurality of different formats corresponding to said 3D model data sets.
19. A method according to claim 14, comprising the generation of a first scan file of higher definition adapted for use for avatar generation and a second scan file of lower definition for display in a client device.
20. A method according to claim 14, comprising a further step of adjusting scan parameters by: transmitting a low-definition scan file from said server to said client device, performing parameter changes at said client device, computing at said client device changes in the 3D scan aspect in response to said parameter changes, and displaying of a client device display the correspondingly changing 3D scan aspect.
21. A method according to claim 20, comprising a further step of transmitting from said client device to said server the final avatar parameters, said parameters being inputted to the avatar generation step.
22. A method according to claim 14, wherein the avatar generation step comprises determining whether 3D points are located within the scan area, or within the model area, or else in a transition area between the scan and the avatar, and selecting which model data are to be replaced with scan data or combined with scan data based on such determination.
23. A method according to claim 22, wherein the coordinates of a pair of boundaries are associated with the stored 3D models, a first boundary extending between the scan area and the transition area, and a second boundary extending between the transition area and the model area.
24. A method according to claim 23, wherein the coordinates of the merged 3D structure in the transition area are determined by interpolation between scan coordinates and model coordinates.
25. A method according to claim 24, wherein said interpolation uses interpolation coefficients that vary gradually from the first boundary to a second boundary to ensure a smooth shape transition between the 3D scan in the scan area and the 3D model in the model area.
26. A method according to claim 25, wherein said avatar generation step further comprises gradually mixing the textures of the 3D scan and the textures of the 3D model in the transition area.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0059] Other aims, features and advantages of the present invention will appear more clearly from the following description of a preferred embodiment thereof, given by illustration only and made with reference to the appended drawings, in which:
[0060]
[0061]
[0062]
[0063]
[0064]
[0065]
[0066]
[0067]
[0068]
[0069]
[0070]
[0071]
[0072]
DETAILED DESCRIPTION OF A PREFERRED EMBODIMENT
1) Hardware Architecture
[0073] Referring to
[0074] Server 20 comprises a conventional computing architecture with processor, memories and I/O circuits, functionally defining together a graphical user interface (GUI) generation unit 210 for providing a user interface to client device 10 and for collecting instructions therefrom, and an avatar generation engine 220 cooperating with memory 230 for performing the various server-side methods as will be described in the following.
2) Overview
[0075] Now referring to
[0084] Both methods 160 and 170 rely on a merging process (method 180) for combining an avatar model corresponding to a selected game or game family and the universal 3D scan data generated at step 140, taking into account the adjustment parameters collected by server at step 140.
[0085] The above methods will now be described in detail.
3) 3D Scan Analysis
[0086] Now referring to
4) Method 140: Universal Scan File Generation
[0101] This method generates universal high-definition and low definition data structures representative of the head with proper orientation as determined at step 130. It includes the following steps: [0102] step 141: a low-definition 2D scan thumbnail of the reoriented best image containing the face is generated, for use as explained below; [0103] step 142: a normal map of the 3D scan in its original definition, as viewed from the head axis is generated; this is done by parsing the scan polygons and writing the coordinates (x, y, z) of the normal vector interpolated to the position of each pixel in a texture; [0104] step 143: the 3D scan data are decimated, e.g. with the commercially available VCG library, in order to simplify the subsequent treatments while retaining a number of polygons sufficiently large to preserve the details of the head; [0105] step 144: a high-definition (HD) version of the 3D scan as obtained at step 143 is stored together with its normal map in an appropriate file format denoted UFF, such format being preferably universal in that it does not depend on a third party library for its handling and is extensible; in addition, the format is preferably adapted for direct handling by a usual library such as WebGL on the client side; details of the format will be provided in the following; [0106] step 145: the 3D scan data are further decimated to a polygon density compatible with computer or tablet browser display; [0107] step 146: the textures scale in said 3D data is adapted for compatibility with browser display; [0108] step 147: a low-definition (LD) version of the 3D scan as obtained in steps 145 and 146 is stored, without normal map, in the universal scan format UFF as further explained below, for use in avatar adjustment method 150 as described below.
5) Method 150: Scan Parameters Adjustment
[0109] Now referring to
[0115] The information and data collected at steps 151-155 are transmitted from client to server and there to the avatar generation engine 220, the latter then generating a new avatar configuration according to methods 160-180 as described in the following.
6) Method 160: Avatar Preview Model Generation
[0116] As illustrated in
7) Method 170: Avatar Package Generation
[0124] Now referring to
8) Method 180: Merging Process
[0130] The merging process 180 mentioned in steps 164 and 174 will now be described with reference to
[0131] An object model such as mentioned in step 172, corresponding to a particular game or game family and in the native format thereof, has the general structure defined as follows: [0132] it can be made of any number of 3D geometrical objects in mesh form; [0133] each mesh can have any number of surfaces, and each surface can contain any number of polygons and display-related information (material type, texture, transparency, etc.), [0134] each polygon is defined by at least 3 vertex identifiers; [0135] a mesh contains minimum basic information for each vertex, i.e. vertex position, normal to the surface at the vertex, texture data; [0136] a mesh optionally contains additional information associated to each vertex, that will be interpolated by the merging process; such additional information includes for instance additional texture coordinates, tangent coordinates, bi-normal coordinates, bone weights for use by a skinning process, etc.;
[0137] It should be noted that various related information that do not fit into model format but need to be included in the final package are kept and stored separately in the server storage in association with the model; such data include for instance material properties and certain geometrical data for use by the game rendering engine. These data are included in the package generated at step 175.
[0138] The merging process is implemented by the plugin selected at step 171, which is configured to new files in the game native format, taking into account the changes brought to the 3D data geometry.
[0139] For this purpose, a 3D mathematical model is pre-established for each 3D model, this model allowing to determine whether a point having given 3D coordinates is located: [0140] either within the scan area, [0141] or within the model area, [0142] or else in a transition area between the scan and the avatar,
and in the latter case, to compute an interpolation coefficient between the scan and the model.
[0143] In one practical example, as illustrated in
[0144] In such case, the 3D mathematical model is capable of determining a first boundary, in the present species a first plane P1, in the top region of the neck and a second boundary, in the present species a second plane P2 preferably parallel to plane P1, in the bottom region of the neck.
[0145] Once these planes have been defined, the merging process performs the following steps: [0146] step 181: the 3D model is prepared for the merging: [0147] all the geometry of the model located within the scan area (i.e. above plane P1) is removed; [0148] the geometries of the scan data and the model data comprising all points located in the transition area between planes P1 and P2 are converted into a closed shape, so as to avoid display artifacts (holes in the display) generated by the fact that the scan cross-section in planes P1 and P2 is not identical to the model cross-section in these planes; this is done by closing the tubular geometries of the scan and model data in said transition area (corresponding to the neck) along said planes P1 and P2; [0149] step 182: the scan data are injected into the 3D model by: [0150] deleting all the scan geometry located in the model area (i.e. below plane P2); [0151] decimating the remaining scan geometry so as to adapt the scan geometry (definition) to the technical requirement of the target game application; typically, this is done by using definition information associated with the model data and stored in the file in the UFF format; [0152] enriching the scan vertex information in the scan area with the above-mentioned additional information missing from the scan data themselves but present in the model; this is performed by interpolating the values of the additional information based on distance with of the scan surface; [0153] adding the scan geometry to the model, with vertex coordinates of the 3D scan unchanged; [0154] generating a transition geometry in the transition area by interpolating the vertex coordinates of the 3D scan with those of the 3D model, the interpolation coefficients being small in the vicinity of plane P1 and progressively larger toward plane P2, so that the 3D scan data in the transition area progressively become adjusted to the 3D model data at plane P2, thus avoiding discontinuities; [0155] step 183: the scan textures are added to the 3D model by: [0156] rearranging each scan texture so that only the zones used by the scan after merging (i.e. head and neck areas in the present example) are used; [0157] in the transition area, mixing the scan textures of the scan polygons located with the model textures of the model polygons according to interpolation coefficients that vary gradually from the first boundary to the second boundary, so as to ensure a smooth visual transition between the scan textures used above plane P1 and the model textures used below plane P2, thus avoiding undesirable discontinuities; [0158] recomputing the scan texture coordinates so that they correspond to the rearranged vertices in the transition area. [0159] step 184: if the plugin associated with the game application supports facial animation (which is determined by a flag or equivalent contained in the plugin), then the following is performed: [0160] the 3D scan polygons injected into the model are divided using the lip boundary segments determined at step 1334; [0161] a geometry of the inter-lip space of the mouth is generated and injected into the model; [0162] a set of 2D parameterizations of the geometry are computed in relation with certain interest zones of the face (eyes, mouth, . . . ); in particular, the influence of the head bone movements impacting facial animations is computed for each vertex, taking into consideration the distances between these vertices and the 2D parameterization of a head bone system stored in the model file in the UFF format;
[0167] The above model is only a possible embodiment, and the skilled person will be able to design other suitable 3D mathematical models (typically not based on separation planes) ensuring that a smooth geometrical and visual transition between the scan area and the model area is ensured.
9) Method 190: Avatar Installation
[0168] Now referring to
[0169] It should be noted here that certain game applications allow direct avatar loading, while other game applications require a specific program for including new avatars to the game. The game or game family information stored in server 20 includes a flag or the like giving such indication.
[0170] Method 190 comprises the following steps: [0171] step 191: if the selected game application supports direct loading, the user connects with his client equipment to his user account in server 20, selects a game or game family, and then selects an existing avatar for this game/game family by browsing in a menu or through avatar thumbnails; once an avatar is selected, the corresponding package stored in memory 230 of server 20 is downloaded to client device, where the client operating systems allows loading the package into the appropriate folder of the game application package; [0172] step 192: if the selected game application does not support direct loading, the package downloading and installation in the game application is performed by a dedicated client program which selects and loads the plugin, capable of performing the procedure required for entering into the game data structure and installing the avatar into that data structure.
[0173] The skilled person will be able to bring many changes and variants to the present invention as described above. In particular: [0174] although the present invention has been described in its application to game programs executed on the client equipment 10, the present invention can be extended to programs executed on dedicated game consoles. In such case, the avatar package will be transferred from the client equipment to the game console by appropriate means such as Wi-Fi connection or a removable storage, and an avatar loading program will be executed in the game console; [0175] although the present has been described in its application for face avatars, full body avatars, or avatars for other body parts, can be generated with the present invention. In this case, the transition zones between scan areas and model areas shall be determined as a function of the types of areas.