SYSTEM AND METHOD FOR DEVELOPING APPLICATIONS
20170357489 · 2017-12-14
Inventors
Cpc classification
International classification
Abstract
A system and method for developing an application. The system comprises a user interface, an object module for creating an object comprising an attribute and a function returning a value for the attribute, an activity module for defining an object's behavior, a simulation module comprising a rule governing the object's environment, and a build module, for compiling the application. The method comprises the steps of defining a plurality of objects and a relationship between the objects, extrapolating a logical clause based on the relationship, and building an application consistent with the logical clause.
Claims
1-10. (canceled)
11. A system for developing an application, comprising: a user interface; an object module, capable of creating an object comprising an attribute and a function returning a value for the attribute using the user interface; an activity module, capable of defining the object's behavior, using the user interface; a simulation module, comprising a rule governing the object's environment; and a build module, capable of compiling the application.
12. The system of claim 11, wherein the object, object behavior, and rule are preset, user-configurable, or user created.
13. The system of claim 11, wherein the object is a visual object, an audio object, an action, a conditional, an expression, an attribute, a user action, or a state.
14. The system of claim 11, wherein the object is a game element, an instance of the game element, or a user interaction.
15. The system of claim 11, further comprising: a relationship module, capable of creating functions returning a value for a relationship between objects, between object behaviors, or between an object and an object behavior.
16. The system of claim 15, further comprising: an artificial intelligence module; a plurality of scenarios, each comprising a unique value for at least one variable; wherein the artificial intelligence module is capable of extrapolating a logical clause from the relationship; and wherein the artificial intelligence module is capable of creating a scenario interpolated from the plurality of scenarios.
17. The system of claim 16, wherein the artificial intelligence module is capable of extrapolating logical clauses from a truth table comprising permutations of the relationships.
18. A method of developing an application, comprising the steps of: creating a relationship between objects; and compiling the objects and the relationship.
19. The method of claim 18, wherein at least one of the objects is a user interaction.
20. The method of claim 18, wherein the step of compiling further comprises creating a first scenario comprising objects and relationships.
21. The method of claim 20, further comprising the step of automatically extrapolating a second scenario by altering a variable in the first scenario.
22. A method of automating application development, comprising the steps of: a user defining a scenario, comprising a plurality of objects and a relationship between the objects; a computer extrapolating a first logical clause based on the relationship; and the computer building an application consistent with the first logical clause.
23. The method of claim 22, wherein the first logical clause is extrapolated from a truth table comprising the relationship.
24. The method of claim 22, wherein the user defines a plurality of scenarios and a new scenario is interpolated.
25. The method of claim 22, wherein the user selects a parameter to modify and wherein a new scenario with a modified parameter is generated.
26. The method of claim 22, wherein the application is executed with a simulated user and wherein an outcome is recorded.
27. The method of claim 26, wherein a second logical clause is extrapolated from the outcome and wherein the method is iterated with the building of the application consistent with the first and second logical clauses.
28. The method of claim 25, wherein the user defines a numerical range for the parameter and the new scenario is generated with the parameter modified to a value within the numerical range.
29. The method of claim 28, wherein the user assigns numerical weights to minimum and maximum values of the numerical range and weights of intermediate values are extrapolated.
30. The method of claim 28, wherein the user selects a plurality of parameters to modify and wherein weights of the values of the parameters are averaged for an average weight and the new scenario is generated with the average weight.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0011] The foregoing summary, as well as the following detailed description of presently preferred embodiments of the invention, will be better understood when read in conjunction with the appended drawings. For the purpose of illustrating the invention, there are shown in the drawings embodiments which are presently preferred. It should be understood, however, that the invention is not limited to the precise arrangements and instrumentalities shown.
[0012] In the drawings:
[0013]
[0014]
[0015]
[0016]
[0017]
[0018]
[0019]
[0020]
[0021]
[0022]
[0023]
[0024]
[0025]
[0026]
[0027]
[0028]
[0029]
[0030] To facilitate an understanding of the invention, identical reference numerals have been used, when appropriate, to designate the same or similar elements that are common to the figures. Further, unless stated otherwise, the features shown in the figures are not drawn to scale, but are shown for illustrative purposes only.
DETAILED DESCRIPTION OF THE DRAWINGS
[0031] Certain terminology is used in the following description for convenience only and is not limiting. The article “a” is intended to include one or more items, and where only one item is intended the term “one” or similar language is used. Additionally, to assist in the description of the present invention, words such as top, bottom, side, upper, lower, front, rear, inner, outer, right and left are used to describe the accompanying figures. The terminology includes the words above specifically mentioned, derivatives thereof, and words of similar import.
[0032] Referring to
[0033] The software system has many aspects that are divided into various blocks for easier description and understanding. Blocks 101-114 each provide objects that may be preset, user-configurable, or entirely user created. These objects can be visual or audio objects, actions, conditionals, expressions, attributes, user actions, and states. For example, an object can be a game character, a character's movement, a character's ability, a player gesture such as a touchscreen swipe, or a state specifying a combination of aspects included but not limited to the before mentioned. Not all blocks are necessary, some blocks may be combined, and other blocks are possible in other embodiments.
[0034] Block 101 shows an interface modifier. The interface modifier is responsible for interpreting and managing object interface reference data.
[0035] Block 102 is a behavior configure. The behavior configure is responsible for object visual, audio, and response characteristics activation. Block 103 is a graphic modifier. The graphic modifier is responsible for interpreting and managing user data relative to graphics, “look and feel,” and audio. Block 104 is an event modifier. The event modifier is responsible for interpreting and managing user data relative to events, cyclic procedures, and actions. Block 105 is an input modifier. The input modifier is responsible for interpreting and managing user interface input data. Block 106 is a dynamics modifier. The dynamics modifier is responsible for interpreting and managing user dynamics data.
[0036] The interface modifier is a subcomponent of the attribute module 107. The attribute module 107 manages types, identifiers, and auxiliary components associated with an object. Types include but are not limited to alphanumeric elements and container-hybrid elements. Identifiers associate data with parameter access, allowing for value read/write or modification. Auxiliary components allow an object to integrate elements defined in another object.
[0037] Block 108 is a semantic link. The semantic link is responsible for managing semantic associations, linking data elements or data sources to specific reference identifiers. The semantic link and the behavior configure are subcomponents of the state module 113. The state module 113 manages a behavioral pattern through object trait activation. Subcomponents manage elements controlling an object's life-cycle within an environment. An object may modify parameters or choose to transition into other patterns. Pattern elements form a pseudo-template stored and tagged for use during initialization processes.
[0038] The state module 113 lets users configure objects that define an application entity's overall look and feel, and action-response mechanisms.
[0039] Users configure presentation layers and behavior patterns that can be linked to specific state objects. The objects function as attribute containers. They help users organize an entity's characteristics into manageable groups and assign each an identifier. The identifiers facilitate state-to-state interaction as well as the trigger mechanisms associated with each state.
[0040] Within each state object, references to a subset of the system attributes available for activation exists for configuration. This includes but is not limited to a number attributes that relate to an entity's overall appearance, interaction, and behavior within the game environment. A trigger configuration subsection is also available to reference action elements that help control entity behavior.
[0041] Users can also link custom parameters to specific states for initialization and update. A state object can enter a number of phases during an application's execution. A custom parameter can be assigned specific values during each phase of the state's life-cycle.
[0042] A variety of other configurable options are available to help drive entity presentation and behavior, included but not limited to active audio and visual components, active physics behavior and interaction, active device interface behavior, active parameter initialization, trigger actions, and trigger events.
[0043] Through the state configuration module, you can create and control every aspect of an entity's presentation layer i.e. the way it looks in the game as well as its behavior at any given time during a game application's run loop.
[0044] The graphic modifier is a subcomponent of the profile module 109. The profile module 109 manages audio and visual object traits. Configurable characteristics include but are not limited to appearance and acoustics parameters, file-referencing, and conversion mechanisms.
[0045] The event modifier is a subcomponent of the activity module 110. The activity module 110 manages actions, expressions, and conditional object traits. Expressions define evaluable statements containing numbers, symbols, and operators. Conditionals represent an aggregate value defined as an evaluation of an expression element, conditional element, or hybrid element. Actions initiate processes to execute a command series. Commands interact with various parameters and other elements to implement object behavior.
[0046] Block 111 is a device link. The device link 111 is responsible for linking and accessing device attributes. The device link 111 and the input modifier are subcomponents of the interaction module 114. The interaction module 114 manages object device-interaction behavior. The module 114 processes action-response links, allowing for device component analysis including gesture and accelerometer analysis, through specialized input mechanisms.
[0047] The dynamic modifier 106 is a subcomponent of the simulation module 112. The simulator module 112 manages object behavior within a simulated environment. Configurable attributes include but not limited to physics relative parameters, collision-response mechanisms, and constraints.
[0048] The attribute module 107, state module 113, profile module 109, activity module 110, interaction module 114 and simulation module 112 communicate with data sources 115. The data sources 115 include but not limited to various databases, both internal and external.
[0049] Referring to
[0050] The data sources 115 communicates with a document module 122, a build module 123 and an artificial intelligence (“AI”) module 124. The document module 122 manages project data relative to specification, tracking, scheduling, and reporting.
[0051] The document module 122 manages project data relative to specification, tracking, scheduling, and reporting. The document module 122 contains a data catalog 116 and a report generator 117. The data catalog 116 manages data descriptors and other information types to facilitate the documentation process. The report generator 117 manages reporting tools and Meta data. Reporting tools process the meta data to produce and present analytical information relative to a project.
[0052] The build module 123 contains an application controller 118 and a platform configure 119. The build module 123 manages project application code to be deployed to a device. Data is tagged, queried, and used to construct structures required to build project application functionality. Structures are aggregated to file and copied to a target device or machine for launch during a deployment phase. The application controller 118 manages platform project integration. Platforms vary in operation and functionality. The process identifies and implements integration hooks to facilitate application/platform communication. The platform configure 119 manages platform environment configuration. The target environment is determined through device queries and other system specific parameters. The configuration is derived from an environment analysis and helps to produce cross-platform executable files.
[0053] The AI module 124 contains semantic interpreter 120 and utility generator 121 subcomponents. The AI module 124 manages configurations to control utility agent behavior. The configurations associated with the agent can be adjusted to influence the analysis task. The agent analyzes data to generate code to build a unique execution path, building a unique user experience. The process allows for multiple code versions to derive from a single source. The semantic interpreter 120 manages data relative to heuristic semantic interpretation. The data associates various components to semantic identifiers. The Identifiers are used during agent processing to access elements assigned a specific meaning. Associations linked to the semantics are implemented in the agent's functionality. The utility generator 121 manages performance measure data used to govern agent processes. The data also helps identify important elements ranked according to desirability. The elements are tracked, mapped, and queried to facilitate path execution selection.
[0054] In game development, the AI module 124 automates the process of level design. In a normal situation, a developer would build the game objects needed for an application. Then, for every level planned, the developer would have to arrange the participating objects in their proper locations within the level and configure their behavior to deliver the desired effect. The AI module 124 automates this process, significantly cutting development time.
[0055] Referring to
[0056] The AI module 124 examines user-defined scenarios to build a knowledge base that will be used to create composite data.
[0057] Referring to
[0058] Next, functions are defined to register with the AI module 124 the names of the operators that will work with parameter data. Lastly, semantic links are defined to establish the context meaning of data, i.e., interpretation of the data.
[0059] Referring to
[0060] Referring to
[0061] Referring to
[0062] The query functions examine parameters and relationships to produce a result that the AI can interpret. The format of the result is dependent on how the user defines the function, the parameter type or, in the case of a relationship, the parameter type of the relationship.
[0063] In
[0064] “iamGuy,” “iamGold,” and “iamWater” are configured as parameters and treated like properties. While “canSwim” and “canGrab” are configured as relationships. Relationships define a relation between two objects. In this example, the relationship “canSwim” is the function act_canSwim(swimmer, swim_in).
[0065] Referring to
[0066] Referring to
[0067] Clauses govern the behavior of the AI during level construction. When the AI is executing a task that requires decision-making, it queries the clauses to determine the correct course of action. The clauses represent the logic that the AI follows to deliver what it determines the user wants. The result of the query can vary. It can be simple data or complex, e.g., a list, tree, graph or custom structure. The AI resolves the result of the query to a value that allows it make a decision.
[0068] Referring to
[0069] Referring to
[0070] Referring to
[0071] The AI will determine that “scenario H” is more difficult than “scenario E” and if its goal is to build an easy level, then a comparison between “scenario E” and “scenario H” will yield a result that will lead to “scenario E” being chosen for inclusion.
[0072] The AI will add “scenario E” to the level. Internally, the clauses created in the AI's knowledge base will be akin to “Swimming with speed at 10 m/s is Easy,” “scenario E is Easy,” “Swimming with speed at 2 m/s is Hard,” and “scenario H is Hard.”
[0073] In practice, multiple scenarios are chosen to be included in a level. The AI only adds scenarios it believes will help deliver the experience it thinks the user wants. If the user wants a hard game or an easy game, all scenarios as a whole must deliver that experience. At times, this may not be a straight forward add operation; some adjustments may need to be made to a scenario in order for it to fit properly in the context of a level and be in balance with the rest of the game.
[0074] Referring to
[0075] Specifically, we assume a parameter named “swim Speed” with possible value of either 2 m/s or 10 m/s has been defined for “scenario E” and “scenario H.”
[0076] Referring to
[0077] Given an average weight of 0.5, the process will assign “swimSpeed” 6 m/s. All numerical values for the participating parameters would be adjusted in this manner and all other types of values would be adjusted according to type, to produce the new “scenario N” for inclusion.
[0078] Scenarios define the specifics of a composite, whether it be a boulder rolling at a high speed or a character with the option to go out for a swim. The AI understands the mechanics of these scenarios and can build their behavior into the overall gameplay of a level.
[0079] Referring to
[0080] The process is expected to yield different results as the testing continues because the queried clauses are always updated regardless of the outcome of simulated actions. This way every iteration of the process will have new knowledge to improve the AI player's chances of success.
[0081] Referring to
[0082] Referring to
[0083] The scenarios in a composite have a physical world component, complete with objects and behavior, where the conditions apply. For example, referring to
[0084] The AI would repeat the process above to build composites based on user-preferences. Each composite would create an environment and a list of conditionals that a potential player must satisfy in order to “win.”
[0085] Referring to
[0086] Meta data 126 is the core data queried, updated, and processed through controller member pipeline segments.
[0087] Template object 127 serve as manager containers for specific data types. Data is initialized through the object and managed by the object.
[0088] Shared data 128 is treated as a system resource and is managed through mechanisms designed to reduce memory consumption.
[0089] Custom buffer 129 serve as containers with specialized components allowing a custom process to be executed through an interface.
[0090] The system can interact with any mobile device 130. The mobile device 130 can be any machine or device consisting of at least one central processing unit and some form of memory.
[0091] A plugin module 131 is provided to enable pipeline-process extension or modification. It processes data from the controller 125, meta data 126, template object 127, shared data 128, custom buffer 129 and the mobile device 130.
[0092] A graphic subsystem 132 is provided for managing processes relative to graphics functionality, including image processing, texturing, and vertex data rendering and manipulation. The graphic subsystem communicates with a glyph loader 136 that can manage font image-processing mechanisms associated with a graphics system. A glyph rendering sub-procedure handles font processing. The procedure uses specialized algorithms to organize, cache, and access glyph information. The graphics subsystem 132 communicates with the semantic link 108 and a custom renderer 139. The custom renderer 139 manages the rendering procedures associated with a graphics sub-system. Graphics capabilities depend on software, hardware, and other system components. The process mediates the access between the variable resources through a specialized interface. The semantic link 108, custom renderer 139 and glyph loader 136 all communicate with a source cache 137. The source cache manages source data relative to a sub-system. Source data queued for processing in the pipeline require component activation. The process facilitates the activation mechanism through a specialized caching procedure, reducing the execution time.
[0093] An audio subsystem 133 also communicates with the plugin module 131. The audio subsystem manages processes relative to audio functionality, including audio sampling, playback, and modification. The audio subsystem 133 communicates with the source cache 137. A rank map 140 provides a ranking system that is used to facilitate audio playback. System dependent resources are finite. A ranking procedure allows priority to be assigned to specific sources to guarantee processing during low-resource availability. A source synchronization 143 manages resource access for source data audio playback. Source data queued for processing require access to system dependent resources. A sync process links a resource to a source to initiate the playback mechanism.
[0094] A dynamics system 134 communicates with the plugin module 131. The dynamics system 134 manages processes relative to dynamics functionality, including physics, input and response, collision and response, and constraints. The dynamics system 134 uses data from an input map 138. The input map manages input-mapping and input-linking mechanisms associated with a dynamics sub-system. Hardware input interfaces will vary between devices and machines. The process handles linking between hardware-supported inputs and customizable handlers to facilitate the event-response process.
[0095] An event link 141 manages and tracks observable events for dynamics relative subcomponents. Event types vary between platforms. The process allows for custom events to be integrated into a dynamics system. An object map 142 manages object mappings relative to an event-response system. Objects require an associated handler to facilitate dynamics operation in a simulated environment. The mapping is queried during an event and the linked handler processes the event data. A handler 144 initiates customizable procedures through a specified interface. Handlers 144 serve as delegates to extend process functionality relative to an event-response system.
[0096] A signature subsystem 145 manages input and event tracking, and gesture recognition procedures associated with a dynamics sub-system. Specialized components help define input and event patterns. The process queries the patterns through a sub-system, processing data as matches are discovered.
[0097] Other systems 135 can communicate with the plugin module 131. The other system 135 can manage processes relative to customizable pipeline functionality, including processes, modules, templates, controllers, and data types.
[0098] The present invention may be embodied in other specific forms without departing from its spirit or essential characteristics. The described embodiments are to be considered in all respects only as illustrative and not restrictive. The scope of the invention will be, therefore, indicated by claims rather than by the foregoing description. All changes, which come within the meaning and range of equivalency of the claims, are to be embraced within their scope.