Apparatus for physical software coding training book running
11749134 · 2023-09-05
Assignee
Inventors
Cpc classification
G06F3/03
PHYSICS
International classification
G06F3/03
PHYSICS
G09B19/00
PHYSICS
Abstract
Disclosed is an apparatus for running a physical software coding training book. The apparatus includes: a toy control unit being connected to a Micro Control Unit (MCU) via serial communication and controlling motion of a toy through the MCU; a training content processing unit being connected to the toy control unit via HyperText Transfer Protocol (HTTP) and providing training content written in HyperText Markup Language (HTML), the training content including motion control commands for the toy; and a physical software processing unit capable of directly writing block coding-based physical software by embedding a block code editor into the training content and performing control of the toy.
Claims
1. An apparatus for running a physical software coding training book, comprising: a toy control unit being connected to a Micro Control Unit (MCU) via serial communication and configured to control motion of a toy through the MCU; a training content processing unit being connected to the toy control unit via HyperText Transfer Protocol (HTTP) and configured to provide training content written in HyperText Markup Language (HTML), the training content including motion control commands for the toy; and a physical software processing unit configured to directly write block coding-based physical software by embedding a block code editor into the training content and perform control of the toy by coding a program that executes a series of motion by using code blocks, provide a toy manipulator capable of performing real-time control of the toy and convert a motion process of the toy manipulated by the toy manipulator into at least one block code, and randomly shuffling an order of the at least one block code to visually provide a motion process of the toy, thereby allowing a user to rearrange the shuffled at least one block code, wherein, after randomly shuffling the order of the at least one block code, the physical software processing unit visually provides the motion process of the toy, thereby allowing the user to rearrange the shuffled at least one block code, wherein the shuffling of the order of the at least one block code is performed according to the following equation
2. The apparatus of claim 1, wherein, when a user input is received onto motion control of the toy, the training content processing unit is further configured to generate a toy motion text command based on HTML comments rather than HTML tags and provide the generated toy motion text command to the toy control unit.
3. The apparatus of claim 2, wherein, when the toy motion text command is generated, the training content processing unit is further configured to pop up a toy motion view on the training content, which allows the user to check successful control of the toy by providing the user with toy states before and after motion obtained through the MCU and expressed in a form of text.
4. The apparatus of claim 3, wherein, when a virtual toy is connected while the toy motion text command is being generated, the training content processing unit is further configured to visualize, on the toy motion view, a change process of the virtual toy based on toy states according to at least one motion interpolated adaptively based on the toy states before and after motion and an amount of change in the toy states before and after motion.
5. The apparatus of claim 1, wherein the physical software processing unit is further configured to provide the training content with a motion manipulator that identifies type of the toy, shows, on the block code editor, at least one toy motion which is written through coding by the user, receive the user's input, and provide a corresponding motion code to the block code editor.
6. The apparatus of claim 1, wherein a motion process of the series of motion of the toy is seen visually by selecting a menu button in the block code editor.
7. The apparatus of claim 1, wherein actual code is contained in each of the coding blocks and the actual code is viewed by selecting a menu button in the block code editor.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
(1)
(2)
(3)
(4)
(5)
(6)
DETAILED DESCRIPTION
(7) Since description of the present disclosure is merely an embodiment for illustrating structural or functional description, it should not be interpreted that the technical scope of the present disclosure is limited by the embodiments described in this document. In other words, embodiments may be modified in various ways and implemented in various other forms; therefore, it should be understood that the technical scope of the present disclosure includes various equivalents realizing technical principles of the present disclosure. Also, since it is not meant that a specific embodiment should support all of the purposes or effects intended by the present disclosure or include only the purposes or effects, the technical scope of the present disclosure should not be regarded as being limited to the descriptions of the embodiment.
(8) Meanwhile, implication of the terms used in this document should be understood as follows.
(9) The terms such as “first” and “second” are introduced to distinguish one element from the others, and thus the technical scope of the present disclosure should not be limited by those terms. For example, a first element may be called a second element, and similarly, the second element may be called the first element.
(10) If a constituting element is said to be “connected” to other constituting element, the former may be connected to the other element directly, but it should be understood that another constituting element may be present between the two elements. On the other hand, if a constituting element is said to be “directly connected” to other constituting element, it should be understood that there is no other constituting element present between the two elements. Meanwhile, other expressions describing a relationship between constituting elements, namely “between” and “right between” or “adjacent to” and “directly adjacent to” should be interpreted to provide the same implication.
(11) A singular expression should be understood to indicate a plural expression unless otherwise explicitly stated. The term of “include” or “have” is used to indicate existence of an embodied feature, number, step, operation, constituting element, component, or a combination thereof; and should not be understood to preclude the existence or possibility of adding one or more other features, numbers, steps, operations, constituting elements, components, or a combination thereof.
(12) Identification symbols (for example, a, b, and c) for individual steps are used for the convenience of description. The identification symbols are not intended to describe an operation order of the steps. Therefore, unless otherwise explicitly indicated in the context of description, the steps may be executed differently from a specified order. In other words, the respective steps may be performed in the same order as specified in the description, actually performed simultaneously, or performed in a reverse order.
(13) The present disclosure may be implemented in the form of program code in a computer-readable recording medium, where a computer-readable recording medium includes all kinds of recording apparatus which store data that may be read by a computer system. Examples of a computer-readable recording medium include ROM, RAN, CD-ROM, magnetic tape, floppy disk, and optical data storage device. Also, a computer-readable recording medium may be distributed over computer systems connected to each other through a network so that computer-readable code may be stored and executed in a distributed manner.
(14) Unless defined otherwise, all of the terms used in this document provide the same meaning as understood generally by those skilled in the art to which the present disclosure belongs. Those terms defined in ordinary dictionaries should be interpreted to have the same meaning as conveyed by a related technology in the context. And unless otherwise defined explicitly in the present disclosure, those terms should not be interpreted to have ideal or excessively formal meaning.
(15)
(16) Referring to
(17) The apparatus 110 for running a physical software coding training book may correspond to a computing device such as a computer or a smartphone being connected to the MCU 120 and capable of running a physical software coding training book. In what follows, more detailed descriptions related to the apparatus 110 for running a physical software coding training book will be given with reference to
(18) The MCU 120 may correspond to a computing device that is embedded or connected to a toy 130 to control various motions of the toy 130. For example, the MCU 120 may be implemented as a main board capable of transmitting a programmed command to the toy 130 to make the toy 130 perform a desired motion.
(19) The toy 130 may correspond to a coding tool capable of performing a specific motion through the MCU 120. For example, the toy 130 may be implemented as a robot or an RC car.
(20)
(21) Referring to
(22) The toy control unit 210 is connected to the MCU 120 via serial communication and controls toy motion through the MCU 120. Here, the toy control unit 210 may control the virtual toy 240 preconfigured by the MCU 120 even if the toy 130 is not physically connected to the MCU 120.
(23) The training content processing unit 220 is connected to the toy control unit 210 via HyperText Transfer Protocol (HTTP) and provides training content for software coding training that teaches how to program the toy 130 to perform a desired motion. Here, the training content may correspond to a coding training book written in HyperText Markup Language (HTML) describing motion control of the toy 130 and may be displayed on a webpage.
(24) If a user input is received onto the motion control of the toy, the motion control being included in the coding training content written in HTML, the training content processing unit 220 may generate a toy motion text command based on special tags rather than HTML tags and provide the generated command to the toy control unit 210. Here, the special tag may correspond to the HTML comment. For example, to use the HTML comment, comments may be inserted between “<!- and .fwdarw.”, which is used to modify the code or to conveniently see explanation given to the tag. The training content processing unit 220 may code the toy motion according to a user input by inserting HTML comments rather than the block code for motion process of the toy.
(25) If the motion text command using special tags according to a user input is generated, the training content processing unit 220 may pop up a toy motion view on the training content, which allows a user to check successful control of the toy 130 by providing the user with toy states before and after motion obtained through the MCU 120 and expressed in the form of text. If the virtual toy 240 is connected while the motion text command is being generated, the training content processing unit 220 may visualize, on the toy motion view, a change process of the virtual toy based on toy states according to at least one motion interpolated adaptively based on the toy states before and after motion and an amount of change in the toy states before and after motion. Here, interpolation of toy states before and after motion may be calculated by Eq. 1 below.
X.sub.p=x.sub.1+t(x.sub.2−x.sub.1)
Y.sub.p=y.sub.1+t(y.sub.2−y.sub.1) [Eq. 1]
(26) In Eq. 1, X.sub.p and Y.sub.p are coordinates corresponding to the interpolated position; x.sub.1 and y.sub.1 are coordinates corresponding to the position before motion; x.sub.2 and y.sub.2 are coordinates corresponding to the position after motion; and t corresponds to the movement time.
(27) The physical software processing unit 230 directly writes block coding-based physical software by embedding a block code editor into the training content and performs control of the toy 130. Here, an example of a block coding screen is shown in
(28)
(29) Each individual function is defined as a block on the block coding screen 600 of
(30) Referring again to
(31) After randomly shuffling the order of the at least one block code, the physical software processing unit 230 may visually provide a motion process of the toy 130, thereby allowing a user to duly rearrange the shuffled at least one block code. Through this operation, the user may be trained for desired motions of the toy 130 and may practice coding. Here, shuffling of order of code blocks may be performed according to Eq. 2 below.
(32)
(33) In Eq. 2, (p.sub.1, p.sub.2, . . . , p.sub.k) corresponds to the number of function blocks, and Sh(⋅) corresponds to the shuffle permutation of partitions of the totally ordered set.
(34) The physical software processing unit 230 may provide training content with a motion manipulator that identifies type of the toy 130, shows at least one toy motion which may be written through coding on the block code editor by a user, receives the user's input, and provides the corresponding motion code to the block code editor.
(35)
(36) Referring to
(37)
(38) Referring to
(39) If a user input is received to the motion control of the toy 130, the motion control being included in the training content provided, the training content processing unit 220 may generate toy motion text commands and provide the generated commands to the toy control unit 210, S420. In one embodiment, the toy motion text command may include the HTML comments.
(40) The toy control unit 210 may check a physical connection of the toy 130 through the MCU 120, S430 and if the physical connection is confirmed, may control motion of the toy 130 through the connected MCU 120 according to the toy motion text command provided by the training content processing unit 220.
(41) In case the toy 130 is not physically connected, the toy control unit 210 may simulate the toy motion by controlling a virtual toy 240 which substitutes for the toy 130, S440. The virtual toy 240 may be implemented in various forms of coding tools including a robot or an RC car preconfigured by the MCU 120.
(42) The training content processing unit 220 may provide a toy motion view on the training content in the form of a pop-up window, the toy motion view showing the state of the toy 130 before and after motion according to a toy motion text command S450.
(43) In the case of a simulation through control of the virtual toy 240, the training content processing unit 220 may provide a change process of the virtual toy 240 on the training content by interpolating toy motion views before and after motion S460.
(44) The user may check successful control of the toy by examining the toy states before and after motion visualized in the form of a toy motion view.
(45)
(46) As shown in
(47) The physical software processing unit 230 sequentially stores the block code generated according to a manipulation sequence S520, shuffles a block code order, and provides the shuffled block codes on the block coding screen 600, S530.
(48) The physical software processing unit 230 allows the user to code a program by rearranging block codes that are in a mixed order on the block coding screen 600, S540.
(49) Although the present disclosure has been described with reference to preferred embodiments given above, it should be understood by those skilled in the art that various modifications and variations of the present disclosure may be made without departing from the technical principles and scope specified by the appended claims below.
(50) The present disclosure may provide the following effects. However, since it is not meant that a specific embodiment should support all of the effects or include only the effects, the technical scope of the present disclosure should be not regarded as being limited to the descriptions of the embodiment.
(51) One embodiment of the present disclosure may provide an apparatus for running a physical software coding training book which allows coding to be performed in an easy and simple manner without directly writing a code (command) but only through a proper arrangement of block codes provided by training content in which a block code editor is embedded.
(52) One embodiment of the present disclosure may provide an apparatus for running a physical software coding training book which visually provides motion states of a toy or a virtual toy according to coded physical software and thus allows a user to check successful control of the toy.