REST API Validation
20230047837 · 2023-02-16
Inventors
- Kavya CHILUPURI (Hyderabad, IN)
- David M. CLIFTON (Ellicott City, MD, US)
- Dharmalingam GANESAN (Crofton, MD, US)
Cpc classification
G06F16/9566
PHYSICS
International classification
G06F16/955
PHYSICS
G06F11/36
PHYSICS
Abstract
Embodiments validate representational state transfer (“REST”) application program interfaces (“API”). Embodiments receive a REST API specification that provides information for a plurality of REST APIs and parse the REST API specification to extract, for each REST API, a corresponding Uniform Resource Locator (“URL”), and corresponding parameter names response codes and payloads. Embodiments convert the parsed REST API specification into a converted text file, the converting including parameter constraints and parameter default values. Embodiments then generate all possible combinations of test data for each REST API from the converted text file and perform one or more test operations on each of the combinations of test data.
Claims
1. A method of validating representational state transfer (REST) application program interfaces (API), the method comprising: receiving a REST API specification that provides information for a plurality of REST APIs; parsing the REST API specification to extract, for each REST API, a corresponding Uniform Resource Locator (URL), and corresponding parameter names response codes and payloads; converting the parsed REST API specification into a converted text file, the converting including parameter constraints and parameter default values; generating all possible combinations of test data for each REST API from the converted text file; and performing one or more test operations on each of the combinations of test data.
2. The method of claim 1, further comprising: in response to the test operations, generating generated default values and repeating the converting, generating and performing using the generated default values instead of the parameter default values.
3. The method of claim 1, the test operations comprising, for each REST API, at least one of: checking incorrect paths in the URL; checking for a rejection of an overly large payload; or validating rejections of invalid media type headers.
4. The method of claim 1, the test operations comprising, for each REST API, at least one of: validating an expected compression format; checking for missing mandatory parameters; or regression testing.
5. The method of claim 1, the test operations comprising, for each REST API, at least one of: checking whether GET and PUT operations are idempotent; or validating a presence of Hypermedia as the Engine of Application State (HATEOS) links for POST and PUT operations.
6. The method of claim 1, wherein the all possible combinations of test data comprises test data for operations comprising GET, POST, PUT and DELETE.
7. The method of claim 1, wherein the REST API specification comprises a Swagger format and comprises a JavaScript Object Notation (JSON) format or a YAML format.
8. The method of claim 1, wherein the generating all possible combinations comprises generating a spreadsheet of all possible combinations.
9. A representational state transfer (REST) application program interfaces (API) validation system comprising: a first parser that is configured to receive a REST API specification that provides information for a plurality of REST APIs and parses the REST API specification to extract, for each REST API, a corresponding Uniform Resource Locator (URL), and corresponding parameter names response codes and payloads; a converter that is configured to convert the parsed REST API specification into a converted text file, the converting including parameter constraints and parameter default values; a combination generator that is configured to generate all possible combinations of test data for each REST API from the converted text file; and an analyzer configured to perform one or more test operations on each of the combinations of test data.
10. The system of claim 9, further comprising: in response to the test operations, a second parser configured to generate generated default values, the system repeating the converting, generating and performing using the generated default values instead of the parameter default values.
11. The system of claim 9, the test operations comprising, for each REST API, at least one of: checking incorrect paths in the URL; checking for a rejection of an overly large payload; or validating rejections of invalid media type headers.
12. The system of claim 9, the test operations comprising, for each REST API, at least one of: validating an expected compression format; checking for missing mandatory parameters; or regression testing.
13. The system of claim 9, the test operations comprising, for each REST API, at least one of: checking whether GET and PUT operations are idempotent; or validating a presence of Hypermedia as the Engine of Application State (HATEOS) links for POST and PUT operations.
14. The system of claim 9, wherein the all possible combinations of test data comprises test data for operations comprising GET, POST, PUT and DELETE.
15. The system of claim 9, wherein the REST API specification comprises a Swagger format and comprises a JavaScript Object Notation (JSON) format or a YAML format.
16. The system of claim 9, wherein the generating all possible combinations comprises generating a spreadsheet of all possible combinations.
17. A computer-readable medium storing instructions which, when executed by at least one of a plurality of processors, cause the processors to validate representational state transfer (REST) application program interfaces (API), the validating comprising: receiving a REST API specification that provides information for a plurality of REST APIs; parsing the REST API specification to extract, for each REST API, a corresponding Uniform Resource Locator (URL), and corresponding parameter names response codes and payloads; converting the parsed REST API specification into a converted text file, the converting including parameter constraints and parameter default values; generating all possible combinations of test data for each REST API from the converted text file; and performing one or more test operations on each of the combinations of test data.
18. The computer-readable medium of claim 17, the validating further comprising: in response to the test operations, generating generated default values and repeating the converting, generating and performing using the generated default values instead of the parameter default values.
19. The computer-readable medium of claim 17, the test operations comprising, for each REST API, at least one of: checking incorrect paths in the URL; checking for a rejection of an overly large payload; or validating rejections of invalid media type headers.
20. The computer-readable medium of claim 17, the test operations comprising, for each REST API, at least one of: validating an expected compression format; checking for missing mandatory parameters; or regression testing.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0006]
[0007]
[0008]
[0009]
[0010]
[0011]
[0012]
[0013]
[0014]
[0015]
[0016]
DETAILED DESCRIPTION
[0017] One embodiment is a representational state transfer (“REST”) application program interfaces (“API”) validator that validates REST APIs in connection with a REST API specification. Given one or more REST APIs, embodiments certify that each REST API satisfies the required REST API standard/specification. Embodiments automatically generate test inputs from the REST API specification and conduct both security and functional testing for each of the REST APIs.
[0018] Reference will now be made in detail to the embodiments of the present disclosure, examples of which are illustrated in the accompanying drawings. In the following detailed description, numerous specific details are set forth in order to provide a thorough understanding of the present disclosure. However, it will be apparent to one of ordinary skill in the art that the present disclosure may be practiced without these specific details. In other instances, well-known methods, procedures, components, and circuits have not been described in detail so as not to unnecessarily obscure aspects of the embodiments. Wherever possible, like reference numbers will be used for like elements.
[0019]
[0020] System 150 includes a REST API validation tool 100 that includes a REST API validation engine 105, a user interface 106, and a combination generation tool 107. Engine 105 performs the REST API validation, and includes a converter to generate a text file, one or more parsers for performing two different parsings, and an analyzer for performing analysis. User interface 106 provides a user interface (“UI”) to allow a user to input default values 103 and parameter constraints 104, and to output a validation report 110.
[0021] Validation tool 100 further receives as input one or more REST API specifications 102. REST is an architectural style, not a strict standard. Therefore, it allows for flexibility among programmers of varying levels of experience. Implementations include API standards that are used to enforce good practice (e.g., well documented APIs, backward versions with upgrade windows, appropriate use of HTTP verbs, etc.) as well as enforce consistency. For multiple APIs, the experience across the APIs should be similar and therefore components may be reused.
[0022] API specifications may include definitions and a set of rules for a format describing REST APIs. The format is both machine-readable and human-readable, and used to share documentation among product managers, testers and developers. It may also be used with various tools to automate API-related processes. API specifications may define a standard, language-agnostic interface to REST APIs which allows both humans and computers to discover and understand the capabilities of the service without access to source code, documentation, or through inspection.
[0023] In embodiments, API specification 102 may include a specification for machine-readable interface files for describing, producing, consuming, and visualizing RESTful web services. A variety of tools may generate code, documentation and test cases given an interface file. Development of the OpenAPI Specification (“OAS”) is overseen by the Open API Initiative, an open source collaborative project of the Linux Foundation. According to an embodiment of the present invention, APIs have a specification, which may include an Open API Specification or Swagger contract. In embodiments, API specification 102 is implemented using the Swagger (JSON/YAML) format or the API Blueprint format.
[0024]
[0025] System 10 includes a bus 12 or other communication mechanism for communicating information, and a processor 22 coupled to bus 12 for processing information. Processor 22 may be any type of general or specific purpose processor. System 10 further includes a memory 14 for storing information and instructions to be executed by processor 22. Memory 14 can be comprised of any combination of random access memory (“RAM”), read only memory (“ROM”), static storage such as a magnetic or optical disk, or any other type of computer readable media. System 10 further includes a communication device 20, such as a network interface card, to provide access to a network. Therefore, a user may interface with system 10 directly, or remotely through a network, or any other method.
[0026] Computer readable media may be any available media that can be accessed by processor 22 and includes both volatile and nonvolatile media, removable and non-removable media, and communication media. Communication media may include computer readable instructions, data structures, program modules, or other data in a modulated data signal such as a carrier wave or other transport mechanism, and includes any information delivery media.
[0027] Processor 22 is further coupled via bus 12 to a display 24, such as a Liquid Crystal Display (“LCD”) and includes a microphone for receiving user utterances. A keyboard 26 and a cursor control device 28, such as a computer mouse, are further coupled to bus 12 to enable a user to interface with system 10.
[0028] In one embodiment, memory 14 stores software modules that provide functionality when executed by processor 22. The modules include an operating system 15 that provides operating system functionality for system 10. The modules further includes a REST API validation module 16 that validates REST APIs against a REST API specification, and all other functionality disclosed herein. System 10 can be part of a larger system. Therefore, system 10 can include one or more additional functional modules 18 to include the additional functionality. A file storage device or database 17 is coupled to bus 12 to provide centralized storage for modules 16 and 18. In one embodiment, database 17 is a relational database management system (“RDBMS”) that can use Structured Query Language (“SQL”) to manage the stored data.
[0029] In one embodiment, particularly when there are a large number of distributed files at a single device, database 17 is implemented as an in-memory database (“IMDB”). An IMDB is a database management system that primarily relies on main memory for computer data storage. It is contrasted with database management systems that employ a disk storage mechanism. Main memory databases are faster than disk-optimized databases because disk access is slower than memory access, the internal optimization algorithms are simpler and execute fewer CPU instructions. Accessing data in memory eliminates seek time when querying the data, which provides faster and more predictable performance than disk.
[0030] In one embodiment, database 17, when implemented as an IMDB, is implemented based on a distributed data grid. A distributed data grid is a system in which a collection of computer servers work together in one or more clusters to manage information and related operations, such as computations, within a distributed or clustered environment. A distributed data grid can be used to manage application objects and data that are shared across the servers. A distributed data grid provides low response time, high throughput, predictable scalability, continuous availability, and information reliability. In particular examples, distributed data grids, such as, e.g., the “Oracle Coherence” data grid from Oracle Corp., store information in-memory to achieve higher performance, and employ redundancy in keeping copies of that information synchronized across multiple servers, thus ensuring resiliency of the system and continued availability of the data in the event of failure of a server.
[0031] In one embodiment, system 10 is a computing/data processing system including an application or collection of distributed applications for enterprise organizations, and may also implement logistics, manufacturing, and inventory management functionality. The applications and computing system 10 may be configured to operate with or be implemented as a cloud-based system, a software-as-a-service (“SaaS”) architecture, or other type of computing solution.
[0032] As disclosed, one embodiment is a tool that tests/validates REST APIs. Embodiments retrieve the information directly from the API documentation (i.e., REST API specifications), and generates test data automatically with varied combinations of data generated from combinations generator 107. Embodiments also incorporate automatic and more generic assertions to validate the execution of the APIs and generate reports. Embodiments further check for HTTP headers, unsupported verbs, incorrect URLs, etc.
[0033] Embodiments retrieve all of the paths mentioned in the REST API specification, along with the parameter details and response codes to generate a more easily understandable spreadsheet or “Excel” file (or any other type of human readable file). Based on the parameter type and the nature of the parameter, randomization of the test data is done. This test data is fed into combination generator tool 107 to generate test data combinations which help in identifying corner cases (i.e., cases outside of normally expected results).
[0034] The APIs are then tested based on the operation type and varied test data combinations to generate HTML reports (or other similarly formatted reports) based on generic assertions. The test data retrieved from a GET operation is stored and used for the upcoming test activities and the results are compared with the previous executions to check the stability of the system.
[0035]
[0036] At 301, REST API specification 102 is parsed using a parser to extract the resource URL, the parameter names along with the parameter information, the response codes and the JSON payload for POST and PUT operations. This parsed information is provided and generated as an Excel file or any spreadsheet or other user readable format. The resource URL is the resource path, which is the URL prefix for all API paths relative to the host root. An example REST endpoint is as follows:
[0037] https://wfivm04329.us.oracle.com/operarest/vab/hsk/v1/hotels/{hotelId}/housekeepingOverview
[0038] https: Scheme
[0039] wfivm04329.us.oracle.com: API Host Server
[0040] operarest/vab: location on the host server where all the APIs are hosted (host root)
[0041] hsk/v1/hotels/{hotelId}/housekeepingOverview: API specific path relative to the host root
[0042]
[0043] For the parsing at 301, embodiments parse the YAML file (or JSON if in that format) to extract the resource URL, the parameter names along with the parameter information, the response codes and the JSON payload for POST and PUT operations. The parsed information is then provided in a spreadsheet or other readable file.
[0044] File 500 further includes a “ParameterRequire” 505 that specifies whether the parameter is a mandatory parameter or an optional parameter to the API call. This is decided by the business and is pre-defined in the API specification. File 500 further includes the parameter type 506, the parameter value 507 and the parameter schema 508. Columns 507 and 508 are blank when the API specification does not include any values for those fields. For example, a “schema” would only be available for POST and PUT operations.
[0045] File 500 further includes “Response Code” 509 which are expected response codes retrieved from the API specification, and “scope” 510. Scope column 510 can be updated by the user before starting the execution if the specific API call is in the scope or not. Based on this information, embodiments would skip the API calls which are not in scope for testing.
[0046] Referring again to
[0047] An example of a parameter defaults.txt file 103 is as follows, which provides default values for the authorization key (which provides authorization to use the web service) and values for identities for hotels, hotel rooms, and activities:
TABLE-US-00001 authorization (string): Bearer eyJhbGciOiJSUzl1NilslnR5cCl6lkpXVClslng1dCl6lmpQWIZtLXhERHFfRjgyRTRjZGpNWmMxczNzTSlslmt pZCl6lm1zLW9hdXRoa2V5ln0.eyJzdWliOiJudmFiX3N3YXRoaSlslmlzcyl6lnd3dy5vcmFjbGUuY29tliwib3J hY2xlLm9hdXRoLnN2Y19wX24iOiJPQXV0aFNIcnZpY2VQcm9maWxlliwiaWF0IjoxNjlxNzgwOTEwLCJvcm FjbGUub2F1dGgucHJuLmlkX3R5cGUiOiJMREFQX1VJRClslmV4cCl6MTYyMTc4NDUxMCwib3JhY2xlLm 9hdXRoLnRrX2NvbnRleHQiOiJ1c2VyX2Fzc2VydGlvbilslmF1ZCl6WyJodHRwczovLypvcmFjbGUqLmNvbS lslmh0dHBzOi8vKi5pbnQiLCJodHRwczovLypvY3Mub2MtdGVzdC5jb20vll0slnBybil6lm52YWJfc3dhdGhpli wianRpljoiZTQ4YzEwN2EtZjk0ZC00ZDJhLWFmMjQtYzZiYmlwYml5Yjllliwib3JhY2xlLm9hdXRoLmNsaWV udF9vcmlnaW5faWQiOiJBWEU20TliNTRkNGNIYWM0MjBiYTkwZDk0YzM1NmFhY2NkMilslnVzZXIudGV uYW50Lm5hbWUiOiJEZWZhdWx0RG9tYWIuliwib3JhY2xlLm9hdXRoLmlkX2RfaWQiOilxMjM0NTY3OC0x MjM0LTEyMzQtMTIzNC0xMjM0NTY3ODkwMTIifQ.SZcQSCby4MpPje76wUxE1qObZyYA_5NfK1uJJjm7W TtoPtvUNq21BqmhjP52yqTBDNzs9RrHg9G6SwpKA4MIMozMg6G1- xjFxdYVUcPf5VOufDrbC7w_r4Z74G8J90ic80MOBuTajkhKlzTUTJpcmYsrnfpZpZoGhl9X46xY0qj5UI5JMg WedQSS57UnMIA5auFGDiz8a0Ute8qVzCDU4Wqg0WbAMIImdbT1c3QwMIPs3dDN_DNoTaLThDAbnLN2 lyRjv0ESC81tXvesBRXTUjutlCl1IGMY6o8dzMkMLxZKRQRPEGN049mH6_kYt6qlwTzHtk2sEzxP_fUHxvH 1Wg x-hotelid: SK003 activityId: 401803 hotelId: SK003
[0048] An example of a parameter constraints file 104 is as follows, which provides dependencies between parameters:
TABLE-US-00002 completed = true => startDate = ″test234″ test = true => endDate = ″test234″
[0049] The constraints can be defined by the business which describes the dependency of one parameter to the other. For example, when a parameter such as “Completed” holds a value “true” then the parameter “EndDate” of a process cannot be anything after the currentDate. This is a condition which can be considered a constraint while defining various combinations of test data. Further, there may also be no constraints for the parameters. Therefore, even if some constraints are defined and the parameters in the constraints file are not valid for an API call then the constraints section would be empty in the text file that gets generated.
[0050] An example of the text file that is generated at 320 that includes the parameters with default values and constraints is as follows. The file shows the name of the REST API, and a listing of all parameters with possible parameter values. The default values are used against the parameters and the constraints sent by the user are not applicable to this operation, so the constraints section in the below generated text
TABLE-US-00003 file is empty. [System] Name: hskv1hotels{hotelId}housekeepingOverview-get [Parameter] hotelId (enum): SK003 limit (integer): 75357, 33468, 45085, 57352, 31287 offset (integer): 41098, 59602, 17829, 51777, 23872 housekeepingRoomStatus (enum): hotel1, hotel2 frontOfficeRoomStatus (enum): hotel1, hotel2 turndownStatus (enum): hotel1, hotel2 roomParityStatus (enum): hotel1, hotel2 serviceRequestStatus (enum): hotel1, hotel2 roomAssignmentStatus (enum): hotel1, hotel2 fetchInstructions (enum): hotel1, hotel2 fromRoomNumber (enum): B7VIHTI, LK0WL4U, LOQU7AW, JZ0KC81, 3YFQ17U toRoomNumber (enum): FGU6YQF, K24X0V0, ZTYIPLM, LG9IP9T, 3DQ0DJ4 floor (enum): hotel1, hotel2 roomType (enum): hotel1, hotel2 roomClass (enum): hotel1, hotel2 building (enum): hotel1, hotel2 roomFeatures (enum): hotel1, hotel2 daySection (enum): hotel1, hotel2 eveningSection (enum): hotel1, hotel2 reservationstatus (enum): hotel1, hotel2 guestServiceRequest (enum): DWFU33W, GS7PX8O, W6LJNG8, 47CA9Q4, 3PHWHC8 roomIdText (enum): TCNG6IC, 6OP4QCM, TXLVCPT, VF181VD, XS8TBD0 authorization (enum): Bearer eyJhbGciOiJSUzl1NilslnR5cCl6lkpXVClslng1dCl6lmpQWIZtLXhERHFfRjgyRTRjZGpNWmMxczNzTSlslmt pZCl6lm1zLW9hdXRoa2V5ln0.eyJzdWliOiJudmFiX3N3YXRoaSlslmlzcyl6lnd3dy5vcmFjbGUuY29tliwib3J hY2xlLm9hdXRoLnN2Y19wX24iOiJPQXV0aFNIcnZpY2VQcm9maWxlliwiaWF0IjoxNjlxMjl4NzQ2LCJvcmFj bGUub2F1dGgucHJuLmlkX3R5cGUiOiJMREFQX1VJRClslmV4cCl6MTYyMTIzMjM0Niwib3JhY2xlLm9hd XRoLnRrX2NvbnRleHQiOiJ1c2VyX2Fzc2VydGlvbilslmF1ZCl6WyJodHRwczovLypvcmFjbGUqLmNvbSlsl mh0dHBzOi8vKi5pbnQiLCJodHRwczovLypvY3Mub2MtdGVzdC5jb20vll0slnBybil6lm52YWJfc3dhdGhpliwi anRpljoiODM2YjQ4N2UtNThiYi00MGQwLWIzMTQtYmFINzdlODU0MGY4liwib3JhY2xlLm9hdXRoLmNsa WVudF9vcmlnaW5faWQiOiJBWEU2OTliNTRkNGNIYWM0MjBiYTkwZDk0YzM1NmFhY2NkMilslnVzZXIud GVuYW50Lm5hbWUiOiJEZWZhdWx0RG9tYWIuliwib3JhY2xlLm9hdXRoLmlkX2RfaWQiOilxMjM0NTY3OC 0xMjM0LTEyMzQtMTIzNC0xMjM0NTY3ODkwMTIifQ.Wpoy_EyfKX2dTpes2EoiHTnKExta1sgblK7zrR2vccL GldUoNPq2gTZyepKUxfh1vx2NhaVDTFeyt6ll1A1cP_0BRK17enopt2oTQU-TAfyl1rHN_vNT- 7AEZtvpsvRqzPqhdnRqgDmXKEJKGm-eKlpg- 6HXdmS5k9WVRz57_RQAGXO1kKK5rlSe7ujinXjWAf7QpnKteq1d1wOrKqVYvTjFiKw5zt4wXD3NR9Wkld POEC9d- JbHNpBKVWjTKKz6hmz5WBEGX7C_tLM5lv7KDPOMN4WAdhrPDwv6zee9H1CqxcNObiGggTc- wjz9QpwP3VFPtxxjwPkhAhBoxpQS7Q x_app_key (enum): SK6S2NS, 0YMPKP0, XLHR9X0, X2TDKWF, CJUUB5U
[Constraint]
[0051] If the user has not provided any default values, the converter searches for the existence of a system generated defaults file 310 that is generated at 305, and if that also does not exist then it uses randomly generated values that are generated at 311 based on the parameter type (e.g., String, Boolean or Integer). The system generated defaults are generated by a parser at 305 as a result of the testing, and is then used as input for the next round of testing as an iterative process. At 305, if the GET operation response is successful, then the parser retrieves the data out of the GET operation response with the key value information and stores it in a text file. This data can be used during the next execution if the user does not pass any pre-defined data values 103.
[0052] At 303, combination generation tool 107 generates a spreadsheet or other file that includes all combinations of test data for each operation (i.e., GET/POST/PUT/DELETE). Any available combination generation tool can be used. In one embodiment, the “Automated Combinatorial Testing for Software” tool provided by the National Institute of Standards and Technology (“NIST”) is used at 303.
[0053]
[0054]
[0055] At 304, the analyzer runs a series of tests using the spreadsheet with all data combinations (e.g., spreadsheet 700) output by the combination generator tool 107. Further inputs include user expected response codes at 330 from column 702 of spreadsheet 700 and response codes from the REST API specification at 331, which provide a set of expected responses for each request. The test operations uses the test data on the HOST that the user sends as an input. Then the execution is validated against the expected status codes passed by the user at 330 or else it validates against the response codes retrieved from the API specification file at 331. Further, the common assertions to check for error messages such as STACK Overflow and DB related information are also tested.
[0056]
[0057] The HTML report 110 with test execution results is then generated.
[0058] More specifically, the analysis at 304 further includes, in some embodiments, if the user requests to check for the mandatory parameters and optional parameters, then the analyzer checks if an error is displayed if any mandatory parameter is missing. It also checks by removing the optional parameters from the request to see if the request is successful. In one embodiment, the user can send the input through a command line argument (e.g., testmandatory=false) while running the tool.
[0059]
[0060] When the user recommends the tool to perform regression testing, the user sends the command line argument (regression=true). Once the execution is completed, the response codes are stored to the excel file for the first time. Next execution onwards, in addition to storing the response codes, embodiments also compare with the response codes of the earlier execution to check the difference in the system behavior and alerts the user if there is any difference in the response codes. The same is documented in the HTML reports that get generated at the end of the execution.
[0061] The analysis at 304 further includes, in some embodiments, extracting test data from a GET operation at 305, which is in the form of a JSON response and is then parsed to generate test data that is used for the next round of execution as the generated defaults 310. When the user sends a GET request to the API server, the response received from the server contains information related to the parameters used in the testing. Embodiments parse this response and generate a file which has the parameter name and the values that are present in the API server. These values could be used as test data (i.e., generated defaults 310) for the next round of execution in case the user does not send default values for the parameters. An example of a generated default text file is as follows:
TABLE-US-00004 type: Activity id: 401803 hotelId: SK003 activityClass: Appointment activityType: BROCHURE activityTypeDescription: Brochure Requests activityPurpose: Purpose is 386 activityOwner: ALL startDateTime: 2020-05-26 08:45:00.0 endDateTime: 2020-05-26 09:45:00.0 completed: False createDateTime: 2021-04-08 05:46:51.0 creatorId: VAFIRSTADMIN@VAFIRST lastModifyDateTime: 2021-04-08 05:46:51.0 lastModifierId: VAFIRSTADMIN@VAFIRST id: 934358 givenName: Company51400 cityName: Alma postalCode: 80420 state: CO code: US phoneNumber: 9876543210 profileType: Company primary: True fullOverlay: False linkedBlocks: linkedAttachments:
linkedActivities:
indicatorName: Profiles count: 1 href: https://www.test.com/act/v1/hotels/SK003/activities/401803? fetchInstructions= ActivityInfo rel: self templated: False method: GET operations: getActivity
[0062] The analysis at 304 further includes, in some embodiments, checking for the idempotent feature of GET and PUT operations. Assuming the environment is controlled, embodiments check if GET and PUT operations are idempotent. The analyzer sends two consecutive GET or PUT requests to check if the same response is received in both of the cases. The input is two GET and PUT operations. The output is an HTML report with test comparison results.
[0063] Further, some API specifications do not contain information related to the schema of a POST or a PUT operation. When the tool parses the information from the API specification, the schema entry in the spreadsheet would be empty (e.g., as shown in portions of
[0064] The analysis at 304 further includes, in some embodiments, validating the presence of Hypermedia as the Engine of Application State (“HATEOS”) links for POST and PUT operations. HATEOAS is a constraint of the REST application architecture that distinguishes it from other network application architectures. With HATEOAS, a client interacts with a network application whose application servers provide information dynamically through hypermedia. A REST client needs little to no prior knowledge about how to interact with an application or server beyond a generic understanding of hypermedia. For a POST operation, a resource gets created along with a HATEOAS link in the response. Embodiments validate if the required HATEOAS link is returned in the response based on the API specification documentation. An API POST request is sent to the web server corresponding to the HATEOAS link and the server responds with the response. The input is an API POST request sent to the server and the server responding with the response. The output is checking for HATEOAS links.
[0065] The analysis at 304 further includes, in some embodiments, checking for the presence of HTTP security response headers. Using as input the response from an API execution, embodiments validate the presence of security response headers from the API specification such as Strict-Transport-Security, X-Frame-Options, X-XSS-Protection and X-Content-Type-Options. Can also accommodate custom headers provided by the user. The input is a response from an API execution. The output is checking the security headers in the responses.
[0066] Further, cache-control is an HTTP header used to specify browser caching policies in both client requests and server responses. Policies include how a resource is cached, where it is cached and its maximum age before expiring (i.e., time to live). In embodiments, Cache-Control is set to no-cache and no-store, meaning sensitive information is being prevented from being cached. Therefore, embodiments further validate the presence of the Cache-Control header in the received response from the server along with the other security headers such as Strict-Transport-Security, X-Frame-Options, X-XSS-Protection and X-Content-Type-Options.
[0067]
[0068] The analysis at 304 further includes, in some embodiments, comparing the results of two test executions which check the stability of the system. Embodiments compare the results of two consecutive executions, and generate an HTML report with the differences in the results. The input is an API execution. The output is an HTML report with a comparison of results.
[0069] The analysis at 304 further includes, in some embodiments, checking for incorrect paths in the URL corresponding to the API. Embodiments add an additional 7′ at the end of each URL and checks if the server rejects the incorrect paths in the URL. The input is an additional 7′ at the end of the URL. The output is an HTML report with execution results.
[0070] The analysis at 304 further includes, in some embodiments, checking for the large payload sent by the user. Embodiments check if the server (i.e., the server where the API implementation is hosted) rejects large payloads sent by the user and returns a valid status code. As per the REST standards, when an overly large payload is sent to the API server the request should be rejected by the server and the response code should be “413 Request Entity Too Large”. The size limit can vary from product to product. The user has the flexibility to send large payloads beyond the permissible limits to check the server behavior. This helps in avoiding DoS attacks on the system under test and having an hacker send large payloads to tie up the server. The input is a large payload sent by the user. The output is an HTML report with execution results.
[0071] The analysis at 304 further includes, in some embodiments, validating the media type header sent by the user. Embodiments send different payload types and determine if the correct response is returned. Embodiments check if the server rejects an invalid media type header sent by the user and returns a valid status code. The input is an invalid media type header as part of the “Media-Type” sent by the user. The output is an HTML report with execution results.
[0072] The analysis at 304 further includes, in some embodiments, validating whether the expected compression format is satisfied by the server or not. Embodiments check if the server rejects invalid compression formats sent by the user and returns a valid status code. The input is compression type headers as part of the “Accept-Encoding” header sent by the user. The output is an HTML report with execution results.
[0073] The analysis at 304 further includes, in some embodiments, validating whether the API operation is in scope for the current testing activity or not based on the user input. Embodiments ignore the API operations which are out of scope and consider only the ones that are in scope for the testing activity. The input is the Excel file format of the API specification with the scope information updated by the user. The output is an HTML report with execution results of only the API operations that are in scope.
[0074] Embodiments skip the generation of the test data combinations approach when the parameter count in the API specification is only one. Embodiments ignore the combinatorial approach when there is only one parameter in the API specification file for a specific operation. The input is an Excel file format of the API specification with only one parameter for an API operation. The output is an HTML report with execution results
[0075] Embodiments generate HTML reports for individual APIs. The HTML report 110 includes the PASS/FAIL status of the API call and in case of failure, the error code and the payload that was used in generating the failure scenario. This helps the user in regenerating the same scenario manually.
[0076] Embodiments further perform a verb check test. For example, assume the API specification for an API call supports both GET and POST operations only, so that other verbs such as PUT and DELETE would be considered as invalid. Embodiments attempt to send invalid operations to an API call and check whether the REST API server rejects the requests with a response code of 405.
[0077] As disclosed, embodiments solve the problem of validating REST APIs by automatically generating test inputs from the REST specification. Embodiments automate REST API quality assurance tasks.
[0078] Specifically, embodiments automatically check whether 1) the REST GET verb is idempotent; 2) HATEOAS is satisfied; 3) mandatory parameters are enforced by the implementation; and 4) the HTTP return code is consistent with the REST standard.
[0079] Further, embodiments support combinatorial testing of REST APIs. In order to apply the combinatorial testing in practice, embodiments automatically extract different parameters from the given API specification. This avoids the need to specify different parameters for combinatorial testing because embodiments automatically extract from the specification. Embodiments further include a built-in list of default values based on data types. Embodiments further parse the response JSON payload from the server and make use of the output values as inputs for future testing.
[0080] Embodiments provide an option to generate test data from the GET response. Further, embodiments generate random test data if the user does not provide any default data. Embodiments automatically validate for the response codes pulled from the API specification. Embodiments test for the HTTP headers in the response by default. Embodiments check for mandatory and optional parameters if the user requests for this check.
[0081] Embodiments validate for incorrect URL checks by adding an extra slash (/) at the end of the URL. Embodiments checks for the presence of HATEOS links in the response received for POST operations based on the API specification. Embodiments check for the idempotent feature of GET and PUT operations. Embodiments take into consideration the dependency between the parameters while generating the test data combinations.
[0082] Embodiments validate for the response time (i.e., the time between the instant an API is executed and the API returns the result of its computation) of an API call. Embodiments checks if the response time falls within the maximum acceptable limit (in one embodiment, 6500 ms).
[0083] Embodiments generate an Authentication token which allows the users to verify their identity. The user retains access to the API server as long as the token remains valid. The token is generated based on the credentials that the user provides during the invocation of the tool. Below is a sample file containing the information required to generate an authentication token. Embodiments also make sure that the token is regenerated before it expires (3600 ms) to keep the execution in-progress.
TABLE-US-00005 url: ′https://wfivm02186.us.oracle.com:14101/ms_oauth/oauth2/endpoints/oauthservice/tokens′ headers: {′Content-Type′: ′application/x-www-form-urlencoded; charset=UTF-8′, ′Authorization′: ′Basic *********************} payload: {′username′: *********, ′password′: *********, ′grant_type′: ′password′}
[0084] The features, structures, or characteristics of the disclosure described throughout this specification may be combined in any suitable manner in one or more embodiments. For example, the usage of “one embodiment,” “some embodiments,” “certain embodiment,” “certain embodiments,” or other similar language, throughout this specification refers to the fact that a particular feature, structure, or characteristic described in connection with the embodiment may be included in at least one embodiment of the present disclosure. Thus, appearances of the phrases “one embodiment,” “some embodiments,” “a certain embodiment,” “certain embodiments,” or other similar language, throughout this specification do not necessarily all refer to the same group of embodiments, and the described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments.
[0085] One having ordinary skill in the art will readily understand that the embodiments as discussed above may be practiced with steps in a different order, and/or with elements in configurations that are different than those which are disclosed. Therefore, although this disclosure considers the outlined embodiments, it would be apparent to those of skill in the art that certain modifications, variations, and alternative constructions would be apparent, while remaining within the spirit and scope of this disclosure. In order to determine the metes and bounds of the disclosure, therefore, reference should be made to the appended claims.