CN107402878B - Test method and device - Google Patents

Test method and device Download PDF

Info

Publication number
CN107402878B
CN107402878B CN201610339412.7A CN201610339412A CN107402878B CN 107402878 B CN107402878 B CN 107402878B CN 201610339412 A CN201610339412 A CN 201610339412A CN 107402878 B CN107402878 B CN 107402878B
Authority
CN
China
Prior art keywords
response
expected
test
engine
answer
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201610339412.7A
Other languages
Chinese (zh)
Other versions
CN107402878A (en
Inventor
王昌
范亚平
孙胜方
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Jingdong Century Trading Co Ltd
Beijing Jingdong Shangke Information Technology Co Ltd
Original Assignee
Beijing Jingdong Century Trading Co Ltd
Beijing Jingdong Shangke Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Jingdong Century Trading Co Ltd, Beijing Jingdong Shangke Information Technology Co Ltd filed Critical Beijing Jingdong Century Trading Co Ltd
Priority to CN201610339412.7A priority Critical patent/CN107402878B/en
Publication of CN107402878A publication Critical patent/CN107402878A/en
Application granted granted Critical
Publication of CN107402878B publication Critical patent/CN107402878B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3684Test management for test design, e.g. generating new test cases

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Quality & Reliability (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Electrically Operated Instructional Devices (AREA)
  • Debugging And Monitoring (AREA)

Abstract

The application discloses a testing method and a testing device. One embodiment of the method comprises: reading at least one expected answer of a to-be-tested question classification and response rules respectively associated with the response rules from response logic information of the to-be-tested question classification of a response engine; extracting parameters related to each response rule of the to-be-tested question classification to form a parameter set influencing answers of the to-be-tested question classification; and assigning values to the parameter set according to response rules associated with the expected answers for each expected answer, cooperating the assigned parameter set as an input parameter and taking the expected answer as an expected result to generate a test case for testing the response engine. The embodiment realizes the automatic generation of the test case of the response engine.

Description

Test method and device
Technical Field
The application relates to the technical field of computers, in particular to the technical field of intelligent robots, and particularly relates to a testing method and a testing device.
Background
The automatic answering robot system needs to provide different answers for different situations when answering, namely, the conditions influencing the answers are different, and the results are different. In the process of actually verifying the accuracy of the answer, whether the application of the automatic answering robot under a specific scene and condition can answer the correct answer according to the set answering logic needs to be tested.
In the prior art, specific question and answer test data are usually designed manually according to set answer logic, and then the question and answer test data are input into a question and answer interface and are processed by the answer logic, and then actual answers and expected answers are directly compared. However, the time for sorting test cases during manual processing is long, various information affecting the interfaces of answers, the keywords of questions and the like needs to be analyzed, and when the response logic changes, the response logic needs to be re-analyzed and the corresponding test cases need to be updated.
Disclosure of Invention
It is an object of the present application to provide an improved testing method and apparatus to solve the technical problems mentioned in the background section above.
In a first aspect, the present application provides a method of testing, the method comprising: reading at least one expected answer of a to-be-tested question classification and response rules respectively associated with the response rules from response logic information of the to-be-tested question classification of a response engine; extracting parameters related to each response rule of the to-be-tested question classification to form a parameter set influencing answers of the to-be-tested question classification; and assigning values to the parameter set according to response rules associated with the expected answers for each expected answer, cooperating the assigned parameter set as an input parameter and taking the expected answer as an expected result to generate a test case for testing the response engine.
In some embodiments, the response logic information is stored in a binary tree structure, wherein each non-leaf node of the binary tree is used for recording a value condition of a single parameter, and each leaf node of the binary tree is used for recording an expected answer.
In some embodiments, the reading of at least one expected answer of the question classification to be tested and response rules respectively associated with the respective response rules from response logic information of the question classification to be tested of a response engine includes: traversing the binary tree, and respectively determining information recorded by each leaf node in the binary tree as expected answers; and for each leaf node in the binary tree, accessing the father node to the root node layer by layer from the leaf node, reading the value conditions of the single parameters recorded in each father node, and aggregating the read value conditions of the single parameters to form a response rule associated with the expected answer recorded by the leaf node.
In some embodiments, the method further comprises: inputting the input parameters in the test case to the response engine for processing; and comparing the actual result output after the processing of the response engine with the expected result associated with the input parameter in the test case to generate a test result.
In some embodiments, the method further comprises: inputting the input parameters in the test cases into a test engine for processing, wherein the test engine is generated by modifying an input interface in the response engine in advance based on the format of the input parameters in the test cases; and comparing the actual result output after the processing of the test engine with the expected result associated with the input parameter in the test case to generate a test result.
In a second aspect, the present application provides a test apparatus, the apparatus comprising: the reading unit is used for reading at least one expected answer of the to-be-tested question classification and response rules respectively associated with the response rules from response logic information of the to-be-tested question classification of a response engine; the extraction unit is used for extracting parameters related to each response rule of the to-be-tested question classification to form a parameter set influencing answers of the to-be-tested question classification; and the generating unit is used for assigning values to the parameter set according to response rules associated with the expected answers according to the expected answers, cooperating the assigned parameter set as an input parameter and generating a test case for testing the response engine by taking the expected answers as expected results. .
In some embodiments, the response logic information is stored in a binary tree structure, wherein each non-leaf node of the binary tree is used for recording a value condition of a single parameter, and each leaf node of the binary tree is used for recording an expected answer.
In some embodiments, the reading unit is further configured to: traversing the binary tree, and respectively determining information recorded by each leaf node in the binary tree as expected answers; and for each leaf node in the binary tree, accessing the father node to the root node layer by layer from the leaf node, reading the value conditions of the single parameters recorded in each father node, and aggregating the read value conditions of the single parameters to form a response rule associated with the expected answer recorded by the leaf node.
In some embodiments, the apparatus further comprises: the first input unit is used for inputting the input parameters in the test cases into the response engine for processing; and the first comparison unit is used for comparing the actual result output after the processing of the response engine with the expected result associated with the input parameter in the test case to generate a test result.
In some embodiments, the apparatus further comprises: the second input unit is used for inputting the input parameters in the test cases into a test engine for processing, wherein the test engine is generated by modifying an input interface in the response engine in advance based on the format of the input parameters in the test cases; and the second comparison unit is used for comparing the actual result output after the processing of the test engine with the expected result associated with the input parameter in the test case to generate a test result.
According to the test method and the test device, the parameters influencing answer logic are extracted, and the test case is generated by combining the expected answer in a fixed parameter mode, so that the test case can be automatically generated, and the generated test case can reflect the answer logic of the answer engine completely and accurately.
Drawings
Other features, objects and advantages of the present application will become more apparent upon reading of the following detailed description of non-limiting embodiments thereof, made with reference to the accompanying drawings in which:
FIG. 1 is an exemplary system architecture diagram in which the present application may be applied;
FIG. 2 is a flow diagram of one embodiment of a testing method according to the present application;
FIG. 3 is a diagram illustrating a data storage structure in response logic information in the corresponding embodiment of FIG. 2
FIG. 4 is a flow diagram of one embodiment of a testing method according to the present application;
FIG. 5 is a schematic block diagram of one embodiment of a test apparatus according to the present application;
fig. 6 is a schematic structural diagram of a computer system suitable for implementing the terminal device or the server according to the embodiment of the present application.
Detailed Description
The present application will be described in further detail with reference to the following drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the relevant invention and not restrictive of the invention. It should be noted that, for convenience of description, only the portions related to the related invention are shown in the drawings.
It should be noted that the embodiments and features of the embodiments in the present application may be combined with each other without conflict. The present application will be described in detail below with reference to the embodiments with reference to the attached drawings.
Fig. 1 shows an exemplary system architecture 100 to which embodiments of the testing method or testing apparatus of the present application may be applied.
As shown in fig. 1, the system architecture 100 may include terminal devices 101, 102, 103, a network 104, and a server 105. The network 104 serves as a medium for providing communication links between the terminal devices 101, 102, 103 and the server 105. Network 104 may include various connection types, such as wired, wireless communication links, or fiber optic cables, to name a few.
A user may use the terminal devices 101, 102, 103 to interact with the server 105 via the network 104 to receive or send messages such as test requests. The terminal devices 101, 102, 103 may have installed thereon various communication client applications, such as a web browser application, a software testing application, and the like.
The terminal devices 101, 102, 103 may be various electronic devices, including but not limited to smart phones, tablet computers, e-book readers, MP3 players (Moving Picture Experts Group Audio Layer III, mpeg compression standard Audio Layer 3), MP4 players (Moving Picture Experts Group Audio Layer IV, mpeg compression standard Audio Layer 4), laptop and desktop computers, and the like.
The server 105 may be a server providing various services, such as a background server providing support for applications displayed on the terminal devices 101, 102, 103. The background server may process data such as the received test request, and feed back a processing result (e.g., a test result) to the terminal device.
It should be noted that the testing method provided in the embodiment of the present application is generally executed by the server 105, and accordingly, the testing apparatus is generally disposed in the server 105.
It should be understood that the number of terminal devices, networks, and servers in fig. 1 is merely illustrative. There may be any number of terminal devices, networks, and servers, as desired for implementation.
With continued reference to FIG. 2, a flow 200 of one embodiment of a testing method according to the present application is shown. The testing method comprises the following steps:
step 201, reading at least one expected answer of the to-be-tested question classification and response rules respectively associated with each response rule from response logic information of the to-be-tested question classification of the response engine.
In this embodiment, the electronic device may read response logic information corresponding to the response engine. The answer logic information is information which is stored in advance and is used for describing answer logic of an answer engine. In particular, the answer logic information may include individual answer rules and individual expected answers involved in the answer logic. The prospective rules and associated prospective answers may be stored in association in some manner. When the electronic device reads the response logic information, the electronic device may use a reading mode corresponding to the associated storage mode to read the information, that is, each expected answer and the response rule respectively associated with each response rule may be obtained. The answer rule may include parameters to which the expected answer is constrained and value conditions corresponding to the parameters. Optionally, the operation in step 201 may be triggered by the electronic device after responding to a request received from the outside to generate a test case of the response engine.
Fig. 3 is a schematic diagram showing a data storage structure in the response logic information. Wherein, the question to be tested has 3 expected answers in total, namely answer 1, answer 2 and answer 3. The answer logic associated with answer 1 is parameter 1 equals 1, the answer logic associated with answer 2 is parameter 1 equals 2, and the answer logic associated with answer 3 is parameter 2 is null.
In some optional implementations of this embodiment, the response logic information is stored in a binary tree structure. Each non-leaf node of the binary tree is used for recording the value taking condition of a single parameter, and each leaf node of the binary tree is used for recording an expected answer. Optionally, what the leaf node records may be an identifier of the expected answer, and the specific description content of the expected answer may be stored in other locations, and the specific description content is stored in association with the identifier of the corresponding expected answer. Therefore, the binary tree can occupy smaller storage space only by storing the identifier, the efficiency of subsequent processing can be further improved only by processing the identifier in the subsequent processing, and the generated test case can also occupy smaller space.
In some optional implementation manners of the foregoing implementation manners, the foregoing step 201 may specifically be implemented by: firstly, traversing nodes of a binary tree for storing response logic information, and respectively determining information recorded by each leaf node in the binary tree as expected answers; then, for each leaf node in the binary tree, starting from the leaf node, accessing the father node to the root node layer by layer, reading the value conditions of the single parameters recorded in each father node, and aggregating the read value conditions of the single parameters, so as to obtain the response rule associated with the expected answer recorded by the leaf node.
Step 202, extracting parameters related to each response rule of the to-be-tested question classification to form a parameter set influencing answers of the to-be-tested question classification.
In this embodiment, the parameters related to the response rules associated with the response rules obtained in step 201 affect the final answer of the classification of the question to be tested, so the electronic device (for example, the server shown in fig. 1) may extract the parameters, and the extracted parameters may form a parameter set affecting the answer of the classification of the question to be tested.
Taking the response logic information schematically depicted in fig. 3 as an example, the parameters related to each response rule include parameter 1 and parameter 2. That is, parameters 1 and 2 constitute a set of parameters that affect the answers to the classification of the question to be tested.
Step 203, assigning values to the parameter set according to the response rule associated with each expected answer, and using the assigned parameter set to cooperate as an input parameter and using the expected answer as an expected result to generate a test case for testing the answer engine.
In this embodiment, for each expected answer obtained in step 201, the electronic device may first assign values to each parameter in the parameter set obtained in step 202 according to a response rule associated with the expected answer. For an expected answer, the associated answer rule may involve some parameters in the set of parameters, while other parameters may not be involved by the answer condition. When assigning, the related parameters can be assigned according to the value conditions corresponding to the parameters, and the non-related parameters can be assigned randomly or according to a preset rule. It should be noted that, the value condition corresponding to a parameter may also be that a certain parameter is null.
And then, the electronic equipment can cooperate the assigned parameter set after assignment as an input parameter, take the expected answer as an expected result, and store the expected answer and the expected answer in an associated manner to generate the test case. The generated test case can be used for testing the response engine.
Continuing with the example of the answer logic information depicted schematically in fig. 3, parameters 1 and 2 form a set of parameters that affect the answers to the classification of the question to be tested. For the expected answer 1, the corresponding answer rule is that the parameter 1 is equal to 1, the parameter 1 in the parameter set may be assigned to 1, the parameter 2 in the parameter set may be assigned to 1, and then the assigned parameter set (i.e., the parameter 1 is equal to 1, and the parameter 2 is equal to 2) is used as an input parameter and the identifier (e.g., 1) of the answer 1 is used as an expected result to construct a test case. Similarly, for answer 2, the input parameter in the generated test case may be parameter 1-2, parameter 2-1, and the expected result is 2; the input parameters in the test case generated for answer 3 may be parameter 1 to 3, parameter 2 to null, and the expected result to be 3.
In some optional implementations of this embodiment, after step 203, the testing method further includes: inputting input parameters in the test case into a response engine for processing; and comparing the actual result output after the processing of the response engine with the expected result associated with the input parameter in the test case to generate a test result. In this implementation manner, for the test case generated in step 203, the electronic device may first input an input parameter in the test case to the response engine to trigger the response engine to execute corresponding processing; then, the electronic device can obtain the actual result output after the response engine processes; then, the electronic device may compare the actual result with an expected result associated with the input parameter in the test case, so as to generate a test result according to the comparison result. Generally, when the two are identical, the test is passed, otherwise the test is not passed. It should be noted that, when the expected result in the test case is the identifier of the expected answer, the identifiers of the two may be compared when the comparison is performed.
According to the method provided by the embodiment of the application, the parameters influencing the answer logic are extracted, and the test case is generated by combining the expected answer in a fixed parameter mode, so that the test case can be automatically generated, and the generated test case can completely and accurately reflect the answer logic of the answer engine.
With further reference to fig. 4, a flow 400 of yet another embodiment of a testing method is shown. The process 400 of the test method includes the following steps:
step 401, reading at least one expected answer of the question classification to be tested and response rules respectively associated with each response rule from response logic information of the question classification to be tested of the response engine.
In this embodiment, the specific processing of step 401 may refer to step 201 in the corresponding embodiment of fig. 2, which is not described herein again.
Step 402, extracting parameters related to each response rule of the to-be-tested question classification to form a parameter set influencing answers of the to-be-tested question classification.
In this embodiment, the specific processing of step 402 may refer to step 202 in the embodiment corresponding to fig. 2, which is not described herein again.
And 403, assigning values to the parameter set according to the response rule associated with each expected answer, and generating a test case for testing the answer engine by using the assigned parameter set as the input parameter and the expected answer as the expected result.
In this embodiment, the specific processing in step 403 may refer to step 203 in the embodiment corresponding to fig. 2, which is not described herein again.
Step 404, input parameters in the test case are input to the test engine for processing.
In this embodiment, according to the parameter format of the input parameters in the generated test case, the input interface in the answer engine may be modified in advance to generate a test engine that can be used to receive the input parameters in the test case. Thus, the electronic device can input the input parameters of the test case generated in step 403 into the test engine, so that the test engine can smoothly process the input parameters. Alternatively, the operation of step 403 may be triggered by the electronic device in response to a test request issued by a user.
Step 405, comparing the actual result output after the processing by the test engine with the expected result associated with the input parameter in the test case to generate a test result.
In this embodiment, the test engine may output an actual result after processing according to the input parameter, and the electronic device may obtain the actual result and compare the actual result with an expected result associated with the input parameter in the test case, so as to determine the test result according to the comparison result.
As can be seen from fig. 4, compared with the embodiment corresponding to fig. 2, the flow 400 of the testing method in this embodiment tests the testing engine formed after the interface of the response engine is modified, which solves the problem that some response engines cannot input in a parameter manner, and improves the universality of the testing scheme.
With further reference to fig. 5, as an implementation of the method shown in the above figures, the present application provides an embodiment of a testing apparatus, which corresponds to the embodiment of the method shown in fig. 2, and which can be applied to various electronic devices.
As shown in fig. 5, the testing apparatus 500 of the present embodiment includes: a reading unit 501, an extraction unit 502 and a generation unit 503. The reading unit 501 is configured to read at least one expected answer of a to-be-tested question classification and response rules respectively associated with the response rules from response logic information of the to-be-tested question classification of a response engine; the extracting unit 502 is configured to extract parameters related to each response rule of the to-be-tested question classification, and form a parameter set affecting answers of the to-be-tested question classification; the generating unit 503 is configured to assign values to the parameter sets according to response rules associated with the expected answers for each expected answer, cooperate the assigned parameter sets as input parameters, and generate test cases for testing the answer engines by using the expected answers as expected results.
In this embodiment, the specific processing of the reading unit 501, the extracting unit 502 and the generating unit 503 of the testing apparatus 500 may refer to step 201, step 202 and step 203 of the corresponding embodiment in fig. 2, which is not described herein again.
In some optional implementations of this embodiment, the response logic information is stored in a structure of a binary tree, where each non-leaf node of the binary tree is used to record a value condition of a single parameter, and each leaf node of the binary tree is used to record an expected answer. The specific processing of this implementation may refer to a corresponding implementation in the corresponding embodiment of fig. 2, which is not described herein again.
In some optional implementations of the present embodiment, the reading unit 501 is further configured to: traversing the binary tree, and respectively determining the information recorded by each leaf node in the binary tree as expected answers; and aiming at each leaf node in the binary tree, accessing the father node from the leaf node to the root node layer by layer, reading the value conditions of the single parameters recorded in each father node, and aggregating the read value conditions of the single parameters to form a response rule associated with the expected answer recorded by the leaf node. The specific processing of this implementation may refer to a corresponding implementation in the corresponding embodiment of fig. 2, which is not described herein again.
In some optional implementations of the present embodiment, the testing apparatus 500 further includes: a first input unit (not shown) for inputting the input parameters in the test case to the response engine for processing; and the first comparison unit (not shown) is used for comparing the actual result output after the processing of the response engine with the expected result associated with the input parameter in the test case to generate a test result. The specific processing of this implementation may refer to a corresponding implementation in the corresponding embodiment of fig. 2, which is not described herein again.
In some optional implementations of the present embodiment, the testing apparatus 500 further includes: a second input unit (not shown) for inputting the input parameters in the test case to the test engine for processing, wherein the test engine is generated by modifying the input interface in the response engine in advance based on the format of the input parameters in the test case; and a second comparing unit (not shown) for comparing the actual result output after being processed by the test engine with the expected result associated with the input parameter in the test case to generate a test result. The specific processing of this implementation may refer to the embodiment corresponding to fig. 4, and is not described here again.
Referring now to FIG. 6, shown is a block diagram of a computer system 600 suitable for use in implementing a terminal device or server of an embodiment of the present application.
As shown in fig. 6, the computer system 600 includes a Central Processing Unit (CPU)601 that can perform various appropriate actions and processes according to a program stored in a Read Only Memory (ROM)602 or a program loaded from a storage section 608 into a Random Access Memory (RAM) 603. In the RAM 603, various programs and data necessary for the operation of the system 600 are also stored. The CPU 601, ROM 602, and RAM 603 are connected to each other via a bus 604. An input/output (I/O) interface 605 is also connected to bus 604.
The following components are connected to the I/O interface 605: an input portion 606 including a keyboard, a mouse, and the like; an output portion 607 including a display such as a Cathode Ray Tube (CRT), a Liquid Crystal Display (LCD), and the like, and a speaker; a storage section 608 including a hard disk and the like; and a communication section 609 including a network interface card such as a LAN card, a modem, or the like. The communication section 609 performs communication processing via a network such as the internet. The driver 610 is also connected to the I/O interface 605 as needed. A removable medium 611 such as a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, or the like is mounted on the drive 610 as necessary, so that a computer program read out therefrom is mounted in the storage section 608 as necessary.
In particular, according to an embodiment of the present disclosure, the processes described above with reference to the flowcharts may be implemented as computer software programs. For example, embodiments of the present disclosure include a computer program product comprising a computer program tangibly embodied on a machine-readable medium, the computer program comprising program code for performing the method illustrated in the flow chart. In such an embodiment, the computer program may be downloaded and installed from a network through the communication section 609, and/or installed from the removable medium 611.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present application. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units described in the embodiments of the present application may be implemented by software or hardware. The described units may also be provided in a processor, and may be described as: a processor includes a reading unit, an extracting unit, and a generating unit. Where the names of the units do not in some cases constitute a definition of the units themselves, for example, the reading unit may also be described as a "unit that reads at least one expected answer to the classification of questions to be tested and the response rules respectively associated with the respective response rules from the response logic information of the classification of questions to be tested of the response engine".
As another aspect, the present application also provides a non-volatile computer storage medium, which may be the non-volatile computer storage medium included in the apparatus in the above-described embodiments; or it may be a non-volatile computer storage medium that exists separately and is not incorporated into the terminal. The non-transitory computer storage medium stores one or more programs that, when executed by a device, cause the device to: reading at least one expected answer of a to-be-tested question classification and response rules respectively associated with the response rules from response logic information of the to-be-tested question classification of a response engine; extracting parameters related to each response rule of the to-be-tested question classification to form a parameter set influencing answers of the to-be-tested question classification; and assigning values to the parameter set according to response rules associated with the expected answers for each expected answer, cooperating the assigned parameter set as an input parameter and taking the expected answer as an expected result to generate a test case for testing the response engine.
The above description is only a preferred embodiment of the application and is illustrative of the principles of the technology employed. It will be appreciated by a person skilled in the art that the scope of the invention as referred to in the present application is not limited to the embodiments with a specific combination of the above-mentioned features, but also covers other embodiments with any combination of the above-mentioned features or their equivalents without departing from the inventive concept. For example, the above features may be replaced with (but not limited to) features having similar functions disclosed in the present application.

Claims (12)

1. A method of testing, the method comprising:
reading at least one expected answer of a to-be-tested question classification and response rules respectively associated with the response rules from response logic information of the to-be-tested question classification of a response engine;
extracting parameters related to each response rule of the to-be-tested question classification to form a parameter set influencing answers of the to-be-tested question classification;
and for each expected answer, assigning values to the parameter set according to a response rule associated with the expected answer, cooperating the assigned parameter set as an input parameter and taking the expected answer as an expected result to generate a test case for testing the response engine, wherein the response engine is a pre-modified test engine which can be used for receiving the input parameter in the test case.
2. The method according to claim 1, wherein the response logic information is stored in a structure of a binary tree, wherein each non-leaf node of the binary tree is used for recording a value condition of a single parameter, and each leaf node of the binary tree is used for recording an expected answer.
3. The method of claim 2, wherein reading at least one expected answer to a question classification to be tested and response rules respectively associated with the respective response rules from response logic information of the question classification to be tested of a response engine comprises:
traversing the binary tree, and respectively determining information recorded by each leaf node in the binary tree as expected answers;
and for each leaf node in the binary tree, accessing the father node to the root node layer by layer from the leaf node, reading the value conditions of the single parameters recorded in each father node, and aggregating the read value conditions of the single parameters to form a response rule associated with the expected answer recorded by the leaf node.
4. The method according to one of claims 1 to 3, characterized in that the method further comprises:
inputting the input parameters in the test case to the response engine for processing;
and comparing the actual result output after the processing of the response engine with the expected result associated with the input parameter in the test case to generate a test result.
5. The method according to one of claims 1 to 3, characterized in that the method further comprises:
inputting the input parameters in the test cases into a test engine for processing, wherein the test engine is generated by modifying an input interface in the response engine in advance based on the format of the input parameters in the test cases;
and comparing the actual result output after the processing of the test engine with the expected result associated with the input parameter in the test case to generate a test result.
6. A test apparatus, the apparatus comprising:
the reading unit is used for reading at least one expected answer of the to-be-tested question classification and response rules respectively associated with the response rules from response logic information of the to-be-tested question classification of a response engine;
the extraction unit is used for extracting parameters related to each response rule of the to-be-tested question classification to form a parameter set influencing answers of the to-be-tested question classification;
and the generating unit is used for assigning values to the parameter set according to response rules associated with the expected answers according to the expected answers, cooperating the assigned parameter set as an input parameter and using the expected answers as expected results to generate a test case for testing the response engine, wherein the response engine is a testing engine which is modified in advance and can be used for receiving the input parameters in the test case.
7. The apparatus according to claim 6, wherein the response logic information is stored in a structure of a binary tree, wherein each non-leaf node of the binary tree is used for recording a value condition of a single parameter, and each leaf node of the binary tree is used for recording an expected answer.
8. The apparatus of claim 7, wherein the reading unit is further configured to:
traversing the binary tree, and respectively determining information recorded by each leaf node in the binary tree as expected answers;
and for each leaf node in the binary tree, accessing the father node to the root node layer by layer from the leaf node, reading the value conditions of the single parameters recorded in each father node, and aggregating the read value conditions of the single parameters to form a response rule associated with the expected answer recorded by the leaf node.
9. The apparatus according to any one of claims 6-8, wherein the apparatus further comprises:
the first input unit is used for inputting the input parameters in the test cases into the response engine for processing;
and the first comparison unit is used for comparing the actual result output after the processing of the response engine with the expected result associated with the input parameter in the test case to generate a test result.
10. The apparatus according to any one of claims 6-8, wherein the apparatus further comprises:
the second input unit is used for inputting the input parameters in the test cases into a test engine for processing, wherein the test engine is generated by modifying an input interface in the response engine in advance based on the format of the input parameters in the test cases;
and the second comparison unit is used for comparing the actual result output after the processing of the test engine with the expected result associated with the input parameter in the test case to generate a test result.
11. A test apparatus, comprising:
a memory; and a processor coupled to the memory, the processor configured to perform the test method of any of claims 1-5 based on instructions stored in the memory.
12. A computer-readable storage medium storing computer instructions which, when executed by a processor, implement a testing method according to any one of claims 1 to 5.
CN201610339412.7A 2016-05-19 2016-05-19 Test method and device Active CN107402878B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610339412.7A CN107402878B (en) 2016-05-19 2016-05-19 Test method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610339412.7A CN107402878B (en) 2016-05-19 2016-05-19 Test method and device

Publications (2)

Publication Number Publication Date
CN107402878A CN107402878A (en) 2017-11-28
CN107402878B true CN107402878B (en) 2020-09-01

Family

ID=60389150

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610339412.7A Active CN107402878B (en) 2016-05-19 2016-05-19 Test method and device

Country Status (1)

Country Link
CN (1) CN107402878B (en)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110109828A (en) * 2019-04-15 2019-08-09 深圳壹账通智能科技有限公司 Question and answer interface test method, device, computer equipment and storage medium
CN111988195B (en) * 2019-05-24 2023-04-07 北京京东尚科信息技术有限公司 Response scheme determination method, device, equipment and medium for packet test
CN112308571B (en) * 2019-07-15 2024-09-24 北京汇钧科技有限公司 Intelligent customer service response method and device, storage medium and electronic equipment
CN112286787A (en) * 2020-06-29 2021-01-29 北京京东尚科信息技术有限公司 Automatic test method and system for intelligent response system and electronic equipment
CN112182044A (en) * 2020-11-10 2021-01-05 平安普惠企业管理有限公司 Rule engine testing method and device and computer equipment
CN113760744A (en) * 2021-04-29 2021-12-07 腾讯科技(深圳)有限公司 Detection method and device for conversation robot, electronic device and storage medium

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102368246A (en) * 2011-09-15 2012-03-07 张德长 Automatic-answer robot system
CN103176896A (en) * 2011-12-23 2013-06-26 阿里巴巴集团控股有限公司 Generating method and generating device of test cases
CN103186457A (en) * 2011-12-29 2013-07-03 阿里巴巴集团控股有限公司 Method and device for automatically generating test case
US8560948B2 (en) * 2005-12-23 2013-10-15 Michael Hu User support system integrating FAQ and helpdesk features and FAQ maintenance capabilities
CN104731895A (en) * 2015-03-18 2015-06-24 北京京东尚科信息技术有限公司 Auto-answer method and device
CN104809062A (en) * 2015-04-22 2015-07-29 北京京东尚科信息技术有限公司 Test method and system of artificial intelligence answering system

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8560948B2 (en) * 2005-12-23 2013-10-15 Michael Hu User support system integrating FAQ and helpdesk features and FAQ maintenance capabilities
CN102368246A (en) * 2011-09-15 2012-03-07 张德长 Automatic-answer robot system
CN103176896A (en) * 2011-12-23 2013-06-26 阿里巴巴集团控股有限公司 Generating method and generating device of test cases
CN103186457A (en) * 2011-12-29 2013-07-03 阿里巴巴集团控股有限公司 Method and device for automatically generating test case
CN104731895A (en) * 2015-03-18 2015-06-24 北京京东尚科信息技术有限公司 Auto-answer method and device
CN104809062A (en) * 2015-04-22 2015-07-29 北京京东尚科信息技术有限公司 Test method and system of artificial intelligence answering system

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
中文问答系统知识库的自动构建问题研究;李臻贤;《中国优秀硕士学位论文全文数据库 信息科技辑》;20151215(第12期);第I138-945页 *

Also Published As

Publication number Publication date
CN107402878A (en) 2017-11-28

Similar Documents

Publication Publication Date Title
CN107402878B (en) Test method and device
CN108830235B (en) Method and apparatus for generating information
CN108805091B (en) Method and apparatus for generating a model
CN109976995B (en) Method and apparatus for testing
CN110019263B (en) Information storage method and device
CN110737726B (en) Method and device for determining test data of interface to be tested
CN110764760B (en) Method, apparatus, computer system, and medium for drawing program flow chart
CN110084317B (en) Method and device for recognizing images
CN111813685B (en) Automatic test method and device
CN109255035B (en) Method and device for constructing knowledge graph
CN107305528B (en) Application testing method and device
CN117278434A (en) Flow playback method and device and electronic equipment
CN107368407B (en) Information processing method and device
US10880604B2 (en) Filter and prevent sharing of videos
CN113495498B (en) Simulation method, simulator, device and medium for hardware device
CN112579428B (en) Interface testing method, device, electronic equipment and storage medium
CN104363237A (en) Method and system for processing internet media resource metadata
CN109617708B (en) Compression method, device and system for embedded point log
CN106294700A (en) The storage of a kind of daily record and read method and device
US20220191345A1 (en) System and method for determining compression rates for images comprising text
CN112308074B (en) Method and device for generating thumbnail
CN111460273B (en) Information pushing method and device
CN111414566B (en) Method and device for pushing information
CN111259194B (en) Method and apparatus for determining duplicate video
CN108984426B (en) Method and apparatus for processing data

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant