CN116402354A - Evaluation parameter determining method and device, medium and electronic equipment - Google Patents
Evaluation parameter determining method and device, medium and electronic equipment Download PDFInfo
- Publication number
- CN116402354A CN116402354A CN202310671199.XA CN202310671199A CN116402354A CN 116402354 A CN116402354 A CN 116402354A CN 202310671199 A CN202310671199 A CN 202310671199A CN 116402354 A CN116402354 A CN 116402354A
- Authority
- CN
- China
- Prior art keywords
- feature
- data
- model
- original
- feature data
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000011156 evaluation Methods 0.000 title claims abstract description 130
- 238000000034 method Methods 0.000 title claims abstract description 56
- 238000010276 construction Methods 0.000 claims abstract description 107
- 238000012549 training Methods 0.000 claims abstract description 52
- 239000011159 matrix material Substances 0.000 claims description 99
- 230000006870 function Effects 0.000 claims description 24
- 230000008569 process Effects 0.000 claims description 22
- 238000007781 pre-processing Methods 0.000 claims description 13
- 238000004590 computer program Methods 0.000 claims description 3
- 238000012545 processing Methods 0.000 description 11
- 238000010586 diagram Methods 0.000 description 8
- 238000010801 machine learning Methods 0.000 description 8
- 230000006399 behavior Effects 0.000 description 4
- 230000007547 defect Effects 0.000 description 4
- 238000004458 analytical method Methods 0.000 description 3
- 230000003287 optical effect Effects 0.000 description 3
- 230000006978 adaptation Effects 0.000 description 2
- 238000013528 artificial neural network Methods 0.000 description 2
- 230000008859 change Effects 0.000 description 2
- 239000013307 optical fiber Substances 0.000 description 2
- 230000000644 propagated effect Effects 0.000 description 2
- 238000012502 risk assessment Methods 0.000 description 2
- 230000009471 action Effects 0.000 description 1
- 238000003491 array Methods 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000012512 characterization method Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 239000000284 extract Substances 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 238000005065 mining Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000003252 repetitive effect Effects 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 230000035945 sensitivity Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
- G06Q10/06—Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
- G06Q10/063—Operations research, analysis or management
- G06Q10/0635—Risk analysis of enterprise or organisation activities
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N20/00—Machine learning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q50/00—Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
- G06Q50/10—Services
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02P—CLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
- Y02P90/00—Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
- Y02P90/30—Computing systems specially adapted for manufacturing
Landscapes
- Business, Economics & Management (AREA)
- Engineering & Computer Science (AREA)
- Human Resources & Organizations (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Tourism & Hospitality (AREA)
- Economics (AREA)
- Physics & Mathematics (AREA)
- Strategic Management (AREA)
- Entrepreneurship & Innovation (AREA)
- Marketing (AREA)
- General Business, Economics & Management (AREA)
- Software Systems (AREA)
- Game Theory and Decision Science (AREA)
- Data Mining & Analysis (AREA)
- Operations Research (AREA)
- Educational Administration (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Primary Health Care (AREA)
- Artificial Intelligence (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Quality & Reliability (AREA)
- Evolutionary Computation (AREA)
- Medical Informatics (AREA)
- Development Economics (AREA)
- Computing Systems (AREA)
- General Engineering & Computer Science (AREA)
- Mathematical Physics (AREA)
- Image Analysis (AREA)
Abstract
The disclosure provides an evaluation parameter determining method, an evaluation parameter determining device, a medium and electronic equipment, and relates to the technical field of computers. The evaluation parameter determining method comprises the following steps: acquiring first original characteristic data of an object to be evaluated; training a feature construction model by using the first original feature data until the feature construction model converges; obtaining model parameters of a feature construction model to obtain target feature data of an object to be evaluated; and determining an evaluation parameter of the object to be evaluated according to the target characteristic data and the evaluation parameter determination model, wherein the evaluation parameter is used for representing the credibility of the object to be evaluated. The invention provides an evaluation parameter determination scheme for automatically constructing feature data, which improves the accuracy and reliability of the determined evaluation parameters.
Description
Technical Field
The disclosure relates to the field of computer technology, and in particular, to an evaluation parameter determining method, an evaluation parameter determining device, a computer readable storage medium and an electronic device.
Background
In various service industries, a target object is subjected to credibility evaluation, so that a service side can determine a more reliable service strategy according to an evaluation result.
In the related art, the reliability evaluation on the target object can be realized based on a machine learning model, wherein, sample original characteristic data of the sample target object can be obtained, the sample original characteristic data is processed to obtain sample target characteristic data, and finally, the sample target characteristic data is used for training an evaluation parameter to determine the model; when risk assessment is needed, the pre-trained assessment parameter determination model is utilized to determine the assessment parameters of the target object so as to obtain the credibility information of the target object.
However, according to the machine learning-based evaluation parameter determination scheme, the processing process of the sample original feature data is usually completed manually, and defects of the sample original feature data can cause low accuracy of the obtained sample target feature data, and the reliability of the trained evaluation parameter determination model can also be reduced, so that the accuracy and reliability of the obtained evaluation parameter are affected.
Disclosure of Invention
The disclosure provides an evaluation parameter determining method, an evaluation parameter determining device, a medium and electronic equipment, so that the accuracy of the determined evaluation parameters of an object to be evaluated is improved.
According to a first aspect of the present disclosure, there is provided an evaluation parameter determining method including:
Acquiring first original characteristic data of an object to be evaluated;
training a feature construction model by using the first original feature data until the feature construction model converges;
obtaining model parameters of the feature construction model to obtain target feature data of the object to be evaluated, wherein the target feature data comprises the first original feature data and relation feature data among the first original feature data;
and determining an evaluation parameter of the object to be evaluated according to the target characteristic data and an evaluation parameter determination model, wherein the evaluation parameter is used for representing the credibility of the object to be evaluated.
Optionally, the training the feature building model by using the first original feature data until the feature building model converges includes:
and training a feature construction model by using the first original feature data and the second original feature data of the sample object until the feature construction model converges.
Optionally, the training the feature construction model using the first raw feature data and the second raw feature data of the sample object until the feature construction model converges includes:
combining the first original feature data and the second original feature data to obtain original feature data to be processed, wherein the original feature data to be processed is an original feature matrix, and elements of the original feature matrix are original feature data of any object in any feature dimension;
Replacing the characteristic data of the second target matrix row with the characteristic data of the first target matrix row in the target matrix row of the original characteristic matrix to obtain noise characteristic data to be processed;
and training a feature construction model according to the noise feature data to be processed until the feature construction model converges.
Optionally, the training a feature building model according to the noise feature data to be processed until the feature building model converges includes:
inputting the noise characteristic data to be processed into a characteristic construction model to be trained to obtain prediction characteristic data of the noise characteristic data to be processed;
determining a loss function value according to the original characteristic data to be processed, the predicted characteristic data of the noise characteristic data to be processed and the loss function;
and according to the loss function value and a preset model convergence condition, adjusting the model parameters of the feature construction model, and repeating the process until the feature construction model converges.
Optionally, after merging the first raw feature data and the second raw feature data to obtain raw feature data to be processed, the method further includes:
and preprocessing the original characteristic data to be processed to obtain preprocessed original characteristic data to be processed.
Optionally, the obtaining the model parameters of the feature construction model to obtain the target feature data of the object to be evaluated includes:
extracting model parameters of the feature construction model, wherein the model parameters are target feature matrixes, and elements of the target feature matrixes are target feature data of any object in any feature dimension;
and acquiring target characteristic data of the object to be evaluated from the model parameter matrix and the object to be evaluated in an associated matrix row.
Optionally, the determining the evaluation parameter of the object to be evaluated according to the target feature data and the evaluation parameter determining model includes:
acquiring target characteristic data of a matrix row associated with the sample object in the model parameter matrix to obtain sample target characteristic data of the sample object;
training the evaluation parameter determination model by using the sample target feature data until the evaluation parameter determination model converges;
and inputting the target characteristic data of the object to be evaluated into the evaluation parameter determination model to obtain the evaluation parameters of the object to be evaluated.
According to a second aspect of the present disclosure, there is provided an evaluation parameter determination apparatus including:
The first acquisition module is configured to acquire first original characteristic data of an object to be evaluated;
a training module configured to train a feature build model using the first raw feature data until the feature build model converges;
the second acquisition module is configured to acquire model parameters of the feature construction model to obtain target feature data of the object to be evaluated, wherein the target feature data comprises the first original feature data and relation feature data among the first original feature data;
and the determining module is configured to determine an evaluation parameter of the object to be evaluated according to the target characteristic data and an evaluation parameter determining model, wherein the evaluation parameter is used for representing the credibility of the object to be evaluated.
Optionally, the training module is configured to:
and training a feature construction model by using the first original feature data and the second original feature data of the sample object until the feature construction model converges.
Optionally, the training module is configured to:
combining the first original feature data and the second original feature data to obtain original feature data to be processed, wherein the original feature data to be processed is an original feature matrix, and elements of the original feature matrix are original feature data of any object in any feature dimension;
Replacing the characteristic data of the second target matrix row with the characteristic data of the first target matrix row in the target matrix row of the original characteristic matrix to obtain noise characteristic data to be processed;
and training a feature construction model according to the noise feature data to be processed until the feature construction model converges.
Optionally, the training module is configured to:
inputting the noise characteristic data to be processed into a characteristic construction model to be trained to obtain prediction characteristic data of the noise characteristic data to be processed;
determining a loss function value according to the original characteristic data to be processed, the predicted characteristic data of the noise characteristic data to be processed and the loss function;
and according to the loss function value and a preset model convergence condition, adjusting the model parameters of the feature construction model, and repeating the process until the feature construction model converges.
Optionally, the apparatus further includes a preprocessing module configured to:
and preprocessing the original characteristic data to be processed to obtain preprocessed original characteristic data to be processed.
Optionally, the second obtaining module is configured to:
extracting model parameters of the feature construction model, wherein the model parameters are target feature matrixes, and elements of the target feature matrixes are target feature data of any object in any feature dimension;
And acquiring target characteristic data of the object to be evaluated from the model parameter matrix and the object to be evaluated in an associated matrix row.
Optionally, the determining module is configured to:
acquiring target characteristic data of a matrix row associated with the sample object in the model parameter matrix to obtain sample target characteristic data of the sample object;
training the evaluation parameter determination model by using the sample target feature data until the evaluation parameter determination model converges;
and inputting the target characteristic data of the object to be evaluated into the evaluation parameter determination model to obtain the evaluation parameters of the object to be evaluated.
According to a third aspect of the present disclosure, there is provided a computer readable storage medium having stored thereon a computer program which, when executed by a processor, implements the method of the first aspect.
According to a fourth aspect of the present disclosure, there is provided an electronic device comprising:
a processor; and
a memory for storing executable instructions of the processor;
wherein the processor is configured to perform the method of the first aspect via execution of the executable instructions.
The technical scheme of the present disclosure has the following beneficial effects:
On one hand, feature information in first original feature data is mined through a feature construction model to obtain target feature data containing the first original feature data and relation feature data among the first original feature data, and feature data which more accurately represent features of an object to be evaluated can be obtained; on the other hand, by determining the evaluation parameters of the object to be evaluated by the target feature data containing more feature information, the accuracy and reliability of the determined evaluation parameters can be improved.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the disclosure and together with the description, serve to explain the principles of the disclosure. It will be apparent to those of ordinary skill in the art that the drawings in the following description are merely some embodiments of the present disclosure and that other drawings may be derived from these drawings without undue effort.
Fig. 1 shows a schematic configuration diagram of an evaluation parameter determination system in the present exemplary embodiment.
Fig. 2 shows a flowchart of an evaluation parameter determination method in the present exemplary embodiment.
Fig. 3 shows a schematic diagram of a feature matrix in the present exemplary embodiment.
Fig. 4 shows an exemplary flow of another evaluation parameter determination method in the present exemplary embodiment.
Fig. 5 shows a schematic diagram of an evaluation parameter determination apparatus in the present exemplary embodiment.
Fig. 6 shows a schematic structural diagram of an electronic device in the present exemplary embodiment.
Detailed Description
Exemplary embodiments will now be described more fully with reference to the accompanying drawings. However, the exemplary embodiments may be embodied in many forms and should not be construed as limited to the examples set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of the exemplary embodiments to those skilled in the art. The described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. In the following description, numerous specific details are provided to give a thorough understanding of embodiments of the present disclosure. One skilled in the relevant art will recognize, however, that the aspects of the disclosure may be practiced without one or more of the specific details, or with other methods, components, devices, steps, etc. In other instances, well-known technical solutions have not been shown or described in detail to avoid obscuring aspects of the present disclosure.
Furthermore, the drawings are merely schematic illustrations of the present disclosure and are not necessarily drawn to scale. The same reference numerals in the drawings denote the same or similar parts, and thus a repetitive description thereof will be omitted. Some of the block diagrams shown in the figures are functional entities and do not necessarily correspond to physically or logically separate entities. These functional entities may be implemented in software or in one or more hardware modules or integrated circuits or in different networks and/or processor devices and/or microcontroller devices.
The flow diagrams depicted in the figures are exemplary only and not necessarily all steps are included. For example, some steps may be decomposed, and some steps may be combined or partially combined, so that the order of actual execution may be changed according to actual situations.
In the related art, a scheme of determining an evaluation parameter of a target object, which is generally implemented based on a machine learning model, i.e., evaluating the credit of the target object by a pre-trained evaluation parameter determination model, has emerged. In the process of training the evaluation parameter determination model, the obtained sample raw feature data of the sample target object is usually processed manually, for example, the sample raw feature data is summarized in an analysis manner, so as to construct the sample target feature data for training the evaluation parameter determination model.
However, on the one hand, the process of manually processing the feature data generally depends on the data sensitivity and analysis capability of the data processing personnel and can be influenced by objective reasons, for example, for anonymous feature data, due to the characteristics of the anonymous feature data, the anonymous feature data cannot be further analyzed to obtain other feature information carried by the anonymous feature data, or for sample original feature data related to the service, since the data processing personnel cannot know service logic, that is, cannot determine a service scene causing the change of the sample original feature data, there is a situation that the analysis of the data is not in place during feature processing, so that the obtained sample target feature data cannot accurately represent the features of the sample target object; on the other hand, the original characteristic data of the sample has certain defects, which can cause poor accuracy of the obtained characteristic data of the sample target and further cause the reliability of the evaluation parameter determination model obtained by training to be reduced; for example, for the same sample target object, sample original feature data obtained at different data acquisition moments are the same, but risk assessment results (labels) for the sample target object are different, and in the process of training an assessment parameter determination model, it is difficult for different labels of the same sample feature data to train a high-reliability assessment parameter determination model.
In view of the foregoing, exemplary embodiments of the present disclosure provide an evaluation parameter determination method for a reliability evaluation service. Application scenarios of the evaluation parameter determination method include, but are not limited to: in commodity sales business of a merchant, first original characteristic data of an object to be evaluated (such as a seller or a logistics party) is obtained; training a feature construction model by using the first original feature data until the feature construction model converges; obtaining model parameters of a feature construction model to obtain target feature data of an object to be evaluated; and determining the evaluation parameters of the object to be evaluated according to the target feature data and the evaluation parameter determination model to obtain the credibility of the object to be evaluated, wherein the target feature data comprises first original feature data and relation feature data among the first original feature data.
To implement the above-described evaluation parameter determination method, exemplary embodiments of the present disclosure provide an evaluation parameter determination system. Fig. 1 shows a schematic architecture diagram of the evaluation parameter determination system. As shown in fig. 1, the evaluation parameter determination system 100 may include a server 110 and a user terminal 120. The server 110 may be a background server deployed by an evaluation parameter determination service provider (such as a shop operator), and the user terminal 120 may be a terminal device of an evaluation parameter determination service demander (such as a credit agency or a bank), and more specifically, the terminal device may be a smart phone, a personal computer, a tablet computer, or the like. The server 110 and the user terminal 120 may establish a connection through a network to implement the evaluation parameter determination.
It should be appreciated that the server 110 may be one computer or a cluster formed by a plurality of computers, and the specific architecture of the server 110 is not limited in this disclosure.
The evaluation parameter determination method will be described below from the perspective of the server. FIG. 2 illustrates an exemplary flow of an evaluation parameter determination method performed by a server, which may include:
step S201, obtaining first original characteristic data of an object to be evaluated;
step S202, training a feature construction model by using first original feature data until the feature construction model converges;
step S203, obtaining model parameters of a feature construction model to obtain target feature data of an object to be evaluated;
the target feature data comprises first original feature data and relation feature data among the first original feature data.
Step S204, determining an evaluation parameter of the object to be evaluated according to the target feature data and the evaluation parameter determination model.
Wherein the evaluation parameters are used for characterizing the credibility of the object to be evaluated.
In summary, according to the method for determining the evaluation parameters provided by the embodiment of the present disclosure, on one hand, feature information in the first original feature data is mined through the feature construction model to obtain target feature data including the first original feature data and relationship feature data between the first original feature data, so that feature data that more accurately characterizes features of an object to be evaluated can be obtained; on the other hand, by determining the evaluation parameters of the object to be evaluated by the target feature data containing more feature information, the accuracy and reliability of the determined evaluation parameters can be improved.
Each step in fig. 2 is described in detail below:
in step S201, the server may acquire first raw feature data of the object to be evaluated;
in the embodiment of the disclosure, the object may be a target object, such as a person or a shop, for which evaluation parameter determination needs to be performed, and the first original feature data of the object to be evaluated may include attribute feature data of the object to be evaluated, and behavior feature data associated with a business behavior of the object to be evaluated.
It should be noted that, in the embodiment of the present disclosure, the attribute feature data and/or the behavior feature data of the object to be evaluated may include data of different feature dimensions, respectively. For example, if the object to be evaluated is a shop, the attribute feature data of the shop may include a shop registration code, a shop identification (ID, identity document), location information, a main product, brand information, and the like, and the business behavior feature data of the shop may include: sales, net income, tax payment, performance information, etc.; the number of objects to be evaluated may include at least one; in order to protect the privacy of the object to be evaluated, the original feature data of the partial feature dimension acquired by the server is typically anonymous feature data, such as net income or a shop registration code.
In an alternative embodiment, the process of obtaining, by the server, the first raw characteristic data of the object to be evaluated may include: responding to an evaluation parameter determination request sent by a user terminal, analyzing an object identifier to be evaluated carried in the evaluation parameter determination request, and acquiring first original characteristic data corresponding to the object identifier to be evaluated according to the object identifier to be evaluated in a preset database to obtain the first original characteristic data of the object to be evaluated. The preset database may be a preset database in a server, or the preset database may be a preset database in a database server; it will be appreciated that the preset database is used to store the first raw characteristic data for each object.
In step S202, the server may train the feature build model using the first raw feature data until the feature build model converges;
in the embodiment of the disclosure, the feature construction model is used for mining feature information of the first original feature data and relation information between the first original feature data; the feature build model may be a self-encoder (AutoEncoder) based machine learning model, among other things. It should be noted that, the network structure of the feature building model may be determined based on actual needs, which is not limited by the embodiments of the present disclosure, for example, the feature building model may be a neural network including a plurality of fully connected layers, or the feature building model may be a neural network including an attention mechanism.
In an alternative embodiment, since the original data obtained by the server usually has defects, and belongs to information which is difficult to identify by the feature construction network, before the feature construction model is trained by using the first original feature data until the feature construction model converges, the first original feature data may be preprocessed to obtain preprocessed original data, so as to train the feature construction model according to the preprocessed original data.
It should be noted that, in the embodiment of the present disclosure, the types of the first original feature data with different dimensions may generally include classification data (for example, a shop registration code) and statistical data (for example, a tax payment amount), where the classification data may be subjected to one-time thermal encoding to implement preprocessing of the classification data in the first original feature data; the statistical data may be normalized to achieve a preprocessing of the statistical data in the first raw characteristic data.
In an alternative embodiment, the process of the server training the feature build model using the first raw feature data until the feature build model converges may include: inputting the first original characteristic data into a characteristic construction model to be trained to obtain predicted characteristic data of the first original characteristic data; determining a loss function value according to the first original characteristic data, the predicted characteristic data of the first original characteristic data and the loss function; and adjusting model parameters of the feature construction model according to the loss function value and the preset model convergence condition, and repeating the process until the feature construction model converges.
It should be noted that the loss function may be any loss function used in the machine learning field to evaluate the convergence condition of the machine learning model, which is not limited by the embodiment of the present disclosure. By way of example, the loss function may be a mean square error loss function (Mean Square Error, MSE); the preset model convergence condition may be that the loss function value is less than or equal to a preset threshold, which may be determined based on actual needs, which is not limited in the embodiments of the present disclosure.
In an alternative embodiment, in the case of a smaller number of objects to be evaluated, the feature-building model may be trained in conjunction with the second raw feature data of the sample object; the process of training the feature construction model by the server through the first original feature data until the feature construction model converges may include: and training the feature construction model by using the first original feature data and the second original feature data of the sample object until the feature construction model converges. The second original characteristic data of the sample object and the first original characteristic data of the object to be evaluated can be combined to train the characteristic construction network, the first original characteristic data and the sample characteristic data with sufficient quantity can train the characteristic construction model with stronger characteristic characterization capability, and the accuracy of target characteristic data which is constructed by the characteristic construction model and is used for representing the real situation of the object to be evaluated is improved.
In an alternative embodiment, the feature build model may be a machine learning model based on a denoising self-encoder (denoising autoencoder, DAE), and the server training the feature build model using the first raw feature data and the second raw feature data of the sample object until the feature build model converges may comprise: combining the first original feature data and the second original feature data to obtain original feature data to be processed, wherein the original feature data to be processed is an original feature matrix, and elements of the original feature matrix are the original feature data of any object in any feature dimension; next, replacing the characteristic data of the second target matrix row with the characteristic data of the first target matrix row in the target matrix row of the original characteristic matrix to obtain noise characteristic data to be processed; and finally, training the feature construction model according to the noise feature data to be processed until the feature construction model converges. On the one hand, feature data to be processed with added noise can be obtained by exchanging feature data with the same feature dimension in original feature data to be processed, so that a feature construction model with higher generalization capability is trained, and the accuracy of target feature data constructed by the feature construction model is further improved; on the other hand, based on a machine learning model of the denoising self-encoder (denoising autoencoder, DAE), the type of the feature data is not concerned in the model construction process, anonymous features can be more accurately represented, and the accuracy of target feature data constructed by the feature construction model is further improved.
Wherein, the matrix rows of the original feature matrix represent: raw feature data for any object in multiple feature dimensions, matrix-wise representation of the raw feature matrix: original feature data of each object in any feature dimension; any object may be an object to be evaluated or a sample object, and it may be understood that the elements in the original feature matrix may include first original feature data of the object to be evaluated in any feature dimension, and second original feature data of the sample object in any feature dimension; the first target matrix row and the second target matrix row may be any matrix row in the original feature matrix, the number of the first target matrix row may include at least one, and by way of example, the first target matrix row may be 10% of the total number of rows of the original feature matrix; the target matrix array may be any matrix array in the original feature matrix.
It will be appreciated that in an alternative embodiment, after combining the first raw feature data and the second raw feature data to obtain raw feature data to be processed, the server may further: preprocessing the original characteristic data to be processed to obtain preprocessed original characteristic data to be processed. The influence of defects of the to-be-processed feature data on the training efficiency of the feature construction model can be reduced by preprocessing the to-be-processed original feature data, so that the training efficiency of the feature construction model is improved.
It should be noted that, in the embodiment of the present disclosure, the preprocessing process of the raw feature data to be processed may refer to the preprocessing process of the first raw number feature data, which is not described in detail in the present disclosure.
In an alternative embodiment, the process of training the feature construction model by the server according to the noise feature data to be processed until the feature construction model converges may include: inputting the noise characteristic data to be processed into a characteristic construction model to be trained to obtain prediction characteristic data of the noise characteristic data to be processed; determining a loss function value according to the original characteristic data to be processed, the predicted characteristic data of the noise characteristic data to be processed and the loss function; and adjusting model parameters of the feature construction model according to the loss function value and the preset model convergence condition, and repeating the process until the feature construction model converges. The first original characteristic data containing noise can be utilized to train the characteristic construction model so as to improve the deconstructing capability of the characteristic construction model on the characteristic and the generalization capability of the characteristic construction model obtained through training.
In an alternative embodiment, the process of training the feature construction model by the server according to the noise feature data to be processed until the feature construction model converges may include: and performing iterative training for the feature construction model to be trained for preset times by utilizing the noise feature data to be processed, and determining convergence of the feature construction model.
In step S203, the server may acquire model parameters of the feature construction model, to obtain target feature data of the object to be evaluated.
In the embodiment of the disclosure, model parameters of a feature construction model in a convergence state are target feature data, and the target feature data comprises first original feature data and relation feature data among the first original feature data; it can be understood that, because the target feature data includes the relationship feature data between the first original feature data, even if the feature dimensions in the first original feature data are less, the relationship feature data between different feature dimensions can be combined, so that the features of the object to be processed can be more accurately represented.
In an alternative embodiment, the feature building model may be trained based on the first original feature data of the object to be evaluated, and the server acquires the model parameters of the feature building model, and the process of obtaining the target feature data of the object to be evaluated may include: the server extracts model parameters of the feature construction model, and determines the model parameters of the feature construction model as target feature data of the object to be evaluated.
It should be noted that, in the embodiment of the present disclosure, a feature construction model may be trained based on to-be-processed feature data composed of first original feature data of an object to be evaluated and second original feature data of a sample object, and after the feature construction model is trained using the to-be-processed original feature data, the extracted target feature matrix and the number of rows of the original feature matrix will not change, where each row of data in the original feature matrix is first original feature data associated with any object, and each row of data in the target feature matrix is target feature data associated with any object. It can be appreciated that, since the target feature data includes the relationship feature data between the first original feature data, the feature dimension in the target feature matrix will increase, and the number of columns in the target feature matrix will increase, compared to the original feature matrix, the increase matrix array being the relationship feature data; it will be appreciated that the matrix rows of the target feature matrix represent: target feature data for any object in multiple feature dimensions, matrix-wise representation of the target feature matrix: target feature data for each object in any feature dimension.
For example, as shown in fig. 3, the original feature matrix 301 is a matrix of 5 rows and 6 columns, the 1 st to 3 rd matrix rows and the 3 rd matrix rows are respectively associated with the second original feature data of 6 feature dimensions of three sample objects, and the 4 th to 5 th matrix rows and the 5 th matrix rows are respectively associated with the second original feature data of 6 feature dimensions of two objects to be evaluated. After the original feature matrix is input into the feature construction model for training until the feature construction model converges, the obtained target feature matrix 302 is a matrix of 5 rows and 6 columns, wherein the 1 st matrix row to the 3 rd matrix row are respectively related to 9 feature dimension target feature data of three sample objects, and the 4 th matrix row to the 5 th matrix row are respectively related to 9 feature dimension target feature data of two to-be-evaluated objects. Wherein the box represents one element of the matrix.
In an alternative embodiment, the process of obtaining the target feature data of the object to be evaluated by the server to obtain the model parameters of the feature construction model includes: extracting model parameters of a feature construction model, wherein the model parameters are target feature matrixes, and elements of the target feature matrixes are target feature data of any object in any feature dimension; further, target feature data of the object to be evaluated is obtained from the object to be evaluated and the target feature data of the matrix row associated with the object to be evaluated in the model parameter matrix. The target feature data associated with the object to be evaluated can be quickly separated from the model parameters of the feature construction model, so that the evaluation result of the object to be evaluated can be more efficiently determined through the target feature data of the object to be evaluated.
In step S204, the server may determine an evaluation parameter of the object to be evaluated according to the target feature data and the evaluation parameter determination model.
In the embodiment of the disclosure, the evaluation parameter is used for representing the credibility of the object to be evaluated, and in general, the higher the evaluation parameter is, the higher the credibility is.
In an alternative embodiment, the server determines a model according to the target feature data and the evaluation parameters, and the process of determining the evaluation parameters of the object to be evaluated may include: inputting the target characteristic data of the object to be evaluated into a pre-trained evaluation parameter determination model to obtain the evaluation parameters of the object to be evaluated. The evaluation parameters of the object to be evaluated can be determined according to the pre-trained evaluation parameter determination model, and the efficiency of determining the evaluation parameters of the object to be evaluated can be improved.
In an alternative embodiment, in a scenario where the feature building model is trained using the first raw feature data of the object under evaluation and the second raw feature data of the sample object, the server determines the model according to the target feature data and the evaluation parameters, and the process of determining the evaluation parameters of the object under evaluation may include: acquiring target feature data of a matrix row associated with a sample object in a model parameter matrix of a feature construction model to obtain sample target feature data of the sample object; training the evaluation parameter determination model by using the sample target characteristic data until the evaluation parameter determination model converges; and inputting the target characteristic data of the object to be evaluated into an evaluation parameter determination model to obtain the evaluation parameters of the object to be evaluated. Because the sample target feature data is determined by the feature construction model under the condition that the first original feature data of the object to be evaluated is deconstructed, the evaluation parameter determination model with stronger feature interpretation capability of the target feature data of the object to be evaluated can be trained, so that the accuracy and reliability of the determined evaluation parameters of the object to be evaluated are improved.
It should be noted that, in the embodiment of the present disclosure, the sample object is an object with known evaluation parameters, and the evaluation parameters of the sample object are used as labels, so that the evaluation parameter determination model may be trained. The model structure of the evaluation parameter determination model may be determined based on actual needs, which is not limited by the embodiments of the present disclosure.
By way of example, as shown in fig. 4, fig. 4 illustrates an exemplary flow of another evaluation parameter determination method performed by a server, comprising:
step S401, obtaining first original characteristic data of an object to be evaluated and second original characteristic data of a sample object;
step S402, combining the first original characteristic data and the second original characteristic data to obtain original characteristic data to be processed;
the original characteristic data to be processed is an original characteristic matrix;
step S403, preprocessing the original characteristic data to be processed to obtain preprocessed original characteristic data to be processed;
step S404, replacing the characteristic data of the second target matrix row with the characteristic data of the first target matrix row in the target matrix row of the original characteristic matrix to obtain noise characteristic data to be processed;
step S405, training a feature construction model according to noise feature data to be processed until the feature construction model converges;
Step S406, extracting model parameters of a feature construction model, and acquiring target feature data of a matrix row associated with an object to be evaluated in a model parameter matrix to obtain target feature data of the object to be evaluated;
step S407, obtaining target feature data of a matrix row associated with a sample object in a model parameter matrix, obtaining sample target feature data of the sample object, and training an evaluation parameter to determine a model by using the sample target feature data until the evaluation parameter determines that the model converges;
step S408, inputting the target feature data of the object to be evaluated into an evaluation parameter determination model to obtain the evaluation parameters of the object to be evaluated.
Fig. 5 shows a schematic diagram of an evaluation parameter determining apparatus provided in an embodiment of the present disclosure, as shown in fig. 5, an evaluation parameter determining apparatus 500 includes:
a first obtaining module 501 configured to obtain first raw feature data of an object to be evaluated;
a training module 502 configured to train the feature build model using the first raw feature data until the feature build model converges;
a second obtaining module 503, configured to obtain model parameters of the feature construction model, to obtain target feature data of the object to be evaluated, where the target feature data includes first original feature data and relationship feature data between the first original feature data;
A determining module 504 configured to determine an evaluation parameter of the object to be evaluated according to the target feature data and the evaluation parameter determining model, the evaluation parameter being used for characterizing the credibility of the object to be evaluated.
Optionally, the training module 502 is configured to:
and training the feature construction model by using the first original feature data and the second original feature data of the sample object until the feature construction model converges.
Optionally, the training module 502 is configured to:
combining the first original feature data and the second original feature data to obtain original feature data to be processed, wherein the original feature data to be processed is an original feature matrix, and elements of the original feature matrix are the original feature data of any object in any feature dimension;
replacing the characteristic data of the second target matrix row with the characteristic data of the first target matrix row in the target matrix row of the original characteristic matrix to obtain noise characteristic data to be processed;
training a feature construction model according to the noise feature data to be processed until the feature construction model converges.
Optionally, the training module 502 is configured to:
inputting the noise characteristic data to be processed into a characteristic construction model to be trained to obtain prediction characteristic data of the noise characteristic data to be processed;
Determining a loss function value according to the original characteristic data to be processed, the predicted characteristic data of the noise characteristic data to be processed and the loss function;
and adjusting model parameters of the feature construction model according to the loss function value and the preset model convergence condition, and repeating the process until the feature construction model converges.
Optionally, the apparatus further comprises a preprocessing module 505 configured to:
preprocessing the original characteristic data to be processed to obtain preprocessed original characteristic data to be processed.
Optionally, the second obtaining module 503 is configured to:
extracting model parameters of a feature construction model, wherein the model parameters are target feature matrixes, and elements of the target feature matrixes are target feature data of any object in any feature dimension;
and acquiring target characteristic data of the object to be evaluated from the model parameter matrix and the object to be evaluated associated matrix row.
Optionally, the determining module 504 is configured to:
acquiring target characteristic data of a matrix row associated with a sample object in a model parameter matrix to obtain sample target characteristic data of the sample object;
training the evaluation parameter determination model by using the sample target characteristic data until the evaluation parameter determination model converges;
And inputting the target characteristic data of the object to be evaluated into an evaluation parameter determination model to obtain the evaluation parameters of the object to be evaluated.
Exemplary embodiments of the present disclosure also provide a computer readable storage medium, which may be implemented in the form of a program product comprising program code for causing an electronic device to carry out the steps according to the various exemplary embodiments of the disclosure as described in the above section of the "exemplary method" when the program product is run on the electronic device. In one embodiment, the program product may be implemented as a portable compact disc read only memory (CD-ROM) and includes program code and may be run on an electronic device, such as a personal computer. However, the program product of the present disclosure is not limited thereto, and in this document, a readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
The program product may employ any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. The readable storage medium can be, for example, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium would include the following: an electrical connection having one or more wires, a portable disk, a hard disk, random Access Memory (RAM), read-only memory (ROM), erasable programmable read-only memory (EPROM or flash memory), optical fiber, portable compact disk read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
The computer readable signal medium may include a data signal propagated in baseband or as part of a carrier wave with readable program code embodied therein. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination of the foregoing. A readable signal medium may also be any readable medium that is not a readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Program code for carrying out operations of the present disclosure may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, C++ or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computing device, partly on the user's device, as a stand-alone software package, partly on the user's computing device, partly on a remote computing device, or entirely on the remote computing device or server. In the case of remote computing devices, the remote computing device may be connected to the user computing device through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computing device (e.g., connected via the Internet using an Internet service provider).
The exemplary embodiments of the present disclosure also provide an electronic device, which may be a server. The electronic device is described below with reference to fig. 6. It should be understood that the electronic device 600 shown in fig. 6 is merely an example and should not be construed to limit the functionality and scope of use of embodiments of the present disclosure in any way.
As shown in fig. 6, the electronic device 600 is in the form of a general purpose computing device. Components of electronic device 600 may include, but are not limited to: at least one processing unit 610, at least one memory unit 620, a bus 630 that connects the different system components, including the memory unit 620 and the processing unit 610.
Wherein the storage unit stores program code that is executable by the processing unit 610 such that the processing unit 610 performs steps according to various exemplary embodiments of the present invention described in the above-described "exemplary methods" section of the present specification. For example, the processing unit 610 may perform method steps as shown in fig. 2 or fig. 4, etc.
The storage unit 620 may include volatile storage units such as a Random Access Memory (RAM) 621 and/or a cache memory 622, and may further include a Read Only Memory (ROM) 623.
The storage unit 620 may also include a program/utility 624 having a set (at least one) of program modules 625, such program modules 625 including, but not limited to: an operating system, one or more application programs, other program modules, and program data, each or some combination of which may include an implementation of a network environment.
The electronic device 600 may also communicate with one or more external devices 700 (e.g., keyboard, pointing device, bluetooth device, etc.) via an input/output (I/O) interface 640. The electronic device 600 may also communicate with one or more networks such as a Local Area Network (LAN), a Wide Area Network (WAN), and/or a public network such as the internet through a network adapter 650. As shown, the network adapter 650 communicates with other modules of the electronic device 600 over the bus 630. It should be appreciated that although not shown, other hardware and/or software modules may be used in connection with electronic device 600, including, but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, data backup storage systems, and the like.
It should be noted that although in the above detailed description several modules or units of a device for action execution are mentioned, such a division is not mandatory. Indeed, the features and functionality of two or more modules or units described above may be embodied in one module or unit in accordance with exemplary embodiments of the present disclosure. Conversely, the features and functions of one module or unit described above may be further divided into a plurality of modules or units to be embodied.
Those skilled in the art will appreciate that the various aspects of the present disclosure may be implemented as a system, method, or program product. Accordingly, various aspects of the disclosure may be embodied in the following forms, namely: an entirely hardware embodiment, an entirely software embodiment (including firmware, micro-code, etc.) or an embodiment combining hardware and software aspects may be referred to herein as a "circuit," module "or" system. Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This application is intended to cover any adaptations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It is to be understood that the present disclosure is not limited to the precise arrangements and instrumentalities shown in the drawings, and that various modifications and changes may be effected without departing from the scope thereof. The scope of the present disclosure is limited only by the appended claims.
Claims (10)
1. An evaluation parameter determining method, comprising:
acquiring first original characteristic data of an object to be evaluated;
training a feature construction model by using the first original feature data until the feature construction model converges;
obtaining model parameters of the feature construction model to obtain target feature data of the object to be evaluated, wherein the target feature data comprises the first original feature data and relation feature data among the first original feature data;
and determining an evaluation parameter of the object to be evaluated according to the target characteristic data and an evaluation parameter determination model, wherein the evaluation parameter is used for representing the credibility of the object to be evaluated.
2. The method of claim 1, wherein training a feature build model using the first raw feature data until the feature build model converges comprises:
and training a feature construction model by using the first original feature data and the second original feature data of the sample object until the feature construction model converges.
3. The method of claim 2, wherein training a feature build model using the first raw feature data and second raw feature data of a sample object until the feature build model converges, comprises:
combining the first original feature data and the second original feature data to obtain original feature data to be processed, wherein the original feature data to be processed is an original feature matrix, and elements of the original feature matrix are original feature data of any object in any feature dimension;
replacing the characteristic data of the second target matrix row with the characteristic data of the first target matrix row in the target matrix row of the original characteristic matrix to obtain noise characteristic data to be processed;
and training a feature construction model according to the noise feature data to be processed until the feature construction model converges.
4. A method according to claim 3, wherein said training a feature build model from said noise feature data to be processed until said feature build model converges comprises:
inputting the noise characteristic data to be processed into a characteristic construction model to be trained to obtain prediction characteristic data of the noise characteristic data to be processed;
Determining a loss function value according to the original characteristic data to be processed, the predicted characteristic data of the noise characteristic data to be processed and the loss function;
and according to the loss function value and a preset model convergence condition, adjusting the model parameters of the feature construction model, and repeating the process until the feature construction model converges.
5. A method according to claim 3, wherein after merging the first raw feature data and the second raw feature data to obtain raw feature data to be processed, the method further comprises:
and preprocessing the original characteristic data to be processed to obtain preprocessed original characteristic data to be processed.
6. The method according to claim 2, wherein the obtaining model parameters of the feature construction model to obtain target feature data of the object under evaluation includes:
extracting model parameters of the feature construction model, wherein the model parameters are target feature matrixes, and elements of the target feature matrixes are target feature data of any object in any feature dimension;
and acquiring target characteristic data of the object to be evaluated from the model parameter matrix and the object to be evaluated in an associated matrix row.
7. The method of claim 6, wherein the determining the evaluation parameters of the object under evaluation based on the target feature data and the evaluation parameter determination model comprises:
acquiring target characteristic data of a matrix row associated with the sample object in the model parameter matrix to obtain sample target characteristic data of the sample object;
training the evaluation parameter determination model by using the sample target feature data until the evaluation parameter determination model converges;
and inputting the target characteristic data of the object to be evaluated into the evaluation parameter determination model to obtain the evaluation parameters of the object to be evaluated.
8. An evaluation parameter determining apparatus, comprising:
the first acquisition module is configured to acquire first original characteristic data of an object to be evaluated;
a training module configured to train a feature build model using the first raw feature data until the feature build model converges;
the second acquisition module is configured to acquire model parameters of the feature construction model to obtain target feature data of the object to be evaluated, wherein the target feature data comprises the first original feature data and relation feature data among the first original feature data;
And the determining module is configured to determine an evaluation parameter of the object to be evaluated according to the target characteristic data and an evaluation parameter determining model, wherein the evaluation parameter is used for representing the credibility of the object to be evaluated.
9. A computer readable storage medium, on which a computer program is stored, characterized in that the computer program, when being executed by a processor, implements the method of any of claims 1 to 7.
10. An electronic device, comprising:
a processor; and
a memory for storing executable instructions of the processor;
wherein the processor is configured to perform the method of any one of claims 1 to 7 via execution of the executable instructions.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310671199.XA CN116402354A (en) | 2023-06-07 | 2023-06-07 | Evaluation parameter determining method and device, medium and electronic equipment |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310671199.XA CN116402354A (en) | 2023-06-07 | 2023-06-07 | Evaluation parameter determining method and device, medium and electronic equipment |
Publications (1)
Publication Number | Publication Date |
---|---|
CN116402354A true CN116402354A (en) | 2023-07-07 |
Family
ID=87018392
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202310671199.XA Pending CN116402354A (en) | 2023-06-07 | 2023-06-07 | Evaluation parameter determining method and device, medium and electronic equipment |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN116402354A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN118052351A (en) * | 2023-11-30 | 2024-05-17 | 中交广州航道局有限公司 | Cutter suction dredger loading two-in-one ship data processing system |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111652327A (en) * | 2020-07-16 | 2020-09-11 | 北京思图场景数据科技服务有限公司 | A model iteration method, system and computer equipment |
US20220215296A1 (en) * | 2020-01-03 | 2022-07-07 | Tencent Technology (Shenzhen) Company Limited | Feature effectiveness assessment method and apparatus, electronic device, and storage medium |
CN115587535A (en) * | 2022-09-29 | 2023-01-10 | 深圳前海微众银行股份有限公司 | Model building optimization method, device, storage medium and program product |
US20230084333A1 (en) * | 2021-08-31 | 2023-03-16 | Naver Corporation | Adversarial generation method for training a neural model |
-
2023
- 2023-06-07 CN CN202310671199.XA patent/CN116402354A/en active Pending
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20220215296A1 (en) * | 2020-01-03 | 2022-07-07 | Tencent Technology (Shenzhen) Company Limited | Feature effectiveness assessment method and apparatus, electronic device, and storage medium |
CN111652327A (en) * | 2020-07-16 | 2020-09-11 | 北京思图场景数据科技服务有限公司 | A model iteration method, system and computer equipment |
US20230084333A1 (en) * | 2021-08-31 | 2023-03-16 | Naver Corporation | Adversarial generation method for training a neural model |
CN115587535A (en) * | 2022-09-29 | 2023-01-10 | 深圳前海微众银行股份有限公司 | Model building optimization method, device, storage medium and program product |
Non-Patent Citations (1)
Title |
---|
张勇等: "神经网络在企业信用评估中的应用研究", 计算机时代, no. 04 * |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN118052351A (en) * | 2023-11-30 | 2024-05-17 | 中交广州航道局有限公司 | Cutter suction dredger loading two-in-one ship data processing system |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11676215B1 (en) | Self-service claim automation using artificial intelligence | |
KR20210058229A (en) | Method and server for providing loan service based on inventory and sales analysys | |
US12299687B2 (en) | Abnormal behavior detection method and apparatus, electronic device, and computer-readable storage medium | |
CN113837596B (en) | Fault determination method and device, electronic equipment and storage medium | |
CN110674360B (en) | Tracing method and system for data | |
CN112818162A (en) | Image retrieval method, image retrieval device, storage medium and electronic equipment | |
CN118505230A (en) | Training method and device for detection model, computer equipment and storage medium | |
CN113052509B (en) | Model evaluation method, model evaluation device, electronic apparatus, and storage medium | |
CN115495711A (en) | Natural disaster insurance prediction processing method and system and electronic equipment | |
CN113901817A (en) | Document classification method and device, computer equipment and storage medium | |
CN117522403A (en) | GCN abnormal customer early warning method and device based on subgraph fusion | |
CN116402354A (en) | Evaluation parameter determining method and device, medium and electronic equipment | |
CN110070383B (en) | Abnormal user identification method and device based on big data analysis | |
CN114708081A (en) | Credit risk prediction method and device, electronic equipment and readable storage medium | |
CN110827261B (en) | Image quality detection method and device, storage medium and electronic equipment | |
CN116542673B (en) | Fraud identification method and system applied to machine learning | |
US20210366048A1 (en) | Methods and systems for reacting to loss reporting data | |
CN111367776A (en) | Recording method, device, equipment and storage medium of resource transfer service | |
CN116611941A (en) | Fraud identification method, device, equipment and storage medium based on artificial intelligence | |
Han et al. | Using source code and process metrics for defect prediction-A case study of three algorithms and dimensionality reduction. | |
CN116629639B (en) | Evaluation information determining method and device, medium and electronic equipment | |
CN117952717B (en) | A method and system for processing air ticket orders based on big data | |
CN113328978B (en) | Malicious user identification method and device, computer storage medium and electronic equipment | |
CN118690325A (en) | Data predictive analysis method and system based on big data | |
CN115760384A (en) | Abnormal behavior identification method, identification device, electronic equipment and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20230707 |
|
RJ01 | Rejection of invention patent application after publication |