CN112016698B - Factorization machine model construction method, factorization machine model construction equipment and readable storage medium - Google Patents
Factorization machine model construction method, factorization machine model construction equipment and readable storage medium Download PDFInfo
- Publication number
- CN112016698B CN112016698B CN202010893538.5A CN202010893538A CN112016698B CN 112016698 B CN112016698 B CN 112016698B CN 202010893538 A CN202010893538 A CN 202010893538A CN 112016698 B CN112016698 B CN 112016698B
- Authority
- CN
- China
- Prior art keywords
- party
- parameter
- model
- secret sharing
- shared
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N20/00—Machine learning
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Software Systems (AREA)
- Data Mining & Analysis (AREA)
- Evolutionary Computation (AREA)
- Medical Informatics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physics & Mathematics (AREA)
- Computing Systems (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Mathematical Physics (AREA)
- Artificial Intelligence (AREA)
- Management, Administration, Business Operations System, And Electronic Commerce (AREA)
- Information Retrieval, Db Structures And Fs Structures Therefor (AREA)
Abstract
The application discloses a factoring machine model construction method, factoring machine model construction equipment and a readable storage medium, wherein the factoring machine model construction method comprises the following steps: acquiring initialization model parameters and first sparse data corresponding to a preset initialization model, carrying out secret sharing with second equipment based on the initialization model parameters to obtain first party secret sharing initial model parameters, enabling the second equipment to determine second party secret sharing initial model parameters, carrying out federal interaction with the second equipment based on a first non-zero part in the first sparse data and the first party secret sharing initial model parameters, calculating secret sharing model errors in combination with the second non-zero part in the second sparse data acquired by the second equipment and the second party secret sharing initial model parameters, updating the preset initialization model based on the secret sharing model errors, and obtaining a longitudinal federal factorization machine model. The application solves the technical problem of low computational efficiency in federal learning based on a sparse matrix.
Description
Technical Field
The present application relates to the field of artificial intelligence in financial technology (Fintech), and in particular, to a method and apparatus for constructing a factorization model, and a readable storage medium.
Background
With the continuous development of financial technology, especially internet technology finance, more and more technologies (such as distributed, blockchain Blockchain, artificial intelligence, etc.) are applied in the finance field, but the finance industry also puts higher demands on the technologies, such as distribution of backlog corresponding to the finance industry.
Along with the continuous development of computer software and artificial intelligence, the application field of federal learning is also more and more extensive, at present, since the training data of federal learning is usually a dense matrix, and then the dense matrix is encrypted by a homomorphic encryption method, so that federal learning can be performed on the premise of not revealing data privacy, but when the training data is sparse matrix data, such as user portrait data, and the like, the sparse matrix is far larger than the dense matrix due to the fact that the same information is stored, and then the calculation amount is very large when federal learning is performed by the homomorphic encryption method, the calculation complexity is very high, and the calculation efficiency of federal learning based on the sparse matrix is extremely low.
Disclosure of Invention
The application mainly aims to provide a factorization machine model construction method, factorization machine model construction equipment and a readable storage medium, and aims to solve the technical problem that in the prior art, calculation efficiency is low when federal learning is performed based on a sparse matrix.
In order to achieve the above object, the present application provides a factoring machine model building method applied to a factoring machine model building apparatus, the factoring machine model building method comprising:
Acquiring initialization model parameters and first sparse data corresponding to a preset initialization model, and carrying out secret sharing with second equipment based on the initialization model parameters to acquire first party secret sharing initial model parameters so as to enable the second equipment to determine second party secret sharing initial model parameters;
Performing federal interaction with the second device based on a first non-zero portion of the first sparse data and the first party secret sharing initial model parameters to combine a second non-zero portion of second sparse data acquired by the second device with the second party secret sharing initial model parameters to calculate a secret sharing model error;
and updating the preset initialization model based on the secret sharing model error to obtain a longitudinal federal factor decomposition machine model.
The application also provides a personalized recommendation method, which is applied to personalized recommendation equipment and comprises the following steps:
acquiring sparse data of a user to be recommended by a first party, and carrying out secret sharing with second equipment to acquire secret sharing model parameters;
based on a first non-zero part in the sparse data of the first party to-be-recommended user and the secret sharing model parameters, performing longitudinal federal prediction interaction with the second equipment to score the to-be-recommended object corresponding to the sparse data of the first party to-be-recommended user, and obtaining a first secret sharing scoring result;
Performing aggregation interaction with the second device based on the first secret sharing scoring result to combine the second secret sharing scoring result determined by the second device to calculate a target scoring result;
and generating a target recommendation list corresponding to the to-be-recommended object based on the target scoring result.
The present application also provides a factoring machine model building apparatus which is a virtual apparatus and which is applied to a factoring machine model building device, the factoring machine model building apparatus comprising:
The secret sharing module is used for acquiring initialization model parameters and first sparse data corresponding to a preset initialization model, carrying out secret sharing with second equipment based on the initialization model parameters, and acquiring first party secret sharing initial model parameters so that the second equipment can determine second party secret sharing initial model parameters;
An error calculation module, configured to perform federal interaction with the second device based on a first non-zero portion in the first sparse data and the first party secret sharing initial model parameter, so as to combine a second non-zero portion in second sparse data acquired by the second device with the second party secret sharing initial model parameter, and calculate a secret sharing model error;
and the generation module is used for updating the preset initialization model based on the secret sharing model error to obtain a longitudinal federal factor decomposition machine model.
The application also provides a personalized recommendation device, which is a virtual device and is applied to personalized recommendation equipment, and the personalized recommendation device comprises:
The secret sharing module is used for acquiring sparse data of the user to be recommended by the first party, and carrying out secret sharing with the second equipment to acquire secret sharing model parameters;
The scoring module is used for performing longitudinal federal prediction interaction with the second equipment based on a first non-zero part in the sparse data of the first party to-be-recommended user and the secret sharing model parameters so as to score the to-be-recommended articles corresponding to the sparse data of the first party to-be-recommended user and obtain a first secret sharing scoring result;
the aggregation module is used for conducting aggregation interaction with the second equipment based on the first secret sharing scoring result so as to combine the second secret sharing scoring result determined by the second equipment to calculate a target scoring result;
And the generating module is used for generating a target recommendation list corresponding to the to-be-recommended article based on the target scoring result.
The application also provides a factorization machine model construction device, the factoring machine model building device is an entity device, and the factoring machine model building device comprises: the system comprises a memory, a processor and a program of the factoring machine model construction method stored in the memory and capable of running on the processor, wherein the program of the factoring machine model construction method can realize the steps of the factoring machine model construction method when being executed by the processor.
The application also provides personalized recommendation equipment, which is entity equipment and comprises: the system comprises a memory, a processor and a program of the personalized recommendation method stored in the memory and capable of running on the processor, wherein the program of the personalized recommendation method can realize the steps of the personalized recommendation method when being executed by the processor.
The present application also provides a readable storage medium having stored thereon a program for implementing a factoring machine model construction method, which when executed by a processor implements the steps of the factoring machine model construction method as described above.
The present application also provides a readable storage medium having stored thereon a program for implementing a personalized recommendation method, which when executed by a processor implements the steps of the personalized recommendation method as described above.
Compared with the prior art adopting a technical means of federal learning based on homomorphic encryption, the application performs secret sharing with second equipment after acquiring initialization model parameters and first sparse data corresponding to a preset initialization model, further the first equipment acquires first party secret sharing initial model parameters, the second equipment acquires second party secret sharing initial model parameters, further performs federal interaction with the second equipment based on a first non-zero part in the first sparse data and the first party secret sharing initial model parameters to combine the second non-zero part in the second sparse data acquired by the second equipment and the second party secret sharing initial model parameters, calculating secret sharing model errors, wherein when federal interaction is carried out, only a first non-zero part in first sparse data and a second non-zero part in second sparse data are utilized for calculation, and therefore calculation processes of the first sparse data and the second sparse data about the zero part are reduced, calculation amount and calculation complexity in the federal interaction process are greatly reduced, and further a preset initialization model is updated based on the secret sharing model errors, so that a longitudinal federal factorization machine model can be obtained, the technical defect that calculation efficiency is low when federal learning based on a sparse matrix is carried out by a homomorphic encryption method in the prior art is overcome, and further calculation efficiency when federal learning based on the sparse matrix is improved.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the application and together with the description, serve to explain the principles of the application.
In order to more clearly illustrate the embodiments of the application or the technical solutions of the prior art, the drawings which are used in the description of the embodiments or the prior art will be briefly described, and it will be obvious to a person skilled in the art that other drawings can be obtained from these drawings without inventive effort.
FIG. 1 is a flow chart of a first embodiment of a factorization machine model building method of the present application;
FIG. 2 is a flowchart of a second embodiment of a factorization machine model building method according to the present application;
FIG. 3 is a flowchart of a third embodiment of a personalized recommendation method according to the present application;
FIG. 4 is a schematic diagram of a device structure of a hardware operating environment related to a factoring machine regression model construction method according to an embodiment of the present application;
fig. 5 is a schematic device structure diagram of a hardware operating environment related to a personalized recommendation method according to an embodiment of the present application.
The achievement of the objects, functional features and advantages of the present application will be further described with reference to the accompanying drawings, in conjunction with the embodiments.
Detailed Description
It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the scope of the application.
In a first embodiment of the factoring machine model building method of the present application, referring to fig. 1, the factoring machine model building method is applied to a first device, and the factoring machine model building method includes:
step S10, initializing model parameters and first sparse data corresponding to a preset initializing model are obtained, secret sharing is carried out with second equipment based on the initializing model parameters, and first party secret sharing initial model parameters are obtained so that the second equipment can determine second party secret sharing initial model parameters;
In this embodiment, it should be noted that, the first device and the second device are both participants of longitudinal federal learning, the first device has first sparse data with a sample tag, the first sparse data may be represented by a first sparse matrix and a sample tag, for example, it is assumed that the first sparse data is (X A,Y),XA is the first sparse matrix, Y is the sample tag, otherwise, the second device has second sparse data without a sample tag, and the second sparse data may be represented by a second sparse matrix, for example, it is assumed that the second sparse matrix is X B.
Additionally, in this embodiment, the factorizer model is a machine learning model constructed based on longitudinal federal learning, and the model parameters of the factorizer model are commonly held by a first device and a second device, where, after the factorization is initialized, the model parameters corresponding to the factorizer model may be represented by a matrix, where, for example, the model parameters corresponding to the factorizer model include an initialization model parameter belonging to the first device and a second initialization model parameter belonging to the second device, where the initialization model parameters include a first-party first-type parameter vector, a first-party second-type parameter matrix and a first-party transpose matrix corresponding to the first-party second-type parameter matrix, and the second-party initialization model parameters include a second-party first-type parameter vector, a second-party second-type parameter matrix and a second-party transpose matrix corresponding to the second-party second-type parameter matrix, and it is assumed that, for example, the first-party first-type parameter vector is w A, the second-party second-type parameter is w 3224, the second-type parameter is w 3424, and the first-type parameter is w 3224, and the second-type parameter is w-type parameter= [ 37, and the first-type parameter is w-type parameter= [ 35 ] and the first-type parameter matrix is w-type parameter "= [ 37 ].
Additionally, it should be noted that the process of secret sharing data is a process of splitting data into two pieces of sub data, and the two pieces of sub data are held by two parties of secret sharing, for example, assuming that two parties of secret sharing are a and B, then secret sharing is performed on data X, a holds a first share [ [ X ] ] A of data X, B holds a second share [ [ X ] ] B of data X, and x= [ [ X ] ] A+[[X]]B.
Additionally, the model expression of the factorer model is as follows:
z(x)=<w,x>+∑i<j<Vi,Vj>xixj
Wherein X is a data matrix corresponding to model input data, wherein the model input data comprises first sparse data (X A, Y) and second sparse data X B, wherein Y is the sample label, X A is a first sparse matrix having d A feature dimensions, X B is a second sparse matrix with d B feature dimensions, a first type of model parameters being w, where w is a d-dimensional vector, a second type of model parameters being V, where V is a matrix of d x, and w= [ w A,wB ], that is, w is composed of a first-party first-type model parameter vector w A and a second-party first-type model parameter vector w B, where w A is a d A -dimensional vector, w B is a d B -dimensional vector, additionally v= [ V A,VB ], where V is composed of a first-party second-type parameter matrix V A and said second-party second-type parameter matrix V B, Wherein V A is a d A*dX -dimensional matrix, V B is a d B*dX -dimensional matrix, < w, x > is the inner product of w and x, V i is the column vector of the ith column of V, V j is the column vector of the jth column of V, x i is the column vector of the ith column of x, and x j is the column vector of the jth column of x.
Acquiring initialization model parameters and first sparse data corresponding to a preset initialization model, carrying out secret sharing with second equipment based on the initialization model parameters to acquire first party secret sharing initialization model parameters for the second equipment to determine second party secret sharing initialization model parameters, specifically initializing the factorizer model to acquire the preset initialization model, acquiring initialization model parameters corresponding to the preset initialization model, wherein the second equipment acquires second party initialization model parameters, splits the second party initialization model parameters into second party initialization model parameter first shares and second party initialization model parameter second shares, Further obtaining first sparse data, splitting the initialization model parameters into a first initialization model parameter share and a second initialization model parameter share, wherein the first initialization model parameter share comprises a first model parameter vector share, a first second model parameter matrix share and the first transpose matrix share, the second initialization model parameter share comprises a first model parameter vector share, a second model parameter matrix share and a second transpose matrix share, and further sending the second initialization model parameters share to the second device, The second device uses the second share of the initialization model and the first share of the second initialization model as the second secret sharing model parameters, further receives the second share of the second initialization model parameters sent by the second device, and uses the first share of the initialization model parameters and the second share of the second initialization model parameters as the first secret sharing initial model parameters, wherein the first share of the second initialization model parameters comprises a first share of a second party first type model parameter vector, a first share of a second party second type model parameter matrix and a first share of the second party transpose matrix, and the second share of the second party initialization model parameters comprises a second party first type model parameter vector second share, The second party second model parameter matrix second share and the second party transpose matrix second share, for example, assuming that the initialization model parameter is G A and the second party initialization model parameter is G B, after secret sharing, the first device has an initialization model first share [ [ G A]]A ] and a second party initialization model second share [ [ G B]]A ], The second device has an initialization model second share [ [ G A]]B ] and a second-party initialization model first share [ [ G B]]B ], an GA=[[GA]]A+[[GA]]B,GB=[[GB]]A+[[GB]]B.
Step S20, based on a first non-zero part in the first sparse data and the first party secret sharing initial model parameter, federally interacting with the second device to combine a second non-zero part in the second sparse data acquired by the second device and the second party secret sharing initial model parameter, and calculating a secret sharing model error;
In this embodiment, the first non-zero portion is each column vector in the first sparse matrix, and the second non-zero portion is each column vector in the second sparse matrix.
Performing federal interaction with the second device based on a first non-zero portion of the first sparse data and the first party secret sharing initial model parameter to combine a second non-zero portion of the second sparse data obtained by the second device with the second party secret sharing initial model parameter, calculating a secret sharing model error, specifically performing federal interaction with the second device based on the first non-zero portion of the first sparse data and the first party secret sharing initial model parameter to combine a second non-zero portion of the second sparse data determined by the second device with the second party secret sharing initial model parameter, calculating each secret sharing error parameter item, and further calculating the secret sharing model error based on each secret sharing parameter item, a secret sharing sample tag and a preset secret sharing model error calculation formula, wherein the secret sharing sample tag is obtained by performing secret sharing on the sample tag with the second device, the secret sharing sample tag is a first share of the sample tag, the second party of the sample tag is a second party secret share of the sample tag, and the second party secret share tag is held by the second device.
Wherein the step of calculating a secret sharing model error based on the first non-zero portion of the first sparse data and the first party secret sharing initial model parameter, performing federal interaction with the second device to combine the second non-zero portion of the second sparse data acquired by the second device and the second party secret sharing initial model parameter, comprises:
Step S21, based on a preset secret sharing multiplication triplet, the first non-zero part and the first party secret sharing initial model parameter, federally interacting with the second equipment to combine the second non-zero part and the second party secret sharing initial model parameter, and calculating a sparse matrix safe inner product and a secret sharing intermediate parameter;
In this embodiment, it should be noted that the secret sharing error parameter items are parameter items for calculating a secret sharing error, and each of the secret sharing parameter items includes a coefficient matrix secure internal area and a secret sharing intermediate parameter.
Performing federal interactions with the second device to combine the second non-zero portion and the second party secret sharing initial model parameters based on a preset secret sharing multiplication triplet, the first non-zero portion and the first party secret sharing initial model parameters, calculating a sparse matrix secure inner product, and performing federal interactions with the second device to combine the second non-zero portion and the second party secret sharing initial model parameters, and calculating a secret sharing intermediate parameter based on a preset secret sharing multiplication triplet, the first non-zero portion and the first party secret sharing initial model parameters, to combine a preset second party secret sharing multiplication triplet, the second non-zero portion and the second party secret sharing initial model parameters corresponding to the preset secret sharing multiplication triplet.
Wherein the first party secret sharing initial model parameters include a first type of sharing model parameters and a second type of sharing model parameters, the second party secret sharing initial model parameters include a second party first type of sharing model parameters and a second party second type of sharing model parameters, the sparse matrix safe inner product includes a first type of sparse matrix safe inner product and a second type of sparse matrix safe inner product,
The step of calculating a sparse matrix secure inner and secret sharing intermediate parameters based on a preset secret sharing multiplication triplet, the first non-zero portion and the first party secret sharing initial model parameters, federally interacting with the second device to combine the second non-zero portion and the second party secret sharing initial model parameters, comprises:
Step S211, performing federal interaction with the second device based on the first type shared model parameters and the first non-zero portion, so as to combine the second party first type shared model parameters and the second non-zero portion, and calculating the first type sparse matrix safe inner product;
In this embodiment, it should be noted that the first type of sharing model parameter is a first type of sharing secret held by the first device, where the first type of sharing model parameter includes a fifth sharing parameter and a sixth sharing parameter, where the fifth sharing parameter is a first share of a first type of sharing model parameter vector, the sixth sharing parameter is a second share of a second type of first type of sharing model parameter vector, the second type of first type of sharing model parameter is a first type of sharing model parameter held by the second device, where the second type of first type of sharing parameter includes a seventh sharing parameter and an eighth sharing parameter, where the seventh sharing parameter is a second share of the first type of sharing model parameter vector, the eighth sharing parameter is a first share of the second type of first type of sharing model parameter vector, the first type sparse matrix secure inner product comprises a third non-zero eigenvector cross inner product, which is the cross eigenvector inner product of a first party first type of parameter vector shared by the secret in the first device and the first non-zero part, i.e. the third non-zero eigenvector cross inner product is the accumulated value of the product of each column vector in the first non-zero part and the first party first type of parameter vector shared by the secret in the first device, wherein the third non-zero eigenvector cross inner product is representable by a vector, and a fourth non-zero eigenvector cross inner product is the cross eigenvector inner product of a second party first type of parameter vector shared by the secret in the first device and the second non-zero part, the fourth non-zero feature term cross inner product is an accumulated value of products of a second party first type model parameter vector shared for a secret in the first device and each column vector in the second non-zero portion, wherein the fourth non-zero feature term cross inner product is representable by a vector.
Based on the first type sharing model parameter and the first non-zero part, federally interacting with the second device to combine the second party first type sharing model parameter and the second non-zero part, calculating the first type sparse matrix secure inner product, specifically, the second device generates a third public key and a third private key corresponding to the third public key, homomorphic encrypting the seventh sharing parameter based on the third public key to obtain an encrypted seventh sharing parameter, transmitting the third public key and the encrypted seventh sharing parameter to the first device, and further receiving the third public key and the encrypted seventh sharing parameter by the first device, and further homomorphic encryption is performed on the fifth shared parameter based on the third public key to obtain an encrypted fifth shared parameter, and then the sum of the encrypted fifth shared parameter and the encrypted seventh shared parameter is calculated to obtain an encrypted first-party first-type model parameter vector, and then the product of the encrypted first-party first-type model parameter vector and each first non-zero column vector in a first non-zero portion of the first sparse matrix is calculated to obtain a first vector product corresponding to each first non-zero column vector, wherein each row of the first sparse matrix corresponds to one sample dimension, each column of the first sparse matrix corresponds to one feature dimension, And further accumulating the first vector products to obtain a third encryption inner product, further constructing a third non-zero characteristic item cross inner product which is uniformly distributed and consistent with the characteristic dimension of the third encryption inner product, and based on the third public key, homomorphic encrypting the third non-zero characteristic item cross inner product to obtain an encryption third non-zero characteristic item cross inner product, further calculating the difference value of the third encryption inner product and the encryption third non-zero characteristic item cross inner product, obtaining an encryption second party third non-zero characteristic item cross inner product, and sending the encryption second party third non-zero characteristic item cross inner product to the second equipment, and further the second equipment is based on the third private key, decrypting the encrypted second-party third non-zero feature item cross inner product to obtain a second-party third non-zero feature item cross inner product, wherein the second-party third non-zero feature item cross inner product is the cross feature item inner product of a first-party first-type parameter vector and the first non-zero part which are secret-shared in the second device, that is, the second-party third non-zero feature item cross inner product is the accumulated value of the products of each column vector in the first-party first-type parameter vector and the first non-zero part which are secret-shared in the second device, and the first device generates a fourth public key and a fourth private key corresponding to the fourth public key based on the fourth public key, homomorphic encryption is carried out on the sixth shared parameter to obtain an encrypted sixth shared parameter, the fourth public key and the encrypted sixth shared parameter are further sent to the second device, the second device receives the fourth public key and the encrypted sixth shared parameter, homomorphic encryption is carried out on the eighth shared parameter based on the fourth public key to obtain an encrypted eighth shared parameter, the encrypted sixth shared parameter and the encrypted eighth shared parameter are further calculated to obtain an encrypted second first type parameter vector, the product of the encrypted second first type parameter vector and each second non-zero column vector in a second non-zero part in the second sparse matrix is further calculated, obtaining a second vector product corresponding to each second non-zero column vector, wherein each row of the second sparse matrix corresponds to one sample dimension, each column of the second sparse matrix corresponds to one feature dimension, and further accumulating each second vector product to obtain a fourth encrypted inner product, and further constructing a second party fourth non-zero feature item cross inner product consistent with the fourth encrypted inner product feature dimension, wherein the second party fourth non-zero feature item cross inner product is a cross feature item inner product of a second party first type parameter vector and the second non-zero part which are secret-shared in a second device, the second party fourth non-zero feature item cross inner product is an accumulated value of products of the second party first type parameter vector and each column vector in the second non-zero part which are secret-shared in the second device, And then homomorphic encryption is carried out on the fourth non-zero characteristic item cross inner product of the second party based on the fourth public key to obtain an encrypted second party fourth non-zero characteristic item cross inner product, and then the difference value of the fourth encryption inner product and the encrypted second party fourth non-zero characteristic item cross inner product is calculated to obtain an encrypted fourth non-zero characteristic item cross inner product, and the encrypted fourth non-zero characteristic item cross inner product is sent to the first equipment, and then the first equipment receives the encrypted fourth non-zero characteristic item cross inner product and decrypts the encrypted fourth non-zero characteristic item cross inner product to obtain a fourth non-zero characteristic item cross inner product, wherein the third non-zero characteristic item cross inner product and the fourth non-zero characteristic item cross inner product can be calculated at the same time, alternatively, the calculations may be performed at different times.
Additionally, the expression of the third non-zero feature term intersection inner product and the fourth non-zero feature term intersection inner product is as follows:
Wherein Q 1 is the third non-zero feature item cross inner product, Q 2 is the fourth non-zero feature item cross inner product, w A is the first-type model parameter vector of the first party, For the first non-zero column vector, X A is the first sparse matrix, d A represents that the first sparse matrix has d A feature dimensions, w B is the second-party first-type model parameter vector,For the second non-zero column vector, X B is the second sparse matrix, d B represents that the first sparse matrix has d B feature dimensions, [ ] ] A represents data in brackets that are secret shared, and belongs to a first apparatus, and further the expression of the second-side third non-zero feature item cross inner product and the second-side fourth non-zero feature item cross inner product is as follows:
Wherein, Q 3 is the third non-zero feature item cross inner product of the second party, Q 4 is the fourth non-zero feature item cross inner product of the second party, [ ] B represents the data whose contents in brackets are secret sharing and belongs to the second device.
Additionally, the association between the third non-zero feature item intersection inner product and the second party third non-zero feature item intersection inner product, and the association between the fourth non-zero feature item intersection inner product and the second party fourth non-zero feature item intersection inner product are as follows:
Step S212, based on the second type sharing model parameters and the first non-zero part, performing federal interaction with the second device to combine the second type sharing model parameters and the second non-zero part, and calculating the second type sparse matrix safe inner product;
in this embodiment, it should be noted that the second type of sharing model parameter is a second type of sharing secret held by the first device, where the second type of sharing model parameter includes a first sharing parameter and a third sharing parameter, where the first sharing parameter is a second share of a second type of second type parameter matrix, the third sharing parameter is a first share of a first type of second type parameter matrix, the second type of sharing model parameter is a second type of sharing secret held by the second device, where the second type of sharing parameter includes a second sharing parameter and a fourth sharing parameter, Wherein the second shared parameter is a first share of a second-party second-type parameter matrix, the fourth shared parameter is a second share of a first-party second-type parameter matrix, the second-type sparse matrix secure inner product comprises a first non-zero feature item cross inner product and a second non-zero feature item cross inner product, wherein the first non-zero feature item cross inner product is an inner product of cross feature items of the second-party second-type parameter matrix and the second non-zero portion that are secret-shared in the first device, i.e. the first non-zero feature item cross inner product is an accumulated value of products of each column vector in the second-party second-type parameter matrix and each column vector in the second non-zero portion that are secret-shared in the first device, the second non-zero feature item cross inner product being an accumulated value of products of each column vector in a first side second type parameter matrix shared for secret in the first device and each column vector in a first non-zero portion, and the second non-zero feature item cross inner product being a cross feature item inner product of a first side second type parameter matrix shared for secret in the first device and the first non-zero portion, and, based on the second type shared model parameter and the first non-zero portion, performing federal interactions with the second device to combine the second side second type shared model parameter and the second non-zero portion, calculating the second type sparse matrix security inner product, In particular, based on the first shared parameter, performing federal interaction with the second device to combine the second shared parameter and the second non-zero portion, calculating the first non-zero feature item cross inner product, and assisting the second device to calculate a second-party first non-zero feature item cross inner product corresponding to the first non-zero feature item cross inner product, wherein the second-party first non-zero feature item cross inner product is an accumulated value of products of a second-party second-type parameter matrix shared by secrets in the second device and the second non-zero portion cross feature item inner product, that is, products of each column vector in the second-party second-type parameter matrix shared by secrets in the second device and each column vector in the second non-zero portion, Similarly, based on the third shared parameter and the first non-zero portion, federally interacting with the second device to compute a second non-zero feature item cross inner product and to assist the second device in computing a second party second non-zero feature item cross inner product, wherein the second party second non-zero feature item cross inner product is an accumulated value of a product of each column vector in a first party second type parameter matrix shared for secrets in the second device and each column vector in the first non-zero portion, in conjunction with the fourth shared parameter, The first non-zero feature item cross inner product and the second non-zero feature item cross inner product can be calculated at the same time or at different times.
Additionally, the expression of the first non-zero feature term intersection inner product and the second non-zero feature term intersection inner product is as follows:
Wherein R 1 is the first non-zero feature term cross inner product, R 2 is the second non-zero feature term cross inner product, V A is the first party second type parameter vector, For the first non-zero column vector, X A is the first sparse matrix, V B is the second-party second-type parameter vector,For the second non-zero column vector, X B is the second sparse matrix, [ [ ] ] A represents data in brackets that are secret shared and belong to a first device, and further, the expression of the second-party first non-zero feature item cross inner product and the second-party second non-zero feature item cross inner product is as follows:
Wherein, R 3 is the first non-zero characteristic item cross inner product of the second party, R 4 is the second non-zero characteristic item cross inner product of the second party, and [ ([ ] ] ] B represents the data with the content in brackets being secret sharing and belongs to the second equipment.
Additionally, the association between the first non-zero feature item intersection inner product and the second party first non-zero feature item intersection inner product, and the association between the second non-zero feature item intersection inner product and the second party second non-zero feature item intersection inner product are as follows:
Step S213, performing federal interaction with the second device based on the second-type sharing model parameter, the first non-zero part and the preset secret sharing multiplication triplet, so as to combine the second-party second-type sharing model parameter and the second non-zero part, and calculating the secret sharing intermediate parameter.
In this embodiment, it should be noted that, the preset secret sharing multiplication triplet is a secret sharing multiplication triplet held by a first device, and the second device holds a second party secret sharing multiplication triplet corresponding to the preset secret sharing multiplication triplet, where a sum of the preset secret sharing multiplication triplet and the second party secret sharing multiplication triplet is the multiplication triplet, where the multiplication triplet is an array formed by three parameters having a product relationship, for example, assuming that the multiplication triplet is (a, b, c), then having the product relationship c=a×b.
Additionally, it should be noted that the second type of sharing model parameter includes the first sharing parameter, a first sharing transpose parameter corresponding to the first sharing parameter, a third sharing transpose parameter corresponding to the third sharing parameter, and a second sharing transpose parameter corresponding to the third sharing parameter, where the first sharing transpose parameter is the second party transpose matrix shared by the secret held by the first device, the second sharing transpose parameter is the first party transpose matrix shared by the secret held by the first device, and the second type of sharing model parameter includes the second sharing parameter, a third sharing transpose parameter corresponding to the second sharing parameter, a fourth sharing transpose parameter corresponding to the fourth sharing parameter, and the third sharing transpose parameter is the second party transpose matrix shared by the secret held by the second device.
Additionally, the secret sharing intermediate parameter includes a first party first secret sharing intermediate parameter and a first party second secret sharing intermediate parameter, wherein the first party first secret sharing intermediate parameter is a first intermediate parameter of secret sharing held by a first device, the first party second secret sharing intermediate parameter is a second intermediate parameter of secret sharing held by the first device, wherein the first intermediate parameter is a secret sharing non-zero feature item cross inner product formed by a second party second type parameter matrix, the second party transpose matrix, a second non-zero portion in the second sparse matrix, and a non-zero portion in a transpose matrix corresponding to the second sparse matrix, that is, each value in the first intermediate parameter is a secret shared product common to the column vector of the second-side second-type parameter matrix, the column vector of the second-side transpose matrix, the column vector of the second non-zero portion in the second sparse matrix, and the column vector of the non-zero portion in the transpose matrix corresponding to the second sparse matrix, and similarly, each value in the second intermediate parameter is a secret shared non-zero feature item cross inner product common to the first-side second-type parameter matrix, the first-side transpose matrix, the first non-zero portion in the first sparse matrix, and the non-zero portion in the transpose matrix corresponding to the first sparse matrix, that is, each value in the second intermediate parameter is the column vector of the first-side second-type parameter matrix, the column vector of the first-side transpose matrix, A secret sharing product common to the column vector of the first non-zero portion in the first sparse matrix and the column vector of the non-zero portion in the transpose matrix corresponding to the first sparse matrix.
Performing federal interactions with the second device based on the second type of shared model parameter, the first non-zero portion, and the predetermined secret sharing multiplication triplet to combine the second type of shared model parameter with the second non-zero portion, calculating the secret sharing intermediate parameter, specifically, performing federal interactions with the second device based on the predetermined secret sharing multiplication triplet, the first shared parameter, and the first shared transpose parameter to combine the second secret sharing multiplication triplet, the second shared parameter, and the third shared transpose parameter, calculating a first-party first transpose matrix inner product to assist the second device to calculate a second-party first transpose matrix inner product to combine the second-party first transpose matrix inner product based on the first-party first transpose matrix inner product, calculating a first-party first secret sharing intermediate parameter to combine the second-party first shared intermediate parameter, and assisting the second device to calculate a second-party second secret sharing intermediate parameter to combine the predetermined shared three-way parameter, and performing the first-party shared three-way, and the second-space shared intermediate parameter to combine the second-party shared three-way, calculating the first-space shared parameter to combine the second-party shared intermediate parameter, performing the first-part shared three-way transpose matrix inner product to combine the second-space shared by combining the second-party first-three-way shared parameter, and performing the first-space-way transpose matrix inner product to combine the second-three-way shared by combining the second-space-three-way shared model parameter to combine the second-square-space shared by combining the second-square-class shared model parameter, and assisting the second device to calculate a second party second secret sharing intermediate parameter, wherein the expressions of the first party first secret sharing intermediate parameter and the first party second secret sharing intermediate parameter are as follows:
Wherein T 1 is the first secret sharing intermediate parameter of the first party, T 2 is the second secret sharing intermediate parameter of the first party, V B is the second type parameter matrix of the second party, For the column vector of the second-party second-type model parameter matrix, X B is the second sparse matrix,For a second non-zero portion of column vectors in the second sparse matrix, V A is the first-party second-type parametric matrix,For column vectors of the first-party second-type model parameter matrix, X A is the first sparse matrix,Is a column vector of a first non-zero portion in the first sparse matrix.
Wherein the second type of sharing model parameters include a second type of secret sharing parameter matrix and a secret sharing transposed parameter matrix corresponding to the second type of secret sharing parameter matrix, the second party second type of sharing model parameters include a second party second type of secret sharing parameter matrix and a second party secret sharing transposed parameter matrix corresponding to the second party second type of sharing parameter matrix,
The step of calculating the secret sharing intermediary parameter based on the second type sharing model parameter, the first non-zero portion, and the preset secret sharing multiplication triplet, federally interacting with the second device to combine the second party second type sharing model parameter and the second non-zero portion, comprises:
Step A10, based on the preset secret sharing multiplication triplets, calculating a secret sharing product between the second-type secret sharing parameter matrix and the secret sharing transpose parameter matrix through federal interaction with the second device to obtain the secret sharing matrix inner product, so that the second device can calculate the secret sharing product between the second-side second-type secret sharing parameter matrix and the second-side secret sharing transpose parameter matrix to obtain the second-side secret sharing matrix inner product;
In this embodiment, it should be noted that, the inner product of the secret shared matrix includes a first inner product of a first transfer matrix and a first inner product of a second transfer matrix, the inner product of the second secret shared matrix includes a second inner product of the first transfer matrix and a second inner product of the second transfer matrix, the second type secret shared parameter includes a first shared parameter matrix and a third shared parameter matrix, the first shared parameter matrix is a matrix representation of a first shared parameter, the third shared parameter matrix is a matrix representation of a third shared parameter, the secret shared transpose parameter matrix includes a first shared transpose parameter matrix and a second shared transpose parameter matrix, the second shared transpose parameter matrix is a matrix representation of the second shared transpose parameter, the second shared parameter matrix includes a second shared parameter matrix and a fourth shared parameter matrix, the second shared parameter matrix is a matrix representation of the fourth shared parameter, the fourth shared parameter matrix is a transpose parameter representation of the first shared parameter, and the fourth shared parameter matrix is a transpose parameter representation of the fourth shared parameter.
Based on the preset secret sharing multiplication triplet, calculating a secret sharing product between the second type secret sharing parameter matrix and the secret sharing transpose parameter matrix by performing federation interaction with the second device, obtaining the secret sharing product between the second type secret sharing parameter matrix and the second party secret sharing transpose parameter matrix, so as to enable the second device to calculate the secret sharing product between the second party secret sharing parameter matrix and the second party secret sharing transpose parameter matrix, obtaining the second party secret sharing matrix inner product, specifically, the first device performs blinding on the first sharing parameter matrix and the first sharing transpose parameter matrix based on the preset secret sharing triplet, respectively obtaining a first party first sharing blinding parameter matrix corresponding to the first sharing parameter matrix and a first party second sharing blinding parameter matrix corresponding to the first sharing parameter matrix, and the second party blinding parameter matrix, and the second device calculates the first sharing parameter matrix corresponding to the first sharing parameter matrix and the second sharing blinding parameter matrix, and the second blind parameter matrix corresponding to the first party blinding parameter matrix, and the second device calculates the first parameter sharing blinding parameter matrix and the first blind matrix, obtaining a second blinding parameter matrix, sending the first shared blinding parameter matrix of the first party and the second shared blinding parameter matrix of the first party to the second equipment, further calculating the sum of the first shared blinding parameter matrix of the first party and the first shared blinding parameter matrix of the second party by the second equipment to obtain a first blinding parameter matrix, calculating the sum of the second shared blinding parameter matrix of the first party and the second shared blinding parameter matrix of the second party to obtain a second blinding parameter matrix, further, calculating the first transfer matrix inner product of the preset secret sharing multiplication triplet, the first blind parameter matrix and the second blind parameter matrix corresponding to each other by the first equipment based on a preset first transfer matrix inner product calculation formula, and equally, calculating the second secret sharing multiplication triplet, the first blind parameter matrix and the second transfer matrix corresponding to each other by the second blind parameter matrix inner product calculation formula.
Further, the first device performs blinding on the third shared parameter matrix and the second shared transpose parameter matrix based on the preset secret shared multiplication triplet, to obtain a first party third shared blinding parameter matrix corresponding to the third shared parameter matrix and a second party fourth shared blinding parameter matrix corresponding to the second shared transpose parameter matrix, and similarly, the second device calculates the third shared blinding parameter matrix and the fourth shared blinding parameter matrix based on the second secret shared multiplication triplet, performs blinding on the fourth shared parameter matrix and the fourth shared transpose parameter matrix to obtain a second party third shared blinding parameter matrix corresponding to the fourth shared parameter matrix and a second party fourth shared blinding parameter matrix corresponding to the fourth shared transpose parameter matrix, and transmits the second party third shared blinding parameter matrix and the second party fourth shared blinding parameter matrix to the first device, and further calculates the fourth shared blinding parameter matrix and the fourth shared blinding parameter matrix to obtain the fourth shared blinding parameter matrix and the fourth shared blinding parameter matrix, further, the first device calculates a first-party second transpose matrix inner product that corresponds to the preset secret sharing multiplication triplet, the third blind parameter matrix, and the fourth blind parameter matrix together based on a preset first-party transpose matrix inner product calculation formula, and similarly, the second device calculates a second-party second transpose matrix inner product that corresponds to the second-party secret sharing multiplication triplet, the third blind parameter matrix, and the fourth blind parameter matrix together based on a preset second-party transpose matrix inner product calculation formula, where the preset first-party first transpose matrix inner product calculation formula and the preset second-party first transpose matrix inner product formula are as follows:
[[x*y]]A=f*[[a]]A+e*[[b]]A+[[c]]A
[[x*y]]B=e*f+f*[[a]]B+e*[[b]]B+[[c]]B
Wherein [ (x y ] ] A is the first side first transfer matrix inner product, [ [ x y ] ] B is the second side first transfer matrix inner product, e is the first blind parameter matrix, f is the second blind parameter matrix, multiplication triplets are (a, b, c), wherein c=a×b, the preset secret sharing multiplication triplets are ([ [ a ] ] A,[[b]]A,[[c]]A), the second side secret sharing multiplication triplets are ([ [ a ] ] B,[[b]]B,[[c]]B), for example, in one embodiment, the first side first transfer matrix inner product and the second side first transfer matrix inner product are calculated as follows:
First, assuming that a first device holds a secret sharing multiplication triplet ([ [ a ] ] A,[[b]]A,[[c]]A), a second device holds a second party secret sharing multiplication triplet ([ [ a ] ] B,[[b]]B,[[c]]B), wherein ,[[a]]A+[[a]]B=a,[[b]]A+[[b]]B=b,[[c]]A+[[c]]B=c,c=a*b, and the first shared parameter matrix is [ [ x ] ] A, the first shared transpose parameter matrix is [ [ y ] ] A, the second shared parameter matrix in the second device is [ [ x ] ] B, the third shared transpose parameter matrix is [ [ y ] ] B, where [ [ x ] ] A+[[x]]B=x,[[y]]A+[[y]]B =y, the first shared blinding parameter matrix of the first party to be calculated by the first device is [ [ x ] y ] ] A, and the first shared blinding parameter matrix of the second device is [ [ x ] y ] ] B, and the calculation process is as follows:
First, the first device calculates [ [ e ] ] A=[[x]]A-[[a]]A and [ [ f ] ] A=[[y]]A-[[b]]A, the second device calculates [ [ e ] ] B=[[x]]B-[[a]]B and [ [ f ] ] B=[[y]]B-[[b]]B, the first device sends [ [ e ] ] A and [ [ f ] ] A to the second device, the second device sends [ [ e ] ] B and [ [ f ] ] B to the second device, the first device and the second device each obtain e=x-a and f=y-b, the first device calculates [ [ x ] y ] ] A=f*[[a]]A+e*[[b]]A+[[c]]A, the second device calculates [ [ x ] y ] B=e*f+f*[[a]]B+e*[[b]]B+[[c]]B, and [ [ x ] y ] ] A+[[x*y]]B =e×f+f×a+e×b+c, and the e=x-a and f=y-b are substituted into the calculation expression, thereby obtaining [ x×y ] ] A+[[x*y]]B =x×y.
Step a20 of federally interacting with the second device to combine the second party secret sharing matrix inner space and the second non-zero portion to calculate the secret sharing intermediate parameter based on the secret sharing matrix inner space and the first non-zero portion.
In this embodiment, it should be noted that, the secret sharing intermediate parameter includes a first party first secret sharing intermediate parameter matrix and a first party second secret sharing intermediate parameter matrix, where the first party first secret sharing intermediate parameter matrix is a matrix representation of the first party first secret sharing intermediate parameter, the first party second secret sharing intermediate parameter matrix is a matrix representation of the first party second secret sharing intermediate parameter, and the second device obtains the second party secret sharing intermediate parameter through federal interaction, where the second party secret sharing intermediate parameter includes a second party first secret sharing intermediate parameter matrix and a second party second secret sharing intermediate parameter matrix, where the second party first secret sharing intermediate parameter matrix is a matrix representation of the second party first secret sharing intermediate parameter, and the second party second secret sharing intermediate parameter matrix is a matrix representation of the second party second secret sharing intermediate parameter.
Based on the secret sharing matrix inner product and the first non-zero part, performing federal interaction with the second device to combine the second party secret sharing matrix inner product and the second non-zero part, calculating the secret sharing intermediate parameter, specifically, generating a fifth public key and a fifth private key corresponding to the fifth public key, performing homomorphic encryption on the first party first transfer matrix inner product based on the fifth public key to obtain an encrypted first party first transfer matrix inner product, transmitting the fifth public key and the encrypted first party first transfer matrix inner product to the second device, further, the second device performing homomorphic encryption on the second party first transfer matrix inner product based on the fifth public key, Obtaining an inner product of an encrypted second first transfer matrix, calculating the inner product of the encrypted first transfer matrix and the inner product of the encrypted second first transfer matrix, obtaining an inner product of the encrypted first transfer matrix, wherein the inner product of the encrypted first transfer matrix is a vector, further generating a transpose matrix corresponding to the second sparse matrix by second equipment, obtaining a second sparse transpose matrix, further calculating a first intermediate parameter product among each numerical value in the inner product of the encrypted transpose matrix, each non-zero column vector of the second sparse matrix and each non-zero column vector of the second sparse transpose matrix, obtaining a first encrypted intermediate parameter inner product, wherein the first intermediate parameter inner product is an accumulated value of each first intermediate parameter product, each numerical value in the encryption transpose matrix inner product, each non-zero column vector of the second sparse matrix and each non-zero column vector of the second sparse transpose matrix are in one-to-one correspondence, a vector consistent with the characteristic dimension of the second sparse matrix is further constructed in a preset first vector space to serve as the second first transpose matrix inner product, homomorphic encryption is performed on the second first transpose matrix inner product based on the fifth public key to obtain an encryption second first transpose matrix inner product, and then a difference value between the first encryption intermediate parameter inner product and the encryption second transpose matrix inner product is calculated, Obtaining an inner product of a first transfer matrix of an encrypted first party, sending the inner product of the first transfer matrix of the encrypted first party to the first device, decrypting the inner product of the first transfer matrix of the encrypted first party by the first device based on the fifth private key to obtain the inner product of the first transfer matrix of the first party, generating a sixth public key and a sixth private key corresponding to the sixth public key by the second device, homomorphic encrypting the inner product of the second transfer matrix of the second party based on the sixth private key to obtain the inner product of the second transfer matrix of the encrypted second party, sending the inner product of the second transfer matrix of the encrypted second party and the sixth public key to the first device, And the first device homomorphic encrypts the first party second transposed matrix inner product based on the sixth public key to obtain an encrypted first party second transposed matrix inner product, and further calculates a sum of the encrypted second party second transposed matrix inner product and the encrypted first party second transposed matrix inner product to obtain an encrypted second transposed matrix inner product, wherein the encrypted second transposed matrix inner product is a vector, and further generates a transposed matrix corresponding to the first sparse matrix to obtain a first sparse transposed matrix, and calculates a second intermediate parameter product among each value in the encrypted second transposed matrix inner product, each non-zero column vector in the first sparse matrix, and each non-zero column vector in the first sparse transposed matrix, Obtaining a second encryption intermediate parameter inner product, wherein the second encryption intermediate parameter inner product is an accumulated value of each second intermediate parameter product, each numerical value in the encryption second transposed matrix inner product, each non-zero column vector in the first sparse matrix and each non-zero column vector in the first sparse transposed matrix are in one-to-one correspondence, further constructing a vector consistent with the characteristic dimension of the first sparse matrix in a preset second vector space as the first party second transposed matrix inner product, homomorphic encrypting the first party second transposed matrix inner product based on the sixth public key to obtain an encryption first party second transposed matrix inner product, And further calculating a difference value between the second encryption intermediate parameter inner product and the second transposed matrix inner product of the encryption first party to obtain an encrypted second party second transposed matrix inner product, and sending the encrypted second party second transposed matrix inner product to the second device, and further decrypting the encrypted second party second transposed matrix inner product by the second device based on the sixth private key to obtain the second party second transposed matrix inner product.
Step S22, calculating the secret sharing model error based on the sparse matrix safe inner product, the secret sharing intermediate parameter and a preset secret sharing model error calculation formula.
In this embodiment, it should be noted that the sparse matrix security inner product includes the first type sparse matrix security inner product and the second type sparse matrix security inner product, where the first type sparse matrix security inner product includes a third non-zero feature item cross inner product and a fourth non-zero feature item cross inner product, the second type sparse matrix security inner product includes a first non-zero feature item cross inner product and a second non-zero feature item cross inner product, and the secret sharing intermediate parameter includes a first secret sharing intermediate parameter and a first secret sharing intermediate parameter.
Calculating the secret sharing model error based on the sparse matrix safe inner product, the secret sharing intermediate parameter and a preset secret sharing model error calculation formula, specifically calculating the fourth non-zero feature item cross inner product, the first non-zero feature item cross inner product and a first party first model output corresponding to the first party first secret sharing intermediate parameter together based on a model output calculation formula, and calculating a third non-zero feature item cross inner product, a second non-zero feature item cross inner product and a first party second model output corresponding to the first party second secret sharing intermediate parameter, and substituting a first device into the preset secret sharing model error calculation formula based on the first party first model output, the first party second model output and a secret sharing sample tag held by a self party, wherein the calculation formula of the first party first model output is as follows:
Wherein [ (f) (X B)]]A is the first model output of the first party), For the fourth non-zero eigenvector cross inner product,For the first non-zero eigenvector cross inner product,The intermediate parameters are shared for the first party first secret, and the calculation expression of the first party second model output is as follows:
Wherein, [ [ f (X A)]]A is the first party second model output, For the third non-zero eigenvector cross inner product,For the second non-zero eigenvector cross inner product,Intermediate parameters are shared for the first party second secret, and the computational expression of the secret sharing model error is as follows:
Wherein Y is the sample tag, [ [ f (X A,XB)-Y]]A is the secret sharing model error).
Similarly, the second device calculates a second party first model output and a second party second model output, and substitutes sample labels of secret sharing held by the second party first model output, the second party second model output and the own party into the preset secret sharing model error calculation formula to calculate a second party secret sharing model error, wherein a calculation expression of the second party first model output is as follows:
wherein, [ [ f (X B)]]B is the second-party first model output, For the fourth non-zero feature term cross inner product of the second party,For the first non-zero eigenvector cross inner product of the second party,The intermediate parameters are shared for the first secret of the second party, and the calculation expression of the second model output of the second party is as follows:
Wherein, [ [ f (X A)]]B is the second model output of the second party), For the second-party third non-zero eigenvector cross inner product,For the second non-zero eigenvector cross inner product of the second party,The intermediate parameters are shared for the second secret of the second party, and thus the calculation expression of the second secret sharing model error is as follows:
Wherein Y is the sample tag, [ [ f (X A,XB)-Y]]B is the second party secret sharing model error).
And step S30, updating the preset initialization model based on the secret sharing model error to obtain a longitudinal federal factor decomposition machine model.
In this embodiment, the longitudinal federal factorization machine model includes a first target model parameter belonging to a first device and a second target model parameter belonging to a second device.
Updating the preset initialization model based on the secret sharing model error to obtain a longitudinal federation factor decomposition machine model, specifically updating the first secret sharing initial model parameter based on the secret sharing model error to obtain a first secret sharing initial updating parameter, updating the second secret sharing initial model parameter by second equipment based on the second secret sharing model error to obtain a second secret sharing initial updating parameter, judging whether the first secret sharing initial updating parameter meets a preset iteration updating end condition, if yes, using the first secret sharing initial updating parameter as a secret sharing updating parameter by the first equipment, using the second secret sharing initial updating parameter as the second secret sharing updating parameter by the second equipment, further carrying out decryption interaction with the second equipment based on the secret sharing updating parameter by the first equipment to combine the second secret sharing updating parameter, determining a first target model parameter and assisting the second equipment in determining the second target model parameter, wherein the second equipment provides the second secret sharing initial updating parameter as the secret sharing updating parameter when carrying out decryption interaction, and the second equipment meets the preset iteration updating end condition, and the second equipment comprises a maximum iteration loss threshold value and the like.
The step of updating the preset initialization model based on the secret sharing model error to obtain a longitudinal federal factorizer model comprises the following steps:
Step S31, updating the secret sharing initial model parameters based on the secret sharing model errors to obtain secret sharing update parameters;
In this embodiment, it should be noted that, based on the secret sharing model error, the secret sharing initial model parameter is updated to obtain a secret sharing update parameter, specifically, gradient information of the secret sharing model error about the secret sharing initial model parameter is calculated respectively, and further, based on the gradient information, the secret sharing initial model parameter is iteratively updated until the iteratively updated secret sharing initial model parameter reaches a preset iteration update end condition, so as to obtain the secret sharing update parameter, where in an embodiment, the gradient information includes a first type gradient, a second type gradient, a third type gradient and a fourth type gradient, and a calculation expression of the first type gradient is as follows:
Wherein D 1 is the first type gradient, α is a super parameter, the size of the super parameter can be set by itself, the value range used for controlling the gradient is w A is the first type parameter vector of the first type, and [ (w A]]A is the first type parameter vector of the first type held by the first device shared by the secret ], otherwise, the calculation expression of the second type gradient is as follows:
Wherein D 2 is the second type gradient, α is a super parameter, the size of the super parameter can be set by itself, the value range for controlling the gradient is V A is the first-type parameter matrix of the second type, and [ (V A]]A) is the first-type parameter matrix of the second type held by the first device which is secret-shared ], further, the calculation expression of the third type gradient is as follows:
Wherein D 3 is the third type gradient, α is a superparameter, the size of the superparameter can be set by itself, the value range used for controlling the gradient is w B is the second-side first type parameter vector, [ [ w B]]A ] is the second-side first type parameter vector held by the first device and shared by secrets, and additionally, the calculation expression of the fourth type gradient is as follows:
Wherein D 4 is the fourth type gradient, α is a super parameter, and its size can be set by itself, for controlling the value range of the gradient, V B is the second type parameter matrix of the second party, and [ (V B]]A) is the second type parameter matrix of the second party held by the second device which is shared by secrets.
Additionally, it should be noted that the second device may calculate gradient information of the second party secret sharing model error about the second party secret sharing initial model parameter, and iteratively update the second party secret sharing initial model parameter based on the gradient information until a preset iteration training end condition is reached, so as to obtain a second party secret sharing update parameter.
Further, based on gradient information, the computing expression for updating the secret sharing initial model parameters is as follows:
Wherein, delta 1、δ2、δ3 and delta 4 are both learning rates which are set in advance, Update model parameters corresponding to a first type of model parameter vector for a secret held by a first device,Update model parameters corresponding to a first party second type parameter matrix shared for a secret held by a first device,Update model parameters corresponding to a second party first type model parameter vector shared for a secret held by a first device,The second party, sharing the secret held by the first device, updates the model parameters corresponding to the second-type parameter matrix of the second party.
And step S32, carrying out decryption interaction with the second device based on the secret sharing update parameters to obtain the first target model parameters so as to obtain the second target model parameters by the second device.
In this embodiment, decryption interaction is performed with the second device based on the secret sharing update parameter to obtain the first target model parameter, so that the second device obtains the second target model parameter, specifically, decryption interaction is performed with the second device based on the secret sharing update parameter to combine the second secret sharing update parameter in the second device, and the first target model parameter is calculated, wherein when decryption interaction is performed, the second device combines the secret sharing update parameter in the first device based on the second secret sharing update parameter, and calculates the second target model parameter.
Wherein the secret sharing update parameters include a first sharing first party model update parameter and a first sharing second party model update parameter,
The step of obtaining the first target model parameter by decrypting interaction with the second device based on the secret sharing update parameter, so that the second device obtains the second target model parameter comprises the following steps:
Step S321, transmitting the first shared second party model update parameter to the second device, so that the second device calculates the second target model parameter based on the determined second shared second party model update parameter and the first shared second party model update parameter;
In this embodiment, it should be noted that the first shared first party model update parameter is a first party model update parameter held by a first device that is shared by a secret, the first shared second party model update parameter is a second party model update parameter held by a first device that is shared by a secret, and the second shared second party model update parameter is the second party model update parameter held by a second device that is shared by a secret.
The first shared second party model updating parameters are sent to the second equipment for the second equipment to calculate the second target model parameters based on the determined second shared second party model updating parameters and the first shared second party model updating parameters, and specifically, the first shared second party model updating parameters are sent to the second equipment for the second equipment to calculate the sum of the second shared second party model updating parameters and the first shared second party model updating parameters, so that the second target model parameters are obtained.
Step S322, receiving a second shared first party model update parameter sent by the second device, and calculating the first target model parameter based on the second shared first party model update parameter and the first shared first party model update parameter.
In this embodiment, the second shared first party model update parameter is a first party model update parameter held by a second device that shares a secret.
Receiving a second shared first party model updating parameter sent by the second equipment, calculating the first target model parameter based on the second shared first party model updating parameter and the first shared first party model updating parameter, specifically, receiving the second shared first party model updating parameter sent by the second equipment, and calculating the sum of the second shared first party model updating parameter and the first shared first party model updating parameter to obtain the first target model parameter.
Compared with the existing longitudinal federal learning method, when the longitudinal federal factorization machine model is built based on longitudinal federal learning, the method combines a secret sharing mechanism and a homomorphic encryption method to calculate the cross inner product of non-zero characteristic items, so that the calculation process of zero parts in a sparse matrix is reduced, the calculation complexity of the factorization machine model based on the sparse matrix is reduced, the calculation efficiency of the longitudinal federal factorization machine model is further improved, and the feature richness of a training sample is higher when the longitudinal federal factorization machine model is built based on longitudinal federal learning modeling, the model performance of the longitudinal federal factorization machine model is better, and the personalized recommendation effect of the longitudinal federal factorization machine model serving as a recommendation model is better.
Compared with the prior art adopting a method based on homomorphic encryption to perform federal learning, the method provided by the embodiment of the application has the advantages that secret sharing is performed with second equipment after the initialization model parameters and first sparse data corresponding to the preset initialization model are acquired, and then the first equipment acquires the first party secret sharing initial model parameters, and the second equipment acquires the second party secret sharing initial model parameters, and further performs federal interaction with the second equipment based on the first non-zero part in the first sparse data and the first party secret sharing initial model parameters, so that the computation complexity of the computation process and the computation complexity in the computation interaction process in the first sparse data and the second sparse data are greatly reduced, and further the federal-based on the longitudinal computation matrix updating method is overcome, and the federal-based on the prior art is overcome.
Further, referring to fig. 2, in another embodiment of the present application, the second type of sparse matrix safe inner product comprises a first non-zero feature term cross inner product and a second non-zero feature term cross inner product,
The step of calculating a second type sparse matrix security inner product based on the second type shared model parameters and the first non-zero portion, performing federal interaction with the second device to combine the second party second type shared model parameters and the second non-zero portion, includes:
Step B10, based on the second type sharing model parameters, performing federal interaction with the second device to calculate a cross inner product between the second type sharing model parameters and the second non-zero part, and obtaining the first non-zero feature item cross inner product;
In this embodiment, it should be noted that, the second type of sharing model parameter includes a first sharing parameter, where the first sharing parameter is a second side second type of sharing parameter matrix held by a first device of secret sharing, that is, the first sharing parameter is a second share of the second side second type of sharing parameter matrix, and the second side second type of sharing model parameter includes a second sharing parameter, where the second sharing parameter is a second side second type of sharing parameter matrix held by a second device of secret sharing, that is, the second sharing parameter is a first share of the second side second type of sharing parameter matrix, for example, if the second side second type of sharing model parameter is V B, V B=[[VB]]A+[[VB]]B is constructed, then [ [ V B]]A ] is the first sharing parameter, [ [ V B]]B ] is the second sharing parameter.
Performing federal interaction with the second device based on the second type sharing model parameter to calculate a cross inner product between the second type sharing model parameter and the second non-zero portion, obtaining the first non-zero feature item cross inner product, specifically, generating a homomorphic encrypted first public-private key pair, wherein the first public-private key pair comprises a first public key and a first private key, further performing homomorphic encryption on the first sharing parameter based on the first public key to obtain an encrypted first sharing parameter, further transmitting the encrypted first sharing parameter and the first public key to the second device, further performing homomorphic encryption on the second sharing parameter by the second device based on the first public key to obtain an encrypted second sharing parameter, further calculating a sum of the first encrypted sharing parameter and the second encrypted sharing parameter, obtaining an encrypted second-side second-type parameter matrix of the round iteration, further obtaining a second encrypted inner product by encrypting the second-side second-type parameter matrix and a non-zero feature item cross inner product of the second non-zero part, further generating a second-side first non-zero feature item cross inner product which is uniformly distributed and is positioned in a first target feature dimension based on a first target feature dimension of the second-side second-type parameter matrix, homomorphic encrypting the second-side first non-zero feature item cross inner product based on the first public key to obtain an encrypted second-side first non-zero feature item cross inner product, calculating an encrypted first non-zero feature item cross inner product based on the second encrypted inner product and the second-side first non-zero feature item cross inner product, and transmitting the encrypted first non-zero feature item cross inner product to the first device, the first device may decrypt the encrypted first non-zero feature item cross inner product based on the first private key to obtain the first non-zero feature item cross inner product.
Wherein the second type of sharing model parameters comprise first sharing parameters, the second party second type of sharing model parameters comprise second sharing parameters,
The step of performing federal interaction with the second device based on the second type of shared model parameters to calculate a cross inner product between the second type of shared model parameters and the second non-zero portion, the step of obtaining the first non-zero feature term cross inner product comprising:
step B11, generating a first public key, and encrypting the first shared parameter based on the first public key to obtain an encrypted first shared parameter;
in this embodiment, it should be noted that the encryption method includes homomorphic encryption.
Step B12, the first public key and the encrypted first shared parameter are sent to a second device, so that the second device determines a second party first non-zero feature item cross inner product and an encrypted first non-zero feature item cross inner product based on the first public key, the encrypted first shared parameter, a second shared parameter and the second non-zero part;
In this embodiment, the first public key and the encrypted first shared parameter are sent to a second device, so that the second device determines, based on the first public key, the encrypted first shared parameter, a second shared parameter and the second non-zero part, a second square first non-zero feature item cross inner product and an encrypted first non-zero feature item cross inner product, specifically, the encrypted first shared parameter and the first public key are sent to the second device, then the second device can perform homomorphic encryption on the second shared parameter based on the first public key, obtain an encrypted second shared parameter, and then calculate a sum of the first encrypted shared parameter and the second encrypted shared parameter, and then an encrypted second square second type parameter matrix of a local iteration is obtained, further, products between each column vector in the encrypted second square second type parameter matrix and each column vector in the second non-zero part are calculated, and then the first square second square cross feature item cross inner product and the second non-zero feature item cross inner product are obtained, and then the first square second square cross feature item cross inner product and the first non-zero feature cross inner product are obtained, and then the first square non-zero feature cross inner product and the second non-zero feature cross inner product is obtained, and the first square second non-zero feature cross inner product is obtained.
And B13, receiving the encrypted first non-zero characteristic item cross inner product sent by the second equipment, and decrypting the encrypted first non-zero characteristic item cross inner product based on a first private key corresponding to the first public key to obtain the first non-zero characteristic item cross inner product.
And step B20, performing federal interaction with the second device based on the first non-zero part to calculate a cross inner product between the first non-zero part and the second-party second-type shared model parameter, and obtaining the second non-zero characteristic item cross inner product.
In this embodiment, it should be noted that, the third sharing parameter is a first-party second-type parameter matrix held by a first device with secret sharing, that is, the third sharing parameter is a first share of the first-party second-type parameter matrix, and the fourth sharing parameter is a first-party second-type parameter matrix held by a second device with secret sharing, that is, the fourth sharing parameter is a second share of secret sharing, for example, if the first-party second-type sharing parameter is V A, and V A=[[VA]]A+[[VA]]B is constructed, then [ [ V A]]A ] is the third sharing parameter, and [ [ V A]]B ] is the fourth sharing parameter.
Performing federal interaction with the second device based on the first non-zero portion to calculate a cross inner product between the first non-zero portion and the second-party second-type shared model parameter, obtaining the second non-zero feature item cross inner product, specifically, generating a second public-private key pair by the second device, wherein the public-private key pair comprises a second private key and a second public key, further encrypting the fourth shared parameter based on the second public key, obtaining an encrypted fourth shared parameter, and transmitting the second public key and the encrypted fourth shared parameter to the first device, further encrypting the third shared parameter by the first device based on the second public key, obtaining an encrypted third shared parameter, further calculating an encrypted first-party second-type matrix parameter based on the encrypted third shared parameter and the encrypted fourth shared parameter, further, a second non-zero feature item cross inner product which is uniformly distributed and positioned in the second target feature dimension is constructed based on the second target feature dimension of the encrypted first-party second-type parameter matrix, the second non-zero feature item cross inner product is encrypted based on the second public key, the encrypted second non-zero feature item cross inner product is obtained based on the first encrypted inner product and the encrypted second non-zero feature item cross inner product, the encrypted second-party second non-zero feature item cross inner product is obtained, the encrypted second-party second non-zero feature item cross inner product is sent to the second equipment based on the second private key, and decrypting the second non-zero characteristic item cross inner product of the encrypted second party to obtain the second non-zero characteristic item cross inner product of the second party.
Wherein the second type of shared model parameters comprise third shared parameters, the second party second type of shared model parameters comprise fourth shared parameters,
The step of performing federal interaction with the second device based on the first non-zero portion to calculate a cross inner product between the first non-zero portion and the second-party second-type shared model parameter, the step of obtaining the second non-zero feature term cross inner product comprising:
Step B21, receiving a second public key sent by the second device and a sent encrypted fourth shared parameter, wherein the encrypted fourth shared parameter is the fourth shared parameter encrypted by the second device based on the second public key;
in this embodiment, the method for encrypting the fourth shared parameter by the second device includes homomorphic encryption.
Step B22, calculating a second non-zero feature item cross inner product and an encrypted second non-zero feature item cross inner product of the second party based on the second public key, the encrypted fourth sharing parameter, the first non-zero part and the third sharing parameter;
In this embodiment, based on the second public key, the encrypted fourth shared parameter, the first non-zero part and the third shared parameter, a second non-zero feature item cross inner product is calculated, specifically, based on the second public key, the third shared parameter is homomorphic encrypted to obtain an encrypted third shared parameter, further, based on the encrypted third shared parameter and the encrypted fourth shared parameter, an encrypted first party second type parameter matrix is calculated, further, the encrypted first party second type parameter matrix and the first non-zero feature item cross inner product of the first non-zero part are calculated, a first encrypted inner product is obtained, further, based on a second target feature dimension of the encrypted first party second type parameter matrix, a second non-zero feature item cross inner product which is uniformly distributed and is in the second target feature dimension is constructed, based on the second public key, the encrypted first party second type parameter matrix is calculated, further, the encrypted first party second non-zero feature item cross inner product is calculated, further, a second non-zero feature cross inner product is obtained, and further, the second non-zero feature cross inner product is obtained.
Wherein the step of calculating a second non-zero feature item cross inner product and encrypting a second non-zero feature item cross inner product of a second party based on the second public key, the encrypted fourth shared parameter, the first non-zero portion, and the third shared parameter comprises:
Step B221, encrypting the third sharing parameter based on the second public key to obtain an encrypted third sharing parameter, and calculating an encryption model parameter which corresponds to the encrypted third sharing parameter and the encrypted fourth sharing parameter together;
in this embodiment, it should be noted that the encryption model parameter is a sum of the encryption third shared parameter and the encryption fourth shared parameter, and the encryption model parameter is the encryption first-party second-type parameter matrix.
Step B222, calculating the cross inner product between each column vector in the encryption model parameter and each column vector in the first non-zero part to obtain the first encryption inner product;
In this embodiment, the cross inner product between each column vector in the encryption model parameter and each column vector in the first non-zero portion is calculated to obtain the first encrypted inner product, specifically, the product between each column vector in the encryption model parameter and each column vector in the first non-zero portion is calculated to obtain each second vector product, and then each second vector product is accumulated to obtain the first encrypted inner product.
Step B223, constructing a second non-zero feature item cross inner product based on the feature dimension of the encryption model parameter, and calculating the second non-zero feature item cross inner product of the encryption second party, which corresponds to the first encryption inner product together;
In this embodiment, the second non-zero feature item cross inner product is constructed based on the feature dimension of the encryption model parameter, the second non-zero feature item cross inner product is calculated, the second non-zero feature item cross inner product of the encrypted second party is corresponding to the first encryption inner product together, specifically, a vector which is uniformly distributed and consistent with the feature dimension of the encryption model parameter is constructed as the second non-zero feature item cross inner product, then homomorphic encryption is performed on the second non-zero feature item cross inner product based on the second public key, so as to obtain the encrypted second non-zero feature item cross inner product, and based on the difference value between the first encryption inner product and the encrypted second non-zero feature item cross inner product, the encrypted second non-zero feature item cross inner product is obtained.
And B23, transmitting the second non-zero characteristic item cross inner product of the encrypted second party to the second device, so that the second device decrypts the second non-zero characteristic item cross inner product of the encrypted second party based on a second private key corresponding to the second public key to obtain the second non-zero characteristic item cross inner product of the second party.
In this embodiment, it should be noted that the second public key and the second private key are homomorphic encrypted public-private key pairs.
The present embodiment provides a method for calculating a second type sparse matrix secure inner product of secret sharing based on a method combining secret sharing and homomorphic encryption, that is, based on the second type sharing model parameter, performing federal interaction with the second device to calculate a cross inner product between the second type sharing model parameter and the second non-zero portion, obtaining the first non-zero feature item cross inner product, and based on the first non-zero portion, performing federal interaction with the second device to calculate a cross inner product between the first non-zero portion and the second side second type sharing model parameter, obtaining the second non-zero feature item cross inner product, wherein, in the whole process, only a first non-zero part in the first sparse data and a second non-zero part in the second sparse data are utilized for calculation, so that the calculation process of the zero parts in the first sparse data and the second sparse data is reduced, the duty ratio of the zero parts in the sparse data is usually far greater than that of the non-zero parts, the calculation amount and the calculation complexity in the federal interaction process are greatly reduced, compared with the technical means of federal learning by adopting a homomorphic encryption method in the prior art, the technical defect that the calculation efficiency is low when the federal learning based on the sparse matrix is performed by adopting the homomorphic encryption method is overcome, and the calculation efficiency when the federal learning based on the sparse matrix is further improved.
Further, referring to fig. 3, in another embodiment of the present application, the personalized recommendation method is applied to a first device, and the personalized recommendation method includes:
step C10, acquiring sparse data of a user to be recommended by a first party, and carrying out secret sharing with second equipment to acquire secret sharing model parameters;
in this embodiment, it should be noted that, the second device includes second-party to-be-recommended user sparse data, where the second-party to-be-recommended user sparse data is a second sparse matrix corresponding to a plurality of user data, the first-party to-be-recommended user sparse data is a first sparse matrix corresponding to a plurality of user data, each vector of the first sparse matrix and each vector of the second-party sparse matrix are coding vectors corresponding to-be-recommended user data of one user, and coding values in the first sparse matrix and coding values in the second sparse matrix are mostly 0, for example, the first sparse matrix is a click result representing different objects of different users, the code value 1 indicates that the user clicks the article, the code value 0 indicates that the user does not click the article, and since the user usually clicks only a small part of articles, the code value in the first sparse matrix is mostly 0, further, it should be noted that, the first device and the second device are all participants of vertical federal learning, and before secret sharing, the first device and the second device train a preset personalized recommendation model based on secret sharing and vertical federal learning, wherein the preset personalized recommendation model is a trained factorer regression model for predicting the score of the article corresponding to the user, and the model expression of the preset personalized recommendation model is as follows:
wherein x is model input data, w and V are model parameters, and f (x) is model output, namely a vector formed by scores output by a preset personalized recommendation model or scores output by the preset personalized recommendation model.
Acquiring sparse data of a first party to be recommended user, and carrying out secret sharing with second equipment to acquire secret sharing model parameters, in particular acquiring first party model parameters of a preset personalized recommendation model and the sparse data of the first party to be recommended user, and simultaneously acquiring second party model parameters of the preset personalized recommendation model and the sparse data of the second party to be recommended user by the second equipment, wherein, because the preset personalized recommendation model is constructed based on longitudinal federal learning, part of model parameters of the preset personalized recommendation model held by the first equipment are first party model parameters, part of model parameters of the preset personalized recommendation model held by the second equipment are second party model parameters, the sparse data of the first party to be recommended user is related data of a user collected by the first equipment, the second party to-be-recommended user sparse data is the associated data of the user collected by the second equipment, wherein the associated data of the user comprises user interest preference data, historical scoring data of the user on the article and the like, in a longitudinal federal scene, the first party to-be-recommended user sparse data and the second party to-be-recommended user sparse data correspond to the same user group to be recommended, the first party to-be-recommended user sparse data and the second party to-be-recommended user sparse data can be represented by vectors or matrixes, for example, assuming that the first party to-be-recommended user sparse data is the vector (1, 0,1, 0), wherein the code 1 indicates that the user clicks the corresponding article, the code 0 indicates that the user does not click the corresponding article, the vector (1, 0,1, 0) indicates that the user clicks the article A and the article C, and the article B does not click, further, based on the first party model parameter, secret sharing is performed with the second device, wherein the second device provides the second party model parameter in secret sharing, and further, the first device obtains the secret sharing model parameter, and the second device obtains the second party secret sharing model parameter, wherein the secret sharing model parameter comprises a first sharing first party model parameter and a first sharing second party model parameter, the second party secret sharing model parameter comprises a second sharing first party model parameter and a second sharing second party model parameter, wherein the first sharing first party model parameter is a first share of the first party model parameter, the second sharing first party model parameter is a second share of the first party model parameter, the first sharing second party model parameter is a first share of the second model parameter, and the second sharing second party model parameter is a second share of the second model parameter.
Step C20, performing longitudinal federal prediction interaction with the second equipment based on a first non-zero part in the sparse data of the first party to-be-recommended user and the secret sharing model parameters so as to score the to-be-recommended object corresponding to the sparse data of the first party to-be-recommended user, and obtaining a first secret sharing scoring result;
in this embodiment, it should be noted that the first non-zero portion is a remaining portion except for a zero vector in the first sparse matrix.
And performing longitudinal federal prediction interaction with the second device based on the first non-zero part and the secret sharing model parameter in the sparse data of the first party to be recommended to score the article to be recommended corresponding to the sparse data of the first party to be recommended to obtain a first secret sharing scoring result, and particularly performing longitudinal federal prediction interaction with the second device based on the first non-zero part and the secret sharing model parameter in the sparse data of the first party to be recommended to combine the second non-zero part and the secret sharing model parameter of the second party to be recommended to calculate the first secret sharing scoring result through a preset secret sharing multiplication and a preset homomorphic encryption algorithm.
Wherein the secret sharing model parameters comprise first party first sharing type model parameters and first party second type sharing model parameters, the second device comprises second party to-be-recommended user sparse data and second party secret sharing model parameters, wherein the second party secret sharing model parameters comprise second party first sharing type model parameters and second party second sharing type model parameters, the first secret sharing scoring results comprise first type sharing scoring results and second type sharing scoring results,
The step of performing longitudinal federal prediction interaction with the second device based on the first non-zero part of the first party to-be-recommended user sparse data and the secret sharing model parameter to score the to-be-recommended object corresponding to the first party to-be-recommended user sparse data, and obtaining a first secret sharing scoring result comprises the following steps:
Step C21, based on the first non-zero part and the first sharing type model parameter of the first party, performing longitudinal federal prediction interaction with the second equipment to combine the first sharing type model parameter of the second party, and calculating the first type sharing scoring result;
in this embodiment, it should be noted that the first party first sharing type model parameter is a first party model parameter of secret sharing held by the first device, the second party first sharing type model parameter is a first party model parameter of secret sharing held by the second device, and the first type sharing scoring result at least includes a first sharing first party score, and the first sharing first party score corresponds to a user, where a first secret sharing score calculation formula corresponding to the first sharing first party score is as follows:
Wherein, [ [ f (X A)]]A is the first shared first party score, X A is the first party to-be-recommended user sparse data, w A and V A are both the first party model parameters, For a first safe inner product calculated in the same way as the third non-zero feature item cross inner product in the model training process,For a second safe inner product that is calculated in the same manner as the second non-zero feature term cross inner product during model training,For a third secure inner product in the same way as the first party second secret sharing intermediate parameter is calculated in the model training process, [ [ ] ] is a symbol representing secret sharing, and [ ([ ] ] ] A is a secret sharing symbol representing that the secret sharing data belongs to the first device.
And performing longitudinal federal prediction interaction with the second device based on the first non-zero part and the first sharing type model parameter of the first party to combine the first sharing type model parameter of the second party, calculating the first type sharing scoring result, specifically, performing longitudinal federal prediction interaction with the second device based on the first non-zero part and the first sharing type model parameter of the first party to combine the first sharing type model parameter of the second party to calculate a first secure inner product, a second secure inner product and a third secure inner product respectively, and substituting the first secure inner product, the second secure inner product and the third secure inner product into the first secret sharing scoring calculation formula to obtain the first type sharing scoring result.
Additionally, it should be noted that the second device may calculate a second party first type of shared score result based on a second party first secret shared score calculation formula, where the second party first type of shared score result includes at least a second shared first party score, and one of the second shared first party scores corresponds to one of the users, and the second party first secret shared score calculation formula is as follows:
Wherein, [ [ f (X A)]]B is the second shared first party score, X A is the first party to-be-recommended user sparse data, w A and V A are both the first party model parameters, For a second-party first safe inner product in the same manner as the second-party third non-zero feature item cross inner product in the model training process,For a second-party second safe inner product that is calculated in the same manner as the second-party second non-zero feature item cross inner product in the model training process,For a second party, a third secure inner product in the same way as the second party, second secret sharing intermediate parameters in the model training process are calculated, [ [ ] ] is a symbol representing secret sharing, and [ ([ ] ] ] B is a secret sharing symbol representing that the secret sharing data belongs to the first device.
And step C22, performing longitudinal federal prediction interaction with the second equipment based on the first party second sharing type model parameters so as to combine the second party second sharing type model parameters and a second non-zero part of the second party to-be-recommended user sparse data, and calculating the second type sharing scoring result.
In this embodiment, it should be noted that, the first party second sharing type model parameter is a second party model parameter of secret sharing held by the first device, the second party second sharing type model parameter is a second party model parameter of secret sharing held by the second device, the second type sharing scoring result at least includes a first sharing second party score, and a first sharing second party score corresponds to a user, where a second secret sharing score calculation formula corresponding to the first sharing second party score is as follows:
Wherein, [ [ f (X B)]]A is the first shared second party score, X B is the second party to be predicted user sparse data, w B and V B are both the second party model parameters), For a fourth secure inner product that is calculated in the same way as the fourth non-zero feature term cross inner product,For a fifth secure inner product that is calculated in the same way as the first non-zero feature term cross inner product,For a sixth secure inner product in the same way as the first secret sharing intermediate parameter of the first party is calculated, [ [ ] ] is a symbol representing secret sharing, and [ ([ ] ] ] A is a secret sharing symbol representing that the secret sharing data belongs to the first device.
And performing longitudinal federation prediction interaction with the second device based on the first party second sharing type model parameter to combine the second party second sharing type model parameter and a second non-zero part of the second party to-be-recommended user sparse data, calculating the second type sharing scoring result, specifically, performing longitudinal federation prediction interaction with the second device based on the first party second sharing type model parameter to combine the second party second sharing type model parameter and the second non-zero part of the second party to-be-predicted user sparse data, respectively calculating a fourth safety inner product, a fifth safety inner product and a sixth safety inner product, and substituting the fourth safety inner product, the fifth safety inner product and the sixth safety inner product into the second secret sharing scoring calculation formula to obtain the second type sharing scoring result.
Additionally, it should be noted that the second device may calculate a second-party second-type sharing score result based on a second-party second secret sharing score calculation formula, where the second-party second-type sharing score result includes at least a second-party sharing score, and one of the second-party sharing score corresponds to one of the users, and the second-party second secret sharing score calculation formula is as follows:
Wherein, [ [ f (X B)]]B is the second shared second party score, X B is the second party to be predicted user sparse data, w B and V B are both the second party model parameters, For a second-party fourth safe inner product that is calculated in the same manner as the second-party fourth non-zero feature item cross inner product in the model training process,For a second-party fifth safe inner product that is calculated in the same manner as the second-party first non-zero feature item cross inner product in the model training process,A sixth secure inner product of the second party, which is the same as the calculation mode of the first secret sharing intermediate parameter of the second party in the model training process, [ [ ] ] is a symbol representing secret sharing, and [ ([ ] ] ] B is a secret sharing symbol representing that the secret sharing data belongs to the second device.
Step C30, performing aggregation interaction with the second equipment based on the first secret sharing scoring result to combine the second secret sharing scoring result determined by the second equipment to calculate a target scoring result;
In this embodiment, aggregation interaction is performed with the second device based on the first secret sharing score result to combine the second secret sharing score result determined by the second device, and a target score result is calculated, specifically, aggregation interaction is performed with the second device based on the first secret sharing score result to aggregate the first secret sharing score result and the second secret sharing score result, so as to obtain the target score result
Wherein the first secret sharing scoring result at least comprises a first sharing first party score and a first sharing second party score, the second secret sharing scoring result at least comprises a second sharing first party score and a second sharing second party score, the target scoring result at least comprises a target score,
The step of calculating a target scoring result based on the first secret sharing scoring result and the second device for aggregation interaction to combine the second secret sharing scoring result determined by the second device includes:
step C31, receiving the second shared first party score and the second shared second party score sent by the second device;
step C32, calculating a first party score based on the first shared first party score and the second shared second party score;
in this embodiment, a first party score is calculated based on the first shared first party score and the second shared second party score, specifically, a sum of the first shared first party score and the second shared second party score is calculated, and a first party score is obtained.
Step C33, calculating a second party score based on the first shared second party score and the second shared second party score;
In this embodiment, a second party score is calculated based on the first shared second party score and the second shared second party score, specifically, a sum of the first shared second party score and the second shared second party score is calculated, and a second party score is obtained.
Step C34, aggregating the first party scores and the second party scores to obtain the target scores;
in this embodiment, the first party score and the second party score are aggregated to obtain the target score, specifically, the first party score and the second party score are aggregated based on a preset aggregation rule, where the preset aggregation rule includes summation, weighted averaging, and the like.
And step C40, generating a target recommendation list corresponding to the to-be-recommended article based on the target scoring result.
In this embodiment, a target recommendation list corresponding to the to-be-recommended item is generated based on the target scoring result, specifically, based on the size of each target score in the target scoring result, the target users are ranked, and a recommendation user list of the to-be-recommended item is generated, where each target score is a score of a different target user on the same to-be-recommended item, and the recommendation user list is further used as the target recommendation list.
In another implementation manner, a target recommendation list corresponding to the user to be recommended is generated based on the target scoring result, specifically, based on each target score in the target scoring result, each target score is a score of the same target user on different articles to be recommended, and further, based on the size of each target score, each article to be recommended is ranked, a recommendation article list of the user to be recommended is generated, and the recommendation article list is used as the target recommendation list.
The embodiment provides a method for predicting click rate based on secret sharing and longitudinal federal learning, which comprises obtaining sparse data of a user to be recommended by a first party, carrying out secret sharing with a second device to obtain secret sharing model parameters, further carrying out longitudinal federal prediction interaction with the second device based on a first non-zero part in the sparse data of the user to be recommended by the first party and the secret sharing model parameters, scoring an item to be recommended corresponding to the sparse data of the user to be recommended by the first party to obtain a first secret sharing scoring result, further carrying out aggregation interaction with the second device based on the first secret sharing scoring result, combining the second secret sharing scoring result determined by the second device, calculating a target scoring result based on the target scoring result, generating a target recommendation list corresponding to the article to be recommended, wherein when the first equipment and the second equipment interact, the transmitted or received data are secret sharing data, encryption of data is not needed by public and private keys generated by a third party, all data transmission processes are performed between two parties participating in longitudinal federal learning, the privacy of the data is protected, meanwhile, the complex encryption and decryption calculation processes of the data are reduced, and because the secret sharing and the decryption corresponding to the secret sharing are performed, the calculation complexity is reduced by only performing a simple mathematical operation process, and when the user data are sparse data, the generation of the target recommendation list can be completed only based on the non-zero part of the user data, so that the calculation amount of the zero part of the user data is reduced, and the calculation complexity is further reduced, therefore, the calculation efficiency of personalized recommendation performed by the factoring machine regression model is improved.
Referring to fig. 4, fig. 4 is a schematic device structure diagram of a hardware running environment according to an embodiment of the present application.
As shown in fig. 4, the factorer model building apparatus may include: a processor 1001, such as a CPU, memory 1005, and a communication bus 1002. Wherein a communication bus 1002 is used to enable connected communication between the processor 1001 and a memory 1005. The memory 1005 may be a high-speed RAM memory or a stable memory (non-volatile memory), such as a disk memory. The memory 1005 may also optionally be a storage device separate from the processor 1001 described above.
Optionally, the factorer model building device may further include a rectangular user interface, a network interface, a camera, an RF (Radio Frequency) circuit, a sensor, an audio circuit, a WiFi module, and the like. The rectangular user interface may include a Display screen (Display), an input sub-module such as a Keyboard (Keyboard), and the optional rectangular user interface may also include a standard wired interface, a wireless interface. The network interface may optionally include a standard wired interface, a wireless interface (e.g., WI-FI interface).
Those skilled in the art will appreciate that the factoring machine model building device structure shown in fig. 4 does not constitute a limitation of the factoring machine model building device, and may include more or fewer components than shown, or may combine certain components, or a different arrangement of components.
As shown in fig. 4, an operating system, a network communication module, and a factoring machine model building program may be included in a memory 1005, which is a computer storage medium. The operating system is a program that manages and controls the hardware and software resources of the factoring machine model building device, supporting the execution of the factoring machine model building program and other software and/or programs. The network communication module is used to enable communication between components within the memory 1005 and other hardware and software in the factoring machine model building system.
In the factoring machine model construction device shown in fig. 4, a processor 1001 is configured to execute a factoring machine model construction program stored in a memory 1005, to implement the steps of the factoring machine model construction method described in any one of the above.
The specific implementation manner of the factoring machine model construction device is basically the same as that of each embodiment of the factoring machine model construction method, and is not repeated here.
Referring to fig. 5, fig. 5 is a schematic device structure diagram of a hardware running environment according to an embodiment of the present application.
As shown in fig. 5, the personalized recommendation device may include: a processor 1001, such as a CPU, memory 1005, and a communication bus 1002. Wherein a communication bus 1002 is used to enable connected communication between the processor 1001 and a memory 1005. The memory 1005 may be a high-speed RAM memory or a stable memory (non-volatile memory), such as a disk memory. The memory 1005 may also optionally be a storage device separate from the processor 1001 described above.
Optionally, the personalized recommendation device may further include a rectangular user interface, a network interface, a camera, an RF (Radio Frequency) circuit, a sensor, an audio circuit, a WiFi module, and the like. The rectangular user interface may include a Display screen (Display), an input sub-module such as a Keyboard (Keyboard), and the optional rectangular user interface may also include a standard wired interface, a wireless interface. The network interface may optionally include a standard wired interface, a wireless interface (e.g., WI-FI interface).
It will be appreciated by those skilled in the art that the personalized recommendation device structure shown in FIG. 5 does not constitute a limitation of the personalized recommendation device, and may include more or fewer components than shown, or may combine certain components, or may have a different arrangement of components.
As shown in fig. 5, an operating system, a network communication module, and a personalized recommendation program may be included in a memory 1005, which is a type of computer storage medium. The operating system is a program that manages and controls the hardware and software resources of the personalized recommendation device, supporting the execution of personalized recommendation programs and other software and/or programs. The network communication module is used to implement communication between the components within the memory 1005 and with other hardware and software in the personalized recommendation system.
In the personalized recommendation device shown in fig. 5, a processor 1001 is configured to execute a personalized recommendation program stored in a memory 1005, and implement the steps of the personalized recommendation method described in any one of the above.
The specific implementation manner of the personalized recommendation device of the present application is basically the same as that of each embodiment of the personalized recommendation method, and will not be described herein.
The embodiment of the application also provides a factoring machine model building device, which is applied to factoring machine model building equipment, and comprises:
The secret sharing module is used for acquiring initialization model parameters and first sparse data corresponding to a preset initialization model, carrying out secret sharing with second equipment based on the initialization model parameters, and acquiring first party secret sharing initial model parameters so that the second equipment can determine second party secret sharing initial model parameters;
An error calculation module, configured to perform federal interaction with the second device based on a first non-zero portion in the first sparse data and the first party secret sharing initial model parameter, so as to combine a second non-zero portion in second sparse data acquired by the second device with the second party secret sharing initial model parameter, and calculate a secret sharing model error;
and the generation module is used for updating the preset initialization model based on the secret sharing model error to obtain a longitudinal federal factor decomposition machine model.
Optionally, the error calculation module includes:
A first computing sub-module for federally interacting with the second device based on a preset secret sharing multiplication triplet, the first non-zero portion and the first party secret sharing initial model parameters to combine the second non-zero portion and the second party secret sharing initial model parameters to compute a sparse matrix secure inner product and secret sharing intermediate parameters;
And the second calculation sub-module is used for calculating the secret sharing model error based on the sparse matrix safe inner product, the secret sharing intermediate parameter and a preset secret sharing model error calculation formula.
Optionally, the first computing module includes:
A first computing unit configured to perform federal interaction with the second device based on the first type shared model parameters and the first non-zero portion to combine the second party first type shared model parameters and the second non-zero portion to compute the first type sparse matrix security inner product;
A second computing unit for federally interacting with the second device based on the second-type shared model parameters and the first non-zero portion to combine the second-side second-type shared model parameters and the second non-zero portion to compute the second-type sparse matrix security inner product;
a third computing unit for federally interacting with the second device to combine the second-party second-type shared model parameter and the second non-zero portion, based on the second-type shared model parameter, the first non-zero portion, and the preset secret sharing multiplication triplet, to compute the secret sharing intermediate parameter.
Optionally, the second computing unit includes:
A first interaction computation subunit, configured to perform federal interaction with the second device based on the second type of shared model parameter, so as to compute a cross inner product between the second type of shared model parameter and the second non-zero portion, and obtain the first non-zero feature item cross inner product;
And the second interaction calculating subunit is used for performing federal interaction with the second device based on the first non-zero part so as to calculate the cross inner product between the first non-zero part and the second-party second-type sharing model parameter and obtain the second non-zero characteristic item cross inner product.
Optionally, the first interaction computation subunit is further configured to:
generating a first public key, encrypting the first shared parameter based on the first public key, and obtaining an encrypted first shared parameter;
Transmitting the first public key and the encrypted first shared parameter to a second device for the second device to determine a second party first non-zero feature item cross inner product and an encrypted first non-zero feature item cross inner product based on the first public key, the encrypted first shared parameter, a second shared parameter, and the second non-zero portion;
And receiving the encrypted first non-zero characteristic item cross inner product sent by the second equipment, and decrypting the encrypted first non-zero characteristic item cross inner product based on a first private key corresponding to the first public key to obtain the first non-zero characteristic item cross inner product.
Optionally, the second interaction computing subunit is further configured to:
receiving a second public key sent by the second device and a sent encrypted fourth shared parameter, wherein the encrypted fourth shared parameter is the fourth shared parameter encrypted by the second device based on the second public key;
Calculating a second non-zero feature item cross inner product and an encrypted second party second non-zero feature item cross inner product based on the second public key, the encrypted fourth shared parameter, the first non-zero portion, and the third shared parameter;
And sending the second non-zero characteristic item cross inner product of the encrypted second party to the second device so that the second device decrypts the second non-zero characteristic item cross inner product of the encrypted second party based on a second private key corresponding to the second public key to obtain the second non-zero characteristic item cross inner product of the second party.
Optionally, the second interaction computing subunit is further configured to:
encrypting the third sharing parameter based on the second public key to obtain an encrypted third sharing parameter, and calculating an encryption model parameter which corresponds to the encrypted third sharing parameter and the encrypted fourth sharing parameter together;
Calculating the cross inner product between each row vector in the encryption model parameter and each row vector in the first non-zero part to obtain the first encryption inner product;
And constructing a second non-zero feature item cross inner product based on the feature dimension of the encryption model parameter, and calculating the second non-zero feature item cross inner product of the encryption second party, which corresponds to the first encryption inner product together.
Optionally, the third computing unit further includes:
A third interaction calculating subunit, configured to calculate, based on the preset secret sharing multiplication triplet, a secret sharing product between the second-type secret sharing parameter matrix and the secret sharing transposed parameter matrix by performing federal interaction with the second device, to obtain the secret sharing matrix inner product, so that the second device calculates a secret sharing product between the second-side second-type secret sharing parameter matrix and the second-side secret sharing transposed parameter matrix, to obtain the second-side secret sharing matrix inner product;
a fourth interaction computation subunit for federally interacting with the second device based on the secret sharing matrix interior and the first non-zero portion to combine the second party secret sharing matrix interior and the second non-zero portion to compute the secret sharing intermediate parameter.
Optionally, the generating module includes:
The updating sub-module is used for updating the secret sharing initial model parameters based on the secret sharing model errors to obtain secret sharing updating parameters;
And the decryption sub-module is used for carrying out decryption interaction with the second equipment based on the secret sharing update parameters to obtain the first target model parameters so as to enable the second equipment to obtain the second target model parameters.
Optionally, the decryption submodule includes:
a first interaction calculating unit, configured to send the first shared second party model update parameter to the second device, so that the second device calculates the second target model parameter based on the determined second shared second party model update parameter and the first shared second party model update parameter;
And the second interaction calculating unit is used for receiving second shared first party model updating parameters sent by the second equipment and calculating the first target model parameters based on the second shared first party model updating parameters and the first shared first party model updating parameters.
The specific implementation manner of the device for constructing the factoring machine model is basically the same as that of each embodiment of the method for constructing the factoring machine model, and is not repeated here.
The embodiment of the application also provides a personalized recommendation device, which is applied to personalized recommendation equipment and comprises:
The secret sharing module is used for acquiring sparse data of the user to be recommended by the first party, and carrying out secret sharing with the second equipment to acquire secret sharing model parameters;
The scoring module is used for performing longitudinal federal prediction interaction with the second equipment based on a first non-zero part in the sparse data of the first party to-be-recommended user and the secret sharing model parameters so as to score the to-be-recommended articles corresponding to the sparse data of the first party to-be-recommended user and obtain a first secret sharing scoring result;
the aggregation module is used for conducting aggregation interaction with the second equipment based on the first secret sharing scoring result so as to combine the second secret sharing scoring result determined by the second equipment to calculate a target scoring result;
And the generating module is used for generating a target recommendation list corresponding to the to-be-recommended article based on the target scoring result.
Optionally, the aggregation module includes:
A receiving unit, configured to receive the second shared first party score and the second shared second party score sent by the second device;
A first calculation unit configured to calculate a first party score based on the first shared first party score and the second shared second party score;
A second calculation unit configured to calculate a second party score based on the first shared second party score and the second shared second party score;
And the aggregation unit is used for aggregating the first party scores and the second party scores to obtain the target scores.
Optionally, the scoring module includes:
a first joint calculation unit, configured to perform longitudinal federal prediction interaction with the second device based on the first non-zero portion and the first party first sharing type model parameter, so as to joint the second party first sharing type model parameter, and calculate the first type sharing scoring result;
And the second joint calculation unit is used for carrying out longitudinal federal prediction interaction with the second equipment based on the first party second sharing type model parameters so as to combine the second party second sharing type model parameters and a second non-zero part of the second party to-be-recommended user sparse data to calculate the second type sharing scoring result.
The specific implementation manner of the personalized recommendation device is basically the same as that of each embodiment of the personalized recommendation method, and is not repeated here.
Embodiments of the present application provide a readable storage medium, and the readable storage medium stores one or more programs, which are further executable by one or more processors for implementing the steps of the factorization machine model construction method described in any one of the above.
The specific implementation manner of the readable storage medium of the present application is basically the same as that of each embodiment of the above-mentioned factorization machine model construction method, and will not be repeated here.
Embodiments of the present application provide a readable storage medium, and the readable storage medium stores one or more programs, which are further executable by one or more processors for implementing the steps of the personalized recommendation method described in any one of the above.
The specific implementation manner of the readable storage medium of the present application is basically the same as that of each embodiment of the personalized recommendation method, and will not be repeated here.
The foregoing description is only of the preferred embodiments of the present application, and is not intended to limit the scope of the application, but rather is intended to cover any equivalents of the structures or equivalent processes disclosed herein, or any application, directly or indirectly, within the scope of the application.
Claims (15)
1. A factoring machine model construction method, wherein the factoring machine model construction method is applied to a first device, the factoring machine model construction method comprising:
Acquiring initialization model parameters and first sparse data corresponding to a preset initialization model, and carrying out secret sharing with second equipment based on the initialization model parameters to acquire first party secret sharing initial model parameters so as to enable the second equipment to determine second party secret sharing initial model parameters;
Performing federal interaction with the second device based on a first non-zero portion of the first sparse data and the first party secret sharing initial model parameters to combine a second non-zero portion of second sparse data acquired by the second device with the second party secret sharing initial model parameters to calculate a secret sharing model error;
the step of updating the preset initialization model based on the secret sharing model error to obtain a longitudinal federal factorizer model comprises the following steps:
Updating the secret sharing initial model parameters based on the secret sharing model errors to obtain secret sharing updating parameters;
and carrying out decryption interaction with the second equipment based on the secret sharing update parameters to obtain first target model parameters so as to obtain second target model parameters by the second equipment, wherein the longitudinal federal factorizer model comprises the first target model parameters belonging to the first equipment and the second target model parameters belonging to the second equipment.
2. The factorizer model building method of claim 1 wherein said step of calculating a secret sharing model error based on a first non-zero portion of said first sparse data and said first party secret sharing initial model parameters, federally interacting with said second device to combine a second non-zero portion of second sparse data acquired by said second device and said second party secret sharing initial model parameters comprises:
Performing federal interactions with the second device based on a preset secret sharing multiplication triplet, the first non-zero portion and the first party secret sharing initial model parameters to combine the second non-zero portion and the second party secret sharing initial model parameters, calculating a sparse matrix secure inner product and secret sharing intermediate parameters;
and calculating the secret sharing model error based on the sparse matrix safe inner product, the secret sharing intermediate parameter and a preset secret sharing model error calculation formula.
3. The factorizer model building method of claim 2 wherein said first party secret sharing initial model parameters include a first type sharing model parameter and a second type sharing model parameter, said second party secret sharing initial model parameters include a second party first type sharing model parameter and a second party second type sharing model parameter, said sparse matrix safe inner product includes a first type sparse matrix safe inner product and a second type sparse matrix safe inner product,
The step of calculating a sparse matrix secure inner and secret sharing intermediate parameters based on a preset secret sharing multiplication triplet, the first non-zero portion and the first party secret sharing initial model parameters, federally interacting with the second device to combine the second non-zero portion and the second party secret sharing initial model parameters, comprises:
performing federal interactions with the second device to combine the second party first type shared model parameters and the second non-zero portion based on the first type shared model parameters and the first non-zero portion to calculate the first type sparse matrix safe inner product;
Performing federal interactions with the second device based on the second-type shared model parameters and the first non-zero portion to combine the second-party second-type shared model parameters and the second non-zero portion to calculate the second-type sparse matrix safe inner product;
Based on the second-type sharing model parameters, the first non-zero portion, and the preset secret sharing multiplication triplets, federally interacting with the second device to combine the second-party second-type sharing model parameters and the second non-zero portion, calculating the secret sharing intermediate parameters.
4. The method of factoring machine model construction as set forth in claim 3 wherein said second type sparse matrix safe inner product comprises a first non-zero feature term cross inner product and a second non-zero feature term cross inner product,
The step of calculating a second type sparse matrix security inner product based on the second type shared model parameters and the first non-zero portion, performing federal interaction with the second device to combine the second party second type shared model parameters and the second non-zero portion, includes:
performing federal interaction with the second device based on the second-type shared model parameters to calculate a cross inner product between the second-type shared model parameters and the second non-zero portion, obtaining the first non-zero feature item cross inner product;
Based on the first non-zero portion, federally interacting with the second device to calculate a cross inner product between the first non-zero portion and the second-party second-type shared model parameters, obtaining the second non-zero feature term cross inner product.
5. The factoring machine model building method of claim 4 wherein said second type of shared model parameters comprise first shared parameters and said second party second type of shared model parameters comprise second shared parameters,
The step of performing federal interaction with the second device based on the second type of shared model parameters to calculate a cross inner product between the second type of shared model parameters and the second non-zero portion, the step of obtaining the first non-zero feature term cross inner product comprising:
generating a first public key, encrypting the first shared parameter based on the first public key, and obtaining an encrypted first shared parameter;
Transmitting the first public key and the encrypted first shared parameter to a second device for the second device to determine a second party first non-zero feature item cross inner product and an encrypted first non-zero feature item cross inner product based on the first public key, the encrypted first shared parameter, a second shared parameter, and the second non-zero portion;
And receiving the encrypted first non-zero characteristic item cross inner product sent by the second equipment, and decrypting the encrypted first non-zero characteristic item cross inner product based on a first private key corresponding to the first public key to obtain the first non-zero characteristic item cross inner product.
6. The method of constructing a factorization model of claim 4 wherein said second type of shared model parameters comprises a third shared parameter and said second side second type of shared model parameters comprises a fourth shared parameter,
The step of performing federal interaction with the second device based on the first non-zero portion to calculate a cross inner product between the first non-zero portion and the second-party second-type shared model parameter, the step of obtaining the second non-zero feature term cross inner product comprising:
receiving a second public key sent by the second device and a sent encrypted fourth shared parameter, wherein the encrypted fourth shared parameter is the fourth shared parameter encrypted by the second device based on the second public key;
Calculating a second non-zero feature item cross inner product and an encrypted second party second non-zero feature item cross inner product based on the second public key, the encrypted fourth shared parameter, the first non-zero portion, and the third shared parameter;
And sending the second non-zero characteristic item cross inner product of the encrypted second party to the second device so that the second device decrypts the second non-zero characteristic item cross inner product of the encrypted second party based on a second private key corresponding to the second public key to obtain the second non-zero characteristic item cross inner product of the second party.
7. The factoring machine model construction method of claim 6 wherein said step of calculating a second non-zero feature item cross inner product and an encrypted second non-zero feature item cross inner product based on said second public key, said encrypted fourth shared parameter, said first non-zero portion, and said third shared parameter comprises:
encrypting the third sharing parameter based on the second public key to obtain an encrypted third sharing parameter, and calculating an encryption model parameter which corresponds to the encrypted third sharing parameter and the encrypted fourth sharing parameter together;
Calculating the cross inner product between each row vector in the encryption model parameters and each row vector in the first non-zero part to obtain a first encryption inner product;
And constructing a second non-zero feature item cross inner product based on the feature dimension of the encryption model parameter, and calculating the second non-zero feature item cross inner product of the encryption second party, which corresponds to the first encryption inner product together.
8. A factorizer model building method according to claim 3, wherein said second type of shared model parameters comprise a second type of secret sharing parameter matrix and a secret sharing transpose parameter matrix corresponding to said second type of secret sharing parameter matrix, said second side second type of shared model parameters comprise a second side second type of secret sharing parameter matrix and a second side secret sharing transpose parameter matrix corresponding to said second side second type of shared parameter matrix,
The step of calculating the secret sharing intermediary parameter based on the second type sharing model parameter, the first non-zero portion, and the preset secret sharing multiplication triplet, federally interacting with the second device to combine the second party second type sharing model parameter and the second non-zero portion, comprises:
Based on the preset secret sharing multiplication triplets, calculating a secret sharing product between the second-type secret sharing parameter matrix and the secret sharing transposed parameter matrix through federal interaction with the second device to obtain the secret sharing matrix inner product, so that the second device calculates the secret sharing product between the second-side second-type secret sharing parameter matrix and the second-side secret sharing transposed parameter matrix to obtain a second-side secret sharing matrix inner product;
Federally interacting with the second device to combine the second party secret sharing matrix inner volume and the second non-zero portion based on the secret sharing matrix inner volume and the first non-zero portion to calculate the secret sharing intermediate parameter.
9. A factoring machine model building method as defined in claim 1 wherein said secret sharing update parameters comprise a first sharing first party model update parameter and a first sharing second party model update parameter,
The step of obtaining the first target model parameter by decrypting interaction with the second device based on the secret sharing update parameter, so that the second device obtains the second target model parameter comprises the following steps:
transmitting the first shared second party model update parameters to the second device for the second device to calculate the second target model parameters based on the determined second shared second party model update parameters and the first shared second party model update parameters;
And receiving second shared first party model updating parameters sent by the second equipment, and calculating the first target model parameters based on the second shared first party model updating parameters and the first shared first party model updating parameters.
10. A personalized recommendation method, wherein the personalized recommendation method is applied to the first device of any one of claims 1 to 9, and the personalized recommendation method comprises:
acquiring sparse data of a user to be recommended by a first party, and carrying out secret sharing with second equipment to acquire secret sharing model parameters;
wherein the acquiring of sparse data of the user to be recommended by the first party, and performing secret sharing with the second device, the step of obtaining secret sharing model parameters comprising:
acquiring first party model parameters and first party to-be-recommended user sparse data of a preset personalized recommendation model, and simultaneously acquiring second party model parameters and second party to-be-recommended user sparse data of the preset personalized recommendation model by second equipment;
The second device provides the second party model parameters in secret sharing, and further the first device obtains secret sharing model parameters, and the second device obtains the second party secret sharing model parameters;
based on a first non-zero part in the sparse data of the first party to-be-recommended user and the secret sharing model parameters, performing longitudinal federal prediction interaction with the second equipment to score the to-be-recommended object corresponding to the sparse data of the first party to-be-recommended user, and obtaining a first secret sharing scoring result;
The step of performing longitudinal federal prediction interaction with the second device based on the first non-zero part of the sparse data of the user to be recommended by the first party and the secret sharing model parameter to score the article to be recommended corresponding to the sparse data of the user to be recommended by the first party, and obtaining a first secret sharing scoring result comprises the following steps:
Based on the first non-zero portion and first party first sharing type model parameters, performing longitudinal federal prediction interaction with the second device to combine second party first sharing type model parameters to calculate a first type sharing scoring result;
Performing longitudinal federal prediction interaction with the second device based on a first party second sharing type model parameter to combine the second party second sharing type model parameter with a second non-zero portion of the second party to-be-recommended user sparse data, calculating a second type sharing scoring result, wherein the secret sharing model parameter comprises a first party first sharing type model parameter and a first party second type sharing model parameter, the second device comprises the second party to-be-recommended user sparse data and a second party secret sharing model parameter, wherein the second party secret sharing model parameter comprises a second party first sharing type model parameter and a second party second sharing type model parameter, and the first secret sharing scoring result comprises a first type sharing scoring result and a second type sharing scoring result;
Performing aggregation interaction with the second device based on the first secret sharing scoring result to combine the second secret sharing scoring result determined by the second device to calculate a target scoring result;
and generating a target recommendation list corresponding to the to-be-recommended object based on the target scoring result.
11. The personalized recommendation method of claim 10, wherein the first secret sharing score result comprises at least a first shared first party score and a first shared second party score, the second secret sharing score result comprises at least a second shared first party score and a second shared second party score, the target score result comprises at least a target score,
The step of calculating a target scoring result based on the first secret sharing scoring result and the second device for aggregation interaction to combine the second secret sharing scoring result determined by the second device includes:
receiving the second shared first party score and the second shared second party score sent by the second device;
calculating a first party score based on the first shared first party score and the second shared second party score;
calculating a second party score based on the first shared second party score and the second shared second party score;
And aggregating the first party score and the second party score to obtain the target score.
12. A factorization machine model construction device is characterized in that, the factorizer model building apparatus includes: a memory, a processor, and a program stored on the memory for implementing the factorization machine model building method,
The memory is used for storing a program for realizing a factorization machine model construction method;
The processor is configured to execute a program implementing the factoring machine model construction method to implement the steps of the factoring machine model construction method according to any one of claims 1 to 9.
13. A readable storage medium, characterized in that the readable storage medium has stored thereon a program implementing a factoring machine model construction method, the program implementing the factoring machine model construction method being executed by a processor to implement the steps of the factoring machine model construction method according to any one of claims 1 to 9.
14. A personalized recommendation device, the personalized recommendation device comprising: a memory, a processor and a program stored on the memory for implementing the personalized recommendation method,
The memory is used for storing a program for realizing the personalized recommendation method;
the processor is configured to execute a program implementing the personalized recommendation method to implement the steps of the personalized recommendation method according to any one of claims 10 to 11.
15. A readable storage medium, characterized in that a program for realizing a personalized recommendation method is stored on the readable storage medium, the program for realizing a personalized recommendation method being executed by a processor to realize the steps of the personalized recommendation method according to any one of claims 10 to 11.
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN202010893538.5A CN112016698B (en) | 2020-08-28 | 2020-08-28 | Factorization machine model construction method, factorization machine model construction equipment and readable storage medium |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN202010893538.5A CN112016698B (en) | 2020-08-28 | 2020-08-28 | Factorization machine model construction method, factorization machine model construction equipment and readable storage medium |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| CN112016698A CN112016698A (en) | 2020-12-01 |
| CN112016698B true CN112016698B (en) | 2024-10-15 |
Family
ID=73503139
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| CN202010893538.5A Active CN112016698B (en) | 2020-08-28 | 2020-08-28 | Factorization machine model construction method, factorization machine model construction equipment and readable storage medium |
Country Status (1)
| Country | Link |
|---|---|
| CN (1) | CN112016698B (en) |
Families Citing this family (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN113434878B (en) * | 2021-06-25 | 2023-07-07 | 平安科技(深圳)有限公司 | Modeling and application method, device, equipment and storage medium based on federal learning |
| CN113516253B (en) * | 2021-07-02 | 2022-04-05 | 深圳市洞见智慧科技有限公司 | Data encryption optimization method and device in federated learning |
| KR20230136950A (en) * | 2022-03-21 | 2023-10-04 | 삼성전자주식회사 | Functional encryption system and method of performing functional encryption |
| CN115865302B (en) * | 2022-09-27 | 2025-05-16 | 中国电子科技集团公司第三十研究所 | A privacy-preserving multi-matrix multiplication method |
| CN116777001A (en) * | 2023-06-02 | 2023-09-19 | 支付宝(杭州)信息技术有限公司 | Multi-party joint model processing method and device |
| CN117973488B (en) * | 2024-03-29 | 2024-06-07 | 蓝象智联(杭州)科技有限公司 | Large language model training and reasoning method and system with privacy protection |
Citations (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN105677701A (en) * | 2015-12-24 | 2016-06-15 | 苏州大学 | Social recommendation method based on oblivious transfer |
| CN111259446A (en) * | 2020-01-16 | 2020-06-09 | 深圳前海微众银行股份有限公司 | Parameter processing method, device and storage medium based on federated transfer learning |
Family Cites Families (8)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US7124302B2 (en) * | 1995-02-13 | 2006-10-17 | Intertrust Technologies Corp. | Systems and methods for secure transaction management and electronic rights protection |
| US10033702B2 (en) * | 2015-08-05 | 2018-07-24 | Intralinks, Inc. | Systems and methods of secure data exchange |
| US10607027B1 (en) * | 2018-12-05 | 2020-03-31 | Cyberark Software Ltd. | Secretless secure data distribution and recovery process |
| CN110288094B (en) * | 2019-06-10 | 2020-12-18 | 深圳前海微众银行股份有限公司 | Model parameter training method and device based on federated learning |
| CN110955907B (en) * | 2019-12-13 | 2022-03-25 | 支付宝(杭州)信息技术有限公司 | A model training method based on federated learning |
| CN111046433B (en) * | 2019-12-13 | 2021-03-05 | 支付宝(杭州)信息技术有限公司 | Model training method based on federal learning |
| CN111241567B (en) * | 2020-01-16 | 2023-09-01 | 深圳前海微众银行股份有限公司 | Data sharing method, system and storage medium in vertical federated learning |
| CN111291417B (en) * | 2020-05-09 | 2020-08-28 | 支付宝(杭州)信息技术有限公司 | Method and device for protecting data privacy of multi-party combined training object recommendation model |
-
2020
- 2020-08-28 CN CN202010893538.5A patent/CN112016698B/en active Active
Patent Citations (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN105677701A (en) * | 2015-12-24 | 2016-06-15 | 苏州大学 | Social recommendation method based on oblivious transfer |
| CN111259446A (en) * | 2020-01-16 | 2020-06-09 | 深圳前海微众银行股份有限公司 | Parameter processing method, device and storage medium based on federated transfer learning |
Also Published As
| Publication number | Publication date |
|---|---|
| CN112016698A (en) | 2020-12-01 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| CN112016698B (en) | Factorization machine model construction method, factorization machine model construction equipment and readable storage medium | |
| CN111985573B (en) | Method, device and readable storage medium for building factor decomposition machine classification model | |
| CN112000987B (en) | Method, device and readable storage medium for building factor decomposition machine classification model | |
| CN111340247B (en) | Longitudinal federal learning system optimization method, device and readable storage medium | |
| CN110753926B (en) | Method, system and computer readable storage medium for data encryption | |
| CN112926073B (en) | Federated learning modeling optimization method, device, medium and computer program product | |
| CN112818374B (en) | Combined training method, equipment, storage medium and program product of model | |
| CN110704860A (en) | Longitudinal federal learning method, device and system for improving safety and storage medium | |
| Li et al. | A privacy-preserving high-order neuro-fuzzy c-means algorithm with cloud computing | |
| CN114696990B (en) | Multi-party computing method, system and related equipment based on fully homomorphic encryption | |
| WO2021092977A1 (en) | Vertical federated learning optimization method, appartus, device and storage medium | |
| CN112000988B (en) | Factor decomposition machine regression model construction method, device and readable storage medium | |
| CN113051586B (en) | Federated modeling system and method, federated model prediction method, medium, and device | |
| CN110807528A (en) | Feature correlation calculation method, device and computer-readable storage medium | |
| CN114691167A (en) | Method and device for updating machine learning model | |
| CN113935050A (en) | Feature extraction method and device based on federal learning, electronic device and medium | |
| CN111368314B (en) | Modeling and prediction method, device, equipment and storage medium based on cross characteristics | |
| CN114448598B (en) | Ciphertext compression method, ciphertext decompression device, ciphertext compression equipment and storage medium | |
| CN115664632A (en) | Prediction model training method, system, equipment and medium based on homomorphic encryption | |
| CN112633356B (en) | Recommendation model training method, recommendation device, recommendation equipment and storage medium | |
| CN114648666A (en) | Classification model training and data classification method and device and electronic equipment | |
| CN114638274A (en) | Feature selection method, device, readable storage medium and computer program product | |
| CN112949866B (en) | Training method and device of poisson regression model, electronic equipment and storage medium | |
| KR20150115762A (en) | Privacy protection against curious recommenders | |
| CN113407860A (en) | Privacy protection-based multi-social platform user recommendation method and system |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| PB01 | Publication | ||
| PB01 | Publication | ||
| SE01 | Entry into force of request for substantive examination | ||
| SE01 | Entry into force of request for substantive examination | ||
| GR01 | Patent grant | ||
| GR01 | Patent grant |