[go: up one dir, main page]

CN109740743A - Hierarchical neural network query recommendation method and device - Google Patents

Hierarchical neural network query recommendation method and device Download PDF

Info

Publication number
CN109740743A
CN109740743A CN201910255010.2A CN201910255010A CN109740743A CN 109740743 A CN109740743 A CN 109740743A CN 201910255010 A CN201910255010 A CN 201910255010A CN 109740743 A CN109740743 A CN 109740743A
Authority
CN
China
Prior art keywords
neural network
layer neural
session
user
state vector
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910255010.2A
Other languages
Chinese (zh)
Inventor
蔡飞
陈洪辉
刘俊先
罗爱民
舒振
陈涛
罗雪山
邓正
潘鹏亨
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
National University of Defense Technology
Original Assignee
National University of Defense Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by National University of Defense Technology filed Critical National University of Defense Technology
Publication of CN109740743A publication Critical patent/CN109740743A/en
Pending legal-status Critical Current

Links

Landscapes

  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)

Abstract

本发明公开一种分层神经网络查询推荐方法及装置。该分层神经网络查询推荐方法包括:建立用于对用户短期查询记录进行建模的会话层神经网络和用于对用户长期查询记录进行建模的用户层神经网络两个神经网络,其中将当前时刻的所述会话层神经网络的状态向量作为当前时刻所述用户层神经网络的输入,将当前时刻的所述用户层神经网络的状态向量作为下一时刻的所述会话层神经网络的输入;根据所述会话层神经网络和所述用户层神经网络,为查询会话输出查询推荐内容。本发明提供的方案,能提高查询推荐效率和准确率。

The invention discloses a hierarchical neural network query recommendation method and device. The hierarchical neural network query recommendation method includes: establishing a session layer neural network for modeling user short-term query records and a user layer neural network for modeling user long-term query records, wherein the current The state vector of the session layer neural network at the moment is used as the input of the user layer neural network at the current moment, and the state vector of the user layer neural network at the current moment is used as the input of the session layer neural network at the next moment; According to the session layer neural network and the user layer neural network, the query recommendation content is output for the query session. The solution provided by the present invention can improve the efficiency and accuracy of query recommendation.

Description

一种分层神经网络查询推荐方法及装置A kind of hierarchical neural network query recommendation method and device

技术领域technical field

本发明涉及计算机网络技术领域,具体涉及一种分层神经网络查询推荐方法及装置。The present invention relates to the technical field of computer networks, in particular to a method and device for querying and recommending a hierarchical neural network.

背景技术Background technique

现有技术中,存在各种各样的查询推荐方法。查询推荐是在用户进行信息检索过程中,当用户只输入部分查询关键词时,系统可以预测用户的查询意图,推荐给用户一组查询候选词供其选择。通过查询推荐可以帮助用户重新修改他们输入的查询关键词,以便更准确查找到所需要的内容,提高用户的满意度。目前传统的查询推荐方法大都基于特征的挖掘,所采用的特征包括查询之间的共现度和语义相似度等。但是这种特征的挖掘往往都是人为定义和操作,很可能有一些隐藏的特征没有被挖掘和发现。In the prior art, there are various query recommendation methods. Query recommendation is that in the process of user information retrieval, when the user only enters part of the query keywords, the system can predict the user's query intention and recommend a set of query candidate words to the user for selection. Query recommendation can help users re-modify the query keywords they input, so as to find the content they need more accurately and improve user satisfaction. At present, traditional query recommendation methods are mostly based on feature mining, and the features used include co-occurrence and semantic similarity between queries. However, the mining of such features is often artificially defined and operated, and there may be some hidden features that have not been mined and discovered.

现有技术中的基于RNN(Recurrent Neural Networks,循环神经网络)的查询推荐方法,可以很好地对时间序列数据进行建模,从而对下一个可能的查询项进行预测。但是,该方法只关注了当前的短期查询会话,没有考虑用户的查询记录,因此查询推荐的效率和准确率有待提高。The query recommendation method based on RNN (Recurrent Neural Networks, Recurrent Neural Network) in the prior art can well model time series data, so as to predict the next possible query item. However, this method only focuses on the current short-term query session and does not consider the user's query records, so the efficiency and accuracy of query recommendation need to be improved.

发明内容SUMMARY OF THE INVENTION

有鉴于此,本发明的目的在于提出一种分层神经网络查询推荐方法及装置,能提高查询推荐效率和准确率。In view of this, the purpose of the present invention is to propose a hierarchical neural network query recommendation method and device, which can improve the query recommendation efficiency and accuracy.

根据本发明的一个方面,提供一种分层神经网络查询推荐方法,包括:According to an aspect of the present invention, there is provided a hierarchical neural network query recommendation method, comprising:

建立用于对用户短期查询记录进行建模的会话层神经网络和用于对用户长期查询记录进行建模的用户层神经网络两个神经网络,其中将当前时刻的所述会话层神经网络的状态向量作为当前时刻所述用户层神经网络的输入,将当前时刻的所述用户层神经网络的状态向量作为下一时刻的所述会话层神经网络的输入;Two neural networks are established, a session layer neural network for modeling the user's short-term query records and a user layer neural network for modeling the user's long-term query records, wherein the state of the session layer neural network at the current moment is calculated. The vector is used as the input of the user layer neural network at the current moment, and the state vector of the user layer neural network at the current moment is used as the input of the session layer neural network at the next moment;

根据所述会话层神经网络和所述用户层神经网络,为查询会话输出查询推荐内容。According to the session layer neural network and the user layer neural network, the query recommendation content is output for the query session.

优选的,所述方法还包括:Preferably, the method further includes:

将所述会话层神经网络中的状态向量的值以不同的权重组合成新的会话层神经网络的状态向量;Combining the values of the state vectors in the session layer neural network with different weights to synthesize the state vectors of the new session layer neural network;

所述根据所述会话层神经网络和所述用户层神经网络,为查询会话输出查询推荐内容,包括:The outputting query recommendation content for the query session according to the session layer neural network and the user layer neural network, including:

根据更新会话层神经网络状态向量后的所述会话层神经网络和所述用户层神经网络,为查询会话输出查询推荐内容。According to the session layer neural network and the user layer neural network after updating the session layer neural network state vector, the query recommendation content is output for the query session.

优选的,所述将当前时刻的所述会话层神经网络的状态向量作为当前时刻所述用户层神经网络的输入,将当前时刻的所述用户层神经网络的状态向量作为下一时刻的所述会话层神经网络的输入,包括:Preferably, the state vector of the session layer neural network at the current moment is used as the input of the user layer neural network at the current moment, and the state vector of the user layer neural network at the current moment is used as the state vector of the user layer neural network at the next moment. The input of the session layer neural network, including:

将t时刻的会话层神经网络的状态向量作为t时刻用户层神经网络的输入,将t时刻用户层神经网络的状态向量作为t+1时刻会话层神经网络的输入,将t+1时刻会话层神经网络的状态向量作为t+1时刻用户层神经网络的输入,其中t大于或等于0。The state vector of the session layer neural network at time t is used as the input of the user layer neural network at time t, the state vector of the user layer neural network at time t is used as the input of the session layer neural network at time t+1, and the session layer at time t+1 is used. The state vector of the neural network is used as the input of the user layer neural network at time t+1, where t is greater than or equal to 0.

优选的,所述将t时刻用户层神经网络的状态向量作为t+1时刻会话层神经网络的输入,将t+1时刻会话层神经网络的状态向量作为t+1时刻用户层神经网络的输入隐藏层,包括:Preferably, the state vector of the user layer neural network at time t is used as the input of the session layer neural network at time t+1, and the state vector of the session layer neural network at time t+1 is used as the input of the user layer neural network at time t+1. Hidden layers, including:

将t时刻用户层神经网络的状态向量用来初始化t+1时刻的会话层神经网络的第一个隐藏层状态向量;将t+1时刻的会话层神经网络的最后一个隐藏层状态向量作为t+1时刻用户层神经网络的输入。The state vector of the user layer neural network at time t is used to initialize the first hidden layer state vector of the session layer neural network at time t+1; the last hidden layer state vector of the session layer neural network at time t+1 is used as t +1 for the input of the user layer neural network at the moment.

优选的,所述将t时刻用户层神经网络的状态向量用来初始化t+1时刻的会话层神经网络的第一个隐藏层状态向量,包括:Preferably, the state vector of the user layer neural network at time t is used to initialize the first hidden layer state vector of the session layer neural network at time t+1, including:

将t时刻用户层神经网络的最后一个隐藏层的状态向量用来初始化t+1时刻的会话层神经网络的第一个隐藏层状态向量。The state vector of the last hidden layer of the user layer neural network at time t is used to initialize the state vector of the first hidden layer of the session layer neural network at time t+1.

优选的,所述将所述会话层神经网络中的状态向量的值以不同的权重组合成新的会话层神经网络的状态向量,包括:Preferably, the value of the state vector in the session layer neural network is combined with different weights to synthesize the state vector of the new session layer neural network, including:

将所述会话层神经网络中所有隐藏层的状态向量的值以不同的权重组合成新的会话层神经网络的状态向量。The values of the state vectors of all hidden layers in the session layer neural network are combined with different weights to synthesize the state vectors of the new session layer neural network.

根据本发明的另一个方面,提供一种分层神经网络查询推荐装置,包括:According to another aspect of the present invention, a hierarchical neural network query and recommendation device is provided, comprising:

分层神经网络建立模块,用于建立会话层神经网络和用户层神经网络两个神经网络,其中将当前时刻的所述会话层神经网络的状态向量作为当前时刻所述用户层神经网络的输入,将当前时刻的所述用户层神经网络的状态向量作为下一时刻的所述会话层神经网络的输入;A layered neural network establishment module is used to establish two neural networks, a session layer neural network and a user layer neural network, wherein the state vector of the session layer neural network at the current moment is used as the input of the user layer neural network at the current moment, Taking the state vector of the user layer neural network at the current moment as the input of the session layer neural network at the next moment;

推荐输出模块,用于根据所述分层神经网络建立模块建立的所述会话层神经网络和所述用户层神经网络,为查询会话输出查询推荐内容。A recommendation output module is configured to output query recommendation content for a query session according to the session layer neural network and the user layer neural network established by the layered neural network establishment module.

优选的,所述装置还包括:Preferably, the device further includes:

注意力机制处理模块,用于将所述分层神经网络建立模块建立的会话层神经网络中的状态向量的值以不同的权重组合成新的会话层神经网络的状态向量;The attention mechanism processing module is used for synthesizing the state vector of the new session layer neural network with the values of the state vector in the session layer neural network established by the layered neural network establishment module with different weights;

所述推荐输出模块,根据更新会话层神经网络状态向量后的所述会话层神经网络和所述用户层神经网络,为查询会话输出查询推荐内容。The recommendation output module outputs the query recommendation content for the query session according to the session layer neural network and the user layer neural network after updating the session layer neural network state vector.

优选的,所述分层神经网络建立模块将t时刻的会话层神经网络的状态向量作为t时刻用户层神经网络的输入,将t时刻用户层神经网络的状态向量作为t+1时刻会话层神经网络的输入,将t+1时刻会话层神经网络的状态向量作为t+1时刻用户层神经网络的输入,其中t大于或等于0。Preferably, the layered neural network establishment module uses the state vector of the session layer neural network at time t as the input of the user layer neural network at time t, and uses the state vector of the user layer neural network at time t as the session layer neural network at time t+1. For the input of the network, the state vector of the session layer neural network at time t+1 is used as the input of the user layer neural network at time t+1, where t is greater than or equal to 0.

优选的,所述注意力机制处理模块将所述会话层神经网络中所有隐藏层的状态向量的值以不同的权重组合成新的会话层神经网络的状态向量。Preferably, the attention mechanism processing module combines the values of the state vectors of all hidden layers in the session layer neural network with different weights to synthesize the state vectors of the new session layer neural network.

可以发现,本发明实施例的技术方案,分别建立了用于对用户短期查询记录进行建模的会话层神经网络和用于对用户长期查询记录进行建模的用户层神经网络两个神经网络,其中将当前时刻的所述会话层神经网络的状态向量作为当前时刻所述用户层神经网络的输入,将当前时刻的所述用户层神经网络的状态向量作为下一时刻的所述会话层神经网络的输入;当接收到查询会话时,可以根据所述会话层神经网络和所述用户层神经网络,为查询会话输出查询推荐内容。其中会话层神经网络可以针对用户的短期查询记录建模,用户层神经网络可以针对用户的长期查询记录进行建模,且会话层神经网络和用户层神经网络两个神经网络的输入和输出可以建立关联关系。这样,不仅仅考虑了当前的短期查询记录,也考虑到了用户长期的查询记录,而且将会话层神经网络和用户层神经网络两个神经网络的状态向量紧密联系起来进行综合考虑,提高了查询推荐的效率和准确率。It can be found that the technical solutions of the embodiments of the present invention respectively establish two neural networks, a session layer neural network for modeling the user's short-term query records and a user layer neural network for modeling the user's long-term query records, The state vector of the session layer neural network at the current moment is used as the input of the user layer neural network at the current moment, and the state vector of the user layer neural network at the current moment is used as the session layer neural network at the next moment. When the query session is received, the query recommendation content can be output for the query session according to the session layer neural network and the user layer neural network. The session layer neural network can model the user's short-term query records, the user layer neural network can model the user's long-term query records, and the input and output of the session layer neural network and the user layer neural network can be established. connection relation. In this way, not only the current short-term query records, but also the user's long-term query records are considered, and the state vectors of the two neural networks of the session layer neural network and the user layer neural network are closely linked for comprehensive consideration, which improves the query recommendation. efficiency and accuracy.

进一步的,本发明实施例还可以引入注意力机制,将所述会话层神经网络中的状态向量的值以不同的权重组合成新的会话层神经网络的状态向量,这样就运用注意力机制更好捕捉了用户的偏好信息。Further, in this embodiment of the present invention, an attention mechanism may be introduced, and the values of the state vectors in the session layer neural network are combined with different weights to synthesize the state vectors of a new session layer neural network, so that the attention mechanism is used to improve the performance of the neural network. It captures the user's preference information.

附图说明Description of drawings

通过结合附图对本公开示例性实施方式进行更详细的描述,本公开的上述以及其它目的、特征和优势将变得更加明显,其中,在本公开示例性实施方式中,相同的参考标号通常代表相同部件。The above and other objects, features and advantages of the present disclosure will become more apparent from the more detailed description of the exemplary embodiments of the present disclosure taken in conjunction with the accompanying drawings, wherein the same reference numerals generally refer to the exemplary embodiments of the present disclosure. same parts.

图1是根据本发明的一个实施例的一种分层神经网络查询推荐方法的示意性流程图;FIG. 1 is a schematic flow chart of a method for query recommendation in a hierarchical neural network according to an embodiment of the present invention;

图2是根据本发明的一个实施例的一种分层神经网络查询推荐方法的另一示意性流程图;FIG. 2 is another schematic flowchart of a method for query recommendation in a hierarchical neural network according to an embodiment of the present invention;

图3是根据本发明的一个实施例的一种NQS模型的示意图;3 is a schematic diagram of an NQS model according to an embodiment of the present invention;

图4是根据本发明的一个实施例的一种HNQS模型的示意图;4 is a schematic diagram of a HNQS model according to an embodiment of the present invention;

图5是根据本发明的一个实施例的一种AHNQS模型的示意图;5 is a schematic diagram of an AHNQS model according to an embodiment of the present invention;

图6是根据本发明的一个实施例的一种Recall指标下不同查询会话长度的各模型检测效果图;Fig. 6 is a kind of model detection effect diagram of different query session lengths under a Recall indicator according to an embodiment of the present invention;

图7是根据本发明的一个实施例的一种MRR指标下不同查询会话长度的各模型检测效果图;7 is a diagram showing the detection effects of each model under different query session lengths under an MRR indicator according to an embodiment of the present invention;

图8是根据本发明的一个实施例的一种分层神经网络查询推荐装置的示意性方框图;FIG. 8 is a schematic block diagram of a hierarchical neural network query recommendation apparatus according to an embodiment of the present invention;

图9是根据本发明的一个实施例的一种分层神经网络查询推荐设备的示意性方框图。Fig. 9 is a schematic block diagram of a hierarchical neural network query and recommendation device according to an embodiment of the present invention.

具体实施方式Detailed ways

为使本发明的目的、技术方案和优点更加清楚明白,以下结合具体实施例,并参照附图,对本发明进一步详细说明。In order to make the objectives, technical solutions and advantages of the present invention clearer, the present invention will be further described in detail below with reference to specific embodiments and accompanying drawings.

虽然附图中显示了本公开的优选实施方式,然而应该理解,可以以各种形式实现本公开而不应被这里阐述的实施方式所限制。相反,提供这些实施方式是为了使本公开更加透彻和完整,并且能够将本公开的范围完整地传达给本领域的技术人员。While preferred embodiments of the present disclosure are shown in the drawings, it should be understood that the present disclosure may be embodied in various forms and should not be limited by the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the disclosure to those skilled in the art.

本发明提供一种分层神经网络查询推荐方法,具体是提供一种基于注意力机制的分层神经网络查询推荐方法,能提高查询推荐效率和准确率。The invention provides a hierarchical neural network query recommendation method, in particular a hierarchical neural network query recommendation method based on an attention mechanism, which can improve the query recommendation efficiency and accuracy.

以下结合附图详细描述本发明实施例的技术方案。The technical solutions of the embodiments of the present invention are described in detail below with reference to the accompanying drawings.

图1是根据本发明的一个实施例的一种分层神经网络查询推荐方法的示意性流程图。该方法可以应用于分层神经网络查询推荐装置中。FIG. 1 is a schematic flowchart of a method for query recommendation in a hierarchical neural network according to an embodiment of the present invention. The method can be applied to a hierarchical neural network query recommendation device.

参见图1,所述方法包括:Referring to Figure 1, the method includes:

在步骤101中,建立用于对用户短期查询记录进行建模的会话层神经网络和用于对用户长期查询记录进行建模的用户层神经网络两个神经网络,其中将当前时刻的所述会话层神经网络的状态向量作为当前时刻所述用户层神经网络的输入,将当前时刻的所述用户层神经网络的状态向量作为下一时刻的所述会话层神经网络的输入。In step 101, two neural networks are established, a session layer neural network for modeling the user's short-term query records and a user layer neural network for modeling the user's long-term query records, wherein the session at the current moment is The state vector of the layer neural network is used as the input of the user layer neural network at the current moment, and the state vector of the user layer neural network at the current moment is used as the input of the session layer neural network at the next moment.

该步骤中,可以将t时刻的会话层神经网络的状态向量作为t时刻用户层神经网络的输入,将t时刻用户层神经网络的状态向量作为t+1时刻会话层神经网络的输入,将t+1时刻会话层神经网络的状态向量作为t+1时刻用户层神经网络的输入,其中t大于或等于0。In this step, the state vector of the session layer neural network at time t can be used as the input of the user layer neural network at time t, the state vector of the user layer neural network at time t can be used as the input of the session layer neural network at time t+1, and t The state vector of the session layer neural network at time +1 is used as the input of the user layer neural network at time t+1, where t is greater than or equal to 0.

其中,可以是将t时刻用户层神经网络的状态向量用来初始化t+1时刻的会话层神经网络的第一个隐藏层状态向量;将t+1时刻的会话层神经网络的最后一个隐藏层状态向量作为t+1时刻用户层神经网络的输入。Among them, the state vector of the user layer neural network at time t can be used to initialize the first hidden layer state vector of the session layer neural network at time t+1; the last hidden layer of the session layer neural network at time t+1 can be used. The state vector is used as the input of the user layer neural network at time t+1.

其中,可以是将t时刻用户层神经网络的最后一个隐藏层的状态向量用来初始化t+1时刻的会话层神经网络的第一个隐藏层状态向量。Wherein, the state vector of the last hidden layer of the user layer neural network at time t may be used to initialize the first hidden layer state vector of the session layer neural network at time t+1.

在步骤102中,根据所述会话层神经网络和所述用户层神经网络,为查询会话输出查询推荐内容。In step 102, according to the session layer neural network and the user layer neural network, output query recommendation content for the query session.

可以发现,本发明实施例的技术方案,分别建立了用于对用户短期查询记录进行建模的会话层神经网络和用于对用户长期查询记录进行建模的用户层神经网络两个神经网络,其中将当前时刻的所述会话层神经网络的状态向量作为当前时刻所述用户层神经网络的输入,将当前时刻的所述用户层神经网络的状态向量作为下一时刻的所述会话层神经网络的输入;当接收到查询会话时,可以根据所述会话层神经网络和所述用户层神经网络,为查询会话输出查询推荐内容。其中会话层神经网络可以针对用户的短期查询记录建模,用户层神经网络可以针对用户的长期查询记录进行建模,且会话层神经网络和用户层神经网络两个神经网络的输入和输出可以建立关联关系。这样,本发明方案不仅仅考虑了当前的短期查询记录,也考虑到了用户长期的查询记录,而且将会话层神经网络和用户层神经网络两个神经网络的状态向量紧密联系起来进行综合考虑,提高了查询推荐的效率和准确率。It can be found that the technical solutions of the embodiments of the present invention respectively establish two neural networks, a session layer neural network for modeling the user's short-term query records and a user layer neural network for modeling the user's long-term query records, The state vector of the session layer neural network at the current moment is used as the input of the user layer neural network at the current moment, and the state vector of the user layer neural network at the current moment is used as the session layer neural network at the next moment. When the query session is received, the query recommendation content can be output for the query session according to the session layer neural network and the user layer neural network. The session layer neural network can model the user's short-term query records, the user layer neural network can model the user's long-term query records, and the input and output of the session layer neural network and the user layer neural network can be established. connection relation. In this way, the solution of the present invention not only considers the current short-term query record, but also considers the user's long-term query record, and closely links the state vectors of the two neural networks of the session layer neural network and the user layer neural network for comprehensive consideration, improving the The efficiency and accuracy of query recommendation.

图2是根据本发明的一个实施例的一种分层神经网络查询推荐方法的另一示意性流程图。图2相对于图1更详细描述了本发明方案。该方法可以应用于分层神经网络查询推荐装置中。Fig. 2 is another schematic flowchart of a method for query recommendation in a hierarchical neural network according to an embodiment of the present invention. FIG. 2 describes the solution of the present invention in more detail with respect to FIG. 1 . The method can be applied to a hierarchical neural network query recommendation device.

本发明提出了一种基于注意力机制(attention machanism)的分层神经网络查询推荐方法。该方法中,包括两个神经网络,一个称为会话层神经网络(可简称为会话层),另一个称为用户层神经网络(可简称为用户层)。其中,会话层神经网络主要针对用户的短期查询记录,可用于对用户的短期查询记录进行建模,用户层神经网络主要针对用户的长期查询记录,可用于对用户的长期查询记录进行建模。一般会话层神经网络可以包括输入层、隐藏层、输出层等;其中输入层是查询会话中各个查询的编码表示;隐藏层的状态向量是由输入层经过计算得到。The present invention proposes a hierarchical neural network query recommendation method based on an attention mechanism. In this method, two neural networks are included, one is called the session layer neural network (which can be referred to as the session layer for short), and the other is called the user layer neural network (which can be referred to as the user layer for short). Among them, the session layer neural network is mainly aimed at the user's short-term query records and can be used to model the user's short-term query records, and the user layer neural network is mainly aimed at the user's long-term query records and can be used to model the user's long-term query records. A general session layer neural network may include an input layer, a hidden layer, an output layer, etc.; the input layer is the encoded representation of each query in the query session; the state vector of the hidden layer is calculated by the input layer.

其中,会话层神经网络的状态向量(用来表征短期查询会话的状态向量)可以用来作为用户层神经网络的输入,同样,用户层神经网络的状态向量(用来表征用户的状态向量)可以用来作为会话层神经网络的输入。具体的,可以将当前时刻的会话层神经网络的状态向量作为当前时刻用户层神经网络的输入,然后将当前时刻用户层神经网络的状态向量作为下一时刻会话层神经网络的输入。例如,可以将t时刻的会话层神经网络的状态向量作为t时刻用户层神经网络的输入,然后将t时刻用户层神经网络的状态向量作为t+1时刻会话层神经网络的输入,将t+1时刻会话层神经网络的状态向量作为t+1时刻用户层神经网络的输入。其中可以将t时刻用户层神经网络的状态向量用来初始化t+1时刻的会话层神经网络的第一个隐藏层状态向量;将t+1时刻的会话层神经网络的最后一个隐藏层状态向量作为t+1时刻用户层神经网络的输入。其中,t大于或等于0。其中所说的状态向量,是状态变量在某一时刻的值,称为系统在时刻的状态。状态变量在t=0时刻的值称为系统的初始状态或起始状态,即也称为初始状态向量或起始状态向量,例如本发明系统的初始状态可以是指会话层神经网络第一个隐藏层的向量值(即t=0时刻的状态向量)。Among them, the state vector of the session layer neural network (the state vector used to represent the short-term query session) can be used as the input of the user layer neural network. Similarly, the state vector of the user layer neural network (used to represent the user's state vector) can be Used as the input of the session layer neural network. Specifically, the state vector of the session layer neural network at the current moment can be used as the input of the user layer neural network at the current moment, and then the state vector of the user layer neural network at the current moment can be used as the input of the session layer neural network at the next moment. For example, the state vector of the session layer neural network at time t can be used as the input of the user layer neural network at time t, and then the state vector of the user layer neural network at time t can be used as the input of the session layer neural network at time t+1. The state vector of the session layer neural network at time 1 is used as the input of the user layer neural network at time t+1. The state vector of the user layer neural network at time t can be used to initialize the first hidden layer state vector of the session layer neural network at time t+1; the last hidden layer state vector of the session layer neural network at time t+1 can be used to initialize As the input of the user layer neural network at time t+1. where t is greater than or equal to 0. Among them, the state vector is the value of the state variable at a certain moment, which is called the state of the system at the moment. The value of the state variable at time t=0 is called the initial state or initial state of the system, that is, it is also called the initial state vector or initial state vector. For example, the initial state of the system of the present invention can refer to the first The vector value of the hidden layer (that is, the state vector at time t=0).

另外,本发明还运用了注意力机制来更好捕捉用户的偏好信息。随着深度学习的兴起,基于注意力机制的神经网络成为了最近神经网络研究的一个热点。注意力机制是一种策略,最早是在视觉图像领域提出来。注意力机制的思想是提高有用信息的权重,从而让任务处理系统更专注于找到输入数据中显著的与当前输出相关的有用信息,从而提高输出的质量。注意力机制也是一个资源分配模型,它模拟人脑工作,将更多的资源集中在重要的内容上。本发明经研究发现,在一个查询会话中,不同的查询表达了不同的用户意图,同时在表达用户兴趣爱好时也具有不同的权重,例如被用户点击的查询可能更能表现用户的查询意图。本发明运用注意力机制给不同的查询会话分配不同的权重,这样不同查询会话的隐藏向量就可以以不同的权重组合成新的查询会话的状态向量。In addition, the present invention also uses an attention mechanism to better capture the user's preference information. With the rise of deep learning, neural network based on attention mechanism has become a hotspot in recent neural network research. Attention mechanism is a strategy that was first proposed in the field of visual images. The idea of the attention mechanism is to increase the weight of useful information, so that the task processing system can focus more on finding useful information in the input data that is significantly related to the current output, thereby improving the quality of the output. The attention mechanism is also a resource allocation model, which simulates the work of the human brain and focuses more resources on important content. According to the research of the present invention, in a query session, different queries express different user intentions, and also have different weights when expressing user interests. For example, a query clicked by a user may better express the user's query intention. The present invention uses the attention mechanism to assign different weights to different query sessions, so that the hidden vectors of different query sessions can be combined with different weights to synthesize the state vector of a new query session.

根据在AOL(America Online Log ,搜索记录数据集)上进行的实验,实验结果表明本发明方法比现有基于RNN(循环神经网络)的查询推荐方法要好,在MRR(MeanReciprocal Rank,平均倒数排名,是把标准答案在被评价系统给出结果中的排序取倒数作为它的准确度,再对所有的问题取平均)指标和Recall(召回率)指标上提高了21.86%和22.99%,尤其对短的查询会话,效果提高更加明显。According to the experiments conducted on AOL (America Online Log, search record data set), the experimental results show that the method of the present invention is better than the existing query recommendation method based on RNN (Recurrent Neural Network), in MRR (Mean Reciprocal Rank, average inverse ranking, It is to take the reciprocal of the ranking of the standard answer in the results given by the evaluated system as its accuracy, and then take the average of all questions) and Recall (recall rate) indicators have increased by 21.86% and 22.99%, especially for short query session, the effect is more obvious.

参见图2,所述方法包括:Referring to Figure 2, the method includes:

在步骤201中,建立会话层神经网络和用户层神经网络两个神经网络,其中将当前时刻的会话层神经网络的状态向量作为当前时刻用户层神经网络的输入,将当前时刻的用户层神经网络的状态向量作为下一时刻的会话层神经网络的输入。In step 201, two neural networks, a session layer neural network and a user layer neural network are established, wherein the state vector of the session layer neural network at the current moment is used as the input of the user layer neural network at the current moment, and the user layer neural network at the current moment is used as the input of the user layer neural network at the current moment. The state vector of is used as the input of the session layer neural network at the next moment.

现有技术中基于RNN(Recurrent Neural Networks,循环神经网络)的查询推荐方法,主要是基于会话层神经网络的查询推荐方法。一般会话层神经网络可以包括输入层、隐藏层、输出层等;其中输入层是查询会话中各个查询的编码表示;隐藏层的状态向量是由输入层经过计算得到,计算过程如公式(1)所示;输出层由隐藏层计算得到,计算过程如公式(5)所示。其中,会话可以是指包含一段时间内的查询的一个序列,例如查询1,查询2,...,查询n,而会话层是指一个神经网络。一般而言,如果前后两个查询词查询时间不超过设定时间例如30分钟,可以认为在同一个查询会话中。The query recommendation method based on RNN (Recurrent Neural Networks, Recurrent Neural Network) in the prior art is mainly a query recommendation method based on the session layer neural network. The general session layer neural network can include input layer, hidden layer, output layer, etc.; the input layer is the coded representation of each query in the query session; the state vector of the hidden layer is calculated by the input layer, and the calculation process is as formula (1) The output layer is calculated by the hidden layer, and the calculation process is shown in formula (5). The session may refer to a sequence containing queries over a period of time, such as query 1, query 2, ..., query n, and the session layer refers to a neural network. Generally speaking, if the query time of the two query words before and after does not exceed a set time, such as 30 minutes, it can be considered that they are in the same query session.

参见图3,是根据本发明的一个实施例的一种NQS(Neural Query Suggestion ,基 于神经网络的查询推荐)模型的示意图。其中会话层神经网络从下至上分别包括输入层 (Input layer)、隐藏层(Hidden layer),输出层(Output layer)。其中,假设一个查询会话 中有个查询,表示为为了产生输入该RNN的状态向量,对每个查询会话采用1-of-N编码 的方式。其中隐藏层的状态向量可以通过公式(1)计算: Referring to FIG. 3 , it is a schematic diagram of an NQS (Neural Query Suggestion, neural network-based query recommendation) model according to an embodiment of the present invention. The session layer neural network includes an input layer, a hidden layer, and an output layer from bottom to top. Among them, suppose a query session has query, expressed as a 1-of-N encoding for each query session in order to generate the state vector input to the RNN. The state vector of the hidden layer can be calculated by formula (1):

其中:in:

其中,表示隐藏层的状态向量,表示上一隐藏层的状态向量,为更新权重, 是激活的状态向量,是遗忘权重,表示第n个输入的查询,表示来自的权 重,表示来自的权重,表示Sigmod激活函数。其中Sigmoid激活函数也 称为S型生长曲线,在信息科学中,由于其单增以及反函数单增等性质,Sigmoid函数常被用 作神经网络的阈值函数,将变量映射到0,1之间。其中tanh( )为双曲正切函数。 in, represents the state vector of the hidden layer, represents the state vector of the previous hidden layer, To update the weights, is the activation state vector, is the forgetting weight, represents the nth input query, means from the weight of, means from the weight of, represents the sigmod activation function. The Sigmoid activation function is also known as the S-shaped growth curve. In information science, due to its single increase and inverse function single increase, the sigmoid function is often used as the threshold function of the neural network to map variables between 0 and 1. . where tanh( ) is the hyperbolic tangent function.

其中,可以用RNN_session和RNN_user分别表示GRU(Gated Recurrent Unit ,门 控循环单元)函数,将会话层神经网络的状态向量表示为。这个神经网络的输出为预 测下一个查询推荐项的得分: Among them, RNN_session and RNN_user can be used to represent the GRU (Gated Recurrent Unit, gated recurrent unit) function respectively, and the state vector of the session layer neural network can be expressed as . The output of this neural network is the predicted score for the next query recommendation:

其中表示会话层神经网络中输出层的值,表示输出为下一个查询推荐项的得分。 其中,g()表示tanh激活函数,hn,t表示t时刻会话层神经网络的第n个隐藏层的状态向量。 in Indicates the value of the output layer in the session layer neural network, indicating that the output is the score of the next query recommendation item. Among them, g() represents the tanh activation function, and hn, t represents the state vector of the nth hidden layer of the session layer neural network at time t.

其中,可以选用成对比较的损失函数,因为这样可以让得分比较高,更有可能成为准确查询的推荐项排在列表的前面,从而提高用户的使用满意度。一般在推荐系统中,大都采用交叉熵或者TOP1的损失函数,TOP1的效果更好,因此,可以采用以下的损失函数Loss:Among them, the loss function of pairwise comparison can be selected, because it can make the recommended items with higher scores and more likely to be accurate queries to the front of the list, thereby improving user satisfaction. Generally, in the recommendation system, the loss function of cross entropy or TOP1 is mostly used, and the effect of TOP1 is better. Therefore, the following loss function Loss can be used:

其中,表示负样本的个数,分别表示负样本的得分和正确值的得分。 in, represents the number of negative samples, represent the score of negative samples and the score of correct values, respectively.

本发明相对于现有技术中基于RNN的查询推荐方法,提出用户-会话RNN查询推荐方法。Compared with the RNN-based query recommendation method in the prior art, the present invention proposes a user-session RNN query recommendation method.

本发明可以将当前时刻的每个会话层神经网络最后一个隐藏层的状态向量,作为当前时刻的用户层神经网络的输入;当前时刻的用户层神经网络中最后一个隐藏层的状态向量也可以用来初始化下一个时刻的会话层神经网络的第一个隐藏层的状态向量。例如,将t时刻用户层神经网络的最后一个隐藏层的状态向量用来初始化t+1时刻的会话层神经网络的第一个隐藏层状态向量。The present invention can use the state vector of the last hidden layer of each session layer neural network at the current moment as the input of the user layer neural network at the current moment; the state vector of the last hidden layer in the user layer neural network at the current moment can also be used to initialize the state vector of the first hidden layer of the session layer neural network at the next moment. For example, the state vector of the last hidden layer of the user layer neural network at time t is used to initialize the state vector of the first hidden layer of the session layer neural network at time t+1.

其中,计算用户层神经网络中隐藏层的状态向量,其计算过程如公式(7)所示;会话层神经网络的第一个隐藏层的状态向量的计算如公式(8)所示。Among them, the state vector of the hidden layer in the user layer neural network is calculated, and the calculation process is shown in formula (7); the calculation of the state vector of the first hidden layer of the session layer neural network is shown in formula (8).

现有技术中基于会话RNN的查询推荐方法只对用户的短期查询记录进行了建模,并没有对用户的长期检索记录进行建模,因此本发明提出了分层的NQS方法,即HNQS(Hierarchical Neural Query Suggestion ,基于分层神经网络的查询推荐模型)。参见图4所示,是根据本发明的一个实施例的一种HNQS模型的示意图。图4中,下方是会话层神经网络(Session- level RNN),上方是用户层神经网络(User- level RNN)。图4中,包括会话层神经网络和用户层神经网络两个神经网络,图中下方包括t、t+1和 t+2时刻的会话层神经网络,上方包括t和 t+1时刻的用户层神经网络。其中,将t时刻的会话层神经网络的状态向量作为t时刻用户层神经网络的输入,然后将t时刻用户层神经网络的状态向量作为t+1时刻会话层神经网络的输入,将t+1时刻会话层神经网络的状态向量作为t+1时刻用户层神经网络的输入。其中,是可以将t时刻用户层神经网络的状态向量用来初始化t+1时刻的会话层神经网络的第一个隐藏层状态向量;将t+1时刻的会话层神经网络的最后一个隐藏层状态向量作为t+1时刻用户层神经网络的输入;同理,将t+1时刻的会话层神经网络的状态向量作为t+1时刻用户层神经网络的输入,然后将t+1时刻用户层神经网络的状态向量作为t+2时刻会话层神经网络的输入,以此类推。The query recommendation method based on session RNN in the prior art only models the user's short-term query records, and does not model the user's long-term retrieval records. Therefore, the present invention proposes a hierarchical NQS method, namely HNQS (Hierarchical Neural Query Suggestion, a query recommendation model based on hierarchical neural networks). Referring to FIG. 4, it is a schematic diagram of an HNQS model according to an embodiment of the present invention. In Figure 4, the bottom is the session-level neural network (Session-level RNN), and the top is the user-level neural network (User-level RNN). In Figure 4, there are two neural networks, the session layer neural network and the user layer neural network. The lower part of the figure includes the session layer neural network at t, t+1 and t+2, and the upper part includes the user layer at t and t+1. Neural Networks. Among them, the state vector of the session layer neural network at time t is used as the input of the user layer neural network at time t, and then the state vector of the user layer neural network at time t is used as the input of the session layer neural network at time t+1, and t+1 The state vector of the session layer neural network at time is used as the input of the user layer neural network at time t+1. Among them, the state vector of the user layer neural network at time t can be used to initialize the first hidden layer state vector of the session layer neural network at time t+1; the last hidden layer of the session layer neural network at time t+1 can be used. The state vector is used as the input of the user layer neural network at time t+1; in the same way, the state vector of the session layer neural network at time t+1 is used as the input of the user layer neural network at time t+1, and then the user layer at time t+1 is used. The state vector of the neural network is used as the input of the session layer neural network at time t+2, and so on.

对于会话层神经网络,查询会话中的查询词在编码以后作为神经网络的输入进入会话层神经网络进行计算。对于用户层神经网络,依次使用t、t+1和 t+2时刻会话层神经网络的最后一个隐藏层向量即会话层神经网络状态向量作为输入,输出的是用户层神经网络最后一个隐藏层向量即用户层神经网络状态向量。关于会话层神经网络和用户层神经网络两层的连接,是可以将t时刻的会话层神经网络状态向量作为t时刻的用户层神经网络输入,然后将t时刻用户层神经网络状态向量作为t+1时刻会话层神经网络的输入。For the session layer neural network, the query words in the query session are encoded into the session layer neural network as the input of the neural network for calculation. For the user layer neural network, the last hidden layer vector of the session layer neural network at time t, t+1 and t+2 is used as the input, and the output is the last hidden layer vector of the user layer neural network. That is, the state vector of the user layer neural network. Regarding the connection between the session layer neural network and the user layer neural network, the state vector of the session layer neural network at time t can be used as the input of the user layer neural network at time t, and then the state vector of the user layer neural network at time t can be used as t+ 1 is the input of the session layer neural network.

参见图4,对于t+1时刻的查询会话,两个神经网络的建模过程可以如下:用t时刻的用户层神经网络的状态向量Ut(即t时刻用户层神经网络最后一个隐藏层向量ht,u=Ut)和t+1时刻的查询会话中的第一个查询词q1,t+1的编码作为输入,计算会话层神经网络中第一个隐藏层向量,然后依次结合查询会话中的查询词编码,计算第一个隐藏层向量后面的其他隐藏层向量;一直到结合查询会话的最后一个查询词进行计算,此时对应会话层神经网络最后一个隐藏层向量,可以称这个向量为会话层神经网络的状态向量St+1,然后将这个状态向量St+1输入t+1时刻的用户层神经网络。结合t时刻用户层神经网络的状态向量Ut(即t时刻用户层神经网络最后一个隐藏层向量ht,u=Ut),计算产生t+1时刻用户层神经网络状态向量Ut+1(即t+1时刻用户层神经网络最后一个隐藏层向量ht+1,u=Ut+1)。这样,一个查询会话的输入和两个神经网络的处理建模过程就完成了,后面再接收的查询会话,也是一样的处理过程。Referring to Figure 4, for the query session at time t+1, the modeling process of the two neural networks can be as follows: use the state vector Ut of the user layer neural network at time t (that is, the last hidden layer vector ht of the user layer neural network at time t) , u=Ut) and the first query word q1 in the query session at time t+1, the code of t+1 is used as input, the first hidden layer vector in the session layer neural network is calculated, and then combined with the query session in turn Query word encoding, calculate the other hidden layer vectors behind the first hidden layer vector; until the calculation is performed in combination with the last query word of the query session, at this time, it corresponds to the last hidden layer vector of the session layer neural network, which can be called a session. The state vector St+1 of the layer neural network, and then this state vector St+1 is input into the user layer neural network at time t+1. Combined with the state vector Ut of the user layer neural network at time t (that is, the last hidden layer vector ht, u=Ut of the user layer neural network at time t), the calculation produces the user layer neural network state vector Ut+1 at time t+1 (ie t+ At time 1, the last hidden layer vector of the user layer neural network ht+1, u=Ut+1). In this way, the input of a query session and the processing modeling process of two neural networks are completed, and the query session received later is also the same processing process.

其中,用户层神经网络的输入为会话层神经网络的状态向量,用户层神经网络的更新为:Among them, the input of the user layer neural network is the state vector of the session layer neural network, and the update of the user layer neural network is:

表示用户层神经网络中第n个隐藏层的状态向量的值,表示第n个会话层神 经网络的状态向量值。其中,u表示用户u,RNNuser( )表示用户层神经网络。 represents the value of the state vector of the nth hidden layer in the user layer neural network, Represents the state vector value of the nth session layer neural network. Among them, u represents the user u, and RNNuser( ) represents the user layer neural network.

其中,本发明是使用用户层神经网络的状态向量输入会话层神经网络,可以用来引入用户的长期检索偏好信息:Among them, the present invention uses the state vector of the user layer neural network to input the session layer neural network, which can be used to introduce the user's long-term retrieval preference information:

其中,表示t+1时刻会话层神经网络的第一个隐藏层的状态向量值, 表 示用户层神经网络状态向量,W表示权重向量,表示偏置向量。 in, represents the state vector value of the first hidden layer of the session layer neural network at time t+1, represents the state vector of the user layer neural network, W represents the weight vector, represents the bias vector.

在步骤202中,将会话层神经网络中所有隐藏层的状态向量的值以不同的权重组合成新的会话层神经网络的状态向量。In step 202, the values of the state vectors of all hidden layers in the session layer neural network are combined with different weights to synthesize the state vectors of the new session layer neural network.

本发明进一步建立基于注意力机制的HNQS模型。本发明提出的基于注意力机制的HNQS模型,也即AHNQS(Attention-based Hierarchical Neural Query Suggestion ,基于注意力机制和分层神经网络的查询推荐模型),将会话层神经网络中所有隐藏层的状态向量的值以不同的权重组合成新的会话层神经网络的状态向量,然后作为用户层神经网络的输入。The present invention further establishes the HNQS model based on the attention mechanism. The HNQS model based on the attention mechanism proposed by the present invention, namely AHNQS (Attention-based Hierarchical Neural Query Suggestion, the query recommendation model based on the attention mechanism and the hierarchical neural network), the state of all hidden layers in the session layer neural network The values of the vector are grouped with different weights to synthesize the state vector of the new session layer neural network, which is then used as the input of the user layer neural network.

本发明方案中,认为在一个查询会话中,不同的查询表达了不同的用户意图,同时在表达用户兴趣爱好时也具有不同的权重,例如被用户点击的查询可能更能表现用户的查询意图。本发明通过注意力机制可以给会话层神经网络中不同的查询会话分配不同的权重,这样会话层神经网络中所有隐藏层的状态向量可以以不同的权重组合成新的会话层神经网络的状态向量。参见图5,是根据本发明的一个实施例的一种AHNQS模型的示意图。图5中,下方是会话层神经网络(Session level RNN),上方是用户层神经网络(User- levelRNN),其中会话层神经网络(Session level)引入用户注意力(User Attention)。In the solution of the present invention, it is considered that in a query session, different queries express different user intentions, and also have different weights when expressing user interests. For example, a query clicked by a user may better express the user's query intention. The present invention can assign different weights to different query sessions in the session layer neural network through the attention mechanism, so that the state vectors of all hidden layers in the session layer neural network can be combined with different weights to synthesize the state vectors of a new session layer neural network . Referring to FIG. 5, it is a schematic diagram of an AHNQS model according to an embodiment of the present invention. In Figure 5, the lower part is the session level neural network (Session level RNN), and the upper part is the user layer neural network (User-level RNN), in which the session level neural network (Session level) introduces user attention (User Attention).

如图5所示:As shown in Figure 5:

在用户层神经网络,用户状态信息的更新如下:In the user layer neural network, the user state information is updated as follows:

其中:in:

其中,表示t时刻会话层神经网络的状态向量值,由公式(10)计算得到,表示t时 刻会话层神经网络的第j个隐藏层的状态值。表示t时刻会话层神经网络的第j个隐藏层 的权重值,为中间参数。这样,其中的参数可以通过损失函数来不断得到更新。其中,ht, u表示用户层神经网络用户u的第t个隐藏层状态向量,u表示用户u,hTt-1,u表示用户u的t- 1时刻个隐藏层状态向量,hj,t表示t时刻会话层神经网络的第j个隐藏层状态向量值,exp ( )表示指数函数。本发明通过权重值将会话层神经网络中所有隐藏层的状态向量组合 成新的会话层神经网络的状态向量。 in, represents the state vector value of the session layer neural network at time t, which is calculated by formula (10), Represents the state value of the jth hidden layer of the session layer neural network at time t. represents the weight value of the jth hidden layer of the session layer neural network at time t, is an intermediate parameter. In this way, the parameters in it can be continuously updated through the loss function. Among them, ht, u represents the t-th hidden layer state vector of user u in the user layer neural network, u represents user u, h T t-1, u represents the hidden layer state vector of user u at time t-1, hj, t Represents the jth hidden layer state vector value of the session layer neural network at time t, and exp ( ) represents the exponential function. The present invention passes the weight value Combine the state vectors of all hidden layers in the session layer neural network into the state vector of the new session layer neural network.

在步骤203中,根据更新会话层神经网络状态向量后的所述会话层神经网络和所述用户层神经网络,为查询会话输出查询推荐内容。In step 203, according to the session layer neural network and the user layer neural network after updating the session layer neural network state vector, the query recommendation content is output for the query session.

在将会话层神经网络中所有隐藏层的状态向量的值分配不同权重并以不同的权重组合成新的会话层神经网络的状态向量后,再重新作为用户层神经网络的输入,这样会话层神经网络和所述用户层神经网络的状态向量会相应发生变化。After assigning different weights to the value of the state vector of all hidden layers in the session layer neural network and synthesizing the state vector of a new session layer neural network with different weights, it is re-used as the input of the user layer neural network, so that the session layer neural network The state vectors of the network and the user layer neural network will change accordingly.

该步骤根据更新会话层神经网络状态向量后的所述会话层神经网络和所述用户层神经网络,为查询会话输出查询推荐内容,可以得到更准确的查询推荐内容。In this step, according to the session layer neural network and the user layer neural network after updating the session layer neural network state vector, the query recommendation content is output for the query session, so that more accurate query recommendation content can be obtained.

本发明方案通过实验对HNQS模型和AHNQS模型进行检查,判断两者能否提高查询推荐的效果,即分层机制和注意力机制对查询推荐效果是否有帮助。另外随着查询会话的长度的变化,推荐效果有时候也会改变。The solution of the present invention checks the HNQS model and the AHNQS model through experiments to determine whether the two can improve the effect of query recommendation, that is, whether the layering mechanism and the attention mechanism are helpful to the effect of query recommendation. In addition, as the length of the query session changes, the recommendation effect sometimes changes.

结合前述内容,本发明所描述的查询推荐模型包括:(1)ADJ(the original co-occurrence based ranking ,原始基于共现度的查询推荐)模型;(2)NQS模型;(3)HNQS模型;(4)AHNQS模型。In combination with the foregoing, the query recommendation model described in the present invention includes: (1) ADJ (the original co-occurrence based ranking, original co-occurrence based query recommendation) model; (2) NQS model; (3) HNQS model; (4) AHNQS model.

本发明将数据集和具体的参数设置展示在表1和表2中。The present invention shows the data set and specific parameter settings in Table 1 and Table 2.

表1Table 1

模型Model 批大小batch size 剔除率rejection rate 学习率learning rate 动量momentum NQSNQS 5050 0.50.5 0.010.01 0.00.0 HNQSHNQS 5050 0.10.1 0.10.1 0.00.0 AHNQSAHNQS 5050 0.10.1 0.10.1 0.00.0

表2Table 2

本发明采用MRR和Recall指标来衡量所有模型的效果。根据实验结果,得出以下结论:The present invention uses MRR and Recall indicators to measure the effects of all models. According to the experimental results, the following conclusions are drawn:

1)总体的表现1) Overall performance

本发明将所有模型的效果显示在表3中,其中对最好的模型效果进行了数据显著性检验,如下表3所示:The present invention displays the effects of all models in Table 3, wherein the data significance test is performed on the best model effects, as shown in Table 3 below:

Model(模型)Model Recall@10Recall@10 MRR@10MRR@10 ADJADJ <u>.7072</u><u>.7072</u> <u>.6922</u><u>.6922</u> NQSNQS .6444.6444 .6148.6148 HNQSHNQS .8138<sup>▲</sup>.8138<sup>▲</sup> .7874<sup>▲</sup>.7874<sup>▲</sup> AHNQSAHNQS .8618<sup>▲</sup>.8618<sup>▲</sup> .8514<sup>▲</sup>.8514<sup>▲</sup>

表3table 3

根据上述实验结果,可以看到AHNQS的数值最大,也代表最好的结果是AHNQS,然后是HNQS、ADJ,最差的是NQS,其中在MRR和Recall指标上,AHNQS比ADJ高了21.86%和22.99%,HNQS只分别高了5.9%和8.13%,这也进一步表明了本发明的注意力机制对提高查询推荐的效果是有帮助。According to the above experimental results, it can be seen that AHNQS has the largest value, which also represents the best result is AHNQS, followed by HNQS and ADJ, and the worst is NQS. Among them, AHNQS is 21.86% higher than ADJ in terms of MRR and Recall indicators. 22.99%, and HNQS is only 5.9% and 8.13% higher, respectively, which further shows that the attention mechanism of the present invention is helpful for improving the effect of query recommendation.

为了进一步研究会话长度对本发明方法的影响,本发明将查询会话的长度分为短(short)、中(medium)、长(long),并在不同的查询会话长度时对不同模型方法进行检测,结果如参见图6和图7所示。其中,图6是根据本发明的一个实施例的一种Recall指标下不同查询会话长度的各模型检测效果图;图7是根据本发明的一个实施例的一种MRR指标下不同查询会话长度的各模型检测效果图。图6和图7中,对于每个会话长度情况短、中、长,四个柱形分别对应ADJ、NQS、HNQS、AHNQS。In order to further study the influence of the session length on the method of the present invention, the present invention divides the length of the query session into short (short), medium (medium), and long (long), and detects different model methods at different query session lengths, The results are shown in FIGS. 6 and 7 . Wherein, Fig. 6 is a model detection effect diagram of different query session lengths under a Recall indicator according to an embodiment of the present invention; Fig. 7 is a graph of different query session lengths under an MRR indicator according to an embodiment of the present invention The results of each model detection. In Figures 6 and 7, for each session length case of short, medium, and long, the four columns correspond to ADJ, NQS, HNQS, and AHNQS, respectively.

可以看到,在每个长度上,AHNQS的数值最高,代表效果都是最好的,其次是HNQS,但是在MRR指标上,AHNQS比HNQS的提高量较Recall指标要多,这也进一步验证了本发明方法提出的注意力机制对于查询推荐的准确性有明显的帮助效果。It can be seen that in each length, the value of AHNQS is the highest, and the representative effect is the best, followed by HNQS, but in the MRR index, the improvement of AHNQS is more than that of HNQS than the Recall index, which is further verified. The attention mechanism proposed by the method of the present invention has obvious help effect on the accuracy of query recommendation.

上述详细介绍了本发明的基于注意力机制的分层神经网络查询推荐方法,以下相应介绍本发明对应的查询推荐装置及设备。The above describes in detail the hierarchical neural network query recommendation method based on the attention mechanism of the present invention, and the corresponding query recommendation apparatus and equipment of the present invention are described below.

图8是根据本发明的一个实施例的一种分层神经网络查询推荐装置的示意性方框图。FIG. 8 is a schematic block diagram of a hierarchical neural network query recommendation apparatus according to an embodiment of the present invention.

参照图8,在一种分层神经网络查询推荐装置80中,包括:分层神经网络建立模块81、推荐输出模块82、注意力机制处理模块83。Referring to FIG. 8 , a hierarchical neural network query and recommendation apparatus 80 includes: a hierarchical neural network establishment module 81 , a recommendation output module 82 , and an attention mechanism processing module 83 .

分层神经网络建立模块81,用于建立用于对用户短期查询记录进行建模的会话层神经网络和用于对用户长期查询记录进行建模的用户层神经网络两个神经网络,其中将当前时刻的所述会话层神经网络的状态向量作为当前时刻所述用户层神经网络的输入,将当前时刻的所述用户层神经网络的状态向量作为下一时刻的所述会话层神经网络的输入。The hierarchical neural network establishment module 81 is used to establish two neural networks, a session layer neural network for modeling the user's short-term query records and a user layer neural network for modeling the user's long-term query records, wherein the current The state vector of the session layer neural network at the current moment is used as the input of the user layer neural network at the current moment, and the state vector of the user layer neural network at the current moment is used as the input of the session layer neural network at the next moment.

推荐输出模块82,用于根据所述分层神经网络建立模块81建立的所述会话层神经网络和所述用户层神经网络,为查询会话输出查询推荐内容。The recommendation output module 82 is configured to output the query recommendation content for the query session according to the session layer neural network and the user layer neural network established by the layered neural network establishment module 81 .

注意力机制处理模块83,用于将所述分层神经网络建立模块81建立的会话层神经网络中的状态向量的值以不同的权重组合成新的会话层神经网络的状态向量;所述推荐输出模块82,根据更新会话层神经网络状态向量后的所述会话层神经网络和所述用户层神经网络,为查询会话输出查询推荐内容。The attention mechanism processing module 83 is used to combine the values of the state vector in the session layer neural network established by the layered neural network establishment module 81 with different weights to synthesize the state vector of the new session layer neural network; the recommendation The output module 82 outputs the query recommendation content for the query session according to the session layer neural network and the user layer neural network after updating the session layer neural network state vector.

其中,所述分层神经网络建立模块81可以将t时刻的会话层神经网络的状态向量作为t时刻用户层神经网络的输入,将t时刻用户层神经网络的状态向量作为t+1时刻会话层神经网络的输入,将t+1时刻会话层神经网络的状态向量作为t+1时刻用户层神经网络的输入,其中t大于或等于0。The layered neural network establishment module 81 can use the state vector of the session layer neural network at time t as the input of the user layer neural network at time t, and use the state vector of the user layer neural network at time t as the session layer at time t+1 For the input of the neural network, the state vector of the session layer neural network at time t+1 is used as the input of the user layer neural network at time t+1, where t is greater than or equal to 0.

其中,可以是将t时刻用户层神经网络的状态向量用来初始化t+1时刻的会话层神经网络的第一个隐藏层状态向量;将t+1时刻的会话层神经网络的最后一个隐藏层状态向量作为t+1时刻用户层神经网络的输入。Among them, the state vector of the user layer neural network at time t can be used to initialize the first hidden layer state vector of the session layer neural network at time t+1; the last hidden layer of the session layer neural network at time t+1 can be used. The state vector is used as the input of the user layer neural network at time t+1.

其中,可以是将t时刻用户层神经网络的最后一个隐藏层的状态向量用来初始化t+1时刻的会话层神经网络的第一个隐藏层状态向量Among them, the state vector of the last hidden layer of the user layer neural network at time t can be used to initialize the first hidden layer state vector of the session layer neural network at time t+1

其中,所述注意力机制处理模块83可以将所述会话层神经网络中所有隐藏层的状态向量的值以不同的权重组合成新的会话层神经网络的状态向量。The attention mechanism processing module 83 may combine the values of the state vectors of all hidden layers in the session layer neural network with different weights to synthesize the state vectors of the new session layer neural network.

图9是根据本发明的一个实施例的一种分层神经网络查询推荐设备的示意性方框图。Fig. 9 is a schematic block diagram of a hierarchical neural network query and recommendation device according to an embodiment of the present invention.

参照图9,在一种分层神经网络查询推荐设备90中,包括:处理器91、存储器92。Referring to FIG. 9 , a hierarchical neural network query and recommendation device 90 includes: a processor 91 and a memory 92 .

处理器91,建立用于对用户短期查询记录进行建模的会话层神经网络和用于对用户长期查询记录进行建模的用户层神经网络两个神经网络,其中将当前时刻的所述会话层神经网络的状态向量作为当前时刻所述用户层神经网络的输入,将当前时刻的所述用户层神经网络的状态向量作为下一时刻的所述会话层神经网络的输入;根据所述会话层神经网络和所述用户层神经网络,为查询会话输出查询推荐内容。The processor 91 establishes two neural networks, a session layer neural network for modeling the user's short-term query records and a user layer neural network for modeling the user's long-term query records, wherein the session layer at the current moment is used for modeling. The state vector of the neural network is used as the input of the user layer neural network at the current moment, and the state vector of the user layer neural network at the current moment is used as the input of the session layer neural network at the next moment; The network and the user layer neural network output query recommendation content for the query session.

存储器92,存储处理器91输出的查询推荐内容。The memory 92 stores the query recommendation content output by the processor 91 .

本发明实施例还提供一种非暂时性机器可读存储介质,其上存储有可执行代码,当所述可执行代码被电子设备的处理器执行时,使所述处理器执行以下所述的方法:Embodiments of the present invention further provide a non-transitory machine-readable storage medium, on which executable codes are stored, and when the executable codes are executed by a processor of an electronic device, the processor is caused to execute the following method:

建立用于对用户短期查询记录进行建模的会话层神经网络和用于对用户长期查询记录进行建模的用户层神经网络两个神经网络,其中将当前时刻的所述会话层神经网络的状态向量作为当前时刻所述用户层神经网络的输入,将当前时刻的所述用户层神经网络的状态向量作为下一时刻的所述会话层神经网络的输入;Two neural networks are established, a session layer neural network for modeling the user's short-term query records and a user layer neural network for modeling the user's long-term query records, wherein the state of the session layer neural network at the current moment is calculated. The vector is used as the input of the user layer neural network at the current moment, and the state vector of the user layer neural network at the current moment is used as the input of the session layer neural network at the next moment;

根据所述会话层神经网络和所述用户层神经网络,为查询会话输出查询推荐内容。According to the session layer neural network and the user layer neural network, the query recommendation content is output for the query session.

综上所述,本发明提供了一种基于注意力机制的分层神经网络查询推荐方法,其中分层的机制包括一个会话层神经网络的神经网络和一个用户层神经网络的神经网络,会话层神经网络可以用于对用户的短期查询记录建模,用户层神经网络可以用于对用户的长期查询进行建模。本发明提出的注意力机制能够更好的捕捉用户的偏好的信息。本发明在AOL数据集上进行了实验,实验结果表明本发明的方法比现有基于神经网络的查询推荐方法要好,在MRR指标和Recall指标上分别提高了21.86%和22.99%,尤其对短的查询会话,效果提高更加明显。To sum up, the present invention provides a hierarchical neural network query recommendation method based on an attention mechanism, wherein the hierarchical mechanism includes a neural network of a session layer neural network and a neural network of a user layer neural network. Neural networks can be used to model short-term query records of users, and user-layer neural networks can be used to model long-term queries of users. The attention mechanism proposed by the present invention can better capture the user's preference information. The present invention has carried out experiments on AOL data set, and the experimental results show that the method of the present invention is better than the existing query recommendation method based on neural network, and the MRR index and Recall index are improved by 21.86% and 22.99% respectively, especially for short Query session, the effect is more obvious.

上文中已经参考附图详细描述了根据本发明的技术方案。The technical solution according to the present invention has been described in detail above with reference to the accompanying drawings.

此外,根据本发明的方法还可以实现为一种计算机程序或计算机程序产品,该计算机程序或计算机程序产品包括用于执行本发明的上述方法中限定的上述各步骤的计算机程序代码指令。Furthermore, the method according to the invention can also be implemented as a computer program or computer program product comprising computer program code instructions for carrying out the above-mentioned steps defined in the above-mentioned method of the invention.

或者,本发明还可以实施为一种非暂时性机器可读存储介质(或计算机可读存储介质、或机器可读存储介质),其上存储有可执行代码(或计算机程序、或计算机指令代码),当所述可执行代码(或计算机程序、或计算机指令代码)被电子设备(或计算设备、服务器等)的处理器执行时,使所述处理器执行根据本发明的上述方法的各个步骤。Alternatively, the present invention can also be implemented as a non-transitory machine-readable storage medium (or computer-readable storage medium, or machine-readable storage medium) on which executable code (or computer program, or computer instruction code is stored) ), when the executable code (or computer program, or computer instruction code) is executed by the processor of the electronic device (or computing device, server, etc.), the processor is caused to perform the various steps of the above method according to the present invention .

本领域技术人员还将明白的是,结合这里的公开所描述的各种示例性逻辑块、模块、电路和算法步骤可以被实现为电子硬件、计算机软件或两者的组合。Those skilled in the art will also appreciate that the various exemplary logical blocks, modules, circuits, and algorithm steps described in connection with the disclosure herein may be implemented as electronic hardware, computer software, or combinations of both.

所属领域的普通技术人员应当理解:以上所述仅为本发明的具体实施例而已,并不用于限制本发明,凡在本发明的精神和原则之内,所做的任何修改、等同替换、改进等,均应包含在本发明的保护范围之内。Those of ordinary skill in the art should understand: the above are only specific embodiments of the present invention, and are not intended to limit the present invention. Any modification, equivalent replacement, improvement made within the spirit and principle of the present invention etc., should be included within the protection scope of the present invention.

Claims (10)

1.一种分层神经网络查询推荐方法,其特征在于,包括:1. a hierarchical neural network query recommendation method is characterized in that, comprising: 建立用于对用户短期查询记录进行建模的会话层神经网络和用于对用户长期查询记录进行建模的用户层神经网络两个神经网络,其中将当前时刻的所述会话层神经网络的状态向量作为当前时刻所述用户层神经网络的输入,将当前时刻的所述用户层神经网络的状态向量作为下一时刻的所述会话层神经网络的输入;Two neural networks are established, a session layer neural network for modeling the user's short-term query records and a user layer neural network for modeling the user's long-term query records, wherein the state of the session layer neural network at the current moment is calculated. The vector is used as the input of the user layer neural network at the current moment, and the state vector of the user layer neural network at the current moment is used as the input of the session layer neural network at the next moment; 根据所述会话层神经网络和所述用户层神经网络,为查询会话输出查询推荐内容。According to the session layer neural network and the user layer neural network, the query recommendation content is output for the query session. 2.根据权利要求1所述的方法,其特征在于,所述方法还包括:2. The method according to claim 1, wherein the method further comprises: 将所述会话层神经网络中的状态向量的值以不同的权重组合成新的会话层神经网络的状态向量;Combining the values of the state vectors in the session layer neural network with different weights to synthesize the state vectors of the new session layer neural network; 所述根据所述会话层神经网络和所述用户层神经网络,为查询会话输出查询推荐内容,包括:The outputting query recommendation content for the query session according to the session layer neural network and the user layer neural network, including: 根据更新会话层神经网络状态向量后的所述会话层神经网络和所述用户层神经网络,为查询会话输出查询推荐内容。According to the session layer neural network and the user layer neural network after updating the session layer neural network state vector, the query recommendation content is output for the query session. 3.根据权利要求1所述的方法,其特征在于,所述将当前时刻的所述会话层神经网络的状态向量作为当前时刻所述用户层神经网络的输入,将当前时刻的所述用户层神经网络的状态向量作为下一时刻的所述会话层神经网络的输入,包括:3. The method according to claim 1, wherein the state vector of the session layer neural network at the current moment is used as the input of the user layer neural network at the current moment, and the user layer neural network at the current moment is used. The state vector of the neural network is used as the input of the session layer neural network at the next moment, including: 将t时刻的会话层神经网络的状态向量作为t时刻用户层神经网络的输入,将t时刻用户层神经网络的状态向量作为t+1时刻会话层神经网络的输入,将t+1时刻会话层神经网络的状态向量作为t+1时刻用户层神经网络的输入,其中t大于或等于0。The state vector of the session layer neural network at time t is used as the input of the user layer neural network at time t, the state vector of the user layer neural network at time t is used as the input of the session layer neural network at time t+1, and the session layer at time t+1 is used. The state vector of the neural network is used as the input of the user layer neural network at time t+1, where t is greater than or equal to 0. 4.根据权利要求3所述的方法,其特征在于,所述将t时刻用户层神经网络的状态向量作为t+1时刻会话层神经网络的输入,将t+1时刻会话层神经网络的状态向量作为t+1时刻用户层神经网络的输入,包括:4. The method according to claim 3, wherein the state vector of the user layer neural network at time t is used as the input of the session layer neural network at time t+1, and the state vector of the session layer neural network at time t+1 is used. The vector is used as the input of the user layer neural network at time t+1, including: 将t时刻用户层神经网络的状态向量用来初始化t+1时刻的会话层神经网络的第一个隐藏层状态向量;将t+1时刻的会话层神经网络的最后一个隐藏层状态向量作为t+1时刻用户层神经网络的输入。The state vector of the user layer neural network at time t is used to initialize the first hidden layer state vector of the session layer neural network at time t+1; the last hidden layer state vector of the session layer neural network at time t+1 is used as t +1 for the input of the user layer neural network at the moment. 5.根据权利要求4所述的方法,其特征在于,所述将t时刻用户层神经网络的状态向量用来初始化t+1时刻的会话层神经网络的第一个隐藏层状态向量,包括:5. The method according to claim 4, wherein the state vector of the user layer neural network at time t is used to initialize the first hidden layer state vector of the session layer neural network at time t+1, comprising: 将t时刻用户层神经网络的最后一个隐藏层的状态向量用来初始化t+1时刻的会话层神经网络的第一个隐藏层状态向量。The state vector of the last hidden layer of the user layer neural network at time t is used to initialize the state vector of the first hidden layer of the session layer neural network at time t+1. 6.根据权利要求2所述的方法,其特征在于,所述将所述会话层神经网络中的状态向量的值以不同的权重组合成新的会话层神经网络的状态向量,包括:6. The method according to claim 2, wherein the state vector of the new session layer neural network is synthesized by combining the values of the state vectors in the session layer neural network with different weights, comprising: 将所述会话层神经网络中所有隐藏层的状态向量的值以不同的权重组合成新的会话层神经网络的状态向量。The values of the state vectors of all hidden layers in the session layer neural network are combined with different weights to synthesize the state vectors of the new session layer neural network. 7.一种分层神经网络查询推荐装置,其特征在于,包括:7. A hierarchical neural network query and recommendation device, characterized in that, comprising: 分层神经网络建立模块,用于建立用于对用户短期查询记录进行建模的会话层神经网络和用于对用户长期查询记录进行建模的用户层神经网络两个神经网络,其中将当前时刻的所述会话层神经网络的状态向量作为当前时刻所述用户层神经网络的输入,将当前时刻的所述用户层神经网络的状态向量作为下一时刻的所述会话层神经网络的输入;A hierarchical neural network building module is used to establish two neural networks, a session layer neural network for modeling the user's short-term query records and a user layer neural network for modeling the user's long-term query records. The state vector of the described session layer neural network is used as the input of the user layer neural network at the current moment, and the state vector of the user layer neural network at the current moment is used as the input of the session layer neural network at the next moment; 推荐输出模块,用于根据所述分层神经网络建立模块建立的所述会话层神经网络和所述用户层神经网络,为查询会话输出查询推荐内容。A recommendation output module is configured to output query recommendation content for a query session according to the session layer neural network and the user layer neural network established by the layered neural network establishment module. 8.根据权利要求7所述的装置,其特征在于,所述装置还包括:8. The apparatus according to claim 7, wherein the apparatus further comprises: 注意力机制处理模块,用于将所述分层神经网络建立模块建立的会话层神经网络中的状态向量的值以不同的权重组合成新的会话层神经网络的状态向量;The attention mechanism processing module is used for synthesizing the state vector of the new session layer neural network with the values of the state vector in the session layer neural network established by the layered neural network establishment module with different weights; 所述推荐输出模块,根据更新会话层神经网络状态向量后的所述会话层神经网络和所述用户层神经网络,为查询会话输出查询推荐内容。The recommendation output module outputs the query recommendation content for the query session according to the session layer neural network and the user layer neural network after updating the session layer neural network state vector. 9.根据权利要求7所述的装置,其特征在于:9. The device according to claim 7, wherein: 所述分层神经网络建立模块将t时刻的会话层神经网络的状态向量作为t时刻用户层神经网络的输入,将t时刻用户层神经网络的状态向量作为t+1时刻会话层神经网络的输入,将t+1时刻会话层神经网络的状态向量作为t+1时刻用户层神经网络的输入,其中t大于或等于0。The layered neural network building module uses the state vector of the session layer neural network at time t as the input of the user layer neural network at time t, and uses the state vector of the user layer neural network at time t as the input of the session layer neural network at time t+1. , take the state vector of the session layer neural network at time t+1 as the input of the user layer neural network at time t+1, where t is greater than or equal to 0. 10.根据权利要求8所述的装置,其特征在于:10. The device of claim 8, wherein: 所述注意力机制处理模块将所述会话层神经网络中所有隐藏层的状态向量的值以不同的权重组合成新的会话层神经网络的状态向量。The attention mechanism processing module combines the values of state vectors of all hidden layers in the session layer neural network with different weights to synthesize the state vectors of a new session layer neural network.
CN201910255010.2A 2019-03-21 2019-04-01 Hierarchical neural network query recommendation method and device Pending CN109740743A (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN2019102150519 2019-03-21
CN201910215051 2019-03-21

Publications (1)

Publication Number Publication Date
CN109740743A true CN109740743A (en) 2019-05-10

Family

ID=66371402

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910255010.2A Pending CN109740743A (en) 2019-03-21 2019-04-01 Hierarchical neural network query recommendation method and device

Country Status (1)

Country Link
CN (1) CN109740743A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110188283A (en) * 2019-06-05 2019-08-30 中国人民解放军国防科技大学 Information recommendation method and system based on joint neural network collaborative filtering
CN112000873A (en) * 2020-06-19 2020-11-27 南京理工大学 Session-based recommendation system, method, device and storage medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108320786A (en) * 2018-02-06 2018-07-24 华南理工大学 A kind of Chinese meal vegetable recommendation method based on deep neural network
CN108959429A (en) * 2018-06-11 2018-12-07 苏州大学 A kind of method and system that the film merging the end-to-end training of visual signature is recommended
CN109145213A (en) * 2018-08-22 2019-01-04 清华大学 Method and device for query recommendation based on historical information

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108320786A (en) * 2018-02-06 2018-07-24 华南理工大学 A kind of Chinese meal vegetable recommendation method based on deep neural network
CN108959429A (en) * 2018-06-11 2018-12-07 苏州大学 A kind of method and system that the film merging the end-to-end training of visual signature is recommended
CN109145213A (en) * 2018-08-22 2019-01-04 清华大学 Method and device for query recommendation based on historical information

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
WANYU CHEN ET AL.: "Attention-based Hierarchical Neural Query Suggestion", 《ARXIV》 *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110188283A (en) * 2019-06-05 2019-08-30 中国人民解放军国防科技大学 Information recommendation method and system based on joint neural network collaborative filtering
CN110188283B (en) * 2019-06-05 2021-11-23 中国人民解放军国防科技大学 Information recommendation method and system based on joint neural network collaborative filtering
CN112000873A (en) * 2020-06-19 2020-11-27 南京理工大学 Session-based recommendation system, method, device and storage medium
CN112000873B (en) * 2020-06-19 2022-10-18 南京理工大学 Session-based recommendation system, method, device and storage medium

Similar Documents

Publication Publication Date Title
US12400121B2 (en) Regularized neural network architecture search
US12314856B2 (en) Population based training of neural networks
CN111931062B (en) Training method and related device of information recommendation model
US11544536B2 (en) Hybrid neural architecture search
US11087201B2 (en) Neural architecture search using a performance prediction neural network
CN109902708B (en) A recommendation model training method and related device
US12306867B2 (en) Production method of multimedia work, apparatus, and computer-readable storage medium
US20210019599A1 (en) Adaptive neural architecture search
CN112257841B (en) Data processing method, device, equipment and storage medium in graph neural network
CN118043802B (en) Recommendation model training method and device
CN105210064A (en) Classifying resources using a deep network
CN109241412A (en) A kind of recommended method, system and electronic equipment based on network representation study
WO2023029350A1 (en) Click behavior prediction-based information pushing method and apparatus
CN115879508A (en) Data processing method and related device
CN117217284A (en) A data processing method and its device
CN112364258B (en) Map-based recommendation methods, systems, storage media and electronic devices
CN115630297A (en) A kind of model training method and related equipment
US20220383195A1 (en) Machine learning algorithm search
CN115048560A (en) Data processing method and related device
CN110516164B (en) Information recommendation method, device, equipment and storage medium
CN109740743A (en) Hierarchical neural network query recommendation method and device
CN117009621A (en) Information search methods, devices, electronic equipment, storage media and program products
CN116595252A (en) A data processing method and related device
CN116955360A (en) Event recommendation method and related device based on user interaction sequence
CN114637926A (en) Content recommendation method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20190510