一种基于多元数据融合的引文网络知识表示方法 | |
陈文杰![]() ![]() | |
2019-09-24 | |
Source Publication | 情报理论与实践
![]() |
Volume | 10Issue:3Pages:173 |
Abstract | [目的/意义]有效融合引文网络中的引用关系和文本属性等多元数据,增强文献节点间的语义关联,从而为数据挖掘和知识发现等任务提供有力的支撑。[方法/过程]提出了一种引文网络的知识表示方法,先利用神经网络模型学习引文网络中的k阶邻近结构;然后使用doc2vec模型学习标题、摘要等文本属性;最后给出了基于向量共享的交叉学习机制用于多元数据融合。[结果/结论]通过面向干细胞领域的CNKI引文数据集的测试,在链路预测上取得了较好的性能,证明了方法的有效性和科学性。 |
Other Abstract | Abstract:
[Purpose/significance] Effectively integrate multi-dimensional data such as citation relations and text attributes in citation networks, and enhance the semantic association between document nodes, thus providing powerful support for data mining and knowledge discovery.[Method/process]propose a knowledge representation method for citation network. Firstly, method uses the neural network model to learn the k-order neighbor structure in the citation network. Then use the doc2vec model to learn text attributes such as titles and abstracts. Finally, a cross-learning mechanism based on vector sharing is presented for multi-data fusion.[Result/conclusion]Through test of CNKI citation data sets for the stem cell field, get a better performance in link prediction, prove the effectiveness and scientificity of the method. |
Keyword | 引文网络 多元数据融合 知识表示 Word2vec Doc2vec |
Indexed By | CSSCI |
Language | 中文 |
Document Type | 期刊论文 |
Identifier | http://ir.las.ac.cn/handle/12502/10737 |
Collection | 中国科学院成都文献情报中心_信息技术部 |
Corresponding Author | 陈文杰 |
Affiliation | 中国科学院成都文献情报中心 |
First Author Affilication | 中国科学院文献情报中心 |
Corresponding Author Affilication | 中国科学院文献情报中心 |
Recommended Citation GB/T 7714 | 陈文杰,许海云. 一种基于多元数据融合的引文网络知识表示方法[J]. 情报理论与实践,2019,10(3):173. |
APA | 陈文杰,&许海云.(2019).一种基于多元数据融合的引文网络知识表示方法.情报理论与实践,10(3),173. |
MLA | 陈文杰,et al."一种基于多元数据融合的引文网络知识表示方法".情报理论与实践 10.3(2019):173. |
Files in This Item: | Download All | |||||
File Name/Size | DocType | Version | Access | License | ||
一种基于多元数据融合的引文网络知识表示方(275KB) | 期刊论文 | 作者接受稿 | 开放获取 | CC BY-NC-SA | View Download |
Items in the repository are protected by copyright, with all rights reserved, unless otherwise indicated.
Edit Comment