word2vec

This tool provides an efficient implementation of the continuous bag-of-words and skip-gram architectures for computing vector representations of words. These representations can be subsequently used in many natural language processing applications and for further research. The word2vec tool takes a text corpus as input and produces the word vectors as output. It first constructs a vocabulary from the training text data and then learns vector representation of words. The resulting word vector file can be used as features in many natural language processing and machine learning applications. ..


References in zbMATH (referenced in 147 articles )

Showing results 1 to 20 of 147.
Sorted by year (citations)

1 2 3 ... 6 7 8 next

  1. Abheesht Sharma, Gunjan Chhablani, Harshit Pandey, Rajaswa Patil: DRIFT: A Toolkit for Diachronic Analysis of Scientific Literature (2021) arXiv
  2. Ataeva, O. M.; Serebryakov, V. A.; Tuchkova, N. P.: Using applied ontology to saturate semantic relations (2021)
  3. Benjamin Paaßen, Jessica McBroom, Bryn Jeffries, Irena Koprinska, Kalina Yacef: ast2vec: Utilizing Recursive Neural Encodings of Python Programs (2021) arXiv
  4. Burashnikova, Aleksandra; Maximov, Yury; Clausel, Marianne; Laclau, Charlotte; Iutzeler, Franck; Amini, Massih-Reza: Learning over no-preferred and preferred sequence of items for robust recommendation (2021)
  5. Chazal, Frédéric; Levrard, Clément; Royer, Martin: Clustering of measures via mean measure quantization (2021)
  6. Hu, Yifan; Hu, Changwei; Tran, Thanh; Kasturi, Tejaswi; Joseph, Elizabeth; Gillingham, Matt: What’s in a name? -- Gender classification of names with character based machine learning models (2021)
  7. Justin Shenk, Wolf Byttner, Saranraj Nambusubramaniyan, Alexander Zoeller: Traja: A Python toolbox for animal trajectory analysis (2021) not zbMATH
  8. Li, Jianxin; Ji, Cheng; Peng, Hao; He, Yu; Song, Yangqiu; Zhang, Xinmiao; Peng, Fanzhang: RWNE: a scalable random-walk based network embedding framework with personalized higher-order proximity preserved (2021)
  9. Loukachevitch, N. V.; Tikhomirov, M. M.; Parkhomenko, E. A.: Using embedding-based similarities to improve lexical resources (2021)
  10. Ma, Guixiang; Ahmed, Nesreen K.; Willke, Theodore L.; Yu, Philip S.: Deep graph similarity learning: a survey (2021)
  11. Mercurio, Paula; Liu, Di: Identifying transition states of chemical kinetic systems using network embedding techniques (2021)
  12. Moreo, Alejandro; Esuli, Andrea; Sebastiani, Fabrizio: Word-class embeddings for multiclass text classification (2021)
  13. Mortier, Thomas; Wydmuch, Marek; Dembczyński, Krzysztof; Hüllermeier, Eyke; Waegeman, Willem: Efficient set-valued prediction in multi-class classification (2021)
  14. Shimada, Takuya; Bao, Han; Sato, Issei; Sugiyama, Masashi: Classification from pairwise similarities/dissimilarities and unlabeled data via empirical risk minimization (2021)
  15. Volpi, Riccardo; Malagò, Luigi: Natural alpha embeddings (2021)
  16. Vorobyov, M.; Zhukov, K.; Grigorieva, M.; Korobkov, S.: Parallel version of the framework for clustering error messages (2021)
  17. Zhu, Yuanyuan; Hu, Bin; Chen, Lei; Dai, Qi: iMPTCE-Hnetwork: a multilabel classifier for identifying metabolic pathway types of chemicals and enzymes with a heterogeneous network (2021)
  18. Agrawal, Devanshu; Papamarkou, Theodore; Hinkle, Jacob: Wide neural networks with bottlenecks are deep Gaussian processes (2020)
  19. Aryal, Sunil; Ting, Kai Ming; Washio, Takashi; Haffari, Gholamreza: A comparative study of data-dependent approaches without learning in measuring similarities of data objects (2020)
  20. Atkinson, Katie; Bench-Capon, Trevor; Bollegala, Danushka: Explanation in AI and law: past, present and future (2020)

1 2 3 ... 6 7 8 next