word2vec

This tool provides an efficient implementation of the continuous bag-of-words and skip-gram architectures for computing vector representations of words. These representations can be subsequently used in many natural language processing applications and for further research. The word2vec tool takes a text corpus as input and produces the word vectors as output. It first constructs a vocabulary from the training text data and then learns vector representation of words. The resulting word vector file can be used as features in many natural language processing and machine learning applications. ..


References in zbMATH (referenced in 199 articles )

Showing results 1 to 20 of 199.
Sorted by year (citations)

1 2 3 ... 8 9 10 next

  1. Althar, Raghavendra Rao; Alahmadi, Abdulrahman; Samanta, Debabrata; Khan, Mohammad Zubair; Alahmadi, Ahmed H.: Mathematical foundations based statistical modeling of software source code for software system evolution (2022)
  2. Blier-Wong, Christopher; Cossette, Hélène; Lamontagne, Luc; Marceau, Etienne: Geographic ratemaking with spatial embeddings (2022)
  3. Chrupała, Grzegorz: Visually grounded models of spoken language: a survey of datasets, architectures and evaluation techniques (2022)
  4. Fagni, Tiziano; Cresci, Stefano: Fine-grained prediction of political leaning on social media with unsupervised deep learning (2022)
  5. Gfrerer, Helmut; Outrata, Jiří V.; Valdman, Jan: On the solution of contact problems with Tresca friction by the semismooth* Newton method (2022)
  6. Hettiarachchi, Hansi; Adedoyin-Olowe, Mariam; Bhogal, Jagdev; Gaber, Mohamed Medhat: Embed2detect: temporally clustered embedded words for event detection in social media (2022)
  7. Koshiyama, Adriano; Blumberg, Stefano B.; Firoozye, Nick; Treleaven, Philip; Flennerhag, Sebastian: QuantNet: transferring learning across trading strategies (2022)
  8. Koto, Fajri; Baldwin, Timothy; Lau, Jey Han: FFCI: a framework for interpretable automatic evaluation of summarization (2022)
  9. Liang, Bo; Wang, Lin; Wang, Xiaofan: OLMNE+FT: multiplex network embedding based on overlapping links (2022)
  10. Liu, Ruibo; Jia, Chenyan; Wei, Jason; Xu, Guangxuan; Vosoughi, Soroush: Quantifying and alleviating political bias in language models (2022)
  11. Li, Xuan; Lu, Lin; Chen, Lei: Identification of protein functions in mouse with a label space partition method (2022)
  12. Loureiro, Daniel; Mário Jorge, Alípio; Camacho-Collados, Jose: LMMS reloaded: transformer-based sense embeddings for disambiguation and beyond (2022)
  13. Lu, Jianfeng; Steinerberger, Stefan: Neural collapse under cross-entropy loss (2022)
  14. Ramgoolam, Sanjaye; Sadrzadeh, Mehrnoosh; Sword, Lewis: Gaussianity and typicality in matrix distributional semantics (2022)
  15. Ribeiro, Eugénio; Ribeiro, Ricardo; Martins de Matos, David: Automatic recognition of the general-purpose communicative functions defined by the ISO 24617-2 standard for dialog act annotation (2022)
  16. Škrlj, Blaž; Džeroski, Sašo; Lavrač, Nada; Petković, Matej: ReliefE: feature ranking in high-dimensional spaces via manifold embeddings (2022)
  17. Tuck, Jonathan; Boyd, Stephen: Eigen-stratified models (2022)
  18. Vrublevskyi, V.; Marchenko, O.: Development and analysis of a sentence semantics representation model (2022)
  19. Abheesht Sharma, Gunjan Chhablani, Harshit Pandey, Rajaswa Patil: DRIFT: A Toolkit for Diachronic Analysis of Scientific Literature (2021) arXiv
  20. Ataeva, O. M.; Serebryakov, V. A.; Tuchkova, N. P.: Using applied ontology to saturate semantic relations (2021)

1 2 3 ... 8 9 10 next