• Miniball

  • Referenced in 44 articles [sw05179]
  • whose rich structure has recently received considerable attention. One consequence is that Welzl’s algorithm...
  • DeMat

  • Referenced in 41 articles [sw24853]
  • full functionality for experimentation. Special attention has been given to coding conventions in hungarian prefix...
  • GNMT

  • Referenced in 27 articles [sw26579]
  • encoder and 8 decoder layers using attention and residual connections. To improve parallelism and therefore ... decrease training time, our attention mechanism connects the bottom layer of the decoder...
  • LSTM

  • Referenced in 37 articles [sw03373]
  • recognition, speech recognition, keyword spotting, music composition, attentive vision, protein analysis, stock market prediction...
  • iLoc-Hum

  • Referenced in 36 articles [sw22433]
  • some special biological functions worthy of our attention. Based on the accumulation-label scale...
  • weightedHypervolume

  • Referenced in 32 articles [sw31878]
  • approximations has received more and more attention in recent years. So far, the hypervolume indicator...
  • AFRA

  • Referenced in 30 articles [sw02090]
  • attacks in argumentation is receiving an increasing attention as a useful conceptual modelling tool...
  • GADMM

  • Referenced in 29 articles [sw12640]
  • method of multipliers (ADMM) has received intensive attention from a broad spectrum of areas...
  • XMark

  • Referenced in 28 articles [sw18908]
  • Benchmark Project. The database community increasingly devotes attention to practical management of large volumes...
  • SMPSO

  • Referenced in 27 articles [sw23795]
  • Particle Swarm Optimization (PSO) has received increasing attention in the optimization research community since...
  • DELSOL

  • Referenced in 25 articles [sw01077]
  • solve systems of such equations. Special attention is given to the program’s organisation...
  • gbs

  • Referenced in 25 articles [sw06078]
  • skewed statistical distribution that has received great attention in recent decades. A generalized version...
  • PINNsNTK

  • Referenced in 25 articles [sw42058]
  • neural networks (PINNs) have lately received great attention thanks to their flexibility in tackling...
  • Grad-CAM

  • Referenced in 23 articles [sw35098]
  • show that even non-attention based models can localize inputs. We devise...
  • Concepts

  • Referenced in 22 articles [sw00151]
  • element method using the library with special attention to the handling of inconsistent meshes...
  • QMT

  • Referenced in 22 articles [sw07137]
  • management is search: Even if we restrict attention to the tiny fragment of mathematics that...
  • TRAVOS

  • Referenced in 20 articles [sw11985]
  • this latter case, we pay particular attention to handling the possibility that reputation information...
  • Be-CoDiS

  • Referenced in 20 articles [sw15918]
  • primate disease that currently requires a particular attention from the international health authorities...
  • FDRSeg

  • Referenced in 19 articles [sw16730]
  • segments, have recently received great attention. In this paper, we propose a multiscale segmentation method...