• LeZi-update

  • Referenced in 18 articles [sw01541]
  • information-theoretic framework. Shannon’s entropy measure is identified as a basis for comparing user...
  • RTransferEntropy

  • Referenced in 6 articles [sw29816]
  • Series with Shannon and Renyi Transfer Entropy. Measuring information flow between time series with Shannon...
  • JIDT

  • Referenced in 9 articles [sw23550]
  • JIDT includes implementations: principally for the measures transfer entropy, mutual information, and their conditional variants...
  • Rnaz

  • Referenced in 3 articles [sw17118]
  • data and the usage of an entropy measure to represent sequence similarities. RNAz 2.0 shows...
  • SITS

  • Referenced in 2 articles [sw18739]
  • flow is also quantified through the entropy measure...
  • kappalab

  • Referenced in 53 articles [sw06086]
  • capacity (or non-additive measure, fuzzy measure) and integral manipulation on a finite setting ... method based on linear programming, a maximum entropy like method based on variance minimization...
  • ITE

  • Referenced in 6 articles [sw12811]
  • many different variants of entropy, mutual information, divergence, association measures, cross quantities, and kernels...
  • acss

  • Referenced in 7 articles [sw10997]
  • traditional (but problematic) measures of complexity are also provided: entropy and change complexity...
  • tseriesEntropy

  • Referenced in 1 article [sw26028]
  • Tests for Time Series. Implements an Entropy measure of dependence based on the Bhattacharya-Hellinger...
  • Inform

  • Referenced in 1 article [sw35834]
  • data. This includes classical information-theoretic measures (e.g. entropy, mutual information) and measures of information ... dynamics (e.g. active information storage, transfer entropy), but also several less common, yet powerful information ... effective information, information flow and integration measures. However, what makes Inform unique is that...
  • CDNA

  • Referenced in 9 articles [sw37103]
  • Significantly lower entropy estimates for natural DNA sequences . Its purpose is to measure the ”predictability/compressibility ... Expectation Maximization). Our focus in Significantly lower entropy estimates for natural DNA sequences...
  • infotheory

  • Referenced in 2 articles [sw29391]
  • theory. It implements widely used measures such as entropy and mutual information, as well...
  • TSEntropies

  • Referenced in 2 articles [sw34014]
  • Pincus in ”Approximate entropy as a measure of system complexity”, Proceedings of the National Academy ... America, 88, 2297-2301 (March 1991). Sample entropy was proposed by J. S. Richman...
  • IDTxl

  • Referenced in 5 articles [sw25603]
  • estimate the following measures: 1) For network inference: multivariate transfer entropy (TE)/Granger causality...
  • FSMRDE

  • Referenced in 9 articles [sw22250]
  • relative decision entropy-based feature selection approach. Rough set theory has been proven ... algorithm (called FSMRDE) in rough sets. To measure the significance of features in FSMRDE ... propose a new model of relative decision entropy, which is an extension of Shannon...
  • infotheo

  • Referenced in 3 articles [sw36715]
  • package implements various measures of information theory based on several entropy estimators...
  • MedianOfNinthers

  • Referenced in 5 articles [sw32520]
  • numbers, typical low-entropy artificial datasets, and real-world data. Measurements are open-sourced alongside...
  • entropart

  • Referenced in 1 article [sw24140]
  • package entropart: Entropy Partitioning to Measure Diversity. Measurement and partitioning of diversity, based on Tsallis...
  • UnitaryPremodularCategoryData

  • Referenced in 1 article [sw41254]
  • three dimensions. The topological entanglement entropy is used to measure long-range quantum correlations ... obtain closed form expressions for topological entropy of (2+1)- and (3+1)-dimensional loop...
  • ImbTreeEntropy

  • Referenced in 2 articles [sw40352]
  • entropy functions, such as Rènyi, Tsallis, Sharma–Mittal, Sharma–Taneja and Kapur, to measure ... existing algorithms that usually employ Shannon entropy and the concept of information gain. Additionally, ImbTreeEntropy...