• Title/Summary/Keyword: relative entropy

Search Result 75, Processing Time 0.027 seconds

A RELATIVE RÉNYI OPERATOR ENTROPY

  • MIRAN JEONG;SEJONG KIM
    • Journal of applied mathematics & informatics
    • /
    • v.41 no.1
    • /
    • pp.123-132
    • /
    • 2023
  • We define an operator version of the relative Rényi entropy as the generalization of relative von Neumann entropy, and provide its fundamental properties and the bounds for its trace value. Moreover, we see an effect of the relative Rényi entropy under tensor product, and show the sub-additivity for density matrices.

A View on Extension of Utility-Based on Links with Information Measures

  • Hoseinzadeh, A.R.;Borzadaran, G.R.Mohtashami;Yari, G.H.
    • Communications for Statistical Applications and Methods
    • /
    • v.16 no.5
    • /
    • pp.813-820
    • /
    • 2009
  • In this paper, we review the utility-based generalization of the Shannon entropy and Kullback-Leibler information measure as the U-entropy and the U-relative entropy that was introduced by Friedman et al. (2007). Then, we derive some relations between the U-relative entropy and other information measures based on a parametric family of utility functions.

RELATIVE SEQUENCE ENTROPY PAIRS FOR A MEASURE AND RELATIVE TOPOLOGICAL KRONECKER FACTOR

  • AHN YOUNG-HO;LEE JUNGSEOB;PARK KYEWON KOH
    • Journal of the Korean Mathematical Society
    • /
    • v.42 no.4
    • /
    • pp.857-869
    • /
    • 2005
  • Let $(X,\;B,\;{\mu},\;T)$ be a dynamical system and (Y, A, v, S) be a factor. We investigate the relative sequence entropy of a partition of X via the maximal compact extension of (Y, A, v, S). We define relative sequence entropy pairs and using them, we find the relative topological ${\mu}-Kronecker$ factor over (Y, v) which is the maximal topological factor having relative discrete spectrum over (Y, v). We also describe the topological Kronecker factor which is the maximal factor having discrete spectrum for any invariant measure.

Characterization of Surface Roughness Using the Concept of Entropy in Machining (엔트로피 개념을 이용한 절삭가공에서 표면거칠기의 특성화)

  • 최기홍;최기상
    • Transactions of the Korean Society of Mechanical Engineers
    • /
    • v.18 no.12
    • /
    • pp.3118-3126
    • /
    • 1994
  • This paper describes the use of the concept of (relative) entropy for effective characterization of the amplitude and the frequency distributions of the surface profile formed in machining operation. For this purpose, a theoretical model for surface texture formation in turning operation is developed first. Then, the concept of (relative) entropy is reviewed and its effectiveness is examined based on the simulation and experimental results. The results also suggest that under random tool vibration the effect of the geometrical factors on the surface texture formation can be successfully decomposed and therefore, identified by applying the concept of (relative) entropy.

THE RELATIVE ENTROPY UNDER THE R-CGMY PROCESSES

  • Kwon, YongHoon;Lee, Younhee
    • Journal of the Chungcheong Mathematical Society
    • /
    • v.28 no.1
    • /
    • pp.109-117
    • /
    • 2015
  • We consider the relative entropy for two R-CGMY processes, which are CGMY processes with Y equal to 1, to choose an equivalent martingale measure (EMM) when the underlying asset of a derivative follows a R-CGMY process in the financial market. Since the R-CGMY process leads to an incomplete market, we have to use a proper technique to choose an EMM among a variety of EMMs. In this paper, we derive the closed form expression of the relative entropy for R-CGMY processes.

Deriving a New Divergence Measure from Extended Cross-Entropy Error Function

  • Oh, Sang-Hoon;Wakuya, Hiroshi;Park, Sun-Gyu;Noh, Hwang-Woo;Yoo, Jae-Soo;Min, Byung-Won;Oh, Yong-Sun
    • International Journal of Contents
    • /
    • v.11 no.2
    • /
    • pp.57-62
    • /
    • 2015
  • Relative entropy is a divergence measure between two probability density functions of a random variable. Assuming that the random variable has only two alphabets, the relative entropy becomes a cross-entropy error function that can accelerate training convergence of multi-layer perceptron neural networks. Also, the n-th order extension of cross-entropy (nCE) error function exhibits an improved performance in viewpoints of learning convergence and generalization capability. In this paper, we derive a new divergence measure between two probability density functions from the nCE error function. And the new divergence measure is compared with the relative entropy through the use of three-dimensional plots.

Mutual Information Analysis with Similarity Measure

  • Wang, Hong-Mei;Lee, Sang-Hyuk
    • International Journal of Fuzzy Logic and Intelligent Systems
    • /
    • v.10 no.3
    • /
    • pp.218-223
    • /
    • 2010
  • Discussion and analysis about relative mutual information has been carried out through fuzzy entropy and similarity measure. Fuzzy relative mutual information measure (FRIM) plays an important part as a measure of information shared between two fuzzy pattern vectors. This FRIM is analyzed and explained through similarity measure between two fuzzy sets. Furthermore, comparison between two measures is also carried out.

Evaluation of Raingauge Networks in the Soyanggang Dam River Basin (소양강댐 유역의 강우관측망 적정성 평가)

  • Kim, Jae-Bok;Bae, Young-Dae;Park, Bong-Jin;Kim, Jae-Han
    • Proceedings of the Korea Water Resources Association Conference
    • /
    • 2007.05a
    • /
    • pp.178-182
    • /
    • 2007
  • In this study, we evaluated current raingauge network of Soyanggang dam region applying spatial-correlation analysis and Entropy theory to recommend an optimized raingauge network. In the process of analysis, correlation distance of raingauge stations is estimated and evaluated via spatial-correlation method and entropy method. From this correlation distances, respective influencing radii of each dataset and each methods is assessed. The result of correlation and entropy analysis has estimated correlation distance of 25.546km and influence radius of 7.206km, deducing a decrease of network density from $224.53km^2$ to $122.47km^2$ which satisfy the recommended minimum densities of $250km^2$ in mountainous regions(WMO, 1994) and an increase of basin coverage from 59.3% to 86.8%. As for the elevation analysis the relative evaluation ratio increased from 0.59(current) to 0.92(optimized) resulting an obvious improvement.

  • PDF

ALGEBRAIC ENTROPIES OF NATURAL NUMBERS WITH ONE OR TWO PRIME FACTORS

  • JEONG, SEUNGPIL;KIM, KYONG HOON;KIM, GWANGIL
    • The Pure and Applied Mathematics
    • /
    • v.23 no.3
    • /
    • pp.205-221
    • /
    • 2016
  • We formulate the additive entropy of a natural number in terms of the additive partition function, and show that its multiplicative entropy is directly related to the multiplicative partition function. We give a practical formula for the multiplicative entropy of natural numbers with two prime factors. We use this formula to analyze the comparative density of additive and multiplicative entropy, prove that this density converges to zero as the number tends to infinity, and empirically observe this asymptotic behavior.