[go: up one dir, main page]

 
 
entropy-logo

Journal Browser

Journal Browser

Selected Papers from 5th International Electronic Conference on Entropy and Its Applications

A special issue of Entropy (ISSN 1099-4300). This special issue belongs to the section "Multidisciplinary Applications".

Deadline for manuscript submissions: closed (30 April 2020) | Viewed by 31398

Special Issue Editor


E-Mail Website
Guest Editor
Research Unit Nuclear Fusion, Department of Applied Physics, Ghent University, Sint-Pietersnieuwstraat 41, B-9000 Ghent, Belgium
Interests: probability theory; Bayesian inference; machine learning; information geometry; differential geometry; nuclear fusion; plasma physics; plasma turbulence; continuum mechanics; statistical mechanics
Special Issues, Collections and Topics in MDPI journals

Special Issue Information

Dear Colleagues,

The 5th International Electronic Conference on Entropy and Its Applications (ECEA-5) is organized by Entropy from November 18 to 30, 2019 and hosted on the MDPI Sciforum platform. A broad range of topics is discussed concerning theory and applications related to the interdisciplinary concept of Entropy.
All conference contributors, both to oral and poster sessions, are encouraged to submit a full paper related to their contribution to this Special Issue of Entropy, with 20% discount on the APC charges. The Special Issue will maintain the topical subdivision of the conference in the following six fields:

  • Thermodynamics and Statistical Physics
  • Information Theory, Probability, Statistics and Artificial Intelligence
  • Quantum Information and Quantum Computing
  • Complex Systems
  • Biological Systems
  • Astrophysics, Cosmology and Black Holes

For more information about the topics, please visit the conference website: https://sciforum.net/conference/ecea-5.
We are looking forward to receiving your manuscript.

Prof. Dr. Geert Verdoolaege
Guest Editor

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Benefits of Publishing in a Special Issue

  • Ease of navigation: Grouping papers by topic helps scholars navigate broad scope journals more efficiently.
  • Greater discoverability: Special Issues support the reach and impact of scientific research. Articles in Special Issues are more discoverable and cited more frequently.
  • Expansion of research network: Special Issues facilitate connections among authors, fostering scientific collaborations.
  • External promotion: Articles in Special Issues are often promoted through the journal's social media, increasing their visibility.
  • e-Book format: Special Issues with more than 10 articles can be published as dedicated e-books, ensuring wide and rapid dissemination.

Further information on MDPI's Special Issue policies can be found here.

Published Papers (6 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

16 pages, 3213 KiB  
Article
Skyrmions and Spin Waves in Magneto–Ferroelectric Superlattices
by Ildus F. Sharafullin and Hung T. Diep
Entropy 2020, 22(8), 862; https://doi.org/10.3390/e22080862 - 4 Aug 2020
Cited by 3 | Viewed by 3612
Abstract
We present in this paper the effects of Dzyaloshinskii–Moriya (DM) magneto–electric coupling between ferroelectric and magnetic interface atomic layers in a superlattice formed by alternate magnetic and ferroelectric films. We consider two cases: magnetic and ferroelectric films have the simple cubic lattice and [...] Read more.
We present in this paper the effects of Dzyaloshinskii–Moriya (DM) magneto–electric coupling between ferroelectric and magnetic interface atomic layers in a superlattice formed by alternate magnetic and ferroelectric films. We consider two cases: magnetic and ferroelectric films have the simple cubic lattice and the triangular lattice. In the two cases, magnetic films have Heisenberg spins interacting with each other via an exchange J and a DM interaction with the ferroelectric interface. The electrical polarizations of ±1 are assumed for the ferroelectric films. We determine the ground-state (GS) spin configuration in the magnetic film and study the phase transition in each case. In the simple cubic lattice case, in zero field, the GS is periodically non collinear (helical structure) and in an applied field H perpendicular to the layers, it shows the existence of skyrmions at the interface. Using the Green’s function method we study the spin waves (SW) excited in a monolayer and also in a bilayer sandwiched between ferroelectric films, in zero field. We show that the DM interaction strongly affects the long-wave length SW mode. We calculate also the magnetization at low temperatures. We use next Monte Carlo simulations to calculate various physical quantities at finite temperatures such as the critical temperature, the layer magnetization and the layer polarization, as functions of the magneto–electric DM coupling and the applied magnetic field. Phase transition to the disordered phase is studied. In the case of the triangular lattice, we show the formation of skyrmions even in zero field and a skyrmion crystal in an applied field when the interface coupling between the ferroelectric film and the ferromagnetic film is rather strong. The skyrmion crystal is stable in a large region of the external magnetic field. The phase transition is studied. Full article
Show Figures

Figure 1

Figure 1
<p>(<b>a</b>) Magneto-ferroelectric superlattice composed of alternately a ferroelectric film and a ferromagnetic film. Each of the ferroelectric and ferromagnetic films has <span class="html-italic">n</span> atomic layers as shown, (<b>b</b>) interfacial coupling between a polarization <span class="html-italic">P</span> with 5 spins in a Dzyaloshinskii–Moriya (DM) interaction, (<b>c</b>) positions of the spins in the <math display="inline"><semantics> <mrow> <mi>x</mi> <mi>y</mi> </mrow> </semantics></math> plane and the position of non magnetic ion Oxygen, defining the DM vector (see text).</p>
Full article ">Figure 2
<p>Ground-state (GS) spin configurations for (<b>a</b>): <math display="inline"><semantics> <mrow> <msup> <mi>J</mi> <mrow> <mi>m</mi> <mi>f</mi> </mrow> </msup> <mo>=</mo> <mo>−</mo> <mn>0.45</mn> </mrow> </semantics></math>, (<b>b</b>): <math display="inline"><semantics> <mrow> <msup> <mi>J</mi> <mrow> <mi>m</mi> <mi>f</mi> </mrow> </msup> <mo>=</mo> <mo>−</mo> <mn>1.2</mn> </mrow> </semantics></math>, with <math display="inline"><semantics> <mrow> <mi>H</mi> <mo>=</mo> <mn>0</mn> </mrow> </semantics></math>, (<b>c</b>): angles between nearest neighbors (NN) are schematically zoomed. See text for comments.</p>
Full article ">Figure 3
<p>(<b>a</b>) 3D view of the GS configuration of the interface for moderate frustration <math display="inline"><semantics> <mrow> <msup> <mi>J</mi> <mrow> <mn>2</mn> <mi>m</mi> </mrow> </msup> <mo>=</mo> <msup> <mi>J</mi> <mrow> <mn>2</mn> <mi>f</mi> </mrow> </msup> <mo>=</mo> <mo>−</mo> <mn>0.2</mn> </mrow> </semantics></math>. (<b>b</b>) 3D view of the GS configuration of the second magnetic layers, (<b>c</b>) zoom of a skyrmion on the interface layer: red denotes up spin, four spins with clear blue color are down spin, other colors correspond to spin orientations between the two. The skyrmion is of the Bloch type, (<b>d</b>) <span class="html-italic">z</span>-components of spins across the skyrmion shown in (<b>c</b>). Other parameters: <math display="inline"><semantics> <mrow> <msup> <mi>J</mi> <mi>m</mi> </msup> <mo>=</mo> <msup> <mi>J</mi> <mi>f</mi> </msup> <mo>=</mo> <mn>1</mn> </mrow> </semantics></math>, <math display="inline"><semantics> <mrow> <msup> <mi>J</mi> <mrow> <mi>m</mi> <mi>f</mi> </mrow> </msup> <mo>=</mo> <mo>−</mo> <mn>1.25</mn> </mrow> </semantics></math> and <math display="inline"><semantics> <mrow> <mi>H</mi> <mo>=</mo> <mn>0.25</mn> </mrow> </semantics></math>.</p>
Full article ">Figure 4
<p>Spin-wave energy <math display="inline"><semantics> <mrow> <mi>E</mi> <mo>(</mo> <mi>k</mi> <mo>)</mo> </mrow> </semantics></math> versus <span class="html-italic">k</span> (<math display="inline"><semantics> <mrow> <mi>k</mi> <mo>≡</mo> <msub> <mi>k</mi> <mi>x</mi> </msub> <mo>=</mo> <msub> <mi>k</mi> <mi>z</mi> </msub> </mrow> </semantics></math>) for (<b>a</b>) <math display="inline"><semantics> <mrow> <mi>θ</mi> <mo>=</mo> <mn>0.3</mn> </mrow> </semantics></math> radian and (<b>b</b>) <math display="inline"><semantics> <mrow> <mi>θ</mi> <mo>=</mo> <mn>1</mn> </mrow> </semantics></math> for a monolayer at <math display="inline"><semantics> <mrow> <mi>T</mi> <mo>=</mo> <mn>0</mn> </mrow> </semantics></math>. See text for comments.</p>
Full article ">Figure 5
<p>(<b>a</b>) Energy of the magnetic films versus temperature <span class="html-italic">T</span> for <math display="inline"><semantics> <mrow> <mo>(</mo> <msup> <mi>J</mi> <mrow> <mn>2</mn> <mi>m</mi> </mrow> </msup> <mo>=</mo> <msup> <mi>J</mi> <mrow> <mn>2</mn> <mi>f</mi> </mrow> </msup> <mo>=</mo> <mo>−</mo> <mn>0.4</mn> <mo>)</mo> </mrow> </semantics></math> (red), coinciding with the curve for <math display="inline"><semantics> <mrow> <mo>(</mo> <msup> <mi>J</mi> <mrow> <mn>2</mn> <mi>m</mi> </mrow> </msup> <mo>=</mo> <mo>−</mo> <mn>0.4</mn> <mo>,</mo> <msup> <mi>J</mi> <mrow> <mn>2</mn> <mi>f</mi> </mrow> </msup> <mo>=</mo> <mn>0</mn> <mo>)</mo> </mrow> </semantics></math> (black, hidden behind the red curve). Blue curve is for <math display="inline"><semantics> <mrow> <mo>(</mo> <msup> <mi>J</mi> <mrow> <mn>2</mn> <mi>m</mi> </mrow> </msup> <mo>=</mo> <mn>0</mn> <mo>,</mo> <msup> <mi>J</mi> <mrow> <mn>2</mn> <mi>f</mi> </mrow> </msup> <mo>=</mo> <mo>−</mo> <mn>0.4</mn> <mo>)</mo> </mrow> </semantics></math>; (<b>b</b>) order parameter of the magnetic films versus temperature <span class="html-italic">T</span> for <math display="inline"><semantics> <mrow> <mo>(</mo> <msup> <mi>J</mi> <mrow> <mn>2</mn> <mi>m</mi> </mrow> </msup> <mo>=</mo> <msup> <mi>J</mi> <mrow> <mn>2</mn> <mi>f</mi> </mrow> </msup> <mo>=</mo> <mo>−</mo> <mn>0.4</mn> <mo>)</mo> </mrow> </semantics></math> (red), <math display="inline"><semantics> <mrow> <mo>(</mo> <msup> <mi>J</mi> <mrow> <mn>2</mn> <mi>m</mi> </mrow> </msup> <mo>=</mo> <mo>−</mo> <mn>0.4</mn> <mo>,</mo> <msup> <mi>J</mi> <mrow> <mn>2</mn> <mi>f</mi> </mrow> </msup> <mo>=</mo> <mn>0</mn> <mo>)</mo> </mrow> </semantics></math> (black), <math display="inline"><semantics> <mrow> <mo>(</mo> <msup> <mi>J</mi> <mrow> <mn>2</mn> <mi>m</mi> </mrow> </msup> <mo>=</mo> <mn>0</mn> <mo>,</mo> <msup> <mi>J</mi> <mrow> <mn>2</mn> <mi>f</mi> </mrow> </msup> <mo>=</mo> <mo>−</mo> <mn>0.4</mn> <mo>)</mo> </mrow> </semantics></math> (blue). Other used parameters: <math display="inline"><semantics> <mrow> <msup> <mi>J</mi> <mrow> <mi>m</mi> <mi>f</mi> </mrow> </msup> <mo>=</mo> <mo>−</mo> <mn>1.25</mn> </mrow> </semantics></math>, <math display="inline"><semantics> <mrow> <mi>H</mi> <mo>=</mo> <mn>0.25</mn> </mrow> </semantics></math>.</p>
Full article ">Figure 6
<p>Triangular lattice: ground-state configuration. For clarity, we have taken the angle between two nearest neighbors to be equal to 120 degrees. Note that with this choice of angle the configuration is very similar to that of the antiferromagnetic triangular lattice. See text for formula of the general angle. Note also that the next-nearest neighbor (NNN) spins are parallel.</p>
Full article ">Figure 7
<p>(<b>a</b>) Three-dimensional (3D) view of the ground-state (GS) configuration of the interface for <math display="inline"><semantics> <mrow> <mi>H</mi> <mo>=</mo> <mn>0</mn> </mrow> </semantics></math>, <math display="inline"><semantics> <mrow> <msup> <mi>J</mi> <mi>m</mi> </msup> <mo>=</mo> <msup> <mi>J</mi> <mi>f</mi> </msup> <mo>=</mo> <mn>1</mn> </mrow> </semantics></math>, <math display="inline"><semantics> <mrow> <msup> <mi>J</mi> <mrow> <mi>m</mi> <mi>f</mi> </mrow> </msup> <mo>=</mo> <mo>−</mo> <mn>0.2</mn> </mrow> </semantics></math> and (<b>b</b>) <math display="inline"><semantics> <mrow> <msup> <mi>J</mi> <mrow> <mi>m</mi> <mi>f</mi> </mrow> </msup> <mo>=</mo> <mo>−</mo> <mn>0.5</mn> </mrow> </semantics></math>.</p>
Full article ">Figure 8
<p>(<b>a</b>) Three-dimensional (3D) view of the ground-state configuration of the interface for <math display="inline"><semantics> <mrow> <mi>H</mi> <mo>=</mo> <mn>0</mn> </mrow> </semantics></math>, <math display="inline"><semantics> <mrow> <msup> <mi>J</mi> <mi>m</mi> </msup> <mo>=</mo> <msup> <mi>J</mi> <mi>f</mi> </msup> <mo>=</mo> <mn>1</mn> </mrow> </semantics></math>, <math display="inline"><semantics> <mrow> <msup> <mi>J</mi> <mrow> <mi>m</mi> <mi>f</mi> </mrow> </msup> <mo>=</mo> <mo>−</mo> <mn>0.75</mn> </mrow> </semantics></math> and (<b>b</b>) <math display="inline"><semantics> <mrow> <msup> <mi>J</mi> <mrow> <mi>m</mi> <mi>f</mi> </mrow> </msup> <mo>=</mo> <mo>−</mo> <mn>0.85</mn> </mrow> </semantics></math>.</p>
Full article ">Figure 9
<p>Dependence of the number of skyrmions on the interface magnetic layer <span class="html-italic">n</span> versus interface magneto–electric interaction parameter <math display="inline"><semantics> <msup> <mi>J</mi> <mrow> <mi>m</mi> <mi>f</mi> </mrow> </msup> </semantics></math> in zero field. The skyrmion phase is indicated by <span class="html-italic">S</span>. Other parameters: <math display="inline"><semantics> <mrow> <mi>H</mi> <mo>=</mo> <mn>0</mn> </mrow> </semantics></math>, <math display="inline"><semantics> <mrow> <msup> <mi>J</mi> <mi>m</mi> </msup> <mo>=</mo> <msup> <mi>J</mi> <mi>f</mi> </msup> <mo>=</mo> <mn>1</mn> </mrow> </semantics></math>.</p>
Full article ">Figure 10
<p>(<b>a</b>) 3D view of the GS configuration at the interface for <math display="inline"><semantics> <mrow> <mi>H</mi> <mo>=</mo> <mn>0.025</mn> </mrow> </semantics></math>, <math display="inline"><semantics> <mrow> <msup> <mi>J</mi> <mi>m</mi> </msup> <mo>=</mo> <msup> <mi>J</mi> <mi>f</mi> </msup> <mo>=</mo> <mn>1</mn> </mrow> </semantics></math>, <math display="inline"><semantics> <mrow> <msup> <mi>J</mi> <mrow> <mi>m</mi> <mi>f</mi> </mrow> </msup> <mo>=</mo> <mo>−</mo> <mn>1.0</mn> </mrow> </semantics></math>. Due to the representation scale, skyrmions seem not circular, but they are. (<b>b</b>) 3D view of the GS configuration at the interface for <math display="inline"><semantics> <mrow> <mi>H</mi> <mo>=</mo> <mn>0.5</mn> </mrow> </semantics></math>, <math display="inline"><semantics> <mrow> <msup> <mi>J</mi> <mi>m</mi> </msup> <mo>=</mo> <msup> <mi>J</mi> <mi>f</mi> </msup> <mo>=</mo> <mn>1</mn> </mrow> </semantics></math>, <math display="inline"><semantics> <mrow> <msup> <mi>J</mi> <mrow> <mi>m</mi> <mi>f</mi> </mrow> </msup> <mo>=</mo> <mo>−</mo> <mn>1.75</mn> </mrow> </semantics></math>.</p>
Full article ">Figure 11
<p>(<b>a</b>) Energy of the magnetic film on the triangular lattice versus temperature <span class="html-italic">T</span> for <math display="inline"><semantics> <mrow> <msup> <mi>J</mi> <mi>m</mi> </msup> <mo>=</mo> <msup> <mi>J</mi> <mi>f</mi> </msup> <mo>=</mo> <mn>1</mn> </mrow> </semantics></math>, <math display="inline"><semantics> <mrow> <msup> <mi>J</mi> <mrow> <mi>m</mi> <mi>f</mi> </mrow> </msup> <mo>=</mo> <mo>−</mo> <mn>1.0</mn> </mrow> </semantics></math> (red, hidden behind the green curve), coinciding with the curve for <math display="inline"><semantics> <mrow> <msup> <mi>J</mi> <mi>m</mi> </msup> <mo>=</mo> <msup> <mi>J</mi> <mi>f</mi> </msup> <mo>=</mo> <mn>1</mn> </mrow> </semantics></math>, <math display="inline"><semantics> <mrow> <msup> <mi>J</mi> <mrow> <mi>m</mi> <mi>f</mi> </mrow> </msup> <mo>=</mo> <mo>−</mo> <mn>1.75</mn> </mrow> </semantics></math> (green), the blue curve is for <math display="inline"><semantics> <mrow> <msup> <mi>J</mi> <mi>m</mi> </msup> <mo>=</mo> <msup> <mi>J</mi> <mi>f</mi> </msup> <mo>=</mo> <mn>1</mn> </mrow> </semantics></math>, <math display="inline"><semantics> <mrow> <msup> <mi>J</mi> <mrow> <mi>m</mi> <mi>f</mi> </mrow> </msup> <mo>=</mo> <mo>−</mo> <mn>4.25</mn> </mrow> </semantics></math>, (<b>b</b>) order parameter of the magnetic film versus temperature with the same color code. See text for comments.</p>
Full article ">
34 pages, 1431 KiB  
Article
Information Theory in Computational Biology: Where We Stand Today
by Pritam Chanda, Eduardo Costa, Jie Hu, Shravan Sukumar, John Van Hemert and Rasna Walia
Entropy 2020, 22(6), 627; https://doi.org/10.3390/e22060627 - 6 Jun 2020
Cited by 34 | Viewed by 11051
Abstract
“A Mathematical Theory of Communication” was published in 1948 by Claude Shannon to address the problems in the field of data compression and communication over (noisy) communication channels. Since then, the concepts and ideas developed in Shannon’s work have formed the basis of [...] Read more.
“A Mathematical Theory of Communication” was published in 1948 by Claude Shannon to address the problems in the field of data compression and communication over (noisy) communication channels. Since then, the concepts and ideas developed in Shannon’s work have formed the basis of information theory, a cornerstone of statistical learning and inference, and has been playing a key role in disciplines such as physics and thermodynamics, probability and statistics, computational sciences and biological sciences. In this article we review the basic information theory based concepts and describe their key applications in multiple major areas of research in computational biology—gene expression and transcriptomics, alignment-free sequence comparison, sequencing and error correction, genome-wide disease-gene association mapping, metabolic networks and metabolomics, and protein sequence, structure and interaction analysis. Full article
Show Figures

Figure 1

Figure 1
<p>Schematic showing the Gene-regulatory Networks (GRN) reconstruction problem where undirected edges are inferred using information theoretic methods.</p>
Full article ">Figure 2
<p>Schematic showing an overview of alignment-free sequence comparison.</p>
Full article ">Figure 3
<p>Illustration of an information theoretic communication model inspired representation for error correction in genomic sequencing based on the work of Chen et al. [<a href="#B65-entropy-22-00627" class="html-bibr">65</a>].</p>
Full article ">Figure 4
<p>Overview of the genome-disease association problem and some example information theoretic metrics calculated for identifying GxG and GxE, a binary phenotype and environmental variable is shown for simplicity.</p>
Full article ">Figure 5
<p>Simplified illustration of how mutual information (MI) can be used to capture links between two proteins from their Multiple sequence Alignments. High MI between 2 columns of the alignments are associated with interactions in their structures.</p>
Full article ">Figure 6
<p>An example of metabolic networks. Nodes represent metabolites and are transformed into each other through chemical reactions catalyzed by metabolic enzymes (red). Metabolic pathways are formed by a linked series of chemical reactions that collectively perform a biological function. Reconstruction of metabolic networks is typically done by reverse engineering of metabolome data, where information theoretic methods ARACNE and CLR have been applied. Reaction fluxes depicted by the width of the reaction arrows can be predicted using FBA. Also shown are examples of downstream analysis of a metabolic network using information theory for metabolic diversity characterization, structure-dynamic relationship analysis, single-cell level simulation of metabolic networks with FBA and nutrient information sensing.</p>
Full article ">Figure 7
<p>Biology presents many optimization problems. Shown here is a hypothetical search space that could represent anything from an organism’s traits to experiment designs to visualization parameters. In these cases, the vertical axis could be fitness/MEP, experiment value, and network edge crossovers, respectively.</p>
Full article ">Figure 8
<p>Independent Components Analysis (ICA) can compute latent factors in data like Principle Components Analysis (PCA). ICA is not limited to orthogonal mixtures like PCA, as shown in this simulated dataset. The ICA mixing vectors align to the data better than the PCA rotation vectors.</p>
Full article ">
23 pages, 2083 KiB  
Article
Renormalization Analysis of Topic Models
by Sergei Koltcov and Vera Ignatenko
Entropy 2020, 22(5), 556; https://doi.org/10.3390/e22050556 - 16 May 2020
Cited by 6 | Viewed by 3171
Abstract
In practice, to build a machine learning model of big data, one needs to tune model parameters. The process of parameter tuning involves extremely time-consuming and computationally expensive grid search. However, the theory of statistical physics provides techniques allowing us to optimize this [...] Read more.
In practice, to build a machine learning model of big data, one needs to tune model parameters. The process of parameter tuning involves extremely time-consuming and computationally expensive grid search. However, the theory of statistical physics provides techniques allowing us to optimize this process. The paper shows that a function of the output of topic modeling demonstrates self-similar behavior under variation of the number of clusters. Such behavior allows using a renormalization technique. A combination of renormalization procedure with the Renyi entropy approach allows for quick searching of the optimal number of topics. In this paper, the renormalization procedure is developed for the probabilistic Latent Semantic Analysis (pLSA), and the Latent Dirichlet Allocation model with variational Expectation–Maximization algorithm (VLDA) and the Latent Dirichlet Allocation model with granulated Gibbs sampling procedure (GLDA). The experiments were conducted on two test datasets with a known number of topics in two different languages and on one unlabeled test dataset with an unknown number of topics. The paper shows that the renormalization procedure allows for finding an approximation of the optimal number of topics at least 30 times faster than the grid search without significant loss of quality. Full article
Show Figures

Figure 1

Figure 1
<p>Partition function in bi-logarithmic coordinates (VLDA). Black: Lenta dataset; Red: 20 Newsgroups dataset.</p>
Full article ">Figure 2
<p>Renyi entropy curves (VLDA). Black: successive TM; Other colors: renormalization with randomly selected topics for merging; Lenta dataset.</p>
Full article ">Figure 3
<p>Renyi entropy curves (VLDA); Black: successive TM; Other colors: renormalization with randomly selected topics for merging; 20 Newsgroups dataset.</p>
Full article ">Figure 4
<p>Renyi entropy curves (VLDA) for both datasets. Black: successive TM. Red: renormalization with the minimum local entropy principle of merging. Solid: 20 Newsgroups dataset; Dashed: Lenta dataset.</p>
Full article ">Figure 5
<p>Renyi entropy curves (VLDA). Black: successive TM; Red: renormalization with the minimum KL divergence principle of merging; Solid: 20 Newsgroups dataset; Dashed: Lenta dataset.</p>
Full article ">Figure 6
<p>Partition function in bi-logarithmic coordinates (GLDA); Black: Lenta dataset; Red: 20 Newsgroups dataset.</p>
Full article ">Figure 7
<p>Renyi entropy curves (GLDA). Black: successive TM; Other colors: renormalization with randomly chosen topics for merging; Lenta dataset.</p>
Full article ">Figure 8
<p>Renyi entropy curves (GLDA). Black: successive TM; Other colors: renormalization with randomly chosen topics for merging; 20 Newsgroups dataset.</p>
Full article ">Figure 9
<p>Renyi entropy curves (GLDA). Black: successive TM; Red: renormalization with the minimum local entropy principle of merging; Solid: 20 Newsgroups dataset; Dashed: Lenta dataset.</p>
Full article ">Figure 10
<p>Renyi entropy curves (GLDA). Black: successive TM; Red: renormalization with the minimum KL divergence principle of merging; Solid: 20 Newsgroups dataset; Dashed: Lenta dataset.</p>
Full article ">Figure 11
<p>Partition function in bi-logarithmic coordinates (pLSA). Black: Lenta dataset; Red: 20 Newsgroups dataset.</p>
Full article ">Figure 12
<p>Renyi entropy curves (pLSA). Black: successive TM; Other colors: renormalization with the random merging of topics; Lenta dataset.</p>
Full article ">Figure 13
<p>Renyi entropy curves (pLSA). Black: successive TM; Other colors: renormalization with the random merging of topics; 20 Newsgroups dataset.</p>
Full article ">Figure 14
<p>Renyi entropy curves (pLSA). Black: successive TM; Red: renormalization with the minimum local entropy principle of merging; Solid: 20 Newsgroups dataset; Dashed: Lenta dataset.</p>
Full article ">Figure 15
<p>Renyi entropy curves (pLSA). Renyi entropy curves (pLSA); Black: successive TM; Red: renormalization with the minimum KL divergence principle of merging; Solid: 20 Newsgroups dataset; Dashed: Lenta dataset.</p>
Full article ">Figure 16
<p>Renyi entropy curves (successsive TM). Blue: VLDA; Orange: pLSA; Red: GLDA; Green: LDA with Gibbs sampling.</p>
Full article ">Figure 17
<p>Renyi entropy curves (renormalization with the minimum local entropy principle of merging). Blue: VLDA; Orange: pLSA; Red: GLDA; Green: LDA with Gibbs sampling.</p>
Full article ">
17 pages, 4256 KiB  
Article
A Novel Counterfeit Feature Extraction Technique for Exposing Face-Swap Images Based on Deep Learning and Error Level Analysis
by Weiguo Zhang, Chenggang Zhao and Yuxing Li
Entropy 2020, 22(2), 249; https://doi.org/10.3390/e22020249 - 21 Feb 2020
Cited by 40 | Viewed by 6813
Abstract
The quality and efficiency of generating face-swap images have been markedly strengthened by deep learning. For instance, the face-swap manipulations by DeepFake are so real that it is tricky to distinguish authenticity through automatic or manual detection. To augment the efficiency of distinguishing [...] Read more.
The quality and efficiency of generating face-swap images have been markedly strengthened by deep learning. For instance, the face-swap manipulations by DeepFake are so real that it is tricky to distinguish authenticity through automatic or manual detection. To augment the efficiency of distinguishing face-swap images generated by DeepFake from real facial ones, a novel counterfeit feature extraction technique was developed based on deep learning and error level analysis (ELA). It is related to entropy and information theory such as cross-entropy loss function in the final softmax layer. The DeepFake algorithm is only able to generate limited resolutions. Therefore, this algorithm results in two different image compression ratios between the fake face area as the foreground and the original area as the background, which would leave distinctive counterfeit traces. Through the ELA method, we can detect whether there are different image compression ratios. Convolution neural network (CNN), one of the representative technologies of deep learning, can extract the counterfeit feature and detect whether images are fake. Experiments show that the training efficiency of the CNN model can be significantly improved by the ELA method. In addition, the proposed technique can accurately extract the counterfeit feature, and therefore achieves outperformance in simplicity and efficiency compared with direct detection methods. Specifically, without loss of accuracy, the amount of computation can be significantly reduced (where the required floating-point computing power is reduced by more than 90%). Full article
Show Figures

Figure 1

Figure 1
<p>Outline of the proposed method.</p>
Full article ">Figure 2
<p>Overview of the DeepFake principle.</p>
Full article ">Figure 3
<p>The flow chart of generating negative examples.</p>
Full article ">Figure 4
<p>Some samples of the region of interest (ROI) area and processing results.</p>
Full article ">Figure 5
<p>Samples of the original and tampered images and the error level analysis results.</p>
Full article ">Figure 6
<p>The trained CNN network architecture.</p>
Full article ">Figure 7
<p>Samples from the MUCT database.</p>
Full article ">Figure 8
<p>Samples of dataset preprocessing and the error level analysis processing: (<b>a</b>–<b>g</b>) are the samples of the MUCT dataset; (1–4) are the corresponding processing and ELA processing results of the original images.</p>
Full article ">Figure 9
<p>The experimental result images: (<b>a</b>) The image of the accuracy curve and the loss function curve. The horizontal axis is the number of training cycles, and the vertical axis represents the loss value and accuracy respectively. (<b>b</b>) The confusion matrix of the verification data.</p>
Full article ">
12 pages, 3955 KiB  
Article
Quantifying Total Influence between Variables with Information Theoretic and Machine Learning Techniques
by Andrea Murari, Riccardo Rossi, Michele Lungaroni, Pasquale Gaudio and Michela Gelfusa
Entropy 2020, 22(2), 141; https://doi.org/10.3390/e22020141 - 24 Jan 2020
Cited by 5 | Viewed by 2598
Abstract
The increasingly sophisticated investigations of complex systems require more robust estimates of the correlations between the measured quantities. The traditional Pearson correlation coefficient is easy to calculate but sensitive only to linear correlations. The total influence between quantities is, therefore, often expressed in [...] Read more.
The increasingly sophisticated investigations of complex systems require more robust estimates of the correlations between the measured quantities. The traditional Pearson correlation coefficient is easy to calculate but sensitive only to linear correlations. The total influence between quantities is, therefore, often expressed in terms of the mutual information, which also takes into account the nonlinear effects but is not normalized. To compare data from different experiments, the information quality ratio is, therefore, in many cases, of easier interpretation. On the other hand, both mutual information and information quality ratio are always positive and, therefore, cannot provide information about the sign of the influence between quantities. Moreover, they require an accurate determination of the probability distribution functions of the variables involved. As the quality and amount of data available are not always sufficient to grant an accurate estimation of the probability distribution functions, it has been investigated whether neural computational tools can help and complement the aforementioned indicators. Specific encoders and autoencoders have been developed for the task of determining the total correlation between quantities related by a functional dependence, including information about the sign of their mutual influence. Both their accuracy and computational efficiencies have been addressed in detail, with extensive numerical tests using synthetic data. A careful analysis of the robustness against noise has also been performed. The neural computational tools typically outperform the traditional indicators in practically every respect. Full article
Show Figures

Figure 1

Figure 1
<p>General topology of autoencoders.</p>
Full article ">Figure 2
<p>Architecture of the autoencoders used in the present work with the matrixes that are multiplied to give the matrix W. The case shown in the figure is particularized for a latent space of dimension 2.</p>
Full article ">Figure 3
<p>Comparison of correlation coefficients for the two examples described in the text. Case 1: No noise. Case 2: Gaussian noise with standard deviation of 20% of the actual signal standard deviation.</p>
Full article ">Figure 4
<p>Trend of the errors in the reconstruction of the input data with the dimensionality of the intermediate layer in the autoencoder.</p>
Full article ">Figure 5
<p>(<b>a</b>) The Pearson correlation coefficient (PCC) for a set of 10 variables correlated as specified in the text. (<b>b</b>) The correlation coefficients obtained with the proposed method of the autoencoders.</p>
Full article ">Figure 6
<p>Trend of the off-diagonal term of the matrix Λ and the PCC versus the percentage of additive Gaussian noise. The noise intensity is calculated as the standard deviation of the noise divided by the standard deviation of the variable amplitude.</p>
Full article ">Figure 7
<p><b>Top</b>: Two linearly dependent variables (<b>left</b>) and the relative local correlation coefficient <span class="html-italic">ρ<sub>int</sub></span> (<b>right</b>). <b>Middle</b>: Two quadratic-dependent variables (<b>left</b>) and the relative local correlation coefficient <span class="html-italic">ρ<sub>int</sub></span> (<b>right</b>). <b>Bottom</b>: Two variables with a negative cubic dependence (<b>left</b>) and the relative local correlation coefficient <span class="html-italic">ρ</span> (<b>right</b>). The integral values of the correlation coefficient and of the monotonicity are reported in the insets.</p>
Full article ">Figure 8
<p>Comparison of the <span class="html-italic">ρ<sub>int</sub></span> and the IQR for the negative cubic dependence (third case of <a href="#entropy-22-00141-f007" class="html-fig">Figure 7</a>). The <span class="html-italic">x</span>-axis reports the number of bins and N is the number of generated points used to calculate the indicators.</p>
Full article ">Figure 9
<p>Effects of the noise amplitude on <span class="html-italic">ρ<sub>int</sub></span> for various choices of the number of bins. (<b>a</b>) The investigated dependence is <span class="html-italic">y</span> = <span class="html-italic">x</span><sup>2</sup>. (<b>b</b>) the investigated dependence is <span class="html-italic">y</span> = <span class="html-italic">x</span><sup>3</sup>. The independent variable <span class="html-italic">x</span> varies in the range [–10;10] and the number of points is 10<sup>5</sup>. The noise intensity is calculated as the standard deviation of the noise divided by the standard deviation of the variable amplitude.</p>
Full article ">
17 pages, 9175 KiB  
Article
A Novel Improved Feature Extraction Technique for Ship-Radiated Noise Based on IITD and MDE
by Zhaoxi Li, Yaan Li, Kai Zhang and Jianli Guo
Entropy 2019, 21(12), 1215; https://doi.org/10.3390/e21121215 - 12 Dec 2019
Cited by 25 | Viewed by 3100
Abstract
Ship-radiated noise signal has a lot of nonlinear, non-Gaussian, and nonstationary information characteristics, which can reflect the important signs of ship performance. This paper proposes a novel feature extraction technique for ship-radiated noise based on improved intrinsic time-scale decomposition (IITD) and multiscale dispersion [...] Read more.
Ship-radiated noise signal has a lot of nonlinear, non-Gaussian, and nonstationary information characteristics, which can reflect the important signs of ship performance. This paper proposes a novel feature extraction technique for ship-radiated noise based on improved intrinsic time-scale decomposition (IITD) and multiscale dispersion entropy (MDE). The proposed feature extraction technique is named IITD-MDE. First, IITD is applied to decompose the ship-radiated noise signal into a series of intrinsic scale components (ISCs). Then, we select the ISC with the main information through the correlation analysis, and calculate the MDE value as feature vectors. Finally, the feature vectors are input into the support vector machine (SVM) for ship classification. The experimental results indicate that the recognition rate of the proposed technique reaches 86% accuracy. Therefore, compared with the other feature extraction methods, the proposed method provides a new solution for classifying different types of ships effectively. Full article
Show Figures

Figure 1

Figure 1
<p>The comparison of the interpolation methods: (<b>a</b>) linear interpolation, (<b>b</b>) cubic spline interpolation, and (<b>c</b>) akima interpolation.</p>
Full article ">Figure 2
<p>Intrinsic scale component (ISC) satisfies the conditions.</p>
Full article ">Figure 3
<p>The coarse-grained process of MDE.</p>
Full article ">Figure 4
<p>The time-frequency domain waveforms of <math display="inline"><semantics> <mrow> <mi>x</mi> <mrow> <mo>(</mo> <mi>t</mi> <mo>)</mo> </mrow> </mrow> </semantics></math>.</p>
Full article ">Figure 5
<p>The results of decomposing.</p>
Full article ">Figure 6
<p>The time waveform for two simulated signals: (<b>a</b>) Gaussian white noise, (<b>b</b>) <math display="inline"><semantics> <mrow> <mn>1</mn> <mo>/</mo> <mi>f</mi> </mrow> </semantics></math> noise.</p>
Full article ">Figure 7
<p>The multi-entropy value of Gaussian white noise and <math display="inline"><semantics> <mrow> <mn>1</mn> <mo>/</mo> <mi>f</mi> </mrow> </semantics></math> noise: (<b>a</b>) MSE, (<b>b</b>) MPE and (<b>c</b>) MDE.</p>
Full article ">Figure 8
<p>The flowchart of feature extraction of ship-radiated noise based on IITD-MDE.</p>
Full article ">Figure 9
<p>Five types of ship signals.</p>
Full article ">Figure 10
<p>Spectrum analysis.</p>
Full article ">Figure 11
<p>Time domain of decomposed results by IITD.</p>
Full article ">Figure 11 Cont.
<p>Time domain of decomposed results by IITD.</p>
Full article ">Figure 12
<p>Spectrum of decomposed results by IITD.</p>
Full article ">Figure 12 Cont.
<p>Spectrum of decomposed results by IITD.</p>
Full article ">Figure 13
<p>Correlation coefficients of ISCs.</p>
Full article ">Figure 14
<p>The distribution of the four methods.</p>
Full article ">Figure 15
<p>Error bar graph of the methods (<b>a</b>) IITD-MDE and (<b>b</b>) ITD-MDE.</p>
Full article ">Figure 16
<p>Classification results.</p>
Full article ">
Back to TopTop