![en:Information entropy of a en:Bernoulli trial X. If X can assume values 0 and 1, entropy of X is defined as H(X) = -Pr(X=0) log2 Pr(X=0) - Pr(X=1) log2 Pr(X=1). It has value if Pr(X=0)=1 or Pr(X=1)=1. The entropy reaches maximum when Pr(X=0)=Pr(X=1)=1/2 (the value of entropy is then 1). The image was created in the following steps. First I have created a DVI version starting from a LaTeX/Pstricks source. Here is the code: %Plot of information entropy of Bernoulli variable % %latex binary_entropy_plot; dvips binary_entropy_plot %open .ps file in gimp, choose strong antialias in both text and graphics, %resulution 500, color mode, crop, scale to 45%, save as .png \documentclass[12pt]{article} \usepackage{pst-plot} \begin{document} \psset{unit=4cm} \begin{pspicture}(0,0)(1.01,1) \psgrid[gridlabels=0pt,gridcolor=lightgray,subgriddiv=10,subgridcolor=lightgray](0,0)(0,0)(1,1) \newrgbcolor{myblue}{0 0 0.7} \psaxes[arrows=->,arrowsize=2pt 4,Dx=0.5,Dy=0.5](0,0)(0,0)(1.1,1.1) \psplot[plotstyle=curve,plotpoints=100,linewidth=1.8pt,linecolor=myblue]{0.0001}{0.9999}{-1 x x log 2 log div mul 1 x sub 1 x sub log 2 log div mul add mul} \rput(0.5,-0.22){$\Pr(X=1)$} \rput{90}(-0.28,0.5){$H(X)$} \end{pspicture} \end{document} compile it with latex to get the DVI. Then it was converted to PS with dvips. Finally it was converted to SVG using ps2svg.sh; it needed some post-processing with Inkscape](http://upload.wikimedia.org/wikipedia/commons/thumb/2/22/Binary_entropy_plot.svg/300px-Binary_entropy_plot.svg.png)
Size of this PNG preview of this SVG file: 300 × 300 pixels. Other resolutions: 240 × 240 pixels | 480 × 480 pixels | 768 × 768 pixels | 1,024 × 1,024 pixels | 2,048 × 2,048 pixels.
Original file (SVG file, nominally 300 × 300 pixels, file size: 1 KB)
File history
Click on a date/time to view the file as it appeared at that time.
Date/Time | Thumbnail | Dimensions | User | Comment | |
---|---|---|---|---|---|
current | 16:07, 9 February 2015 | ![]() | 300 × 300 (1 KB) | Krishnavedala | simplified drawing |
15:19, 22 April 2007 | ![]() | 169 × 163 (40 KB) | Alejo2083 | {{Information |Description=Information entropy of a Bernoulli trial ''X''. If ''X'' can assume values 0 and 1, entropy of ''X'' is defined as ''H''(''X'') = -Pr(''X''=0) log<sub>2</sub> Pr(''X''=0) - Pr(''X''=1) log<sub>2</sub> Pr(''X''=1). It has |
File usage
The following pages on the English Wikipedia use this file (pages on other projects are not listed):
Global file usage
The following other wikis use this file:
- Usage on ar.wikipedia.org
- Usage on ca.wikipedia.org
- Usage on cs.wikipedia.org
- Usage on da.wikipedia.org
- Usage on el.wikipedia.org
- Usage on es.wikipedia.org
- Usage on eu.wikipedia.org
- Usage on fa.wikipedia.org
- Usage on fr.wikipedia.org
- Usage on gl.wikipedia.org
- Usage on id.wikipedia.org
- Usage on id.wiktionary.org
- Usage on it.wikipedia.org
- Usage on it.wikiversity.org
- Usage on ko.wikipedia.org
- Usage on pl.wikipedia.org
- Usage on pnb.wikipedia.org
- Usage on sq.wikipedia.org
- Usage on su.wikipedia.org
- Usage on tl.wikipedia.org
- Usage on uk.wikipedia.org
- Usage on ur.wikipedia.org
- Usage on vi.wikipedia.org
- Usage on zh-yue.wikipedia.org
- Usage on zh.wikipedia.org
- Usage on zh.wiktionary.org