To install click the Add extension button. That's it.

The source code for the WIKI 2 extension is being checked by specialists of the Mozilla Foundation, Google, and Apple. You could also do it yourself at any point in time.

4,5
Kelly Slayton
Congratulations on this excellent venture… what a great idea!
Alexander Grigorievskiy
I use WIKI 2 every day and almost forgot how the original Wikipedia looks like.
Live Statistics
English Articles
Improved in 24 Hours
Added in 24 Hours
What we do. Every page goes through several hundred of perfecting techniques; in live mode. Quite the same Wikipedia. Just better.
.
Leo
Newton
Brights
Milds

Timeline of probability and statistics

From Wikipedia, the free encyclopedia

The following is a timeline of probability and statistics.

YouTube Encyclopedic

  • 1/5
    Views:
    4 931 920
    3 872
    386 350
    903
    1 443 197
  • Probability explained | Independent and dependent events | Probability and Statistics | Khan Academy
  • The story of probability and it's development
  • Probability Part 1: Rules and Patterns: Crash Course Statistics #13
  • Probability & Statistics | Lec1.1 - Content of the Course
  • What Is Statistics: Crash Course Statistics #1

Transcription

What I want to do in this video is give you at least a basic overview of probability. Probability, a word that you've probably heard a lot of, and you are probably a little bit familiar with it. But hopefully, this will give you a little deeper understanding. Let's say that I have a fair coin over here. And so when I talk about a fair coin, I mean that it has an equal chance of landing on one side or another. So you can maybe view it as the sides are equal, their weight is the same on either side. If I flip it in the air, it's not more likely to land on one side or the other. It's equally likely. And so you have one side of this coin. So this would be the heads I guess. Try to draw George Washington. I'll assume it's a quarter of some kind. And the other side, of course, is the tails. So that is heads. The other side right over there is tails. And so if I were to ask you, what is the probability-- I'm going to flip a coin. And I want to know what is the probability of getting heads. And I could write that like this-- the probability of getting heads. And you probably, just based on that question, have a sense of what probability is asking. It's asking for some type of way of getting your hands around an event that's fundamentally random. We don't know whether it's heads or tails, but we can start to describe the chances of it being heads or tails. And we'll talk about different ways of describing that. So one way to think about it, and this is the way that probability tends to be introduced in textbooks, is you say, well, look, how many different, equally likely possibilities are there? So how many equally likely possibilities. So number of equally-- let me write equally-- of equally likely possibilities. And of the number of equally possibilities, I care about the number that contain my event right here. So the number of possibilities that meet my constraint, that meet my conditions. So in the case of the probability of figuring out heads, what is the number of equally likely possibilities? Well, there's only two possibilities. We're assuming that the coin can't land on its corner and just stand straight up. We're assuming that it lands flat. So there's two possibilities here, two equally likely possibilities. You could either get heads, or you could get tails. And what's the number of possibilities that meet my conditions? Well, there's only one, the condition of heads. So it'll be 1/2. So one way to think about it is the probability of getting heads is equal to 1/2. If I wanted to write that as a percentage, we know that 1/2 is the same thing as 50%. Now, another way to think about or conceptualize probability that will give you this exact same answer is to say, well, if I were to run the experiment of flipping a coin-- so this flip, you view this as an experiment. I know this isn't the kind of experiment that you're used to. You know, you normally think an experiment is doing something in chemistry or physics or all the rest. But an experiment is every time you do, you run this random event. So one way to think about probability is if I were to do this experiment, an experiment many, many, many times-- if I were to do it 1,000 times or a million times or a billion times or a trillion times-- and the more the better-- what percentage of those would give me what I care about? What percentage of those would give me heads? And so another way to think about this 50% probability of getting heads is if I were to run this experiment tons of times, if I were to run this forever, an infinite number of times, what percentage of those would be heads? You would get this 50%. And you can run that simulation. You can flip a coin. And it's actually a fun thing to do. I encourage you to do it. If you take 100 or 200 quarters or pennies, stick them in a big box, shake the box so you're kind of simultaneously flipping all of the coins, and then count how many of those are going to be heads. And you're going to see that the larger the number that you are doing, the more likely you're going to get something really close to 50%. And there's always some chance-- even if you flipped a coin a million times, there's some super-duper small chance that you would get all tails. But the more you do, the more likely that things are going to trend towards 50% of them are going to be heads. Now, let's just apply these same ideas. And while we're starting with probability, at least kind of the basic, this is probably an easier thing to conceptualize. But a lot of times, this is actually a helpful one, too, this idea that if you run the experiment many, many, many, many times, what percentage of those trials are going to give you what you're asking for. In this case, it was heads. Now, let's do another very typical example when you first learn probability. And this is the idea of rolling a die. So here's my die right over here. And of course, you have, you know, the different sides of the die. So that's the 1. That's the 2. And that's the 3. And what I want to do-- and we know, of course, that there are-- and I'm assuming this is a fair die. And so there are six equally likely possibilities. When you roll this, you could get a 1, a 2, a 3, a 4, a 5, or a 6. And they're all equally likely. So if I were to ask you, what is the probability given that I'm rolling a fair die-- so the experiment is rolling this fair die, what is the probability of getting a 1? Well, what are the number of equally likely possibilities? Well, I have six equally likely possibilities. And how many of those meet my conditions? Well, only one of them meets my condition, that right there. So there is a 1/6 probability of rolling a 1. What is the probability of rolling a 1 or a 6? Well, once again, there are six equally likely possibilities for what I can get. There are now two possibilities that meet my conditions. I could roll a 1 or I could roll a 6. So now there are two possibilities that meet my constraints, my conditions. There is a 1/3 probability of rolling a 1 or a 6. Now, what is the probability-- and this might seem a little silly to even ask this question, but I'll ask it just to make it clear. What is the probability of rolling a 2 and a 3? And I'm just talking about one roll of the die. Well, in any roll of the die, I can only get a 2 or a 3. I'm not talking about taking two rolls of this die. So in this situation, there's six possibilities, but none of these possibilities are 2 and a 3. None of these are 2 and a 3. 2 and a 3 cannot exist. On one trial, you cannot get a 2 and a 3 in the same experiment. Getting a 2 and a 3 are mutually exclusive events. They cannot happen at the same time. So the probability of this is actually 0. There's no way to roll this normal die and all of a sudden, you get a 2 and a 3, in fact. And I don't want to confuse you with that, because it's kind of abstract and impossible. So let's cross this out right over here. Now, what is the probability of getting an even number? So once again, you have six equally likely possibilities when I roll that die. And which of these possibilities meet my conditions, the condition of being even? Well, 2 is even, 4 is even, and 6 is even. So 3 of the possibilities meet my conditions, meet my constraints. So this is 1/2. If I roll a die, I have a 1/2 chance of getting an even number.

Before 1600

  • 8th century – Al-Khalil, an Arab mathematician studying cryptology, wrote the Book of Cryptographic Messages. The work has been lost, but based on the reports of later authors, it contained the first use of permutations and combinations to list all possible Arabic words with and without vowels.[1]
  • 9th century - Al-Kindi was the first to use frequency analysis to decipher encrypted messages and developed the first code breaking algorithm. He wrote a book entitled Manuscript on Deciphering Cryptographic Messages, containing detailed discussions on statistics and cryptanalysis.[2][3][4] Al-Kindi also made the earliest known use of statistical inference.[1]
  • 13th century – An important contribution of Ibn Adlan was on sample size for use of frequency analysis.[1]
  • 13th century – the first known calculation of the probability for throwing 3 dices is published in the Latin poem De vetula.
  • 1560s (published 1663) – Cardano's Liber de ludo aleae attempts to calculate probabilities of dice throws. He demonstrates the efficacy of defining odds as the ratio of favourable to unfavourable outcomes (which implies that the probability of an event is given by the ratio of favourable outcomes to the total number of possible outcomes[5]).
  • 1577 – Bartolomé de Medina defends probabilism, the view that in ethics one may follow a probable opinion even if the opposite is more probable

17th century

  • 1654 – Blaise Pascal and Pierre de Fermat create the mathematical theory of probability,
  • 1657 – Chistiaan Huygens's De ratiociniis in ludo aleae is the first book on mathematical probability,
  • 1662 – John Graunt's Natural and Political Observations Made upon the Bills of Mortality makes inferences from statistical data on deaths in London,
  • 1666 – In Le Journal des Sçavans xxxi, 2 August 1666 (359–370(=364)) appears a review of the third edition (1665) of John Graunt's Observations on the Bills of Mortality. This review gives a summary of 'plusieurs reflexions curieuses', of which the second are Graunt's data on life expectancy. This review is used by Nicolaus Bernoulli in his De Usu Artis Conjectandi in Jure (1709).
  • 1669 – Christiaan Huygens and his brother Lodewijk discuss between August and December that year Graunts mortality table (Graunt 1662, p. 62) in letters #1755
  • 1693 – Edmond Halley prepares the first mortality tables statistically relating death rate to age,

18th century

19th century

20th century

See also

References

  1. ^ a b c Broemeling, Lyle D. (1 November 2011). "An Account of Early Statistical Inference in Arab Cryptology". The American Statistician. 65 (4): 255–257. doi:10.1198/tas.2011.10191.
  2. ^ Singh, Simon (2000). The code book : the science of secrecy from ancient Egypt to quantum cryptography (1st Anchor Books ed.). New York: Anchor Books. ISBN 0-385-49532-3.
  3. ^ Singh, Simon (2000). The code book : the science of secrecy from ancient Egypt to quantum cryptography (1st Anchor Books ed.). New York: Anchor Books. ISBN 978-0-385-49532-5.
  4. ^ Ibrahim A. Al-Kadi "The origins of cryptology: The Arab contributions", Cryptologia, 16(2) (April 1992) pp. 97–126.
  5. ^ Some laws and problems in classical probability and how Cardano anticipated them Gorrochum, P. Chance magazine 2012
  6. ^ Wright, Sewall (1921). "Correlation and causation". Journal of Agricultural Research. 20 (7): 557–585.

Further reading

This page was last edited on 17 November 2023, at 18:00
Basis of this page is in Wikipedia. Text is available under the CC BY-SA 3.0 Unported License. Non-text media are available under their specified licenses. Wikipedia® is a registered trademark of the Wikimedia Foundation, Inc. WIKI 2 is an independent company and has no affiliation with Wikimedia Foundation.