Exercises on Probability Knowledge

In this lesson, we do some exercises on probability knowledge. There are 6 main tasks, each covering various knowledge points from the lecture notes. Some require you to solve problems discussed in the lecture, while others introduce new questions or extend previous one.

If you are skilled in probability theory, you might treat it as a quick and easy test. For those who are not familiar with probability theory, these exercises can assistant you for learning or reviewing the basics. They are not mathematical puzzles; rather, they are basic exercises designed to help you understand those concepts. Again, the more you solve, the more you practice, the more you master.

Task 1: Probabilities of events

Task 1.1 A bag contains 5 red balls, 3 blue balls, and 2 green balls. If you randomly draw one marble from the bag:

  1. What is the probability of drawing a red ball?
  2. What is the probability of drawing a blue or green ball?

Task 1.2 A fair six-sided dice is rolled once.

  1. What is the probability of getting an even number?
  2. What is the conditional probability of getting a number larger than 3 given the number is an even number?
  3. What is the probability that you first time get a number less than 2 after throwing the dice 5 times?

Task 2: Discrete Random Variable and Distribution

Task 2.1: In R, write a function returning the values of p.m.f of an arbitrary Binomial distribution.

Task 2.2: Apply your function to print out the distribution of Binomial distribution \(\text{Bin}(10,0.7)\).

Task 3: Characteristic Values

Task 3.1: Explain why the expected value of a Bernoulli distributed random variable, \(X \sim \text{Ber}(p)\), is \(p\).

Task 3.2: Explain why the expected value of a Binomial distributed random variable, \(X \sim \text{Bin}(N, p)\), is \(Np\).

Task 3.3: Explain why the variance of a Bernoulli distributed random variable is \(p(1-p)\).

Task 3.4: Explain why the variance of a Binomial distributed random variable is \(Np(1-p)\).

Task 3.5 ( HS ): In Section 3.4.5, we discussed the covariance between two random variables and got formulas of calculating the characteristic values for the weighted sum (linear combination) of random variables. Actually, these formula can be represented in a matrix form, and the matrix form can help us to easily generalize it to multiple setting (more than two random variables). For example, the mean and variance of a linear combination of two random variables are \[ \text{E}(a_1X_1+a_2X_2) = a_1\text{E}(X_1) + a_2\text{E}(X_2) \] It can be represented as \[ \text{E}(\textbf{a}^{\top}\textbf{X}) = \textbf{a}^{\top}\text{E}(\textbf{X}) \] where \(\textbf{a} = (a_1, a_2)^{\top}\) and \(\textbf{X} = (X_1, X_2)^{\top}\). In a multivariate setting, we usually call \(\textbf{X}\) as a random vector, it is a 2 dimensional random vector in this case, however, it can be arbitrary dimension in general.

Now, it is your turn. Can you represent the following result of variance in a matrix form? \[ \text{Var}(a_1X_1+a_2X_2) = a_1^2\text{Var}(X_1) + 2a_1a_2\text{Cov}(X_1,X_2) + a_2^2\text{Var}(X_2) \]

Task 4: Joint distribution

Task 4.1: In Section 3.4.2, I only show you how to calculate the value in the first cell, i.e. \(\Pr( X=1, Y=1 )\). Please calculate the values of the rest 3 cells, i.e. \(\Pr( X=1, Y=0 )\), \(\Pr( X=0, Y=1 )\), and \(\Pr( X=0, Y=0 )\)

Task 4.2: With the same background problem, calculate the probability that you eventually get an orange.

Task 4.3: Apply Bayes Formula to calculate the posterior probability that we chose the red box if we get an apple, i.e. \(\Pr( X=1 | Y=1 )\).

Task 4.4: Discuss with your group mates, propose a new example to explain what is the difference between prior and posterior probabilities.

Task 5: Continuous Random Variable and Distribution

Task 5.1: \(X \sim \mathcal{N}(1.2, 1)\), use R to calculate \(\Pr(-1.5 < X < 2.2)\).

Task 5.2: Suppose \(X \sim \mathcal{N}(\mu, \sigma^2)\), then what is the distribution of \(\frac{X - \mu}{\sigma}\)?

Task 5.3: Explain why 95% of the probability is covered within two SDs around the mean in a Normal distribution.

Task 6: Likelihood Analysis

Task 6.1: With the “Box-Fruits” the background problem, let’s make a small adjustment: we use the flip of a fair coin to decide which box to choose.

Suppose you got an apple—then which box do you think you are more likely to have chosen?

Task 6.2 Now, we change back the original setting that we throw a dice to decide the box, but we get red box when getting a number less than 6. Suppose you got an apple—then which box do you think you are more likely to have chosen?

Lecture 3 Homepage

© 2024 Xijia Liu. All rights reserved. Contact: xijia.liu AT umu.se
Logo