MA2510-2425A Solution
Certainly! Below, each question from your attached MA2510 Probability & Statistics exam is restated, then analyzed with a layered, hierarchical breakdown. Each layer summarizes the underlying theory, formulas, and reasoning, mapping the question content directly to the logical/statistical concepts required to solve it.
Q1. Decks of Cards & Probability (Combinatorics, Conditional Probability, Discrete Distributions)
Restatement
- Two decks: a black deck (52 cards), red deck (32 cards).
- (i) Toss coin, pick deck; then pick 2 cards simultaneously. Find probability two cards are of the same value.
- (ii) Given two cards of the same value, probability black deck was chosen?
- (iii) Merge & shuffle both decks (84 cards), pick 2 cards at random. Probability same value?
- (iv) Pick 5 cards with replacement from 84-card deck. Let $$ X_B $$: #black, $$ X_R $$: #red. Find distributions.
- (v) Let $$ X = X_B - X_R $$. Compute $$ \mathbb{E}[X], \operatorname{Var}(X) $$.
- (vi) Pick $$ n=7000 $$ cards with replacement, count $$ X_{JQK} $$. Estimate $$ \mathbb{P}(X_{JQK} \geq 2100) $$ using CLT.
Concepts & Hierarchical Analysis
1. Combinatorics and Basic Probability
- Sample Space: All possible orderings/drawings (with/without replacement).
- Event Definition: "Same value" for a pair → count pairs by value, deck size.
2. Hypergeometric & Binomial Distributions
- With Replacement: Binomial distribution for counting black/red/JQK cards.
- Without Replacement: Hypergeometric for picking specific values from a deck.
3. Conditional Probability, Bayes’ Theorem
- Posterior probability: Reverse probabilities when an event (matched value) is given.
4. Indicator Random Variables, Expectation & Variance
- Counting Matches: Sum indicators for event "value match". Use linearity of expectation for sums.
- Variance: For independent picks, variance adds up, formulas for difference of random variables.
5. Central Limit Theorem (CLT)
- Normal Approximation: For sums/counts with large $$ n $$, use normal CDF for probability estimation.
Layered Tree Breakdown
Q1. Cards & Probabilities
├ Combinatorics & Counting Principles
│ └─ Number of ways to pick pairs, suits, values
├ Probability Calculations
│ └─ Probability = #favorable/#total (cases for 'same value')
├ Discrete Random Variables & Distributions
│ ├─ Binomial (with replacement, count of red/black/JQK cards)
│ └─ Hypergeometric (if picking without replacement)
├ Conditional Probability & Bayes’ Rule
│ └─ Reverse conditional given outcome (compute deck probabilities after event)
├ Expectation & Variance of Sums
│ ├─ E[X_B], E[X_R], Var[X_B], Var[X_R], E[X_B - X_R], Var[X_B - X_R]
│ └─ Properties of expectation/variance for independent trials
├ Central Limit Theorem
│ └─ Normal approximation for sum of indicator variables; CDF Φ(z)
Q2. Joint Continuous Random Variables: Uniform Distribution on a Diamond (Geometry, Expectation, Covariance)
Restatement
- Random variables $$X,Y$$ with joint density $$f_{(X,Y)}(x,y) = c \cdot \mathbb{1}_{|x|+|y|<1}$$.
- (i) Determine $$c$$
- (ii) Find marginal pdf of $$X$$.
- (iii) Compute $$\mathbb{E}[X]$$
- (iv) Compute $$\operatorname{Var}(X)$$
- (v) Compute $$\operatorname{Cov}(X,Y)$$, are $$X,Y$$ independent?
- (vi) Compute $$\mathbb{E}[X^2 Y^2]$$
Concepts & Hierarchical Analysis
1. Joint & Marginal Probability Densities
- Density Function: Uniform over region (diamond/lozenge).
- Integration: Find c so total probability is 1.
2. Marginals & Independence
- Marginalization: Integrate joint pdf over y (or x).
- Independence: If joint pdf factors, variables are independent.
3. Expectation, Variance, Covariance
- Expectation: $$\mathbb{E}[X] = \int x f_X(x) dx$$
- Variance, Covariance: Compute mix products with marginal/joint pdf.
Layered Tree Breakdown
Q2. Joint Continuous RVs (Diamond Region)
├ Joint PDF Normalization
│ └─ Integrate over region |x|+|y|<1; set ∫∫=1, solve for c
├ Marginal Distributions
│ └─ Integrate out y: f_X(x) = ∫ f_(X,Y)(x,y) dy
├ Expectation & Variance
│ ├─ E[X] = ∫ x f_X(x) dx
│ └─ Var[X] = ∫ (x - E[X])² f_X(x) dx
├ Covariance
│ ├─ Cov(X,Y) = E[XY] - E[X]E[Y]
│ └─ Test for independence: f_(X,Y) = f_X(x)f_Y(y)?
├ Higher Moments
│ └─ E[X²Y²] = ∫∫ x²y² f_(X,Y)(x,y) dx dy
Q3. Random Subsets & Binomial-type Problems (Set Theory, Discrete Probability)
Restatement
- Pick subset $$A$$ uniformly at random from set $$I_n = {1,\dots,n}$$. Let $$X = |A|$$, $$Y=|I_n \setminus A|$$.
- (i) Compute $$\mathbb{P}(X=0)$$, $$\mathbb{P}(X=1)$$
- (ii) For $$n=4$$: draw cdf of $$X$$
- (iii) For general $$n$$: compute $$\mathbb{E}[X], \operatorname{Var}(X)$$
- (iv) Are $${X=0}$$, $${Y=0}$$ independent?
- (v) Does $$Y$$ have same distribution as $$X$$?
- (vi) Compute $$\operatorname{Cov}(X,Y)$$
Concepts & Hierarchical Analysis
1. Counting & Uniform Probability on Power Set
- Sample Space: All $$2^n$$ subsets; each equally likely.
- Binomial Trait: Size of subset is binomially distributed.
2. Discrete Distributions and Generating Functions
- CDF Construction: For small $$n$$, tabulate probabilities.
- Binomial Moments: $$E[X]=np$$, $$Var[X]=np(1-p)$$ (here p=1/2).
3. Independence & Symmetry
- Events and Dependence: Test by multiplication and intersection of events.
- Symmetry: Distribution of $$X$$ and $$Y$$ due to complement; expect same.
Layered Tree Breakdown
Q3. Random Subsets, Binomial-Type
├ Uniform Distribution on Power Set
│ ├─ Total outcomes: 2^n
│ └─ Probability of given |A|: binomial coefficients
├ Binomial Distributions (p=1/2)
│ ├─ X: size of random subset → Binomial(n,1/2)
│ └─ Compute probabilities, expectation, variance
├ CDF Table/Distribution Graph
├ Independence and Symmetry
│ └─ Use definition and compare P(X=0), P(Y=0), P(X=0 ∩ Y=0)
├ Covariance/Joint Properties
│ └─ Use covariance formula, properties of complementary events
Q4. Parameter Estimation for a Family of PDFs (Moments, MLE, Bias)
Restatement
- For random variable $$X$$ with density $$f_X(x) = \frac{3}{(x-\theta+1)^4} \mathbb{1}_{[\theta, \infty)}(x)$$:
- (i) Compute $$\mathbb{E}[X]$$.
- (ii) Compute $$\operatorname{Var}(X)$$.
- (iii) For iid sample, find estimator for $$\theta$$ by method of moments.
- (iv) Compute bias of that estimator.
- (v) Find MLE of $$\theta$$, state bias.
Concepts & Hierarchical Analysis
1. Integration of Nonstandard PDFs
- Support: Recognize variable must be $$\geq \theta$$.
- Integrals: Use substitution for expectation/integral evaluation.
2. Method of Moments
- Match sample moment to theoretical mean; solve for parameter.
3. Maximum Likelihood Estimation
- Likelihood Function: Write for sample, maximize wrt $$\theta$$.
- Estimator Bias: $$E[\hat{\theta}]$$ vs $$\theta$$.
Layered Tree Breakdown
Q4. Estimation for Nonstandard PDF
├ Continuous RV Integration
│ ├─ Compute mean, variance by integrating with given pdf/support
|\-- Substitution method for integrals (change to known integral)
├ Method of Moments
│ ├─ Set sample mean = theoretical mean, solve for parameter
├ Maximum Likelihood Estimation (MLE)
│ ├─ Construct and maximize likelihood
│ └─ Check estimator bias (unbiased/biased)
This step-by-step “mindmap” organizes the solution path for each exam question, mapping them to the relevant foundational probability/statistics concepts and the layered logical steps used to solve them. Each root node is a problem; each branch is a theory, method, or formula that underpins a solution.