New Semester
Started
Get
50% OFF
Study Help!
--h --m --s
Claim Now
Question Answers
Textbooks
Find textbooks, questions and answers
Oops, something went wrong!
Change your search query and then try again
S
Books
FREE
Study Help
Expert Questions
Accounting
General Management
Mathematics
Finance
Organizational Behaviour
Law
Physics
Operating System
Management Leadership
Sociology
Programming
Marketing
Database
Computer Network
Economics
Textbooks Solutions
Accounting
Managerial Accounting
Management Leadership
Cost Accounting
Statistics
Business Law
Corporate Finance
Finance
Economics
Auditing
Tutors
Online Tutors
Find a Tutor
Hire a Tutor
Become a Tutor
AI Tutor
AI Study Planner
NEW
Sell Books
Search
Search
Sign In
Register
study help
business
introduction to probability statistics
Introduction To Probability Statistics And Random Processes 1st Edition Hossein Pishro-Nik - Solutions
Consider a c ommunication system. At any given time, the communication channel is in good condition with probability 0.8, and is in bad condition with probability 0.2. An error occurs in a transmission with probability 0.1 if the channel is in good condition, and with probability 0.3 if the channel
A box contains two coins: a regular coin and one fake two-headed coin (P(H) = 1). I choose a coin at random and toss it twice. Define the following events.A= First coin toss results in an H.B= Second coin toss results in an H.C= Coin 1 (regular) has been selected.Find P(A|C),P(B|C),P(A ∩
Let X be a discrete random variable with the following PMF:a. Find RX, the range of the random variable X.b. Find P(X ≥ 1.5).c. Find P(0 d. Find P(X = 0|X Px(x)= T | 16 2 1 3 0 for x = 0 for x = 1 for x = 2 otherwise
In a factory th ere are 100 units of a certain product, 5 of which are defective. We pick three units from the 100 units at random. What is the probability that exactly one of them is defective?
50 students live in a dormitory. The parking lot has the capacity for 30 cars. If each student has a car with probability 1/2 (independently from other students), what is the probability that there won't be enough parking spaces for all the cars?
Let X be a discrete random variable with the following PMF:Find and plot the CDF of X. Px(x) 0.2 0.3 0.2 0.2 0.1 0 for x = = -2 -1 for x = for x = 0 for x = 1 for x = 2 otherwise
I toss a coin twice. Let X be the number of observed heads. Find the CDF of X.
Let X ∼ Geometric(p). Find V ar(X).
Let X be a mixed random variable with the following generalized PDFa. Find P(X = 1) and P(X = −2).b. Find P(X ≥ 1).c. Find P(X = 1|X ≥ 1).d. Find EX and V ar(X). fx(x) = (2+2) + - (1 - (-1) + 중이다 1 2 1 2
Consider two random variables X and Y with joint PMF given in Table 5.4 Joint PMF of X and Y a. Find P(X ≤ 2,Y > 1).b. Find the marginal PMFs of X and Y .c. Find P(Y = 2|X = 1).d. Are X and Y independent? X = 1 X = 2 X = 4 Y = 1 WIT 1 1 Y = 2 1 12 0 -13
N people sit around a round table, where N > 5. Each person tosses a coin. Anyone whose outcome is different from his/her two neighbors will receive a present. Let X be the number of people who receive presents. Find EX and Var(X).
Prove the union bound using Markov's inequality.
50 students live in a dormitory. The parking lot has the capacity for 30 cars. Each student has a car with probability 1/2, independently from other students. Use the CLT (with continuity correction) to find the probability that there won't be enough parking spaces for all the cars.
I have a bag that contains 3 balls. Each ball is either red or blue, but I have no information in addition to this. Thus, the number of blue balls, call it θ, might be 0, 1, 2, or 3. I am allowed to choose 4 balls at random from the bag with replacement. We define the random variables X1, X2, X3,
Suppose that the random variable X is transmitted over a communication channel. Assume that the received signal is given by Y = 2X +W, where W ∼ N(0,σ2) is independent of X. Suppose that X = 1 with probability p, and X = −1 with probability 1 −p. The goal is to decide between X = −1 and X
Suppose that the number of customers visiting a fast food restaurant in a given time interval I is N ∼ Poisson(μ). Assume that each customer purchases a drink with probability p, independently from other customers, and independently from the value of N. Let X be the number of customers who
Let N(t) be a Poisson process with rate λ. Let 0 < s < t. Show that given N(t) = n, N(s) is a binomial random variable with parameters n and p = s/t.
Let Xn be a discrete-time Markov chain. Remember that, by definition, p(n)ii = P(Xn = i|X0 = i). Show that state i is recurrent if and only if (η) Σ Pu = ∞. n=1
Write R programs to generate Geometric(p) and Negative Binomial(i,p) random variables.
Use the algorithm for generating discrete random variables to obtain a Poisson random variable with parameter λ = 2.
Explain how to generate a random variable with the densityif your random number generator produces a Standard Uniform random variable U. f(x) = 2.5x√x for 0
Use the inverse transformation method to generate a random variable having distribution function F(x) x² + x 2 0 ≤ x ≤ 1
Let X have a standard Cauchy distribution.Assuming you have U ∼ Uniform(0; 1), explain how to generate X. Then, use this result to produce 1000 samples of X and compute the sample mean. Repeat the experiment 100 times. What do you observe and why? 1 Fx(x)==arctan(r) + 1 2
When we use the Inverse Transformation Method, we need a simple form of the cdf F(x) that allows direct computation of X = F-1(U). When F(x) doesn't have a simple form but the pdf f(x) is available, random variables with density f(x) can be generated by the rejection method. Suppose you have a
Use the rejection method to generate a random variable having density function Beta(2; 4). Assume g(x) = 1 for 0 < x < 1.
Use the rejection method to generate a random variable having the Gamma( 5/2 ; 1) density function.Assume g(x) is the pdf of the Gamma (a = /2, λ = 1).
Use the rejection method to generate a standard normal random variable. Assume g(x) is the pdf of the exponential distribution with λ = 1.
Use the rejection method to generate a Gamma(2; 1) random variable conditional on its value being greater than 5, that isAssume g(x) be the density function of exponential distribution. f(x) = -I xe ỗ xe-da 15 xe-x 6e-5 (x > 5)
Let X and Y be jointly normal and X ∼ N(0, 1), Y ∼ N(1, 4), and ρ(X,Y ) = 1/2. Find a 95% credible interval for X, given Y = 2 is observed.
Assume our data Y given X is distributed Y | X = x ∼ Binomial(n, p = x) and we chose the prior to be X ∼ Beta(α,β). Then the PMF for our data isand the PDF of the prior is given byNote that,a. Show that the posterior distribution is Beta(α +y,β +n −y).b. Write out the PDF for the
Find the average error probability in Problem 13.Problem 13Suppose that the random variable X is transmitted over a communication channel. Assume that the received signal is given by Y = 2X +W, where W ∼ N(0,σ2) is independent of X. Suppose that X = 1 with probability p, and X = −1 with
When the choice of a prior distribution is subjective, it is often advantageous to choose a prior distribution that will result in a posterior distribution of the same distributional family. When the prior and posterior distributions share the same distributional family, they are called conjugate
A monitoring system is in charge of detecting malfunctioning machinery in a facility. There are two hypotheses to choose from:H0: There is not a malfunction,H1: There is a malfunction.The system notifies a maintenance team if it accepts H1. Suppose that, after processing the data, we obtain P(H1|y)
Let X and Y be jointly normal and X ∼ N(2, 1), Y ∼ N(1, 5), and ρ(X,Y ) = 1/4. Find a 90% credible interval for X, given Y = 1 is observed.
Assume our data Y = (y1, y2,…, yn)T given X is independently identically distributed, Y | X = x i.i.d. ∼ Exponential(λ = x), and we chose the prior to be X ∼ Gamma(α,β).a. Find the likelihood of the function, L(Y;X) = fY1,Y2,…,Yn|X(y1, y2,…, yn|x).b. Using the likelihood function of
Assume our data Y given X is distributed Y | X = x ∼ Geometric(p = x) and we chose the prior to be X ∼ Beta(α,β). Refer to Problem 18 for the PDF and moments of the Beta distribution.a. Show that the posterior distribution is Beta(α +1,β +y −1).b. Write out the PDF for the posterior
Let {Xn,n ∈ Z} be a discrete-time random process, defined aswhere Φ ∼ Uniform(0, 2π).a. Find the mean function, μX(n).b. Find the correlation function RX(m,n).c. Is Xn a WSS process? (+), Xn = 2 cos
You have 1000 dollars to put in an account with interest rate R, compounded annually. That is, if Xn is the value of the account at year n, thenThe value of R is a random variable that is determined when you put the money in the bank, but it does not not change after that. In particular, assume
Let {X(t), t ∈ R} be a continuous-time random process, defined aswhere A ∼ U(0, 1) and Φ ∼ U(0, 2π) are two independent random variables.a. Find the mean function μX(t).b. Find the correlation function RX(t1, t2).c. Is X(t) a WSS process? X(t) = A cos(2t + $),
Let {X(n),n ∈ Z} be a WSS discrete-time random process with μX(n) = 1 and RX(m,n) = e−(m−n)2. Define the random process Z(n) asa. Find the mean function of Z(n), μZ(n).b. Find the autocorrelation function of Z(n), RZ(m,n).c. Is Z(n) a WSS random process? Z(n) = X(n) + X(n − 1), for all n
Let {X(t), t ∈ [0,∞)} be defined as X(t) = A +Bt, for all t ∈ [0,∞),where A and B are independent normal N(1, 1) random variables.a. Find all possible sample functions for this random process.b. Define the random variable Y = X(1). Find the PDF of Y .c. Let also Z = X(2). Find E[YZ].
Let g : R ↦ R be a periodic function with period T, i.e.,Define the random process {X(t), t ∈ R} aswhere U ∼ Uniform(0,T). Show that X(t) is a WSS random process. g(t+T)= g(t), for all t = R.
Consider the random process {Xn,n = 0, 1, 2,⋯}, in which Xi's are i.i.d. standard normal random variables.1. Write down fXn(x) for n = 0, 1, 2,⋯.2. Write down fXmXn(x1,x2) for m ≠ n.
Find the mean functions for the random processes given in Examples 10.1 and 10.2.Examples 10.1You have 1000 dollars to put in an account with interest rate R, compounded annually. That is, if Xn is the value of the account at year n, thenThe value of R is a random variable that is determined when
Let {X(t), t ∈ R} and {Y (t), t ∈ R} be two independent random processes. Let Z(t) be defined asProve the following statements:a. μZ(t) = μX(t)μY (t), for all t ∈ R.b. RZ(t1, t2) = RX(t1, t2)RY (t1, t2), for all t ∈ R.c. If X(t) and Y (t) are WSS, then they are jointly WSS.d. If X(t) and
Find the correlation functions and covariance functions for the random processes given in Examples 10.1 and 10.2.Examples 10.1You have 1000 dollars to put in an account with interest rate R, compounded annually. That is, if Xn is the value of the account at year n, thenThe value of R is a random
Let A, B, and C be independent normal N(1, 1) random variables. Let {X(t), t ∈ [0,∞)} be defined asAlso, let {Y (t), t ∈ [0,∞)} be defined asFind RXY (t1, t2) and CXY (t1, t2), for t1, t2 ∈ [0,∞). X(t) = A + Bt, for all t = [0, ∞).
Let X(t) be a Gaussian process such that for all t > s ≥ 0 we have X(t) −X(s) ∼ N (0, t −s).Show that X(t) is mean-square continuous at any time t ≥ 0.
Let X(t) be a WSS Gaussian random process with μX(t) = 1 and RX(τ) = 1 +4sinc(τ)a. Find P(1 < X(1) < 2).b. Find P(1 < X(1) < 2,X(2) < 3).
Let X(t) be a zero-mean WSS Gaussian process with RX(τ) = e−τ 2 , for all τ ∈ R.1. Find P(X(1) < 1).2. Find P(X(1) +X(2) < 1).
Consider a WSS random process X(t) withwhere a is a positive real number. Find the PSD of X(t). Rx(T) = e-a|¹|,
Let X(t) be a Gaussian random process with μX(t) = 0 and RX(t1, t2) = min(t1, t2).Find P(X(4) < 3|X(1) = 1).
Let {X(t), t ∈ R} be a continuous-time random process, defined aswhere A0, A1, ⋯, An are i.i.d. N(0, 1) random variables and n is a fixed positive integer.a. Find the mean function μX(t).b. Find the correlation function RX(t1, t2).c. Is X(t) a WSS process?d. Find P(X(1) e. Is X(t) a Gaussian
Let X(t) be a zero-mean WSS process with RX(τ) = e−|τ|. X(t) is input to an LTI system withLet Y (t) be the output.a. Find μY (t) = E[Y(t)].b. Find RY (τ).c. Find E[Y (t)2]. |H(f)| = = √1+4π² f² 0 |f|< 2 otherwise
In some applications, we need to work with complexvalued random processes. More specifically, a complex random process X(t) can be written as X(t) = Xr(t) +jXi(t), where Xr(t) and Xi(t) are two real-valued random processes and j = √−1. We define the mean function and the autocorrelation
Let {X(t), t ∈ R} be a continuous-time random process. The time average mean of X(t) is defined as (assuming that the limit exists in mean-square sense)Consider the random process {X(t), t ∈ R} defined aswhere U ∼ Uniform(0, 2π). Find ⟨X(t)⟩. (X(t)) = lim T→∞ [ 1 2T T -T X(t)dt
Let X(t) be a zero-mean Gaussian random process with RX(τ) = 8 sinc(4τ). Suppose that X(t) is input to an LTI system with transfer functionIf Y (t) is the output, find P(Y (2) H(f) = 2 0 |f|
Let X(t) be a white Gaussian noise process that is input to an LTI system with transfer functionIf Y (t) is the output, find P(Y (1) 0). |H(f)| = 2 0 1
Let X(t) be a WSS process. We say that X(t) is mean ergodic if 〈X(t)〉 (defined above) is equal to μX. Let A0, A1, A−1, A2, A−2, ⋯ be a sequence of i.i.d. random variables with mean EAi = μ where, g(t) is given byShow that X(t) is mean ergodic. 1 -{ 0 g(t) = 0 < t < 1 otherwise
Let {X(t), t ∈ R} be a WSS random process. Show that for any α > 0, we have P(|X(t + r) - X(t)| > a) ≤ 2Rx(0) -2RX(T) a²
Let {X(t), t ∈ R} be a WSS random process. Suppose that RX(τ) = RX(0) for some τ > 0. Show that, for any t, we have X(t+r) = X(t), with probability one.
Let X(t) be a real-valued WSS random process with autocorrelation function RX(τ). Show that the Power Spectral Density (PSD) of X(t) is given by ∞ Sx(f) = Rx (7) cos(2n fr) dr. -∞
Let X(t) be a WSS process with autocorrelation functionAssume that X(t) is input to a low-pass filter with frequency responseLet Y (t) be the output.a. Find SX(f).b. Find SXY(f).c. Find SY (f).d. Find E[Y(t)2]. 1 1+7²72 RX(T) ==
Let X(t) and Y (t) be real-valued jointly WSS random processes. Show that SY X(f) = S∗XY (f), where, ∗ shows the complex conjugate.
Let X(t) be a WSS process with autocorrelation functionAssume that X(t) is input to an LTI system with impulse responseLet Y (t) be the output.a. Find SX(f).b. Find SXY (f).c. Find RXY (τ).d. Find SY (f).e. Find RY (τ).f. Find E[Y (t)2]. RX(T) = 1 +8(T).
Let X(t) be a zero-mean WSS Gaussian random process with RX(τ) = e−πτ2. Suppose that X(t) is input to an LTI system with transfer functionLet Y (t) be the output.a. Find μY .b. Find RY (τ) and Var(Y (t)).c. Find E[Y (3)|Y (1) = −1].d. Find Var(Y (3)|Y (1) = −1).e. Find P(Y (3) |H(f)| =
Let X(t) be a white Gaussian noise with SX(f) = N0/2. Assume that X(t) is input to a bandpass filter with frequency responseLet Y (t) be the output.a. Find SY (f).b. Find RY (τ).c. Find E[Y (t)2]. H(f) = 2 0 1 < f
The number of customers arriving at a grocery store can be modeled by a Poisson process with intensity λ = 10 customers per hour.1. Find the probability that there are 2 customers between 10:00 and 10:20.2. Find the probability that there are 3 customers between 10:00 and 10:20 and 7 customers
The number of orders arriving at a service facility can be modeled by a Poisson process with intensity λ = 10 orders per hour.a. Find the probability that there are no orders between 10:30 and 11.b. Find the probability that there are 3 orders between 10:30 and 11 and 7 orders between 11:30 and 12.
Let N(t) be a Poisson process with intensity λ = 2, and let X1, X2, ⋯ be the corresponding interarrival times.a. Find the probability that the first arrival occurs after t = 0.5, i.e., P(X1 > 0.5).b. Given that we have had no arrivals before t = 1, find P(X1 > 3).c. Given that the third arrival
In this problem, our goal is to complete the proof of the equivalence of the first and the second definitions of the Poisson process. More specifically, suppose that the counting process {N(t), t ∈ [0,∞)} satisfies all the following conditions:1. N(0) = 0.2. N(t) has independent and stationary
Let X ∼ Poisson(μ1) and Y ∼ Poisson(μ2) be two independent random variables. Define Z = X +Y . Show that X|Z = n~ Binomial (n, μ1 f1 + f₂
Let {N(t), t ∈ [0,∞)} be a Poisson process with rate λ. Find the probability that there are two arrivals in (0, 2] or three arrivals in (4, 7].
Consider the Markov chain shown in Figure 11.7.a. Find P(X4 = 3|X3 = 2).b. Find P(X3 = 1|X2 = 1).c. If we know P(X0 = 1) = 1/3, find P(X0 = 1,X1 = 2).d. If we know P(X0 = 1) = 1/3, find P(X0 = 1,X1 = 2,X2 = 3). HA IN HT −12 Wot 3 3 3 2 Figure 11.7 - A state transition diagram.
Let N1(t) and N2(t) be two independent Poisson processes with rate λ1 and λ2 respectively. Let N(t) = N1(t) +N2(t) be the merged process. Show that givenWe can interpret this result as follows: Any arrival in the merged process belongs to N1(t) with probabilityand belongs to N2(t) with
Consider a system that can be in one of two possible states, S = {0, 1}. In particular, suppose that the transition matrix is given bySuppose that the system is in state 0 at time n = 0, i.e., X0 = 0.a. Draw the state transition diagram.b. Find the probability that the system is in state 1 at time
Let {N(t), t ∈ [0,∞)} be a Poisson process with rate λ. Let T1, T2, ⋯ be the arrival times for this process. Show thatOne way to show the above result is to show that for sufficiently small Δi, we have fT₁ T2T (t1, t2, tn) = X"e-tn, ·,tn)=X"e-tn, for 0
Consider the Markov chain shown in Figure 11.9. It is assumed that when there is an arrow from state i to state j, then pij > 0. Find the equivalence classes for this Markov chain. (2 (3) 8 Figure 11.9 - A state transition diagram. 5 6
Show that in a finite Markov chain, there is at least one recurrent class.
Consider the Markov chain in Example 11.6.a. Is Class 1 = {state 1, state 2} aperiodic?b. Is Class 2 = {state 3, state 4} aperiodic?c. Is Class 4 = {state 6, state 7, state 8} aperiodic?Example 11.6.Consider the Markov chain shown in Figure 11.9. It is assumed that when there is an arrow from state
Let {N(t), t ∈ [0,∞)} be a Poisson process with rate λ. Show the following: given that N(t) = n, the n arrival times have the same joint CDF as the order statistics of n independent Uniform(0, t) random variables. To show this you can show that n! fT₁, T2, Tn N(t)-n (t1, t2,..., tn)
For the Markov chain given in Figure 11.12, answer the following questions: How many classes are there? For each class, mention if it is recurrent or transient. -13 1 ² 2 3 Figure 11.12 - A state transition diagram. 3
Let {N(t), t ∈ [0,∞)} be a Poisson process with rate λ. Let T1, T2, ⋯ be the arrival times for this process. Find E[T₁+T₂ + +T₁0|N(4) = 10].
Consider the Markov chain in Figure 11.12. Let's define bi as the absorption probability in state 3 if we start from state i. Use the above procedure to obtain bi for i = 0, 1, 2, 3. 13 1 co/N 3 2 H2 1 Figure 11.12 - A state transition diagram. 3
Consider a Markov chain with two possible states, S = {0, 1}. In particular, suppose that the transition matrix is given bywhere a and b are two real numbers in the interval [0, 1] such that 0 where α ∈ [0, 1].a. Using induction (or any other method), show thatb. Show thatc. Show that P = 1-a
Consider the Markov chain shown in Figure 11.13. Let tk be the expected number of steps until the chain hits state 1 for the first time, given that X0 = k. Clearly, t1 = 0. Also, let r1 be the mean return time to state 1.1. Find t2 and t3.2. Find r1. 1 22/12 2 3 1o11 w/N Figure 11.13- A state
Two teams A and B play a soccer match. The number of goals scored by Team A is modeled by a Poisson process N1(t) with rate λ1 = 0.02 goals per minute, and the number of goals scored by Team B is modeled by a Poisson process N2(t) with rate λ2 = 0.03 goals per minute. The two processes are
In Problem 10, find the probability that Team B scores the first goal. That is, find the probability that at least one goal is scored in the game and the first goal is scored by Team B.Problem 10Two teams A and B play a soccer match. The number of goals scored by Team A is modeled by a Poisson
Consider a Markov chain in Example 11.12: a Markov chain with two possible states, S = {0, 1}, and the transition matrixwhere a and b are two real numbers in the interval [0, 1] such that 0 0 and r1, for this Markov chain.Example 11.12Consider a Markov chain with two possible states, S = {0, 1}. In
Let {N(t), t ∈ [0,∞)} be a Poisson process with rate λ. Let p : [0,∞) ↦ [0, 1] be a function. Here we divide N(t) to two processes N1(t) and N2(t) in the following way. For each arrival, a coin with P(H) = p(t) is tossed. If the coin lands heads up, the arrival is sent to the first process
Let α0, α1, ⋯ be a sequence of nonnegative numbers such thatConsider a Markov chain X0, X1, X2, ⋯ with the state space S = {0, 1, 2,⋯} such thatShow that X1, X2, ⋯ is a sequence of i.i.d random variables. Σα; = 1. j=0
Consider the Markov chain with three states S = {1, 2, 3}, that has the state transition diagram is shown in Figure 11.31.Suppose P(X1 = 1) = 1/2 and P(X1 = 2) = 1/4.a. Find the state transition matrix for this chain.b. Find P(X1 = 3,X2 = 2,X3 = 1).c. Find P(X1 = 3,X3 = 1). 1 12 3 12 2 1 Figure
Consider a Markov chain in Example 11.12: a Markov chain with two possible states, S = {0, 1}, and the transition matrixwhere a and b are two real numbers in the interval [0, 1] such that 0 Example 11.12Consider a Markov chain with two possible states, S = {0, 1}. In particular, suppose that the
Consider the Markov chain shown in Figure 11.14.a. Is this chain irreducible?b. Is this chain aperiodic?c. Find the stationary distribution for this chain.d. Is the stationary distribution a limiting distribution for the chain? 7 H -12 1 -IN WIT 3 c 2 를 Figure 11.14- A state transition diagram.
Consider the Markov chain in Figure 11.32. There are two recurrent classes, R1 = {1, 2}, and R2 = {5, 6, 7}. Assuming X0 = 4, find the probability that the chain gets absorbed to R1. 1 S AL 1 1 Figure 11.32 - A state transition diagram.
Consider the Markov chain of Problem 16. Again assume X0 = 4. We would like to find the expected time (number of steps) until the chain gets absorbed in R1 or R2. More specifically, let T be the absorption time, i.e., the first time the chain visits a state in R1 or R2. We would like to find E[T|X0
Consider a continuous Markov chain with two states S = {0, 1}. Assume the holding time parameters are given by λ0 = λ1 = λ > 0. That is, the time that the chain spends in each state before going to the other state has an Exponential(λ) distribution.a. Draw the state diagram of the embedded
Consider the Markov chain shown in Figure 11.15. Assume that 0 < p < 1/2. Does this chain have a limiting distribution?
Consider the Markov chain shown in Figure 11.33. Assume X0 = 2, and let N be the first time that the chain returns to state 2, i.e.,Find E[N|X0 = 2]. N = min {n ≥ 1: X₂ = 2}.
Consider the continuous Markov chain of Example 11.17: A chain with two states S = {0, 1} and λ0 = λ1 = λ > 0. In that example, we found that the transition matrix for any t ≥ 0 is given byFind the stationary distribution π for this chain.Example 11.17Consider a continuous Markov chain
Showing 6600 - 6700
of 7136
First
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
Step by Step Answers