Student GPA Seminar Dropped 1 3.78 1 No 2 3.22 0 No 3 2.69 0 Yes 4
Question:
Student | GPA | Seminar | Dropped |
1 | 3.78 | 1 | No |
2 | 3.22 | 0 | No |
3 | 2.69 | 0 | Yes |
4 | 3.12 | 0 | No |
5 | 2.96 | 0 | Yes |
6 | 2.47 | 0 | No |
7 | 2.37 | 0 | No |
8 | 2.76 | 0 | Yes |
9 | 3.1 | 0 | No |
10 | 2.86 | 0 | No |
11 | 2.62 | 0 | No |
12 | 3.07 | 0 | No |
13 | 3.3 | 0 | No |
14 | 2.43 | 0 | Yes |
15 | 2.98 | 0 | Yes |
16 | 2.63 | 0 | No |
17 | 2.68 | 0 | Yes |
18 | 2.62 | 0 | Yes |
19 | 2.82 | 1 | No |
20 | 2.38 | 0 | No |
21 | 3.33 | 0 | No |
22 | 2.33 | 0 | Yes |
23 | 2.93 | 0 | Yes |
24 | 3.06 | 0 | No |
25 | 3.98 | 1 | No |
26 | 2.88 | 0 | No |
27 | 3.12 | 0 | No |
28 | 3.12 | 0 | No |
29 | 3.38 | 0 | No |
30 | 3.29 | 1 | No |
31 | 3.49 | 1 | No |
32 | 2.89 | 0 | No |
33 | 2.77 | 0 | No |
34 | 2.61 | 0 | Yes |
35 | 3.1 | 0 | No |
36 | 3.06 | 1 | No |
37 | 2.61 | 0 | Yes |
38 | 2.91 | 1 | No |
39 | 2.49 | 0 | Yes |
40 | 3.45 | 0 | No |
41 | 2.93 | 0 | Yes |
42 | 2.22 | 0 | Yes |
43 | 2.51 | 0 | Yes |
44 | 3.71 | 1 | No |
45 | 2.11 | 0 | Yes |
46 | 2.87 | 0 | No |
47 | 2.64 | 1 | No |
48 | 3.04 | 0 | No |
49 | 2.54 | 0 | Yes |
50 | 2.57 | 0 | No |
51 | 3.85 | 0 | No |
52 | 2.53 | 0 | No |
53 | 3.2 | 1 | No |
54 | 3.12 | 0 | No |
55 | 2.9 | 0 | No |
56 | 3.27 | 0 | No |
57 | 2.73 | 0 | Yes |
58 | 2.7 | 0 | Yes |
59 | 2.29 | 0 | No |
60 | 2.34 | 0 | No |
61 | 3.4 | 0 | No |
62 | 2.62 | 0 | No |
63 | 2.62 | 1 | No |
64 | 2.64 | 0 | No |
65 | 2.14 | 0 | No |
66 | 3.06 | 0 | No |
67 | 2.83 | 0 | Yes |
68 | 3.35 | 1 | No |
69 | 3.33 | 0 | No |
70 | 2.16 | 0 | No |
71 | 2.4 | 0 | No |
72 | 2.51 | 0 | No |
73 | 2.86 | 1 | No |
74 | 3.12 | 0 | No |
75 | 3.38 | 1 | No |
76 | 2.82 | 1 | No |
77 | 3.04 | 0 | No |
78 | 2.78 | 0 | No |
79 | 2.74 | 0 | No |
80 | 2.02 | 0 | No |
81 | 2.56 | 0 | No |
82 | 3.48 | 0 | No |
83 | 2.64 | 0 | Yes |
84 | 2.63 | 0 | No |
85 | 2.11 | 0 | No |
86 | 3.03 | 0 | No |
87 | 2.33 | 0 | No |
88 | 2.01 | 0 | No |
89 | 2.94 | 0 | Yes |
90 | 3.81 | 1 | No |
91 | 2.68 | 0 | No |
92 | 2.58 | 0 | No |
93 | 2.7 | 1 | No |
94 | 2.75 | 0 | Yes |
95 | 2.53 | 0 | No |
96 | 3.08 | 0 | Yes |
97 | 2.78 | 0 | No |
98 | 3.23 | 0 | No |
99 | 3.17 | 0 | No |
100 | 2.4 | 0 | No |
101 | 3.27 | 1 | No |
102 | 2.57 | 0 | Yes |
103 | 3.64 | 1 | No |
104 | 3.03 | 0 | No |
105 | 2.77 | 0 | Yes |
106 | 2.85 | 0 | Yes |
107 | 2.91 | 0 | Yes |
108 | 2.93 | 0 | No |
109 | 3.17 | 0 | No |
110 | 3.45 | 0 | No |
111 | 2.64 | 1 | No |
112 | 3.02 | 0 | No |
113 | 2.9 | 1 | Yes |
114 | 2.31 | 0 | No |
115 | 2.7 | 0 | Yes |
116 | 2.24 | 0 | No |
117 | 3.08 | 0 | No |
118 | 3.12 | 0 | No |
119 | 2.59 | 1 | No |
120 | 2.86 | 0 | No |
121 | 2.7 | 0 | No |
122 | 3.78 | 1 | No |
123 | 3.4 | 0 | No |
124 | 3.46 | 0 | No |
125 | 2.77 | 0 | No |
126 | 2.11 | 0 | No |
127 | 2.86 | 0 | No |
128 | 2.33 | 0 | No |
129 | 3.64 | 1 | No |
130 | 3.07 | 1 | No |
131 | 3.23 | 0 | No |
132 | 2.63 | 0 | No |
133 | 2.41 | 0 | No |
134 | 3.5 | 0 | No |
135 | 3.42 | 0 | No |
136 | 2.69 | 0 | Yes |
137 | 2.99 | 0 | No |
138 | 3.23 | 0 | No |
139 | 2.63 | 0 | No |
140 | 2.79 | 0 | No |
141 | 2.17 | 0 | No |
142 | 2.68 | 0 | Yes |
143 | 3.37 | 0 | No |
144 | 3.31 | 0 | No |
145 | 2.93 | 0 | Yes |
146 | 2.93 | 1 | No |
147 | 2.17 | 0 | No |
148 | 2.26 | 0 | No |
149 | 2.79 | 0 | No |
150 | 2.25 | 0 | No |
151 | 2.47 | 0 | No |
152 | 2.88 | 0 | No |
153 | 2.79 | 0 | Yes |
154 | 3.19 | 0 | No |
155 | 2.66 | 0 | No |
156 | 2.19 | 0 | No |
157 | 2.13 | 0 | No |
158 | 3.15 | 1 | Yes |
159 | 2.41 | 0 | No |
160 | 2.84 | 0 | No |
161 | 2.85 | 0 | No |
162 | 3.03 | 0 | Yes |
163 | 3.1 | 0 | No |
164 | 2.57 | 0 | No |
165 | 3.01 | 0 | No |
166 | 3.01 | 1 | No |
167 | 2.81 | 0 | Yes |
168 | 2.82 | 0 | Yes |
169 | 3.02 | 1 | No |
170 | 2.46 | 0 | No |
171 | 2.63 | 1 | No |
172 | 2.96 | 1 | Yes |
173 | 2.57 | 0 | No |
174 | 2.24 | 0 | No |
175 | 2.62 | 0 | No |
176 | 3.59 | 1 | No |
177 | 3.53 | 1 | No |
178 | 2.14 | 0 | No |
179 | 2.9 | 0 | No |
180 | 3.78 | 1 | No |
181 | 2.92 | 0 | Yes |
182 | 2.17 | 0 | No |
183 | 2.41 | 0 | No |
184 | 2.68 | 1 | No |
185 | 2.19 | 0 | No |
186 | 2.46 | 0 | No |
187 | 3.1 | 0 | No |
188 | 2.82 | 1 | Yes |
189 | 2.29 | 0 | No |
190 | 2.72 | 1 | No |
191 | 2.1 | 0 | No |
192 | 2.54 | 1 | No |
193 | 2.7 | 0 | No |
194 | 3.48 | 1 | No |
195 | 2.77 | 0 | Yes |
196 | 2.67 | 1 | No |
197 | 2.21 | 0 | No |
198 | 2.61 | 0 | No |
199 | 3.42 | 0 | No |
200 | 3.47 | 0 | No |
201 | 2.49 | 0 | No |
202 | 2.62 | 0 | Yes |
203 | 2.27 | 0 | No |
204 | 2.64 | 0 | No |
205 | 2.6 | 1 | Yes |
206 | 2.31 | 0 | No |
207 | 3 | 0 | No |
208 | 2.48 | 0 | No |
209 | 3.65 | 1 | No |
210 | 2.01 | 0 | No |
211 | 3.67 | 1 | No |
212 | 2.32 | 0 | No |
213 | 2.94 | 1 | Yes |
214 | 3.42 | 0 | No |
215 | 3.87 | 1 | No |
216 | 2.71 | 0 | No |
217 | 3.9 | 1 | No |
218 | 2.79 | 0 | No |
219 | 2.19 | 0 | No |
220 | 3.01 | 1 | No |
221 | 3.29 | 0 | No |
222 | 2.41 | 0 | No |
223 | 3.46 | 0 | No |
224 | 2.58 | 1 | No |
225 | 2.48 | 0 | No |
226 | 2.64 | 1 | No |
227 | 2.59 | 0 | No |
228 | 2.16 | 0 | No |
229 | 3.48 | 0 | No |
230 | 2.94 | 0 | No |
231 | 2.92 | 0 | Yes |
232 | 3.41 | 0 | No |
233 | 3.64 | 1 | No |
234 | 2.73 | 0 | No |
235 | 3.06 | 1 | No |
236 | 3.03 | 0 | No |
237 | 2.28 | 0 | No |
238 | 2.72 | 1 | No |
239 | 2.85 | 0 | No |
240 | 3.42 | 0 | No |
241 | 2.16 | 0 | No |
242 | 2.88 | 0 | No |
243 | 2.1 | 0 | No |
244 | 2.94 | 0 | No |
245 | 2.66 | 1 | Yes |
246 | 2.44 | 0 | No |
247 | 2.18 | 0 | No |
248 | 2.69 | 1 | Yes |
249 | 2.93 | 0 | Yes |
250 | 2.82 | 0 | No |
251 | 2.97 | 1 | No |
252 | 2.43 | 0 | No |
253 | 3.69 | 1 | No |
254 | 3.21 | 0 | No |
255 | 3.05 | 0 | Yes |
256 | 2.66 | 0 | Yes |
257 | 3.25 | 0 | No |
258 | 2.82 | 0 | No |
259 | 3.34 | 1 | No |
260 | 3.75 | 1 | No |
261 | 3.14 | 1 | No |
262 | 2.46 | 0 | No |
263 | 3.39 | 0 | No |
264 | 3.14 | 0 | No |
265 | 2.52 | 0 | No |
266 | 3.37 | 0 | No |
267 | 3.67 | 1 | No |
268 | 3.55 | 1 | No |
269 | 3.36 | 0 | No |
270 | 2.1 | 0 | No |
271 | 2.47 | 0 | No |
272 | 2.86 | 0 | Yes |
273 | 3.25 | 0 | No |
274 | 2.89 | 1 | No |
275 | 2.78 | 1 | Yes |
276 | 3.21 | 1 | No |
277 | 2.83 | 1 | No |
278 | 3.24 | 0 | No |
279 | 2.26 | 0 | No |
280 | 2.89 | 0 | Yes |
281 | 2.88 | 1 | Yes |
282 | 2.6 | 0 | No |
283 | 2.24 | 0 | No |
284 | 2.74 | 0 | No |
285 | 3.14 | 1 | No |
286 | 2.71 | 1 | Yes |
287 | 2.62 | 1 | No |
288 | 3.41 | 0 | No |
289 | 2.66 | 0 | Yes |
290 | 2.75 | 0 | No |
291 | 2.74 | 0 | No |
292 | 3.91 | 1 | No |
293 | 3.23 | 0 | Yes |
294 | 2.53 | 1 | Yes |
295 | 3.05 | 0 | No |
296 | 2.72 | 0 | No |
297 | 2.38 | 0 | No |
298 | 2.46 | 0 | No |
299 | 2.59 | 0 | No |
300 | 2.66 | 0 | No |
Over the past few years the percentage of students who leave Dana College at the end of their first year has increased. Last year, Dana started voluntary one-credit hour-long seminars with faculty to help first-year students establish an on-campus connection. If Dana is able to show that the seminars have a positive effect on retention, college administrators will be convinced to continue funding this initiative. Dana's administration also suspects that first-year students with lower high school GPAs have a higher probability of leaving Dana at the end of the first year. Data on the 500 first-year students from last year has been collected. Each observation consists of a first-year student's high school GPA, whether they enrolled in a seminar, and whether they dropped out and did not return to Dana. Apply logistic regression to classify observations as dropped out or not dropped out by using GPA and Seminar as input variables and Dropped as the target (or response) variable.
For part a., in theDatatab of theRattle GUI - Rwindow, click inside the box next toFilename: and navigate to the location of the fileDanaTrain.csv. Select the fileDanaTrain.csv, clickOpen, then click theExecutebutton. Uncheck the box next toPartition. For theStudentvariable, select theIdentbutton. For theGPAandSeminarvariables, select theInputbutton. For theDroppedvariable, select theTargetbutton. Next, click theExecutebutton. In theModeltab, and in theType:row, select the button next toLinear, and then select the button next toLogistic. Click theExecutebutton.
To evaluate the performance of a logistic regression model on a validation set, click theEvaluatetab. In theModel: row, select the box next toLinear, and in theData: row selectCSV File. Click inside the box next toCSV Fileand navigate to the location of the fileDanaValidation.csv. Select the fileDanaValidation.csv, and clickOpen.
To generate the ROC chart in thePlotspane of the RStudio interface, in theEvaluatetab, selectROCin theType:row, and clickExecute.
For part b., to generate the boxplot ofGPAbySeminarin thePlotspane of the RStudio interface, in theDatatab, for theSeminarvariable, select theTargetbutton. For theDroppedvariable, select theInputbutton. Then in theExploretab, selectDistributionsin theType:row, selectSeminarin theGroup By: box, select the first box next toGPAvariable, and clickExecute.
(a) | Evaluate the candidate logistic regression models based on their predictive performance on the validation set. Recommend a final model and express the model as a mathematical equation relating the target variable to the input variables. Iteratively remove the least significant independent variable one at a time until all independent variables remaining in the model are significant at the 0.10 level of significance. If required, round your answers to four decimal places. If a variable is not used in the model, enter "0" before the corresponding variable. For subtractive or negative numbers use a minus sign even if there is a + sign before the blank. (Example: -300) |
Log odds of dropping out = + GPA + Seminar |
Understanding Basic Statistics
ISBN: 9781111827021
6th Edition
Authors: Charles Henry Brase, Corrinne Pellillo Brase