File size: 48,165 Bytes
0d00d62
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
480
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
526
527
528
529
530
531
532
533
534
535
536
537
538
539
540
541
542
543
544
545
546
547
548
549
550
551
552
553
554
555
556
557
558
559
560
561
562
563
564
565
566
567
568
569
570
571
572
573
574
575
576
577
578
579
580
581
582
583
584
585
586
587
588
589
590
591
592
593
594
595
596
597
598
599
600
601
602
603
604
605
606
607
608
609
610
611
612
613
614
615
616
617
618
619
620
621
622
623
624
625
626
627
628
629
630
631
632
633
634
635
636
637
638
639
640
641
642
643
644
645
646
647
648
649
650
651
652
653
654
655
656
657
658
659
660
661
662
663
664
665
666
667
668
669
670
671
672
673
674
675
676
677
678
679
680
681
682
683
684
685
686
687
688
689
690
691
692
693
694
695
696
697
698
699
700
701
702
703
704
705
706
707
708
709
710
711
712
713
714
715
716
717
718
719
720
721
722
723
724
725
726
727
728
729
730
731
732
733
734
735
736
737
738
739
740
741
742
743
744
745
746
747
748
749
750
751
752
753
754
755
756
757
758
759
760
761
762
763
764
765
766
767
768
769
770
771
772
773
774
775
776
777
778
779
780
781
782
783
784
785
786
787
788
789
790
791
792
793
794
795
796
797
798
799
800
801
802
803
804
805
806
807
808
809
810
811
812
813
814
815
816
817
818
819
820
821
822
823
824
825
826
827
828
829
830
831
832
833
834
835
836
837
838
839
840
841
842
843
844
845
846
847
848
849
850
851
852
853
854
855
856
857
858
859
860
861
862
863
864
865
866
867
868
869
870
871
872
873
874
875
876
877
878
879
880
881
882
883
884
885
886
887
888
889
890
891
892
893
894
895
896
897
898
899
900
901
902
903
904
905
906
907
908
909
910
911
912
913
914
915
916
917
918
919
920
921
922
923
924
925
926
927
928
929
930
931
932
933
934
935
936
937
938
939
940
941
942
943
944
945
946
947
948
949
950
951
952
953
954
955
956
957
958
959
960
961
962
963
964
965
966
967
968
969
970
971
972
973
974
975
976
977
978
979
980
981
982
983
984
985
986
987
988
989
990
991
992
993
994
995
996
997
998
999
1000
1001
1002
1003
1004
1005
1006
1007
1008
1009
1010
1011
1012
1013
1014
1015
1016
1017
1018
1019
1020
1021
1022
1023
1024
1025
1026
1027
1028
1029
1030
1031
1032
1033
1034
1035
1036
1037
1038
1039
1040
1041
1042
1043
1044
1045
1046
1047
1048
1049
1050
1051
1052
1053
1054
1055
1056
1057
1058
1059
1060
1061
1062
1063
1064
1065
1066
1067
1068
1069
1070
1071
1072
1073
1074
1075
1076
1077
1078
1079
1080
1081
1082
1083
1084
1085
1086
1087
1088
1089
1090
1091
1092
1093
1094
1095
1096
1097
1098
1099
1100
1101
1102
1103
1104
1105
1106
1107
1108
1109
1110
1111
1112
1113
1114
1115
1116
1117
1118
1119
1120
1121
1122
1123
1124
1125
1126
1127
1128
1129
1130
1131
1132
1133
1134
1135
1136
1137
1138
1139
1140
1141
1142
1143
1144
1145
1146
1147
1148
1149
1150
1151
1152
1153
1154
1155
1156
1157
1158
1159
1160
1161
1162
1163
1164
1165
1166
1167
1168
1169
1170
1171
1172
1173
1174
1175
1176
1177
1178
1179
1180
1181
1182
1183
1184
1185
1186
1187
1188
1189
1190
1191
1192
1193
1194
1195
1196
1197
1198
1199
1200
1201
1202
1203
1204
1205
1206
1207
1208
{
  "title": "Bagging Mastery: 100 MCQs",
  "description": "A comprehensive set of multiple-choice questions designed to test and deepen your understanding of Bagging (Bootstrap Aggregating), starting with easy-level concepts (1–30).",
  "questions": [
    {
      "id": 1,
      "questionText": "What does Bagging stand for in ensemble learning?",
      "options": [
        "Bootstrap Aggregating",
        "Bayesian Aggregation",
        "Binary Aggregation",
        "Batch Averaging"
      ],
      "correctAnswerIndex": 0,
      "explanation": "Bagging stands for Bootstrap Aggregating. It improves model stability and accuracy by training multiple models on random subsets of the dataset and aggregating their predictions."
    },
    {
      "id": 2,
      "questionText": "What is the main purpose of Bagging?",
      "options": [
        "Increase complexity",
        "Normalize data",
        "Reduce variance",
        "Reduce bias"
      ],
      "correctAnswerIndex": 2,
      "explanation": "Bagging reduces variance by averaging predictions from multiple models trained on different bootstrap samples, helping improve model stability."
    },
    {
      "id": 3,
      "questionText": "Which type of models is Bagging most effective with?",
      "options": [
        "Clustering models",
        "Linear models only",
        "High variance models",
        "High bias models"
      ],
      "correctAnswerIndex": 2,
      "explanation": "Bagging is especially effective with high variance models (like decision trees) because averaging multiple models reduces variance and prevents overfitting."
    },
    {
      "id": 4,
      "questionText": "How are the datasets generated in Bagging?",
      "options": [
        "By splitting features into groups",
        "By normalizing the original dataset",
        "By removing outliers",
        "By randomly sampling with replacement"
      ],
      "correctAnswerIndex": 3,
      "explanation": "Bagging uses bootstrap sampling, which randomly selects samples with replacement to create multiple training datasets for each model in the ensemble."
    },
    {
      "id": 5,
      "questionText": "In Bagging, how is the final prediction typically made?",
      "options": [
        "By using the last trained model only",
        "By averaging or majority voting",
        "By multiplying predictions",
        "By choosing the first model’s output"
      ],
      "correctAnswerIndex": 1,
      "explanation": "The final prediction in Bagging is usually made by averaging the outputs for regression tasks or majority voting for classification tasks."
    },
    {
      "id": 6,
      "questionText": "Which of the following is NOT a benefit of Bagging?",
      "options": [
        "Reduces overfitting",
        "Improves prediction stability",
        "Reduces bias significantly",
        "Reduces variance"
      ],
      "correctAnswerIndex": 2,
      "explanation": "Bagging primarily reduces variance. It may slightly reduce bias, but it does not significantly reduce bias. Other ensemble methods like boosting are better for bias reduction."
    },
    {
      "id": 7,
      "questionText": "Which algorithm is commonly used with Bagging?",
      "options": [
        "Naive Bayes",
        "Linear Regression",
        "Decision Trees",
        "K-Means"
      ],
      "correctAnswerIndex": 2,
      "explanation": "Decision Trees are commonly used with Bagging because they have high variance, and Bagging reduces this variance effectively."
    },
    {
      "id": 8,
      "questionText": "What is the main difference between Bagging and a single model?",
      "options": [
        "Bagging uses multiple models to reduce variance",
        "Bagging uses only one model",
        "Bagging removes all data randomness",
        "Bagging increases overfitting intentionally"
      ],
      "correctAnswerIndex": 0,
      "explanation": "Bagging trains multiple models on different random subsets and aggregates their predictions, which reduces variance compared to a single model."
    },
    {
      "id": 9,
      "questionText": "Bootstrap samples in Bagging are:",
      "options": [
        "Selected based on feature importance",
        "Always smaller than 10% of dataset",
        "Randomly drawn with replacement",
        "Drawn without replacement"
      ],
      "correctAnswerIndex": 2,
      "explanation": "Bootstrap sampling involves selecting data points randomly with replacement, allowing some points to appear multiple times in a sample."
    },
    {
      "id": 10,
      "questionText": "Bagging is mainly used for which type of problem?",
      "options": [
        "Only clustering",
        "Only anomaly detection",
        "Only dimensionality reduction",
        "Classification and regression"
      ],
      "correctAnswerIndex": 3,
      "explanation": "Bagging is an ensemble method applicable to both classification and regression tasks."
    },
    {
      "id": 11,
      "questionText": "In Bagging, increasing the number of models generally:",
      "options": [
        "Increases bias",
        "Makes individual models more complex",
        "Reduces variance and improves stability",
        "Reduces dataset size"
      ],
      "correctAnswerIndex": 2,
      "explanation": "Adding more models to Bagging averages predictions over more models, reducing variance and improving prediction stability."
    },
    {
      "id": 12,
      "questionText": "Which ensemble method uses boosting instead of averaging?",
      "options": [
        "Random Forest",
        "Bagging",
        "Boosting",
        "K-Means"
      ],
      "correctAnswerIndex": 2,
      "explanation": "Boosting is an ensemble method that sequentially trains models, focusing on errors of previous models, rather than averaging independent models like Bagging."
    },
    {
      "id": 13,
      "questionText": "Why does Bagging reduce overfitting in high variance models?",
      "options": [
        "Because it removes data noise",
        "Because it increases bias",
        "Because it averages multiple models’ predictions",
        "Because it uses fewer features"
      ],
      "correctAnswerIndex": 2,
      "explanation": "Bagging reduces overfitting by training multiple models on different samples and averaging their predictions, which stabilizes the output."
    },
    {
      "id": 14,
      "questionText": "Random Forest is a type of:",
      "options": [
        "Bagging with feature randomness",
        "Boosting with weighting",
        "Single decision tree",
        "Dimensionality reduction method"
      ],
      "correctAnswerIndex": 0,
      "explanation": "Random Forest is an extension of Bagging where trees are trained on bootstrap samples and each split considers a random subset of features to reduce correlation among trees."
    },
    {
      "id": 15,
      "questionText": "Which of these is a key requirement for Bagging to be effective?",
      "options": [
        "High variance of base models",
        "High bias of base models",
        "Small dataset size",
        "Single feature only"
      ],
      "correctAnswerIndex": 0,
      "explanation": "Bagging is most effective when base models have high variance; averaging their outputs reduces variance and stabilizes predictions."
    },
    {
      "id": 16,
      "questionText": "Bagging works best with:",
      "options": [
        "Stable learners like linear regression",
        "Clustering models",
        "Unstable learners like decision trees",
        "Dimensionality reduction models"
      ],
      "correctAnswerIndex": 2,
      "explanation": "Bagging reduces variance. Unstable learners with high variance benefit the most, while stable learners like linear regression do not gain much."
    },
    {
      "id": 17,
      "questionText": "How is the randomness introduced in Bagging?",
      "options": [
        "Through normalization",
        "Through adding noise to labels",
        "Through bootstrap sampling of data",
        "Through reducing feature space"
      ],
      "correctAnswerIndex": 2,
      "explanation": "Randomness in Bagging comes from creating multiple bootstrap samples from the original dataset."
    },
    {
      "id": 18,
      "questionText": "In classification with Bagging, the final class is decided by:",
      "options": [
        "Multiplying probabilities",
        "Weighted averaging",
        "Majority voting",
        "Selecting first model output"
      ],
      "correctAnswerIndex": 2,
      "explanation": "For classification, Bagging predicts the class that receives the most votes among all models."
    },
    {
      "id": 19,
      "questionText": "Which of the following statements is TRUE about Bagging?",
      "options": [
        "It decreases dataset size",
        "It reduces variance without greatly affecting bias",
        "It increases variance",
        "It is only used for regression"
      ],
      "correctAnswerIndex": 1,
      "explanation": "Bagging reduces variance by averaging predictions, while bias remains mostly unchanged."
    },
    {
      "id": 20,
      "questionText": "Bagging can be used with which base learners?",
      "options": [
        "Only decision trees",
        "Only clustering models",
        "Any model that benefits from variance reduction",
        "Only linear models"
      ],
      "correctAnswerIndex": 2,
      "explanation": "Any high-variance model can benefit from Bagging, not just decision trees."
    },
    {
      "id": 21,
      "questionText": "Bootstrap samples are the same size as the original dataset. True or False?",
      "options": [
        "False",
        "Depends on the algorithm",
        "True",
        "Depends on features"
      ],
      "correctAnswerIndex": 2,
      "explanation": "Typically, each bootstrap sample has the same number of instances as the original dataset but is sampled with replacement."
    },
    {
      "id": 22,
      "questionText": "Which scenario is ideal for using Bagging?",
      "options": [
        "Small datasets with no noise",
        "Low variance models",
        "High variance models prone to overfitting",
        "Single-feature linear regression"
      ],
      "correctAnswerIndex": 2,
      "explanation": "Bagging helps reduce overfitting in high variance models by averaging predictions from multiple models."
    },
    {
      "id": 23,
      "questionText": "Bagging helps in prediction stability by:",
      "options": [
        "Reducing dataset size",
        "Changing the loss function",
        "Increasing model depth",
        "Reducing fluctuations due to individual models"
      ],
      "correctAnswerIndex": 3,
      "explanation": "By averaging multiple models, Bagging reduces the impact of fluctuations from any single model, improving stability."
    },
    {
      "id": 24,
      "questionText": "Which of these is an ensemble learning technique like Bagging?",
      "options": [
        "Boosting",
        "PCA",
        "Feature Scaling",
        "K-Means"
      ],
      "correctAnswerIndex": 0,
      "explanation": "Boosting is another ensemble learning technique that differs from Bagging by sequentially training models."
    },
    {
      "id": 25,
      "questionText": "Does Bagging always improve model performance?",
      "options": [
        "It only works with linear models",
        "It decreases performance for high variance models",
        "It improves performance if the base model is high variance",
        "It always improves performance"
      ],
      "correctAnswerIndex": 2,
      "explanation": "Bagging improves performance primarily for models with high variance; stable models may not gain significant improvement."
    },
    {
      "id": 26,
      "questionText": "In Bagging, can the same instance appear multiple times in a bootstrap sample?",
      "options": [
        "Yes, due to sampling with replacement",
        "No, each instance appears only once",
        "Only if dataset is small",
        "Depends on features"
      ],
      "correctAnswerIndex": 0,
      "explanation": "Bootstrap sampling is done with replacement, so some instances may appear multiple times in the same sample."
    },
    {
      "id": 27,
      "questionText": "Bagging reduces overfitting by:",
      "options": [
        "Adding noise to data",
        "Increasing learning rate",
        "Reducing feature dimensionality",
        "Averaging multiple models trained on different data"
      ],
      "correctAnswerIndex": 3,
      "explanation": "Averaging predictions from multiple models trained on bootstrap samples reduces overfitting and variance."
    },
    {
      "id": 28,
      "questionText": "Which statement is TRUE about Random Forest compared to Bagging?",
      "options": [
        "Random Forest uses only one tree",
        "Random Forest adds feature randomness to Bagging",
        "Random Forest does not use bootstrap sampling",
        "Random Forest uses boosting"
      ],
      "correctAnswerIndex": 1,
      "explanation": "Random Forest is Bagging with additional feature randomness at each split to decorrelate trees."
    },
    {
      "id": 29,
      "questionText": "Which error does Bagging aim to reduce the most?",
      "options": [
        "Feature selection error",
        "Variance",
        "Irreducible error",
        "Bias"
      ],
      "correctAnswerIndex": 1,
      "explanation": "Bagging primarily reduces variance in high-variance models, leading to more stable predictions."
    },
    {
      "id": 30,
      "questionText": "Which type of datasets benefit most from Bagging?",
      "options": [
        "Datasets meant for clustering",
        "Small, perfectly clean datasets",
        "Large datasets with noisy labels",
        "Datasets with single features only"
      ],
      "correctAnswerIndex": 2,
      "explanation": "Bagging is especially useful for large datasets with noisy labels or high variance models to stabilize predictions."
    },
    {
      "id": 31,
      "questionText": "What is the role of the number of estimators (trees) in Bagging?",
      "options": [
        "Increasing it increases bias",
        "It controls feature selection",
        "Increasing it generally reduces variance",
        "It reduces dataset size"
      ],
      "correctAnswerIndex": 2,
      "explanation": "Increasing the number of base models (trees) in Bagging helps in averaging more predictions, which reduces variance and stabilizes the model."
    },
    {
      "id": 32,
      "questionText": "When performing regression with Bagging, which aggregation method is used?",
      "options": [
        "Majority voting",
        "Averaging predictions",
        "Multiplying predictions",
        "Weighted voting"
      ],
      "correctAnswerIndex": 1,
      "explanation": "For regression, Bagging combines predictions by averaging outputs from all models."
    },
    {
      "id": 33,
      "questionText": "Which hyperparameter of base models impacts Bagging performance the most?",
      "options": [
        "Learning rate",
        "Model depth (for decision trees)",
        "Kernel type",
        "Number of classes"
      ],
      "correctAnswerIndex": 1,
      "explanation": "Decision tree depth influences individual model variance. Deep trees are high variance and benefit most from Bagging."
    },
    {
      "id": 34,
      "questionText": "If a Bagging ensemble is underfitting, which approach can help?",
      "options": [
        "Decrease features",
        "Reduce sample size",
        "Reduce number of trees",
        "Increase base model complexity"
      ],
      "correctAnswerIndex": 3,
      "explanation": "Underfitting occurs when models are too simple. Increasing the complexity of base models allows each to capture more patterns, improving ensemble performance."
    },
    {
      "id": 35,
      "questionText": "Bagging can help reduce overfitting caused by:",
      "options": [
        "Irreducible error",
        "Small dataset size",
        "High bias in base learners",
        "High variance in base learners"
      ],
      "correctAnswerIndex": 3,
      "explanation": "Bagging reduces overfitting that arises from high variance models by averaging multiple models trained on bootstrap samples."
    },
    {
      "id": 36,
      "questionText": "How does Bagging impact the training time?",
      "options": [
        "Has no effect",
        "Increases training time linearly with number of models",
        "Decreases training time",
        "Reduces only for regression"
      ],
      "correctAnswerIndex": 1,
      "explanation": "Training multiple models increases computational cost, as each model is trained separately on a bootstrap sample."
    },
    {
      "id": 37,
      "questionText": "Which metric would best evaluate Bagging for classification?",
      "options": [
        "Silhouette Score",
        "Mean Squared Error",
        "Accuracy, F1-score, or AUC",
        "R-squared"
      ],
      "correctAnswerIndex": 2,
      "explanation": "Accuracy, F1-score, and AUC are standard metrics for classification, suitable for evaluating Bagging ensembles."
    },
    {
      "id": 38,
      "questionText": "Bagging helps in scenarios where the model is:",
      "options": [
        "High variance but low bias",
        "Low bias and low variance",
        "Unsupervised",
        "High bias but low variance"
      ],
      "correctAnswerIndex": 0,
      "explanation": "Bagging is most beneficial for high variance models; it averages predictions to reduce variance while bias remains low."
    },
    {
      "id": 39,
      "questionText": "If bootstrap samples are too small, what is likely to happen?",
      "options": [
        "Bias decreases",
        "Variance reduction decreases",
        "Model becomes unsupervised",
        "Training time increases"
      ],
      "correctAnswerIndex": 1,
      "explanation": "Smaller bootstrap samples provide less diversity and reduce the effectiveness of variance reduction in Bagging."
    },
    {
      "id": 40,
      "questionText": "Bagging can be combined with which technique for better performance?",
      "options": [
        "Normalization only",
        "PCA without ensemble",
        "Single linear regression",
        "Random feature selection (Random Forest)"
      ],
      "correctAnswerIndex": 3,
      "explanation": "Combining Bagging with random feature selection, as in Random Forests, further decorrelates trees and improves performance."
    },
    {
      "id": 41,
      "questionText": "Which of the following is true about Bagging and bias?",
      "options": [
        "Bias is irrelevant",
        "Bias may remain mostly unchanged",
        "Bias increases significantly",
        "Bias is always reduced"
      ],
      "correctAnswerIndex": 1,
      "explanation": "Bagging primarily reduces variance. Bias generally remains the same because base learners are not modified."
    },
    {
      "id": 42,
      "questionText": "In Bagging, how are outliers in the training data handled?",
      "options": [
        "They are removed automatically",
        "They cause model to ignore majority classes",
        "They have no effect",
        "They are partially mitigated by averaging predictions"
      ],
      "correctAnswerIndex": 3,
      "explanation": "Outliers may affect individual models, but averaging predictions reduces their impact on final output."
    },
    {
      "id": 43,
      "questionText": "Bagging with deep trees is preferred over shallow trees because:",
      "options": [
        "Shallow trees overfit more",
        "Shallow trees have high variance",
        "Deep trees reduce bias automatically",
        "Deep trees have higher variance which Bagging reduces"
      ],
      "correctAnswerIndex": 3,
      "explanation": "Bagging reduces variance. Deep trees tend to overfit (high variance), so Bagging stabilizes them."
    },
    {
      "id": 44,
      "questionText": "Which is an advantage of Bagging over a single model?",
      "options": [
        "Faster training",
        "Improved prediction stability",
        "Automatic feature selection",
        "Reduced number of features"
      ],
      "correctAnswerIndex": 1,
      "explanation": "Bagging improves stability and reduces variance by averaging predictions from multiple models."
    },
    {
      "id": 45,
      "questionText": "Bagging can help in which real-world scenario?",
      "options": [
        "Single linear regression on clean data",
        "Unsupervised clustering",
        "Classifying perfectly separable data",
        "Predicting stock prices with high-variance trees"
      ],
      "correctAnswerIndex": 3,
      "explanation": "Bagging is useful in high-variance prediction problems, such as stock price prediction with complex decision trees."
    },
    {
      "id": 46,
      "questionText": "Why might Bagging not improve a linear regression model?",
      "options": [
        "Linear regression is unstable",
        "Bagging cannot be used for regression",
        "It always decreases performance",
        "Linear regression is a low variance model"
      ],
      "correctAnswerIndex": 3,
      "explanation": "Linear regression is a stable, low-variance model. Bagging does not significantly improve performance in such cases."
    },
    {
      "id": 47,
      "questionText": "In Bagging, increasing correlation among base models:",
      "options": [
        "Improves variance reduction",
        "Decreases bias automatically",
        "Does not matter",
        "Reduces ensemble effectiveness"
      ],
      "correctAnswerIndex": 3,
      "explanation": "High correlation among base models reduces the benefit of averaging, making Bagging less effective."
    },
    {
      "id": 48,
      "questionText": "When using Bagging, what should you do to reduce correlation among trees?",
      "options": [
        "Use random subsets of features (Random Forest approach)",
        "Use fewer trees",
        "Increase bootstrap sample size",
        "Use shallow trees only"
      ],
      "correctAnswerIndex": 0,
      "explanation": "Randomly selecting features at each split reduces correlation among trees, enhancing Bagging effectiveness."
    },
    {
      "id": 49,
      "questionText": "Which is true about Bagging in small datasets?",
      "options": [
        "It always works perfectly",
        "It may not improve performance much",
        "It increases model complexity",
        "It reduces bias significantly"
      ],
      "correctAnswerIndex": 1,
      "explanation": "Bagging relies on diverse bootstrap samples. In small datasets, diversity is limited, reducing its effectiveness."
    },
    {
      "id": 50,
      "questionText": "What is the effect of Bagging on model interpretability?",
      "options": [
        "Interpretability increases",
        "Interpretability decreases compared to single model",
        "It simplifies decision trees",
        "No effect"
      ],
      "correctAnswerIndex": 1,
      "explanation": "Ensembling multiple models makes it harder to interpret predictions compared to a single model."
    },
    {
      "id": 51,
      "questionText": "Which combination is commonly used in practice?",
      "options": [
        "Bagging with linear regression on clean data",
        "Bagging with decision trees (Random Forest)",
        "Bagging with K-Means",
        "Bagging with PCA"
      ],
      "correctAnswerIndex": 1,
      "explanation": "Bagging with decision trees, as in Random Forests, is the most common and effective practical implementation."
    },
    {
      "id": 52,
      "questionText": "What is the effect of increasing the number of trees beyond a certain point?",
      "options": [
        "Training time decreases",
        "Overfitting increases",
        "Bias increases",
        "Variance reduction saturates"
      ],
      "correctAnswerIndex": 3,
      "explanation": "After a certain number of trees, adding more provides little additional variance reduction, but training cost increases."
    },
    {
      "id": 53,
      "questionText": "Bagging is more suitable than boosting when:",
      "options": [
        "High variance base learners need stabilization",
        "High bias learners need improvement",
        "The dataset is very small",
        "Features are unimportant"
      ],
      "correctAnswerIndex": 0,
      "explanation": "Bagging reduces variance, while boosting is more focused on reducing bias and sequential learning."
    },
    {
      "id": 54,
      "questionText": "What type of error does Bagging primarily address?",
      "options": [
        "Feature error",
        "Bias",
        "Irreducible error",
        "Variance"
      ],
      "correctAnswerIndex": 3,
      "explanation": "Bagging reduces variance errors by averaging predictions from multiple models."
    },
    {
      "id": 55,
      "questionText": "How can Bagging handle noisy labels?",
      "options": [
        "It removes noisy labels automatically",
        "Noise increases ensemble variance",
        "Averaging reduces the effect of noisy instances",
        "Noise has no effect"
      ],
      "correctAnswerIndex": 2,
      "explanation": "Averaging predictions from multiple models reduces the influence of noise in individual training samples."
    },
    {
      "id": 56,
      "questionText": "In Random Forest, what differentiates it from plain Bagging?",
      "options": [
        "Sequential learning",
        "Random feature selection at each split",
        "Boosting weights",
        "No bootstrap sampling"
      ],
      "correctAnswerIndex": 1,
      "explanation": "Random Forest introduces feature randomness at each split in addition to Bagging to decorrelate trees."
    },
    {
      "id": 57,
      "questionText": "Bagging ensemble predictions are robust because:",
      "options": [
        "Only the first model matters",
        "It reduces bias completely",
        "All models use the same data",
        "Individual model errors cancel out"
      ],
      "correctAnswerIndex": 3,
      "explanation": "Averaging predictions from diverse models helps cancel out individual errors, leading to more robust outputs."
    },
    {
      "id": 58,
      "questionText": "Which is NOT a hyperparameter of Bagging?",
      "options": [
        "Base model type",
        "Learning rate",
        "Bootstrap sample size",
        "Number of estimators"
      ],
      "correctAnswerIndex": 1,
      "explanation": "Learning rate is not a hyperparameter for Bagging; it is used in boosting algorithms."
    },
    {
      "id": 59,
      "questionText": "How does Bagging affect overfitting on noisy datasets?",
      "options": [
        "Does not affect overfitting",
        "Increases overfitting",
        "Reduces overfitting",
        "Eliminates bias completely"
      ],
      "correctAnswerIndex": 2,
      "explanation": "Averaging predictions reduces variance, which helps in reducing overfitting on noisy datasets."
    },
    {
      "id": 60,
      "questionText": "Bagging works best when base models are:",
      "options": [
        "Stable and low variance",
        "Unstable and high variance",
        "Linear regression only",
        "Perfectly accurate"
      ],
      "correctAnswerIndex": 1,
      "explanation": "Bagging reduces variance, so it works best with unstable, high-variance models like decision trees."
    },
    {
      "id": 61,
      "questionText": "Increasing diversity among base learners in Bagging:",
      "options": [
        "Reduces stability",
        "Increases bias",
        "Improves ensemble performance",
        "Has no effect"
      ],
      "correctAnswerIndex": 2,
      "explanation": "More diverse models provide uncorrelated errors, which improves averaging and ensemble performance."
    },
    {
      "id": 62,
      "questionText": "Bagging is considered a parallel ensemble method because:",
      "options": [
        "Feature selection is sequential",
        "Bootstrap samples are dependent",
        "All models are trained independently",
        "Models are trained sequentially"
      ],
      "correctAnswerIndex": 2,
      "explanation": "In Bagging, models are trained independently on different bootstrap samples, allowing parallel computation."
    },
    {
      "id": 63,
      "questionText": "Bagging performance is limited by:",
      "options": [
        "Dataset size",
        "Correlation among base models",
        "Feature normalization",
        "Bias of base models only"
      ],
      "correctAnswerIndex": 1,
      "explanation": "If base models are highly correlated, averaging them does not reduce variance effectively, limiting Bagging performance."
    },
    {
      "id": 64,
      "questionText": "When would increasing bootstrap sample size improve Bagging?",
      "options": [
        "When bias is too low",
        "When individual models are undertrained",
        "When model is overfitting",
        "When using boosting"
      ],
      "correctAnswerIndex": 1,
      "explanation": "Larger bootstrap samples provide better training for each base model, improving overall ensemble performance."
    },
    {
      "id": 65,
      "questionText": "Which scenario reduces Bagging effectiveness?",
      "options": [
        "Large datasets",
        "High variance models",
        "Deep decision trees",
        "Highly correlated base models"
      ],
      "correctAnswerIndex": 3,
      "explanation": "Highly correlated base models reduce the benefit of averaging predictions, making Bagging less effective."
    },
    {
      "id": 66,
      "questionText": "Bagging can be implemented for regression using:",
      "options": [
        "PCA only",
        "Only linear regression",
        "Decision trees or other regressors",
        "Clustering algorithms"
      ],
      "correctAnswerIndex": 2,
      "explanation": "Bagging can be applied with any high variance regressor, commonly decision trees."
    },
    {
      "id": 67,
      "questionText": "How does Bagging affect model variance?",
      "options": [
        "Leaves variance unchanged",
        "Increases variance",
        "Reduces variance",
        "Reduces bias only"
      ],
      "correctAnswerIndex": 2,
      "explanation": "Averaging predictions from multiple models reduces variance compared to individual base models."
    },
    {
      "id": 68,
      "questionText": "Which is true about Bagging and Random Forest?",
      "options": [
        "Random Forest increases bias",
        "Random Forest is sequential boosting",
        "Random Forest is Bagging with feature randomness",
        "Random Forest has no bootstrap"
      ],
      "correctAnswerIndex": 2,
      "explanation": "Random Forest builds on Bagging and adds random feature selection to reduce tree correlation."
    },
    {
      "id": 69,
      "questionText": "What type of learners are less likely to benefit from Bagging?",
      "options": [
        "Stable, low-variance learners",
        "Deep learners",
        "High-variance trees",
        "Noisy models"
      ],
      "correctAnswerIndex": 0,
      "explanation": "Stable, low-variance models already produce consistent predictions; Bagging adds little benefit."
    },
    {
      "id": 70,
      "questionText": "Which factor does NOT influence Bagging effectiveness?",
      "options": [
        "Correlation among models",
        "Feature scaling",
        "Diversity of models",
        "Number of base models"
      ],
      "correctAnswerIndex": 1,
      "explanation": "Bagging effectiveness is influenced by model diversity, correlation, and number of models, but feature scaling does not play a direct role."
    },
    {
      "id": 71,
      "questionText": "You have a high-dimensional dataset with correlated features. How would Bagging performance be affected?",
      "options": [
        "Performance is unaffected",
        "Bias will reduce significantly",
        "Performance may degrade due to correlation among base models",
        "Performance will improve automatically"
      ],
      "correctAnswerIndex": 2,
      "explanation": "High correlation among base models reduces the benefit of averaging, which can degrade Bagging performance. Random feature selection can help mitigate this."
    },
    {
      "id": 72,
      "questionText": "In a dataset with severe class imbalance, how can Bagging be adapted?",
      "options": [
        "Use balanced bootstrap samples or weighted voting",
        "Reduce number of trees",
        "Apply PCA before Bagging",
        "Ignore imbalance as Bagging handles it automatically"
      ],
      "correctAnswerIndex": 0,
      "explanation": "For imbalanced datasets, Bagging can use balanced bootstrap samples or weight the voting process to handle minority classes more effectively."
    },
    {
      "id": 73,
      "questionText": "If Bagging is applied to already overfitted deep trees, what is the likely outcome?",
      "options": [
        "Variance decreases, but predictions may still overfit slightly",
        "Overfitting increases",
        "Bias decreases significantly",
        "Model becomes linear"
      ],
      "correctAnswerIndex": 0,
      "explanation": "Bagging reduces variance of overfitted models, stabilizing predictions, but extreme overfitting may still persist to some extent."
    },
    {
      "id": 74,
      "questionText": "Which is a real-world scenario where Bagging might fail?",
      "options": [
        "High variance decision trees",
        "Noisy datasets",
        "Small datasets with low variance models",
        "Large datasets"
      ],
      "correctAnswerIndex": 2,
      "explanation": "Bagging relies on diversity from bootstrap samples. Small datasets with low variance models do not benefit much, limiting Bagging effectiveness."
    },
    {
      "id": 75,
      "questionText": "How does Bagging compare to boosting in terms of error reduction?",
      "options": [
        "Both reduce variance only",
        "Bagging reduces variance, boosting reduces bias",
        "Bagging reduces bias, boosting reduces variance",
        "Both reduce bias only"
      ],
      "correctAnswerIndex": 1,
      "explanation": "Bagging is designed to reduce variance by averaging predictions, while boosting sequentially reduces bias by focusing on errors."
    },
    {
      "id": 76,
      "questionText": "In a scenario where computation is limited, what trade-off exists for Bagging?",
      "options": [
        "Bias increases automatically",
        "Fewer base models reduce computation but may increase variance",
        "More base models reduce computation",
        "Bootstrap sampling becomes unnecessary"
      ],
      "correctAnswerIndex": 1,
      "explanation": "Reducing the number of models saves computation but decreases variance reduction, which may affect performance."
    },
    {
      "id": 77,
      "questionText": "Bagging is applied to a time-series prediction problem. What caution should be taken?",
      "options": [
        "Bootstrap samples should respect temporal order",
        "Features should be normalized first",
        "Time-series data does not need Bagging",
        "Standard bootstrap is sufficient"
      ],
      "correctAnswerIndex": 0,
      "explanation": "In time-series data, random bootstrap may break temporal relationships. Resampling should maintain temporal order."
    },
    {
      "id": 78,
      "questionText": "When using Bagging with regression trees, which is true about overfitting?",
      "options": [
        "Bagging increases overfitting",
        "Overfitting is only reduced if dataset is small",
        "Bagging has no effect on overfitting",
        "Bagging reduces overfitting due to variance averaging"
      ],
      "correctAnswerIndex": 3,
      "explanation": "Bagging averages predictions from multiple high-variance trees, reducing overfitting by stabilizing the output."
    },
    {
      "id": 79,
      "questionText": "A Bagging model shows poor performance on unseen data. Which is the likely reason?",
      "options": [
        "Base models are biased or low variance",
        "Random feature selection is used",
        "Bootstrap sampling is perfect",
        "Number of trees is too high"
      ],
      "correctAnswerIndex": 0,
      "explanation": "Bagging is effective for high variance models. If base models are biased or too simple, Bagging cannot improve generalization much."
    },
    {
      "id": 80,
      "questionText": "Which scenario demonstrates Bagging’s strength?",
      "options": [
        "PCA datasets",
        "Clustering datasets",
        "Small, linear datasets",
        "High variance, non-linear datasets"
      ],
      "correctAnswerIndex": 3,
      "explanation": "Bagging excels with high variance, complex datasets, like non-linear relationships captured by decision trees."
    },
    {
      "id": 81,
      "questionText": "In a real-time prediction system, what is a potential drawback of Bagging?",
      "options": [
        "Prediction latency due to multiple models",
        "Bias increases significantly",
        "Randomness is removed",
        "Bootstrap sampling fails"
      ],
      "correctAnswerIndex": 0,
      "explanation": "Bagging requires predictions from multiple models, which can increase latency in real-time applications."
    },
    {
      "id": 82,
      "questionText": "How can Bagging be optimized for large-scale datasets?",
      "options": [
        "Use a single base model",
        "Avoid bootstrap sampling",
        "Reduce the number of features",
        "Parallelize model training across processors"
      ],
      "correctAnswerIndex": 3,
      "explanation": "Bagging can be parallelized because each model is trained independently, making it scalable for large datasets."
    },
    {
      "id": 83,
      "questionText": "If base models are highly correlated, which approach can improve Bagging?",
      "options": [
        "Reduce tree depth",
        "Use single model",
        "Random feature selection (like Random Forest)",
        "Increase sample size only"
      ],
      "correctAnswerIndex": 2,
      "explanation": "Introducing feature randomness decreases correlation among models, improving Bagging’s effectiveness."
    },
    {
      "id": 84,
      "questionText": "Bagging is applied to image classification with deep trees. Which is a valid advantage?",
      "options": [
        "Reduces variance while capturing complex patterns",
        "Decreases number of features",
        "Removes need for normalization",
        "Reduces dataset size"
      ],
      "correctAnswerIndex": 0,
      "explanation": "Bagging stabilizes predictions from complex trees while still allowing each tree to capture intricate patterns."
    },
    {
      "id": 85,
      "questionText": "Which of the following scenarios benefits least from Bagging?",
      "options": [
        "High variance decision trees",
        "Noisy data with high variance trees",
        "Low variance models like linear regression",
        "Classification tasks with deep trees"
      ],
      "correctAnswerIndex": 2,
      "explanation": "Stable, low-variance models do not gain significant improvement from Bagging, as variance is already low."
    },
    {
      "id": 86,
      "questionText": "How does Bagging handle overfitting in ensemble models?",
      "options": [
        "Ignores overfitting completely",
        "Increases it by adding more models",
        "Reduces it by averaging multiple high variance models",
        "Reduces bias instead of variance"
      ],
      "correctAnswerIndex": 2,
      "explanation": "By averaging predictions from multiple overfitted models, Bagging reduces variance and helps mitigate overfitting."
    },
    {
      "id": 87,
      "questionText": "What is the main difference between Random Forest and standard Bagging?",
      "options": [
        "Random Forest uses boosting instead",
        "Random Forest has no bootstrap samples",
        "Random Forest adds feature randomness at splits",
        "Random Forest reduces bias instead of variance"
      ],
      "correctAnswerIndex": 2,
      "explanation": "Random Forest builds upon Bagging by introducing random feature selection at each split to reduce correlation among trees."
    },
    {
      "id": 88,
      "questionText": "When Bagging is used with regression trees on large noisy datasets, what is the effect?",
      "options": [
        "Training time decreases",
        "Variance is reduced, predictions are more stable",
        "Models always overfit",
        "Bias is eliminated completely"
      ],
      "correctAnswerIndex": 1,
      "explanation": "Bagging averages predictions from multiple trees, reducing variance and stabilizing outputs even in noisy datasets."
    },
    {
      "id": 89,
      "questionText": "In practice, what is a reason to limit the number of trees in Bagging?",
      "options": [
        "Computational cost and diminishing returns on variance reduction",
        "Randomness is lost",
        "Bias increases automatically",
        "Training becomes sequential"
      ],
      "correctAnswerIndex": 0,
      "explanation": "Beyond a certain point, adding more trees does not significantly reduce variance but increases computation."
    },
    {
      "id": 90,
      "questionText": "In which scenario is Bagging most likely to fail?",
      "options": [
        "High-variance decision trees",
        "Large-scale datasets with parallel computation",
        "Low-variance, biased base learners",
        "Noisy datasets"
      ],
      "correctAnswerIndex": 2,
      "explanation": "Bagging reduces variance; it cannot fix high-bias, low-variance models, which limits its effectiveness."
    },
    {
      "id": 91,
      "questionText": "You want to reduce prediction variance for a stock market model using trees. What method should you consider?",
      "options": [
        "Clustering",
        "PCA only",
        "Single linear regression",
        "Bagging ensemble of decision trees"
      ],
      "correctAnswerIndex": 3,
      "explanation": "Stock market predictions are high-variance. Bagging multiple decision trees stabilizes predictions and reduces variance."
    },
    {
      "id": 92,
      "questionText": "For highly correlated features, which Bagging modification helps performance?",
      "options": [
        "Remove bootstrap",
        "Random feature selection at splits",
        "Use shallow trees",
        "Increase number of estimators only"
      ],
      "correctAnswerIndex": 1,
      "explanation": "Random feature selection reduces correlation among trees, improving the effectiveness of Bagging."
    },
    {
      "id": 93,
      "questionText": "Which is a computational challenge with Bagging?",
      "options": [
        "Bias increases automatically",
        "Training multiple models increases time and memory",
        "Overfitting is unavoidable",
        "Bootstrap sampling fails on large datasets"
      ],
      "correctAnswerIndex": 1,
      "explanation": "Training many models independently can be computationally intensive, especially for large datasets."
    },
    {
      "id": 94,
      "questionText": "In a classification problem with Bagging, why might majority voting fail?",
      "options": [
        "If features are normalized",
        "If base models are biased or misclassify the same instances",
        "If dataset is large",
        "If number of trees is too high"
      ],
      "correctAnswerIndex": 1,
      "explanation": "If all base models are biased in the same way, majority voting will not correct the errors."
    },
    {
      "id": 95,
      "questionText": "Bagging is considered robust because:",
      "options": [
        "Outliers have reduced impact due to averaging",
        "Bootstrap samples are ignored",
        "Correlation is increased",
        "Bias is eliminated"
      ],
      "correctAnswerIndex": 0,
      "explanation": "Averaging predictions reduces the effect of outliers, making Bagging more robust to noisy data."
    },
    {
      "id": 96,
      "questionText": "Which scenario illustrates Bagging’s limitation?",
      "options": [
        "Using stable low-variance models where averaging provides minimal gain",
        "Using high variance models",
        "Using noisy datasets",
        "Using parallel computation"
      ],
      "correctAnswerIndex": 0,
      "explanation": "Stable low-variance models do not benefit from Bagging as variance is already low."
    },
    {
      "id": 97,
      "questionText": "In Bagging, if base models perform differently on subsets of data, what is the effect?",
      "options": [
        "Prediction variance decreases and ensemble is more stable",
        "Training time reduces",
        "Ensemble fails completely",
        "Bias increases dramatically"
      ],
      "correctAnswerIndex": 0,
      "explanation": "Diverse base models provide uncorrelated errors; averaging reduces variance and stabilizes predictions."
    },
    {
      "id": 98,
      "questionText": "How can Bagging handle noisy labels in training data?",
      "options": [
        "Models ignore noisy samples",
        "Noise is amplified automatically",
        "Bias is eliminated completely",
        "Averaging predictions reduces the impact of noise"
      ],
      "correctAnswerIndex": 3,
      "explanation": "Averaging predictions from multiple models mitigates the effect of noisy labels in the final output."
    },
    {
      "id": 99,
      "questionText": "Which factor can limit Bagging effectiveness in real-world applications?",
      "options": [
        "Bootstrap sampling",
        "High correlation among base learners",
        "High variance of base learners",
        "Parallel training"
      ],
      "correctAnswerIndex": 1,
      "explanation": "High correlation among base models reduces variance reduction, limiting Bagging performance."
    },
    {
      "id": 100,
      "questionText": "Which is a key consideration when applying Bagging to real-world regression problems?",
      "options": [
        "Bagging always guarantees perfect predictions",
        "Only number of features matters",
        "Base model complexity, number of estimators, and correlation among models",
        "Bootstrap size is irrelevant"
      ],
      "correctAnswerIndex": 2,
      "explanation": "For effective Bagging, you must consider base model complexity, ensemble size, and model correlation to ensure variance reduction and generalization."
    }
  ]
}