1. d (Dimensionality reduction aids in data compression by reducing features and removes noise by focusing on important components.)
2. c & d (Face recognition and gene expression analysis use dimensionality reduction; however, the question allows only one answer. If multiple answers are permitted, both c and d are correct. Otherwise, the question may have an error.)
3. a (Image classification involves labeling unlabeled data, though supervised learning is more precise; the options are limited.)
4. b (Microarray analysis deals with high-dimensional data, requiring techniques like PCA.)
5. b (Feature selection retains a subset of original features, unlike feature reduction which creates new ones.)
6. b (Wrapper model uses a classification algorithm to evaluate feature subsets.)
7. b (PCA is a dimensionality reduction technique.)
8. c (MRMR maximizes feature relevance to the target and minimizes redundancy between features.)
9. c (Scikit-learn provides PCA via sklearn.decomposition.PCA.)
10. a (Standardizing ensures equal contribution of features to PCA by normalizing variance.)
11. b (Information gain is a filter model metric for feature relevance.)
12. b (Lambda functions are anonymous, defined with lambda.)
13. a (Correct syntax: lambda a, b: a + b.)
14. a (Lambda is called immediately with arguments: (lambda a, b: a*b)(5, 3).)
15. N/A (Question 15 missing in provided text.)
16. b (Validation set is used for tuning hyperparameters.)
17. b (False: Test set is separate from training data.)
18. d (Classification uses predefined classes.)
19. a (True: Clustering discovers unknown groups.)
20. a (Smaller k (e.g., 3) is better for local patterns; context-dependent.)