[go: up one dir, main page]

0% found this document useful (0 votes)
115 views2 pages

Deep Learning Unit I II MCQ

The document contains multiple-choice questions (MCQs) focused on deep learning and artificial intelligence concepts. Key topics include the goals of AI, algorithms like Naive Bayes and Support Vector Machines, issues like overfitting, and techniques such as dropout regularization and transfer learning. The questions cover various aspects of machine learning, including supervised and unsupervised learning, model evaluation metrics, and neural network architectures.

Uploaded by

G Ravi Kumar
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
115 views2 pages

Deep Learning Unit I II MCQ

The document contains multiple-choice questions (MCQs) focused on deep learning and artificial intelligence concepts. Key topics include the goals of AI, algorithms like Naive Bayes and Support Vector Machines, issues like overfitting, and techniques such as dropout regularization and transfer learning. The questions cover various aspects of machine learning, including supervised and unsupervised learning, model evaluation metrics, and neural network architectures.

Uploaded by

G Ravi Kumar
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 2

Deep Learning - UNIT I & II: MCQs

1. What is the primary goal of Artificial Intelligence?


A. Replace humans
B. Solve arithmetic problems faster
C. Enable machines to mimic human intelligence
D. Store large data efficiently Answer: C
2. Which algorithm is based on probabilistic modeling?
A. Decision Tree
B. Naive Bayes
C. Perceptron
D. Random Forest Answer: B
3. The perceptron is mainly used for solving which type of problems?
A. Non-linear classification
B. Regression
C. Linearly separable classification
D. Clustering✅ Answer: C
4. Which of the following is a kernel-based algorithm?
A. Random Forest
B. Support Vector Machine
C. Naive Bayes
D. K-means✅ Answer: B

5. What is the major problem with decision trees?

A. High computation cost


B. Cannot handle categorical data
C. Overfitting
D. Low accuracy✅ Answer: C

6. Random Forest is an example of:

A. Boosting
B. Supervised Learning
C. Bagging
D. Clustering✅ Answer: C
7. Gradient Boosting works by:
A. Averaging multiple trees
B. Combining weak learners sequentially
C. Training a single strong learner
D. Using k-nearest neighbors✅ Answer: B
8. Which of the following is *not* a branch of machine learning?
A. Supervised Learning
B. Unsupervised Learning
C. Reinforcement Learning
D. Active Learning✅ Answer: D
9. In supervised learning, the training data must include:
A. Only features
B. Only labels
C. Features and labels
D. No labels✅ Answer: C
10. Which metric is best when classes are imbalanced?
A. Accuracy
B. Precision
C. Recall
D. F1-score✅ Answer: D
11. Overfitting occurs when:
A. Model performs well on test data
B. Model performs well on training data but poorly on test data
C. Model has high bias
D. Model uses a small dataset✅ Answer: B
12. Bias-variance tradeoff is used to:
A. Optimize training time
B. Improve model interpretability
C. Balance underfitting and overfitting
D. Select activation functionAnswer: C
13. CNNs are mainly used for:
A. Speech recognition
B. Text summarization
C. Image processing
D. Regression tasks✅ Answer: C
14. Which is the correct order in a feedforward neural network?
A. Input → Output → Hidden
B. Hidden → Input → Output
C. Input → Hidden → Output
D. Output → Input → Hidden✅ Answer: C
15. Which activation function is most commonly used in hidden layers?
A. Sigmoid
B. Softmax
C. ReLU
D. Tanh✅ Answer: C
16. Which of the following prevents overfitting in deep neural networks?
A. Increasing the learning rate
B. Using a small dataset
C. Dropout regularization
D. Skipping trainingAnswer: C
17. Backpropagation is used for:
A. Model evaluation
B. Weight initialization
C. Updating weights
D. Model testing✅ Answer: C
18. Which optimizer adapts the learning rate during training?
A. SGD
B. Momentum
C. Adam
D. Batch Gradient Descent✅ Answer: C
19. Batch Normalization is used to:
A. Normalize dataset
B. Normalize layer inputs
C. Reduce dataset size
D. Normalize labels✅ Answer: B
20. Transfer learning is effective when:
A. No pretrained models are available
B. Data is abundant
C. Similar tasks exist
D. Models need to be trained from scratch

✅ Answer: C

You might also like