🤖 Top AI Interview Questions: Part-4
Continue your AI interview prep with advanced machine learning & AI concepts.
- Decision Tree: Flowchart-like structure where internal nodes are feature tests, branches are outcomes, and leaves are final decisions.
- Random Forest: Ensemble of decision trees trained on different subsets of data/features. Improves accuracy and reduces overfitting by averaging tree outputs.
SVM is a supervised learning model that finds the best hyperplane to separate classes with maximum margin.
Works well for linear and non-linear data using kernel functions (RBF, polynomial).
Combines predictions from multiple models to improve performance.
Benefits:
- Increases robustness
- Reduces overfitting
- Bagging: Trains models independently on random samples (e.g., Random Forest). Reduces variance.
- Boosting: Trains models sequentially; each corrects the previous (e.g., XGBoost). Reduces bias.
Technique to evaluate model performance by splitting dataset into k folds. Train on (k−1) folds and test on remaining fold. Repeat k times for reliability.
- ROC Curve: Plots True Positive Rate vs False Positive Rate.
- AUC: Area under ROC curve. Closer to 1 indicates better performance.
Unsupervised neural network used for dimensionality reduction.
Learns to encode input to lower-dimensional space and decode back.
Applications: denoising, anomaly detection.
GANs have two networks:
- Generator: Creates fake data
- Discriminator: Distinguishes real from fake
- LSTM (Long Short-Term Memory) & GRU (Gated Recurrent Unit) are RNN variants solving vanishing gradient problem.
- Used in sequence modeling (text, time series).
- GRU: faster, fewer parameters; LSTM: more expressive.
Natural Language Processing (NLP) deals with human language.
Applications:
- Chatbots
- Sentiment analysis
- Language translation
- Text summarization
- Speech recognition
Comments (0)
No comments yet
Be the first to share your thoughts!
Leave a Comment