MCA-20-44(ii): Machine Learning
Type: Elective
Contact Hours: 4 hours/week
Examination Duration: 3 Hours
Mode: Lecture
External Maximum Marks: 75
External Pass Marks: 30(i.e. 40%)
Internal Maximum Marks: 25
Total Maximum Marks: 100
Total Pass Marks: 40(i.e. 40%)
Instructions to paper setter for End semester exam:
Total number of questions shall be nine. Question number one will be compulsory and will be consisting of short/objective type questions from complete syllabus. In addition to compulsory first question there shall be four units in the question paper each consisting of two questions. Student will attempt one question from each unit in addition to compulsory question. All questions will carry equal marks.
Course Objectives: The objective of this course is to enable student to perform experiments in Machine Learning using real-world data.
Course Outcomes (COs) At the end of this course, the student will be able to:
MCA-20-44(ii).1 understand the basics of machine learning and supervised learning;
MCA-20-44(ii).2 analyse and implement the concepts of Naïve-Bayes and Regression;
MCA-20-44(ii).3 understand the unsupervised learning using clustering algorithms;
MCA-20-44(ii).4 perform dimensionality reduction and understand the basics of reinforcement learning.
Unit – I
Machine Learning: Introduction to Machine Learning, Overview of Machine Learning, Key Terminology and task of ML, Applications of ML;
Supervised Learning: Classification, Decision Tree Representation- Appropriate problem for Decision Learning, Decision Tree Algorithm, Hyperspace Search in Decision Tree;
Unit – II
Naive Bayes- Bayes Theorem, Classifying with Bayes Decision Theory , Conditional Probability, Bayesian Belief Network;
Regression: Linear Regression- Predicting numerical value, Finding best fit line with linear regression, Regression Tree- Using CART for regression.
Unit – III
Logistic Regression – Classification with Logistic Regression and the Sigmoid Function;
Clustering: Learning from unclassified data –Introduction to clustering, K-Mean Clustering, Expectation-Maximization Algorithm(EM algorithm), Hierarchical Clustering, Supervised Learning after clustering.
Unit – IV
Dimensionality reduction- Dimensionality reduction techniques, Principal component analysis, Anomaly Detection, Recommender Systems;
SVM, Reinforcement Learning.
Text Books:
⦁ Tom M. Mitchell, Machine Learning, McGraw-Hill Education (India) Private Limited.
⦁ EthemAlpaydin, Introduction to Machine Learning (Adaptive Computation and Machine Learning), The MIT Press.
Reference Books:
⦁ Stephen Marsland, Machine Learning: An Algorithmic Perspective, CRC Press.
⦁ Peter Flach, Machine Learning: The Art and Science of Algorithms that Make Sense of Data, Cambridge University Press.
⦁ Peter Harrington, Machine Learning in Action, Manning
⦁ Shai Shalev-Shwartz and Shai Ben David, Understanding Machine Learning From Theory to Algorithms, Cambridge University Press