1. List Comprehensions 2.List Functions 3.Functions 4.Lambda, return statements 5.FAQ to do marks = int(input()) # Write your code below if marks > 90: print("Congratulations ! You have scored A+") elif marks >=80 : print("Congratulations ! You have scored A-") else : print("Sorry. You have not passed this course") a = int(input()) b = int(input()) c = int(input()) # Write your code below if a >0 : if b%2==0: if c%3==0: print("All conditions are satisfied") else : print("Conditions are not satisfied") ********** n = int(input()) # Write your code below for i in range (1,11): print('{} x {} = {}'.format(n, i, n*i)) ***************************** i=0 while(i0 : if b%2==0: if c%3==0: print("All conditions are satisfied") else : print("Conditions are not satisfied") a = int(input()) b = int(input()) c = int(input()) if a>0 : if b%2==0 : if c%3==0 : print("All conditions are satisfied") else : print ("Conditions are not satisfied") else : print("Conditions are not satisfied") else : print(" Conditions are not satisfied") Bagging -Samples are drawn from original dataset with replacement to train individual weak learner Boosting - subsequent samples have more weight of those observations which had relatively higher errors in previous weak learners Are these statements true for Gradient boosting 1. Gradient boosting tree work on residuals instead of changing weights of observations 2.By fitting models to the residuals , the overall learner gradually improves in areas where residuals are initially high in adaboost, alpha assosciated with each weak learner serves to 1. Adjust learning rate of the algorithm 2.Determine the overall accuracy of final model 3.Weigh the contribution of each weak learner to the final model 4.Measure the importance of each feature in the dataset What is the correct sequence for gradient boosting ? a. calculate residual for each observation b.Update all predictions using previous probabilities and new output values c. initialise the model with an initial prediction for all observations d. repeat steps by creating a new tree until max number of estimators is reached. e.build a tree and calculate the output value for each leaf ode Adaboosting takes a weighted voting/weighted average of weak learners for its final prediction Which of the following is true for boosting ? 1. Weight of observation to be selected while building the next weak learner increases if the observation is incorrectly classified 2 Weight of observation to be selected while building the next weak learner decreases if the observation is correctly classified