Decision trees family in scikit-learn Question Title * 1. What is a name of parameter that controls maximum depth of a tree? max_depth. max_leaf_nodes. n_estimators. min_samples_leaf. Question Title * 2. How does random forest work? Random forest combines different individual models into ensemble and produces an aggregate model. + Random forest used neural network on top of decision tree to boost performance. Random forest is just a simple decision tree with a very big depth. None of the variants. Question Title * 3. How does GBT (Gradient Boosted Trees) works? It creates a series of trees, where each tree is trained that it attempts to correct the mistakes of the previous tree in the series. None of the variants. It creates a series of trees and just averages their predictions. It uses neural networks on top of trees to boost predictions. Question Title * 4. What parameter of GBT controls emphasis on fixing errors from previous iteration? learning_rate. max_depth. error_fixing. n_workers. Question Title * 5. What function from sklearn.tree is used to visualize decision tree? visualize_tree. show_trained. plot_tree. show_tree. Question Title * 6. (single choice) What is the name of a class used for regression with random forest? RFR. forestRegressor. RandomForest. RandomForestRegressor. Question Title * 7. What is the name of a module from which you can import random forest and GBT? sklearn.trees. sklearn.models. sklearn.ensemble. sklearn.forests. Question Title * 8. What algorithm got the smallest error on test data according to the summary table at the end of assignment? SVM with rbf kernel. l2 (Ridge) polynomial regression. Gradient boosting trees. Polynomial regression with degree of 2. Готово