实验:由上面的学习,我们可以知道ID3的核心应该是在决策树各个结点上,对所对应的信息增益准则选择特征,递归地构建树。具体方法是:从根结点(r的英语翻译

实验:由上面的学习,我们可以知道ID3的核心应该是在决策树各个结点上,

实验:由上面的学习,我们可以知道ID3的核心应该是在决策树各个结点上,对所对应的信息增益准则选择特征,递归地构建树。具体方法是:从根结点(root node)开始,对结点计算所有可能的特征的信息增益,选择信息增益最大的特征作为结点的特征,由该特征的不同取值建立子节点;再对子结点递归地调用以上方法,构建决策树;直到所有特征的信息增益均很小或没有特征可以选择为止,最后得到一个决策树。训练集如下:
0/5000
源语言: -
目标语言: -
结果 (英语) 1: [复制]
复制成功!
Experiment: From the above study, we can know that the core of ID3 should be at each node of the decision tree, select features for the corresponding information gain criteria, and build the tree recursively. The specific method is: starting from the root node, calculate the information gain of all possible features for the node, select the feature with the largest information gain as the feature of the node, and establish child nodes from different values ​​of the feature; then The above method is called recursively on the child nodes to construct a decision tree; until the information gain of all features is very small or no features can be selected, a decision tree is finally obtained. <br>The training set is as follows:
正在翻译中..
结果 (英语) 2:[复制]
复制成功!
Experiment: From the above study, we can know that the core of ID3 should be in the decision tree at each node, the corresponding information gain criteria to select characteristics, recursively build the tree. The specific method is: starting with the root node (root node), calculating the information gain of all possible features, selecting the feature with the largest information gain as the node feature, establishing the child node by the different values of the feature, then calling the above method on the sub-junction recursively, building the decision tree, until the information gain of all features is small or no features to be selected, and finally a decision tree is obtained.<br>The training set is as follows:
正在翻译中..
结果 (英语) 3:[复制]
复制成功!
Experiment: from the above learning, we can know that the core of ID3 should be to select the characteristics of the corresponding information gain criteria on each node of the decision tree and construct the tree recursively. The specific method is: from the root node (root Node) first, calculate the information gain of all possible features for the node, select the feature with the largest information gain as the feature of the node, and establish the sub node according to the different values of the feature; then call the above methods recursively for the sub node to build the decision tree; until the information gain of all features is small or there is no feature to choose, finally get a decision tree.<br>The training set is as follows:
正在翻译中..
 
其它语言
本翻译工具支持: 世界语, 丹麦语, 乌克兰语, 乌兹别克语, 乌尔都语, 亚美尼亚语, 伊博语, 俄语, 保加利亚语, 信德语, 修纳语, 僧伽罗语, 克林贡语, 克罗地亚语, 冰岛语, 加利西亚语, 加泰罗尼亚语, 匈牙利语, 南非祖鲁语, 南非科萨语, 卡纳达语, 卢旺达语, 卢森堡语, 印地语, 印尼巽他语, 印尼爪哇语, 印尼语, 古吉拉特语, 吉尔吉斯语, 哈萨克语, 土库曼语, 土耳其语, 塔吉克语, 塞尔维亚语, 塞索托语, 夏威夷语, 奥利亚语, 威尔士语, 孟加拉语, 宿务语, 尼泊尔语, 巴斯克语, 布尔语(南非荷兰语), 希伯来语, 希腊语, 库尔德语, 弗里西语, 德语, 意大利语, 意第绪语, 拉丁语, 拉脱维亚语, 挪威语, 捷克语, 斯洛伐克语, 斯洛文尼亚语, 斯瓦希里语, 旁遮普语, 日语, 普什图语, 格鲁吉亚语, 毛利语, 法语, 波兰语, 波斯尼亚语, 波斯语, 泰卢固语, 泰米尔语, 泰语, 海地克里奥尔语, 爱尔兰语, 爱沙尼亚语, 瑞典语, 白俄罗斯语, 科西嘉语, 立陶宛语, 简体中文, 索马里语, 繁体中文, 约鲁巴语, 维吾尔语, 缅甸语, 罗马尼亚语, 老挝语, 自动识别, 芬兰语, 苏格兰盖尔语, 苗语, 英语, 荷兰语, 菲律宾语, 萨摩亚语, 葡萄牙语, 蒙古语, 西班牙语, 豪萨语, 越南语, 阿塞拜疆语, 阿姆哈拉语, 阿尔巴尼亚语, 阿拉伯语, 鞑靼语, 韩语, 马其顿语, 马尔加什语, 马拉地语, 马拉雅拉姆语, 马来语, 马耳他语, 高棉语, 齐切瓦语, 等语言的翻译.

Copyright ©2024 I Love Translation. All reserved.

E-mail: