图书介绍

基于不确定性建模的数据挖掘 英文版PDF|Epub|txt|kindle电子书版本下载

基于不确定性建模的数据挖掘 英文版
  • 秦曾昌,汤永川著 著
  • 出版社: 杭州:浙江大学出版社
  • ISBN:9787308121064
  • 出版时间:2013
  • 标注页数:291页
  • 文件大小:40MB
  • 文件页数:304页
  • 主题词:数据采集-研究-英文

PDF下载


点此进入-本书在线PDF格式电子书下载【推荐-云解压-方便快捷】直接下载PDF格式图书。移动端-PC端通用
种子下载[BT下载速度快]温馨提示:(请使用BT下载软件FDM进行下载)软件下载地址页直链下载[便捷但速度慢]  [在线试读本书]   [在线获取解压码]

下载说明

基于不确定性建模的数据挖掘 英文版PDF格式电子书版下载

下载的文件为RAR压缩包。需要使用解压软件进行解压得到PDF格式图书。

建议使用BT下载工具Free Download Manager进行下载,简称FDM(免费,没有广告,支持多平台)。本站资源全部打包为BT种子。所以需要使用专业的BT下载软件进行下载。如BitComet qBittorrent uTorrent等BT下载工具。迅雷目前由于本站不是热门资源。不推荐使用!后期资源热门了。安装了迅雷也可以迅雷进行下载!

(文件页数 要大于 标注页数,上中下等多册电子书除外)

注意:本站所有压缩包均有解压码: 点击下载压缩包解压工具

图书目录

1 Introduction1

1.1 Types of Uncertainty1

1.2 Uncertainty Modeling and Data Mining4

1.3 Related Works6

References9

2 Induction and Learning13

2.1 Introduction13

2.2 Machine Learning14

2.2.1 Searching in Hypothesis Space16

2.2.2 Supervised Learning18

2.2.3 Unsupervised Learning20

2.2.4 Instance-Based Learning22

2.3 Data Mining and Algorithms23

2.3.1 Why Do We Need Data Mining?24

2.3.2 How Do We do Data Mining?24

2.3.3 Artificial Neural Networks25

2.3.4 Support Vector Machines27

2.4 Measurement of Classifiers29

2.4.1 ROC Analysis for Classification30

2.4.2 Area Under the ROC Curve31

2.5 Summary34

References34

3 Label Semantics Theory39

3.1 Uncertainty Modeling with Labels39

3.1.1 Fuzzy Logic39

3.1.2 Computing with Words41

3.1.3 Mass Assignment Theory42

3.2 Label Semantics44

3.2.1 Epistemic View of Label Semantics45

3.2.2 Random Set Framework46

3.2.3 Appropriateness Degrees50

3.2.4 Assumptions for Data Analysis51

3.2.5 Linguistic Translation54

3.3 Fuzzy Discretization57

3.3.1 Percentile-Based Discretization58

3.3.2 Entropy-Based Discretization58

3.4 Reasoning with Fuzzy Labels61

3.4.1 Conditional Distribution Given Mass Assignments61

3.4.2 Logical Expressions of Fuzzy Labels62

3.4.3 Linguistic Interpretation of Appropriate Labels65

3.4.4 Evidence Theory and Mass Assignment66

3.5 Label Relations69

3.6 Summary73

References74

4 Linguistic Decision Trees for Classification77

4.1 Introduction77

4.2 Tree Induction77

4.2.1 Entropy79

4.2.2 Soft Decision Trees82

4.3 Linguistic Decision for Classification82

4.3.1 Branch Probability85

4.3.2 Classification by LDT88

4.3.3 Linguistic ID3 Algorithm90

4.4 Experimental Studies92

4.4.1 Influence of the Threshold93

4.4.2 Overlapping Between Fuzzy Labels95

4.5 Comparison Studies98

4.6 Merging of Branches102

4.6.1 Forward Merging Algorithm103

4.6.2 Dual-Branch LDTs105

4.6.3 Experimental Studies for Forward Merging105

4.6.4 ROC Analysis for Forward Merging109

4.7 Linguistic Reasoning111

4.7.1 Linguistic Interpretation of an LDT111

4.7.2 Linguistic Constraints113

4.7.3 Classification of Fuzzy Data115

4.8 Summary117

References118

5 Linguistic Decision Trees for Prediction121

5.1 Prediction Trees121

5.2 Linguistic Prediction Trees122

5.2.1 Branch Evaluation123

5.2.2 Defuzzification126

5.2.3 Linguistic ID3 Algorithm for Prediction128

5.2.4 Forward Branch Merging for Prediction128

5.3 Experimental Studies130

5.3.1 3D Surface Regression131

5.3.2 Abalone and Boston Housing Problem134

5.3.3 Prediction of Sunspots135

5.3.4 Flood Forecasting137

5.4 Query Evaluation143

5.4.1 Single Queries143

5.4.2 Compound Queries144

5.5 ROC Analysis for Prediction145

5.5.1 Predictors and Probabilistic Classifiers145

5.5.2 AUC Value for Prediction149

5.6 Summary152

References152

6 Bayesian Methods Based on Label Semantics155

6.1 Introduction155

6.2 Naive Bayes156

6.2.1 Bayes Theorem157

6.2.2 Fuzzy Naive Bayes158

6.3 Fuzzy Semi-Naive Bayes159

6.4 Online Fuzzy Bayesian Prediction161

6.4.1 Bayesian Methods161

6.4.2 Online Learning164

6.5 Bayesian Estimation Trees165

6.5.1 Bayesian Estimation Given an LDT165

6.5.2 Bayesian Estimation from a Set of Trees167

6.6 Experimental Studies168

6.7 Summary169

References171

7 Unsupervised Learning with Label Semantics177

7.1 Introduction177

7.2 Non-Parametric Density Estimation178

7.3 Clustering180

7.3.1 Logical Distance181

7.3.2 Clustering of Mixed Objects185

7.4 Experimental Studies187

7.4.1 Logical Distance Example187

7.4.2 Images and Labels Clustering190

7.5 Summary191

References192

8 Linguistic FOIL and Multiple Attribute Hierarchy for Decision Making193

8.1 Introduction193

8.2 Rule Induction193

8.3 Multi-Dimensional Label Semantics196

8.4 Linguistic FOIL199

8.4.1 Information Heuristics for LFOIL199

8.4.2 Linguistic Rule Generation200

8.4.3 Class Probabilities Given a Rule Base202

8.5 Experimental Studies203

8.6 Multiple Attribute Decision Making206

8.6.1 Linguistic Attribute Hierarchies206

8.6.2 Information Propagation Using LDT209

8.7 Summary213

References213

9 A Prototype Theory Interpretation of Label Semantics215

9.1 Introduction215

9.2 Prototype Semantics for Vague Concepts217

9.2.1 Uncertainty Measures about the Similarity Neighborhoods Determined by Vague Concepts217

9.2.2 Relating Prototype Theory and Label Semantics220

9.2.3 Gaussian-Type Density Function223

9.3 Vague Information Coarsening in Theory of Prototypes227

9.4 Linguistic Inference Systems229

9.5 Summary231

References232

10 Prototype Theory for Learning235

10.1 Introduction235

10.1.1 General Rule Induction Process235

10.1.2 A Clustering Based Rule Coarsening236

10.2 Linguistic Modeling of Time Series Predictions238

10.2.1 Mackey-Glass Time Series Prediction239

10.2.2 Prediction of Sunspots244

10.3 Summary250

References252

11 Prototype-Based Rule Systems253

11.1 Introduction253

11.2 Prototype-Based IF-THEN Rules254

11.3 Rule Induction Based on Data Clustering and Least-Square Regression257

11.4 Rule Learning Using a Conjugate Gradient Algorithm260

11.5 Applications in Prediction Problems262

11.5.1 Surface Predication262

11.5.2 Mackev-Glass Time Series Prediction265

11.5.3 Prediction of Sunspots269

11.6 Summary274

References274

12 Information Cells and Information Cell Mixture Models277

12.1 Introduction277

12.2 Information Cell for Cognitive Representation of Vague Concept Semantics277

12.3 Information Cell Mixture Model(ICMM)for Semantic Representation of Complex Concept280

12.4 Learning Infcrmation Cell Mixture Model from Data Set281

12.4.1 Objective Function Based on Positive Density Function282

12.4.2 Updating Probability Distribution of Information Cells282

12.4.3 Updating Density Functions of Information Cells283

12.4.4 Information Cell Updating Algorithm284

12.4.5 Learning Component Number of ICMM285

12.5 Experimental Study286

12.6 Summary290

References290

热门推荐