图书介绍
自然语言生成系统的建造PDF|Epub|txt|kindle电子书版本下载
- (英)雷特,(澳)戴尔著 著
- 出版社: 北京市:北京大学出版社
- ISBN:9787301171547
- 出版时间:2010
- 标注页数:248页
- 文件大小:16MB
- 文件页数:308页
- 主题词:自然语言处理-研究-英文
PDF下载
下载说明
自然语言生成系统的建造PDF格式电子书版下载
下载的文件为RAR压缩包。需要使用解压软件进行解压得到PDF格式图书。建议使用BT下载工具Free Download Manager进行下载,简称FDM(免费,没有广告,支持多平台)。本站资源全部打包为BT种子。所以需要使用专业的BT下载软件进行下载。如BitComet qBittorrent uTorrent等BT下载工具。迅雷目前由于本站不是热门资源。不推荐使用!后期资源热门了。安装了迅雷也可以迅雷进行下载!
(文件页数 要大于 标注页数,上中下等多册电子书除外)
注意:本站所有压缩包均有解压码: 点击下载压缩包解压工具
图书目录
1 Introduction1
1.1 The Research Perspective2
1.1.1 Differences between NL Generation and NL Understanding2
1.1.2 Sharing Knowledge between Generation and Understanding3
1.2 The Applications Perspective4
1.2.1 Computer as Authoring Aid5
1.2.2 Computer as Author6
1.2.3 Uses of NLG Technology6
1.3 Some Example NLG Systems7
1.3.1 WEATHERREPORTER8
1.3.2 FOG9
1.3.3 IDAS12
1.3.4 MODELEXPLAINER14
1.3.5 PEBA15
1.3.6 STOP16
1.4 A Short History of NLG19
1.5 The Structure of This Book20
1.6 Further Reading21
2 National Language Generation in Practice23
2.1 Introduction23
2.2 When Are NLG Techniques Appropriate?24
2.2.1 Text versus Graphics25
2.2.2 Natural Language Generation versus Mail Merge26
2.2.3 Natural Language Generation versus Human Authoring28
2.3 Using a Corpus to Determine User Requirements30
2.3.1 Assembling an Initial Corpus of Output Texts31
2.3.2 Analysing the Information Content of Corpus Texts33
2.3.3 Creating a Target Text Corpus35
2.4 Evaluating NLG Systems37
2.5 Fielding NLG Systems38
2.6 Further Reading40
3 The Architecture of a Natural Language Generation System41
3.1 Introduction41
3.2 The Inputs and Outputs of Natural Language Generation42
3.2.1 Language as Goal-Driven Communication42
3.2.2 The Inputs to Natural Language Generation43
3.2.3 The Output of Natural Language Generation46
3.3 An Informal Characterisation of the Architecture47
3.3.1 An Overview of the Architecture47
3.3.2 Content Determination50
3.3.3 Document Structuring51
3.3.4 Lexicalisation52
3.3.5 Referring Expression Generation55
3.3.6 Aggregation56
3.3.7 Linguistic Realisation57
3.3.8 Structure Realisation59
3.4 The Architecture and Its Representations59
3.4.1 Broad Structure and Terminology59
3.4.2 Messages61
3.4.3 The Document Planner63
3.4.4 Document Plans64
3.4.5 Microplanning65
3.4.6 Text Specifications66
3.4.7 Phrase Specifications67
3.4.8 Surface Realisation71
3.5 Other Architectures72
3.5.1 Different Representations and Modularisations72
3.5.2 Different Architectures:Integrated Systems76
3.6 Further Reading77
4 Document Planning79
4.1 Introduction79
4.1.1 What Document Planning Is About79
4.1.2 The Inputs and Outputs of Document Planning80
4.1.3 A WEATHERREPORTER Example82
4.2 Representing Information in the Domain83
4.2.1 What's in a Domain Model?86
4.2.2 Domain Modelling for WEATHERREPORTER87
4.2.3 Implementing Domain Models88
4.2.4 Defining Messages89
4.2.5 Determining the Degree of Abstraction in Messages91
4.2.6 A Methodology for Domain Modelling and Message Definition94
4.3 Content Determination95
4.3.1 Aspects of Content Determination96
4.3.2 Deriving Content Determination Rules98
4.3.3 Implementing Content Determination100
4.4 Document Structuring101
4.4.1 Discourse Relations102
4.4.2 Implementation:Schemas104
4.4.3 Implementation:Bottom-up Techniques107
4.4.4 A Comparison of Approaches109
4.4.5 Knowledge Acquisition110
4.5 Document Planner Architecture110
4.6 Further Reading112
5 Microplanning114
5.1 Introduction115
5.1.1 Why Do We Need Microplanning?115
5.1.2 What's Involved in Microplanning?116
5.1.3 The Inputs and Outputs of Microplanning117
5.1.4 The Architecture of a Microplanner122
5.2 Lexicalisation124
5.2.1 Simple Lexicalisation124
5.2.2 Simple Lexical Choice126
5.2.3 Contextual and Pragmatic Influences on Lexical Choice128
5.2.4 Expressing Discourse Relations129
5.2.5 Fine-Grained Lexicalisation130
5.3 Aggregation132
5.3.1 Mechanisms for Sentence Formation133
5.3.2 Choosing between Possible Aggregations140
5.3.3 Order of Presentation142
5.3.4 Paragraph Formation143
5.4 Generating Referring Expressions144
5.4.1 The Nature of the Problem144
5.4.2 Forms of Referring Expressions and Their Uses145
5.4.3 Requirements for Referring Expression Generation146
5.4.4 Generating Pronouns149
5.4.5 Generating Subsequent References152
5.5 Limitations and Other Approaches156
5.6 Further Reading157
6 Surface Realisation159
6.1 Introduction159
6.2 Realising Text Specifications162
6.3 Varieties of Phrase Specifications164
6.3.1 Skeletal Propositions165
6.3.2 Meaning Specifications166
6.3.3 Lexicalised Case Frames168
6.3.4 Abstract Syntactic Structures168
6.3.5 Canned Text169
6.3.6 Orthographic Strings170
6.3.7 Summary170
6.4 KPML171
6.4.1 An Overview171
6.4.2 The Input to KPML171
6.4.3 Using Systemic Grammar for Linguistic Realisation176
6.4.4 Summary179
6.5 SURGE179
6.5.1 An Overview179
6.5.2 The Input to SURGE180
6.5.3 Functional Unification Grammar182
6.5.4 Linguistic Realisation via Unification183
6.6 REALPRO186
6.6.1 An Overview186
6.6.2 The Input to REALPRO187
6.6.3 Meaning-Text Theory189
6.6.4 How REALPRO Works190
6.6.5 Summary191
6.7 Choosing a Realiser192
6.8 Bidirectional Grammars194
6.9 Further Reading196
7 Beyond Text Generation198
7.1 Introduction198
7.2 Typography201
7.2.1 The Uses of Typography201
7.2.2 Typography in NLG Systems203
7.2.3 Implementing Typographic Awareness206
7.3 Integrating Text and Graphics208
7.3.1 The Automatic Generation of Graphical Objects209
7.3.2 Choosing a Medium210
7.3.3 Commonalities between Text and Graphics213
7.3.4 Implementing Text and Graphics Integration214
7.4 Hypertext216
7.4.1 Hypertext and Its Uses216
7.4.2 Implementing Hypertext-based HLG Systems219
7.5 Speech Output221
7.5.1 The Benefits of Speech Output221
7.5.2 Text-to-Speech Systems222
7.5.3 Implementing Concept-to-Speech225
7.6 Further Reading227
Appendix:NLG Systems Mentioned in This Book229
References231
Index243
1.1 The Macquarie weather summary for February 19958
1.2 Data input for the WEATHERREPORTER system10
1.3 FOG input:a weather system over Canada11
1.4 Some example forecasts from FOG11
1.5 Some example texts from IDAS13
1.6 An example MODELEXPLAINER input:An object-oriented class model14
1.7 An example description produced by MODELEXPLAINER from the model in Figure 1.615
1.8 A www page generated by PEBA16
1.9 A letter generated by the prototype STOP system18
2.1 An example human-authored text in the WEATHERREPORTER domain32
2.2 An example daily weather record32
2.3 An example target text in the WEATHERREPORTER domain36
3.1 Modules and tasks49
3.2 The weather summary for February 199550
3.3 The weather summary for August 199550
3.4 The weather summary for January 199651
3.5 The structure of the weather summary in Figure 3.253
3.6 Using typography to indicate discourse structure54
3.7 An NLG system architecture60
3.8 A definition of some message types in WEATHERREPORTER62
3.9 A MonthlyTemperatureMsg message62
3.10 A MonthlyRainfallMsg message63
3.11 A simple document plan representation64
3.12 A simple text specification representation67
3.13 Some phrase specification representations68
3.14 An abstract syntactic representation for The month had some rainy days70
3.15 A lexicalised case frame for The month had some rainy days70
3.16 Combining canned text and abstract syntactic structures71
3.17 Variations in terminology72
3.18 A proto-phrase specification representation of the MonthlyRainfallMsg message73
4.1 A knowledge base fragment in PEBA81
4.2 A daily weather record82
4.3 The document plan corresponding to the text in Figure 4.583
4.4 A documentplan84
4.5 The weather summary for July 199684
4.6 Example MonthlyTemperatureMsg and RainEventMsg messages from the document plan shown in Figure 4.385
4.7 The correspondence of temperature values to categories91
4.8 The definition of TemperatureSpellMsg91
4.9 A TemperatureSpellMsg message92
4.10 A message defined as a string92
4.11 A message specified as a month and a textual attribute93
4.12 A corpus-based procedure for domain modelling and message definition95
4.13 A corpus-based procedure for identifying content determination rules99
4.14 The RST definition of Elaboration103
4.15 PEBA's Compare-And-Contrast schema105
4.16 A simple set of schemas for WEATHERREPORTER106
4.17 A bottom-up discourse structuring algorithm108
4.18 Bottom-up construction of a document plan109
4.19 Different combinations of document structuring and content determination111
5.1 A simple weather summary118
5.2 The top-level document plan for the text in Figure 5.1118
5.3 The first constituent of the document plan corresponding to the text in Figure 5.1118
5.4 The second constituent of the document plan corresponding to the text in Figure 5.1119
5.5 A schematic representation of the document plan in Figures 5.2-5.4119
5.6 The top-level text specification corresponding to the text in Figure 5.1120
5.7 The phrase specification for Sentence1 in Figure 5.6(The month was slightly warmer than average,with the average number of rain days)120
5.8 The phrase specification for Sentence2 in Figure 5.6(Heavy rain fell on the 27th and 28th)121
5.9 A blackboard architecture for a microplanner122
5.10 A simple microplanner architecture123
5.11 Simplified microplanning124
5.12 A simple template for RainEventMsgs125
5.13 A simple message125
5.14 The proto-phrase specification produced by applying the template to amessage125
5.15 A simple template for RainSpellMessages126
5.16 An algorithm for lexicalising spells127
5.17 A template proto-phrase specification for on the ith127
5.18 A template proto-phrase specification for on the ith and jth127
5.19 A decision tree for realising the Contrast relation129
5.20 The weather summary for July 1996 without aggregation132
5.21 A proto-phrase specification for Heavy rain fell on the 28th134
5.22 The result of a simple conjunction(Heavy rainfell on the 27th and 28th)135
5.23 The result of a shared-participant conjunction(Heavy rain fell on the 27th and 28th)138
5.24 The result of a shared-structure conjunction(Heavy rain fell on the 27th and 28th)139
5.25 Some of Scott and de Souza's heuristics for aggregation142
5.26 Some example rules from the AECMA Simplified English guide142
5.27 A conservative pronoun-generation algorithm151
5.28 Distinguishing descriptions153
5.29 An algorithm for producing distinguishing descriptions154
5.30 Constructing a referring expression155
6.1 A simple PEBA text specification160
6.2 A surface form with mark-up annotations for the PEBA text160
6.3 The PEBA text as displayed by the presentation system161
6.4 The logical structure specification in LATEX form164
6.5 The logical structure specification in Word RTF form164
6.6 A skeletal proposition as an AVM166
6.7 A meaning specification167
6.8 A lexicalised case frame168
6.9 An abstract syntactic structure169
6.10 A canned text structure169
6.11 An orthographic string structure170
6.12 An input to KPML,from which KPML produces March had some rainy days172
6.13 An AVM representation of the structure in Figure 6.12172
6.14 A more complex SPL expression(The month was cool and dry with the average number of rain days)173
6.15 An SPL which exploits a middle model(March had some rainy days)174
6.16 The mood system for the English clause176
6.17 A chooser for deciding which determiner should be used177
6.18 Some realisation statement operators178
6.19 Realisation statements in a network178
6.20 An input to SURGE,from which SURGE produces March had some rainy days180
6.21 An AVM representation of the structure in Figure 6.20181
6.22 A more complex input to SURGE(The month was cool anddry with the average number of rain days)182
6.23 An FD for a simple sentence(John sells the car)184
6.24 A simple FUF grammar184
6.25 The result of initial unification of the input FD with the grammar185
6.26 The FD in Figure 6.23 after unification with the grammar in Figure 6.24186
6.27 The DSyntS for John helps Mary187
6.28 Variations of the DSyntS in Figure 6.27187
6.29 A DSyntS for March hadsome rainy days188
6.30 An AVM representation of the structure in Figure 6.29188
6.31 A more complex DSyntS input to REALPRO (The month was cool and dry with the average number of rain days)188
6.32 Common relations in MTT's DSyntS189
6.33 Some simple REALPRO grammar rules191
6.34 Stages of realisation in REALPRO192
7.1 An example output from PEBA199
7.2 Using typeface variation for emphasis202
7.3 Using a labelled list structure to aid information access203
7.4 Information expressed typographically as a decision tree204
7.5 Information expressed typographically via a table204
7.6 Information expressed compactly205
7.7 Declarative mark-up annotations205
7.8 Physical mark-up annotations205
7.9 A tabular presentation of aggregated information207
7.10 Multimodal output from the WIP system208
7.11 Graphics-only output from the WIP system209
7.12 Text-only output from the WIP system209
7.13 The document plan for the output in Figure 7.10215
7.14 The architecture of a text-to-speech system222
7.15 SABLE mark-ups for controlling speech synthesis224