图书简介
Theory of Computation is designed to serve as a textbook for undergraduate students of Computer Science
Preface 00; Foreward 00; Features of the Book 00; 1. PRELIMINARIES 1; 1.1 Introduction 1; 1.2 Basic Concepts 1; 1.2.1 Symbol 1; 1.2.2 Alphabet 1; 1.2.3 String (or Word) 2; 1.3 Sets 2; 1.3.1 Operations 3; 1.3.2 Cardinality 7; 1.3.3 Countable and Uncountable Sets 7; 1.4 Relations 8; 1.4.1 Properties 10; 1.4.2 Closure Properties 10; 1.5 Graph 11; 1.5.1 Directed Graph (or Digraph) 12; 1.5.2 Tree 13; 1.6 Language 13; 1.6.1 Formal Languages 14; 1.7 Mathematical Induction 14; 2. FINITE STATE MACHINES 20; 2.1 Introduction 20; 2.1.1 Concept of Basic Machine 21; 2.2 Finite State Machine 22; 2.2.1 Examples 22; 2.2.2 Transition Diagram (or Transition Graph) 24; 2.2.3 Transition Matrix 24; 2.3 Finite Automata 29; 2.3.1 Transition Graph 30; 2.3.2 Functions 31; 2.3.3 Acceptance of a String 32; 2.3.4 Acceptance of a Language 32; 2.3.5 Some Examples of FA as Acceptor 33; 2.3.6 FA as Finite Control 35; 2.4 Deterministic Finite Automata 36; 2.5 Non-deterministic Finite Automata 36; 2.6 Equivalence of NFA and DFA 37; 2.6.1 NFA to DFA Conversion (Method I) 38; 2.6.2 DFA Minimization 41; 2.6.3 NFA to DFA Conversion (Method II) 43; 2.7 NFA with ?-Transitions 47; 2.7.1 Significance of NFA with ?-Transitions 49; 2.7.2 State Transition Table for NFA with ?-Transitions 50; 2.7.3 ?-Closure of a State 50; 2.8 Equivalence of NFA and NFA with ?-Transitions 50; 2.9 Equivalence of DFA and NFA with ?-Transitions 53; 2.9.1 Indirect Conversion Method 53; 2.9.2 Direct Conversion Method 55; 2.10 Finite Automata with Output 57; 2.10.1 Moore Machine 57; 2.10.2 Mealy Machine 59; 2.10.3 Finite State Transducer 63; 2.11 Equivalence of Moore and Mealy Machines 63; 2.11.1 Moore to Mealy Conversion 64; 2.11.2 Mealy to Moore Conversion 66; 2.11.3 Additional Examples on Moore and Mealy Machines 68; 2.12 FSM Equivalence 75; 2.12.1 Moore’s Algorithm 75; 2.13 DFA Minimization (Another Approach) 77; 2.14 Properties and Limitations of FSM 79; 2.15 Additional FSM Examples 80; 2.16 Two-way Finite Automaton 83; 3. REGULAR EXPRESSIONS 94; 3.1 Introduction 94; 3.2 Regular Expression Formalism 95; 3.3 Examples of Regular Expressions 96; 3.4 Equivalence of Regular Expressions and Finite Automata 102; 3.4.1 Kleene’s Theorem 102; 3.4.2 Regular Expression to FA Conversion 103; 3.4.3 DFA to Regular Expression Conversion 109; 3.5 Regular Sets and their Closure Properties 120; 3.5.1 Formal Definition for Regular Sets 120; 3.5.2 Closure Properties of Regular Sets 120; 3.6 Pumping Lemma for Regular Languages 121; 3.6.1 Applications of Pumping Lemma 123; 3.7 Decision Algorithms for Regular Sets 125; 3.8 Applications of Regular Expressions and Finite Automata 126; 3.8.1 Lexical Analyser 127; 3.8.2 Text Editors 128; 3.8.3 ’grep’ Command 128; 3.9 Additional Examples 128; 3.10 Myhill-Nerode Theorem 133; 4. TURING MACHINES 139; 4.1 Introduction 139; 4.2 Elements of a Turing Machine 140; 4.3 Turing Machine Formalism 141; 4.4 Instantaneous Description 143; 4.5 Transition Graph for Turing Machine 145; 4.6 Solved Problems 146; 4.7 Complexity of a Turing Machine 199; 4.8 Composite and Iterative Turing Machines 200; 4.9 Universal Turing Machine 203; 4.10 Multi-tape Turing Machine 205; 4.11 Multi-stack Turing Machine 206; 4.12 Multi-track Turing Machine 206; 4.13 Solvable, Semi-solvable, and Unsolvable Problems 207; 4.14 Halting Problem 208; 4.15 Recursively Enumerable and Recursive Languages 210; 4.16 Functions 210; 4.16.1 Total Recursive Functions 211; 4.16.2 Partial Recursive Functions 211; 4.17 Church’s Turing Hypothesis 211; 4.18 Post’s Correspondence Problem 212; 4.19 Additional Turing Machine Examples 213; 4.20 Linear Bounded Automata 226; 5. GRAMMARS 233; 5.1 Introduction 233; 5.2 Constituents of Grammar 234; 5.3 Formal Definition of a Grammar 234; 5.4 Grammar Notations 235; 5.5 Derivation Process 236; 5.5.1 Leftmost Derivation 236; 5.5.2 Rightmost Derivation 237; 5.5.3 Derivation Examples 238; 5.6 Derivation Tree 239; 5.7 Context-free Languages 240; 5.7.1 Examples of CFLs 240; 5.8 Ambiguous Context-free Grammar 252; 5.8.1 Removal of Ambiguity 253; 5.9 Simplification of CFG 257; 5.9.1 Removal of Useless Symbols 257; 5.9.2 Removal of Unit Productions 258; 5.9.3 Elimination of ?-Productions 260; 5.10 Normal Forms 265; 5.10.1 Chomsky Normal Form 265; 5.10.2 Greibach Normal Form 267; 5.11 Chomsky Hierarchy 269; 5.11.1 Unrestricted Grammar (Type-0 Grammar) 270; 5.11.2 Context-sensitive Grammar (Type-1 Grammar) 270; 5.11.3 Context-free Grammar (Type-2 Grammar) 271; 5.11.4 Regular Grammar (Type-3 Grammar) 271; 5.12 Equivalence of Right-linear and Left-linear Grammars 273; 5.12.1 Conversion of Right-linear Grammar to Equivalent Left-linear Grammar 273; 5.12.2 Conversion of Left-linear Grammar to Equivalent Right-linear Grammar 275; 5.13 Equivalence of Regular Grammars and Finite Automata 277; 5.13.1 Right-linear Grammar and FA 277; 5.13.2 Left-linear Grammar and FA 280; 5.14 Pumping Lemma for Context-free Languages 283; 5.14.1 Application of Pumping Lemma 286; 5.14.2 Ogden’s Lemma 288; 5.15 Kuroda Normal Form 288; 5.16 Dyck Language 289; 5.17 Derivation Graph 289; 5.18 Applications of CFG 291; 5.18.1 Parser (or Syntax Analyser) 291; 5.19 Backus-Naur Form 292; 6. PUSHDOWN STACK-MEMORY MACHINE 300; 6.1 Introduction 300; 6.2 Elements of a PDM 301; 6.2.1 Pictorial Representation of PDM Elements 301; 6.3 Pushdown Automata 302; 6.4 Finite Automata vs PDA 304; 6.4.1 Examples of PDA that Accept Regular Languages 304; 6.4.2 Relative Computational Powers of PDA and FA 307; 6.5 PDA Accepting CFLs 307; 6.5.1 Instantaneous Description of PDA 311; 6.5.2 Acceptance of CFL by Empty Stack 312; 6.5.3 Acceptance of CFL by Final State 312; 6.5.4 State Transition Diagram for a PDA 312; 6.6 DPDA vs NPDA 317; 6.6.1 Relative Powers of DPDA/NPDA and NFA/DFA 320; 6.7 Equivalence of CFG and PDA 321; 6.7.1 NPDA Construction using Chomsky Normal Form 326; 6.8 Closure Properties of CFLs 329; 6.9 Additional PDA Examples 332; 7. PARSING TECHNIQUES 339; 7.1 Introduction 339; 7.2 Introduction to Parsing 339; 7.3 Top-down Parsing 340; 7.3.1 Why Leftmost Derivation? 341; 7.3.2 Working of a Top-down Parser 341; 7.3.3 Some Potential Problems in Top-down Parsing and their Solutions 342; 7.3.4 Recursive Descent Parsing 346; 7.4 Bottom-up Parsing 350; 7.4.1 Why Rightmost Reduction? 350; 7.4.2 Working of a Bottom-up Parser 350; 7.4.3 Operator Precedence Parsing 351; 7.5 Automatic Construction of Bottom-up Parsers 352; 7.5.1 LR(0) Grammar 352; 7.5.2 SLR Parser 357; 7.5.3 LR(1) Grammar 363; 7.5.4 Canonical-LR Parser 365; 7.5.5 LALR Parser 370; 8. POST MACHINE 376; 8.1 Introduction 376; 8.2 Elements of Post Machine 376; 8.3 Pushdown Stack-memory Machine vs Post Machine 377; 8.4 Pictorial Representation of Post Machine Elements 378; 8.5 Finite State Machine vs Post Machine 379; 8.6 Post Machine that Accepts Context-free Languages 381; 8.7 Non-deterministic Post Machine 390; 8.8 Post Machine that Accepts Non-CFLs 394; 9. UNDECIDABILITY 405; 9.1 Introduction 405; 9.2 Recursive and Recursively Enumerable Languages 405; 9.2.1 Some Important Results with Recursive and RE Languages 406; 9.3 Godel Numbering (or Godel Encoding) 408; 9.3.1 Encoding of Turing Machines 408; 9.4 Non-recursively Enumerable Languages 409; 9.4.1 Diagonalization Language 410; 9.4.2 Ld not Recursively Enumerable 410; 9.5 Universal Language 411; 9.6 Reducibility and Undecidable Problems 411; 9.7 Rice’s Theorem 412; 9.8 Post’s Correspondence Problem 413; 9.9 Undecidable Problems for Context-free grammars 413; 9.10 Greibach’s Theorem 414; 9.11 Hilbert’s Problem 415; 9.12 Ackermann’s Function 416; 10. COMPLEXITY AND CLASSIFICATION OF PROBLEMS 421; 10.1 Introduction 421; 10.2 Complexity of a Problem 421; 10.2.1 Mathematical Notations for Time Complexity Measure 422; 10.2.2 Time and Space Complexity of a Turing Machine 424; 10.3 Classification of Problems 424; 10.3.1 Non-deterministic Algorithm 424; 10.3.2 Satisfiability 425; 10.3.3 P-type and NP-type Problems 426; 11. PRODUCTION SYSTEMS 431; 11.1 Introduction 431; 11.2 Post-Markov-Thue Production System 432; 11.2.1 Formal Definition 433; 11.2.2 Examples 433; 11.3 Post Canonical System 436; 11.4 Post Normal Form 437; 11.5 Post-Markov-Thue System and Turing Machine 437; 11.6 Post-Markov-Thue System and Finite State Machine 438; 11.7 Markov Algorithm 440; 11.7.1 Formal Definition 440; 11.7.2 Examples 440; 11.8 Labelled Markov Algorithm 447; 11.8.1 Formal Definition 448; 11.8.2 Some Examples 448; Appendix A: Implementations 453; Appendix B: Model Question Papers 512; Glossary 516; Bibliography 00; Index 00
Trade Policy 买家须知
- 关于产品:
- ● 正版保障:本网站隶属于中国国际图书贸易集团公司,确保所有图书都是100%正版。
- ● 环保纸张:进口图书大多使用的都是环保轻型张,颜色偏黄,重量比较轻。
- ● 毛边版:即书翻页的地方,故意做成了参差不齐的样子,一般为精装版,更具收藏价值。
关于退换货:
- 由于预订产品的特殊性,采购订单正式发订后,买方不得无故取消全部或部分产品的订购。
- 由于进口图书的特殊性,发生以下情况的,请直接拒收货物,由快递返回:
- ● 外包装破损/发错货/少发货/图书外观破损/图书配件不全(例如:光盘等)
并请在工作日通过电话400-008-1110联系我们。
- 签收后,如发生以下情况,请在签收后的5个工作日内联系客服办理退换货:
- ● 缺页/错页/错印/脱线
关于发货时间:
- 一般情况下:
- ●【现货】 下单后48小时内由北京(库房)发出快递。
- ●【预订】【预售】下单后国外发货,到货时间预计5-8周左右,店铺默认中通快递,如需顺丰快递邮费到付。
- ● 需要开具发票的客户,发货时间可能在上述基础上再延后1-2个工作日(紧急发票需求,请联系010-68433105/3213);
- ● 如遇其他特殊原因,对发货时间有影响的,我们会第一时间在网站公告,敬请留意。
关于到货时间:
- 由于进口图书入境入库后,都是委托第三方快递发货,所以我们只能保证在规定时间内发出,但无法为您保证确切的到货时间。
- ● 主要城市一般2-4天
- ● 偏远地区一般4-7天
关于接听咨询电话的时间:
- 010-68433105/3213正常接听咨询电话的时间为:周一至周五上午8:30~下午5:00,周六、日及法定节假日休息,将无法接听来电,敬请谅解。
- 其它时间您也可以通过邮件联系我们:customer@readgo.cn,工作日会优先处理。
关于快递:
- ● 已付款订单:主要由中通、宅急送负责派送,订单进度查询请拨打010-68433105/3213。
本书暂无推荐
本书暂无推荐