Semester Calendar
Date | Lecture Title | Assignment(s) | Lecture & Assignment Goals |
---|---|---|---|
Notes | |||
Mon May 6 | Final Exam | The final exam is scheduled for May 6 3:15-5:15PM in MC313 (McNeil Hall, behind the library and the Volk gym). The final exam is cumulative, here is a list of learning goals covered. A reminder: you can complete either the final project or the final exam, but you can't do both. For those of you interested, here is a tarball with a |
|
Wed May 1 | Quiz 3 |
CZAR. An overview video has been posted. I'll except late submissions of CZAR without penalty, so long as I get them before grades are due. Which this semester is 12 May. |
|
Here is a list of learning goals covered by the third quiz. |
|||
Wed May 1 |
Classes End |
||
Fri Apr 26 | PC Codes |
LG Assignment:
CZAR. An overview video has been posted. I'll except late submissions of CZAR without penalty, so long as I get them before grades are due. Which this semester is 12 May. |
Cite two pieces of information stored as control information in a stack frame. How is the location of local variables in a function determined? How are these addresses accessed in modern CPUs? What function specific information is commonly stored in the symbol table? Why is the symbol table the ideal structure? What is a frame or stack frame? What language construct implies the use of stack frames? What is a local variablle? Where in memory is a local variable typically stored? What is the difference between callee and caller saved registers? Which might not exist for a function and why? An activation record is another name for what stack associated term? Distinguish between the stack-top register and a frame pointer. How are each used in stack and procedure management? How are Understand how the Understand how the Understand how the Understand how the |
Lecture slides for program counter manipulations, |
|||
Wed Apr 24 | Expression Evaluation |
LG Assignment:
No LGA, work on ZOBOS! |
Cite two pieces of information stored as control information in a stack frame. How is the location of local variables in a function determined? How are these addresses accessed in modern CPUs? What function specific information is commonly stored in the symbol table? Why is the symbol table the ideal structure? What is a frame or stack frame? What language construct implies the use of stack frames? What is a local variablle? Where in memory is a local variable typically stored? What is the difference between callee and caller saved registers? Which might not exist for a function and why? An activation record is another name for what stack associated term? Distinguish between the stack-top register and a frame pointer. How are each used in stack and procedure management? How are Understand how the |
Lecture slides for program counter manipulations, |
|||
Mon Apr 22 | Register Counting and Allocation | LG Assignment: |
What is an immediate operand of a machine instruction? How do these affect register counting and allocation? How can machine code for expression tree valuation be efficiently constructed? Know the Sethi-Ullman numbering algorithm for expression trees. What is register spilling? What is register targeting in the context of register allocation in expression trees? Why would it be advantageus to reorder (a+b)+(c+d) into ((a+b)+c)+d? Why can this NOT be done by a compiler? In on-the-fly register allocation, what are allocatable registers, reserved registers, and work registers. What is a pseudo register or virtual register? |
Slides for register management and expression tree code generation and pseudo code: Sethi-Ullman counting (including function calls), treeCG() expression tree code generation or treeVR() using virtual registers. |
|||
Fri Apr 19 | ZOBOS |
ZOBOS, a walk through video is available. |
|
Wed Apr 17 | Stack Management |
LG Assignment:
ZOBOS, a walk through video is available. |
Cite two pieces of information stored as control information in a stack frame. How is the location of local variables in a function determined? How are these addresses accessed in modern CPUs? Know the difference between procedure level and block level frame allocation. What function specific information is commonly stored in the symbol table? Why is the symbol table the ideal structure? What is a dynamic link of a stack frame? Why is it dynamic? What is a frame or stack frame? What language construct implies the use of stack frames? What is a local variablle? Where in memory is a local variable typically stored? What is the difference between callee and caller saved registers? Which might not exist for a function and why? An activation record is another name for what stack associated term? Distinguish between the stack-top register and a frame pointer. How are each used in stack and procedure management? Distinguish the use of data areas: global data, heap allocated data, stack allocated data. When are each used? What is an object pointer, how is it used in the addressing of variables in modern CPUs? |
Slides for basic stack management. |
|||
Mon Apr 15 | Lecture continued |
LG Assignment:
No LGA. |
Describe a common simple data structure for maintaining a compiler's namespace. What information is associated with a symbol? Name 3 distinct items. What is the namespace of a compiler? What does the namespace store? What special properties of names suggest a relatively simple namespace data structure can be used? What are active scopes? What is a shadowed identifier? What is the current scope? What is the downside to implementing a full blown symbol table for each scope vs one global table? What operations (API) is required of a symbol table implementation? Cite or describe a simple data structure to manage symbols and their attributes. In the author's SymTable structure, what steps are required to open or close as scope? In the author's SymTable structure, what steps are required to determine if an identifier is shadowed? What are V, L, and H in the author's SymTable structure? What is the scope display of the authors SymTable structure? What is it used for? What is static scoping? |
Symbols and Names slides. |
|||
Fri Apr 12 – Sun Apr 14 |
E-Days - No Lectures |
||
Fri Apr 12 |
Last Day to Withdrawal from Courses (No Refund) |
||
Wed Apr 10 | Symbol Tables | LG Assignment: |
Be able to compare and contrast two approaches: one symbol table for all scopes and separate symbol table for each scope Describe a common simple data structure for maintaining a compiler's namespace. What information is associated with a symbol? Name 3 distinct items. What is the namespace of a compiler? What does the namespace store? What special properties of names suggest a relatively simple namespace data structure can be used? What are active scopes? What is a shadowed identifier? What is the current scope? What is the downside to implementing a full blown symbol table for each scope vs one global table? What operations (API) is required of a symbol table implementation? Cite or describe a simple data structure to manage symbols and their attributes. In the author's SymTable structure, what steps are required to open or close as scope? In the author's SymTable structure, what steps are required to determine if an identifier is shadowed? What are V, L, and H in the author's SymTable structure? What is the scope display of the authors SymTable structure? What is it used for? What is static scoping? |
Symbols and Names slides. |
|||
Mon Apr 8 | ASTs and Visitor Patterns for Conventional Languages | LG Assignment: |
The author suggests an AST node have access to what collection of other "nearby" nodes? What is the common AST node structure associated with a sequence or block of statements? What is the common AST node structure associated with a What is the common AST node structure associated with a What is the common AST node structure associated with an What is the common AST node structure associated with a What is the difference between a concrete syntax tree and an abstract syntax tree (AST)? What four critical functions does the author recommend for AST processing? What do they do? What are semantic actions? What are semantic values? What is syntax directed translation (SDT)? How are semantic actions different than syntactic actions? Learn how to express logic for parse tree simplification during construction (AKA: syntax directed translation). How are synthesized attributes different than inherited attributes in SDT? How can a λ rule be used to implement forced semantic actions? What is Rule Cloning and what is its downside in parser implementation? Know the (limited) capabilities of semantic actions in a simply implemented LR parser. Understand the delayed execution technique of implementing SDT and semantic actions in an LR parse. |
The trove of lecture slides on SDT implementations in LR parsers. |
|||
Fri Apr 5 | SDT and Semantic Actions |
LG Assignment:
lga-code-prep (basically a laundry list of "code goals") |
What is the difference between a concrete syntax tree and an abstract syntax tree (AST)? What are semantic actions? What are semantic values? What is syntax directed translation (SDT)? How are semantic actions different than syntactic actions? Learn how to express logic for parse tree simplification during construction (AKA: syntax directed translation). How are synthesized attributes different than inherited attributes in SDT? How can a λ rule be used to implement forced semantic actions? What is Rule Cloning and what is its downside in parser implementation? Know the (limited) capabilities of semantic actions in a simply implemented LR parser. Understand the delayed execution technique of implementing SDT and semantic actions in an LR parse. |
SDT nitty-gritty details and terms, example of textbook's semantic actions. The trove of lecture slides on SDT implementations in LR parsers. Here is the group worksheet for evaluating your peer's pseudo code solutions. |
|||
Wed Apr 3 | Quiz 2 |
LG Assignment:
|
Be able to convert a RE (in the book's notation) to an equivalent NFA. Be able to process a parse-tree for REs creating a structure for an equivilant NFA. |
Here is a list of learning goals covered by the second quiz. A review video of the quiz is here. |
|||
Mon Apr 1 | RE Trees to NFAs |
LG Assignment:
|
|
Lecture slides for generating NFAs from RE ASTs. |
|||
Fri Mar 29 | Lecture continued | LG Assignment: |
What is the difference between a concrete syntax tree and an abstract syntax tree (AST)? What is syntax directed translation (SDT)? Learn how to express logic for parse tree simplification during construction (AKA: syntax directed translation). Be able to express semantic actions in the context of an LL parse where a node's parent, descendents, and left-hand siblings are known. |
Download the slides for RE Grammar and LL(1) SDT to the RE's AST. Lecture slides for generating NFAs from RE ASTs. |
|||
Wed Mar 27 | SDT and ASTs |
What is syntax directed translation (SDT)? Learn how to express logic for parse tree simplification during construction (AKA: syntax directed translation). |
|
lga-lr-itemsets selected solution resources: question 1a, question 1b, the missing states are Download the slides for RE Grammar and LL(1) SDT to the RE's AST. |
|||
Mon Mar 25 | LR Item Sets | LG Assignment: |
How are Item Follow Sets formed? Know what an Item Set and its Progress Marker represent. What is a reduce-reduce conflict in an LR table construction algorithm? What is a shift-reduce conflict in LR table construction? Be able to calculate Closure and Goto for LR Item Sets. How are Follow Sets used in SLR to resolve reduce-reduce or shift-reduce conflicts in LR(0) tables? How are Follow Sets and the grammar's Item Set graph (CFSM) used to construct SLR parsing tables? |
Shift-reduce LR(0) and SLR(1) table generation lecture slides are here. I won't require that you implement LR(0) or SLR(1) table generation logic in this course, but you will be expected to know how LR item sets and the CFSM is formed (which is to say, be able to demonstrate this in LGAs, quizzes, or exams). While you aren't required to implement the LR table generation algorithms, this is Mines and some students will really want to implement this stuff. Here are the algorithms in both prose and pseudo code closure, goto and SLR table generation. |
|||
Sat Mar 16 – Sun Mar 24 |
Spring Break |
||
Fri Mar 15 | No lecture. | ||
Snow day --- have a great Spring Break! As promised, here is a preview of the next programing project. A due date will be set as soon as we finish syntax directed translation. |
|||
Wed Mar 13 | LR Parsing wrap-up |
LG Assignment:
No LGA for Friday, consider reading the text sections for lga-lr-itemsets! |
In terms of deciding which production rule to use, what is the fundamental advantage of LR parsers over LL parsers? Know what an Item Set and its Progress Marker represent. Understand the shift-reduce parsing algorithm (given a LR parsing table or a simple grammar). What is a reduce-reduce conflict in an LR table construction algorithm? What is a shift-reduce conflict in LR table construction? Be able to calculate Closure and Goto for LR Item Sets. What important property do LR parsers have with respect to the DBL language? How are Follow Sets used in SLR to resolve reduce-reduce or shift-reduce conflicts in LR(0) tables? How are Follow Sets and the grammar's Item Set graph (CFSM) used to construct SLR parsing tables? |
Solutions to lga-lr-knitting knitting-algorithm questions: #1, #2, #3, #4, #5 and #6. Shift-reduce LR(0) and SLR(1) table generation lecture slides are here. I won't require that you implement LR(0) or SLR(1) table generation logic in this course, but you will be expected to know how LR item sets and the CFSM is formed (which is to say, be able to demonstrate this in LGAs, quizzes, or exams). While you aren't required to implement the LR table generation algorithms, this is Mines and some students will really want to implement this stuff. Here are the algorithms in both prose and pseudo code closure, goto and SLR table generation. |
|||
Mon Mar 11 | The Knitting Parser |
LG Assignment:
Optional reading section 6.3. |
If a LR(k) parsing table can be formed from a grammar G that generates or defines language L, what ambiguity conclusion can be made about L? In terms of deciding which production rule to use, what is the fundamental advantage of LR parsers over LL parsers? Know what an Item Set and its Progress Marker represent. Understand the shift-reduce parsing algorithm (given a LR parsing table or a simple grammar). What do to the L, R, and k stand for in LR(k) parsing? What is a reduce-reduce conflict in an LR table construction algorithm? What is a shift-reduce conflict in LR table construction? What type of derivation do "bottom-up" parsers use (generate)? Be able to calculate Closure and Goto for LR Item Sets. What important property do LR parsers have with respect to the DBL language? How are Follow Sets used in SLR to resolve reduce-reduce or shift-reduce conflicts in LR(0) tables? How are Follow Sets and the grammar's Item Set graph (CFSM) used to construct SLR parsing tables? |
LR Parsing Algorithm slides and pseudo code as well. Enjoy! Pseudo code for shift-reduce parsing. Shift-reduce LR(0) and SLR(1) table generation lecture slides are here. I won't require that you implement LR(0) or SLR(1) table generation logic in this course, but you will be expected to know how LR item sets and the CFSM is formed (which is to say, be able to demonstrate this in LGAs, quizzes, or exams). |
|||
Fri Mar 8 | Woeful lecture |
LG Assignment:
No LGA due Monday. Work on LUTHOR! |
Be able to eliminate left recursion from a grammar in order to make it LL(1) How are left recursive rules refactored so that a grammar might be processed with an LL(1) parser? |
As promised, my solutions for ll1-limitations (zip). You were all correct, 1c is not resolvable to an LL1 grammar. I've added a correcting terminal to the grammar and it works now.
I also noticed a bit of a silly error in 3c, but I'll wait till next semester to correct it --- the error doesn't effect the correctness of the original grammar or its rewrite to LL1. The grammar simply isn't the language I intended to write Ah yes, and Group question non-trivial left factorization |
|||
Wed Mar 6 | LL Wrap Up | LG Assignment: |
Be able to write a non-trivial grammar with recursive rules. Be able to factor out common prefixes in order to make an LL(1) grammar. Given a grammar and its LL(1) parsing table, show how a parse tree is created from a token stream. What does a production rule written with left recursion look like? What important property do LL(k) parsers have with respect to the DBL language? What is the DBL, how is it connected to the ubiquitious if-else in programming? How are left recursive rules refactored so that a grammar might be processed with an LL(1) parser? How are grammars with common prefixes that create intersecting predict sets refactored to become LL(1) languages? |
Lecture notes for LL1 limitations and the Dangling Bracket Language. |
|||
Mon Mar 4 | LL(1) Parsing |
LG Assignment:
lga-ll1-parsing (see below for when this LGA will be checked). |
Be able to factor out common prefixes in order to make an LL(1) grammar. Given a grammar and its LL(1) parsing table, show how a parse tree is created from a token stream. How do you build an LL(1) parsing table from a grammar that is LL(1)? How is a predict set calculated from first sets, follow sets, and "derives to λ"? What does a production rule written with left recursion look like? What is a predict set? What important property do LL(k) parsers have with respect to the DBL language? What is the DBL, how is it connected to the ubiquitious if-else in programming? What property of it's predict sets make a grammar not LL(1)? How are left recursive rules refactored so that a grammar might be processed with an LL(1) parser? How are grammars with common prefixes that create intersecting predict sets refactored to become LL(1) languages? Be able to write a non-trivial LL(1) language with recursive rules and operators (precedence and associativity). |
Submit to the course website two things
Submit both as simple text files or screenshots; please no invites to your repo or URLs to pastebins. CFG for challenge question. show_chl-predict-sets.pdf group question from today. Here are the derives to λ, first, and follow sets for the grammar, if you'd like to check them against your lga-predict-sets CFG group code. lga-ll1-parsingThe coding questions will be checked on Friday, the 8th. The RE Grammar question will be checked on the next lecture. Students may find my python script tree-to-graphvis helpful in visualizing their code results. See the pydoc string in the file for example usage. You will want to
or invoke as an argument of
Lecture notes for LL1 limitations and the Dangling Bracket Language. |
|||
Fri Mar 1 | No Lecture | ||
Wed Feb 28 | Midterm Exam | Learning goals covered by the exam are here. A review video is here, forgive my "momentary brain mis-fire" around the predict set of C →λ. I didn't have time to edit the video, apologies. The Quiz 1 review video probably has a better explanation of the operator precedence question at about 9:20 into the video. |
|
Mon Feb 26 | LUTHOR |
LG Assignment:
lga-ll1-tables will be group checked on Monday 4 March. LUTHOR. These LUTHOR specific slides are a slightly more visual representation of the write-up example explanation. |
Given a grammar and its LL(1) parsing table, show how a parse tree is created from a token stream. How do you build an LL(1) parsing table from a grammar that is LL(1)? What does a production rule written with left recursion look like? How are left recursive rules refactored so that a grammar might be processed with an LL(1) parser? How are grammars with common prefixes that create intersecting predict sets refactored to become LL(1) languages? Be able to cite examples of implementation challenges and techniques for "production" token scanner. |
Submit to the course website two things
Submit both as simple text files or screenshots; please no invites to your repo or URLs to pastebins. |
|||
Fri Feb 23 | Predict Sets & LL(1) Languages Continued |
Read through the LUTHOR project so we can discuss on Monday. |
Given a grammar and its LL(1) parsing table, show how a parse tree is created from a token stream. How do you build an LL(1) parsing table from a grammar that is LL(1)? What property of it's predict sets make a grammar not LL(1)? |
Reminder: lga-predict-sets will be "checked" at the next lecture. At least one group member should bring a computing device with an Internet connection and capable of running your current code base. Here is pseudo code for the LLT parsing algorithm we'll work through in lecture today. Download lecture slides showing predict calculations and data structure evolution during an LL parse. |
|||
Wed Feb 21 | CFG Algorithms wrap-up | LG Assignment: |
How are top-down parsers that don't use LL tables constructed? How is a predict set calculated from first sets, follow sets, and "derives to λ"? How is a predict set calculated from first sets, follow sets, and "derives to λ"? What do the two Ls and the k stand for in "LL(k)" parsing? What is a predict set? What property of it's predict sets make a grammar not LL(1)? |
"Introducing predict sets" slides. |
|||
Mon Feb 19 – Tue Feb 20 |
Presidents' Day Break - No Lectures |
||
Fri Feb 16 | CFG Algorithms | LG Assignment: |
Determine what terminals are in the first set of a grammar sequence. Determine what terminals are in the follow set of a grammar symbol. Determine whether a "symbol derives to λ". Given a grammar, be able to derive a valid sentence of the grammar. Gain experience and insight into the software design and trade-offs of a context free grammar class (object, API). |
Some pseudo code for derives to lambda, first sets and follow sets; and some lecture slides. For those of you who don't like typing, here is the sample grammar used in the LGA. This LGA is the same assignment for everyone, so there is no reason you need a group to divvy up the work. If you're group has already decided on one or two (three?) preferred languages to share code in, you are welcome to complete this LGA pair-programming style. |
|||
Wed Feb 14 | Grammar Symbol Sets (cont) | LG Assignment: |
Determine what terminals are in the first set of a grammar sequence. Determine what terminals are in the follow set of a grammar symbol. Determine whether a "symbol derives to λ". Given a grammar, be able to derive a valid sentence of the grammar. |
Introduction to CFG algorithm analysis (first and follow sets).... Some pseudo code for derives to lambda, first sets and follow sets; and some lecture slides. Note: many online references put Now, when we get to building a parsing engine, we will need to artificially add |
|||
Mon Feb 12 | First Sets | LG Assignment: |
Determine what terminals are in the first set of a grammar sequence. Determine what terminals are in the follow set of a grammar symbol. Determine whether a "symbol derives to λ". What are terminals, non-terminals, λ, and $ in a CFG? Gain experience and insight into the software design and trade-offs of a context free grammar class (object, API). What is a parse tree, what characteristics does it have? |
CFG implementation in code (group question). Students will want to complete their (previous) learning group evaluations sooner than later --- future course uploads (both programming projects and learning group "checks") require your survey to be completed. No reason to stress out over a deadline when you don't need to... Introduction to CFG algorithm analysis (first and follow sets).... |
|||
Fri Feb 9 | Quiz 1 | Here is a list of learning goals covered by the first quiz. A review video has been posted. |
|
Wed Feb 7 | Quiz prep | LG Assignment: |
Determine what terminals are in the first set of a grammar sequence. Determine what terminals are in the follow set of a grammar symbol. What are terminals, non-terminals, λ, and $ in a CFG? Gain experience and insight into the software design and trade-offs of a context free grammar class (object, API). |
CFG implementation in code (group question). Introduction to CFG algorithm analysis (first and follow sets).... |
|||
Mon Feb 5 | Ambiguity in Grammars | LG Assignment: |
Be able to identify (seemingly) ambiguous grammars, what demonstrates an ambiguous grammar? Given a grammar, be able to derive a valid sentence of the grammar. How are left-associative and right-associative connected to recursive production rules? When might we suspect that a grammar is ambiguous? Be able to deduce associativity and precedence rules for operators in a CFG. Given a binary operator, △, know how to write its production rule for left or right associativity. Given two binary operators △ and □ , know how to write a simple grammar to express their relative precedence. Can a BNF grammar represent different languages than a CFG? What is different between a BNF grammar and a CFG? Be able to write a non-trivial LL(1) language with recursive rules and operators (precedence and associativity). |
Regular volleyball group question slides. Lecture slides for ambiguous grammars. Lecture slides for arithmetic precedence and associativity expressed through context free grammars. |
|||
Fri Feb 2 | NFAMATCH |
LG Assignment:
No LGA for Monday --- work on NFAMATCH! (startin' the fire |
What is a reduced grammar? In what way are REs limited in their description of languages? What features of a language are REs able to describe? Be able to optimize a DFA to a smaller form by reconciling indistinguishable states. Be able to describe an algorithm for removing dead and unreachable states in a DFA. Be able to convert an NFA to a (unoptimized) DFA. How are the terms regular language, regular set connected to DFAs? How are the terms regular language, regular set, and regular expression interrelated? Understand and implement token matching given an optimized DFA transition table. |
RL limitations and the pumping lemma. Reduced grammar group question. |
|||
Wed Jan 31 | No Lecture (Career Day!) | ||
Tue Jan 30 – Wed Jan 31 |
Career Days (No Assessments) |
||
Mon Jan 29 | Context Free Grammars |
LG Assignment:
(startin' the fire |
Given a grammar, be able to derive a valid sentence of the grammar. Know the difference between a leftmost derivation and a rightmost derivation. Know the notational conventions for CFGs used in the book. What are terminals, non-terminals, λ, and $ in a CFG? What is a sentential form? Be able to design simple languages and grammars with nested productions and recursive rules. What is a regular grammar? What two production rule patterns are associated with regular grammars? In what way are REs limited in their description of languages? What features of a language are REs able to describe? Be able to optimize a DFA to a smaller form by reconciling indistinguishable states. Be able to describe an algorithm for removing dead and unreachable states in a DFA. Be able to convert an NFA to a (unoptimized) DFA. Understand and implement token matching given an optimized DFA transition table. |
Three DFA transition tables for you to test your LGA code on: dfa0-dfa-tt.txt, dfa1-dfa-tt.txt, dfa2-dfa-tt.txt. Submit your aptly named results on the course homepage in the "lgcOptimalDFA" slot. You can upload each result file individually --- just be sure they are named differently! This "learning group check" is worth four participation points for all members in attendance today. My solutions for the learning group check: dfa0-optdfa-tt.txt, dfa1-optdfa-tt.txt, dfa2-optdfa-tt.txt. Challenge question matching nested structures with regular expressions. You may want to download and have at the ready this grammar from the lecture. Here are the context free grammar intro slides. |
|||
Fri Jan 26 | Optimizing DFAs | LG Assignment: |
Be able to optimize a DFA to a smaller form by reconciling indistinguishable states. Be able to describe an algorithm for removing dead and unreachable states in a DFA. What are dead states and unreachable states in FAs? Be able to cite examples of implementation challenges and techniques for "production" token scanner. |
Merging indistinguishable states example. Pseudo code for DFA "optimization" (you might want to download and have this available during lecture). Slides for "mostly optimizing" DFAs with the MergeStates algorithm on DFAs. |
|||
Wed Jan 24 | NFA to DFA | LG Assignment: |
Be able to convert a DFA into a RE (in the book's notation). How are the terms regular language, regular set connected to DFAs? |
You may want to download (before lecture) some algorithm PDFs so you have ready access to them while we tease out how they work. NFA to DFA algorithm lecture slides. |
|||
Wed Jan 24 |
Last Day to DROP |
||
Mon Jan 22 | NFAs | LG Assignment: |
Be able to convert a RE (in the book's notation) to an equivalent NFA. Be able to optimize a DFA to a smaller form by reconciling indistinguishable states. What are dead states and unreachable states in FAs? What is the difference(s) between a NFA and a DFA? Be able to convert an NFA to a (unoptimized) DFA. |
As promised, my solutions for lga-dfa, with my solution to #1 corrected of course. lga-dfa question three's very large NFA, if you're curious. Lecture slides on NFAs and REs→NFAs. The motivating challenge question for NFAs. Note that figure 3.22 of the text is incorrect. There should be a λ transition from the starting node (to the left of the "FA for A" box) to the accepting node to the right of the box. Without this, the figure actually shows the NFA for A+. |
|||
Fri Jan 19 | DFAs ≡ REs | LG Assignment: |
Cite two ways that a FA (DFA or NFA) may be represented in computer memory. Understand their tradeoffs. Given the rules for Lex, be able to write REs compatible with Lex. Understand the basic constructs in a How are the terms regular language, regular set connected to DFAs? |
Wed Jan 17 | Regular Expressions | LG Assignment: |
What are runtime semantics? What features of a language are REs able to describe? Be able to construct a RE (in the book's notation) for particular token patterns. How are the terms regular language, regular set, and regular expression interrelated? Know the (book's) notation and terminology for regular expressions. Know the meaning of RE notations: AB, A|B, A∗, A+ and Aκ (κ a constant positive integer) Know the meaning of λ, single characters s ∈ Σ, and T ⊆ Σ as they relate to the book's RE notation. What are the three basic operations in REs from which all other operations may be derived. |
Tue Jan 16 | |||
Mon Jan 15 |
MLK Holiday - Campus Closes |
||
Fri Jan 12 | Groups! | LG Assignment: |
A context free grammar defines what particular facet of a programming language? What does a CFG guarantee? Complete the following: "CFGs confirm a program has valid ----, static semantics confirm a program is ---. What are static semantics? Cite examples. Other than pure and augmented machine code, what other type of machine code might a compiler generate? What is augmented machine code? How does it differ from pure machine code? What is bootstrapping a compiler? Have all compilers been bootstrapped? What is pure machine code? Be able to cite an example. What is the difference between absolute binary and relocatable binary forms of machine code? What is the difference between the target of a compiler and the kind of machine code generated by the compiler? What other target code format, other than absolute and relocatable is a common output of compilers? How is an interpretter different than a compiler, are there advantages to an interpretted language? What are runtime semantics? |
Grammars to Parse Trees to Code (non-semantic checking examples). You can also geek out on the shunting yard algorithm but this isn't in the learning goals for the course. |
|||
Wed Jan 10 | Introductions |
|
Understand how learning groups will be used in this course, and what the expectations are for your shared work. Understand how your course grade will be calculated and how your participation points will may affect your grade. Understand the Collaboration Policy for individual assignments. |
What about Canvas? We won't use Canvas in this course. Here are the intro slides from the first lecture. |
|||
Tue Jan 9 |
Classes Start |