Lecture 2: Introducing Java - Emory University

Lecture 2: Introducing Java - Emory University

CS 730: Text Mining for Social Media & Collaboratively Generated Content Lecture 3: Parsing and Chunking 9/27/2010 CS730: Text Mining for Social Media, F2010 1 Big picture of the course Language models (word, n-gram, ) Classification and sequence models WSD, Part-of-speech tagging Syntactic parsing and tagging

Semantics Next week: Information extraction NextNext week: No class (Fall break) Part II: Social media (research papers start) 9/27/2010 CS730: Text Mining for Social Media, F2010 2 Natural Language Semantics 9/27/2010 CS730: Text Mining for Social Media, F2010

3 Programming Language Interpreter + What is meaning of 3+5*6? First parse it into 3+(5*6) * 3 6

5 E E F N + E F E 3

N * N E 5 9/27/2010 CS730: Text Mining for Social Media, F2010 6

4 Programming Language Interpreter What is meaning of 3+5*6? First parse it into 3+(5*6) Now give a meaning to each node in the tree (bottom-up) +33 33 * 30

5 5 66 E 33 E 3N 3 E 30 F + E add CS730: Text Mining for Social Media, F2010

E * N6 5 N mult 5 9/27/2010 F 6 5 Interpreting in an Environment

How about 3+5*x? Same thing: the meaning of x is found from the environment (its 6) Analogies in language? +33 33 * 30 5 5 6x E 33 E

3N 3 E 30 F + E add CS730: Text Mining for Social Media, F2010 E

* N6 5 N mult 5 9/27/2010 F 6 6 Compiling How about 3+5*x? Dont know x at compile time Meaning at a node

is a piece of code, not a number E 5*(x+1)-2 is a different expression3 N that produces equivalent code 3 (can be converted to the previous code by optimization) Analogies in language? 9/27/2010 CS730: Text Mining for Social Media, F2010 add(3,mult(5,x)) E

mult(5,x) F E + E add N F E

N * mult 55 x x 7 What Counts as Understanding? some notions We understand if we can respond appropriately

ok for commands, questions (these demand response) Computer, warp speed 5 throw axe at dwarf put all of my blocks in the red box imperative programming languages database queries and other questions We understand statement if we can determine its truth ok, but if you knew whether it was true, why did anyone bother telling it to you? comparable notion for understanding NP is to compute what the

NP refers to, which might be useful 9/27/2010 CS730: Text Mining for Social Media, F2010 8 What Counts as Understanding? some notions We understand statement if we know how to determine its truth What are exact conditions under which it would be true? necessary + sufficient Equivalently, derive all its consequences

what else must be true if we accept the statement? Philosophers tend to use this definition We understand statement if we can use it to answer questions [very similar to above requires reasoning] Easy: John ate pizza. What was eaten by John? Hard: Whites first move is P-Q4. Can Black checkmate? Constructing a procedure to get the answer is enough 9/27/2010 CS730: Text Mining for Social Media, F2010 9

What Counts as Understanding? some notions Be able to translate Depends on target language English to English? bah hambug! English to French? reasonable English to Chinese? requires deeper understanding English to logic? deepest ultimate goal of NLP/NLU! all humans are mortal = x [human(x) mortal(x)] Can use logic-manipulating rules to tell us how to act, draw conclusions, answer questions 9/27/2010

CS730: Text Mining for Social Media, F2010 10 Levels of NL Processing Words Concepts/Units of Meaning Word similarity/semantic distance Ontologies and folksonomies Sentences Relationships between concepts Deep parsing/Semantic annotation Semantic role labeling Documents Events, arguments, themes (next week) Discourse processing

Event/fact extraction 9/27/2010 CS730: Text Mining for Social Media, F2010 11 Word Similarity 9/27/2010 CS730: Text Mining for Social Media, F2010 12

Word Similarity & Distance 9/27/2010 CS730: Text Mining for Social Media, F2010 13 Word Clustering 9/27/2010 CS730: Text Mining for Social Media, F2010 14

Word Similarity Approaches Thesaurus-based (using WordNet) These approaches define directly a similarity score. Distributional Similarity (using co-occurrences) These approaches represent words as points in an Ndimensional space, and require an appropriate distance measures for their space. Ontologies & Folksonomies (Wikipedia) 9/27/2010 CS730: Text Mining for Social Media, F2010

15 WordNet Large lexical database of English Nouns, verbs, adjectives, adverbs Senses grouped into synonym sets (synsets): Synset (semantic) relations Word (lexical) relations http://wordnet.princeton.edu/ 9/27/2010 CS730: Text Mining for Social Media, F2010

16 Thesaurus Similarity Basic idea: A thesaurus like WordNet contains all the information needed to compute a semantic distance metric. Simplest instance: compute distance in WordNet sim(s, s) = -log pathlen(s, s) pathlen(s,s): number of edges in shortest path between s and s Note: WordNet nodes are synsets (=word senses). Applying this to words w, w: sim(w, w) = max sim(s, s) s Senses(w) Senses(w)

s Senses(w) Senses(w) 9/27/2010 CS730: Text Mining for Social Media, F2010 17 Path length distance Pathlength distance is the length of the shortest path between nodes in Wordnet: 9/27/2010 CS730: Text Mining for Social Media, F2010

18 Information Content (IC) Similarity 9/27/2010 CS730: Text Mining for Social Media, F2010 19 IC Similarity 9/27/2010 CS730: Text Mining for Social Media, F2010

20 Lowest Common Subsumer (LCS) 9/27/2010 CS730: Text Mining for Social Media, F2010 21 LCS-based Similarity 9/27/2010

CS730: Text Mining for Social Media, F2010 22 Problems with Thesaurus Similarity Need a thesaurus! (not available for all languages) We need a thesaurus that contains the words were interested in. We need a thesaurus that captures a rich hierarchy of hypernyms and hyponyms 9/27/2010 CS730: Text Mining for Social Media, F2010

23 Learning Hyponym Relations 9/27/2010 CS730: Text Mining for Social Media, F2010 24 Distributional Similarity 9/27/2010 CS730: Text Mining for Social Media, F2010

25 Computing Distributional Similarity 9/27/2010 CS730: Text Mining for Social Media, F2010 26 How to compute co-occurrences? 9/27/2010

CS730: Text Mining for Social Media, F2010 27 Co-occurrence Examples 9/27/2010 CS730: Text Mining for Social Media, F2010 28 Using Grammatical Features 9/27/2010

CS730: Text Mining for Social Media, F2010 29 Measuring Association with Context 9/27/2010 CS730: Text Mining for Social Media, F2010 30 Pointwise Mutual Information (PMI)

9/27/2010 CS730: Text Mining for Social Media, F2010 31 Using PMI for Word Association 9/27/2010 CS730: Text Mining for Social Media, F2010 32 Frequencies vs. PMI

9/27/2010 CS730: Text Mining for Social Media, F2010 33 Measuring Distributional Similarity 9/27/2010 CS730: Text Mining for Social Media, F2010 34

Measuring Distributional Similarity (contd) Dot product, normalized Dot Product KL Divergence: KL + JS Div: 9/27/2010 CS730: Text Mining for Social Media, F2010 35 Using Wikipedia Instead of Thesaurus Wikipediabased semantics

NLP Meaning 36 Every Wikipedia article represents a concept A Wikipedia article Leopard A concept 37

Wikipedia can be viewed as an ontology a collection of concepts Ocean Leopard Isaac Newton 38 The semantics of a word is a vector of its associations with Wikipedia concepts

Word 39 Leopard Pets Tennis ball Cat Elephant House

Veterinarian Mouse 40 Explicit Semantic Analysis (ESA) [Gabrilovich & Markovitch, IJCAI 2007, JAIR 2009] Panthera Cat [0.92] Leopard [0.84] Roar [0.77]

Cat Cat [0.95] Felis [0.74] Claws [0.61] Tom & Jerry Animation [0.85] Cat

Mouse [0.82] Cat [0.67] Cat Panthera Tom & Jerry 0.95 0.92

0.67 41 Using Wiktionary Computing semantic relatedness [Zesch et al., AAAI 2008] Wiktionary > WordNet, GermaNet Using Wikipedia and Wiktionary in IR [Mueller & Gurevych, CLEF 2008] Combining the concept space of Wikipedia and Wiktionary 42

Levels of NL Processing Words Concepts/Units of Meaning Word similarity/semantic distance Ontologies and folksonomies Sentences Relationships between concepts Deep parsing/Semantic annotation Semantic role labeling Documents Events, arguments, themes (next week) Discourse processing Event/fact extraction 9/27/2010 CS730: Text Mining for Social Media, F2010

43 Goal: Logical Representation Three major kinds of objects 1. Booleans Roughly, the semantic values of sentences 2. Entities Values of NPs, e.g., objects like this slide

Maybe also other types of entities, like times 3. Functions of various types A function returning a boolean is called a predicate e.g., frog(x), green(x) 9/27/2010 Functions might return other functions! Function might take other functions as arguments! CS730: Text Mining for Social Media, F2010

44 Logic: Grounding Eventually we have to ground out in a primitive term Primitive terms are bound to object code What is executed by loves(john, mary) ? 9/27/2010 CS730: Text Mining for Social Media, F2010 45

Logic: Interesting Constants Thus, have constants that name some of the entities and functions (e.g., times): GeorgeWBush - an entity red a predicate on entities holds of just the red entities: red(x) is true if x is red! loves a predicate on 2 entities loves(GeorgeWBush, LauraBush) Question: What does loves(LauraBush) denote? Constants used to define meanings of words Meanings of phrases will be built from the constants 9/27/2010

CS730: Text Mining for Social Media, F2010 46 A reasonable representation? Gilly swallowed a goldfish First attempt: swallowed(Gilly, goldfish) Returns true or false. Analogous to prime(17) equal(4,2+2) loves(GeorgeWBush, LauraBush) swallowed(Gilly, Jilly) or is it analogous?

9/27/2010 CS730: Text Mining for Social Media, F2010 47 A reasonable representation? Gilly swallowed a goldfish First attempt: swallowed(Gilly, goldfish) But were not paying attention to a! goldfish isnt the name of a unique object the way Gilly is In particular, dont want

Gilly swallowed a goldfish and Milly swallowed a goldfish to translate as swallowed(Gilly, goldfish) AND swallowed(Milly, goldfish) since probably not the same goldfish 9/27/2010 CS730: Text Mining for Social Media, F2010 48 Use a Quantifier Gilly swallowed a goldfish

First attempt: swallowed(Gilly, goldfish) Better: g goldfish(g) AND swallowed(Gilly, g) Or using one of our quantifier predicates: exists(g goldfish(g), g swallowed(Gilly,g)) Equivalently: exists(goldfish, swallowed(Gilly)) In the set of goldfish there exists one swallowed by Gilly Here goldfish is a predicate on entities This is the same semantic type as red But goldfish is noun and red is adjective .. #@!? 9/27/2010 CS730: Text Mining for Social Media, F2010

49 Tense Gilly swallowed a goldfish Previous attempt: exists(goldfish, g swallowed(Gilly,g)) Improve to use tense: Instead of the 2-arg predicate swallowed(Gilly,g) try a 3-arg version swallow(t,Gilly,g) where t is a time Now we can write: t past(t) AND exists(goldfish, g swallow(t,Gilly,g)) There was some time in the past such that a goldfish was among the objects swallowed by Gilly at that time 9/27/2010

CS730: Text Mining for Social Media, F2010 50 (Simplify Notation) Gilly swallowed a goldfish Previous attempt: exists(goldfish, swallowed(Gilly)) Improve to use tense: Instead of the 2-arg predicate swallowed(Gilly,g) try a 3-arg version swallow(t,Gilly,g) Now we can write: t past(t) AND exists(goldfish, swallow(t,Gilly)) There was some time in the past such that a goldfish was among the

objects swallowed by Gilly at that time 9/27/2010 CS730: Text Mining for Social Media, F2010 51 Nouns and Their Modifiers expert g expert(g) big fat expert g big(g), fat(g), expert(g) But: bogus expert

Wrong: g bogus(g), expert(g) Right: g (bogus(expert))(g) bogus maps to new concept Baltimore expert (white-collar expert, TV expert ) g Related(Baltimore, g), expert(g) expert from Baltimore Or with different intonation: g (Modified-by(Baltimore, expert))(g) expert on Baltimore Cant use Related for that case: law expert and dog catcher = g Related(law,g), expert(g), Related(dog, g), catcher(g) = dog expert and law catcher 9/27/2010 CS730: Text Mining for Social Media, F2010 52

Nouns and Their Modifiers the goldfish that Gilly swallowed every goldfish that Gilly swallowed three goldfish that Gilly swallowed g [goldfish(g), swallowed(Gilly, g)] like an adjective! three swallowed-by-Gilly goldfish Or for real: g [goldfish(g), e [past(e), act(e,swallowing), swallower(e,Gilly), swallowee(e,g) 9/27/2010

CS730: Text Mining for Social Media, F2010 ]] 53 Adverbs Lili passionately wants Billy Wrong?: passionately(want(Lili,Billy)) = passionately(true) Better: (passionately(want))(Lili,Billy) Best: e present(e), act(e,wanting), wanter(e,Lili), wantee(e, Billy), manner(e, passionate) Lili often stalks Billy (often(stalk))(Lili,Billy) many(day, d e present(e), act(e,stalking), stalker(e,Lili),

stalkee(e, Billy), during(e,d)) Lili obviously likes Billy (obviously(like))(Lili,Billy) one reading obvious(likes(Lili, Billy)) another reading 9/27/2010 CS730: Text Mining for Social Media, F2010 54 Speech Acts What is the meaning of a full sentence? Depends on the punctuation mark at the end.

Billy likes Lili. assert(like(B,L)) Billy likes Lili? ask(like(B,L)) or more formally, Does Billy like Lili? Billy, like Lili! command(like(B,L)) Lets try to do this a little more precisely, using event variables etc. 9/27/2010

CS730: Text Mining for Social Media, F2010 55 Speech Acts What did Gilly swallow? ask(x e past(e), act(e,swallowing), swallower(e,Gilly), swallowee(e,x)) Argument is identical to the modifier that Gilly swallowed Is there any common syntax? Eat your fish! command(f act(f,eating), eater(f,Hearer), eatee()) I ate my fish.

assert(e past(e), act(e,eating), eater(f,Speaker), eatee()) 9/27/2010 CS730: Text Mining for Social Media, F2010 56 Compositional Semantics Weve discussed what semantic representations should look like. But how do we get them from sentences??? First - parse to get a syntax tree. Second - look up the semantics for each word.

Third - build the semantics for each constituent Work from the bottom up The syntax tree is a recipe for how to do it 9/27/2010 CS730: Text Mining for Social Media, F2010 57 Compositional Semantics Add a sem feature to each context-free rule S NP loves NP S[sem=loves(x,y)] NP[sem=x] loves NP[sem=y] Meaning of S depends on meaning of NPs

TAG version: S loves(x,y) VP x NP V loves S x NP NP y

died(x) VP NP V kicked the bucket Template filling: S[sem=showflights(x,y)] I want a flight from NP[sem=x] to NP[sem=y] CS730: Text Mining for Social Media, F2010 9/27/2010 58 Grammar with Semantics

9/27/2010 CS730: Text Mining for Social Media, F2010 59 Parse Tree with Semantics 9/27/2010 CS730: Text Mining for Social Media, F2010 60

More realistic example ssert(every(nation, x e present(e), START act(e,wanting), wanter(e,x), wantee(e, e act(e,loving), Punc Sfin lover(e,G), lovee(e,L)))) . s assert(s) VP NP fin

N Det Every nation every nation T -s v x e present(e),v(x) (e) y x e act(e,wanting), wanter(e,x), wantee(e,y) 9/27/2010

VPstem Vstem want Sinf NP VPinf George G VPstem T a a to NP Vstem

y x e act(e,loving),love Laura L lover(e,x), lovee(e,y) CS730: Text Mining for Social Media, F2010 61 Semantic Grammars 9/27/2010 CS730: Text Mining for Social Media, F2010 62 Compositional Semantics Instead of S NP loves NP S[sem=loves(x,y)] NP[sem=x] loves NP[sem=y]

might want general rules like S NP VP: V[sem=loves] loves VP[sem=v(obj)] V[sem=v] NP[sem=obj] S[sem=vp(subj)] NP[sem=subj] VP[sem=vp] Now George loves Laura has sem=loves(Laura)(George) In this manner well sketch a version where 9/27/2010 Still compute semantics bottom-up Grammar is in Chomsky Normal Form

So each node has 2 children: 1 function & 1 argument To get its semantics, apply function to argument! CS730: Text Mining for Social Media, F2010 63 Lifer Semantic Grammars 9/27/2010 CS730: Text Mining for Social Media, F2010 64 Compositional Semantics

Intended to mean G loves L Lets make this explicit loves(L,G) Sfin NP George G STAR assert(loves(L,G)) T Puncs assert(s) . VPfin y loves(L,y) Vpres

AdjP Laura loves loves = L x y loves(x,y) 9/27/2010 CS730: Text Mining for Social Media, F2010 65 Compositional Semantics e present(e), act(e,loving),

lover(e,G), lovee(e,L) loves(L,G) Sfin NP George G STAR T Punc . y e present(e), VPfin y loves(L,y) act(e,loving), lover(e,y), lovee(e,L)

Vpres AdjP Laura loves loves = L x y loves(x,y) 9/27/2010 x y e present(e), act(e,loving), lover(e,y), lovee(e,x) CS730: Text Mining for Social Media, F2010

66 Semantic Grammar 9/27/2010 CS730: Text Mining for Social Media, F2010 67 Semantic Grammars Summary 9/27/2010

CS730: Text Mining for Social Media, F2010 68 Summary: From the Words to Assertions ssert(every(nation, x e present(e), START act(e,wanting), wanter(e,x), wantee(e, e act(e,loving), Punc Sfin lover(e,G), lovee(e,L)))) . s assert(s)

VP NP fin N Det Every nation every nation T -s v x e present(e),v(x) (e) y x e act(e,wanting),

wanter(e,x), wantee(e,y) 9/27/2010 VPstem Vstem want Sinf NP VPinf George G VPstem

T a a to NP Vstem y x e act(e,loving),love Laura L lover(e,x), lovee(e,y) CS730: Text Mining for Social Media, F2010 69 Other Fun Semantic Stuff: A Few Much-Studied Miscellany Temporal logic Gilly had swallowed eight goldfish before Milly reached the bowl Billy said Jilly was pregnant

Billy said, Jilly is pregnant. Generics Typhoons arise in the Pacific Children must be carried Presuppositions The king of France is bald. Have you stopped beating your wife? Pronoun-Quantifier Interaction (bound anaphora)

9/27/2010 Every farmer who owns a donkey beats it. If you have a dime, put it in the meter. The woman who every Englishman loves is his mother. I love my mother and so does Billy. CS730: Text Mining for Social Media, F2010 70 Semantic Roles Sasha broke the window. Breaker = Sasha BrokenThing = window

Pat opened the door. Opener = Pat OpenedThing = door. Breakers and openers are often animate actors (but what about the storm broke the window?) 9/27/2010 CS730: Text Mining for Social Media, F2010 71 Grammatical roles and semantic roles

There is no one-to-one dependency between grammatical roles (subject/object) and semantic roles (agent/patient): John broke the window. The window broke into pieces. John played the piano. John played a song. John played. A song played in the background. 9/27/2010 CS730: Text Mining for Social Media, F2010 72

Semantic role labeling Given a sentence: He would n't accept anything of value from those he was writing about . can you recover the Propbank annotation: A0 He] [AM-MOD would ] [AM-NEG n't ] [V accept ] [A1 anything of value] from [A2 those he was writing about] . The roles for accept are defined in PropBank as: V: verb A0: acceptor A1: thing accepted A2: accepted-from A3: attribute AM-MOD: modal AM-NEG: negation

See http://l2r.cs.uiuc.edu/~cogcomp/srl-demo.php 9/27/2010 CS730: Text Mining for Social Media, F2010 73 Frame Semantics a.k.a. Role Labeling 9/27/2010 CS730: Text Mining for Social Media, F2010 74

Syntactic variations 9/27/2010 CS730: Text Mining for Social Media, F2010 75 Applications of SRL 9/27/2010 CS730: Text Mining for Social Media, F2010

76 Dependency trees from CF trees 9/27/2010 CS730: Text Mining for Social Media, F2010 77 PropBank/FrameNet 9/27/2010 CS730: Text Mining for Social Media, F2010

78 PropBank Example 9/27/2010 CS730: Text Mining for Social Media, F2010 79 PropBank Example 9/27/2010

CS730: Text Mining for Social Media, F2010 80 PropBank Status Current release (2005): Proposition Bank I Verb Lexicon: 3,324 frame files Annotation: ~113,000 propositions http://www.cis.upenn.edu/~mpalmer/project_pages/ACE.htm Alternative format: CoNLL-04,05 shared task Represented in table format Has been used as standard data set for the shared tasks on semantic role labeling http://www.lsi.upc.es/~srlconll/soft.html

9/27/2010 CS730: Text Mining for Social Media, F2010 81 SRL Subtasks Identification: Very hard task: to separate the argument substrings from the rest in this exponentially sized set Usually only 1 to 9 (avg. 2.7) substrings have labels ARG and the rest have NONE for a predicate Classification: Given the set of substrings that have an ARG label, decide the

exact semantic label Core argument semantic role labeling: (easier) Label phrases with core argument labels only. The modifier arguments are assumed to have label NONE. 9/27/2010 CS730: Text Mining for Social Media, F2010 82 SRL Processing 9/27/2010

CS730: Text Mining for Social Media, F2010 83 9/27/2010 CS730: Text Mining for Social Media, F2010 84 9/27/2010 CS730: Text Mining for Social Media, F2010 85

Labeling Parse Tree Nodes 9/27/2010 CS730: Text Mining for Social Media, F2010 86 9/27/2010 CS730: Text Mining for Social Media, F2010 87

9/27/2010 CS730: Text Mining for Social Media, F2010 88 Evaluating SRL 9/27/2010 CS730: Text Mining for Social Media, F2010 89 9/27/2010

CS730: Text Mining for Social Media, F2010 90 9/27/2010 CS730: Text Mining for Social Media, F2010 91 9/27/2010 CS730: Text Mining for Social Media, F2010

92 Pre-Processing for Semantic Tasks Anaphora resolution Coreference resolution 9/27/2010 CS730: Text Mining for Social Media, F2010 93 Anaphora Resolution 9/27/2010

CS730: Text Mining for Social Media, F2010 94 Terminology 9/27/2010 CS730: Text Mining for Social Media, F2010 95 Salience/Focus

9/27/2010 CS730: Text Mining for Social Media, F2010 96 Discourse Coherence 9/27/2010 CS730: Text Mining for Social Media, F2010 97 Centering Theory

9/27/2010 CS730: Text Mining for Social Media, F2010 98 Anaphora Resolution Preferences 9/27/2010 CS730: Text Mining for Social Media, F2010 99

Centering-Based Algorithm 9/27/2010 CS730: Text Mining for Social Media, F2010 100 Example 9/27/2010 CS730: Text Mining for Social Media, F2010 101

Levels of NL Processing Words Concepts/Units of Meaning Word similarity/semantic distance Ontologies and folksonomies Sentences Relationships between concepts Deep parsing/Semantic annotation Semantic role labeling Next week: Documents Events, arguments, themes Discourse processing Event/fact extraction 9/27/2010

CS730: Text Mining for Social Media, F2010 102 Further Readings/Resources SRL Tutorial: http://www-nlp.stanford.edu/kristina/papers/SRL-T utorial-post-HLT-NAACL-06.pdf 9/27/2010 CS730: Text Mining for Social Media, F2010 103

Recently Viewed Presentations

  • 5th Grade Properties of Matter Question 1: STRIKES

    5th Grade Properties of Matter Question 1: STRIKES

    Julian rides in her mother's car sometimes and in her father's car other times. If it takes her less time to travel the same distance in her mother's car than in her father's car, which of the following is true?...
  • Supervision and Ethics - Appalachian State University

    Supervision and Ethics - Appalachian State University

    Metaethics - (p. 14) Personal ethics and studying the very ideas of right and wrong (Taleff, 2012) Rosen, C.M. 2016 AGAI. Practical or descriptive. Most common= dualistic right or wrong behavior. Strong feeling and a slight sense of consciously judging.
  • Διαφάνεια 1 - Tuc

    Διαφάνεια 1 - Tuc

    One implementing translation of RCC relations to OWL-DL class axioms while preserving their semantics. One operating on the RCC composition table by implementing a path-consistency algorithm. Doesn't support directional (CSD) algebra
  • Chapter 7: Project Cost Management Information Technology Project

    Chapter 7: Project Cost Management Information Technology Project

    Most members of an executive board better understand and are more interested in financial terms than IT terms , so IT project managers must speak their language. Profits. are revenues minus expenditures. Profit margin . is the ratio of revenues...
  • Scientific investigation & mapping - Bath County Public Schools

    Scientific investigation & mapping - Bath County Public Schools

    scientific law. or scientific principle is a concise verbal or mathematical statement of a relation that expresses a fundamental principle of science, like Newton's law of universal gravitation. A scientific law must always apply under the same conditions, and implies...
  • OSPP: The Kernel Abstraction

    OSPP: The Kernel Abstraction

    Interrupt Stack. Per-processor, located in kernel (not user) memory. Interrupt response will save PC, PSR, and user SP on the interrupt stack and then set the new SP to the top of the interrupt stack. Why can't the interrupt handler...
  • Wraparound Fidelity Assessment System Team Observation ...

    Wraparound Fidelity Assessment System Team Observation ...

    Welcome to the TOM 2.0 training presentation! This is provided for your use in training TOM observers on how to use and score the Team Observation Measure, version 2. We recommend that all collaborating communities have some sort of initial...
  • Matlab Screen - University of Crete

    Matlab Screen - University of Crete

    Corbel Arial Consolas Wingdings Wingdings 2 Wingdings 3 Calibri Tahoma Courier New Times New Roman 新細明體 Courier Arial Narrow Garamond Lucida Sans Metro 1_Metro 2_Metro 3_Metro 4_Metro 5_Metro 6_Metro 7_Metro 8_Metro Bitmap Image Matlab Screen Variables Vector, Matrix Matrix index...