Formal semantics and pragmatics: Origins, issues, impact

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
is one of the founders of formal semantics really exciting I have a chance to hear from her she's currently the distinguished University professor of linguistics and philosophy emeritus at the University of Massachusetts Amherst so maybe that's why she could find the time to come up wait wait wait we call it u mass Amherst am immersed is immersed College all right so we call it you miss well that's the full name but yeah [Laughter] [Music] in Russian in philosophy and then she got her PhD in linguistics at MIT with her thesis advisors and she went on in her career to to pioneer in formal semantics and read how she discusses her her own principal research interests she says formal semantics and its connections with syntax for it Maddox logics related issues in philosophy of language and in cognitive science philosophical interests included the relation between model theoretic and psychologists ik approaches to the foundations of semantics and we're generally on various logic oriented and linguistics oriented approaches to change the natural language next to language semantics and problems of compositionality my class will be very as well as I'm Friday and she's currently working on a book on the history of formal semantics for Oxford University Press and I will turn things over to her Thank You Adena it's a great pleasure to be here I I think I've only been at Dartmouth once before if ever I'm not quite I'm not sure if I've ever been here before so I'm very I'm delighted delighted to be here so Adina told you what I'm going to talk about this is this is related to this project of mine of writing a book on the history of formal semantics it's taking me forever but it's a lot of fun to work on so formal semantics and formal pragmatics as they've developed over the last 50 years or so have been shaped by linguists philosophers and logicians affecting and affected by developments in linguistics philosophy cognitive science also computational linguistics so as part of the part of my project on the history of formal semantics today I'm going to emphasize aspects of that history that concerned the relation between language and logic it's give me including the the vexing question of whether meanings are in the head there have been centuries of study of logic and language and until the late 19th century the disciplines of logic psychology and linguistics were not even yet separated and issues of logic thought and language were often discussed together and closely intertwined today I'll trace some of the histories of these issues including the history of claims that quote natural language has no logic and how joint work of linguists logicians and philosophers have taken that claim from being what was probably once a majority opinion - now thank goodness a minority opinion so semantics can mean a lot of different things traditionally it meant quite different things to linguist philosophers and psychologists because different fields have different central concerns and different methodologies philosophers have long been concerned with truth and reference with logic with compositionality I'll say much more about what that means in a minute with how meaning is connected with thought and with the analysis of philosophically important terms linguists influenced by Chomsky care about what's in the head what's what's the unconscious knowledge of the native speaker of a language and how its acquired psychologists have experimentally studied concept discrimination concept acquisition emphasis in meaning for psychologists most often at the lexical level words and concepts syntax has influenced linguists notions of what logical form should mean when if you talk to linguist and you talk about structure of meaning the first thing that comes in to a linguist head is some kind of a tree structure very very much influenced by syntactic tree structures logicians build formal systems with axioms and model theoretic interpretation and when you talk about logical structure to a logician that suggests referential patterns patterns of inference or algebraic structure so very different notions formal semantics has its roots in several disciplines most importantly logic philosophy and linguistics the most important figure in its his was Richard Montague who had a tragically short life from just from 1930 to 1971 and his seminal works in this field date just from the late 60s and the very beginnings of the 70s no knowing what he would have done if he had lived longer there were of course many other important contributors not all will get their fair treatment today because I just have a short hour or so just because the story's too long and time is too short but let me back up very briefly to some prehistory so semantics first first semantics in linguistics and a bit on semantics and in philosophy so Chomsky syntactic structures was 1957 so so I people take kind of take that as a turning point in history of linguistics in the 19th century linguistics existed within philology in Europe that's that's when the German philosopher and other philosophers were discovering have the the roots of indo-european languages and the family trees that actually had an influence on Darwin in his work and in the United States there was some philology but also a lot of anthropology because the linguists here were discovering all the Native American languages that looked so completely different from the european languages and from each other and that that that got linguistics in into a very active field here in the 20th century linguistics emerged as a science part of the chomskyan Revolution was to view linguistics as a branch of psychology that was that was a radical view when he when he first started saying that and and that that really helped make linguistics one of the the root points for cognitive science in general but there were negative attitudes to semantics in American linguistics in the 20th century partly influenced by logical positivism and in psychology influenced by behaviorism that you could only study what you could directly observe you couldn't study anything going on in the head and semantics seemed very unobservable you could you can see the words you can see the sounds of a language you can see the morpheme you can find the morphemes you can see the words but the meanings that's that they're a little bit mysterious you know what meaning is has been a constantly difficult problem so there was neglect of semantics in early American linguistics also because working on Indian languages the fieldwork tradition says well you've got to start with phonetics you've got to decipher the sounds and then figure out the sound patterns the phonology then the morph morphology the smallest meaningful units then maybe you've got a little syntax and no semantics except in in word lists and dictionaries a little bit semantics and logic and philosophy of language in the early 20th century was making a lot of progress Praeger Russell people I'll talk about in a bit but that was relatively unknown to most linguists 1954 the philosopher Yahshua part bar-hillel who was a good friend both of Chomsky and of Montagu wrote an article in language inviting cooperation between linguists and logicians arguing that advances in both fields would seem to make the time very ripe for an attempt to combine forces and work on syntax and semantics together he was arguing against the logicians who thought natural language was just too messy to even try to work on the logicians were mostly at that time developing their logics to use instead of using ordinary language if you wanted to write things in a logically precise way and he was appealing to linguists to learn about and make some use of the rich methods that magicians were developing for their formal languages but the very next year Chomsky who was then still of PhD student wrote a reply in language basically saying no thank you he was arguing that the artificial languages studied by logicians the languages of logic were so unlike natural languages in their structure that the methods of logicians had no chance of being useful for linguistic theory Chomsky and bar-hillel remained good friends but this was a you know they put their points of view very clearly in those articles bar-hillel didn't totally give up though on trying to get the linguists and logicians and philosophers together in 1967 he wrote a letter to Montague after he had received one of Montague's newest papers he said this will doubtless be a considerable contribution to the field though I remain perfectly convinced that without taking into account the recent achievements in theoretical linguistics your contribution will remain one-sided also in 1967 there was the third International Congress for logic methodology and philosophy of science and bar-hillel organized a symposium in Amsterdam on the role of formal logic in the evaluation of argumentation in natural languages with Jerry Katz from MIT with Montagu with Yakko gente comics black-black frits tal Erik's Aeneas John Lyons dum-dum 't and others so he was he kept trying to get these people to learn from each other pay attention to each other but back to our history so Chomsky syntactic structures 1957 he concentrated on the native speakers ability to produce and understand a potentially infinite class of sentences his conclusion that was that linguistic competence must involve some finite description of an infinite class of sentences his formulation of the goals of linguistic theory to characterize how that happens how that's possible and what's what the what the characterization of this this finite knowledge is like that revolutionized the field but he has always been ambivalent about semantics he's been skeptical about the possibility of including semantics in a formal grammar and has insisted on what he calls the autonomy of syntax that you study syntax just using syntactic arguments you don't look at the semantics to get ideas how you should do the syntax but on the other hand he has always held that one test of a good syntactic theory is that it should provide a basis for a good semantics if owned but then he would say if only we had any idea how to study semantics that's that's what I was encountering as a student and I considered semantics of field that was just just too fuzzy to even try to think about he argued early on for instance here's here's a thing where syntax is relevant clearly to semantics he argued early on that deep structure is the thing that was characteristic and his syntactic theory reveals semantically relevant structure that you don't see when you just look at the surface form of a sentence so a sentence like John is easy to please on the surface John is the subject all right and and easy is the main part of the predicate but but that doesn't really say that John is easy but what the sentence really says is that for someone to please John is easy and that's and that for him was the deep structure of the sentence not for semantic reasons but just to get all the co-occurrence --is of of the different possible forms that you can put those phrases together in and and quoting from syntactic structures he says in proposing that syntactic structure can provide a certain insight into problems of meaning and understanding we have entered onto dangerous ground and so he likes to say his syntax is good for semantics but he also says beware of semantics there is no aspect of linguistic study more subject to confusion and more in need of clear and careful formulation than that which deals with the points of connection between syntax and semantics the real question that should be asked is how are the syntactic of I devices available in a given language put the work in the actual use of a language that's very far away from the contemporary view whereas semantics is part of the grammar that's it's one phonology syntax semantics nowadays you know we see them all on a par but that certainly was not the view in Chomsky's early work it was not the training that we who were his students and we're getting Katzen Fodor in the early 60s they were they were they had philosophy degrees from Princeton but they were spending all their time at MIT and working with Chomsky also they added a semantic component to generative grammar and this this is they thought there ought to be you know there are to be some semantics and they addressed what they called the projection problem ie compositionality how to get the meaning of a sentence from the meanings of its parts now to illustrate just in a in a minimal way at that time negation and question formation were transformational rules that would map a positive declarative sentence or it or simply a declarative sentence into a negative sentence or into a question they were prime examples of transformations that that that very seriously changed the meaning of the sentences they applied to either negating it or turning a an assertion into a question their idea of computing the meaning on the basis of the whole transformational history or T marker of a sentence can be seen as aiming in the same direction as what Montague did later when he said we're gonna do the semantics by looking at the derivation steps of a sentence and building up the meaning on the basis of how we built up the syntactic structure so a deep structure like the airplanes will fly would turn into a negative sentence by the negation transformation like the airplanes will not fly and the and the T marker for that airplanes will not fly includes the phrase marker for its deep structure plus a graph showing what transformations occurred and they said you have to do this a man if you've got to take all that into account but there's semantics but what it was like I haven't shown you any of their actual semantic representations it was very primitive they had the notion of semantic features like like plus human or plus animate and things like that and their semantic representations were bundles of features suitable at best for decompositions of one place predicates like nouns or adjectives and all their examples tended to be putting a noun and an adjective together and just taking the features from both of them and putting together Quine 1970 in his philosophy of logic had the sensible statement that logic chases truth up the tree of grammar katz and photos position could be characterized as semantic projection rolls chased semantic features up the tree of grammar what they were trying to capture had nothing to do with truth conditions this is another important thing but what their aims were we're not like the aims of the logicians or philosophers they were trying to capture properties like ambiguity synonymous semantic anomaly analyticity all characterized in terms of how many readings of sentence has which for them would be how many different semantic representations it would have or whether two sentences share a reading and things like that all just in terms of representations which in turn were these bundles of features philosophers reaction to linguists semantic representations most most forcefully and clearly articulated by David Lewis who is really another one of the key figures in the foundations of formal semantics he wrote 1970 in his paper general semantics but we can know the mark hurry's translation that's what he called these photos these cats and Fodor representations with their semantic markers we can know the marker Reis translation of an English sentence without knowing the first thing about the meaning of the English sentence namely the conditions under which it would be true semantics with no treatment of truth conditions is not semantics and then he also said translation into mercury's is at best a substitute for real semantics relying either on our tacit competence as at some future date as speakers of Mercury's or on our ability to do real semantics at least for the one language Mercury's so for him these representations with no like representation with no interpretation was just empty marks on pieces of paper but linguist did presuppose tacit competence in mercury's this was part of the general chomskyan picture of of universal grammar and and linguistic competence they took that or some kind of representation they didn't claim they had the best one yet as something that must be universal and innate and many linguists still think that way I mean they know that in syntax the languages have different syntaxes you have to learn that but there's still a widespread feeling that semantics is universal know that I'm not going to get into that issue today but just say that was that was a that that was how they justified working with these representations because even if that wasn't yet the best one the best one would be something that we would just know because it's part of our innate endowment as humans to philosophers and logicians doing formal semantics the language of mockeries looked empty because it was uninterpreted to linguists in 1970 concerned with truth looked puzzling linguists were trying to figure out mental representations that would underlie our linguistic competence actual truth was correctly considered irrelevant and the notion of truth conditions was not really understood I'll get to that a little more later too but when the linguistic relevance of truth conditions finally penetrated later the very nature of linguistic semantics changed not just in terms of the tools used but also in terms of the questions asked and the criteria of adequacy for semantic analysis so it's a really big change took awhile to come now another there was another theoretically important move that was made by the same dairy cats this time with Paul postal in a 1964 book separable from the marker Reese issue and related to the problem of compositionality how we get the meaning of the whole from the meaning of the parts cats and postal made an innovation so morphemes like negation which earlier were introduced in a transformation they argued that that morpheme really belonged as part of deep structure of a negative sentence and that a the deep structure of a question would have a cue morpheme as part of it and they argued that there was independent syntactic evidence for doing that so that they were still respecting the autonomy of syntax in their argumentation but and then once you did that then you could get the meaning from deep structure alone so then a sentence like mary has not visited Moscow would have a deep structure like neg Mary has visited Moscow or a question like has Mary visited Moscow would have a deep structure with a Q in it q has q Mary has visited Moscow so then then you didn't have to look at all the transformations when you're working on the meaning you could just look at the deep structures which linguists liked anyway right so this led to a really beautiful architecture which Chomsky laid out in his 1965 classic aspects of the theory of syntax phrase structure rules the the simple basic building blocks context free rules generate deep structures deep structure is the input to semantic interpretation then transformations will turn these deep structures into a surface structure which is what we actually pronounce here's a picture of the same thing base rules generate deep structures then those deep structures are the input to the semantic component transformations map that into surface structure surface structure is the input to the phonological component so so meaning and form are connected in this very pretty way this big change in architecture rested on cats and pastas claim that transformations can be meaning preserving it was an interesting and provocative claim and even without what philosophers would call any real semantics at its foundation it led to a lot of interesting debates because that question of whether the transformations are changing meaning or not you can look at independent of what you think meanings are right and the architecture of the theory with syntax mediating between semantics for meaning and phonology for the pronunciation that was elegant and attractive chomskyan aspects had added to the elegance by combining all of his kernel sentences underlying a sentence into a single deep with all the different subordinate clauses are all represented in this one deep structure so during the brief period maybe five less than ten years when aspects held sway there was a rosy optimism I remember because 65 was when I got my PhD and it was a really optimistic period oh boy the form of syntactic theory now we understand the form of the theory now our job is just to go out and look at different languages and find out what the substantive universals are and how languages can differ because we've got the theory down I was it didn't last very long I call it the Garden of Eden period so in that period roughly the mid 60s I think generative grammarians generally believed that that castes and postal hypothesis the transformations do not change meaning and the idea that meaning was determined at the deep level was undoubtedly part of the broad appeal of that chomskyan notion of deep structure beyond linguistics Leonard Bernstein's Norton lectures from around that time the unanswered question talk about deep structure in music and how it gets transformed into the various things that you actually hear and it contributed to the ARA surrounding the notion of language as a window on the mind so around 1965 there was this very widespread optimism about the Katz postal hypothesis that semantic interpretation is determined by deep structure and the syntax semantics interface was believed to be relatively straightforward even without having any very good idea about the nature of syntax of semantics ah so what upset that lovely view what led to expulsion from the Garden of Eden well this is an oversimplification but it's a good representative kernel of what happened linguist discovered quantifiers linguist discovered words like every and all and some which hadn't been thought about before all the examples before had John and Mary is the phrases and and and there were lots of transformations which if you apply them to sentences with names they don't change the meaning so for instance the deep structure for John wants to win was John wants John to win fine that's that's fine that's that's a that's nice but if there's a quantifier there you would be saying that the deep structure for everyone wants to win if it's by identical noun phrase deletion the deep structure must have been everyone wants everyone to win and that does not have the same meaning at all right I mean one is selfish the other is altruistic right so this and and lots of similar problems led to the well-known linguistic wars between generative semantics and interpretive semantics slightly caricaturing we can say generative semantics put logical form first its insisting on having deepest structures which would be semantically interpreted all whereas the interpretive semana Syst put linguistic form or syntax first insisting on the autonomous syntax without too much change in it and being willing to complicate the semantic rules however we had to to account for these sentences with quantifiers and such okay so this led to battles in the late 60s and early 70s so let's leave these battles raging for a while and turn to philosophy and logic for a little bit so the relevant history and philosophy goes back at least to Aristotle but I'm I you know I'm not gonna try to do all that right so I'll just start with brief mentions of a few key people and then get into the 20th century quickly so the history of formally oriented approach toward philosophy of language goes back at least a Descartes and Leibniz Descartes like the medieval speculative grammarians believed that underlying all speech there exists a lingua Universalis representing the form of human reason reason and the liveness agreed with that and had ambitious ideas about how it could be described and put to scientific use Leibniz called the general framework for such a universal language a characteristic of universalis Bay based on and ours combinatorial a system of symbolization where you would have simple forms for simple concepts and definite rules for putting concepts together an unambiguous logical forms displaying the logical structures of all complex expressions built from multiple concepts and together that should provide a logical analysis of all the actual and possible concepts that might arise in science and then the framework should include a calculus calculus ratiocinate er I don't pronounce Latin right sorry a complete system of deduction that would allow new knowledge to be derived from old knowledge so you just represent things in this beautiful unambiguous language with simple symbols for simple concepts apply this calculus to it and you get all kinds of new knowledge from there wouldn't that be wonderful my colleague Angelica Kratzer often teaches a very popular undergraduate course called the dream of a perfect language talking about all the attempts through history to try to design something that would have this these kinds of features in the in the 19th century George Boole had an algebraic conception for his system governing what he called the laws of thought a calculus ratiocinate er independent from the vagaries of natural language boolean algebra turns out to have widespread application to natural language semantics whether bull would like that or not but the greatest foundational figure for formal semantics is freya his crucial ideas include the idea that the way a lot of meanings get put together the core of compositionality is function argument structures that some expressions are interpreted as functions other expressions are interpreted as things that can be their arguments and as you climb up the syntactic tree you're constantly applying functions to arguments getting the new results which will either be new functions or they can be new arguments to other things and he's credited with the principle of compositionality which i keep referring to and finally better say right the meaning of a complex expression is a function of the meanings of its parts and the way they are syntactically combined the the emphasis on syntax came later when the linguists got involved but it's it's the basic Prague in idea and Fraga introduced the distinction between sense and reference ORS in in Bedoya which philosophers and semana cysts been trying to formalize adequately ever since one of frege's great contributions was the logical structure of quantified sentences the Aristotle had done things with quantified sentences but never more than one quantifier in a sentence so Aristotle has all these syllogisms with all men are mortal etc but nothing with every man loves some woman that that didn't enter Aristotelian logic at all and Frager centuries later corrected that defect aristotle was so great that everybody just took Aristotle as a Bible and it took a renegade like Prager to say wait we really need more right so that was part of the design of a concept script or big riff shift a logically perfect language to satisfy and Leibniz his goals he didn't see himself as offering an analysis of natural language although we looked to him for a lot of help with that he was seeing himself as offering a tool to augment natural language like the microscope augments the eye he was very respectful of natural languages just like the inventor of the microscope is very respectful of what the eye can do but for some purposes the microscope does better for some purposes frege's logical language can do better Freyja also figured out a systematic semantics for variable binding more compositionally than what tarski did 50 years later he rejected the psychologism of many of his his predecessors so these interesting arguments of about the foundations of semantics and logic arise already back then exemplified for instance by John Stuart Mill mill said logic is a branch of psychology that's I mean nobody now says that but that's that was a that he had very powerful influence in the 19th century he said so far as it's a science at all it's a part or branch of psychology differing from it on the one hand as the part differs from the whole and on the other as an art differs from a science he considered psychology of science logic and art it's theoretical grounds are wholly borrowed from psychology and include as much of that science as is required to justify its rules of art so logic back then really meant how to avoid fallacies and this and that it wasn't what you study in a formal logic class now but-but-but Fraga Fraga had as one of his main theses that mathematics and logic are not part of psychology that the objects and laws of mathematics and philosophy are not defined illuminated proven true or explained by psychological observations and results well in a frege's central arguments is the consideration that whereas mathematics is the most exact of all Sciences psychology is imprecise and vague traiga claims that in the realm of logic we do find both descriptive and prescriptive laws but the former the descriptive laws are the foundation for the prescriptive ones as he says every law that states what is can be apprehended as prescribing them what that one ought to think according to it this holds of geometrical and physical laws known less than logical laws so frege's main criticism of psychological logic of yeah psychological logic is that it conflates true and being thought to be true so that was fun so some key 20th century developments in logic and semantics Bertrand Russell was one of the great forerunners he introduced logical types to avoid paradox and made many foundational contributions to the logical analysis of natural language but more with the aim of replacing natural language biological language in formal argumentation so you see this over and over in Leibniz and Fraga and Russell you study natural language you analyze how certain things work then you try to do and artificial language that will do that more unambiguously and clearly and precisely early Carnap used Russell's theory of types and tactically for the the grand project of the logical construction of the world and the logical construction of language but later Carnap developed a semantic approach where meaning equals truth conditions an idea he got from vic and stein carnap introduced possible worlds into semantics as state descriptions and analyzed intentions as functions from possible worlds to extensions I Mayer if I if I don't have time to say any more about that you're welcome to ask questions about any of these things that I say too briefly tarski then developed model theory based in set theory and with it made major advances in providing a semantics for logical languages like first-order predicate logic and modal logic including his semantical definition of truth around this time a war began within philosophy of language so linguist had one more philosophers had a different one the ordinary language versus formal language were ordinary language philosophers rejected the formal approach urged attention to ordinary language and its uses this includes late Vick and Stein strossen and a number of other philosophers mostly British Strawson in on referring 1950 he's reacting to Russell's great paper unda noting which had Russell's theory of definite descriptions strossen says the actual unique reference made by a definite description if any is a matter of the particular use in the particular context neither Aristotelian Norris alien rules give the exact logic of any expression of ordinary language for ordinary language has no exact logic Russell 1957 in mr. strossen on referring I may say to begin with that I am totally unable to see any validity whatever in any of mr. strawsons arguments but then toward the end of the article i agree however with mr. Strawson statement that ordinary language has no logic so both sides in this war and like chomp see as well later we're in agreement that logical methods of formal language analysis do not apply to natural language in some quarters that war continues but the interesting response of some formally oriented philosophers was to try to analyze ordinary language better including its context-dependent features which were part of what it sometimes seemed to make it seem illogical the generation that included Arthur Pryor who did great work on tense logic bar-hillel who did a lot of work on demonstrative Reichenbach who did work on intense and aspect Haskell curry and Montagu gradually became more optimistic about being able to formalize the crucial aspects of natural language so Montagu Montagu was a student of tar skis and he was a he was an important contributor to all of these developments his higher order typed intentional logic unified tense logic and modal logic and more generally unified formal pragmatics with intentional logic he gave he gave an analysis for the words like I and you in terms of their context dependence along with the semantics of the words that are not context dependent and all of it compositionally Montagu treated both possible worlds and moments of time as components of what he called indices and treated intentions as functions from indices to extensions so generalizing from what Carnap had done the strategy of adding more indices as you find more things that the that the reference of an expression of natural language depends on came from Dana Scott's paper advice on modal logic and Montague also generalized the intentional notions of property proposition individual concept etc into a typed intentional logic extending Carnap church and kaplan putting together frege's function argument structure with the treatment of intentions as functions two extensions so all kinds of syntactic categories had expressions which had an extension which pick tell us what they pick out in the actual world or at some particular time and possible world and they had an intention which is closer to capturing the meaning what they would pick out in any possible world at a given time in pragmatics and intentional logic Montagu distinguished between possible worlds in possible contexts something that had not been clarified earlier and applied his logic to the analysis of a range of in philosophically important notions like events or obligations this was just as he started working on the analysis of natural language a little bit that work like most of what preceded it still followed Montagu still followed the tradition of not trying to formalize the relation between natural language constructions and the logic of semantic analyses that the philosopher and logician was giving to them the philosopher analyst served as a bilingual speaker of both English and the formal language used for analysis and the goal was not to analyze natural language but to develop a better formal language that would have all the best properties of natural language to be as rich as natural language but more precise and one that you could work with formally to derive only valid inferences Montagu in an article in in style 1969 that comes from that conference that bar-hillel organized continued to maintain that this goal of developing a better formal language was more important than the goal of analyzing natural language I don't know if he still believed that at his death but he was putting more and more of his effort into analyzing natural language at least so so we a lot of us wondered for a long time why did Montagu who had been doing logical work and work in recursive function Theory foundations of set theory the kind of work the logicians typically do why did he start working on analyzing natural language explicitly and I I've been that I've been doing some some of my research research has taken me the Montagu archives at the UCLA library and there I found a new clue so there was a handout there from a talk he gave in Vancouver one of his earliest talks on his work on English as a formal language and and and there was a page that was not part of the handout on a you know just a yellow sheet of paper the sort of thing that means this is what I'm gonna say before I start the actual talk and partly was in shorthand Montague used shorthand a lot so I had to work to figure out what it said on that piece of paper but I think I've got it right he said this talk is the result of two annoyances one the distinction that some philosophers especially in England England draw between formal and informal languages and the other the great sound and fury that nowadays issues from MIT that's that means Chomsky right under the label of mathematical linguistics or the new grammar a clamor not to the best of my knowledge accompanied by any X and the accomplishments when I first was trying to decipher I thought is it not by many accomplishments but when as I got better I thought no it really says any right I this is still Montague continuing I therefore sat down one day and proceeded to do something that I previously regarded and continue to regard as both rather easy and not very important that is to ordinate to analyze ordinary language imagine how that hit me in the pit of my stomach I wife long linguist to see this is just some little thing I'm just gonna sit down and do it today right I shall of course present only a small fragment of English but I think a rather revealing one he inserted a note about other creditable work including traditional grammar the Polish logician I drew cave it's a pair of computational linguist logicians Boehner and backer and his student hunts come later notes nineteen seventy luckily suggests he eventually found it not entirely easy otherwise it makes all of us feel like why does it take us so long to good analysis of this or that his first work on natural language was the provocatively titled English as a formal language he had taught that at UCLA and at the University of Amsterdam in 66 it famously begins I reject the contention that an important theoretical difference exists between formal and natural languages as noted by Emin box the terms theoretical here must be understood from a logicians perspective and not from a linguist what he was denying was the central presupposition of the formal language ordinary language Wars that there was a mismatch between linguistic form and logical form for natural language what he was proposing here and in his universal grammar paper was a framework for describing syntax and semantics and the relation between them that he considered compatible with existing practice for formal languages and an improvement on existing practice for the description of natural language and the fraggin principle of compositionality was central to what he was doing and remained central in formal semantics for his for him the syntax semantics interface has a very different look than these representations the linguists we're playing with syntax is an algebra semantics is a different algebra they have different stuff syntax has expressions semantics the algebra is an algebra of meanings of some sort and compositionality is the requirement that there should be a homomorphism from the syntactic algebra to the semantic algebra that is a very a very systematic mapping the nature of the elements of the syntactic and semantic algebra is left open so he you will let the linguist argue about you know what the crucial pieces are that make up the syntactic structures and he'll let the philosophers and and linguists and logicians continue to argue about what meanings are but what he cared about was a structure of them and the relation between them so the and the differences between his fancy higher-order intentional logic typed intention logic and the first-order predicate logic of Russell which is all that linguist and most philosophers knew and worked with or maybe extensions to some modal operators that made a crucial difference in the very possibility of giving a compositional semantics based on a relatively conservative syntax of English that's where the structures posited look pretty close to the structures that you actually see on the surface and I'll give an illness at least one illustration of that so once he had shown what could be done with the use of model theoretic techniques for compositional semantics and with a higher-order intentional logic both the linguistic Wars and the philosophy of language was could be peacefully resolved by removing their presuppositions that doesn't mean they ended that doesn't mean that everybody immediately said oh good we have this we can forget all the rest of that but I was lucky enough to be at UCLA and sitting in on his classes at that time I mean I mean I just I just had very very good luck with with place and time that I got syntax and Chomsky's first classes at MIT and then my first job was at UCLA and there was Montague and David Lewis an old friend of mine was there to help me understand what Montague was saying because if a linguist tries to read Montague's papers unaided it's just about impossible but sitting in on his course is being able to get David Lewis to answer my stupid questions like what's a lambda that that helped me understand what Montague was doing and and and this is the way I saw it was Oh My heavens you know if we could do things this way then we could have some semantics that would not be mushy it would be serious we could see what's what's giving right and wrong answers and we could really go to town so details of Monica's own analysis have in many cases been superseded but an overall impact his his paper proper treatment of quantification in ordinary English was as profound for semantics as Chomsky syntactic structures was for syntax Eman Bach summed up their cumulative innovations by saying Chomsky CSIs was the English can be described as a formal system Montague's thesis was the English can be described as an interpreted formal system the semantics can be part of the formal description truth conditions and entailment our basic these are minimal data that have to be accounted for to reach what Chomsky would have called observational adequacy the ground level of getting things right that principal inherited from logic and model theory is at the heart of Montague semantics and is one of the defining properties of formal semantics Cresswell put it in the form of his most certain principle we may not know what meanings are but we know that if if two sentences are such that we can imagine a situation in which one of them is true and the other is false then at least we know those two sentences do not have the same meaning that seems so trivial it's hard to appreciate how useful it can be in just starting from basic observations you can go a long way if you just respect that principle and many decisions about semantic analysis both in general architecture and in particular instances can be seen to follow from that principle so the advent of truth conditions and the tools of model theory made semantics an incomparably more powerful discipline than it had been before it may be hard to realize now how surprising and controversial an idea it was to linguists in the early 70s that we should think about truth conditions rather than just ambiguity semantic anomalies anonyme okay so then going on from there I cannot see my slide number without my glasses okay I need to I need to find some things I can leave out and I will Monica was doing his work on natural language at the height of linguistic Wars though he and thus Amanda Sisseton in linguistics had no awareness of each other then he didn't work single-handedly there's lots of other important people his last paper was delivered at a workshop in Stanford in 1970 fall 1970 and he was murdered in March of 1971 so he had only done a few papers in this area but they were really rich and deep and and became extremely influential so after his death we mere mortals had to kind of carry on and try to see what we could do to understand it and and do more with it so I was yeah I already said I was at UCLA so yeah I told you that but after Monica's death I worked on understanding his logic and semantics and looking for ways to to try to combine montagu semantics and Chomsky's syntax and I had a lot of help from logicians and philosophers in the process the introduction of Monica's work to linguists came by some papers I started writing starting in 73 and rich Thomason philosopher and logician who published Montague's collected works with a long nice pedagogical introductory chapter in 1974 two exciting events I don't have time to tell you about but would be delighted to answer questions about was a 1971 Institute at Irvine in semantics and philosophy of language and in 1974 at the 1974 linguistic Institute at UMass Amherst that had lots of semana system philosophers of language gathered together those things helped give a big push to this kind of later cooperation so I do want to say a little bit about this this foundational issue the psychologism anti psychologist I mentioned Fraga and Lawton and John Stuart Mill arguing about it well there's a similar tension between Chomsky is view of of what we're studying when we're studying when we're studying linguistics studying grammar and and mano gives view which is much more direct descendant of the fraggin view the chomskyan revolution included putting human linguistic competence at the center of study that what we're studying is what's in the head of the native speaker of a language and the huge question of what's our what's the endowment we come with that enables children to learn a language when some of us took on the challenge of combining Montague's approach to semantics with Chomsky's approach to syntax technical and substantive progress was rapid and successful they could you don't worry about this psychology not psychology you just worry about how do relative clauses combine with how does the semantics of comparatives work how how does sentences with quantifiers get interpreted that that progressed by leaps and bounds but there was still while the progress went on there was still this apparent incompatibility of about what we were doing Chomsky's view of linguistics as a branch of psychology and the anti psychologist at fraggin tradition viewing meanings as abstract objects in hindsight what I once saw is a problem I now see as a good thing how semantics pushes us actually toward a less narrow view of Compu main competence and psychological reality as we don't have to give up we don't have to we don't have to choose between these two things we can find a better view of competence and psychological reality that just makes them much more compatible than they seemed to be what should competence and performance mean in semantics how might that differ from competence and performance in syntax and in I mean the cop I don't know if you all know Chomsky's competence performance distinction so performance might stop us from being able to to produce or understand a sentence that's a thousand world words long but the grammar generates sentences of unbounded length sorry our competence would according to competence there's all these infinitely many sentences and they are but the grammar says they are and so so it's kind of like knowing the rules of multiplication but you can't just from your head multiply extremely long numbers but but you've got you've got the confidence you just have some limitations of time and memory space and that kind of thing and how have ideas about what's in the head of about what what's in know about what quote in the head means changed from folders methodological solipsistic abrade in the Baptist is the human competence that we're looking at prominent in the 70s in the intervening decades Putnam famously argued in 75 meanings ain't in the head sonica argued in 89 that meanings are in the head but they're in the head like footprints are in thus and that's very different from a container and contents right and and the problem was in taking too narrow view of in the head so I expand on that slightly semantics is the part of linguistics most affected by these ideas advanced by Stalin occur and then by Tyler birch about what in the head means and I now suspect that recent advances in philosophy of mind which I had not been paying attention to I was just looking I would pay some attention to what was going on in philosophy of language but I wasn't watching philosophy of mine I think recent advances in philosophy of mind to go a long way toward changing the suppositions on which the earlier arguments about psychological reality and competence rested for Chomsky competence is defined by the unconscious knowledge of the speaker if two speakers differ in their internalized syntactic rules then we say they simply speak different idioms they just simply speak slightly different languages your language what your language is is determined by these unconscious rules that are in your head there's no such thing as not knowing the syntactic rules of the language you speak what you know defines what is the language that you speak and if you if you are a master of multiple dialects than in some sense or rather you have multiple systems of rules in your head most of us are multi dialectal you know it's and and or multilingual so this view is shared by many linguists takes the central goal of linguistic theory to be an account of what the native speaker of a language knows when she knows a language and how that's knowledge is acquired that's part of the view that linguistics is a branch of psychology some of the most difficult most fundamental difficulties in trying to reconcile the goals of formal semantics and chomskyan linguistics arise when we try to characterize a notion of semantic competence compatible with both because we we quickly realize semantics and syntax are quite different in this respect let me know let me back up and say a little bit there's no such thing as not knowing the syntactic rules of your language but if I ask do you know the meanings of all the words of your life that's a very different issue we often depend on experts there's a lot of division of labor and we will cooperate we will speak as long as we approximately have similar enough meanings we can speak together and and when I if I use a technical term like electromagnetism I mean maybe I slightly knew what it meant when I was a senior in college but I certainly don't now I depend on well I'm using that word to mean whatever the people who really know about it mean so so we we depend in semantics on on each other our intentions are to be speaking the same language that other people are speaking and as we acquire words you know I don't know if you remember this consciously in in acquisition I remember when I was a graduate student Chomsky sometimes used two words use the phrase ad hoc to characterize certain people's arguments about certain things and I had never heard that before and I just gradually gradually gradually by seeing what he applied to applied it to you know gradually picked up you know what kind of property of an argument that was supposed to be and kids are doing this a lot I mean that things are not so simple as learning dog means dog you know especially with the abstract terms so semantics semantics is different we've got a lot of words that we use in our language that we do not completely know the meanings of so we can't really just say that the meanings of everything in our lives is just determined by what's in our head in the narrow sense of content and Putnam Putnam was arguing that when he said so theory of meaning came to rest on to unchallenged assumptions I'm gonna have to stop within five minutes that knowing the meaning of a term is just a matter of being in a certain psychological state that's the chomskyan side and that the meaning of a term in the sense of its intention determines its extension determines what it applies to her but it's true of in the sense that same intention determines same extension and Putnam went on to argue that those two assumptions are not jointly satisfied by any notion let alone any notion of meaning he has famous twin earth example where he takes a word like water and says you could there could be a possible world just like ours where all of our experience and everything in our head is just like it is for us with real water except on on twinners that wasn't h2o that was XYZ it was a totally different thing so what we know about the meaning of water does not determine that it's h2o it rather than XYZ that there's a lot else that goes into determining what water actually is so Stalin occur then argued in a 1989 paper that weight meanings are in the head they're in the head like footprints are in the sand the problem is to a narrow view a representational system for Stalin occur is a system that's capable in being arranged in a range of alternative internal states that tend to be causally dependent on the environment in a systematic way so the difference between us and the people on twin earth is that whatever our internal ideas about water are they were caused by this stuff that's in this this world namely water and that our twin earth counterpart his his represent his impressions his his in internal states were caused by XYZ so so the the representational states are different because the the footprints in the sand you know they could be footprints of different creatures right and so so meanings in the head are representations but they're representations of and you just can't leave out the causal connections okay so I won't try to talk about Burge and Chomsky yeah yeah mine is this the end yeah yeah yeah okay good so Chomsky defeated behaviorism and helped to inaugurate the study of cognitive science and he showed that languages can be can be analyzed as formal systems governed by rules and principles that are in the minds of their users as unconscious knowledge Montagu showed that natural languages can be interpreted as interpreted formal systems they're not illogical and there was this remaining foundational problem about knowing a language if we if we think about it in the wrong way then it seems like a natural language interpreted as Montagu understood it could not be known by a human being could not be in the head but if we follow Stalin liquor and birch in drawing insight from how perception works and how it gives fallible but veridical knowledge prior to any reasoning we can see semantics as indeed a particularly important and fruitful branch of psychology it's not narrow psychology and Fodor's old methodological solipsism sense it's psychology that looks at interaction between environment and and the the knowing think the perceiving subject I think that's it and then I thank lots of people who've helped me learn more about the history of my field and the Special Collections archivist at UCLA and I think Edina said the slides will go up later so so if you want some references there there aren't over that's that's if you ask me that question yeah if you want some references there's some references there and I can readily supply more [Applause] yeah yeah you have you have some idea whoo-hoo-hoo-hoo you should call it I mean they actually get credit yes that's nice so I'm useful yep so generative semantics is associated with the lake off and radom yeah like I'd have to move lake off McCauley hedge Ross and and Paul postal and others they they said when they discovered the problem when we all discovered the problem with the quantifiers there's at all so the deep structure for everyone wants to win can't be this everyone wants everyone to win there's got to be something more like a variable in that second so you know what it's like in first order logic for every X if X is a person then X wants X to win something like that so they made deep structures that looked in some ways like first-order logic they would have just one occurrence of that every one and then they would have two noun phrases with variable in them and then they had a syntactic rule well a bunch of syntactic rules to get from there to surface structure by by quantifier lowering putting the everyone in the position of the subject and in this case just deleting the second variable to get to everyone wants to win so so and they and they discovered all kinds of interesting things they didn't made a lot of progress but the sin tactician found their syntax somewhat implausible and and if when we go back and look at what we do now which has some things in common with that we could see certain certain things they were they were hobbled by the fact that the only logic they had to work with in making those structures more logical so to speak was first-order logic can I do can I use the blackboard will it show up in your in your pictures okay okay so so here's here's an example that I like to a very simple example that I that I just like to show students so every every dog runs let's say the way Russell's first-order logic would represent that would be for all X if X is a dog then the predicate run is true of X right so for all X if X is a dog then X runs now we look at that now and say we're in that formula is the meaning of every dog and and well certainly the for all X is part of it certainly the the dog of X is part of it the if this if then is part of it because if it was some dog runs you'd put an ant in there for some X for some X X is a dog and X run so if then is part of it this occurrence of X as an argument of run is part of it because that's the position that that every dog came from so in in first order logic the meaning of this nice syntactic noun phrase is everything except to run this is part of my Russell thought that natural language that was so illogical you know Russell said look what natural languages - they take every dog and treat it the same way as John you know make it make it a noun phrase and pre wrestle even duck King you know it's something even more complicated looking than than the one with every so so Russell thought natural languages were crazy to have John a man every man the man all in the same syntactic category and and the generative Samana cysts were sort of stuck in making the deep structures make those different noun phrases look very very different from one another so they they had a noble goal but trying to accomplish that goal with first-order logic as your only way to make the deep structures more have logical forms that would be more semantically transparent it was was hampered by the big mismatch between first-order ok this is a part that was right about Chomsky's reply to bar-hillel that the structure of the logical languages just looks very very different from the structure of any natural language and oh yeah I didn't have the slides in here that showed how Montagu solved that that's right I forgot that I I left that out for Montagu he he sort of starts by observing this meaning and then says ok if we have higher-order expressions we can abstract on this predicate and we can make we can make an expression and it says lambda P just think you can think of it sort of like the set of all properties P such that set of all properties P well let me first say he's gonna let every dog denote the set of all those properties that every dog has right that's that's the way he's going to unify all the noun phrases John is going to become the set of properties all that jump that all the properties that John has so this is going to be lambda P I mean this is one way of writing you don't have to use the first-order logic it in writing it lambda lambda P for all X if X is a dog then P of X so that this this is a constituent in the semantic structure so so now corresponding a syntactic constituent every dog now there's a semantic constituent it's of a pretty fancy type sets of properties of individuals and then this guy is a function so I said you could think of it as set but really it's a it's a function that will apply to any property and give you something that has a truth value so if you apply it to run as let run B its argument oh I bet I'm off the camera run yeah if you apply it to run then and then do rules of lambda conversion you would get back the Russell Ian statement now this this makes it look too much like it still got first-order logic in it there's ways of representing these things more directly than than this way but this for people who do know first-order logic and don't know Monica's intentional logic this is a fair way to represent it so that's so so so so that's the sense in which what Montague did help to accomplish the goals of the generative semana sis and the interpretive semana sis they were trying to keep the same syntactic structures but but figure out which parts should be interpreted at the level of deep structure and which parts should be interpreted at the level of surface structure like scope of quantifiers maybe and what kind of things might have to be interpreted along the way so the syntax was much cleaner but the semantics was more complicated I see one in the back corner say again yeah that's it that's a nice question I don't know exactly I mean I mean for one thing I think he thought he was pretty smart and for another thing he focused he had already focused on a number of specific problems and you know like like how to do these quantifiers and he saw uh-huh let me let me work for a long time he didn't want to go to to higher order logic but but I you know even found the day on which he decided that's the way he's going to do the quantifier phrases so he'd solved that problem he solved a lot of problems about intentions and extensions and and he thought was probably a few more problems you know to work on and then I'll have it all but in the in the things I found that the papers in the archives room from 1970 when he was working on that last proper treatment of quantification problem one of the things I saw there for instance was this paper that was called proper treatment of quantification in ordinary English it actually only has three different kinds of quantifier phrases if you can even call that call them that every man the man and a man and only those three quantifiers in the notes he's got lots more but he ran into the problem of plurals and and and how plurals singular and plural works in English and when does plural really mean plural and when doesn't it like they're these puzzling sentences like unicycles have wheels which is true you know each unicycle has one wheel so if you say this unicycle has wheels then has wheels should mean more than one but when you say unicycles have wheels it doesn't mean more than one so he he noticed a number of there's lots of problems about about plurals they aren't you know plural doesn't just always mean more than one in fact yeah if if I say do you have do you have any children and you have one child the correct true answer is yes even though the question says do you have any children right so so there's a lot you know a lot lots of puzzles about about singular employment so he noticed them and I saw that he noticed the a lot of puzzles about the determiner any any any hint occur tried to analyze any hunts con tried to analyze any and there's just you know that the at that 1974 linguistic Institute that I mentioned we had a workshop with a lot of linguists and a lot of philosophers of language and and I can't remember now which which philosopher had a theory of any that he presented and the linguist just were ready with the examples you know because we already knew how hard any was and so for any different theory of any we had the crucial example that would show why that theory won't work so it was yeah so I think he was so I think at first he hadn't noticed some of those problems and I think gradually gradually he did that's cop worked closely with him through MIT much of this time and Han said he had the feeling that that Montagu was getting more and more appreciative of the fact that this was non-trivial I think another part of it was Montagu was a student of tarski and tarski was a very demanding and very exacting professor and he wanted all his students to do great things he he made he I think he made Montagu revise his dissertation multiple times and before he would accept it Dana Scott left Berkeley because he he was not going to submit himself to having to satisfy tarski but but I mean tarski was brilliant but he he treated his his graduate students sort of like slaves well something between disciples and slaves they had to do it they had to help him a lot and they had to please him with their work you know so so I think an enhanced comp also thinks that Montagu had internalized this feeling that all that matters is proving non-trivial serums you should you should find some really difficult problems in logic and prove some theorems that that mere mortals would not manage to prove right and so so just to analyze ordinary language you know there's we don't have deep theorems in this work I mean I mean there's there's tough tough problems and we try just to get a description is it's hard but like like what have we got when we've got it it's not it's not like it's a theorem that can be used to prove more theorems that would be like that that would be like that would be like saying we're in a new Garden of Eden right right I mean no no no so the theory that the theory keeps undergoing interesting changes so for instance my PhD student Arina Heim did a dissertation on definite and indefinite noun phrases in English in which she so so this this Montagu idea that all noun phrases can be treated in this kind of way what we call generalized quantifiers that that had been getting lots and lots of mileages in arenas as well no actually not definite now in phrases and indefinite noun phrases they both just denote variables with certain conditions put on them and and and it was a complex theory because it wasn't just changing the denotation it was changing the meta theory because she said you can never get it right if you just look at truth conditions of sentences you've got to go back to something that moniker had done earlier in his work on assertion that we that speaker and here you have to think about speaker in here communicating with each other and we share a certain set of presuppositions we have something like a common ground when we start talking to each other and if I say something and you accept it then we update the common ground to include that and arraign has said ah that's not only is that right and important but that's what happens a lot with the use of indefinite and definite noun phrases I say something like a cat walked in now we update our common ground to include something like you can think of it as a variable you can call it a discourse referent whatever you want to call it some kind of representation of some entity which is understood to be a cat it's not denoting any particular cat but it's therein at least locally in the new common ground and then if somebody else can say is the cat black and that sock at the UH cat a cat means this is a new thing to enter into the common ground the cat says oh you better find a cat in the common ground and it should be there should be just one salient cat in the common ground so so some noun phrases update the common ground other narrow phrases require that update so she said the basic semantic values are not truth conditions but context change potential and that that way of looking I mean that's in some ways the beginning of formal pragmatics or one of the steps in the beginning of formal pragmatics the interaction of of meaning and context so that that was a that was a very big change that that changed the way things are looked at one thing I didn't talk about at all and and it's it's always a worry that we still don't have any good theories of is lexical meaning you know what what all of this is good for is compositionality give me the meanings of the words and the syntactic structure of the scent and I'll give you the meaning of the sentence so all of it is starting from the words as given I mean Montague himself said working on word meaning is a different kind of job that's somebody else's job he just lied each word denote some constant of the right type in his logic and then went from there and building up from there so he had constants and variables at the bottom of the trees and then he showed us how to build up the meanings of the whole sentences but lexical semantics is a is an interesting difficult problem and there's a lot of different approaches to that and they look totally different from one another and there's I mean it's it's everybody I mean that so in this is one of the area's that's really big in computational linguistics now I should mention in case there's people interested in computational this and that computational formal semantics is a big field now and Google is hiring formal semantics it's even for a long time I thought Google would never want us because Google just works on co-occurrences of what words occurred next to each other but it turns out you can have all the probabilities in the world that won't help you to know what to do with a sentence that has a knot in it okay so you really have to have some of the serious structure as well so so but but computational linguists do a huge amount with lexical semantics just in terms of looking at huge corpora of language and what words occur together and and giving each word a profile in terms of all the other words it occurs with and and and our first reaction might be wait but that's not meaning but then then I start saying well some of the words we learn we really do learn in the context of other words some some words we learn in connection with the world and some words we learn in terms of what you can say with them and I don't know how to you know how to say that too clearly but but that anyway so so so there's there's looking at how words co-occur there's there's lots lots of psycholinguistic research there's one line of psycholinguistic reach research that says we don't have these sharp concepts like all-or-none conditions but something like prototypes and and resemblance to prototype for some kinds of words then that gets argued against by saying yeah but we don't know how to do anything compositionally with prototypes oh is is is yeah what's his name Michael Connelly Andy Andy that's you that's here yeah yeah yeah yeah proto the the prototype paper right so yeah so so there's so there's a lot of different ideas about lexical meaning but there's nothing that gets you there's nothing that gets you to something the same sort of abstraction as the things we're talking about here and I I really don't know where it's going to come from I think it needs linguists and psychologists and computer computational linguists and philosophers all to work on it too - yeah I mean Jerry Jerry Fodor believes every word is just a primitive you know and you don't you don't try to you know look inside it at all but then you'd have then it's like you have to say what it refers to and Chomsky is always saying but that's impossible you know we have he did it for my sake okay what's a sake you know and okay history uh-huh in the garden what are the Knights saying about hadn't hadn't noticed them in the in the guard Garden of Eden Garden of Eden was really syntax and only this cats and Fodor kind of semantics and that that was really only semantic features you know that are like like things that are decompositions of predicate no I don't know in 1965 I mean I can think back to then no demonstrative no but no that's that the philosophers were worrying about it already that that at that conference that Montague went to that bar Hillel organized you know that context dependence was seen as one of the big stumbling blocks and and people said like nobody understands demonstrative z' and Monica said I do I do yeah and that was 68 and it hadn't even been published yet [Music] yes yes yes in fact oh I'm happy you asked that because because so so linguist love to work on typology because this is that's the way to find out about what's universal and what can vary which which can help answer the question about how much is part of the innate given structure and and not surprisingly first there was lots of typology for phonology then for then for morphology then for syntax and for a long time there wasn't much serious work in semantics but but Amon Baca and Galica Krotz are always Jelinek and I had a project in the end in the late 80s working on quantification across languages and one of the most and and one of the one of the driving questions was is this kind of interpretation treating noun phrases quantificational ones and non quantification ones as generalize quantifies is that universal and it turns out from from work that several people did in that time in Ries really in response to our asking the question loudly to the whole linguistic world some examples were found for the answer is no and Eloise jellinek herself studied some Salish languages in the northwest coast and found no no they didn't and what seems to be more nearly universal is something David Lewis at study called adverbs of quantifications quantification as in a sentence like a quadratic equation always has two roots and and that always is not temporal that that's really a way of saying every quadratic equation has has two roots so always sometimes usually those can all be used to quantify over cases of all different sorts not just over time and and that's that I don't know if that's absolutely universal cause nobody's directly asked that but the languages that don't have these generalized clarifies don't have determiners like every or most they do tend to have something that is more adverbial that that could be used to express the same kind of things so yeah yeah and and there's lots of work on negation there's I'm in a I'm in this I'm sitting in on a seminar at UMass just just a semester worrying about the words that begin with well we come in words words like no one nothing etc across different languages tremendous variation how they act I mean English is in the minority in not liking sentences like nobody said nothin or nobody didn't say nothing we consider that a substandard dialect but that's but there's many more languages that do it that way than that do it our way so so the emergence of dialects doing it that way seems to be emergence of a very general linguistically general property [Music] [Applause]
Info
Channel: Dartmouth
Views: 8,629
Rating: 4.9550562 out of 5
Keywords:
Id: TpKpjiS1aic
Channel Id: undefined
Length: 87min 4sec (5224 seconds)
Published: Tue Apr 02 2019
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.