Category Theory for the Working Hacker by Philip Wadler

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments

As much as prof. Wadler is a great presenter and scientist, I disliked this particular presentation: it does not convey the power of Category Theory (arguably this is very hard to do in the context of such a short lecture), and instead superficially presented some constructs that seem to just be conjured out of thin air, with little to no application.

To really appreciate Category Theory as a programmer, the learning curve is unfortunately very steep. I would advise Bartosz Milewski's great series of lectures on the topic, as they arguably do a much better job at explaining some of the connections.

Still, it will be years before we learn how to explain Category Theory to programmers in a way that is quickly useful. Given that the most powerful uses usually involve easier ways to formulate abstract concepts and their composition (I was recently blown away by the beauty of building monads out of adjunctions!), and that understanding the immense power of constructs such as monads is complex in itself, this looks like a challenging struggle. Moreover, we should start showing monads as a tool to solve real world problems (shameless plug: https://medium.com/@giuseppemaggiore/type-safe-monads-and-react-499b4a2f56d7) instead of a tool to build parsers, then maybe we can turn more people to the light side.

👍︎︎ 22 👤︎︎ u/giuseppemag 📅︎︎ Sep 24 2017 🗫︎ replies

A couple of minor wibbles coming from working with slides that are unfortunately not typechecked:

At 38:15 or so he talks about distributive categories, but type he shows is a little wrong. It should be:

A*C+B*C <-> (A+B)*C

Notice how the * and + are flipped from his presentation.

At 40:50 or so he says "exponential is left adjoint to product" to be pithy, but its actually right adjoint.

Very minor issues, but I figured I should mention them in case someone else was trying to follow along and do all the proofs as they go.

👍︎︎ 23 👤︎︎ u/edwardkmett 📅︎︎ Sep 24 2017 🗫︎ replies

I have my own theory about professors who presume to know anything about working programmers/hackers as a category.

👍︎︎ 52 👤︎︎ u/DoctorOverhard 📅︎︎ Sep 24 2017 🗫︎ replies

I hope to find the time to get a basic working knowledge of category theory. I actually found programming extremely dull until I took a class in abstract algebra and they explained to me what functions and operations and algebraic structures actually were. And I really enjoyed learning functional programming, even though I prefer objects it's influence how I program a lot, for the better.

👍︎︎ 2 👤︎︎ u/[deleted] 📅︎︎ Sep 24 2017 🗫︎ replies

Did they get product types (aka pairs) in Java 9 now?

👍︎︎ 2 👤︎︎ u/quiteamess 📅︎︎ Sep 24 2017 🗫︎ replies
Captions
thank you for that nice introduction throughout the day people keep coming up and say oh thank you for coming and I keep well thank you for inviting me but actually there's something more profound than that I don't want to just thank you for inviting me I want to thank you for existing right I want to thank all of you who've shown up especially this late in the day I guess it's early Spanish time by once thank all of you who taken the trouble to come to this conference the notion that 260 people in industry trying to get work done want to come together and learn about functional programming if I could go back and tell the Haskell committee that 30 years ago they were gone laughs right but there's a lesson here which I'd like you to take away which is if you want to change the world and that's what you should do right you should want to change the world and you can do it you can do it the same way we did it which is come up with a good idea and find other great people to work on it with you like you praised me for type classes but you wouldn't use type classes if it wasn't for the additional work that Mark Jones did on top of what I did Bota was praising me for monads before but I never would have found monads if it wasn't for the work that cly's lee had done and the work that you general Maji had done so it's always a chain so I am inviting you to join this chain that is what you should do and then you can do it just like the Haskell Committee did it by great people to work with come up with great ideas and then here's the important point everybody will ignore you and this is what I've learned now don't worry about it just keep doing it for 30 years so that is what I'm in joining you to do come up with a great idea and stick with it for 30 years and then you will have 260 squared people building on your ideas okay so well I'm going to talk about today is so I've given the talk on propositions as types and I was very pleased to see that the two houses here were named Korean Howard because one of the names for propositions as types is the curry Howard isomorphism some people don't like that name because they point out that the intuitionists Brewer and the like had the same ideas even earlier and that in terms of practical application people like de bruyne had the same idea independently so there are many many different names you could give this and just to avoid getting into fights I've taken to calling it propositions as types but if you want to call it curry Howard that's fine that's one of the many names right you know an idea is good if it has lots of names and and indeed curry Howard is a double-barreled name that guarantees you will have other double-barrelled names because what does curry Howard tell us it tells us that there's a strong correspondence between logic on the one hand and computing on the other hand that they are indeed like this so you can look at the rules of logic and you can look at the rules of a programming language in particular natural deduction on the one hand and simply typed lambda calculus on the other hand and hey they're exactly the same and hey it only took 50 years about 40 years between when Jensen formal formulated natural deduction and church within a few years had formulated simply typed lambda calculus they both came along at the same time and all you need to do is wait until 1980 when William Howard finally publishes the paper pointing out by the way the are both the same thing right and as soon as you see it it's completely obvious any of you ago that's completely obvious but it took 40 years to see and and if you talk to William Howard which I've been fortunate to do you discover it was anything but obvious at the time so if you don't know about propositions as types I'm not going to speak about that at any greater length today I'm going to almost assume that you know what it is and that's not really fair how many people do have some idea of what propositions is types of not bad okay about half of you it's this close correspondence so there's these two halves right and they're exactly the same and since it's logic on the one hand and computing on the other hand you're guaranteed that every good idea will be discovered twice once by a logician and once by a computer scientist well computer science is a horrible name let's say somebody who does computation or informatics and examples of this are the hindley-milner type system which underlies standard ml and Haskell and many other languages which was discovered both by the logician Roger Hindley and by the computer scientist Robin Milner and if you want a more powerful types of sin that's similar you have laser on Reynolds type system also called system F which is in fact the basis for how compilation in Haskell works as well as it finally being exposed in some extensions to users and that was discovered by the logician Johnny Sherrod and the computer scientist John Reynolds so every good idea gets discovered twice the fact that it's found twice independently is what tells you that it was discovered not invented and philosophers can argue about this sort of thing and I don't want to do that I want to treat it in a very practical way I would claim that there is a difference in how you feel using a programming language that is invented from one that is discovered and most of you in your day to day lives the ones that aren't lucky enough to be using functional languages are using programming languages that are invented and you can tell but functional languages they have lots of bits in them that are invented but there is a core of them lambda calculus which we know is discovered precisely because of the Curie Howard have some morphism so we've got these two halves logic computation today my goal is to tell you about the third half category Theory category theory is the secret sauce underneath curry Howard and indeed category theory is responsible for lots of the things that many of you have now ended up using to date today things like monads okay that came directly out of category theory things like functors the name functor and the name monad come from category theory so I'm going to get you started on the path where you could learn about this very basic idea and I will try to get just far enough to show you one of the coolest things I know to vehicle this thing I know of course is that natural deduction and lambda calculus correspond to each other but today I will show you something that's actually I think equally cool okay so have I built up the talk enough right so let's just so I'm going to explain category theory to you category theory has a reputation of being hard I might say something that you don't understand that's probably not because you're stupid it's probably because I didn't explain something well enough because I'm in the unfortunate position that I know category theory so the bits that aren't obvious aren't obvious to me but they're obvious to you so you can help this talk work if I say something that you don't understand please ask a question if I say something that you disagree with and you want to argue with please say that to the end but if I say something that you don't understand please ask right away will you do that okay great thank you so categories are about something very simple which is composing functions and we just have funny names for all these things that you know perfectly well so a B and C here are called objects okay that this is not object as an object-oriented programming you should object to that kind of object object here actually is just a different name for type and then we have these things that go between objects called arrows so if f is an arrow from A to B that's just saying F is a function that accepts arguments of type a and returns results of type B in category theory a and B would often be some kind of mathematical structure like and algebra and F would be a morphism between algebra that maintains that categorical strong try maintains that algebraic structure so there are lots of examples that have that property and if you know about algebraic structures and homomorphisms that maintain properties of them you can keep that in mind but if you don't and you just happen to know functional programming and you know about types and functions that will be enough to be getting on with and then there are a couple of different things we can do with arrows there is the identity arrow from A to A sorry I should move around that way and you can guess what that does right it's just does nothing at all it leaves its argument unchanged and there's function composition which says if we have F that goes from A to B and G that goes from B to C then of course we can compose them and go from A to C directly and that's what F semicolon G does so semicolon is just composition of two arrows literally just means one arrow followed by the other arrow and oh yeah this will satisfy some laws right if you do nothing and compose that with F that's the same as doing F and following it by doing nothing and no stuff they're different nothing's right if that goes from A to B the first nothing would be the identity on a but the second nothing would be the identity on B and of course those are also both equal to F I neglected to write that on my slide I would update it except these slides were made by hand and I scanned them in and I didn't bring my scanner with me so any errors were stuck with them and then the other thing is right if you compose three things right there are two ways to do it I can compose F with G and then compose it with H or I can compose G and H and then compose F onto the front and either way of course that's going to be the same where you put the parentheses doesn't matter and that should be an idea that's familiar to you indeed many of you will recognize this as the structure of a mono ID but don't worry about that uh any questions yet I thought I said something somebody didn't understand everybody understood everything cool okay we'll see if we can make it harder than that so the first thing is the product now actually I'm not going to start here with category theory I'll start I'll go to the end I'll show you product in Java right so here's product in Java and what is product right it's just a record with fields and we're going to give specific names to these recording call them first and second so the product of a and B has two fields one call first of type a and one called second of type B and you can create a new product and then get first and get second return the two fields any questions about that one feel free to point out I made an error in my Java other than using Java and then here's the same thing in the type rules of Haskell just to show you what happened so gamma is some free variables that are available in our context is this notation gamma that funny symbols called a turnstile and call it a how many people have seen something like that okay about half of you okay so now you've all seen it it's not something to panic over it's just a standard way of writing down type rules and says M is a term you know about terms right these are expressions you write in your programming language a is a type you know about types and then gamma what's gamma that must be hard because it's written as a big Greek letter gamma is just our name for all the free variables and their types so I might have X of type a in the context and Y of type B in the context and now I could build up a term a pair MN so if X if gamma is that context I just described and might be ax is type is a and n might be Y whose type is B and then an XY pair is an a cross B product okay and then if L is an a cross B pair first of LX up the first component whose type is guess what a and second of L picks out the second component whose type is guess what B and I'll explain the other bit when we get to it any questions about that plus it's way too easy for you all right we'll be bored so here's a product so product is just what I said right we've thought an a cross V product and then there are two morphisms first which goes from a cross B to a and second which extracts out the second component goes from a cross B to B now the interesting thing is how category this theory describes creating one of these things so we have to do it in terms of arrows that can be kind of annoying right we're used to working with terms like m and n but no I'm going to have to do it now using arrows it turns out we will get a big benefit from using arrows at the end but right now right the category theory is going to look like this is what I thought of category theory when I first learned it easy stuff I understand really well made difficult that's what category theory is so let's take products which we all understand really well and make them difficult and the way we're going to make them difficult is instead of being able to build an NN pair from two terms we have to do it from arrows how are we going to do it from arrows well let's pick some other type C and let F be an arrow from C to a and let G be an arrow from C to B okay so if I apply F to this thingy and C I get a thingy an A and if I apply G to this thing in C I get a thingy in B so we so this funny notation angle bracket s comma G close angle bracket says I'm going to build a pair how am I going to build that pair well use take your C and use F to get the a component and use G to get the B component okay now this diagram says a couple of other things so first of all we notice that following that diagram you then go through first and second and now look there two ways to go from C to a we can go directly via F or we can go that's probably going to be to say they have a pointer a big pointer a stick not a laser pointer I don't trust them somebody's going off to get a big stick okay we'll see if they can find a big stick but meanwhile if I point with that little hand can you see what I'm doing yes okay so we can go either from C directly to a via F or we can go from C to a cross B via F G and then go from a cross B to a via first so there are two ways you can go in this diagram and these dot you find these diagrams in category theory all the time they're sort of cool because it's a graphic way of writing things down and it's called a commuting diagram because the rule is when you see one of these things if they're two different paths through it they are both equal so what this diagram tells us is that building a pair with F G and then taking the first component is the same as just going from C to a with F do I believe that well let's see FG applies F to C to get the a component and G to C to get the B component and then first just takes the a component so I could have brilliant thank you Oh okay literally we are being led by the blind thank you so um right so if we apply FGM we build up this pair first extracts out the component of the pair of course that's just the same as applying F directly and similarly we went the other way around right we build up the pair and then take the second component that's just the same as using G directly so that's what these two equations are telling us now this line is written with dots what does that mean it says any arrow that you could put in here that makes this diagram commute must be s G that's the only way to get from C to a cross B if um if it commutes as if it's cursed component is gotten by F and second component is gotten by G the only way to get there is to apply F and G to build up a pair so this is called a universality condition as I told you it was obvious stuff made hard and that just says okay if you've got some arrow H right say we've got some arrow H from C to a cross B and H composed with first is F that's half of saying the diagram commutes and H composed the second is G that's the other half of saying the diagram commutes so if the diagram commutes then H must be FG wow that makes my brain hurt right that that's kind of hard why would I want to know such a bit but but it doesn't sort of cool all right if there's any air okay about commuting diagrams and then if there's any arrow that makes a commute it must be that one I mean this is actually cut off just in reading Thinking Fast and Slow and you sort of leap to conclusions when you see something so you know FG goes from here so if you see some other arrow going from here you sort of leap to the conclusion oh maybe it's s G of course awesome when we leap to conclusions it's incorrect what university ality tells us is if we leap to this conclusion it is correct but the only thing that goes here must be FGM is that good for anything well yeah it actually tells you useful stuff so an example is um well one arrow I can't exceed to be anything I want right let me pick c2 itself be a cross V what by Crosby's down here alright yeah I can still pick C to be it as well that kind of trick where you make something as simple as possible that crops up in category theory all the time it's great you can consign Devery buddy just by making everything as simple as possible so you know this is like the simplest possible way of instantiating the previous diagram and then what will I use for F well first goes from a cross B to a so let's use first and then second goes from a cross B 2 seconds let's use second so now our thing to build a pair is first second wait we're going from a cross B to a cross B ID goes from a cross B to a cross B ah ok so first paired with second must be the same as ID you may all now say and this is actually useful right you will all know programming languages where this is not true right even in Haskell this is not true because we goofed we could have made it true we should have done but there are many languages where this is not true any strict language this will not be true it's hard to it's impossible to make tricky at the least to make this true in a strict language where one one half doesn't terminate if everything terminates in a language like Idris or Agda or caulk then you can make this true this is only true in the very best of programming languages but but pretty much all logics right it will be true it except for very low class logics are actually trying to deal with real computing languages but any logic works its name you can do that okay any questions yep Wow okay um let me go back and explain something yep here I didn't explain this wee bit of notation nobody asked about that you're not doing your job so if I write curly see that stands for the category curly see curly C of a and B that just means the set of all the arrows in the category so the set of all arrows from A to B is in C and we're going to use that in a minute we're going to use it right now right another way of reading what I've just told you is that what arrows are they're from C to a cross B well they're in one-to-one correspondence with the arrows from C to a and the arrows from C to B so in other words the arrows in curly C sorry I've used the same name twice from Optics C to the object a times B are in one-to-one correspondence right if you have one of these arrows what does it mean well you've got an SG here right we know it must be an SG because if there's any arrow that makes this commute it must be equal to an FG and so those arrows are in one-to-one correspondence with an arrow from C to a that's our F and an arrow from C to B that's our G okay so that little equation tells us what products are and right we've got lots of different products here actually right this is the product of two objects in our category this is the product of two sets of arrows in set theory so this is just part even product but Cartesian product is actually an instance of categorical product so we've got categorical product used in two different ways here I wrote one in black to make that a bit easier now your head should start to hurt a little bit okay and then the way we can use this construct is right what does it mean to kind of give a semantics of this Haskell term in terms of categories yes I can so gamma remember with all our free variables M is our term and a is our type and remember gamma has the types of each of the free variables so if gamma is something like X has type a and Y has type B it will correspond to the product of the types a and B and then to get X you have to know to fetch the first one to get Y you have to know to fetch the second one it might be we might have 25 free variables so then we'll take the product of 25 things oh wait which way did the parentheses go oh wait product is associative it doesn't matter good ah so if s goes from gamma to a and we're just saying F is the meaning of the term M and if G goes from gamma to be so G would be the meaning of the term end so right what does this term end up do it says give me values for all the free variables I'll give you a value of type a right and that's what F does it says give me one value each of each of the types of the free variables and I'll give you a value of type A and that's what G does and then so what does pairing do it's of course just this construct the FG pair and then what does first do well if H goes from gamma to a times B then what first does is it first evaluates H to give us an a times B and then this first and similarly second first is H and then the second so that's a way of giving semantics right are your heads hurting now good are there any questions yes in the productions on the right side it looks like the the arrows coming from this initial like life what is it the which gamma that the gamma I guess so does that mean that gamma is sort of wrapped in that case it's kind of standing in for like any possible type you could be talking about or any possible object I guess any possible environment environment the environment will be a bunch of they're free variables paired with a bunch of types got it okay that makes sense and notice how how easily it was for me to make this really difficult for you just by taking a letter gamma that you don't recognize and can't pronounce okay we know what products are right now we'll do sums the sums maybe few of you see how many of you familiar with UM datatype in Haskell okay so all of you have seen sums before how many of you are familiar with record variants not too many on that neck here let me show it to you in Java yeah it gets kind of messy okay but once you've seen this you'll understand what it is right so sum of a and B so well we've got a case expression and so write a sentence it's either a value on the which we call in left of a here or in right ooh I must have no yeah in left or in right those are the two ways of forming a sum over a and B and an in left has a component of type a and then in right has a component of type D so an A plus B thing is either an A or a B and then we'll use case expression which takes two functions I'm making use of the fact that Java has lambdas now a function from A to C and a function from B to C called F and G each of them and if it's the thing on the left and we do the case expression we apply F to that value X contained in the psalm but if it's a thing on the right then we use G and we apply that to Y so now that I've made it look really hard by writing it out in Java how many people have seen sums before not all of you okay so sums are actually as basic as products so it's sort of weird that a lot of our programming languages have products really easy but sums are not so easy yes rejecting a bijective but we haven't said in our preconditions that our functions have bijective so there could be more than one inverse or that could be adding no function inverse okay now my head's hurting I'm sorry I didn't follow that what was their question there or was that a statement are there any other requirements with Holland writing here because if we are saying that is the only requirement in pressing F is a function from A to C and G is a function from B to C that's it okay in Haskell the way you write these is in left of M is of type a plus B if M is of type a and in right of n is of type a plus B if n is of type B and if L is an A plus B and gamma is this environment but we're going to extend it now we're going to add to it a free variable X of type a and then T might refer to anything in gamma but might also refer to X and it must have type C and similarly will extend gamma with Y of type B and Q can refer to any of these things including Y and it must have type C so then case L of in left X which will bind X to the value stored in L if L is a left sum an and then we'll evaluate P so we've got all the free variables plus X which is bound to the left thing in L or if there is no less thing in L then it'll be a right thing and then we have to keep the second line instead we bind Y to the right thing and then go off and evaluate Q so now let's see what that looks like so remember so here's our sum there are two ways of building a sum within left or within right and then we need a case expression to get out of it and again we don't have term we have arrow so in Haskell we wrote terms in Java we were actually forced to write arrows interestingly so now I'm going to write F and G with square brackets rather than angle brackets and that means the thing that goes out of a sum so and again the diagram commutes so this is okay if I take an A inject it via in left into a plus B and then go out via the fgk statement to get a C that's just the same as taking s and applying it to a directly similarly if I took a B and injected it into the sum and then went out with FG that's the same as just applying G directly so that gives you these two equations and again it's a dotted line so again it's universal so again any line here that makes this diagram commute the only way to do that is with F and G and again in a programming language where everything terminates like Idris or acto or caulk it's easy to make all this be true but otherwise you can't have if something might not terminate as far as I know you can't have both true sums and true products one of these equations must go walking right so really only the very best of programming languages will have this property and just like before we can do the same trick and show that the identity must be the same as doing a case expression that dozen left or in right and and that might be a little bit hard to think of okay what's this a kate says it looks at this thing that's either an A or a B if it's an A I'll turn it back into an A plus B by injecting that a on the left and if it's a B I'll turn it back into an A plus B by injecting it on the right and yeah that's not really changed very much has it so that's why this is the identity function okay any questions about that and again what is this telling us it's saying that an arrow from a plus B to C is isomorphic to an arrow from A to C paired with right so this is product again right here we've got some here we've got product paired with an arrow from D to C which it says what are the ways of getting from A to B plus C well it's just you must have a way of getting from A to C and you must have a way of getting from B to C oh let's see right and then here's the category in Atlantic's so in less than in writer obvious case is actually bloody difficult to use the technical term so gamma so our term L here would be Matt would be it semantics would be an arrow H going from gamma to a plus B and then obviously these things right so P would correspond to something from gamma times a because we've got an additional free variable of type A to C and G would be a gamma times B to C and then oh how do we make this work well actually we won't do H here we'll do ID paired with H so we still have gamma because we're going to need gamma still when we reevaluate P or Q so we'll use products to make a pair of gamma times a plus B so we've got our a plus B value but we also still have gamma and now we use our FG case and we use that just on the left thing here and we leave em oh sorry now I've got one other thing here right just by the distributive law gamma times a plus B must be the same as gamma times a plus gamma times B I'll show you that in a minute it's actually not so easy but this is something you learned in high school right turns out it's still true in category theory actually it's not true for all categories it's only true for categories called distributed categories which are ones for which guess what the distributive law holds and almost every category is distributive if it has sons and products and now we've got a sum of two things so we can go out with F and F paragraph G oh but as wasn't yesterday it took a gamma oh we got the gamma by doing the distribution we're okay okay so that gives us the semantics again and it turns out as long as you've got exponentials which I'll show you in a minute which match lambda terms you can actually build up particular arrows they make these two terms isomorphic so that just means there's an arrow for one an arrow to the other and then if you compose the two arrows it's the identity either way around that you compose them so that just says that there's a one-to-one correspondence or what we call an isomorphism between the type we've written on the left and the type we've written on the right or object if you prefer the proper categorical term I won't go into that in detail that was at it you know I just write this up and say this is obvious it took me two hours to work out what these arrows were and to confirm that it worked properly okay so the last thing is how do you do lambda expressions and this is kind of bizarre the first time I saw it I thought wow that's taking lambda expressions and making them really really difficult so what we do is we say okay I've got gamma extended by a that's very similar to the trick we did with the case expressions and F is to find on everything in gamma and also a and it goes to be well that's the same as if I do Curry of s that's going to be the lambda expression basically the lambda abstraction right because if I've got some term like there's a bunch of free variables including one of type a and it goes to be a non-white lambda X of type a right let's say this is X lambda X of type a dot whatever this term is has type B well that's the same once I land abstract as a function from A to B which you get by binding the argument to this X so Korea vests goes from gamma to an a arrow B I've written the arrow in a slightly funny way to distinguish it it's called an exponential and then we paired this with the identity function on a so this distance okay we've got of things this one changed by curry of s this one just leave unchanged and now we've got an a to be thing and an a and we can apply the aid to be function to the a to get a B so this is function abstraction and this is function application and what this construct tells us I've not been able to write it here but this again is dotted right anything that makes this can be any arrow H that makes this commute must be equal to curry of s and so what this says is the arrows from gamma to a to B are the same as the arrows from gamma times A to B or we can just say exponential is less than joint to product if we really want to make our heads explode so this is a little bit easier to follow in programming language terms right oh oh right these are just variables so gamma right is a product of a bunch of variables how does that work well if it's just one thing it would be the identity if it's a bunch of stuff and we'll make use that and we've added an X on the left and whatever n is it's going to be variable right eventually we'll use this rule but we've added an extra X that we're going to ignore then we can just throw that away right using first we'll pick out gamma and then do whatever the semantics of the first variable was and similarly for that so this just says to take a variable out of the environment we need a chain of first and second and this builds up that appropriate chain of first and seconds to pull the variable out and then we can do the proper laws for Exponential's right here's the law I was telling you before for building a lambda expression here's the law for application and the semantics is very easy right if F is the semantics of the top line curry of F is the semantics of the bottom line and if F and G are the semantics of the two terms here and we just pair F with G to get an A to B function paired with an A and apply it so the categorical semantics are actually pretty easy so one thing I've skipped over here but I'll just make the point briefly let's do it here oh I've not used the colors properly right if you just look at the types and you ignore the terms this says if you have an A and you have a B then you have an a cross B well in logic if you can prove a improve B you've proved a and B so that's curry Howard pears correspond to logical conjunction if you know a and B then from that you can conclude a and from that you can conclude B so product is just logical conjunction some says if I know a then I know a plus B oh that's or right I know a then a or B must be true because a is true I know B then a or B must be true because B is true what do I do if I have no a or b hmm well if assuming a I can prove C and if assuming B I can prove C and I know a or B ah then I know C without assuming anything because I know either a is true or B is true and in either case I have a proof of C so these are Jensen's logical rules for or and you can see here we get the categorical interpretation of Jensen's rules for or so again categorically sums give us disjunction and what this shows us is that categorically what are called Exponential's give us implication this is the rule for um right how do I know a implies B will fight suing a I can prove B then I know a implies B and if I know a implies B and I know a I can conclude B that's called modus ponens so the categories give us the right thing I must be close to time if not over on that time so let me wrap up so that's it right that's what I want to show you except for the cool stuff so you've seen all the hard stuff you've done all the work now I'm going to show you what so I've already showed you one of the coolest things I know like the correspondence between logic on the one hand and computing on the other hand and categories on the third hand so we've got the sort of alien being here Vicki okay right here's our diagram for product here's our diagram for some you sort of-- oh yeah they look kind of similar no they do not look kind of similar they are exactly the same if you just flip all the arrows so this diagram and this diagram are the same right it's just that here the arrow goes from C to a and here the arrow goes from a cross B to a but here the arrow goes from A to C and here the arrow goes from A to A plus B so you just flip all the arrows exactly and that gives you the other thing so this is called a dual duals are often indicated with the word Co the dual of a dual is that to the thing you got right if you flip all the arrows and then flip them back you're where you start it so duals have this nice property negation in logic is an example of duality and yes flipping all the arrows in the categorical statement that corresponds to logical statement gives you its negation so it is just like negation in logic so the dual of a dual is back where you started right category Theory joke what does the category theorist call a Coco not hahaha and here's the other cool thing what did we say we said okay what does it mean to have an arrow from C to a time's the means you have a pair of things an arrow from C to a and an arrow from C to B what does it mean to have an arrow from a plus B to C it means you have a pair of things and air from A to C and an arrow from B to C what does it mean to have an arrow from C to a goes to be a funny arrow B it means you've got an arrow from C times a to b those of you who are very symmetrical thinking oh what happens if you have a plus here do you get something yes you do it's a turns out to be quite a limited logic there very few things that satisfy that that gives you exactly boolean algebras in classical logic if you also have the other thing but these you know you solve them in high school let's write that there's an arrow from C to a I'll write that by saying a raised to the C power and that kind of makes sense because if C is a finite set with exactly see things in it and a is a finite set with exactly a things in it how many ways are there to get from C to a well let's see there's um I can pick out one thing in a for the first thing here and one thing and a for the second thing here and so on so if there see things here it's a to the C different ways of getting some seed to a but how many functions are there from n variables on to 2 2 to the N you're all familiar with that so this is just written as a race to the C power and if you take those three equations and rewrite them in that way you knew those a long time ago so notice I'm cheating a bit here right I've taken I've written product for both the products in black and the product I wrote in blue and similar I've done a similar thing with the exponential so I've mushed together things that are different so category theory teases that apart in a very useful way but the basic idea you knew back in high school you may now all say mmm again right I'm done I will just leave you with this thought life is cool do something cool and stick to it for 30 years and always remember that when you have a tough job to do that you should recall that this is a job for lambda calculus
Info
Channel: Lambda World
Views: 77,786
Rating: 4.8819914 out of 5
Keywords: Philip Wadler, Lambda Man, Lambda World, functional programming, tech talks, technology, coding, programming
Id: V10hzjgoklA
Channel Id: undefined
Length: 50min 52sec (3052 seconds)
Published: Fri Nov 11 2016
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.