We're all familiar with the phrase 'rules are made to be broken.' But today, by examining the work of philosophers and activists like Rosa Parks, I want to ask: 'Is the rule of law also made to be broken?' Part 1: Breaking the Rules In his book 'To Save Everything Click Here,' writer Evgeny Morozov asks us to imagine a law that could never be broken. Not in the sense that it's totally impossible, just that in practice it's so thoroughly enforced that nobody can break it. For instance, imagine a version of the United States that is the same in all respects as the one we know, except it has always been about 70 years ahead technologically. So in 1950, they have the technology that we will have in 2020: social media, smartphones, algorithms, facial recognition. That sort of thing. Morozov says we can imagine them building a system that they could mount on top of a bus that would scan the faces of passengers waiting at the bus stop and compare them to a database of known troublemakers. If you've got a history of causing trouble on the bus, like not paying for your ticket or causing a bit of a ruckus, and the turnstile won't turn and you can't get on- What a brilliant system! Since we started using it, crime and confrontations on the bus service in the city have gone down hugely. If you're very clever you'll already have seen where this is going, because in 1955 buses in certain parts of the United States had racially segregated seating. There was a section for white people, and a section for everyone else. A lot of people have heard of Rosa Parks, a black woman who was arrested in Alabama in 1955 for breaking those rules. She became a huge symbol of civil rights activism and is still rightly remembered today. But a lot of people, I think, also have the idea that Parks was just some lady trying to get the bus that day who was tired and didn't want to give up her seat to a white person. In fact, the popular image of Parks as a lady who just had tired feet and didn't want to get up is false. Parks was already a committed activist, and had been arrested before. She was the Secretary of the NAACP, no less. Her protest was deliberate and pre-planned, and she was chosen for the central role. She wasn't even the first person to be arrested for breaking those rules. 16 year old Claudette Colvin was, nine months previously. In other words, Parks already had a great history of activism and breaking the law. Nowadays, the vast majority of people realize what she and the other civil rights activists back then already knew, which was that that history of breaking the rules, breaking the law, was a good thing. Because those were bad, racist laws. The people that made them and enforced them were wrong to do it, and breaking them helped make the United States a better place. However, in the scenario with the technologically advanced bus system the computer would have recognized her as Rosa Parks prominent, civil rights activist, and it would never have let her on, so she could never have done her protest. So a law that morally should have been broken couldn't be. The point is, an unbreakable law allows no room for activism. As Parks and many other activists throughout history have demonstrated, sometimes breaking the law is necessary to show people that it's a bad one. Trouble is, if you're not affected very much by a bad law or even benefited by one, it can be easier to realize that in hindsight. A lot of white people at the time thought of Parks as creating a conflict, she's creating a problem. When in reality what most people nowadays, though not all, realize is what she already knew: that activists don't create conflict. They just bring to light conflict that has been ignored. And breaking the rules might not only be morally necessary, but practically necessary to get them struck down. Browder v. Gayle was the case that went all the way to the Supreme Court in the United States that finally got bus segregation in Alabama ended. But of course, judges can only decide on cases that appear before them. An unbreakable law can never be reviewed in court. Which means if it's an unjust law that's popular with those who have a lot of influence, like racial segregation was, it's gonna stay on the books and continue to be enforced. Morozov says that the law, which includes both courts and policing, must allow room for a moral environment. It must allow people some opportunities to break the law so that we have the opportunity to dissent and revise the laws that are bad. Having at least some crimes--of at least some kinds--is a feature, not a bug, of a good legal system. So it looks like the rules really are made to be broken. Part 2: Making the Rules There's a famous conflict in legal philosophy between the philosophers H.L.A Hart and Ronald Dworkin over what the law is. Hart thought that the law is essentially just a system of rules. They're special rules, admittedly. They're not like the rules of Scrabble, because if you break the rules of the law you will be violently punished. And I don't know about you, but Scrabble never got that serious in my house. What distinguishes an ordinary rule from a rule of law, Hart says, is that laws come under a master rule called the 'rule of recognition.' And that determines what is and isn't law. The rule recognition might be something very simple, like everything in the Constitution is the law. If you want to know whether something's legal, check whether it's constitutional. But it might be something very complex or even unspoken. Hart viewed the law as a closed system, like the rules of chess, where a good or a bad move can be determined by reference only to where the pieces are and the rules of the game. This position is called legal positivism So for instance, a judge only has to think about what the law says. Contrast Hart's positivism with another view- that of Ronald Dworkin. Hart and Dworkin disagreed on almost everything. Dworkin said that the law is more than just a system of rules, and applying the law is more than just following the rules. When judges decide on a case, they don't just think about what the rules are. They think about what's fair, and what's just. Sometimes there's an entirely new legal situation with no precedence. And in those cases Hart says that judges use their discretion to create new laws, because there's no rules to tell them what to do. But Dworkin is more of the opinion that judges never create new laws out of thin air. Even if there's no hard and fast rules to go on, they always have the principles and the history of previous decisions to guide them. The job of judges isn't to just inflexibly apply ironclad rules, but to think about what the law as an institution is for, and make a decision that aligns with the goal of distributing justice and painting that institution in the best possible light. Dworkin calls this 'law as integrity.' And we can see that it offers a little bit of a bigger picture, a little bit more flexibility, and an encouragement--not just to citizens and activists--but to the courts to consider the moral environment. Part 3: Minority Report I'm curious about what happens when we throw technology into the mix, because computers have a reputation for being notoriously inflexible. Let's imagine that you and I are software developers, and we've written a new computer program that's going to help the police. Our special program takes all of the crime reports, all of the arrest records and the police data from the last few years, plugs them into an algorithm and then says 'Better send a few more officers down Green Street, there's been a bunch of burglaries down there lately.' The algorithm spots the patterns in crime that the police might miss, predicts where crime is most likely to be occurred next, and sends the cops patrolling round there. They might catch somebody in the act but, because the cops are there anyway, anybody thinking about committing a crime is gonna think twice. So we might actually even prevent some crimes from ever happening. We can call it 'smart policing,' or 'predictive policing.' You and I are going to pitch this idea to the mayor of our city, and we're gonna say it's gonna cut crime, it's gonna cut policing cost, it's gonna make people safer. The mayor is almost certainly gonna get reelected and we'll employ people from the local area to build the software and run our offices. So we're job creators, too. Give us a grant of taxpayer's money to develop this technology. Roll it out. Doesn't it sound wonderful? Of course it does. Until you remember that the police in our city have a history of disproportionately stopping black people. And disproportionately arresting black people for the same crimes as white people. And are more likely to interpret the innocent actions of black people as criminal. And white citizens of our city are more likely to call the police on black citizens when they're not actually doing anything wrong. Our algorithm is going to take all of that data, and it's going to recommend that we more heavily police predominantly black neighborhoods. So unless we find some way of debiasing our dataset, we've essentially just made racist policing more efficient. Now you and I, the software developers, are we gonna admit that? No. If we even realize it. Because if we admit it, then we're out of a job. All of our public facing communications has to say that our product is objective, and it's great, and it works. And the mayor isn't going to admit that they let racist computers run the police department, and the police sure as hell aren't gonna admit that they're institutionally racist, and this algorithm that we wrote belongs to our private software company. It's our intellectual property. We can't just publish it. Somebody might steal it. So the people in our city are not allowed to challenge or even know the system that is being used to police them. And those of us that are in a position to know and challenge it all have a very strong incentive not to. By the way, in case predictive policing sounds like science fiction... I hate to tell you it's already being tested in San Francisco and Santa Cruz, in LA, New Orleans. Many other cities in the US. In my country, it's being tested in Kent. Private companies like Palantir, Ripjar, and PredPol are poised to sell this tech, for taxpayers' money, to the cops and make bank doing it. And as with any new policing technique, be it a fancy algorithm or be it just a thousand more dudes with guns on the streets, nobody's gonna ask your permission before they roll it out on you. The point is, just because the law is being enforced, and just because crime is going down doesn't mean that everything's going well, because those things can happen whilst the moral environment is being eroded. Whilst we have less opportunity to challenge, change, and even know the rules of the game. [Acapella, to the tune of 'Folsom Prison Blues'] I hear the sirens comin' comin' for me I'll bet They're gunnin' to arrest me for what I ain't done yet Going down for activism I'll be inside so long But them laws need to be broken 'cause all them laws was wrong My momma told me sonny, don't hang around with fools always be a good boy don't ever break the rules Well I sure am sorry momma Now I am doing time Well I broke my momma's heart with my necessary crime [lively acapella guitar solo]