Alison Roman Answers Your Tech Questions | Hard Fork Podcast

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
we reached out to Allison Roman and something about me is that I on YouTube I will watch any human cook any dish I don't know what it is I love uh watching people make food I make food occasionally but not all that often relative to how much YouTube I watch and there is nobody better at cooking on YouTube than Allison Roman and I should say it's not just YouTube She's also a chef and a food writer she is known for some of her viral recipes like the stew and the cookies and some bestselling cookbooks including uh my favorite nothing fancy and more recently sweet enough she kind of has a a finger in just about everything yeah and I think most relevant to us is that Allison is what I would consider kind of like a very online person she sort of came up during uh the sort of social media age as a person who sort of developed this kind of very relatable online Persona and she's also had a career that's been shaped in some ways by the internet she experienced you know an episode of sort of sort of criticism or backlash a few years ago um online which led her to leave her job at the New York Times and eventually start her own newsletter so she's kind of experienced the highs of online Fame and also some of the lows and maybe most relevant to us and what we're going to do on the show today is that she loves to give advice that's right she recently started an advice podcast it's called solicited advice where she answers listeners questions not unlike what what we do on this segment which we call hard questions so we're thrilled to have her here today let's bring her in here Alison Roman Welcome to Hard Fork hello happy to be here how are you I'm well how are you I'm great I'm so thrilled that you are here today thank you so much for doing this oh my gosh of course I love podcasts I love your podcast this one comes from a listener whose name is also Alison from Arizona and she's very concerned about protecting her child's image from faci recognition technology but recently when she went to register her kid for daycare she learned that she might have to choose between getting child care and her kids privacy here's Allison hi hard Fork after months of being on a wait list a spot my 2-year-old opened at one of my top choices of daycare as a precondition of registration I was asked to sign a photo release giving the daycare rights to use my child's image and promotional materials including online content I rais concerns that images uploaded to the internet could be available to train AI facial recognition technology but the response was that I could sign the photo release or not take the daycare spot why is it okay for child care providers to demand through a photo release the ability to provide this training data to the algorithms potentially impacting their future privacy and is there anything that can be done o Alison Roman what do you think what do you make of this I mean it's sort of like two questions right like is this okay and also what do I do and I don't know how hard it is to get into these schools I'm sure vary and you know you sort of have to like weigh the what are they going to do with the information is that does that you know con outweigh the pro of them going to this school that I really want them to go to that feels like it's going to enrich their life and make my life better as well as a parent I don't know I sort of not to sound nihilistic but I sort of feel like with all surveillance and Ai and facial recognition and cameras and all stuff it's like we've already lost you know like there are cameras on every corner people are taking pictures of them like I I'm sort of like when I go to the airport yeah I I belong to clear Delta 1 TSA I'm like get me into the plane you know what I mean at this point I don't know that like not signing up for Clear is going to be the the thing that like takes it down you know they're going to do it anyway so I don't know yeah that I mean that's actually a thing that people have WR about the concept of privacy nihilism where it's like we already lost why does it matter yeah um I you know I I have a a kid I I um you I do Post uh photos of my kit on a private you know social media account but I don't put them publicly on the internet and there are a couple reasons I've chosen you know my wife and I have chosen to do that one of them is the sort of like AI of it all but there's also just stuff that has nothing to do with AI there are lots of like Predators on the internet and I think that's um that's something that a lot of parents are worried about as well um and and the thing that actually tipped me over the line on this because we were sort of grappling with what to do with this is seeing that Mark Zuckerberg did not post photos of his kids faces on Instagram um he would like put little emojis over them which a lot of celebrities also do um with their kids and I was like well if Mark Zuckerberg has chosen not to put his kids faces on the internet like that probably indicates I Mark Zuckerberg isn't like it's fine you know then like it's I mean we know it's not fine but like even when he's like it's not fine yeah I mean I guess my advice for our listener would be like I I think it's a red flag that the daycare won't accommodate this request right as a parent that is a great rebuttal yes it's a pretty reasonable thing to ask like I a condition of sending my kid to daycare shouldn't be that their images get used on the internet so like I I don't know I would drill down a little bit to figure out well what what exactly are you planning to use this for you know is it is it like a you know a a a promotional Instagram account could you just use photos of the other kids like why do you need my kids face specifically but yeah that that seems to me like an indication that this is not a place that that may willing to work with you other things if you can't bring peanut butter to class you should also be able to opt out of the like AI facial recognition you know yeah I I agree with you on that Kevin if they're going to be unreasonable about this they're going to be unreasonable about something else and I do think it's worth making a stink over it because it's not just the AI training it's also like you're going to take my child's gorgeous face and use it to sell your preschool like I I should actually just have a say in that right or like give me a discount on your daycare or something so that's you know am I getting compensated for this usage yeah what's your day rate that's what I would do but but I would do a lot for free free or reduced price daycare that stuff is expensive you know yeah but I I also do want to underline something that Allison said which is like uh the a camera is going to take a photo of your child's face somewhere and you're not going to be able to opt out and and I don't want to sound like an nihilist because I do think that we should you know fight against surveillance everywhere but you know probably something to keep in mind all right our next question is about what happens when people are rude to digital assistance like Siri let's hear the clip hi my name is Nick and I live in Oakland California my son loves tools so I recently took him to a hardware store to look around where he became fixated on a magnetic Grabber tool while in the process of explaining to my son what magnets were and how they worked I accidentally triggered a Siri via my Apple watch it misheard what I said about magnets and thought I had instead called it a homophobic slur Siri lightly scolded me for my behavior and sent me some links to resources from the Human Rights Campaign this mishap made me realize how easily digital assistants misunderstand us but it also raised a fear does how we treat our digital assistant get tracked and influence how they interact with us for example if someone keeps telling Siri to shut up will it learn to give shorter responses over time as a result of this fear I even caught myself scolding others for being being roote to my digital assistance am I weird for thinking about this or is this a legitimate concern I mean first of all this man's Apple watch is not just going to sit around and take it when slurs are being uttered in the Home Depot what Alison how do you feel about like the way that people talk to their digital have you ever heard somebody saying something to their digital assistant and thought I I don't like the tone no but I did used to date someone who was like very codependent on their digital assistant which is like a very generous term um it was like Siri and Alexa I was like every I'm like can't you do anything your own on your own I'm like just Google it like if I'm either Siri or Alexa I'm going to be like read a book um but it I I think that like people being rude in general is really upsetting to me and like we shouldn't be rude even to our phones because if you're speaking it out loud into your phone someone around you is going to hear you and even just hearing someone be rude to like it's like have you ever been to dinner with someone and they're shitty to the server nothing worse like truly nothing worse than that yeah such a deal breaker for any sort of relationship friendship partnership anything I can't imagine and if you're like that with your digital assistant then you're probably a shitty person to a server I imagine yeah I I agree I I am polite to all of my like Ai and digital assistants um I even caught myself I got out of a wh Mo like self-driving car the other day and oh my gosh you're not getting in those they so they're scary are you both in San Francisco yes yeah wait did you did you take a ride or one while you were here the other no but I saw them and I I was like on Mission and I look look over and there's no one in the car and I'm thinking well that's not going to end well no I take them all the time and I caught myself the other day like as I was getting out like thanking the driver of the the the self-driving car I was like thanks one time I told an U driver I loved them I was getting out and I was like love you bye I was like what um well I think that here's what I do think is I think that the increase of digital assistance be it an actual vehicle or a Siri or type thing will make us more rude as people because we are going to have less and less uh empathy for the person that we're interacting with because it's not a person so we're going to think of them as a machine as a tool as a whatever like then you get it in Uber and you don't say thank you because you're so accustomed to being in a wh Mo or whatever those are called and you're not saying thank you because there's no one there and or you're like oh well I'm just asking Siri something so I'm like so I'm going to talk to like my Barista you know I think it's going desensitize us a bit which I know to I I also think though that like these systems will just like mimic human beings over time and like most empathetic people are going to have trouble being mean to something that sounds like a person even if you sort of know in your heart that it's not yeah it's like when when you remember when those videos came out of the robot dogs and people kicking the robot dogs they were like these these robot you know BOS I can't believe they caught you doing that on video by the way Kevin but people were like kicking the robot dogs to to you know show off how they could sort of get back on their feet and people were like really upset that they were like videos of people kicking robot dogs I I think you're right I think we're going to sort of we're going to feel weird talking as these assistants get smarter and smarter um we're just going to feel stranger and stranger talking to them in like a such a short and sort of I don't know like a like an unkind way well Kev Kevin let me ask you one thing because I've I've heard parents say that they are extra nice to the digital assistance in their house CU they don't want their kids to hear being rude so are you doing a lot of please and thank you to sort of you know make an impression on your child I do yeah because he's listening now and he's watching me and if I start like ordering around Alexa then he's going to think it's okay to start ordering around his friends at daycare that's my fear and then I'll get kicked out and I'll have to go to the dayare where you have where you have to sign up for surveillance yeah but there retinas get scanned and Age Two yeah all right let's move on to the next question this one comes from listener Amanda Darby who wants to know what we make of the risks using a period Tracker app let's play the clip hi hardfork for context I have been using a period tracking app for more than 10 years uh so it knows a lot about my health like um when I didn't didn't have periods other aspects of my health like population mood swings any irregularities in my cycle um but I don't think it's a stretch to say that someone could look at this data perhaps say if it were subpoena in a particularly Red State and make inferences about decisions I've made regarding my reproduction Ive health or experiences I've had that relate to uh my reproductive life so my question is what should I and other people who use period trackers be aware of and be thinking about when we think about this question thanks so obviously Casey and I yeah I was going to say you guys want take this one period tracker apps um but I'm I'm I'm curious what all what both of you make of of this question yeah I mean I do and the way that I treat it is very is like a calendar I do not it prompts me all the time to ask how I'm feeling my mood swings my symptoms my this my that and I never answer and it'll ask me like pointed things like it'll like go on a slide and be like what about this and like have you done this today like have you blah blah blah and I never participate I I had no idea that these apps were that nosy like what do they want all that information for well I think it's I mean they say they're using it to like give the user a better experience so like if you're like oh it's day three of my period and it's really really heavy and I feel like they'll be like that's really common 83% of our users feel the same way you know they use it as a way to say like here's what's going on with your body you have questions and and here's our data right so from everyone who's using this you know only 4% of women have felt this way so you might want to go to a doctor yeah or I did a little looking into this issue because I was curious about these apps and it turns out that this has been an area where there's been a lot coming out over the past few years um especially after Row versus Wade was overturned a lot of uh people said you know delete your period Tracker app because this is going to be able to be subpoena and you could get in trouble for using one of these apps there are privacy risks associated with these apps some more than others my colleague cashme Hill actually wrote a great article uh basically about whether this is a thing that people should be worried about and the conclusion of the article was basically there are many other ways that governments if they wanted to or or you know or groups uh that wanted to incriminate People based on information about their their periods there are lots of other ways they could get that information um you know they can some of them can track location some of them can you know if you're texting a friend you know oh oh like I I just found out I'm pregnant or maybe you did a Google search for uh you know for Plan B like those are the kinds of things that can be used and so uh the conclusion of these privacy experts was you know let's let's keep things in proportion it's not just uh the this is not the only way that if someone wanted to find out information about you and your your pregnancy status they could do it but um I'm curious if either you find that persuasive yeah I mean that is persuasive to me the thing that comes to mind is just we need a national Privacy Law we don't want to get too wonky today but something we've talked about before is that America is basically unique amongst developed Nations for not having any national Privacy Law so this is a case where we could just write legislation that says hey if you collect this kind of very intimate personal Health Data you're actually not allowed to sell it to an Advertiser like without someone's consent or something like that so yet another reason I wish we had such a law yeah all right this next one comes to us from a teacher named Margo who lives in Ohio she comes to us with a story about how some of her colleagues became concerned over how she used AI in an assignment she gave to her students and she wants to know are her colleagues right here's Margo so right before state test prep I wanted to um have the students analyze two pieces of poetry and compare and contrast the points of view is a common standard in Ohio and this was not meant for deep analysis this wasn't a poetry unit it was something that I just I needed two quick pieces to practice the skill before the state test and so I hopped on to chat GPT 4 at the time and I'm always trying to bring in marginalized voices into my classroom and so I had it generate uh a poem from the perspective of an indigenous person during the Trail of Tears and then another poem from the perspective of a white colonizer and I was going to have the students compare and contrast the points of view through the pieces and I shared this idea with some of my colleagues and I was met with a little bit of concern and one of my colleagues suggested that this wasn't a great use of AI because I was in essence you know silencing the voices of of already marginalized people by using a robot to create the piece of poetry and voices that are already marginalized and silenced and ignored um you know to go above and beyond to create something from their perspective it just wasn't sitting right with them and it really gave me pause and so I would love to hear your perspective on when if at all is it okay to use AI to generate literature for the students to um read and analyze oh my God I have so many thoughts about this but let's let our guest go first oh my God I have to go first okay well I mean yeah I mean my thought is is this doesn't sit well with me is what I'll say I think for so many reasons um Mo most of all that like there is so much literature that does exist that is authentic to people's real experiences and the second we decide to just cut out the human and be like well we're just going to like approximate it with this AI situation that freaks me out so bad I got to say yeah um I I really agree with you on that one you know when when you think about what a large language model is the people that built them took the all the literature they could find all the marginalized voices they could find they fed it into this model and the non- marginalized voices and the non- marginalized voices and they didn't compensate any of those people and now they're making a product that they sell for $20 a month so it's sort of hard for me to imagine a worse like insult to injury we're just going to use Chach to make a pump so look we have talked often on the show about we think there are lots of cool and fun ways that you can Ai and we will continue to explore those but one place I think we can hold sacred at least for now is like literature class like let's just show the kids real literature and not the AI slop I I I'm going to defend our listener here a little bit I I I share the concern about the sort of replacement of you know art and especially you know art by marginalized people with AI slop in this case like this is not something that she is offering as a commercial product she is not making any money on this this is a classroom exercise to teach students how to compare and contrast you you pieces of poetry like it's not like somebody would be getting a royalty on this that that is being deprived of it so I think we really have to look at the the economic harm here but I do think it's important to disclose to the students that the thing they are writing is AI and not to sort of make up some name and and like and maybe that would create a fruitful discussion in the classroom about the ethics of doing this my my take on all the stuff about AI in the classroom which has been the subject of so much debate over the last year or two is just that we should be having the discussion like teach the controversy essentially and um and I think this would be like a really good question to pose to the students but Kevin do you know how many poems there are why why do we need to use the AI for this we are not there's not a poem shortage in this country listen these are teachers they strapped for time they don't like what are they supposed to do go through a poetry anthology looking for the exact right piece it feels like microwave dinner to me it's like yeah we're going to eat and like sure we don't have the time it's like okay well at least we're eating dinner it's like well no like there's a difference and I do think that if you have the opportunity to educate your students also here are five really great indigenous poets that you should be aware of if this spoke to you if you're like interested in this story their stories here are some people you can look up but why not just go with one of those poets in general like just pick one I think the second that we have start being like does this feel weird is this is this gross like am I doing something wrong like the answer is probably yes right and it feels a little sticky to expose young people who are like supposed to be learning and expanding their brains not that they're going to like 15 years from now being like wow that one poem really changed my life but it might for one for one person yeah sure well I'm outvoted here on this show but Margo I just want to let you know um I'm with you um just just know that I've got your back hey that's the end of this clip if you liked what you saw head on over to our page and subscribe and you can get the full podcast we do a show like this almost every week on Tech and the future head on over there now and subscribe [Music] [Music] oh
Info
Channel: The New York Times
Views: 3,681
Rating: undefined out of 5
Keywords: Alison roman, hard fork, recipe, advice, podcast
Id: 4oVTB4ganoY
Channel Id: undefined
Length: 21min 2sec (1262 seconds)
Published: Fri Jul 12 2024
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.