DEF CON 25 Wifi Village - Balint Seeber - Hacking Some More of the Wireless World

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
all right if you know what's good for you and now's the time you'd be quiet' me as this man is literally invited just so that I can watch him and he's awesome so we will throw you out all right ladies and gentlemen I would like to present Quadling who normally yells no I would like to present ballin Siebert he is literally invited every year so that all of us running the contest can stare in his direction with our mouths open wondering what the hell just happened his bio is kind of lame software engineer by training Bolland is a perpetual hacker director of vulnerability research at Bastille networks and the guy behind Spence net what you should really know is while most of you were discussing burner phones on Twitter this man came into our village I think it was three years ago three years ago and he said I need to redirect a space probe real quick can anybody loan me a cell card and he borrowed a cell card from a random stranger and beamed commands to something flying in space to try to redirect it around the planet so yeah if you all are scared at Def Con and he's like redirecting space seriously sit down have a good old time I would like to with the proudest feeling in my chest introduce Balan Sieber and hacking some more of the wireless world an extension of his rant of hacking the wireless world last year this will be cut off when he damn well feels like it or sometime late after lunch so sit down and make yourselves at home meet your neighbors and Thank You Balan [Applause] thank you for that very kind introduction zero and thank you all for coming the wireless village over some of the other excellent talks that are going on at the moment at Def Con we just heard it an interesting story from to Def cons ago is my voice level all right can you hear me okay up the back yeah thumbs up great it where's Anders and is where you sitting Anders is the man that let me tell you to his phone so that I could get on the internet and talk to the computer at Arecibo at the Arecibo radio telescope to the beam commence ICU 3 so thank you very much Enders once again so trusting me with your internet connection so my name is violent and I'm work with the vulnerability research team at Bastille and I'm going to talk to you today about three things some of them I start introducing last year and I'd like to show you some of the additional work that I've done on them since then one of the briefly will be Inmarsat another one is implementing something that we saw at a cyber spectrum previously service spectrum is a meet-up that we run for people interested in sharing projects and software-defined radio in the Bay Area in particular and then also some new work that I've done with radar and hopefully my aim today is is that everybody understands a little bit more about how radar processing actually works and so if you have any questions at any point please feel free to ask I will leave here satisfied if I see nodding heads and smiles as opposed to quizzical look so so let me know how that goes and again most of this done with the new radio and I'll show you some flow graphs and and the general concepts behind them so the first item is going to be Inmarsat and later on I'll also be encouraging audience participation I've got some live demos here so that hopefully will go well with the gods willing so just a very quick recap if you're more interested in more of the details you can visit the last year you use talk the Inmarsat is a constellation of birds and therefore a number of services one of which is in my set arrow which is used by airplanes that travel you know elsewhere not over terrestrial coverage of normal VHF ground networks but if they're particularly going over the oceans and Inmarsat is a vent pipe design so transmissions will be sent from aircraft to ground stations and back it's actually pretty easy to listen to this over these on the c-band downlink you can just get a dish make your own feed use an LNA in a bandpass filter and then you're able to decode at least the p channel which is what i looked at to begin with to decode basically a car's star messages going over this link and i have to give credit here to my friend in buckley this is a setup that he made so that we could receive these these signals so again I won't go through the detail I covered it last time but you can start demodulating and it's it's a simple gmsk signal and then you can work through the process to actually turn what appears to be noise into intelligible data so the first step would be d interleaving and now I want you to remember the idea of an inter leaver or D interleaver so the idea is that you read data into the rows and then once the memory cell is full then you read it out from the columns and the purpose of this for the transmitter is to redistribute the information bits over time so if you have burst noise source then it won't knock out the all the bits in a single packet those missed bits will be distributed over time through many packets and then you can use convolutional decoding or forward error correction to fix those individual erroneous bits over time and therefore you get a much better chance of recovering any corrupted data once you do the forward error correction some D scrambling and then things look much more structured and then on the P channel here this was a coordination channel and some of the messages are actually user messages that contain text information and if you're a plane spotter and you like to listen to a car's this is essentially the equivalent but coming down from a geostationary satellite and it's interesting because there's a slightly different flavor to them in terms of navigating and sending directions to planes which you know anywhere above the Pacific for example so just just some examples to give you a feel you know random human readable text messages to the crew you've got weather reports in in standard notation they're a different form of weather report commonly used in aviation you've got a FN a DSC CP DLC which are different sorts of encodings for for you know messages that that pertain to routing and traffic flow and various other sorts of things and then this is a good one that I saw remember the Galaxy Note 7 they would remind everybody over the Inmarsat Arab peach channel as well and then a lot of these messages are also printed out on the cockpit printer or sent to the captain so you know informing them about preferred routes for their flight plan this one is a custom message saying that there's really bad weather somewhere and this one is interesting this is actually a step by step - I think reset one of the displays or subsystems on the aircraft or in the cockpit this one actually would be a message that you print out and so just before they close the doors you know they they print out a manifest in them that the pilots the captain signs it and so they would I'm guessing print this out there and then the captain would sign it on the on the line there here you'll recognize these numbers who who has a guess as to what these these codes are a ke9 1 5 3 6 Q any guesses I don't think they're tail numbers I think they're something else no not be railway points and I might be wrong here but I think that they're the codes for the luggage cargo containers because if you look on the side they're all uniquely coded so I think I think these are some of the the containers that they have in the belly of the plane here there's an alert the gross takeoff weight exceeds some threshold you know again routing information reduce the delay and this is interesting a call gateway and had them bring the fuel up to some value to take into consideration the point through fuel burn increase so them they're modeling different aspects of the flight this is actually data regarding all sorts of different subsystems so the auto break and you know various other telemetry that's sent back to airline operations and manufacturers so they can keep track of the equipment because a lot of the stuff there like the engines - are all fly by the hour so that's all sent back to rolls-royce or other manufacturers and then you know various things that are left to the discretion of the captain as to you know take off or what-have-you so that's all well and good and I also spoke previously about decoding all eight channels at the same time and once you do that then you can actually amass quite a lot of information and you might not want to read all that stuff but maybe you want to actually have a look at some other stuff which is what was just pointed out here the waypoints that are part of the flight plans because if you look at this you actually see there there are north and west cold and it's latitude longitude and then there are these five letter character combinations anybody have any ideas what these five character codes are that's right their navigational waypoints on aviation navigation charts that are established around the world so they often don't actually send the rule that it should long at your coordinates they just summarize it with these five letter codes because that's well-defined and so what you can do there is pass that and plot that so at any point when the blue arrow appears I'm going to show you something outside the presentation if for some reason I get caught up and forget about the blue arrow and it's there and I haven't shown you whatever I intended to yell at me and I'll switch so you can amass all that data and then pause it and then get something like this so I parse all that information and then you can convert it into KML for display in Google Earth and the satellite that I was listening to is actually servicing the Pacific Ocean area and so you would expect that all the most of the information flowing through will be regarding planes flying over the Pacific and so if you zoom out you can kind of almost get a feel for the satellite footprint and these are all the flight plans that have been sent at all of the the aircraft flying through so you know it's particularly busy over Japan say and if you zoom in a little bit more then you get all the the five character codes coming up and you can you know see all by the here I guess this is the registration number for the aircraft and so you can see you know where what what plan flight plan that was actually taking I think the the red dots are the other the dots that are actually sent with the raw latitude among the true coordinates and you can pass that up so just to show you what that looks like in the raw the output of the Guinea radio decoder it just dumps all this raw information and then you have the frequency here because there are eight channels and so if you I just pass that a little bit you can get you know those sorts of exits that I showed you before and what I did was I just googled these five letter codes and then through various matches online you find various listings for these for different areas so you know for for Russian airspace there was this kind of text listing so I just throw that in a Python file and then at the bottom I just haven't go through and passed that and it builds up a list of waypoints st. for Hong Kong Japan Taiwan actually had a JavaScript listing so I did a fine and replace to make it possible in Python and then build the sort of a self-reflexive object to to parse all that and then the waypoints file just you know include includes all the points and then the positiv takes the output of this file and then looks for all those coordinates and five digit five character waypoints and then just spits out the KML so it's kind of kind interesting to see the system actually working in visualize that way so that was the update on in master any questions about that before I move on to the to the next thing no okay so the next thing is actually also aviation-related it's an implementation of an what's been termed as an unselected AM receiver and I have to give full credit to Kevin Reed who at cyber spectrum number 15 demonstrated this with his shiny SDR platform which is a really neat web-based excuse me really neat web-based SDR receiver so that you can use it remotely from the actual hardware and what he did and it's documented this on his blog there which I encourage you to have a look at is do a really nifty sort of reversal of the common AM demodulator concept so switch borg on on twitter so with a normal I am the modulator if you want to listen to aircraft listen to interactions between pilots and the tower you know for takeoff or approach taxiing if you're a bit of a plane spotter then usually you have a receiver so an software-defined radio and with that thick arrow there that's receiving a very large chunk of the spectrum so say you might be receiving I don't know 10 megahertz worth so you can get a nice big chunk of the arrow spectrum and see all the little transmissions there and then you select a channel to listen to and so that gets Dan sampled and filtered and then you go complex to mag actually that should be a thin arrow there I didn't fix that and then what happens is once you do the AMD mode which is really this step here of complex to two magnitude then you output that to your speaker and you hear the audio what Kevin Reed did was that he swapped these two here so so what he did was he swapped the damn sampling and filtering with the complex 2 mag and so what that means is you essentially take the magnitude of your entire AM spectrum that you're receiving like 10 megahertz worth and then you down sample and filter the entire thing so you never actually select a channel to down sample to you you D modulate the entire band but because it's AM the side effect is you end up hearing the strongest transmitter in the band which is really neat thanks Russ there we go so the idea here is that you end up hearing the strongest transmission and this works because it's a.m. right your demodulating a.m. because you're simply looking at the power level of your carrier wave and if you treat the entire spectrum as your carrier wave then the strongest signal there will cause the the largest ripples and that's what you'll end up hearing so he swapped them like this and this is actually really easy to to implement he did something that was really clever which was take the receive output the raw complex baseband and then apply a slope filter to the spectrum so one path you take out and you say I want to have the left-hand side of the spectrum louder than the right and the other one you want the right side of the spectrum to be louder than the left and then once you've shaped your spectrum like that you do the same process so you you do complex to mag or AMD modulate the entire entire spectrum down sample and then you play that into the left speaker and the right speaker respectively so what ends up happening is stuff that's on the left-hand side of the spectrum you end up hearing out of the left speaker and stuff that's on the right side of the spectrum you hear out of the right speaker so it's spatial eise's the entire AM spectrum for you and if you're wearing headphones it sounds really cool I've got some speakers here so in a moment I'm my task for absolute quiet because they're not very loud and they need to be stereo but I'm going to show you a video here and then I'll show you it interactively now just before I show you the video I'll just explain what's going on here so this is the flow graph and this is the whole spectrum and this is a representation of the frequency response of those of the shaping filters so this is not this is the raw baseband that's coming in and this is this is that that shaping and so there's a blue one and a right and a green one that's the left in the right channel now the reason why it has these notches in there and you see those notches reflected on the base band is because often you you've got birdies and/or Spurs which are either internal to the receiver or from you know man-made noise or from vo RS which are always transmitting something or just you know other things that you don't want to listen to and sometimes they can be powerful and they will drown out the legitimate transmissions that you want to listen to and so I thought it would be cool if you could interactively notch out those frequencies that you didn't want to listen to and so what this does is and I'll show you you can click on the baseband spectrum on the things that you want to not chart it'll recompute the filter with those notches included and then apply that to the left and right slopes and it does it globally so you can change the frequency of your SDR and then it'll actually recompute everything that's in your pass band and always keep the stuff out that you don't listen to so if if I could if I have as quite as you can I'll turn the volume up here so you can hopefully hear it and so I'm just gonna go so notice I clicked on that and the tone went away I don't know if you can get the stereo if they're Cokely you can so see I click there it applied the notch and then written recomputed that and I'm changing the baseband frequency and so it's it's shifting all the notches along to keep them on on the right frequency and so I moved to to where there's eight cars and it's very loud and powerful you want to listen to it so I'm not that up I'm not that out too so it's cool right because I'm not actually clicking on the channel that I want to listen to it just all comes in and you hear the the most powerful thing and then what's nice is that you can click on remove if you want to remove the knotch and then click on the the baseband spectrum and that will actually remove that knotch recompute the filter and then you get you know in this case that I don't want to turn back so so that's that now the thing is I mentioned that that slope filter and it wasn't immediately obvious to me how you you generate that and it turns out it's actually really really easy this is done in a single line of Python in a GRC block that I'll show you basically you use numpy and use linspace to create a ramp between zero and one of over the number of taps that you want you shift the do an FFT shift so it basically rearranges you know ha one half and the other half back-to-back you do an inverse Fourier transform and then you shift it back so what you've done is essentially you say I want this frequency response and then you take the inverse f of T and then it gives you the time domain taps that you can apply in a normal F IR filter and that that's all it it's it's it's really basic so I've got a blue arrow CH means show you the actual thing any any questions so far who actually listens to the arrow band and listens to cockpit and tower and all archive stuff a few of you alright quit was there a question under there just curious with the interactive knotch you could do equalization to null out tones that are continuous but is that too computationally intensive across that wide spectrum so that's a very good question so I don't have to repeat the question because he's in the mic usually yeah so if you remember on that line when it was using a linspace and had the task you can decrease the number of taps if you're using a higher bandwidth because naturally yeah the larger bandwidth the like this thing works really hard I think I do six point two five megahertz off offer you sir and it makes it work pretty hard and I chose a higher tap count so that it would look really good in the frequency response plot but if you if you make that much smaller it looks far more ugly the response is obviously rippled but it what works okay so you can manage it that way so let me find the flow graph here I'm gonna have a resolution problem so let me just transfer this make the thing fit all right so I'm looking for a month selective No some I'm going to just do a live demo I know I put together so that's it's that and so at the moment I've got a single much in there and if I take it out you can you can hear the tone right you don't want that so let's and if you're watching the console there I've got this not just I Jason that contains the list so if I click this then it'll load the load that not just calculate the complex what to do a complex filter and then rotate it up to the frequency the baseband frequency that I selected and then apply that here and what's nice is that you can also change the number of taps so if I made it 13 instead of 512 then obviously changes the look of your frequency response and one other thing Ian my friend Ian Buckley observed this that if you actually add a delay between the left and the right channel especially if you're on headphones that it makes it it emphasizes the stereo effect even more I don't know whether it will necessarily be able to tell here but if we make think this looks like yeah you'll need to play with this yourself but you can add at a delay to emphasise that and you can you know mute the left with the right and if you look at the the actual response of each the output of each filter you can see the baseband plus the filter showing that characteristic you know roll off to one side and the other one and the green lines that are appearing at just basically selecting the the signal V FFT bin that has the strongest energy in it so that that that's most likely the the audio that you're hearing in that frequency which is being updated at the bottom there max frequency right then left so again this is just the implementation the original idea was Kevin's but it's you know I thought it might be fun to implement in Guddu radio I can take you through the flow graph very quickly I've got a file source here and then you get your complex baseband output and as I mentioned you whoops you take one line up to one filter that contains the right channel taps and then you take the baseband to the other filter that has the left channel taps and what's nice actually is that to compute the you only need to compute one set of taps because the other channel is just the same taps but in Reverse which is which is handy and then once you have done the the filtering you merely do the complex two mag which is the AMD mod step and then you resample it to match the sample rate of your sound card I've got a DC Block in there as well to remove any any DC from the AMD modulated signal I've got an AGC so that the audio level that comes up the speaker is always more or less normalized a low-pass filter so that you you know am is quite narrow band here so just to focus on that one one channel at the audio audio pass band and then it just goes into one channel of the audio sync and that's all there is to it the additional complexity happens more with that the interactive thing so the graph there is brought about by what do you got here there's this sort of real-time graph in that plot lib rap rap where I made and so using an any code block you just implement the graph like that and it all caused it to display and then every time every time here the variable dependencies taps left or taps right changes that this line means create a callback for if those variables change then it does this Python actually this in multiple lines but separated by semicolons so the first part of it is use Sai PI to calculate the frequency response of the left taps and then calculate the frequency response of the right tap and then set the data on the real time graph with the fft shifted log ten magnitude of the frequency response and so that lion wherever the taps change will update the graph and then in terms of actually clicking on on stuff there's a callback mechanism here you know with these FFT GUI sinks you can specify a variable to take the the frequency that you clicked on so when that variable changes over here then I have a little bit of Python that it's a class and it has this add not function in it and so it takes the the click frequency and if remove is disabled then it adds the knotch and if remove is enabled then it removes the knotch at that frequency and so there's just a little bit of Python that manages the the global database of values and once that happens I think it's it's here so that it the tach generator class is initialized here and it basically takes the initial set of taps which is just that initial you know left and right slope and then self dot taps is I think a reference to the the actual taps that the generator will will update so the generator will generate new taps and then call a callback registered within GRC to update the taps and then it you know will update in the m50 filter an update on the graph and that's it really so conceptually at least not you know it's it's pretty elegant and again that's props to Kevin for figuring that out any any questions on any of that so far is it okay all right so next on the agenda this is the the third and final part is radar and particularly FM CW radar I spoke about a little bit about this last year but I'd like to show you and some more and explain some more concepts to you as well and to do a bit of a recap who's seen these primary swerves radars at airports when you fly you know you see them rotating there this is a basic kind of CW radar and if you watch here as it rotates some I'm sitting on a hill with a usurper on the frequency and on the left-hand side it's plotting the magnitude of the signal received and you can see that it's triggering play it again it's triggering on the initial transmission from the radar and the radar vents which is to receive mode to listen to any echoes that are coming back from in this case airplanes and you can see there that there's also some responses coming back very close in time so I'm picking up both the initial burst of the radar and what turns out to be ground clutter mainly nearby and so what is the radar return and how does that work so this is an example return down the bottom just complex IQ that's been again turned to am essentially the magnitude and the way it works is that the radar sends out what's called the bag which is that initial short pulse and in this case it's CW so it's just a continuous wave at a single frequency and then the radar switches to receive very quickly and then it waits through these echoes to come back because the radio energy will reflect off for example aircraft fuselages and some of that imagery just a small portion will come back and be collected by that massive dish that's rotating on the radar it's so massive because the signals are so weak and so you want a lot to gain at the antenna and and then once that period has elapsed which is the pulse repetition period then a new pulse is sent out and then the process just continues and this can happen hundreds or thousands of times per second and there is some radar concepts which are important to get a hold of one is that pulse repetition interval of frequency so that's the amount of time or how many times per second that pulse is sent out and then the wait period I'm gonna call it PRF which is the number of times that happens per second and the width of the actual transmitted pulse so for how long the radar is actually transmitting that's also important so the way you actually figure out how far an object is away from you is by looking at the delay between when the signal went out and then when you have the signal come back and this is called the round-trip time I'll explain that a little in a little bit more detail in a moment but if you consider these two diagrams here from RF cafe these thick black lines here when the transmitted pulse were sent out and then you have this return here that comes back and there's this notion of unambiguous and ambiguous returns in radar so in in the this diagram the a diagram you have a return that occurs before the next transmitter pulse and you can then say that this is an unambiguous return because it happens before the next pulse in the B diagram the real range actually occurs after the second pulse so that's in excess of that interval there and so if you didn't know any better you would assume that this target is actually really close to your radar or an actual fact it's even further away and so you get that sort of false return and this is ambiguous range and so what's important to note is that the way you design the parameters of your radar will constrain you and will set the unambiguous range that you can compute and so the easy way to think about it is that the round-trip time times the speed of light divided by two is your virtual range it's not the real range it's the virtual range so just keep that in mind now with that radar that you saw spinning the primaries firms radar and you're recording and then each of these scan lines is triggered there's this red part on the left-hand side when it hears a really loud burst which is attributed to the radar doing it's it's transmission and then it keeps listening for a fixed period of time for the return so this is all you know me sitting on the on the hillside with my little use of weather just a with antenna not a big radar dish and then it captures a fixed number of samples and then it waits for the next two radar transmission as it rotates and what amazed me is that you can actually still pick up with a little weapon with antenna all of these structures that are a little bit further out from the ground clutter which is causing these immediate returns on the left-hand side and so this is just each scanline and then I just recorded it for a number of rotations I think this is actually just a single rotation this is 50 mega samples per second capture into ramdisk and what's nice is that taking that that linear plot and if you unwrap it into into polar space and place it on the radar position itself this is the Bay Area here you can see the ground clutter actually lines up quite neatly with you know the ground and not in the bay what blew my mind is that you actually get returns from the power pot power pylons that cross the the bay there and also from the bridge so these large structures return enough energy to be received by just a width antenna on their user so you know that that's the simple concept of of CW radar returns a bit of theory here from from Wikipedia this is to illustrate why CW radar is is not the best kind of radar there there are better forms so again this is the plot where you have time and the red CW tone here is the brief one is what's transmitted by the radar and these blue patches here are the echoes that come back and so if you take a match filter which is in this case just a filter for the frequent see that was sent out then that means you can ignore everything else and if you apply the filter then you get this plot on the on the right so you ignore the transmitter pulse and then the peak of the response here is is at the time where where that target was and then this other target is further out and so you get a lower amplitude response but again you get the peak there and this is not not the best right because you have this ramp up and ramp down and you like to have a much more accurate definition in time and the time domain of where that target response actually is and the bigger issue is that if you have two targets that are close together like this so those two targets are now not separated by as much distance and therefore time then the response the echo actually blurs together and so now it looks like there's only a single target if you apply a naive protection strategy so that's why CW has these drawbacks and now a simple alternative and there are many other radar alternatives is FM CW and what that means is it's frequency modulated carrier wave so instead of just sending out a fixed carrier wave you actually send out a carrier wave that increases in frequency or might decrease in frequency and this is what it looks like in the time domain so you start at a low frequency and then you quickly ramp up and it might be linear it might not be but you can ramp up and by the end of your transmission you're you're actually transmitting a high frequency and the reason why FM Sedova is good is because these chirps this is known as a chirp when when you have something increasing or decreasing in frequency it has the property of strong self correlations it's got a really good autocorrelation property and what that means is if you take that signal and you mix it with a copy of itself then you'll get a really good response but if you mix it with anything else then you get a really bad response which is not the case for what we just saw with the CW tone because you get this mix here and the ambiguous response so I have a demo for that we're going to actually listen to what a chirp is so that you can get a better idea of what that is like and you can see it so this is again in the audio domain I want to emphasize that these are general concepts and these are transferable between audio and radio and other things so you will hear the chirp and this is actually the response from my microphone in my laptop so as I'm speaking you can see there does anybody want to whistle or make a tone there you go just to prove to you that it's it's live and now I'm going to let you hear the tone and you'll see it on the spectrum so we can't even hear that but the microphone and the speaker's still generate a response okay so this is a chirp it's a very slow chirp but it's a choke nonetheless the radar chips are much much quicker and I'll show you what that sounds like and looks like in a moment so this is the frequency response in the time domain what a common construction that you use is you use a sawtooth wave so usually a signal source engineer ear you might have a sine wave but here I'm using a sawtooth and I hook that up to a VCO to a voltage controlled oscillator so what I'm effectively doing is I'm I'm putting in a linear ramp for linear voltage control and then the VCO will turn that into a frequency modulated continuous wave and we can see what that looks like here so I'm gonna speed up the shirt here with the where is it delay multiplier low-pass cutoff uh here so this will take ten seconds into the chirp I'm gonna make a do it every second right I'm gonna make it do it ten times the second okay now if you look at it in the time domain this is the output of the sawtooth generator right so this is not the frequency output yet the the FMC oh yeah but this is just controlling the VCO once you feed that through the VCO it looks like this so let me slow that down so you can see a little bit better this is the the time domain plot of the complex output of a VCO block and you can see that you get a complex sine wave so on your side but it's increasing in frequency and then it resets so this is the basis of a chirp radar we're listening to the chirp here alright so again this should be familiar to you now we saw in the complex domain but here it's in the real domain now in this case we've been listening to a full duty cycle chirp generator so it's constantly generating the chirp and as soon as one it finishes the next one starts in the radar scenario we looked at before we sent out a chirp real quiet for a while and then we sent another chip you can do both they're both equally valid they just require different sorts of hardware configurations I'm going to start talking about the full duty cycle continuous version now now the other thing to remember here and this is really important too is that when we talk about filters filters need taps right before with the arrow you know the left right channel unselective receiver we generated two series of taps to define two filters and they they impose their shape on the spectrum here we're gonna make we'll consider a filter whose taps are the chirp okay so usually there are low-pass filters and high-pass filters and you can think about how they might shape the spectrum you can make a filter of whatever you want you can even use filters in the digital domain for like the access control word or a pre for a digital packet here our filter is actually going to be the chirp itself so the idea is that whenever it hears a chirp and then you put a filter through it you'll get the maximum response from your filter when the incoming chirp lines up with the the chirp that we've defined in the filter so and this is what makes it I think really click this is like the diagram we had before so we have the transmitted chirp here and we have these two echoes right because the echoes are going to be the reflection of what we transmitted and because we're using this the special filter that has this great autocorrelation property we get a really nice well-defined peak here in time that lines up with the response the exact response echo that came back here so you get the first target the second target and this is the really cool thing then all this noise is added right this is basically swamp in our completely swamp thing out the the echoes and amazingly when you pass that filter with the chirp through it the noise disappears and still you're left with the responses the peaks at the two targets and even if they were overlaid on one another like we had in the previous plot would be just the CW you would still be able to disambiguate those two targets so FM CW is pretty simple like you can use radar techniques that use coded transmissions but you know this is like an in-between level where it's it's still fairly doable with with easy construction even in GRC so any questions so far question no question about the mapping how did you get the the I guess rotational rate of that radar to project this is going back a couple yeah you know I guessed I mean okay I I recorded I think I can record 70 seconds to 16 gigabytes of RAM and and I think I guessed the other one was there's a distance from the actual radar receiver you on the hill give like some kind of distortion yes yeah exactly so I can show you later okay but if anyone's interested in this notion of the distortion and and the actual physical model of the radar system and if you consider the path propagation I'll talk about that just briefly but in in a blat Hecht black talk I did a couple of years back I show a slide where actually I wrote a program to create rasterization a visualization of that distortion map so you basically give it the offset an angle between the receiver and the transmitter and then will calculate a distortion grid that I didn't do it but you would have then apply that to your returns and then it would under store everything so it's possible to do but that's a very good point in that we're always talking about virtual range and virtual time echoes and so you have to fin mind how the geometry of your radar is set up next question did you just go back a slide so there you've shown the filtering this is from Wikipedia by the way yeah I realized that I was gonna ask the question the previous Wikipedia side had the two pulses really close together yep is there a limit for the church like decoding filtering the chip in terms of frequency or is it something else can you two be able to disambiguate yes so again that's all set and constrained by the parameters of your radar system so how why the chirp is what your signal bandwidth is what your sample rate of the ADC is all that all that stuff and I've got a little little Jupiter notebook that I'll put online and you can enter in all of those parameters and it'll tell you what the what all the constraints of your radar system it's just because the wikipedia page showed that previous kind of you know signals close together yes intrigued to see yeah I mean that was some simulation that someone put up but again it's a lot to the numbers just a correlation of giving you that rotation rate it may be let's light them just go look at it and get them yes yeah it would yeah yeah I mean you can its each captures like I mean it's like 15 gigs or something so you could do it it would take a little while but but that's exactly right the point is raised that if you wanted to figure out the the rotation rate I you know I guess because I didn't want to do it but what you would do completely correct is you would run an order correlation with the capture and do a correlation with itself and as soon as the similar kind of features match up again you'll get a response at the time whether where the radar has resumed and completed a single rotation thank you very much any other questions that's that's great thank you for these questions it's really good so so with this continuous or full duty cycle radar system you have your chirp generator and that's transmitted so as you heard here like it's being transmitted out of the speakers and then the receiver in this case you know demonstration it was the microphone in my laptop and there I then mixed the input with the chirp right and because we're receiving and generating this simultaneously on the same hardware it means that the clocks are synchronized because our using the audio hardware that doesn't allow you to start transmitting and receiving the streams at the same time that's different on say a USRP which I'll show you next and that's why I have this delay slot factor that you can change and I'll show you that in a minute but you mix the two together and then you get the deep signal and the deep chirp signal holds wonders that I'll try to explain to you but it means you can do some really sophisticated processing in a pretty simple way and get much more rich information out about what's going on in the in the space around you so what's important to note here is that once you do the receive signal then you get constant tones so when it D chirps the when it deep chokes itself it'll you'll just get a DC tone and then any other reflections will end up generating other tones at different frequencies and that's the key there the cool thing about chirp radar is that once you do stuff your reflections become tones at different frequencies in that frequency is controlled by the distance from your radar which is kind of mind-blowing when you think about it it's just a really nice way all the math kind of falls out and then you can just use f of T's and then do cool stuff I don't think so because you're still just doing it on the basis of the responses and filter so you're still looking for a peak response I mean you can send out any kind of signal you wanted to and I'll demonstrate that actually the last thing I'll show you is passive radar using exactly these concepts using digital television signals off aeroplanes so stick around if you want to see that that's what this is building up to so what's interesting is that in you know properly deployed radar systems what often happens is that this more sophisticated RF plumbing so when you have a receiver and you're continually transmitting you don't want to have to ingest your local signal because it's always going to be the strongest thing you hear and considering that it's going to be eventually fed into an analog to digital converter that has a fixed dynamic range you don't want to blow out your receiver you don't to blow out the ADC and so what you can do is you can actually mix out or null out your tone that you're transmitting just using analog RF mixes mixes and and blocks o DC blockers for example and that means that you take the power out like the actual RF power you subtract not digitally but but in the actual you know physical electro making domain and then oil left without the responses the echoes and that way you can much better utilize the the dynamic range of your ADC and see reflections further out so from Wikipedia as well this is a nice diagram to illustrate that kind of arrest it up so again you've got your generator power divider transmit that out and you split that off bring in the output of your receiver through a preamplifier mix out the signal that you transmitted low-pass filter amplify again an engineer ADC and then you do your digital processing so that's the simple kind of structure so what I'm going to try and do now is take you through how you do range calculations and then this is the magical new part how to do Doppler calculations using FM CW and GRC and it's basically all FFTs it's it's this would appeared like this massive mystery black box to me for the longest time when I finally kind of sat down and looked at it it's actually really really elegant so my goal here today is to make you one try to understand at least a little bit of this so this is a plot in the frequency domain right we talked about chokes we know that they look like this on the spectrum here you've got time going from left to right and frequency from bottom to top and the red signal here is the chirp that you're transmitting and the green signal is the echo so remember an echo is just going to send back what you transmitted but with a delay in time right so as you can see the green is delayed a little bit delta T but from the original transmission now what the beautiful thing about this is as I mention is that any time delay will produce a frequency shift at any single sample in time so if we look at time t1 you've got a frequency shift from your transmitted tone at that particular point in time to your received echo and what happens is what was it going to demonstrate you there maybe the whole thing here yeah that's what i'ma show you remove it so going back to the audio demo we had this right and we know what the VCO input looks like we're not the VCO alpha looks like and now we're going to look at the FFT of the output of the deeper thing of the mixing right so what this is doing is it's taking the chirp and then it's going to multiply that by what is being received from the microphone and what you'll notice here is if you compare this one right to this one what it looks like and hopefully you'll see this it looks like the spectrum is being rotated over and over and over again at a fixed rate see how looks like the entire spectrum shifting so the left-hand side here so actually do people want to whistle and you'll see your whistles come out but they'll be diagonal lines here so see this is me and this is the cool thing our chirp signal remember I was telling you we're turning to a fixed time can you all see the fixed tone here it's that line coming down the middle at roughly DC right that is the tone that we want to receive and any other tones that end up I think in this case to the left of it will be echoes that have come back from around the room into the microphone on the laptop and so that frequency difference because remember this is in the frequency domain that now that frequency difference is going to give you your virtual range information right which is pretty pretty neat now what does this look like I showed this last year but hopefully explained things a little better so that it makes more sense what I'm going to show you now is the same thing but actually running in the proper mode see this is the chirp again it's running at a higher rate because we want to be able to update the display more and you see when I'm talking and still picking me up because it's constantly rotating the spectrum it doesn't look like a proper frequency response now so what I'm gonna do is get my other laptop to be the target and I'm gonna set my my delay slope so that you see the the DC part of it somewhere okay apparently Nate temple the genius here pointed out that I'm trying to move my laptop in front of speakers that I'm actually outputting anything I want to use the speakers in my laptop because that's where the microphone is not those speakers thanks Nate Nate is an absolute legend by the way if you didn't get to see you cyber spectrum talk from the other night I recommend you go online and watch it cuz he has built some amazing stuff alright here we go this is better now so I'm what I'm going to do is this is the DC component and because I'm talking right we're just going to get this noise showing up here but if I put my laptop here then you can see there's that main response there that is directly related to the height of a laptop don't clap though because then you'll have to set the microphone right so that's alright I planned that so what's happening here a chirp is being sent out from the laptop I mean this is you know a chirp those air molecules are being vibrated and then that audio wave is is moving up hitting the bottom side of my laptop and then being reflected back down in the microphone and amazing in the microphone can discern even though it's it's hearing itself it can still discern the response from an object way out here so that's that's the power of FM CW now there was another little thing in there that was on my screen I will show you that that's actually the Doppler plot and I'll come to that what that gives you is not only range information but it gives you volatile emotion so you can tell how quickly a target is moving and that that's really cool when it comes to aircraft so what I want to show you here now is this plot where instead of looking at the original frequency domain what we receive this is the once we've done the deep chirping so we have that DC component here and then in the green line this is now our target return tone right because we've decrypted and so you have this difference in frequency so the frequency change implies that time delay and therefore virtual range now what's neat is that you can take the D chirped output and then pass it through an FFT right because an FFT will give you the energy in each bin and so the trick here is that you use an FFT you know how you have a certain size in your FFT a certain number of points that you transform the number of points in your FFT you can make the same number of samples as what is made up in a single chip right so you decide how long your chirp is and then you set your FFT in terms of samples to be exactly the same amount of time and what that means is that every single time you do one full chirp you take one FFT and I'm trying to illustrate that here by these lions so if you think about these over time this is like a waterfall these are your frequency bins that you get out of your FFT transform and in this case the echo returned the green line ends up falling into in this case arbitrary bin number five and remember how we said that differences in frequency will imply your virtual range here then the bin that it falls into will give you the range information for that particular bin so here you can se bin 5 and we can calculate how far away that is and to calculate how far away that is you know how long one sample lasts because you've sent a sample rate on your SDR right so 10 mega Hertz or whatever 1 mega Hertz so you can calculate 1 over that to calculate how with a period of a single sample and each bin in terms of duration maps up with the duration of a single sample and if you know how long it is if we're talking about RF now in the audio you can multiply it by the speed of light and that gives you your round-trip time and you can calculate your we'll range so it's cool because you can map to frequency really easily just by doing an FFT and then you can map that directly back to virtual range so it's it does that kind of make sense so if I see people nodding which is good now the wild thing is in this in this instance you know this is a target at a fixed range and it's not changing here you might have two different targets moving toward an away from your radar receiver and so you would expect a plot like this and we kind of saw that before with the FM chirp actually one thing I was going to try was I don't have another laptop but maybe if somebody has a laptop we've try at the end we can have two here and then if you'd move them separately you'd get obviously two different crossing returns like that maybe that's what I was going to demo but I'll do that later because time is short so again as I was saying the sample rate sets the duration of a single sample and therefore limits your range resolution because if you consider you get discretized bins coming at an FFT and the length the total length of that is going to be your unambiguous possible round-trip time and so your pulse repetition frequency sets your unambiguous range but the sample rate also sets the distinct range that you can resolve within a single bin so as you step from one bin to the next that's going to be some amount of range that you go you know from one one to the next and so what you have to consider is that targets depending upon your Raider system might be differently spaced but they might fall within the range of a single bin and so you'll get one return for multiple targets in a single bin and that's that's ambiguous so you want to do something which is Doppler processing to to make that unambiguous and as I was saying you've got different radar geometries so you got a mono static system where the transmitter and receiver are co-located and so it's just a direct line of sight round-trip time but if you have a by static radar system where the receiver is separate from the transmitter then you have to take into account the geometry because that delay is going to be you know you might have a target way out here and you'll have the single bounce out and come back or might be in between the two and so on so important to keep that in mind so what I was talking about there is that with our ref the speed of light we know what that is and for a single sample that's going to be a very large distance because we're talking about the speed of light 300,000 kilometers per second so even at a high sample rate megahertz a single sample is still going to allow a light to travel a long way or in this case the RF energy and so what we want to do is you can increase the sample rate to give you a better range resolution but but there comes a point where if you're just using an SDR and a laptop you can't go to you know gigahertz worth of bandwidth you you will end up being constrained by your bus and by your processing speed so what you should consider them with an FFT here is that this is an FFT and you have these range bins and then every time the chirp finishes it does the transform and then it just starts and keeps cycling like that over and over so you can see that that's cycle there and we're talking about multiple targets fitting into a single range bin and so once you do the FFT and you were to take the the magnitude you would basically get an energy response at that range so as I'm saying you might have two echoes that fall into the same range because they're not the same target they're slightly different but from the point of view of your receiver they end up giving you the same round-trip time and the other problem too is that you might have a radar system and I'll show you that this were the passive one where the the transmitter is so strong even out to many range bins that it completely swamps your receiver in the for for you know if you were just doing this transform and what's nice is that you can do Doppler processing which reveals this hidden information working in the phase information that's output from the Fourier transform that will actually show you these these targets even though ordinarily you wouldn't see them because you're swamped by your local transmitter or or ground clutter so yeah the clutter could affect that bin so you don't see them if you're just looking at range information or it'll take out your entire transform so the Doppler effect we've heard about this and we know that it will cause a usually a shift in frequency and the classic example is an ambulance that's driving by and in this case what I'm gonna try and illustrate to you is how it changes the phase of the echo that comes back to the radar that's the key here so we get a phase change due to motion and again if anybody had any any questions so far or any any comments if you want me to repeat anything please please ask don't be don't be afraid as it took me a while to figure this out yeah yeah so it depends on sort of how you conceptualize it and and how you receive it in this particular case is when you think about I mean I'll illustrate it here and hopefully it'll make sense so the idea is with Doppler processing that you receive not just echoes from a single chirp but you do it for multiple chirps over some arbitrary period that you select that'll also have implications on what you can resolve but this is a referred to as the integration time so let's say you can have an integration time of like a second and let's say you're sending out a hundred pulses per second so you collect a hundred of those returns and then you process them in one batch and then from that you extract Doppler information and the key here and I'll illustrate this for you is that you instead of just ending where we had before where you do an FFT of your range response bins you build up then a hundred of those range response transforms right so you imagine they're in rows and then you run an FFT S on the columns so what you end up doing is you do a second effort series of FFTs on each individual range over the integration period and I'll show you what that looks like so the other thing to do is remember again the geometry so this is Doppler velocity with respect to your radar system not with respect to the moving object so if you consider this scenario on the left here you've got a radar transmitter receiver sends out a pulse the aeroplane is moving tangentially at that moment to the radar receiver so it's not going to impart any Doppler shift on the return because there's no velocity component in the same plane as your radar signal whereas in the second scenario in the middle with the planes moving toward the radar receiver it's going to impart the biggest Doppler shift on the return and if it's offset by some amount then even though it's you know not entirely tangential like in the first scenario it's still gonna impart a little bit because there is still a component there that matches with the the plane so with Doppler processing if you consider the scenario where at any one time the target is moving toward the radar there in int a teaser that pulse is sent out at round-trip time divided by 2 the reflection is sent back and in the round-trip time you get the return and you can see that throughout that period the signal amplitude decreases because of course there's that loss through through free space and on there on the reflector and what I want to illustrate to you here is when that return comes back at a particular point in time right it's always going to come back at a particular point which implies a particular range right so there's a range there there's going to be a phase difference which might be zero there might be effectively no-face your different sort of phase difference or some phase difference between the signal that you sent out and the signal that comes back and if you consider the actual propagation of a sine wave even though you know they're photons and whatnot then depending upon where that plane is if it's if it's here or it's like a meter over here offset the wavefront will hit it and then you know if you imagine that wavefront hitting it it'll hit it at a particular phase and then a reflect at that phase and if you move that plane a little bit then it'll hit the the tone will hit the plane at a slightly different flight phase and reflected a different thing and so in this particular case we're lucky because everything happens to line up on on the phase boundary and so when you get the response it actually ends up being the same things so in here it's zero degrees and remember this is all in a single range bin know you could always save if a plane is traveling you'll see it move through the range bins and imply it and that implies that it's moving with some velocity but this is all within a single range bin because we want to disambiguate targets and get velocity information so you've got a transitive pulse receive pulse no phase difference now let's consider the plane has moved a meter and you know it also depends upon your wavelength that has implications too but here we're talking about the same scenario and if you if you notice carefully here the wave that is returned is oops slightly offset see that so see there and then there it's slightly different phase on that on that way and so when it comes back the phase difference is going to be slightly different as well and when we compare them we get some arbitrary phase offset now let's consider these are two distinct pulses now let's consider for in more or less successive time order so the first one comes back the second one comes back with it with a difference third one comes back in this case it's the same because the plane moved again and imparted another phase change and then again we've got a different phase offset in the fourth period there and remember every chirp period that gets sent out we get this different energy coming back to the to the receiver it's the same chirp or the same signal the magnitude will look the same but the phase will be different and what you have to think about is when you do FFTs when we look at them on the waterfall or when we when we usually think about them what are you doing if you're taking the F of T which gives you always a complex output and you're you always end up taking the magnitude of that complex output and the effect of that is anyone you lose data what do you lose you lose the phase information so here we want to use that and the integration Peary is important because that's the period over which you're monitoring these face changes in each of the range bins does that make sense so the idea is to get that phase information from each f of t bin over that integration period and then think about this what is a changing phase over time it's a rotating what's the magic P word it's a phase all right so you've got a phasor going around and I think I have a blue arrow here yes so I mean just to illustrate that a little bit more very very simply in Guinea radio because we usually use fast sample rates and fast in terms of spinning you know everything is high frequency here I've got a signal source which is a sine wave the frequency is 0.2 Hertz okay so that phaser is rotating very slowly and we can very easily illustrate that this is an IQ plot it's a scope plot in XY mode and you can see the samples you know this is a normal sine wave right where are you I get no it takes a long time because it's slow there it is let's go I think strip shot yeah there is so we have the normal phase or the normal sine way that's being generated if you look at on XY then it's just going to be a phase that's rotating at a fixed rate around your unit circle on a complex plane so what I want you to think about is exactly this in a single range bin ok so you got your echo coming back is changing in phase and so you'll get a bit of this in that range bin and of course what happens when you take the FFT of a sine wave in this case you'll get a response in some bin that maps to the frequency of the thing so what does that imply can anybody tell me once you end up taking the effort ease of the columns maybe I'll show you the next slide and somebody can tell me so we did all that okay so we OH microphone stopped working hello oh there we go testing it's not even wireless so we couldn't blame somebody trying to docile a mic right so consider this then each row here is a single FFT that we took on an echo that came back right over our integration period we build up multiple rows so these are all the transforms here we're just taking an 8-point transform and so we build up each row and have a number of columns that maps to our integration period then to get the Doppler information what do you do you look at the face but but what's the next step here I mean I've already said it so I'm hoping somebody was listening you took a take F of T's of what the columns so you build up this memory cell right and then once you've built up and you've populated all of these with the the complex outputs of the first FFT part of the the range information you take the F of T's column wires and so what you're doing is you're doing an FFT to look at the phase changes of the returns at each range bin and what does that mean the Doppler shift and the movement of the target imparts the phase change and that causes that phase all to rotate and the faster or slower and whether it's forward or reversed broad heading in that phase or tells you how fast the object is moving and so if you have two targets that end up ambiguously ending up in the same range bin but there may be flying in different directions or different speeds you can use Doppler processing to then disambiguate them so instead of then just having a 1d plot you end up with a to deploy and I'll show you what that looks like so you run the f of T's for each these columns and then you get oops now remember how I said remember this slide earlier in the presentation what was I showing you in it when I showed that slide and interleaver look familiar in talega so what you can do in your new radio is I just wrote a simple row column in to leave a block and it takes the output of the first FFT stage and then once it fills the interleaver it out puts everything into another f of T and then you've got your all your your data the range and velocity information okay so interleaver and so this is a neat little tweak that I made to my into lever usually with intially if you need to fill up the entire anta lever and then you can read it out right so you fill up all the rows and then you read it out all the columns what I did with my interleaver is that you can set I can't know what the variable is but you can actually cause it to read out more quickly so it'll actually fill up let's say there are eight rows instead of outputting all the data once it's filled up eight rows it fills four rows moves everything up fills up the lustful rows moves everything up and then out that's all eight and then it fills the next four outputs all eight and then fills up for an output and so effectively you get twice the output coming out or in two level which means that even though you might have a long integration period like 10 seconds we don't want to be waiting 10 seconds to update a pretty plot we want to update it much more frequently and so you have your inter leaver spit the dollar out more frequently so the d signal goes into an FFT you get your range information goes into the inter leaver you do another F of T and then magically you get your velocity information so now what I'm going to show you is the original audio demo that I had with the with the laptop here and then that plot that I didn't show you is going to be the output of all of this in a 2d image that gives you the Doppler information [Music] okay where are we audio tell me and I will take Nate's wonderful advice again use my laptop speakers a microphone let me just fix up the so see before the this is the DC components actually wrapped around gets the microphone on the audio card start sampling in a different time to win it the trans the audio speaking you know the output digital Allen convertor started there offset in in frequency space because they're offset in time and so I need to adjust for that manually myself you just do that yeah okay alright right there's that should look familiar and now let me show you that the magic here hopeful it'll work alright here we go so this is the Doppler plot you're gonna see a little bit more of this some I'm just going to show you some brief previous experiments after this what I want your thing about here is that this is the waterfall behind it is showing you effectively virtual range over time right a Doppler plot the vertical axis here is the range information the horizontal axis is the velocity information so stuff there actually you see as I'm moving around I'm actually causing the echo to change so it's surprisingly sensitive so yeah the see how we've got the strong DC part there on the left that maps to vertically that strong bright spot at the top there so what I'm going to do is I might need to change the color yeah that's better all right let's see how this works so remember what you'll see is I'm gonna be moving the target at a particular velocity you know maybe up and down a little bit slower a little bit faster and you'll see a bright spot up here they're either on the left or the right depending if I'm going forward or backward and the distance from the center point is going to imply how quickly I'm moving it let's see if this works so there's my target it's a bright spot right and notice that I moved it here and now it's settled so I'm gonna move it more see now it's on the left because I'm moving it up and I'm gonna stop and it's gonna come back because I've stopped moving it now you still get the rain to return see the bright spot in the middle and the center line but I'm not moving it so there's no dot why I'm actually because I can't keep it totally steady but it's that sensitive that even as it's waving here you can see it's moving left and right so if I go like this then you get the Doppler response see and that's just using the audio now who can apart from the multiple echoes who can tell me why are we getting multiple echoes here wait skyla hang on a second anyway anyone else with a hand go out the back you know say say in any way yeah at least because we didn't have a microphone but what the two gentleman was saying was that they're so close that the audio is going to bounce off the bottom I laptop it's gonna enter the microphone but it's also going to bounce off my bottom laptop and we'll keep bouncing back and forth so we get the multiple returns they're the strongest ones going to be the direct you know the shortest path the other thing is look carefully what happens to the bright spot in relation to the edges of this image as I move it right so I'm going to move very slowly to keep it in there maybe a bit further out so we don't get all the returns right so now is gonna be on the left I'm gonna move it fast and watch what happens who said that that's right it's aliasing why we're getting aliasing yeah that's exactly right so what is happening here is that the pulse repetition frequency right how often we send out pulses and that'll basically constrain us in terms of the maximum unambiguous velocity that you can make out so once you exceed that velocity just like with a normal signal if it's above your Nyquist / - or Nyquist if it's complex it will then wrap around because you know you can't do some big u8 above that it's exactly the same principle so this is another constraint that's applied to the radar system depending upon the parameters that you pick initially so you have to as a radar designer just fine what your you want your Maxim unambiguous range to be and then you have to balance that with your maximum unambiguous velocity because you can't have both it's got to be either one way or the other yep so that's that's basically the I need the sound on again very good advice from the audience so any any questions about this No so just a reminder then remember how you were we were basically doing the the interleaving and doing the F of T that plot that we saw is basically the output of that entire process so each of those each of the rows was an output of n FFT so that's the audio it's almost time so if you're interested I'll show you an SDR version I've got some directional antennas here and I'll try and show you because I've actually what I've done is I've I take the the bins that map to to the zero range because we're mousing light travels really fast right so even though I'm think I'm using like one mega megahertz sample rate and so 1 over 1 million is still going to be like 100 meters or 60 meters or whatever it is and so we wouldn't be able to demonstrate a change in range but we can demonstrate Doppler because Doppler will still work in the 0 bin once before nice and close and what I do is I then do an inverse transform of that f of T at the zero bin and then I put that to the speakers and it makes like a theremin so you can we couldn't wave something back and forth and turn it into a musical instrument but I'll show you that last in case we run out of time you can stick around if you want to see it right back to the slides ok so that's that's stuff that you can do at home and I want to show you some stuff where it's used in practice and how you can actually decode that stuff as well so one is coda which is HF radar used to map the surface of the ocean and map currents and you can go on various universities websites that have these set up and look at interactive live plots of currents of the ocean the idea is that you send out an HF wave it'll hit crest of an RF wave it'll hit the crest of ocean waves which will reflect them back to the receiver and there's ocean waves will also have a velocity component to them which will then change the phase and if you once you do Doppler integration over time then you can both get range and velocity information of ocean waves and then you know they have them distributed in certain certain ways and they did something interesting which is gating the the transmitter so if you look on an ordinary plot it looks like it's just a chirp because if you look very closely with the rap parameters you can actually see they gate the transmitter on and off to give it that dead time to receive a return and the side effect of this is that when you take an FFT you actually end up getting side bands on your main signal and you might think their returns but they're they're just a result of doing the frequency transform of this gated waveform and if you do the math they're like millions of kilometers away in terms of range and so the stuff that you want to see is really really close in so that's what it looks like and you've got your main chirpin and hidden in the middle and those symmetric lines going out the a.m. side bands that you can forget about so I showed this before they're a bunch of HF code our stations running on on the spectrum all the time so you can pick any one of them I'm basically talked about all that and a shout out again to Peter Billings and Moe Wheatley are a space spectra view actually Peter made these with all the antennas really cool if you want to check them out out of the talk and they've used spectra view and and his STS to produce these amazing plots where you're not actually looking at the ocean returns anymore you're looking at the returns of the ionosphere because it's HF and all trans still travel all the way up to the ionosphere and and come back on a return and then this is one of RF spaces tweets really really cool stuff and then this is inspector view producing on the plot so in this way you can use the signal in an unintended way to look at natural phenomena that are going around and another very very amazing Str guy you have the onion the Arian n he's some incredible stuff he's used this kind of UHF radar to map the surface of the moon amazing paper he's done something really cool stuff with honor songs I tried my hand at this and so you also get those multiple returns this is sort of frequency over time as an ion asan sweeps the entire HF spectrum you can similarly D chirp it right because it's just a chirp and then get these multiple reflections from the honours fear we went out hiking and I thought after the hike my wife was was very kind and she just read her book in the car I set my laptop up in the boot and received Koda from around that area and you may recall let me show you this previously but this is the x-axis is time y-axis is frequency and so this is the Koda over time and you can see the am side bands again and you know there's some information hidden in there we're only interested in the center one if you zoom in in this case I don't think I had a GPS do the GPS discipline so you wanna have a GPS too so that the phase information is correct because remember if you're running your interleaver and the Doppler processing you wanna make sure everything is perfectly aligned and and stable otherwise we'll be introducing phase shifts they'll actually corrupt your velocity information so there's something going on there I didn't know what it was again you know that's range information effectively in the middle so you're getting returns somehow and then with the in Buckley we went out one night and set up and did some captures with a GPS do and you know we had a long wire setup and a custom-made dipole I'm gonna show you a video here this is again same kind of output from Grenier radio using that same plotting thing but it's it's a modified SEL sync but also outputs a series of bitmaps and so this is the same everything that I've spoken about so far exactly the same process but applied to the HF coda returns that I've just received on my SDR now the disclaimer is I have no idea exactly what we're looking at I mean obviously there's range and velocity information in there and you can see that there's some cool stuff happening like there are these you know there are these returns here and in terms of you know velocity which is on the y-axis things are moving in a direction away and toward the receiver so there's stuff happening in the there are weaker responses out there I don't know whether this is purely from the ocean waves or it's purely from the honest fear I meant to do the calculations but I didn't get a chance if you know what's contributing this to this I'd be really curious but you know you you can basically take the parameters of the radar system and calculate how far those returns are and what the implications are there and you know it's this information that they take in the code our system and then you know maps of figuring out the ocean currents from information gathered from multiple points like you saw on that map before so you know you can generate cool plots the last thing I want to show you is the television version with aeroplanes so this is using passive radar again so with coda I was using someone else's signal in this case I'm using digital ATSC terrestrial TV signal and remember like I said it with the chirp if you have a fixed thing that you're looking for you want to run through your filter in the case of ATSC happily in the middle between picture information there is this synchronization sequence of a fixed length and it's documented so this now becomes the filter no.not FM CW show up anymore but in the standards base full of synchronization sequence for ATSC that we end up filtering and correlating on and this is in all all TV signals and what's nice is they they use incredibly powerful transmitters that all you know broadcast the signal of the broad period time a broad range I said had to set up for us in my house where I had this directional antenna pointing out to a highway this is the view from from our place there's the 280 highway in San Francisco that runs there cars are moving back and forth they might end up being decent radar returns and so you can see how they're moving there one direction of traffic on top the other on the bottom you know they're metallic so we might get some returns and that's that's the receiver sight the transmitter site is over the hill which is nice because we don't get the direct path on the transmitter you get the reflections from stuff that's and and the direct path is attenuated so this is where I want to illustrate the power of Doppler processing because in this case Doppler and velocity is on the Y and ranges on the X what this is doing is it's correlating that known synchronization sequence in the signal and then that initial the strongest return from the filter for that ends up being in the center there right so you can ignore everything to the left because that's all picture information and then stuff that's here this is basically time zero in terms of the synchronization signal now if we weren't doing any Doppler processing if we didn't care about velocity I mean we got the range information look at this range line in the Green Line there's nothing discernible going on there you will get reflections of static structures that are large it will reflect a lot of the signal but if you're talking about cars and airplanes the radar cross-section is so small they won't be able to send any appreciable signal back to that small directional antenna so there's nothing there once you add velocity information you can see that green is you know we go from white to red to green you know it's the Rory booth color spectrum the black and the blue there are basically down the other end of your your your power range still within the dynamic range of your ADC but it's much more easy to discern stuff so actually that's a target right there I'll go back to the beginning because I think there was I look did you see no that's not on my display let me try that at one more time so the beginning here's some funky stuff happens is the inter leaver fills up because they're still empty data in the inter leaver and it kind of settles just and just look here and you'll see little little spots appear and I'm pretty sure they're cars on the highway that are reflecting the TV signal there's another one there and then they appear very briefly because if you think of them like a mirror you want to end up seeing the specular reflection off the car and the car has to be at the right angle you know the door or the windshield or whatever it is has to be the right angle both would the respect of my receiver and the TV transmitted so to produce the reflection another important point here is that I have a GPS do disciplining the oscillator in my receiver but that's still not good enough there is still some phase offset so there's an additional series of blocks I have that measure the number of samples between each successive primary return of that ATSC synchronization sequence and there's a polyphase resampler but it uses a long double ratio which is it's not a float it's not a double it's a long double because there are so many decimal points back where it has to adjust the rate to be locked on exactly to the signal or not have range information wander and then you have to track the phase of the primary and it gets a bit messy gonna do it well but you can see there's there's stuff going on in there now to prove this a little bit more with the aeroplanes ena and I went out to this carpark and we parked there and this is in this this park here which happens to be here right on the almost the approach path for planes coming to land at San Francisco so the geometry there is really nice we've got a TV transmitter elsewhere in the in the north of the bay and then we're down here receiving the reflections quite close to the aircraft and that's the kind of setup there we're just using a normal TV antenna plug into the user and then that's the stuff going on there so I've got a couple videos and we'll see some interesting things so that's a plane see that and it's aliasing because we can't control the pulse repetition frequency and so we can't control the the constraints of unambiguous velocity because that's defined by the ATSC spec but you can see these returns that are coming off planes and what's interesting is in terms of range these ones are actually further out we wouldn't see them again right if we didn't do the Doppler processing because we do we basically ignored the ignore the zero velocity line and these targets are moving so they make themselves apparent on the rest of the plot let me go to the next one some of them are more obvious than others I think they get yeah so you can see see there that line and the reason why it's streaked like that remember I'm reading out of my interleaver faster than the integration period so you basically end up sort of smashing together smoothing together one integration period with the next and so you'll basically get the blending of velocity information to go a scalar question I haven't got my head around that it's some some artifact of the system I'll look there's something there yeah I have to think about that a bit more wasn't it merely obvious to me why that's there any ideas from anyone yeah so there's a return down there so some of them are really abusive moves what was interesting is that the planes that are overhead are obviously going to be very close I think one pixel here is 30 meters somewhere on there maybe 40 meters and so they line have been quite close especially theirs they're flying an overhead but what what really blew my mind later is that I currently which of these videos it is unfortunately it might just appear but there are dots that appeared further out and they stayed there with slowly changing velocity and as it turned out we looked over to the bay and of course the San Francisco but what's on the other side of the bay Oakland is on the other side of the bay and was a plane that was just smoothly cruising into land in Oakland but because of the geometry it was headed in a slightly different direction than our planes coming in at SFO and so it appeared differently on the on the plot both in terms of velocity and range okay this would be something out there yeah so any any questions so far so you saw that that streak there I think this one might be a decent one question correct yeah so if you're transmitting or receiving from the same piece of hardware everything is inherently gonna be synchronized so you don't need to worry about using a GPS do and I'll show you that if you want to stick around the you mean that vertical line there so I think at the moment because there's a lot of artifacts happen during that de cinéma disciplining process not the GPS do and it's the other one that I talked about where it measures the number of samples between two of the the primary returns and the measures the phase there's some additional processing going on there and as it's doing that very very fine adjustment of the resampling it produces all these artifacts in the and the F of T's and that's why you get crap so I don't remember the exact frequency of this television station but it's you know there are plethora of stations throughout the UHF band and then based on the frequency that has an implication on how the radar cross-section will look so you know everything is interrelated there and here we're constrained obviously by the station frequency in the spec so anyway that brings me to the end of the talk if you want to hang around I'll show you the SDR version if you want to hang around now I'll show you it'll just take a couple minutes or you can clap now and head to lunch totally up to you alright I'll do it quickly so here we've got one thing I want to show you quickly this is the live processing of the of the code are right so the h-f h-f one so this should look look familiar from the video what I wanted to show you here is in this plot you know how it's talking about usually when we take f of T's we always take the magnitude and we always forget the phase information this plot is showing you phase information only so consider it like a waterfall but instead of the magnitude its phase and remember how we had those interesting shapes coming up and you know there are strong returns so there's going to be some information there this looks like noise cuz there's you know there's nothing strong there but amazingly you can see this phase structure in these range bins in the in the return so I you know it's I haven't I'd never looked at a phase plot like that before phase waterfall and it it kind of looks cool so this is two mega samples per second oh let me show you that that workbook somewhere all right here it is so this is a Jupiter notebook you basically put in the speed of light put in your frequency put in your bandwidth which is effectively your sample rate and then it'll calculate range resolution you put in your pulse repetition frequency you know you basically can plug in any anything you want into the equation and then it'll output all this information so for a given frequency would gives you the wavelength the range resolution for the given pulse repetition frequency interview pulse duration given bandwidth will give you an ambiguous range an ambiguous table and ambiguous velocity and then you know you can you can figure out the scales on all of these pictures from from that let's see what's happening here oh it's running cool so what I've got is I've got this flow graph here and then I've got the yeah nothing's showing which is a little bit oh no of course nothing showing skyla correctly points out thank you another audience contribution that the default configuration here is everything's off because I don't want to blow anything up so and I've got these two directional antennas here one is the transmitter one is the receiver so I can put them next to each other and I'm just gonna do a very quick setup and put my bag in between I didn't hopefully that won't affect things and I'm using a b200 mini here in a custom 3d printed case and we tried doing this this is really okay and now no guarantees it works also better if you're using separate transmit/receive you serve so for example I did it with a usurp with a memo cable so you can synchronize the clocks but the actual boards are different and they weren't interfere with them with themselves as easily as they do with with a single unit so what I'm gonna do is I'm gonna set the gain I think I use about 55 same gain I'll go around 33 and then I'm gonna set my it's not to bring it over here my amplitude of the single 0.7 and what we've got now is hopefully this is familiar to you the spectrum is being rotated so this is the FFT of the output of the D chipping and we've got the tone there which is what the system is hearing it's the d c-- tone it's it's hearing itself basically unfortunately that's really high and it shouldn't be that high because we won't be able to resolve anything otherwise okay it might be too much yeah oh there it is see see how we've got the center line which is actually offset I haven't figured out why that's offset yet but as I move my laptop toward and away from the receiver you can see that even though it's in a zero range and it's still overloading you still get velocity information and so that's you know that's basically how RF radars work now let's see if because I promise you music music is that actually yeah let me plug in there so it's a little bit finicky in that I need to do that and then I need to change the offset here so that ends up going into the zero bin because that's the one that's being transformed do I have the audio actually on I'd you sorry what's that thank you another audience participation Nate what would I do without y'all [Music]
Info
Channel: DEFCONConference
Views: 43,228
Rating: undefined out of 5
Keywords: DEF, CON, DEF CON 2017, DEF CON 25, DEF CON, DC25, hackers, security conference, balint seeber, wifi village, wireless security, wifi
Id: psuEzxFJnZY
Channel Id: undefined
Length: 106min 31sec (6391 seconds)
Published: Thu Oct 19 2017
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.