CEEN 545 - Lecture 14 - Time History Development

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
hello everyone in today's lecture we're going to introduce something that's pretty exciting we've talked a lot about statistical modeling in this class we've done a lot with different sorts of data but in this lecture we're going to introduce the idea of how we can generate and prepare actual earthquake time histories to use in a response analysis so a lot of different engineers and industry use these skills whether they're structural engineers or geotechnical engineers so hopefully you'll find the information but you'll get from this lecture useful one thing that I want to say right at the start of this lecture please please please do not think that the things that you're going to learn in this lecture and in coming lectures where we do some demonstrations with time history development please do not think that you're gonna be an expert right off the bat that the point is that my goal is to train you and teach you enough to make you dangerous but not enough to make you an expert really what you need is to go and work under the tutelage and guidance of an experienced and reputable engineering earthquake engineering mentor out in industry and you know to learn this stuff really well takes years of practice and experience and building your own reputation and so the things that that you're gonna learn in this lecture and in the lectures to come related to time history development will assist you in gaining some reputation within your future company and and impressing this potential mentor who will train you and teach you more and more but please then I think that you're going to be an expert immediately when you walk out of this class the things that we're going to be talking about here are pretty sophisticated and advanced and so they take a lot of practice and a lot of Critias sysm to fine-tune your skills and to get better so I want to just pause and point out look how far you've come in this semester we've done a lot of stuff you've learned a lot about seismology about the earth about earthquakes and different wave types you've learned about ground motion parameters and how we can quantify an earthquake you've learned about attenuation relationships and how we can use those models to predict ground motion parameters for some future earthquake at a site and doing that in a deterministic or a probabilistic manner you've learned about different types of spectra Fourier spectra response spectra and all of these things are really advanced topics and so you should be proud of yourself and see just how far you've come today's topic is going to be new to a lot of you but it's going to be exciting because we're going to start to make use of all of those time histories that have been recorded since the 1940s time history analysis is when we build some sort of response model whether it's structural or geotechnical and we use actual time histories in the model to see what's going to happen usually these models are a time Anette based analysis or they're in the time domain so the model steps through each time increment and computes the response of the different nodes of the structure or the soil and so for instance this picture that you see on this slide here is is just an example of a dynamic sap2000 model of some steel structure that when you look at down on the base of each of these columns you see little springs that's where the ground motion is going to come into the mall and it's going to shake the entire structure so when we do a time history analysis really the goal is to use time histories that are representative of the potential design earthquake that we're expecting at our site but that's a problem right I mean it's hard enough to try to predict ground motion parameters from some future earthquake that hasn't happened yet but now try predicting an entire earthquake time history for some earthquake that hasn't happened yet that that's a really tall order and so one of the things that we commonly do today to try to guide our development of time histories is we try to use the design response spectrum as a target for our various time histories and you know I you could ask this question well why do we do that take a look here at this picture that we have so in this line you see that we have a design spectrum where the arrows pointing maybe this was developed from the IBC code so that's really what the code is saying is the standard we should be shooting for for designing whatever structure were designing and then you see all below it you see numerous response spectra from real earthquakes a whole bunch of them and if you compare the response spectra from the real earthquakes with our design and response spectrum you can see that for the most part all of the real earthquake spectra fall below and in most cases well below our design response spectrum so what would the problem be in this scenario well the problem would be that the time histories we'd be introducing to our model would be inadequate compared to the design spectrum or to the level of seismic loading that the code says our structure should be designed for ideally we would want time histories that could get very close if not as close as possible to matching that target across certainly across the periods of interest if not across the entire spectrum so to do this there's really two main methods that we're going to introduce that we use today to develop our time histories for these types of analyses the first method that we'll talk about briefly is called time histories scaling and and this is a good method where we're going to use real actual raw time histories and we're just going to modify them or tweak them slightly to get them to where we want them to be I'll explain more when we get there the second method is called spectral matching now this is a more rigorous more analytical approach where we're going to use a computer algorithm to modify real time histories to get them to closely match a target response spectrum let's first talk about time history scaling so when I say scaling really what I'm talking about is adjusting the time histories scaling factor so that would be scaling its spectral accelerations across the board and or adjusting the time histories time step to try to shrink or fudge the period of our response spectrum to try to meet some sort of target so here's a great example let's say that for a particular structure that we're designing it it has a natural period or a period of interest of two seconds so I can go and select say 20 time histories and each of these little lines that you see here is the is a time history from one of the twenty earthquakes and there's this big black line that I'm now outlining in red which I'm gonna have to erase because now that's confusing because you have multiple red lines on there that is the target spectrum that we're shooting for that's say from building code or something like that okay so now the goal is how do we pick time histories that are going to get us really close to our target codified spectral acceleration at our period of interest so imagine if for every one of these little gray lines here or the individual response spectra that we multiply or divide that the entire response spectrum by the factor such that at the period of two seconds I have this response spectrum that corresponds with my target in other words we're going to fudge all of our response spectra so that they run through that exact same point at our period of interest now the reason we do that is so that we can we can hit our structure in our model with the right spectral acceleration at the critical period but see now we've got maybe a challenge with depending on who you're talking to you some people call it a challenge other people call it a good thing if you look at any other period along this in our spectrum you can see that man we have a really broad range of spectral accelerations that we're dealing with such that anywhere in this range here we could expect to get a spectral acceleration at those different periods and all of that different variability some would argue is a good thing because what it's doing is it's introducing natural variation from earthquakes to our model other people would say well how in the world could we possibly interpret anything with all that variation in there we would have no idea to know you know say something performs inadequately is it because it got hit by an unrealistic ground motion or is it because something's wrong with the structure how are we going to know so you know both both arguments have really good points I think the idea is that if we take say the median the median or the averaged response spectrum from all these different time histories we have in here it's hard to see because now I'm coming over a red line with a red pan maybe I'll change the color let's change it to blue okay so now I'm coloring blue if I take the average of all those it's going to draw something that looks you know something like this and the goal is that if that that average response spectrum should be pretty close to our target response spectrum in the first place so we're not so worried about individual response spectra but more the average of all of the series that we're looking at should be close to our target so this is a what this slide is talking about here is what I was explaining before is that when we look at that smooth average spectrum it should fall within some specified tolerance of the target spectrum of course we know it's not going to fall right on the target across all the periods but we'd like it to get as close as possible so if it here's the deal I mean you can see that right around our target period our average spectrum from all the different individual response spectra is pretty close to the target spectrum but if we move further you know reduce the period you can see that all of a sudden now we have some pretty big deviations here and so it's it makes sense to think that we're not going to get our target across all periods that would be pretty impossible and unrealistic to you to expect so the idea is to try to focus on a period range of interest so in other words maybe I'd say if if this is my target period then maybe I could draw a line here and here and say within this window my average response spectrum from all the individual response spectra is going to be within some specified tolerance of my target so a typical window that is used by a lot of engineers is this range of 0.2 times the natural period up to one and a half times the natural period and the idea with this window is we're trying to account for the possibility that we can't really accurately predict what the true natural period of our structure is going to be before we actually build it so let's go ahead and review some of the advantages and disadvantages of time history scaling it was one of the advantages is it uses real earthquake time histories and I have real and parenthesis because you know these are the raw true recorded time histories from actual earthquakes and that gives a lot of comfort to engineers to know that hey you know if I were to try to just make up a time history which some people can do I may miss some aspects of the the seismic waves that we don't understand yet or don't know but if we're using real earthquake time histories then all of those fundamentals all of those physics are inherently built in to those time histories along with that these scale time histories tend to retain their natural natural variances which is in many ways a really good thing because we know that earthquakes have a lot of variability and our ability to predict them with confidence is well it's stinky at best and so when we have a wide range of spectral accelerations in our response spectra that may be more representative of our true uncertainty with the problem so a lot of people like this wider variance in the natural variance that's built into these scale time histories and it can allow us to focus on one specific target period of interest or even a window if we're interested in that and of course one of the most important things with this approach is that it's relatively easy to do especially if you have the proper tools now it used to be hard don't get me wrong you know applying just a fudge factor uniformly across the time history in and of itself isn't bad but picking initial time histories to fudge in the first place was really really hard and time-consuming but now we have some new tools to help us like the NGA west to online database that's housed at the University of California at Berkeley it's got some really cool search tools that we're going to demonstrate on Wednesday in our lecture and so you know with these new tools available to us now time history scaling is is really not that bad there are some disadvantages though that we need to talk about the first disadvantage is that scaling because there's so much variation it's gonna require a whole lot more time history records than if we had time history records that were really tight and in close to our target response spectrum so you know the more variation we have the more time histories we need so that we can capture all the possible variation and with that greater variance in the time histories you can expect to get a greater variance in your computed response and the guiness goes back to this concern now because what if what if some of your models show your structure performs in adequately what are you going to point the finger to you is it poor design or did it happen to be something in the time history itself or the ground motion was the ground motion unreasonably large who knows and so it can be really difficult to find you know the culprit in these instances and then finally even though it's pretty easy to do I didn't say it was quick doing time history scaling the right way because we're using lots of time histories it can take a lot of time to do and so you know like for the homework assignments we're going to be having it's not going to be too bad but if you're doing this and getting paid for it an industry you're gonna spend quite a bit of time getting your time histories ready let's talk about the second method and before I get to the second method I do want to point out that there are other methods besides time history scaling and spectral matching to develop time history records for dynamic modeling for instance some experts and researchers use what are called greens functions to develop completely synthetic time motions when we go and visit engineers who helped design a seismic retrofit for the Utah State Capitol you'll hear them say that the time histories they used in their models were all completely synthetic I'm not a fan of synthetic time histories for some of the reasons we've already discussed personally I think that there are a whole lot of things about earthquake ground motions that we really don't understand very well and even though the process is pretty challenging I mean at the end of the day we're still basically taking a wave and just adding some Wiggles to it and calling it an earthquake I'm a lot more comfortable when we use actual earthquake recordings and ground motions because I feel like at least that's grounded in reality and in nature but you know with that being said there are a lot of benefits to synthetic stuff and we purportedly can synthesize and simulate motions from scenarios that we may not have a lot of data from but then I would counter if we don't have a lot of data from it how do we know that the synthesis is realistic or representative of what really would happen so anyway oh and there's some other methods that deal with what we call physics based prediction of time histories where they try to simulate the development of time histories in the spread of seismic waves all the way from the rupture of the rock and then as the waves attenuate to your side I mean they're really sophisticated and advanced models and you know pretty amazing and cool but they're they're still definitely in development and definitely a research thing right now so I don't see those being used in practice or design any time soon okay so that was my little detour let's get back to spectral matching so spectral matching is where we're going to take real or an initial time history in the time domain and we're gonna modify its accelerations we're gonna modify the acceleration time history using things that are called wavelets so in other words if that's time and this is acceleration and we have you know some earthquake time history however that looks the idea is that we're going to come in and we're going to you know maybe take that and we're going to its gonna plot some response spectrum according to that actual time history and we may come in and say oh man you know I have some target that looks like this it'd be really nice if this point was up closer to my target okay well here's the deal you know that this is period that looks like a towel sorry this is period t this is spectral acceleration okay so we know the period associated with that point there and we know we need to get that up so the idea is what if we took another wave and we use what are called wavelets so these are just little guys that look like this and they're you know it almost looks like an echo gram or something that you get from your heart the point is that the the spacing of these wavelets relates it's a it's a function of the period in the response spectrum and the amplitude is a function of the spectral acceleration in the response spectrum so the point is that we can add or subtract these wavelets to the actual acceleration time to try to get this point up closer to our target at least within a specified tolerance but it's we've got to do it in a way that is mathematically feasible and doesn't create a scary Frankenstein time history this idea about using waveless to modify the acceleration time history was originally proposed by a couple of gentlemen named lil hen and insane in 1987 and 1988 but it really wasn't until of the famous seismologist doctor norm Abramson developed a code called RSP match in 1993 and made that available to the engineering and seismological community that this method really started catching fire and a lot of people started using it an RSP match was a useful program but you see doctor dr. Abramson is another one of these guys like dr. Jack Baker I mean he's just he's just a genius he probably graduated from college when he was 10 years old or something and and you know the fact of the matter is that RSP match was was really clear to nobody but him and he didn't make you know any bones about that he was straightforward and said look I'm happy to share this code with you but I don't really have time to write documentation for this or to you know debug or troubleshoot your problems with the software that I create and so there were a lot of spin-offs of RSP match and different versions that were developed by different individuals the most recent update that I'm aware of was done by Hancock and others in the year 2006 so it's been about 10 years but they developed a method or a modification of RSP match that's a little more stable and and a little faster I think so the advantage of spectral matching is that we can take you know we can take response spectra that look like you know the one we drew here and instead we can end up with something that looks like this that follows our target really really closely and so what's the advantage of that well remember the whole point of doing our numerical modeling of our structure or our soil is to try to hit it with a time history that is representative of the design ground motions that code says it it should be capable of resisting and then we want to look at the system for flaws or defects when we do time history scaling the variability in the time histories we introduced can be so large that looking for the source or the cause of defects can be really difficult sometimes but when we have time histories that are really close to our target now all of a sudden if we see a flaw or a defect appear on our system we can know immediately okay this may not necessarily be due to the time history I think there is a flaw or a defect in our structural system so the whole idea in the main advantage of spectral matching is that we're trying to reduce well I was crazy we're trying to reduce the variability in the response and why would using spectral matching time histories reduce the variability in the response well because I mean look I have three time histories plotted with my target and look how close the three time histories are to my target if I plot the upper and a lower bound from all three of these response spectra I mean they're they're almost like practically right on top of my target spectrum so that basically means that when I run my simulations in my model theoretically my if I think of my model as a single degree of freedom oscillator it's still getting hit by the same peak acceleration in every single one or across all periods of of the spectrum now the individual motions are going to be different true but the peak motions are going to be very similar between all three records and that's really what we're after so let's review the advantages and the disadvantages to spectral matching in terms of advantages we have less variability now some people would call this a disadvantage but again if the goal of your modeling is to zero in on potential flaws of your structure and how those flaws could relate to the design spectral acceleration that the code is telling you to design to then less variability could be a good thing because there's less variability in the response spectra it usually requires few ground motions now with time history scaling where a code may require you know at least seven time histories to use the code used to require as few as three time histories for spectrally matched ground motions I'm not sure if that law still stands i they may have just uniformly said you need at least seven motions regardless of which approach you used today and like we said earlier sometimes it's easier to spot design flaws in our structure or in our system when we have less variance in the input motions disadvantages when we start adding wavelets or subtracting wavelets from to the acceleration time history we can get some pretty unrealistic and and some wonky time histories you know we I like to call these motions Frankenstein motions because while we didn't just synthetically create the motion from scratch we took a real motion and we may have altered at beyond recognition to the point where it it's not a motion that we suspect any real earthquake would actually produce especially by the way if we're matching this to the uniform hazard response spectrum for the reasons that we discussed in the conditional mean spectrum lecture if I force an earthquake time history to give me the response spectrum or to match a uniform hazard response spectrum or that's a tall order for nature to produce that now another disadvantage is that if we're starting to modify the actual time history of the earthquake some of the valuable aspects that may have been inherently preserved in the time history might get removed and a great example of that would be for instance let's draw a time history of say velocity and let's say that I have a big pulse record of the beginning of my velocity time history a directive 'ti pulse and boy I think that might be important if I suspect that my structure could be exposed to forward directivity pulse then I might want it to have such pulses in the time histories I use in my modeling but after you know one of the that I commonly see is after spectral matching sometimes by the time we're done that velocity polls might look something like this it gets you know filtered out or damped or whatever it gets modified to the point that we essentially removed that pulse from our record if if you chose that record because you like the Poltz that is in it then you want to keep your pulse so we want to be careful with our spectral matching to try to retain the aspects of the time history that are valuable to us and you know again similar to time history scaling spectral matching can take a lot of time to do correctly you're gonna see that it doesn't take a whole lot of time to run the spectral matching algorithm but when when you start asking the question you know is this feasible or is this a usable earthquake time history you might find yourself starting to throw out a lot of the time histories that you're developing and and that's really what can take a lot of time is is running rerunning re running again until you get a time history that you feel like you can use and rely upon and finally of course you know spectral matching because it is pretty much a black box very few people and including myself don't understand very well the ins and outs of what's going on and the spectral matching algorithm and and as such whenever you use a black box no matter what it is it's pretty easy to screw up and do something wrong if you don't know what you're doing and I've seen this time and time again with being a third party reviewer for time histories that have been developed for different applications and yeah it's it's rough so part of the challenge in developing good times histories is selecting appropriate time histories to use and then knowing when to throw out a bad or a wonky time history so some of the factors that we should consider when we select time histories to use in our initial modeling for instance so by the I'm talking about now selecting time histories to use in either my scaling or my spectral matching so so we're saying okay we need initial time histories to use which ones are we going to pick that's what I'm talking about here so here's some things that we should consider when we select time histories one of course we would want the recorded earthquakes to come from magnitudes similar to the earthquake magnitude associated with our design earthquake typically within plus or minus one on the moment magnitude scale it would be nice if they were closer than that but we'll take what we can get distance of course is important we typically want the source to sight distance to be within you know plus or minus ten kilometers of what our site is from the anticipated earthquake source we'd sure like the faulting mechanism to be similar but this is where you start to get to the point where say say I'm in Utah and I you know the Wasatch faults a normal fault so I go to the peer database or the NGA database and I say I want time history records from normal faults and you turn on that filtering criteria and guess what you're not going to get a whole lot of time histories because we don't have a whole lot of recorded normal faulting events and so you know in some cases this this isn't going to be as important if I'm dealing with reverse or thrust faults I think it's a little more important and certainly subduction zone though the NGA database does not have any subduction zone time histories in it if I was designing for a subduction zone event I would want to select time histories from a subduction zone source because subduction zones have much longer durations than crustal faulting events we would like the spectral acceleration at our target natural period to be you know within about twenty to thirty percent and so you know this this is tricky to this woman though is not quite as important you think it'd be really important but when we do spectral matching as you'll see you know it can be further away but typically the closer the earthquake response spectrum is to our target initially the better the match now this one's pretty important we want to make sure that our ground motion was recorded on the same type of soil or the same type of rock and that's usually based on site class so remember this was site class a b c d or e we can filter the recorded ground motions based on the on their site class and that way we can only get we could specify to get ground motions of just the same site class that our site is and finally we can specify whether or not we want to look for directivity effects in the NGA database these are referred to as pulse motions not pool motions goodness pulse motion so whether or not your time history has a pulse in it so as i've mentioned a few times before i really like to use time histories from the Pierre ng8 database because they've undergone a lot of review and a lot of people looked at them and vetted them and the search functions that they provide are very handy so you know this used to be their database I'm not sure if it's still active the new nga West database is this one that I'm circling right here we've already introduced that are talked about it but that's the database you go to search for earthquake time histories from the NGA west to crustal source database to access this database you do have to create an account and you will have a sign-in it's free that there's no charge they just want to track who's using their database so this is interesting how can you tell if your time history is good or not you know we talked about picking and choosing time histories to use as the seed time histories or the initial time histories at the beginning of your analysis now that you're done with the analysis you've created some of these time histories that are maybe close or match your target response spectrum how do you know whether or not that time history that you created is a good time history how do you know whether or not it's a keeper so I was at a conference in Sacramento California in the year 2008 and norm Abramson was doing a question and answer session and somebody asked him a very interesting question they asked when you do spectral matching dr. Abramson how many of the time histories that you develop do you end up throwing out and you know this is the creator of the software this is the guy who probably knows more about spectral matching than anybody alive today and his response was I throw out about 80% of my spectrally match time histories and you could hear the whole audience just go whoa because if the master is throwing out 80% and only keeping 20% of the time histories he develops man you know most of the people in that room I guarantee weren't throwing out 80% of their time histories you know I think his point was maybe you should put more thought into what you're calling a good time history and and make sure that you're not giving your clients garbage so it's it's really important then when we ask well why is this it's really important that the the time histories we develop are as representative of real earthquakes as possible the moment we start to deviate from what we know is real is when we're starting to extrapolate and just flat out guess and so we want to keep that to a minimum as possible so here a couple of things that I was taught to look for when evaluating your time history results again you're not gonna find any of this stuff written down in a journal article you're not gonna find it written down in a manual these are the kinds of things that you just learn under the training and the tutoring of someone who's just a lot smarter than you or has a lot more experience than you and and you know that was certainly the case for me when I worked at Kleinfeld er one of the first things that they taught me was them they want they taught me to plot the acceleration the velocity and the displacement time histories basically right on top of one another so you have your three plots one two three one is acceleration one is velocity on his displacement and all have the same time scale and the idea here is and this is gonna get ugly and now the idea is that you want to compare how the velocity and the displacement time history compared with one another but then what you want to do is you want to plot you know the spectrally matched time history on top of the same plot you know something that maybe looks like this so you don't want you you want to plot the pre-matching and then the post matching time histories on the same plot and that way you can see very very clearly where there's going to be deviations for example in in and how the spectral matching cause your time histories to change okay what we're really looking for is we want to make sure that significant peaks in our time histories aren't drastically shifted or altered in a way that we've we've really fundamentally changed the ground motion number two we want to make sure that all the desired aspects of the ground motions such as the directivity are still there after the spectral matching so if I go back for instance if my initial velocity record had a big pulse in it like that I want to make sure that my spectrally matched time history also has you know somewhat of a pulse there as well I want to maintain and keep that ground motion pulse in the record finally we want to compare the pre and the post matching areas intensity plots these are the Arius intensity I know like in seismo signal when you look at your ground motion parameters it gives you an option you have actual Arius intensity which you actually has units associated with it or it has percent Arius intensity and what you want to plot is the percent areas intensity this is going to be really similar to you know what we already did with these in fact a lot of people like to just line it up as a fourth plot so we plot time as the same scale but on this one this is percent areas intensity areas intensity usually is like represented with this capital I sub a and the idea here is that again we're looking at post and pre spectral matching and post and Arius intensity when you plot it in terms of percent is going to vary between 0 and 100% so all it is is an indicator of the energy buildup in your earthquake and it eventually will top out at a hundred percent what we want to see when we look at the post spectral matching you know you might see a little bit of difference something like that we want to see that the area's intensity didn't deviate too much because really what areas intensity is indicative of is the energy run-up in our ground motion it tells us where in the earthquake is accruing energy some of you may be wondering what the world I'm talking about when I say Arius intensity go back to the ground motion parameter lecture and I talked about what areas intensity is I mean really areas intensity all we're doing is are measuring the area beneath each one of these cycle loops and we're essentially summing up that area so the more loops there are on each side of the x-axis the more energy is being emitted by the time history so you know this this run up here is just like the integral or the accumulation of that energy that's what arias intensity is so those are some of the things we look for so some actual examples for instance here's some plots I created so these plots just show a couple velocity plots so you can see I'm in the dark black line that was my spectrally match time history and then the little light dashed line that was the pre spectral matching velocity time history so you can see you know in this instance I have some deviation let's change my pen color you say I have some deviation here I have some deviation there I have a lot of deviation here and here you know when you start to see big deviations like that that's not a good thing also you know in my initial record I had a peak down here and that lined up well with that peak so that's good but then I had a big peak up here but my spectrally match time history doesn't peak till it's over here so I actually had a time shift in my in my Peaks and so one could argue that we fundamentally changed the time history again so this is why you know I call this stuff bad that would be an indication that yeah I would probably throw this time history out and try again now here's an example down here of a good match so you can see that the peaks they line up pretty well across the matched and the pre-match velocity time record you know there's there's distance yeah there's always gonna be some distance between them the point is you know you want to use your judgement and say you know that's not too bad certainly you could see how that deviation of that distance is in there as bad as that deviation from the blocks above so you know if it were me I would say yeah you know that's a pretty good match I think I could keep that time history so here's a couple of tips in the coming week here on your homework you're gonna be running a software called seismo match on Wednesday I plan to do a demonstration with seismo match and it would be really helpful if you guys had your computers with you and we're ready to go to help you know to follow along in that demonstration a couple of things that are going to help you in seismo match you can specify the tolerance the tolerance is how much deviation from your target spectrum you can allow if you're developing a lot of time history say see seven sets of time histories then you can have a pretty loose tolerance twenty to thirty percent and you're going to be fine but if you have you know a smaller number of time histories you're developing then you might want that tolerance to get tighter maybe something around ten to fifteen percent and I'll tell you though that the tighter your tolerance is the longer it's going to take for you to run your analyses that you know not much longer it'll just be a few minutes but the less likely you're going to get a convergence on your spectral matching and that leads me to my second tip a lot of people make the mistake where they try to match the time history to the entire response spectrum in one run and that's a really tall order for the algorithm to do I usually prefer breaking up your analysis into two different passes so in the first pass specifying that your go to match just between the periods of point zero one second and one second and and everything greater than one second you're not worried about that you're telling the program don't worry about that don't match that so you're going to run that analysis and you're gonna get your spectrally matched time history that's gonna be past number one then you're going to take that that first matched time history and use it as the input for your second pass where now this time you're gonna specify you know your your larger periods huh doesn't need to be ten seconds you can match it out to four seconds or whatever your response vector means you're interested in but the point is that in the second pass you are going to increase your period window or your period range for your matching so this range from point zero one to one second is really going to get matched twice and the reason is if you look at any response spectrum you know point zero one second is really close to zero and one second is out here generally this is the portion of the response spectrum where you see the curve and it's coming around the top everything greater than 1 second is usually pretty logarithmic and just kind of spreads out so because we have this curvy section up here that's why we're doing two passes on it because it's it's kind of hard for our computer program to try to match the response spectrum to that curvy portion so in order to do this just be aware that you may need to get into your default settings with an RSP match too and to specify these periods so a final word of warning software like seismo match and there's other software programs out there that do spectral matching they've been simplified to the point that almost you know anybody with a mouse and any basic understanding of a time history even is can come in and do a spectral matching job but if they treat it like a black box and they don't really have an idea of what's going on or the dangers or limitations of spectral matching then they it's like having a gun if you don't know what you're doing it's really easy to hurt yourself or to hurt others so you know again the things I'm teaching you in this class they're gonna be useful to you and and it's going to be really nice for you to have this understanding but I strongly recommend that you don't go out and start applying this stuff in practice and trying to sell these services unless you get under the guidance of some mentor that has some reputation and experience and they can give you some training and counsel on how to do all this stuff properly so with that that's the end of this lecture it's been fun thanks for watching and feel free to leave any comments down below or go ahead and shoot me an email and I'll be happy to respond have a great day
Info
Channel: Office Hours
Views: 16,100
Rating: undefined out of 5
Keywords: kevin franke, office hours, CEEN 545, geotechnical earthquake engineering, time history development, time history scaling, spectral matching, SeismoMatch
Id: 2lC1wMq4c_w
Channel Id: undefined
Length: 55min 57sec (3357 seconds)
Published: Fri Feb 24 2017
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.