The World After Silicon - From Vacuum Tubes to QUANTUM

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments

Too long, couldn’t watch?

👍︎︎ 1 👤︎︎ u/AustinTN 📅︎︎ Oct 12 2019 🗫︎ replies
Captions
[Music] the transistor first patented in 1925 wasn't until 1947 that Bell Labs created a practically implemented device for which it's three inventors were granted a nobel prize in physics in 1956 in the following year Egyptian engineer Mohammed Atalla also at Bell Labs proposed a new method of semiconductor device fabrication coaching a silicon wafer with an insulating layer of silicon oxide so that electricity could reliably penetrate the conducting silicon below this was the birth of the semiconductor industry as we know it today two years later atala developed the metal oxide semiconductor or Moore's process which he believes could be used to create a field effect transistor or fad the MOSFET has since become the most widely manufactured devised in history in 1963 the CMOS or complementary mass process was invented at fairchild semiconductor and in 1989 a team of researchers at Hitachi created the finn field effect transistor or FinFET which is currently used in processes like the horizon 3000 series it was precisely a former Fairchild Semiconductor employee Intel co-founder Gordon Moore who predicted that the amount of transistors would double in integrated circuits every year he revised that observation ten years later to a cadence of every two years and more recently in 2015 Gordon Moore has stated that he expected the so-called Moore's law to be dead within a decade will transistors really stop scaling what happens after one nanometer and what materials can replace silicon in the micro processors of the future this video is sponsored by PCP way PCB away offers quick turnaround PCB design with custom specs for low volume production or prototyping as well as highly specialized precision PCBs and large-scale production get 10 free boards with your first order over at PCB Wacom link in the description the first thing we should address is where the Moore's law is indeed dead no it is not but is it slowing down as many claim well if we are talking strictly about density then the answer is also no Moore's law will continue for several years but if we talk about Moore's law as a measurement for steady increased computing performance as is commonly used today then yes it is indeed slowing down but mostly because of bottlenecks in memory and overall costs in fabrication and not necessarily due to transistor density increase but general computer performance is a byproduct of Moore's law not the law itself which is not even a law but an observation it's important to remember that the rate of technological progress is controlled primarily from financial realities and the financial state of the world doesn't stay constant we've gone through periods of stability of incredible prosperity but also of wars of financial crisis there are bubbles and there are recessions as a consequence the rate of technological change varies over time just because there is a slowdown in technological advancements now doesn't mean that that will always be the case conversely technological innovation typically leads to an increase in prosperity in a recent presentation at the hot chips conference AMD CEO Lisa Hsu talked about Moore's law and particularly about how much it has historically dictated performance increases in this slide from that presentation we can see how Moore's law is now at a three-year cadence when it comes to transistor density so that's a triangles plotted here this other slide shows the microarchitecture techniques with AMD claiming their process technology was responsible for only 40 percent of the Foreman speed up over the last 10 years even though I don't have the numbers that AMD used to produce this slide I feel like this is wildly biased towards the chip design side of things traditionally process technology accounts for closer to 90% of the pie when it comes to performance speed up over the years and the last 10 years were no exception so in my last video I talked about how new advancements in packaging in concert with data centric architecture designs will bring about massive improvements in performance in the next couple of years in today's video we will be focusing primarily on the other side of the coin the process and the materials we will look at the near future changes also into this somewhat distant future including quantum computing as it stands this is the current roadmap from the main fabs 7 nanometre is out now with rice and 3000 being the most prominent example as far as our niche is concerned and it's still using good old thin fat transistors five nanometer is scheduled for production next year both at SMC and Samsung using fin fat transistors but now with EUV lithography rise in 4000 or sin 3 will not be on this node instead coming out on a refined version of 7 nanometers 3 nanometers is expected to be in ris production in 2021 both that yes and see and Samsung 3 nanometers marks the introduction of a new type of transistor to replace thin fat the gate all-around fat has been in development for several decades and is now ready for large-scale production this transistor goes by different names depending on who you ask like nano wire or nano sheet transistor without going into too much detail the idea has to create a three layer channel completely surrounded by the metal gate in order to better control the leakage of electrons at such a small scale the FinFETs we have today don't work properly at this scale so if this advancement in transistor design was necessary to keep us pushing whoa word if you've been watching my videos for a while you'll know that seven and a meter five nanometer etcetera are just marketing terms these process names burn no connection to the actual physical size of transistors but I'm sure you've wondered how small can we make a transistor from our current knowledge of physics the only real limit is the size of a single atom say if that's the only limitation then there's probably at least another twenty to thirty years worth of scaling down then why are companies like Intel and AMD saying that Moore's law is coming to an end well that's the reality within the current context so that's the current fabrication processes the current materials and more importantly the current economics I've talked in past videos about the s-curve Moore's law can also be modeled in a learning curve except that the learning curve for Moore's law is linear provided the two original conditions are met so Gordon Moore stated that every year the number of transistors would double and the cost of making chips would also go down but remember more adjusted the projection in 1975 to every two years instead of every year and then in 1997 adjusted it down again to every 18 months and like we saw earlier today we are closer to three years that has the interesting thing when it comes to transistors Moore's law existed long before more observed it so if we look beyond MOSFETs and we separate the devices into several eras let's call them we have in 1941 to 1955 as the vacuum-tube era 1955 to 1966 as the discrete transistor era and then 1966 still today as the integrated circuit era and this is where Moore's law takes place after we chart the cost per transistor over time adjusted for inflation and guess what the learning curve the semiconductor industry has been linear since its inception in a well continue to be that way what is happening is people are stuck on this idea of the silicon transistor and its limit patience imagine back in 1955 someone saying that the industry was going to stagnate because vacuum tubes had reached their limits sounds a bit short-sighted doesn't it and costs are influenced by the current financial and political climate while AMD and Intel are referring to in these recent presentations is the point of diminishing returns in semiconductors as things stand we had a point of diminishing returns when increasing the investment makes the chips more expensive rather than cheaper you can imagine this as us running out of cheap energy sources here on earth and having to invest massive energy resources to go and get more energy from say mining asteroids in space we would be using too much energy for very little energy in return and if we have to go to asteroids there are too far away we go from diminishing returns to negative returns spending more energy getting there than we get in return from mining these hypothetical asteroids so going down to seven Anna metres has been an enormous expense just to get a small jump in performance going down to five nanometers will be even more expensive again with diminishing returns and then down to three nanometers and one nanometer possibly becoming prohibitively expensive for most companies given that the returns will be minimal if we continue to do things as we have been doing so far so that sets the stage for what we will talk about next what does the future hold we went from vacuum tubes to discrete transistors to integrated circuits what will the next era be like tsmc suggests that being more creative with packaging like using 3d stacking will guarantee high returns on these growing investments and that's where we left off in my last video at hardships tsfc talked about its chip on wafer of substrate integration orko was this thick boy is similar to what I discussed in my last video although this is a much simpler chip than what I was proposing a couple of weeks ago going down to five nanometers this is what we will see Chiclets and 3d stacking with data centre designs as I discussed in that video somewhere between five nanometer and three nanometer we will see something that I also discussed in the past and that is massive amounts of one chip memory with one gigabyte of cache starting to become the norm so this is your level for cash the most significant change here will be that HBM will now be stacked on top rather than next to the logic dice this period will see a maturation of EUV and a new generation for CMOS called high numerical aperture EUV or an AUV will be introduced in fabs the gate all around transistor will also start to replace FinFETs in these next five years so while these short-term changes might not seem too drastic things will get very interesting very quickly after three nanometers 2025 will see the introduction of the 2.5 nanometer node which will bring innovations like the vertical gate all-around transistor and up to 4 gigabytes of on chip our full cash and let's ride full gigabytes of cash on chip and thus thanks to advancements in interconnects and of course new memory and hair we will enter a stage where memory and logic will converge there are several types of next-generation memory being worked on there are candidates for this integration there spent or cram face change memory resistive ram which has been the subject of a lot of attention recently conductive bridge RAM and ferroelectric Ram some of these like spin talk are already being used in products today and they will become more prevalent as we combine logic and memory there are of course more exotic types of memory like DNA based memory but now we'll probably have applications in massive storage rather than on chip memory it's too early to say which of these will win out but my guess would be that em ram or memory stem Ram will be the one to bridge on chip SRAM and logic sometime between 2026 and 2030 we will transition from the 2.5 nanometer node to to nanometer node and then 1.5 an enemy to node and I expect the carbon nanotubes will be used to make transistors on these nodes in 2028 we will see mobile CPUs reached their frequency wall at around 4 gigas after which point they will begin to regress as mobile CPUs increased score counts for reference the a12 in Apple's current phones has a max frequency of 2.5 gigahertz so we will see a gradual increase in the next 10 years to around doubled the frequency on today's mobile chips during this period we will probably see the gate all around all nano sheet transistor design reach its limits so we will see the start of stacking nano sheets and move to the 3d stacked and nanos sheet transistor the tentative name for this transistor right now is C fat by 2030 we will finally reach the 1 nanometer node again this is just a marketing term hair carbon nanotubes will be prevalent in 3d form and photonics will possibly replace copper wires before we look at what comes after 2030 let's have a quick look at some of the materials and advancements that will get us discontinued Moore's Law scaling in the period we just looked at the carbon nanotube transistor was first demonstrated in 1998 in simple terms carbon nanotubes are a 1d layer of graphene that's rolled into a tube the idea here is to change the channel in MOSFETs to one of these tubes at least for the first implementations or an array of these tubes to better control the flow of electrons so in practical terms a carbon nanotube transistor can reliably switch using much less power than a silicon-based device so this will result in much lower power consumption in your rise in 6000 or 7000 series processors compared to the silicon based FinFET transistors we have today but to be clear silicon will still be used alongside carbon nanotubes for a long while probably as a base layer we will likely eventually see it and sister made with no silicon using only carbon nanotubes but it's likely that silicon will stay around for several decades still in one form or another alongside silicon based semiconductors we are seeing the emergence of compound semiconductors using gallium and other elements like nitrogen to form gallium nitride the 5g phones there you will be buying in the next couple of years will already feature compound semiconductors and these will continue maturing in the next ten years it's possible that modifying the gate with ferroelectric properties brings about a negative capacitance transistor Global Foundries have been experimenting with ferroelectric on their 14 nanometer process but so far the devices have been unreliable in addition to gallium nitride there's also work being done with other three V's namely gallium arsenide these have so far been shown to be pretty good for things like RF power amplifiers but not so much as candidates for the Xbox 360 - I road map and going beyond 2030 will first look at photonics in simple terms photonics aims to use the light as a medium for switching in transistors instead of using electrons like we have been doing forever now the problem with photonics so far has been with the practical implementation it's just too expensive and impractical to build a full photonics system right now so at least in the short term the most likely application for photonics is related to why ring you see copper wires don't scale with resolution meaning that when we go down to five nanometers three nanometers one nanometers and Beyond copper wires will have the same characteristics as they have today there won't be any faster will use less energy photonics is one of the few candidates for future architectures that can solve this energy related problem also at a recent exhibition here in Europe I found this really interesting discovery of luminescence in graphene so it turns out that you can use a laser to open up a bandgap in graphene which results in night emission in the same spectrum as the Sun well if we're going to use graphene to build transistors anyway and if they have a luminescence when excited by a laser why not combine lasers in graphene to create a photonic system the transistor laser is actually something that is being researched and developed this is in the same area as photonics with the difference being that the photon to electron conversion can happen within the laser and of course the advantages here are that light travels much faster than electrons which means we could see speed ups of hundreds of times compared to our current computers a more likely scenario doe like I said is that photonics in one form or another only gets used to replace copper wires you will often have very smart people in the semiconductor industry say that quantum computing will never be a real thing these people either have a bias or just haven't looked at the history of this industry IBM Intel and Google and a few others like regazzi are in an arms race to create quantum computers that take processing far beyond traditional binary silicon based systems the reason so many are skeptical is that so far quantum hasn't been demonstrated to be better than a conventional semiconductor processor at well anything so you could say they're quantum computing right now he's not even on the same level of progress as Aniak the massive vacuum tube computer was in the 1940s when it was first introduced but there are enough similarities for people to get excited about quantum my personal opinion is that once fully realized quantum computing will be the single biggest advancement in human history no any attempt to explain the scale of data there quantum computing can compute will always fail to demonstrate how absolutely massive it is one here's an example let's say I want to take coffee to the next level and make the best coffee there has ever been made you can either go through millions of permutations in a test - to test every possible molecular permutation which would take you several lifetimes or you can model the molecule of caffeine in a computer exactly as it exists in nature so that a computer can run simulations to figure out what the best permutation is how much information would that take can you guess for an accurate model of caffeine it's roughly ten to the forty eight bits of information so that's one with 48 zeros that's a big number right how big the number of atoms in the earth is estimated to be 10 to the 49th - 10 to the 50th so the number of bits so that's zeros and ones just to represent our caffeine molecule in a single instant is roughly 10% of the number of atoms in our planet well if you are drinking a coffee while you are watching this video nature is modeling trillions of these molecules in that cup of coffee right now the information is all there being processed by nature without much effort so when we talk about quantum computing we have to let go of most of what we know about computing today and that's why most people fail to understand quantum and why there is so much skepticism the problem with bits based computers like the one you are using to watch this video is that there is a deal with relatively small amounts of data when we're dealing with quantum mechanics and the laws that govern molecules and atoms and electrons we need a machine that operates under the same laws so there's nothing particularly magical about quantum computers versus classical computers it's just a different framework so you have to let go of the notion of logic gates and ands and ORS because those belong to a particular framework that doesn't scale to the amount of data that nature deals with qubits are artificial quantum particles that operate under the same laws of nature that govern quantum mechanics so those are the tools for this additional framework that we have to help us solve these large-scale problems so it's not so much their quantum computers will replace classical computers there are new tool largest-scale problems especially ones with massive parallelism so the two most important questions regarding quantum computing are not how we will get it working because the math is already something we understand but rather when will it become useful and what exactly will it enable as far as when most people estimate that it will take another 30 to 50 years for quantum computers to him the same sort of impact on society that classical computers have today I think it's more like 15 to 20 years you see the rate at which qubits are increasing is not exponential but double exponential that means that quantum supremacy can be achieved within a few years and after that we can have a transistor moment where quantum can have an enormous impact on our society just like the transistor did as far as you having your own personal quantum computer I think that that will certainly happen in our lifetime although it's anyone's guess as to when exactly as far as what sort of things quantum computers can enable well these range from simple things in the short term like discovering new materials to more complicated things like modeling molecules to cure diseases and eventually to more exotic applications in a more distant future like teleportation sounds crazy imagine telling the creators of Aniak in the 1940s that within decades thousands of people would be looking at a picture of them all at the same time in different parts of the globe some even while taking a dump they would have called you crazy too if quantum computing makes your head hurt what about thinking of what comes after that to finish our journey at least for now I'll ask you to think about your favorite song when you think of the chorus section in that song you don't have to sing the whole song to remember it the chorus just comes to you and you start singing it in your head who are an approximation of it how did that process happen as far as we know our brains don't catalog different parts of songs so that we can recall them in fact most of the things we recall aren't actually stored anywhere in our brains we just formulate them as needed strange is that or use the information taking some amount of space or energy be it bits in classical computers or qubits in quantum but the brain which is the most complex system that we know doesn't actually operate this way neuromorphic computer architectures aim to replicate this by using hyper dimensional computing so one of the fundamental principles here is while classical and quantum computers are programmed to do specific functions brains both human and that of animals are flexible and adapt to what each situation and moment in time requires them to do so it's possible that a traditional von Neumann computer architecture so--that's Ram a processor inputs and outputs can be used to create a neuro morphic system but it's more likely that a new paradigm will be required to build a brain like computer while will that be no one knows at this point such a system will probably take decades to materialize it's likely that some form of bio transistor will be required and that a completely different type of architecture will need to be conceptualized I'll leave a link in the description to a white paper that proposes such a neuromorphic system if you are interested in looking that far into the future as a parting note and looking only at the next decade the biggest takeaways as far as the future of computing performance is concerned are that we've entered a stage of hybrid scaling in regards to Moore's Law it's not just an increase in density of transistors anymore it's the packaging that will change the architecture the materials and the transistors themselves and no Moore's Law is not dead in fact while many including Moore himself predicted that Moore's law would end in this coming decade the reality is that were entering the most exciting period in computer history and the best is yet to come if you've enjoyed this video and want to help my channel survive the ad pocalypse join my patreon and get exclusive access to the cortex discourse server where you can talk to me directly about these topics you will also get access to exclusive videos and presentations on computer architecture and you will be joining a community of like-minded individuals if you can't contribute financially at this time then please share this video on social media s that really helps thanks for watching and until the next one [Music]
Info
Channel: Coreteks
Views: 375,747
Rating: undefined out of 5
Keywords: tsmc, photonics, quantum, moore's law, amd, intel, nvidia, silicon, global foundries, samsung, euv, gaafet, mosfet
Id: maZnwuGxAA8
Channel Id: undefined
Length: 26min 44sec (1604 seconds)
Published: Thu Aug 29 2019
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.