Are Brains Analogue or Digital? | Prof Freeman Dyson | Univeristy College Dublin
Video Statistics and Information
Channel: UCD - University College Dublin
Views: 55,299
Rating: undefined out of 5
Keywords: UCD, University College Dublin, Dublin, Ireland, UCD - University College Dublin, myucd, Freeman Dyson (Author), Theoretical Physics (Field Of Study), Mathematics Education (Website Category), Maths, Biology (Field Of Study), Artificial Intelligence (Industry), Computing (Exhibition Subject), Computer Science (Industry), Brain (Literature Subject), Robotics (Invention), Quantum Theory (Video Game), Quantum Computer (Algorithm Family), Analogue brain, digital brain, brain
Id: JLT6omWrvIw
Channel Id: undefined
Length: 50min 40sec (3040 seconds)
Published: Mon May 26 2014
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.
An analog system can absolutely be simulated (or at least approximated) by a digital system, it just takes more power.
I think physicists like Freeman Dyson and Carver Mead make a good point: AI will come from massively parallel special chips that will include silicon transistors used in their natural analog form.
I agree, digital processes can't replicate analog systems perfectly.
If however you throw enough raw power at it, you'll get a fidelity of simulation down to the point that you'd need to run both systems for a few billion years to notice a drift. That's good enough for a brain upload, and heck even if you lose some information during that process that happens daily on an analog brain anyway.
I am definitely a luddite who is disturbed by the implications of uploaded consciousness, but yeah... Gotta disagree.
I think of it this way: The fire control computers used during WWII were amazingly complicated analogue machines. They could constantly take in several dozen inputs of course, speed, wind at different altitudes, spin of the Earth, humidity, age of the shell, etc..... These factors would be constantly computed by a series of cams and gears into bearings at which the guns should be aimed so that, as soon as they were reloaded and ready to fire, they could match the calculated bearing and elevation and let off a salvo.
Analogue computers were better at this task up through the 50's because they could constantly compute new results, while digital computers would run a calcultaion, spit out a result, and then begin running it again with new, slightly adjusted inputs. Initially, these calculations took too long to be iseful. However, once digital computers became fast enough to run the calculations at a speed great enough to approximate their analogue predecessors, they took over.
I think that purely digital beings will quickly cease to think in a manner that we see as properly human, but it won't be their architecture that is the limiting factor.
It's always worth remembering that while digital approaches are undoubtedly powerful they are still an approximation of the underlying analog problem. This is not ideal for time varying problems where a digital system has to necessarily approximate the problem and evaluate it at a single time before moving to the next discrete time. With increasing computational performance you can make the approximation better by making the steps smaller but this requires more power and therefore produces more heat.
For the sort of continuous differential problems associated with artificial intelligence and robotics this causes problems as it may not be possible to achieve suitable performance to deal with complex dynamic situations. Analog processing is however better suited for solving these types of problems in an energy efficient way. Since many people believe that the feedback loop between brain, body, environment and sensors is a requirement for true intelligence this suggests that perhaps analog processing is also required.
This type of approach was initially called neuromorphic as it attempted to model how biological neurons work, though it has more recently expanded to be called cytomorphic.
They simulate analog through digital all the time.
Didn't some mathematician prove, mathematically, that all information can be represented by numbers? If so, then (by the definition of "number") digital can simulate analog.
to everyone in this conversation:
nobody is talking about precision. i mean, almost everybody here, but that's the crux of the issue really. it is not the claim that analog is more precise, the claim is, there are things it can practically calculate and a turing machine can't. similarly how a quantum computer can be simulated by classical computers, but i can easily conjure up a problem (e.g. factorization) that a classical computer can't practically solve, but a quantum computer can. it does not matter that given infinite time, a classical could solve the same problem. the question is always practicality.
Hey look, it's Clarke's first law.
if this is true, that throws a wrench into the mechanism. no more brain uploading, unless someone invents good analog computers.