NSA Whistle-Blower Tells All: The Program | Op-Docs | The New York Times
Video Statistics and Information
Channel: The New York Times
Views: 291,908
Rating: 4.9161863 out of 5
Keywords: William Binney, William Binney interview, United States National Security Agency (Organization), NSA, n.s.a., laura poitras, filmmaker, film, documentary, top secret, top secret nsa, top secret nsa program, edward snowden, data collection, personal data collection, The New York Times, NY Times, NYT, Times Video, nytimes.com, news, newspaper, feature, reporting, Whistleblower, Edward, Snowden, Russia, Phone tapping
Id: r9-3K3rkPRE
Channel Id: undefined
Length: 8min 27sec (507 seconds)
Published: Wed Aug 29 2012
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.
Here's a leaked video from Raytheon of a piece of software called Riot that allows you to track people via social media. Seems relevant: https://www.youtube.com/watch?v=im2HycUMSbM
I heard it was called Prism? can anyone explain the difference between prism and stellar wind?
pretty fucked up
He also was on the 29th Chaos Communication Congress in Hamburg as a speaker for "Enemies of the State" with Jesselyn Radack and Thomas Drake.
Heres the video https://www.youtube.com/watch?v=qBp-1Br_OEs (He starts at ~0:53:00)
Seriously NYT? 480p is the best you can do?
http://www.imgur.com/oDuDB2B.jpeg
Yet they couldn't prevent the Boston bombing...I'm not gonna concern myself with the dangers of a system that is bloated up to be something it's not.
There are so many things I don't understand about this.
Firstly, how does all this data get from the sources to whatever systems the NSA supposedly has to process it all? Let's take banking, one of the "domains" mentioned in the video, as an example. Now I worked for an investment bank as a software developer for a while, and even transferring data between systems in the same bank was a monumental task. It required many teams of software developers, network engineers and systems administrators, not to mention millions of dollars worth of hardware to move data around internally. Moving the amount of data generated about customers of a bank even in a single day to an external data center would require even more effort. Who's writing the software to collate all this data from the various databases and software systems in the bank? Who's maintaining the hardware in the bank that this software runs on? Who's responsible for administering those systems? Who's paying for and maintaining the big fat network links that would be required to move that amount of data? And how the hell does this all happen without hundreds of employees of the bank knowing about it? Now imagine the same problems across every single bank, ISP, telecoms company and wherever else they're acquiring data from. I can't understand how they would manage this without large teams of insiders inside every single one of these companies. Also, the entire board of each company would have to be in on it too. It's not like they can just flip a secret little switch and magically start acquiring all of this data.
Secondly, who designs the protocols and data formats? If you've ever been involved in a merger of two companies as IT staff, an engineer or a software developer, you'll be well aware of the pain that results from trying to get two different systems to talk to each other! One merger I saw still wasn't entirely complete after even 3 years - there were still multiple versions of systems that did much the same thing, and other systems that had to be able to talk to both of them at the same time. So to built a system that can successfully talk to the systems in hundreds of other different companies? That's not something you can just do. That takes a huge amount of effort. You have to design protocols, procedures, data back-up policies, error-handling, fail-over systems and redundancy etc. Not to mention then processing all of the data to import it into your own system!
Thirdly, I don't see how they can have enough storage and enough systems to store 100 years worth of that amount of information. Take a second to think how much data Google stores about each person; now go research how many data centers they have, how many staff they have, how many failed hard drives they replace every single day. They've basically designed their own data centers to be able to cope. And we're supposed to believe that the NSA can store and process probably several of orders of magnitude more in a single data center, and keep it a complete secret? Really?
tl;dr The technical challenges of acquiring, moving, sorting, processing and then storing that amount of data every day from so many different companies and sources makes it extremely infeasible. Doing it in secret is practically impossible. We're talking about a system that would literally be several times bigger than Google, which employs over 30,000 staff across the world.
Edit: Why the downvotes? If there's something I'm missing, tell me...
Anybody remember all those cold war movies where people in the USSR were always paranoid that telephones, radios or TV's were used as listening devices? Even the Russians never figured out how to get people to carry the bugs with everywhere they went or post their personal information in easily accessible databases.