US whistleblower Edward Snowden, who honed his hacking skills in India, used inexpensive and widely available software to “scrape” the National Security Agency’s networks, according to American intelligence officials probing his high-profile case. Using “web crawler” software designed to search, index and back up a website, 30-year-old Snowden “scraped data out of our systems” while he went about his day job, The New York Times quoted a senior intelligence official as saying.
“We do not believe this was an individual sitting at a machine and downloading this much material in sequence,” the official said. The process by which Snowden gained access to a huge trove of the country’s most highly classified documents was “quite automated” and the former CIA contractor kept at it even after he was briefly challenged by agency officials.
- Bigg Boss 10 Day 3 Review: Celebs Fail To Do Well in First Task
- Airtel Offers 10GB Data At Rs 259 For New 4G Smartphone Users
- Aamir Khan Starrer Dangal’s Trailer Launched: First Impressions
- TMC Supporters Attack BJP Leader Babul Supriyo
- Sri Lankan Navy Apprehends 20 Indian Fishermen
- Hillary Clinton accuses Donald Trump of being Vladimir Putin’s ‘puppet’
- Senior UP Congress Leader Rita Bahuguna Joshi Joins BJP
- Missing JNU Student: VC Gives Ultimatum To Students Over ‘Illegal Confinement’
- US Presidential Debate: Donald Trump Calls Hillary Clinton ‘A Nasty Woman’
- Hasselblad True Zoom Mod Review
- Honor 8 First Look Video
- Apple Watch 2: Review, Price And Features
- Delhi HC Dismisses Kejriwal’s Plea For Stay In Criminal Defamation Case
- Gulzar Shares An Interesting Anecdote Behind The Lyrics of ‘Humne Dekhi Hai’ Song
- Diya Mirza Displays Her Painting Skills At An Art Festival In Mumbai
The findings are striking because the NSA’s mission includes protecting America’s most sensitive military and intelligence computer systems from cyber attacks, especially the sophisticated attacks that emanate from Russia and China, the report said. In contrast, Snowden’s “insider attack” was hardly sophisticated and should have been easily detected, investigators found. Snowden had broad access to the NSA’s complete files because he was working as a technology contractor for the agency in Hawaii, helping to manage the agency’s computer systems in an outpost that focuses on China and North Korea.
A web crawler, also called a spider, automatically moves from website to website, following links embedded in each document, and can be programmed to copy everything in its path. Snowden appears to have set the parameters for the searches, including which subjects to look for and how deeply to follow links to documents and other data on the NSA’s internal networks. US intelligence officials said that he accessed roughly 1.7 million files.
According to media reports, Snowden had traveled to India in 2010. He spent six days in New Delhi, taking courses in “ethical hacking,” where he learned advanced techniques for breaking into computer systems and exploiting flaws in software, the reports said. Among the materials prominent in the Snowden files are the agency’s shared “wikis,” databases to which intelligence analysts, operatives and others contributed their knowledge. Some of that material indicates that Snowden “accessed” the documents. But experts say they may well have been downloaded not by him, but by the programme acting on his behalf.