Saturday, Sep 24, 2022

Snowden used ‘inexpensive’ web crawler software to access 1.7 mn secret NSA files

Snowden used Web Crawler software to scrape out sensitive data out of the NSA system while he went about his day job.


Snowden is reported to have accessed nearly 1.7 million NSA files. (AP) Snowden is reported to have accessed nearly 1.7 million NSA files. (AP)

US whistleblower Edward Snowden, who honed his hacking skills in India, used inexpensive and widely available software to “scrape” the National Security Agency’s networks, according to American intelligence officials probing his high-profile case. Using “web crawler” software designed to search, index and back up a website, 30-year-old Snowden “scraped data out of our systems” while he went about his day job, The New York Times quoted a senior intelligence official as saying.

“We do not believe this was an individual sitting at a machine and downloading this much material in sequence,” the official said. The process by which Snowden gained access to a huge trove of the country’s most highly classified documents was “quite automated” and the former CIA contractor kept at it even after he was briefly challenged by agency officials.

The findings are striking because the NSA’s mission includes protecting America’s most sensitive military and intelligence computer systems from cyber attacks, especially the sophisticated attacks that emanate from Russia and China, the report said. In contrast, Snowden’s “insider attack” was hardly sophisticated and should have been easily detected, investigators found. Snowden had broad access to the NSA’s complete files because he was working as a technology contractor for the agency in Hawaii, helping to manage the agency’s computer systems in an outpost that focuses on China and North Korea.

Subscriber Only Stories
Giorgia Meloni could be the first woman to lead Italy. Not all women are ...Premium
UPSC Essentials: Weekly news express with MCQs — EWS to Modi-Putin meetPremium
Swati Ganguly’s Tagore’s University: A History of Visva-Bharati (19...Premium
A cool and breezy Carnatic Summer is helping Chennai reset and rehabPremium

A web crawler, also called a spider, automatically moves from website to website, following links embedded in each document, and can be programmed to copy everything in its path. Snowden appears to have set the parameters for the searches, including which subjects to look for and how deeply to follow links to documents and other data on the NSA’s internal networks. US intelligence officials said that he accessed roughly 1.7 million files.

According to media reports, Snowden had traveled to India in 2010. He spent six days in New Delhi, taking courses in “ethical hacking,” where he learned advanced techniques for breaking into computer systems and exploiting flaws in software, the reports said. Among the materials prominent in the Snowden files are the agency’s shared “wikis,” databases to which intelligence analysts, operatives and others contributed their knowledge. Some of that material indicates that Snowden “accessed” the documents. But experts say they may well have been downloaded not by him, but by the programme acting on his behalf.

First published on: 09-02-2014 at 08:07:07 pm
Next Story

IB officer Rajinder Kumar behind affidavit declaring Ishrat a terrorist: CBI

Latest Comment
Post Comment
Read Comments