SHARE

Snowden Files 

The New York Times reported that according to one top official of the National Security Agency, Snowden who was the former NSA contractor made use of very cheap and available softwares to hack and gain access to numerous top secret and classified files. This was revealed by one of the NSA top official dealing with the investigations of the case.

According to the official, the collection of the information was done by an automatic process. They added that Snowden made use of ‘web crawler’ designed software which was designed to search the files, to back the files up and finally index the files. This means that Snowden needed not to be on the watch each hour to search for the files manually. The software or program was always running while he was engaged in other daily activities.

According to the anonymous official, it was impossible for a person to sit down all day and all nights on a computer, searching and downloading that number of files.

The NSA investigators then managed to finally conclude that the attack by Snowden was not as highly complicated and advanced and that the NSA monitors were in a position to detect the hacking, something that failed. According to them, the website crawlers can be set to migrate from website to website, this enables the coping of everything this program comes in contact with through the use of embedded link and addresses.

Accordingly, the program needs the correct setting of algorithm so that information is copied correctly. This ensures that the correct subjects are copied and at the same time give limitations on how the program follows the links. The report also indicated that Snowden managed to acquire the files together with other important documentation such as the internal National Security Authority internal networks. This revealed the manner in which the NSA analyses and shares information over the world.

Snowden was allowed to have all the required access to the NSA files; this was according to his job description as an NSA contractor. He was in charge of managing the computer system that were directed to china and North Korea. The NSA officers said that Snowden was able to easily access the NSA files due to the fact that the Hawaii substation was not secured with the latest security protection and measures.

According to analysts, the web crawler which was used by the former NSA contractor was a simpler one. The program was not as sophisticated as the one that GOOGLE uses in it searching engine to get access to the millions of web pages.

When questioned about his job, the officials revealed that Snowden’s work was concerned with system administration as he was responsible for carrying out regular network maintenance. And he was also charged with the task of ensuring that data is completely backed up in the system and servers.

The whistleblower did raise some flags while working in Hawaii, prompting questions about his work, but he was able to ward off criticism successfully.

In June the former contractor admitted that he had a number of documents which were relied upon by many media house to publish news about the US NSA and the GCHQ. After this, Snowden was granted refuge in Russia.

The documents revealed that US and UK agencies have been undertaking surveillance through digital platforms.

NO COMMENTS

LEAVE A REPLY

This site uses Akismet to reduce spam. Learn how your comment data is processed.