Monday, November 14, 2011

Legal Issues with Cloud in UK.


Cloud computing did not exist when data protection regulations came in. John Roberts of Redstone explains how to keep within the law
The UK Data Protection Act (DPA) is often regarded as the world's leading law on protecting personal data. But many UK companies now adopting cloud services are not only putting data at risk, but also themselves, by breaching data protection laws. How do you comply with the DPA, whilst maintaining a cloud presence? When the UK government passed the DPA in 1998 it was heralded as the definitive way to guarantee personal data was protected. Over the following decade, refinements to the act ensured that personal data was not just secure, but more specifically, it was secure online. This worked well when data was held on-premise, within a company's own data centre, but the advent of cloud technology has changed all that.

What do we mean by cloud?

Just to be clear, in this context we're referring to 'cloud' as infrastructure as a service. Ask many cloud service providers (CSPs) where a specific piece of data is held, and it would take them a while to answer. In most instances the cloud does not recognise national boundaries. CSPs simply move data across their often globally dispersed infrastructure at will in the most efficient way for them. This means that the IT director no longer knows where his or her data is, nor are they able to comply with the DPA. With data being streamed and stored across national territories, it also runs the risk of falling foul of other countries' legislation. When George W Bush signed the US Patriot Act into law in 2001 following 9/11, no one could have predicted the data protection conflict that would occur between the UK and US as a result. The two acts lie in direct opposition of each other. The UK DPA prohibits organisations passing personal data on to another party, yet, the US Patriot Act expressly permits the US government to access and examine any data – personal or otherwise – that's held by a US company. Security has long been a real concern for IT directors considering cloud infrastructures but previous anxieties have focused on data loss rather than location – a legal requirement enforceable under the DPA. Location of data has to become a priority, considering the words of Microsoft UK MD Gordon Frazer this summer who admitted that the US Patriot Act took precedence over the DPA. Not only does this mean trouble for UK companies using cloud services where data is stored in the US, it also means that the data of US companies operating outside of its borders are also subject to this priority, affecting some of the world's largest CSPs – from Microsoft and Salesforce, to Google and Amazon. The EU, UK and other nations are debating the issue. The EU has negotiated a safe harbour agreement with the US to protect data. However, since most CSPs are unable to assure customers where data is located, the bigger question has to be: just whose responsibility is data storage when operating in the cloud? The Information Commissioner's Office (ICO) is responsible for enforcing the DPA, and its latest annual tracking survey found that one in four companies are still unaware of the need to comply with the DPA. While many companies may plead ignorance, we've found a more concerning trend. When it comes to the cloud many data owners believe data protection responsibility lies with the CSP, or more worryingly, are simply using the cloud as a way to abdicate responsibility for storing and protecting their data. This company apathy to data protection is widespread. We know this from experience. Rarely are we asked by prospective customers to ensure that data held within our cloud service is stored in the UK – compliance is simply not considered an issue when buying cloud services.

With power comes responsibility

Many cloud services are problematic because they provide a generic, one-size fits all solution. Yet as cloud services have evolved, alongside customer needs, more tailored solutions have appeared, including UK-specific, DPA (and PCI) compliant services. With these services 'control', a concern cited by many IT directors when considering cloud services initially, has been given back to the IT department. With that control, however comes the responsibility for data protection. The other failure occurs with the law itself. While the DPA provides stipulated requirements for the protection of data, it is enforced retrospectively not proactively. That means that companies are only prosecuted once a breach has occurred. The ICO has no power to audit private sector companies' compliance to ensure that a data breach doesn't occur in the first place. Having no audit control over the private sector makes it impossible to proactively regulate and enforce the DPA. It's generally accepted that the private sector generates the most data protection complaints. As a result, the information commissioner Christopher Graham, recently called for compulsory audit powers for the private sector. Data audits need to become a requirement within the financial and legal audit processes if companies are to be held accountable for data protection. We think that the solution may be simpler. What the industry (and companies operating in the cloud) needs to assist in compliance is a series of DPA standards. Comparable to ISO 9000, a simple checklist of standards would provide companies with a way to effectively measure themselves as part of any risk assessment or business continuity plan. We've seen how well they work for quality management, so now it's time to apply the same theory to the question of data protection. 
Source : eweek


Code Search for Open Source Holes

Tools such as Google Code Search can provide hackers with a wealth of information hidden in open source code, writes Eric Doyle

The downside of open source is its very openness. Hackers are using Open Source Intelligence (OSint) to find personal information and even passwords and usernames to plan their exploits.

Organisations like Anonymous and LulzSec have been using Google Code Search - a public beta in which Google let users search for open source code on the Internet - according to Stach & Lui, a penetration testing firm.  In Code Search, they can unearth information to assist them in their exploits, for instance finding passwords for cloud services which have been embedded in code, or configuration data for virtual private networks, or just vulnerabilities that lay the system open to other hacking ploys, such as SQL injection.

Google Hacking

The Google service is due to be switched off next year as part of the company's rationalisation of its research efforts with the closure of Google Labs but that does not mean that exposed code on the Internet will be safer. There are several sites which provide similar services.

 Google's BigTable is the repository of most things the company gleans from its searches, and searching it for nefarious purposes is known as Google Hacking.

A-Team, a white-hat hacking group which appears to have the sole purpose of exposing Anonymous and its various subgroups, wrote a highly critical, sneering condemnation of Google Hacking.

"LulzSec and Anonymous [are] believed to use Google Hacking as a primary means of identifying vulnerable targets," the group blogged in June this year. "Their releases[revelations] have nothing to do with their goals or their lulz [fun]. It's purely based on whatever they find with their 'google hacking' queries and then release it."

Mark Stockley, an independent Web consultant, wrote on the Naked Security blog, "While the findings provide a much-needed wake-up call to online businesses, admins and developers, they also offer a fascinating insight into the motivation of hacking collectives such as Anonymous and LulzSec...

"Rather than being motivated by politics or injustice, hacking groups may simply be targeting organisations because Google Code search has turned up a vulnerability too tempting to ignore, making them less political action groups, more malicious 21st century Wombles," he said.

The best protection is to ensure that nothing is included in code that is useful to a hacker. If it is unavoidable then the information should be stored separately and encrypted.

Colin Tankard, managing director of encryption and security specialist Digital Pathways, advised, "Obviously if the data is encrypted it protects that data wherever it goes as long as the key is never stored with the data. This adds extra control of who or what application is allowed access to the data. By applying encryption with access control organisations can define who or what is allowed access to data."

Source: eWeek 



Tuesday, October 25, 2011

eWeek Europe : Facebook Offers Developers HTML5 Resource Centre

Facebook has opened a HTML5 Resource Centre to help developers build and test HTML5 apps

To continue reading
http://www.eweekeurope.co.uk/news/facebook-offers-developers-html5-resource-centre-43310

Cloud To Drive Major IT Spend

Spending by public cloud service providers will grow at a sharp rate over the next few years, according to IDC
To continue reading: http://www.eweekeurope.co.uk/news/study-cloud-to-drive-major-it-spend-4334

XML Encryption flaw with no simple patch

Security researchers have cracked the major XML framework used to encrypt data in major web applications. Two researchers from Germany's Ruhr-University demonstrated a practical attackagainst XML's cipher block chaining module at the ACM Conference on Computer and Communications Security in Chicago on 19 October. The technique affects messages encrypted with any of the algorithms supported by the XML Encryption standard, including the popular AES and DES.

'No simple patch'

"We were able to decrypt data by sending modified ciphertexts to the server - by gathering information from the received error messages," the researchers said in a statement.XML, or "eXtensible Markup Langugage", is used for storing and transporting data and is widely used for web applications such as business communications, e-commerce, financial services, healthcare, and government and military infrastructure. Standardised in 2002 by the W3 Consortium, an Internet standards group, XML Encryption is widely used by Apache, Red Hat, IBM and Microsoft in their XML frameworks. "There is no simple patch for this problem", said Juraj Somorovsky, one of the researchers, adding, "We therefore propose to change the standard as soon as possible." Researchers proposed replacing the CBC module in XML Encryption with a mechanism focused on both message confidentiality and message integrity. Adopting a new approach and changing the standard, however, would likely affect existing deployments and create backwards compatibility issues with older applications, the researchers said. A potential attack vector involves sending bogus messages to a targeted system and then using the information returned by the system to crack the encryption. "We show that an adversary can decrypt a ciphertext by performing only 14 requests per plaintext byte on average," they said. "This poses a serious and truly practical security threat on all currently used implementations of XML Encryption."

Workarounds

The German team notified all affected XML framework providers, including Amazon.com, IBM, Microsoft, and Red Hat, via the W3C mailing list before releasing their paper. They've engaged in "intensive discussions on workarounds" with some of the affected organisations. Amazon.com acknowledged the issue on 20 October, and said it had fixed the related vulnerabilities in XML-based messaging protocol Simple Object Access Protocol (SOAP) in its Elastic Compute Cloud (EC2) infrastructure. The giant also checked to ensure no customers had been targeted by potential attackers. "The research showed that errors in SOAP parsing may have resulted in specially crafted SOAP requests with duplicate message elements and/or missing cryptographic signatures being processed," the company wrote in the Amazon Web Services security bulletin. "If this were to occur, an attacker who had access to an unencrypted SOAP message could potentially take actions as another valid user and perform invalid EC2 actions," according to the advisory. Amazon said it would be generally difficult for attackers to obtain a pre-signed SOAP request or a signed certificate, but admitted it was possible if the customer was sending SOAP requests over plain HTTP connections instead of the more secure HTTPS protocol. The researchers also disclosed cross-site scripting flaws that would have allowed attackers to obtain the certificate, according to Amazon. This is not the first time the CBC mode in encryption protocols was targeted. Two researchers last year developed a "padding oracle attack" to decrypt encrypted cookies for websites and hijack users' secure sessions. The technique affected the security of Microsoft's ASP.NET framework and forced an emergency patch from Microsoft to close the hole.

Source: eWeek




Thursday, October 6, 2011

The Obama administration says switching some federal computer networks to cloud computing will save taxpayers money but members of Congress are worrying about the security risks.

clearpxl

Cloud computing refers to computer networks that are hosted by outside vendors and are accessible over the Internet.

Until now, the federal government has kept all its networks within its own computer systems.

Last week, the Department of Homeland Security granted a five-year, $5 million contract to computer company CGI Federal Inc. to manage some of its public Web sites. They include DHS.gov, FEMA.gov and USCIS.gov.

Members of the House Homeland Security subcommittee on cybersecurity want to know whether computer hackers who have broken into other Web sites could hack the government Web sites hosted by private companies.

The subcommittee plans a hearing Thursday on the risks of cloud computing.

"In light of the administration's 'Cloud First Policy' and the announced transition by the Department of Homeland Security to cloud computing, my subcommittee will be examining how government information is being managed and secured in the cloud environment," said Daniel E. Lungren (R-CA), chairman of the cybersecurity subcommittee.

The Cloud First Policy refers to President Barack Obama's plan to switch government Web site management to private companies when it can be done at lower cost without security risks.

The Homeland Security Department contract last week was the first of many planned for federal agencies.

Computer networks that contain classified information or represent a public threat if they are hacked will be served only by the government's servers and systems under the Cloud First Policy.

"We also want to hear how the private sector is implementing this shared technology option, its cost savings and risk concerns," Lungren said.

Cloud computing offers its customers easier updates to Web sites, less maintenance and lower costs for equipment and personnel.

The controversies for the government include the potential for layoffs among its computer staff and whether private contractors can be trusted to properly manage the government networks.

Homeland Security Department spokesman Larry Orluskie said his agency's contract with CGI Federal "maintains requisite security for the government's needs and delivers best-in-class return on investment for the citizens of the United States."

CGI Federal said in a statement that its computer management service "contains all of the required enterprise-wide security" the government requires.

The House hearing Thursday could influence whether Obama's Cloud First Policy gets carried out.

Republicans, who hold a majority in the House, must eventually approve funding for the program.

Private companies confront the same issues as the government, but still are making a big push toward cloud computing.

Four out of five businesses plan to switch to cloud computing soon, according to a survey of more than 900 large companies announced this week by the business consulting firm of KPMG.

Ten percent of the companies surveyed reported they already moved their core information networks from internal computers to cloud computing.

A grocery mentioned by KPMG in its study reported it could maintain its inventory better and increase sales by linking its suppliers through a cloud computing ordering network.

Cloud computing is "quickly shifting from a competitive advantage to an operational necessity," said Steve Hill, KPMG's vice chairman of strategic investments.

The cloud computing industry is expected to generate $177 billion in revenue by 2015, compared with $89.4 billion this year, according to industry forecasts.



Read more: http://www.allheadlinenews.com/articles/90061978?Congress%20to%20examine%20Obama%26%23146%3Bs%20%26%23147%3BCloud%20First%20Policy%26%23148%3B%20for%20computers#ixzz1a0bH5wTk