Saturday, January 29, 2011

Gartner Identifies the Top 10 Strategic Technologies for 2011

Cloud computing is at the top of the list. Please see the excerpts from Gartner's report. 

"Gartner defines a strategic technology as one with the potential for significant impact on the enterprise in the next three years. Factors that denote significant impact include a high potential for disruption to IT or the business, the need for a major dollar investment, or the risk of being late to adopt.

A strategic technology may be an existing technology that has matured and/or become suitable for a wider range of uses. It may also be an emerging technology that offers an opportunity for strategic business advantage for early adopters or with potential for significant market disruption in the next five years.   As such, these technologies impact the organization's long-term plans, programs and initiatives.

"Companies should factor these top 10 technologies in their strategic planning process by asking key questions and making deliberate decisions about them during the next two years," said David Cearley, vice president and distinguished analyst at Gartner.

"Sometimes the decision will be to do nothing with a particular technology," said Carl Claunch, vice president and distinguished analyst at Gartner. "In other cases, it will be to continue investing in the technology at the current rate. In still other cases, the decision may be to test or more aggressively deploy the technology."

The top 10 strategic technologies for 2011 include:

Cloud Computing. Cloud computing services exist along a spectrum from open public to closed private. The next three years will see the delivery of a range of cloud service approaches that fall between these two extremes. Vendors will offer packaged private cloud implementations that deliver the vendor's public cloud service technologies (software and/or hardware) and methodologies (i.e., best practices to build and run the service) in a form that can be implemented inside the consumer's enterprise. Many will also offer management services to remotely manage the cloud service implementation. Gartner expects large enterprises to have a dynamic sourcing team in place by 2012 that is responsible for ongoing cloudsourcing decisions and management.

Mobile Applications and Media Tablets. Gartner estimates that by the end of 2010, 1.2 billion people will carry handsets capable of rich, mobile commerce providing an ideal environment for the convergence of mobility and the Web. Mobile devices are becoming computers in their own right, with an astounding amount of processing ability and bandwidth. There are already hundreds of thousands of applications for platforms like the Apple iPhone, in spite of the limited market (only for the one platform) and need for unique coding.

The quality of the experience of applications on these devices, which can apply location, motion and other context in their behavior, is leading customers to interact with companies preferentially through mobile devices. This has lead to a race to push out applications as a competitive tool to improve relationships and gain advantage over competitors whose interfaces are purely browser-based.

Social Communications and Collaboration.  Social media can be divided into: (1)Social networking —social profile management products, such as MySpace, Facebook, LinkedIn and Friendster as well as social networking analysis (SNA) technologies that employ algorithms to understand and utilize human relationships for the discovery of people and expertise. (2) Social collaboration —technologies, such as wikis, blogs, instant messaging, collaborative office, and crowdsourcing. (3) Social publishing —technologies that assist communities in pooling individual content into a usable and community accessible content repository such as YouTube and flickr. (4)Social feedback - gaining feedback and opinion from the community on specific items as witnessed on YouTube, flickr, Digg, Del.icio.us, and Amazon.  Gartner predicts that by 2016, social technologies will be integrated with most business applications. Companies should bring together their social CRM, internal communications and collaboration, and public social site initiatives into a coordinated strategy.

Video.  Video is not a new media form, but its use as a standard media type used in non-media companies is expanding rapidly. Technology trends in digital photography, consumer electronics, the web, social software, unified communications, digital and Internet-based television and mobile computing are all reaching critical tipping points that bring video into the mainstream. Over the next three years Gartner believes that video will become a commonplace content type and interaction model for most users, and by 2013, more than 25 percent of the content that workers see in a day will be dominated by pictures, video or audio.

Next Generation Analytics. Increasing compute capabilities of computers including mobile devices along with improving connectivity are enabling a shift in how businesses support operational decisions. It is becoming possible to run simulations or models to predict the future outcome, rather than to simply provide backward looking data about past interactions, and to do these predictions in real-time to support each individual business action. While this may require significant changes to existing operational and business intelligence infrastructure, the potential exists to unlock significant improvements in business results and other success rates.

Social Analytics. Social analytics describes the process of measuring, analyzing and interpreting the results of interactions and associations among people, topics and ideas. These interactions may occur on social software applications used in the workplace, in internally or externally facing communities or on the social web. Social analytics is an umbrella term that includes a number of specialized analysis techniques such as social filtering, social-network analysis, sentiment analysis and social-media analytics. Social network analysis tools are useful for examining social structure and interdependencies as well as the work patterns of individuals, groups or organizations. Social network analysis involves collecting data from multiple sources, identifying relationships, and evaluating the impact, quality or effectiveness of a relationship.

Context-Aware Computing. Context-aware computing centers on the concept of using information about an end user or object's environment, activities connections and preferences to improve the quality of interaction with that end user. The end user may be a customer, business partner or employee. A contextually aware system anticipates the user's needs and proactively serves up the most appropriate and customized content, product or service. Gartner predicts that by 2013, more than half of Fortune 500 companies will have context-aware computing initiatives and by 2016, one-third of worldwide mobile consumer marketing will be context-awareness-based.

Storage Class Memory. Gartner sees huge use of flash memory in consumer devices, entertainment equipment and other embedded IT systems. It also offers a new layer of the storage hierarchy in servers and client computers that has key advantages — space, heat, performance and ruggedness among them. Unlike RAM, the main memory in servers and PCs, flash memory is persistent even when power is removed. In that way, it looks more like disk drives where information is placed and must survive power-downs and reboots. Given the cost premium, simply building solid state disk drives from flash will tie up that valuable space on all the data in a file or entire volume, while a new explicitly addressed layer, not part of the file system, permits targeted placement of only the high-leverage items of information that need to experience the mix of performance and persistence available with flash memory.  

Ubiquitous Computing.  The work of Mark Weiser and other researchers at Xerox's PARC paints a picture of the coming third wave of computing where computers are invisibly embedded into the world. As computers proliferate and as everyday objects are given the ability to communicate with RFID tags and their successors, networks will approach and surpass the scale that can be managed in traditional centralized ways. This leads to the important trend of imbuing computing systems into operational technology, whether done as calming technology or explicitly managed and integrated with IT. In addition, it gives us important guidance on what to expect with proliferating personal devices, the effect of consumerization on IT decisions, and the necessary capabilities that will be driven by the pressure of rapid inflation in the number of computers for each person.

Fabric-Based Infrastructure and Computers.  A fabric-based computer is a modular form of computing where a system can be aggregated from separate building-block modules connected over a fabric or switched backplane. In its basic form, a fabric-based computer comprises a separate processor, memory, I/O, and offload modules (GPU, NPU, etc.) that are connected to a switched interconnect and, importantly, the software required to configure and manage the resulting system(s). The fabric-based infrastructure (FBI) model abstracts physical resources — processor cores, network bandwidth and links and storage — into pools of resources that are managed by the Fabric Resource Pool Manager (FRPM), software functionality. The FRPM in turn is driven by the Real Time Infrastructure (RTI) Service Governor software component. An FBI can be supplied by a single vendor or by a group of vendors working closely together, or by an integrator — internal or external."



 







Thursday, January 27, 2011

Interesting status on the pegs views of my blog post

It seems that IE and Firefox took the major market share from the tech users. 

 

China is missing from the list due to language barrier and due to the fact Baidu is the main search engine in China 

 

Windows took 78%

 

Flipboard, my favorite iPad app is also in the List. 

 

I will publish another status next year to see what have been changed

 

Pageviews by Countries

United States

748

United Kingdom

102

India

71

Canada

56

Netherlands

51

France

42

Germany

28

Russia

27

Australia

25

Italy

19

 

 Pageviews by Browsers

 

Internet Explorer                             643 (44%)

Firefox                                                  511 (35%)

Chrome                                                                186 (12%)

Safari                                                     78 (5%)

Opera                                                   10 (<1%)

FlipboardBrowserProxy                                9 (<1%)

Mobile                                                  9 (<1%)

FlipboardProxy                                 5 (<1%)

SearchToolbar                                   2 (<1%)

Shiretoko                                            2 (<1%)

 

 

Pageviews by Operating Systems

Windows                                             1,148 (78%)

Macintosh                                           151 (10%)

Other Unix                                          117 (7%)

iPhone                                                  18 (1%)

iPad                                                       13 (<1%)

Linux                                                     11 (<1%)

BlackBerry                                           4 (<1%)

iPod                                                       3 (<1%)

Windows NT 6.1                               1 (<1%)

 

 

 

 

 

Tuesday, January 25, 2011

SLA for your data on the cloud

The following post from EWeek indicates why it is important to have an SLA for your company's data stored at cloud provider. You will first need to decide what data should be stored. In Honda case, Vin is not needed for web
Mail and as such should not be stored on the cloud. A privacy impact analysis before moving data to the cloud will certainly find this security hole. Then you need to make sure that sensitive data are stored in the encrypted format and the key management for the encryption is up to FIPS 140-2 standard. Third, you need to ask your provider to implement correct identity and access management solution to limit access to your data and finally you need to review and audit the security controls in place. Also, please read my check list for SLA for the cloud service. 

The following is from eWeek. 


In light of recent data breaches, including a December 2010 incident which affected 2.2 million Honda customers, IT managers need to limit what data is actually shared with cloud service providers.

Corporations partnering with cloud service providers need to think carefully about what data is being shared to adequately protect consumer privacy, according to security experts.

As companies outsource various business functions, which can range from e-mail marketing to e-discovery and e-mail archiving, it's important to remember that the data leaving the corporate network still needs to be protected. The protection should be worked in to the contract, ensuring a standard level of security, before the company hands over the data.

There are many benefits of the cloud, but "all that goes out the window when there is a data breach," Ben Goodman, principal strategist for identity, security, and compliance at Novell, told eWEEK. When the cloud provider gets breached, the company that hired the provider is help responsible, he said. Companies "outsource the job, not the responsibility," Goodman added.

That was the case when an e-mail marketing firm that Honda partnered with had a data breach in late December. Criminals stole a database containing names, login names to a Honda portal, e-mail addresses, and 17-character Vehicle Identification Numbers for 2.2 million Honda customers, according to a Dec. 28 report in Columbus Dispatch. A separate list of 2.7 million Acura customer e-mail addresses was also stolen from the same marketing firm, but that list did not have any other customer data.

The inclusion of VINs in the stole data was surprising, as Honda shared its customer information with the firm for e-mail marketing services. People understand that a certain level of "data granularity is required" when the data is stored on-premise, but when that information "crosses the firewall," it's not acceptable, said Goodman. Enterprises should be exercising "minimum disclosure," and not giving external providers "more data than they need," he said.

Enterprises are "not as concerned as they should be," about how other providers are using their data, Brian Singer, senior security management solution manager at Novell, told eWEEK. Honda's relationship with the e-mail marketing provider was analogous to the one an enterprise would have with a cloud provider, Singer said, and similar rules applied. Enterprises need to be careful what data they provide to cloud service providers, because the providers may not have the kind ofsecurity controls that would exist internally, he said.

"Don't take for granted the provider will secure the data," Goodman said.

Companies should discuss security measures such as access control, network monitoring, and regular audits when negotiating the partnership, Singer said. It should be clearly part of the contract that the provider needs to make sure the data is being secured, he said.

There is no reason to just hand over the information as is to the provider, said Singer, just because it might have taken some extra time to remove unnecessary data fields from the files. Prior to sending the data, it needs to be examined to verify that the bare minimum of what the provider needs is sent, and nothing else, he said. In Honda's case, the VIN numbers clearly were unnecessary for the firm to send customers a Welcome email after they bought the car at a dealership, or created an account with Honda Financial Services.

Cloud applications deployed "under radar" without IT approval or supervision are also a challenge for IT managers who need to ensure proper data granularity, Goodman said. "All the effort to put in access control and security" measures are compromised when the data is being passed on to these cloud applications without thinking about whether it's really necessary to share all that information, , Goodman said.

Another concern with cloud service providers is the fact that they are multi-tenanted, Singer said. It is difficult for the enterprise to know the extent of a data breach that hits a cloud service providers, whether the data of one company or multiple companies were compromised by the breach. "Companies don't know which companies are affected," he said.



Saturday, January 22, 2011

Check list for drafting a Service Level Agreement for Cloud Service.

This is initial list for reference only and listed with no particular order and I welcome comments and insights and will update it if needed based on the comments.

1: Name and content of the Service, specify the type of service such as SAAS, PAAS, IAAS, or Data As Service and the business purpose of the Service.


2: How long does this Agreement valid for? What happen when this agreement is expired? What is the renewal clause?


3: What is the scope of this agreement? What are included and what are excluded?


4: Specify the uptime and down time. What is the maintenance window for the Service?


5: Procedure for prolonged change of uptime and downtime.


6: Who to contact when the service is down?


7: What is the service availability target level? In the Cloud Computing environment, by its nature definition, the service availability level shall reach 100%. But in reality, this is not always possible. So, set a correct target level is very important?


8: What is your Recovery Time Objective (RTO) for the Cloud Service and how does your Cloud Provider meet your RTO? RTO is duration of time and a service level within which the Cloud Service must be restored after a disaster (or disruption) in order to avoid unacceptable consequences associated with a break in business continuity.


9: What is the Recovery Point Objective (RPO) for your data in the Cloud and how does your Cloud Provider meet your RPO? RPO is the point in time to which your organization must recover data in the Cloud.


10: What is your acceptable value for Mean Time Between Failures (MTBF) and how does your Cloud Provider meet this requirement?  MTBF measures predicted elapsed time between inherent failures of a Cloud Service during operation. What defines a failure for your Cloud Service? How do you and/or your Cloud Provider monitors and audits the failure event.


11: Do you get any credit for the service downtime? How it is calculated. The more precise the algorithm is the better will be the SLA.


12: Define various level of support? For example, what will be the Help Desk support? What would be second level or third level support if first level support cannot solve the problems?


13: What will be escalation procedures for the support issues?


14: Where is data located? Should the data be located in certain geographic area?


15: Who can access the Service? Can someone from other county access the service?


16: Who will operate and maintain the Cloud Service? Can it be an offshore provider?


17: What is the price for each user and for additional user? If the user gets to certain level, do you get volume discount? Certain Cloud Service Provider does not provides volume discount and you need to know this up front.


18: If it is IAAS service, specify the price for CPU time, network bandwidth, storage capacity etc.


19: If this is PAAS service, specify OS and its version, database vendor and version, IDE tool version etc.


20: If this is SAAS, specify how this Service integrates with other service inside or outside of the Cloud Provider. Even if SAAS is standalone, is there any published API for the customer to integrate in house application with this service in the cloud?


21: Who will create user? How does user created? Who will provider User Provision and De-Provisioning Service? Does your Cloud Provider support Service Provisioning Markup Language (SPML)? SPML is an XML-based framework, being developed by OASIS, for exchanging user, resource and service provisioning.


22: If Identity Federation is used, what kind of Federation? Who will provide Secure Token Service? Who will be Identity Provider? Does your Cloud Provider supports SAML?  Security Assertion Markup Language (SAML) is an XML-based open standard for exchanging authentication and authorization data between security domains, that is, between an identity provider (a producer of assertions) and a service provider (a consumer of assertions). SAML is a product of the OASIS Security Services Technical Committee.


23: How does Access Management handled? Does your Cloud Provider support XACML? XACML stands for eXtensible Access Control Markup Language. It is a declarative access control policy language implemented in XML and a processing model, describing how to interpret the policies.


24: How does your Cloud Provider supports elasticity and usage based pricing? How frequent does the bill generated (per day, per month, or per year)? Any prorated calculation in the bill?


25: How is security issue handled by Cloud Provider? This will be a very broad topic and can vary depends on the type of cloud service. So, seek a Cloud Security Expert before drafting the agreement.


26: How is Change Management handled by the Cloud Provider?


27: What is the response time and how do you Cloud Provider meets your response time requirement?


29: What is the process and procedure to get the downtime or outage credit? When can you get the credit and how the credit is processed and applied?


30: What is the Domain Name Service for your Cloud Service and the availability of this service?


31: Does you Cloud Provider meet the compliance requirement for PCI-DSS, SOX, HIPAA, FISMA, GLBA, NERC CIP, GCSx , GPG13?


32: How does physical security handled by your cloud provider?


33: Any encryption needed for your data in the cloud and how does it handled? Does the encryption meet FIPS 140-2?


34: How easy it is for you to migrate your data and application from one cloud provider to another cloud provider?


35: Do your cloud provider supports or plans to support Distributed Management Task Force's Open Cloud Standards Incubator standards (DTMF) such as Open Virtualization Format (OVF) and Systems Management Architecture for Server Hardware (SMASH) and the underlying DMTF management data model called CIM (Common Information Model)?


36: Does you Cloud Provider has any recent high profile outage and what is the process and procedure in incident response and the how can this be avoided and minimized in the future?


37: How is Service Level Objectives (SLO) defined? SLO is needed to actually provides a tool to monitor SLA.


38: How is your data and application isolated from other customers? Is this a logical isolation or physical isolation?

 

Friday, January 21, 2011

HTTP Session Tracking Mechanism and the Security

HTTP is a stateless session protocol, meaning that if there is no session tracking mechanism, the server will not be able to track user if user submit multiple HTTP requests.

In order for the Server to track a user, there are four approaches.


The following table describes each approach and the security impact of the approach:

 

Session Tracking Mechanism

Description

When to Use

Security

Use Hidden Fields

The HTML hidden fields are used to track user's unique session ID

Rarely used in Session Tracking due to security concerns

This is not a secure method as the hidden fields can be intercepted by the hacker

Use URL Rewriting

With URL rewriting, every local URL the user might click on is dynamically modified, or rewritten, to include extra information. The extra information can be in the form of extra path information, added parameters, or some custom, server-specific URL change. Due to the limited space available in rewriting a URL, the extra information is usually limited to a unique session ID

If the user browser disable cookie, then this is one of the best alternative method

Can cause session fixation attack or man in the middle attack due to session id being exposed.

Use persistent Cookie

A cookie that is intended to maintain information over more than one browser session.

If the application needs to track user for more than one session. This is widely used in some shopping and news websites to track user's preference

Persistent cookie can cause privacy and security concerns if the cookie saved in persistent storage such as file system or database is revealed to a hacker.

Use Session Cookie

A cookie that is intended to be used only in the browser session in which it is created.

To track user's ID within one session and is widely used by most commercial website.

Compared with all other approaches, this approach has been proved to be a little bit more secure although it still faces session hijacking and Cookie poisoning attacks. http://en.wikipedia.org/wiki/HTTP_cookie

 

Thursday, January 6, 2011

Litigation Win: Google vs. Microsoft on DoI Cloud Service worth of $60 Million.

According to Bizjournals and I quoted here 


"A federal court ordered the Interior Department to halt all plans to award a contract for web-based email, following a lawsuit filed by Google Inc. that claimed the solicitation broke procurement rules.

U.S. Court of Federal Claims Judge Susan Braden filed an injunction on Tuesday that barred the Interior Department from proceeding with or awarding a contract to implement Microsoft Corp.'s suite of online email and collaboration tools, and remanded the procurement — which seeks to consolidate 13 different e-mail systems into a single web-based platform — back to Interior "for additional investigation or explanation." The contract is worth nearly $60 million.

Google sued Interior in October for allegedly violating procurement rules by specifying in its bid request that "only the Microsoft Business Productivity Online Suite" for federal government could be proposed."


This is one of the battle between Google and Microsoft on cloud service and it seems that Google is winning the first round by getting an injunction. overall, it will delay government's cloud adoption and intensify the competition on cloud offering between big companies such as Google, Microsoft, Amazon, etc. It will be interesting to follow up any new advance on this litigation.



Read more: Google gets judge to halt Microsoft contract | Washington Business Journal 
http://www.bizjournals.com/washington/blog/2011/01/court-halts-fed-contract-with-microsoft.html?ed=2011-01-06&s=article_du&ana=e_du_pap


Tuesday, January 4, 2011

Dell to expand into Cloud Security space with purchase of Atlanta's SecureWorks



According to SecureWorks home page: 
Dell Announces Intent to Acquire SecureWorks, Inc
SecureWorks is a US-based managed security services provider headquartered in Atlanta with secure operations centers in Atlanta, Myrtle Beach and Chicago. SecureWorks provides managed security services to companies seeking to protect computer, network and information assets from malicious activity (cybercrime). SecureWorks’ services are used by organizations to fully outsource management and monitoring of security, co-manage security devices (such as firewalls, intrusion prevention systems andintrusion detection systems), monitor 24x7 for security incidents, provide a hosted security information management (SIM) platform for companies who want to monitor security activity themselves or any combination of the above.


For more information, please see: 
http://www.secureworks.com/
With another recent news on Dell's agreement to purchase of Fluid Data Storage (see link below) it appears to me that Dell is very serious on the Cloud Service Offering. 
http://www.compellent.com/About-Us/News-and-Events/Press-Releases/2010/101213-Dell-CML.aspx?ref=HPDellPR