The Cloud Goes Underground

Extreme weather, seismic events, and even rodents have compromised the physical security of cloud servers and other data center infrastructure. Selecting an underground colocation facility with above industry standards provides the solution to these and other threats.

With cyberattacks such as Petya and WannaCry making big headlines recently, it’s understandable that fortifying cybersecurity is top-of-mind for many CIOs. Last May, WannaCry invaded 200,000 computers in 150 countries, including the U.S., UK, Russia, and China, as it attacked hospitals, banks, and telecommunications companies.

A mere six weeks later, Petya struck — first hitting targets in the Ukraine, including its central bank, main airport, and even the Chernobyl nuclear power plant before quickly spreading and infecting organizations across Europe, North America, and Australia. Its victims included a UK-based global advertising agency, a Danish container ship company, and an American pharmaceutical giant.

Virtual Security Is Only Half the Equation

According to the 2017 BDO Technology Outlook Survey, 74% of technology chief financial officers say cloud computing will have the most measurable impact on their business this year, while IDC predicts that at least 50% of IT spending will be cloud-based in 2018. Although cyberattacks remain a significant threat in this environment, it’s important to remember that virtual security is only half of the equation. With the cloud growing ever more critical to businesses, ensuring the physical security of cloud servers is also essential.

Physical security at the colocation or data center facility is critical to effectively safeguarding not only cloud computing, of course, but also mission-critical business applications, data storage, networking, and computing related to Big Data analytics and emerging technologies such as artificial intelligence and IoT-enabled devices. To be fully secure, companies must ensure that their colocation provider can deliver a high level of physical resilience on-site. As evidenced by the devastation wreaked by Hurricanes Harvey and Irma, these physical threats include extreme weather events, but also seismic disturbances, breaches by unauthorized intruders, and given the current geopolitical climate, terrorism.

Explosives and Squirrels

In recent years, many customers have deprioritized physical security from their data center to-do list. However, physical threats remain real and have the potential to become much more sophisticated. As the late Uptime Institute founder Kenneth Brill wrote, “The oldest cyber frontier is actual physical attack or the threat of attack to disable data centers. Previously in the realm of science fiction, asymmetrical physical attacks on data centers by explosives, biological agents, electromagnetic pulse, electric utility, or other means are now credible.”

While electromagnetic pulses do sound like the stuff of science fiction, some physical security breaches perpetrated against data centers have been more suggestive of a Quentin Tarantino crime drama, and others, a Pixar animated movie for children starring woodland creatures. This is not to minimize the economic impact or the damage these attacks on record have caused to business reputation.

Consider the Chicago-based data center that experienced a physical security breach not once but twice in the span of two years. In the first breach, a lone IT staffer working the graveyard shift was held hostage and his biometric card reader taken from him, allowing the masked assailants to freely enter the facility. They made off with computer equipment estimated at a cost of upwards to $100,000. In the second, resourceful miscreants managed to break through a wall using a chainsaw and stole servers.

Yahoo once saw half its Santa Clara data center taken down by squirrels that managed to chew their way through powerlines and fiber-optic cables. Google “Yahoo and frying squirrels” if you think this episode is referenced merely for entertainment purposes. It is not.

Among the most infamous physical breaches to have taken place was a 2011 attack on Vodafone’s data center in Basingstoke, England. A gang broke in and stole servers and networking equipment, causing systems to go down and the telecom company’s business reputation to suffer greatly.

The Rock-Solid Safety of Colocating Underground

For some companies, the cloud and IT infrastructure altogether have moved even farther from the skies to underground data centers. Data center operators have been retrofitting underground bunkers into functional data centers for many years. But as security and energy demands as well as concerns about terrorism have lately intensified, there’s an increasing trend towards building subterranean colocation facilities to host mission-critical infrastructure and data.

Today, you’ll find underground data center facilities in Lithuania, the Netherlands, Switzerland, Ukraine, the United Kingdom, and Sweden, as well as the U.S. Some of these facilities were previously the site of mining operations while others were originally Cold War era bunkers designed to protect citizens in the event of a nuclear attack.

Surrounded by rock, underground data centers are highly physically secure, and since subterranean temperatures are naturally regulated, environmental conditions are more efficient. But not all underground data centers are created equal. Key design factors to consider during the site selection process include utilities infrastructure, availability and capacity of fiber-optic systems, the risk of natural and man-made risks, and how well the physical perimeter of the facility can be secured.

The issue of location is especially critical, and any data center selection needs to consider whether the facility is in a flood zone or if the region has an unstable seismic profile. The most fortified underground data centers also implement multi-layered security access methods, including visual inspections from multiple 24×7 guard stations, keycard access, video monitoring, and biometric scanning. Best practices incorporate mantraps and restrictive access policies for each customer’s space, providing security within each zone of the facility.

To ensure business continuity, it’s advantageous that all critical infrastructure of a subterranean data center be located underground. This extends to dual utility feeds backed up by two MW generators and N+1 critical infrastructure components, including UPS, chillers, and battery back-up. Such a design is further enhanced by being a SOC 2, Type 2 certified data center, ensuring the customer’s confidence in the provider’s 100% uptime guarantee, if indeed they offer one at all.

And because connectivity means everything, subterranean facilities should also have access to high-speed, carrier-class internet and data services through a fiber network that runs in and out of the data center via multiple fiber paths and entrances.

From presidential bunkers and NORAD facilities hosting military analysts, to scientists studying astrophysics in subterranean laboratories, and to Warner Brothers film archives stored safely away from the elements, some of the world’s most essential personnel, valued assets and activities are located underground, protected from natural and most man-made disasters. So, why should your cloud servers and critical data be any different?

But the crux of the matter is this: while cyberattacks are on the increase and the cloud can be vulnerable, the importance of physical data center security cannot be overstated. The underground data center is a prime example of using the earth’s resource to offer protection from natural and unnatural disasters.

Powered by WPeMatico

What You Don’t Know About Public Cloud Might Hurt You

Companies seem to be moving to public cloud in droves, as conventional wisdom would have us believe it’s more user-friendly, scalable, and affordable than private cloud. So is it ‘bye bye’ on-premises storage? Is that the way to go? Not necessarily.

While public cloud has been increasingly adopted over the last several years, there is also a new trend where organizations are waking up to the hidden costs involved, both in fees beyond the quoted price per GB and in the potential cost to a business’s data security. Companies are finding that data transit fees for public cloud storage can double the cost of basic storage, and can be as high as three or four times if data is moved often. Yet, these fees are frequently ignored when organizations are considering the use of public cloud. This leads to gross miscalculations and over spending. Then there is the question of data security and availability. Breaches of public cloud have become commonplace, and yet organizations are lead to believe the public cloud is safe for the most sensitive data. How can this be?

The answer is two-fold. The major public cloud providers have huge marketing budgets and with that comes the ability to dominate the airwaves with their message, getting in front of a large set of customers. Secondly, public cloud can be the right solution under certain circumstances, primarily for short-term storage. But with recent headline-grabbing public cloud outages from the likes of AWS S3 and Azure, and related data leak risks coming to light, the fundamental importance of keeping greater control over the most critical data has come back into focus. 

A related problem is that CIOS and IT leaders are failing to read the fine-print on their public cloud contracts. In many of these contracts, the vendor has very little obligation to the customer. In reality, durability and availability should be managed just like an on-premises storage structure, where the CIO needs to superimpose an architecture on top of cloud to establish the desired level of certainty.

This all leads to organizations investing heavily in public cloud solutions that not only lack control over security and data locality, but also in the long run, cost more.

So why does everyone think public cloud is cheaper?

Perception and reality of the public cloud do not always align. Although the public cloud may appear more affordable, and is certainly marketed that way, the reality is that once organizations are tied into recurring monthly fees, this is an expensive outlay. Not to mention that transit and other fees can be vastly more than you ever imagined. Some cloud service providers also charge per user, so although public cloud promotes unlimited scalability, this can come with a heavy price tag. 

Many companies also use public cloud as a way to ensure increased data resilience, where data stored in the public cloud can use a form of data protection called erasure coding. However, erasure coding only protects against certain hardware failures. It does not protect against input errors, human maliciousness, ransomware, malware of all kinds, or software corruption. Erasure coding can also add significant latency, affecting application performance and response time. It is also common for public cloud vendors such as Amazon to charge for replication, or the copying of data, across multiple data centers, adding to the cost further. As a result, IT teams often end up selecting a less sophisticated public cloud vendor, in effort to save cost, but this then introduces more risk as these smaller vendors also have less sophisticated protection.

In contrast, on-premises private cloud can provide the same agility as the public cloud, but within the organization’s own environment, offering more functionality such as higher performance, local control and protection against malware.

Data is the lifeblood of an organization and critical to its success, which is why more businesses are retaining information for longer and using this to gain insight into customer behavior and trends. Storage is therefore critical and companies need a comprehensive IT infrastructure that is built to offer the same agility of the public cloud, such as seamless file sharing, but with the added security and capability on-premises offers, to safeguard a company’s most sensitive data long-term. 

 

Availability and security

Relying on third party public cloud providers also brings into question availability and security. As the aforementioned outages demonstrate, public cloud services are not immune to suffering through downtime. This is something smaller organizations especially cannot afford. Although AWS, for example, recovered from its outage several months ago, and only suffered a small impact in terms of its revenue, it’s not always the same story for SMBs that may not be able to recover. Downtime in any form for smaller organizations can have critical consequences.

So is public cloud too risky? To be clear, public cloud isn’t going anywhere and it provides crucial benefits to many businesses. However, before opting to trust a public cloud service provider with all of your data, it’s important to understand what data is most critical to business survival. For this data, too much is at stake to place it in the hands of an outside party, and with the additional cost, nothing about it is worth it. The only way to maintain full control, while also minimizing expenses, is through an on-premises solution as part of your infrastructure. This way, organizations can achieve the agility of the cloud at a lower cost and with guaranteed control over data privacy, availability and security when it’s most important. 

It’s time we updated the conventional wisdom on the role, limitations, and true cost of the public cloud.

Powered by WPeMatico

Multi-Cloud In The Context Of Market Verticals

As it is well-known, the cloud market is dominated by large cloud service suppliers (CSP) such as Amazon, Google, and Azure. Amazon Web Services is dominating the market with 47.1% of the Public Cloud Market Share. The smaller CSPs are increasingly having to diversify their offering and collaborating rather than competing with large and other small providers, for example to deliver specialised services specifically for a market vertical.

The concept of cloud computing is being defined and redefined again by the larger players and the smaller players are continually playing catch up. The agility and functionality of the smaller players however is increasing to support a more agile approach to integrating and working with public cloud providers. The large CSPs are interested in providing cost and profit effective computing services and the more niche specialist requirements are largely ignored. This has delivered a partnership advantage to the smaller CSPs. This is facilitated by tools such as AWS Direct Connect and Azure Express Route allowing on premises or data centre providers to scale into the public cloud on demand.

The concept of multi-cloud in the context of smaller providers is now being spread in multiple dimensions:

  • Hybrid cloud: Providing customers ability to hyper scale from their private cloud environment into public cloud providers.
  • Network connectivity: Providing interconnects to proprietary networks such as health and social care network (HSCN) which large public cloud providers are not capable of providing directly.
  • Backup and disaster recovery: Using geographically diverse and different CSPs/data center operators to deliver enterprise solutions.

Smaller CSPs need to be able to provide a comparable specialised service and toolset to be able to match the agility that the larger players provide.

Collaborate, not compete

Domain specific knowledge smaller CSPs now have means that larger CSPs are being driven to collaborate with the smaller providers to resolve the gap in their domain specific knowledge.

As an example, in the health care domain, CSPs with domain specific knowledge have gained that because they are close to the endusers. Often CSP domain experts are not engaging with technical people but with clinicians who are at the cutting edge of adopting new and innovative cloud based technologies as part of their service to patients. It is not just a case of CSPs selling equipment as a service, they need to create a service which meets the needs of the end user, and this cannot be done being one step removed from the user’s domain experts. An overall service offering to match all the client’s requirements in terms of availability, performance, security, privacy, etc. the will drive an increasing demand for an alliance with other CSPs. Providers will need to be prepared to enable and facilitate such scenarios. 

Reputation

Larger cloud service providers particularly delivering to the public sector must consider a reputational risk associated with their business relations with smaller CSPs. Recently, Data Centred, a Manchester UK-based data center provider was put into bankruptcy administration after its only customer, the UK government’s department of Revenue and Customs (HMRC), opted to move their complete environment out into the AWS public cloud. It can be argued that building a business on one, admittedly large, customer is high risk and Data Centred should have diversified and built a wider customer base. However, even if Data Centred had developed a more diverse customer portfolio the loss of this disproportionately large customer would have resulted in a major re-structuring, downsizing, and even then bankruptcy may not have been avoided. This has led to a public concern regarding the competition generated — predominately American providers — delivering  data center services to EU-based businesses and governments. To counter this perceived problem many public cloud providers are opting to work with local CSPs to front services that are supported by the public cloud.

Security and Privacy

In the CloudWATCH Summit in September 2017,  Nicola Franchetto, senior associate and data protection officer at ICT Legal Consulting explained that the upcoming General Data Protection Regulation EU Directive (GDPR) will broaden territorial reach compared to the current regime, The new directive will apply not only to data controllers and processors in the EU, but also to processors outside the EU that offer goods or services to data subjects in the EU and/or monitor the behaviour of data subjects in the EU.

In this context, small providers mastering local regulations and data protection in the cloud will play a valuable role as allies of those big players from within and outside the EU that need to demonstrate adherence to the EU GDPR. Further, application and service providers in privacy demanding vertical domains would be willing to opt for cloud offers that ease their compliance and supply them with the necessary security and privacy controlling mechanisms as those provided by MUSA framework.

In conclusion, the picture of cloud service provision is not as simple as the marketing of the larger CSPs would ask us to believe. Domain expertise and local expertise are frequently quoted as reasons for a collaborative approach to providing services, leading to an increase in multi-cloud service provision. The future impact of GDPR cannot be ignored by large CSPs and will further influence the need for partnership alliances. Large CSPs find that it is often easier to collaborate with smaller local CSPs rather than ignore them. After all, cloud computing is a collaboration between the customer and supplier and expanding that relationship is natural.

Powered by WPeMatico

How The Cloud Can Help Your Business Get Compliant With GDPR

The UK Brexit planning has started in earnest and companies and organizations are rightly looking at what leaving the European Union (EU) will mean to their operations and staff.

However, amid wide-ranging business concerns is a new piece of legislation affecting personal data which could potentially have similar aftershocks — and that’s general data protection regulation (GDPR) which will apply to the UK from May 25, 2018.

GDPR is intended to strengthen data protection for individuals within the EU while imposing restrictions on the transfer of personal data outside the European Union to third countries or international organizations.

It would be a mistake for any data controller or processer to assume that because they know and adhere to the existing Data Protection Act 1998 (DPA) that it will be similar and therefore no additional compliance is required.

GDPR will have a set of new and different requirements and for any organization which has day-to-day responsibility for data protection, it’s imperative that they monitor the regulations and ensure that their organization can be compliant-ready ahead of next year.

Compliance requires investment as well as specialist knowledge and many business leaders are looking at how the cloud will be able to help with their data storage, protection, and management and meet GDPR compliance as well.

GDPR is the biggest challenge facing data management in the last 20 years; it’s no understatement to say that it is presenting business leaders with a headache.

A survey from analyst firm Gartner earlier this year showed that around half of those affected by the legislation, whether in the EU or outside, will not be in full compliance when the regulations take effect.

The message coming forward is that the cloud is the preferred option to help with the upgrading of data security practices and data protection standards in line with the regulations.

As the May 2018 deadline nears ever closer, moving data to the cloud can help ease the burden faced by senior IT leaders, many of whom see GDPR compliance as their top priority.

As a leading cloud services provider, we are increasingly being asked about GDPR considerations from concerned clients migrating to the cloud.

We believe that the task of migrating people’s data such as emails, contacts, files, calendar, and tasks over to Office365 will make compliance easier for organizations.

During any cloud migration process, the most important result, particularly with the need for GDPR compliance ahead, is that data sovereignty is maintained and full control with comprehensive reporting is provided.

After migration comes management and it’s the next big part of the cloud which is vital to GDPR compliance to address security and data protection.

Organizations and service require a tool with the ability to control multi-tenant Office 365 users in a very intuitive and cost-effective way.

With the need for increased security, bulk transaction processing, advanced hierarchical management capability, and role-based access control will all help companies to comply with increasingly stringent access controls required by GDPR.

GDPR compliance before May 25, 2018 isn’t an option for those doing business with EU countries, it’s a necessity. Organizations will need to look across their business and manage their data holistically to ensure compliance and avoid sanctions. With GDPR coming into effect in a matter of months, the time to act is now.

Powered by WPeMatico

5 Ways To Protect Your Business From Being Hacked

The past couple of months have been a huge wake-up call for businesses in terms of their cybersecurity. With large enterprises such as Equifax being successfully attacked, costing the company billions, as well as their reputation as a reliable service to trust, there are no longer any doubts regarding the importance of protecting company-sensitive information.

Protecting your business from cyber criminals is easier said than done. For every new security measure you integrate into your business, cyber criminals mastermind a new method to circumvent your security and access critical data.

This doesn’t mean you can’t protect your data from hackers. Here are some techniques which you can implement that will greatly diminish the chance of being successfully hacked.

Encrypt Business Critical Data to Mitigate the Risk of Being Hacked

Think about all the customers that your company has gathered personal credentials on in the last year alone. Imagine having to explain that your data has been breached, and is in the looming grasp of an unidentified cyber criminal? This is not a conversation that any CEO should have to initiate.

It is important to encrypt all sensitive data. Good data protection practices also include making use of trusted, and credible financial service providers such as PayPal or Google Wallet.

Ensure That Your Security Software is Up To Date

Your employees are immensely valuable to the daily operations of your business. At the same time, due to their daily interaction with company-sensitive information, they are also your Achilles heel.

In order to reduce the risk of viruses such as malware and spyware, it is important to keep your antivirus software up to date. Employees are known to browse “shady” websites for entertainment now and then during a workday — it is inevitable in most establishments.

However, if you wish to mitigate the risk to your company, it is important to limit the sites that can be accessed through work terminals to diminish the window of opportunity for an attack against your company.

Physical Security Has Never Been More Important

The mistake that many businesses make is focusing solely on online security and neglecting to secure their physical assets — this doesn’t mean you have to hire people to guard your computers. However, you should consider the physical security of you, your team, and your workplace, is just as important as your cybersecurity.

Through doing this you eradicate the chance of a physical breach of data — such as stolen hardware. This may not stop the typical thief, but it will make them reconsider whether the job is worth the time and effort it will take.

Protect Yourself from Common Cybercriminal Techniques

Cyber criminals work diligently to find vulnerabilities in a company’s security. The smallest flaw in your security is a door wide open for any looming, knowledgeable cybercriminal. The good news is that there are fairly easy patches to rectify vulnerabilities in your security system.

One common form of cyber attacking is SQL (structured query language) injection. Essentially, criminals focus on placing questions to your e-commerce site in order to extract sensitive information from your database.

Another frequently used technique is known as cross-site scripting. This attack focuses on extracting data from unwitting consumers, who are intercepted while they load your page, and taken to a malicious website.

Both of these attack strategies can cause irreparable harm to your business. This just emphasizes the importance of ensuring that you implement the use of a trusted e-commerce platform. The good news is that due to the online nature of these web-platform techniques, you can prevent the risk by implementing a web application firewall protocol.

Secure Your Data in Case of Emergencies

Oftentimes, ransomware is injected into a business database. When criminals hit your business, their main objective is to blackmail you with your own data. Although, the cost is different depending on the malicious intent of the hacker. However, one thing is for certain — it will be costly to your business.

If you’re starting to wonder how you can afford to make these changes, consider cutting back costs in your business to give you some more spending money. For example, consider automating your accounts payable department. This will prevent you from overpaying your employees to do a task that can be done through payment software.

Malicious breaches will also drastically diminish the trust that customers place into your establishment. In order to reduce the risk of being blackmailed with ransomware, businesses are advised to implement an effective data recovery tool for their businesses.

Not only will this protect you from the consequences of cybercrime — but it will also mitigate the risk of losing data during severe incidents such as natural disasters.

Stay on Top of it and Embrace Security

Unfortunately, it’s possible that your business can still be hacked no matter how hard you try to avoid it. Keeping your staff educated is a huge way you can keep your business as safe as possible. The last thing you want is for your employers to start caring after an attack has already taken place. As long as you are prepared, you can possibly stop the attack before it ruins all of your data.

Powered by WPeMatico

Why You Should Consider Moving Unified Communications To The Cloud

In today’s evolving business landscape, executives are looking to technology to help transform their operations, enabling them to be more agile and efficient. To help achieve this, executives are increasingly incorporating cloud solutions into their business strategies to help them stay competitive. Whether it is virtualizing the data center, deploying new applications or extending network capacity, cloud solutions are becoming critical for today’s enterprise companies. With this in mind, enterprises are increasingly considering moving unified communications (UC) into the cloud as well.

Cloud services for unified communications can offer measurable value for organizations when compared to traditional PBX services. As communications equipment becomes outdated or needs to be replaced, a cloud service can look especially attractive as it can offer a host of benefits that are essential in executing a wide range of digital transformation initiatives. Outlined below are several reasons why you should consider choosing a cloud-based unified communications model as your next unified communications solution.

Cost-Effectiveness

One of the main reasons companies look to cloud solutions for their unified communications needs is the cost advantage. Today, it is very cost-effective to host a phone system over the internet because businesses are charged for the service and not the expensive switching equipment located on premise. This eliminates the need to pay for the necessary installation and maintenance costs that a traditional phone system would require. Additionally, most cloud phone systems offer unlimited local and long distance calling, which also is a substantial benefit for businesses looking to minimize expenses.

Ease of Use

Because of the complexity of today’s communications systems, it can sometimes take an entire IT department or a third-party vendor just to manage the upkeep of a traditional phone system. Cloud-based communications can help alleviate the burden by eliminating maintenance, IT workload, and some of the more costly internal infrastructure. Having a standardized point of contact and connectivity can streamline operations for IT teams, enabling them to focus on driving future business initiatives instead of maintaining current systems.

Quality of service

For every business, uptime is pivotal, and cloud solutions provide good uptime from partners in the space. To keep operations running smoothly, businesses rely on the ability to leverage remote work teams, manage multiple office spaces or serve customers from anywhere in the world. For businesses requiring this flexibility, cloud communications maximizes coverage through multiple data centers, helping them avoid costly interruptions and potential downtime.

Functionality

On-premise phone systems can bring challenges to an organization that is expanding quickly or has varying needs. Alternatively, cloud-based unified communications solutions can provide the flexibility and scalability that a business needs whether it is adding a new office space, moving locations, or sizing up or down now or in the future. When using cloud-based systems, businesses can access and add new features without any new hardware requirements, offering quick and easy solutions for both installing and maintaining unified communications systems.

Digital transformation and cloud solutions are revolutionizing multiple industries, and unified communications is no exception. To ensure your business is adequately prepared for a digital transformation, it is critical to begin integrating cloud-based applications into your current unified communications model. As businesses continue to evolve and adopt new technologies, remaining agile and scalable based on need is becoming increasingly important. With every business looking to keep up, offering solutions that will provide considerable business value will be beneficial now and in the future.

Powered by WPeMatico

AWS’s Kubernetes support is a step in the right direction

Kubernetes is open source technology originally built by a Google team that has received support from several enterprises, including Microsoft, Oracle, and IBM. This containers orchestration layer has many advantages, including the ability to process an application on any public cloud. This makes it easy for Kubernetes to migrate from one cloud vendor to another.

At the AWS Re:Invent show last week, Amazon’s announcement that AWS will support Kubernetes received great acceptance from the AWS user base. The fear had been that AWS would wander in its own proprietary directions when it came to container orchestration, even though Kubernetes was the de facto standard.

The ability for AWS to adapt to dynamic movements in the technology market, even if it means moving away from technologies it created, is a check in the plus column. The fact of the matter is that public cloud providers need to serve what enterprises need, and not look after their own “vision” or selfish interests.

Cloud providers should offer the market many places to get the same technology. The fact that Kubernetes is available from Microsoft, Google, IBM, Oracle, and now AWS means that enterprises have a choice as to which they want to rent from.

In the larger market, this approach means the playing field gets a bit more level. Cloud providers can adopt best-of-breed tools, both proprietary and open source, and work from the needs of their enterprise customers to the technology they offer. This is not only the right thing to do, but it should drive even more enterprises into the awaiting arms of public cloud providers.

The struggle is not over, however. I suspect that the major public cloud providers will try to own a market using proprietary technology that they don’t share with their competition. In fact, this what the major public cloud providers alreadu do a lot today. Such proprietary offerings make them different from other public cloud providers, but I suspect that enterprises want them to be more alike.

Powered by WPeMatico