Author: John Grange

Why we support the public cloud too

by John Grange
publiccloud

This summer has absolutely flown by. The highlight for us has been finally rolling out our managed public cloud service. It was a long time in the making and we received great early customer input to help us shape the offering. Now that we’ve been providing our services on AWS and Azure for a couple of months, I’ve noticed some reoccurring questions and misunderstandings that I thought I would address.

Given the positive reception to our nascent public cloud services and in the interests of providing better definition, I thought I’d provide a quick rundown of what our public cloud services are and why companies need managed public cloud.

Why is there a need?

As companies rebuild and replatform their line-of-business apps to leverage new technologies the public cloud is a natural choice because of the scale, flexibility and ever-evolving tool sets. There’s definitely reasons to run certain things like your ERP or another core production workloads in your on-premise datacenter or a private cloud, but for less critical systems or cloud-native applications the public cloud provides many benefits.

Despite all the fancy interfaces and capabilities, businesses still need to ensure data security, privacy, and governance. In the pervading shared responsibility model, the burden still falls on the customer to enforce security above the infrastructure layer. With tools, concepts and capabilities that are vastly different from in-house environments, companies now require new processes and that their staff have a different set of skills.

A managed public cloud provider alleviates many of these issues by providing the setup and day-to-day maintenance so that internal staff can focus on the application itself.

We enable public cloud adoption in an supported, secure, and enterprise-ready way

Since we provide hands-on support, advanced monitoring, and secure and compliant configurations on our own infrastructure it wasn’t much of a stretch for us to extend that service onto Azure or AWS. The biggest challenge was in building processes around the PaaS elements and ancillary services such as Azure Backup and Azure Site Recovery.

In the end, we provide our clients with an instant ops team to setup, configure and secure, along with a support capability to monitor and respond to incidents. We take away a ton of the risk while maximizing the value of the public cloud inherent scale and tooling.

Key service attributes:

-Best-practice environment configuration

-24 x 7 support

-Health and Performance Monitoring

-Hardened OS configurations, user access controls

Doesn’t this negate the cost advantages of public cloud?

I hear this a lot but typically not from actual clients. Most organizations who are exploring public cloud are doing so because of the operational efficiency associated with the scale and tool sets available on those platforms. The cost of the actually server resources are really only a small part of the equation. If a company can focus their internal resources on directly supporting their users and not on servers and maintenance, the benefits of the public cloud become substantial.

Adding management services to cover the day-to-day responsibility of the customer in a public cloud environment allows companies to move faster because the key “boxes” are checked. Public cloud allows you to move faster and can be secure, our goal is to make it easier for companies to get there.

3 tips for making your multi-cloud approach wildly successful

by John Grange
3tipsmulticloud

As I speak with customers and partners, it’s striking how many of them are no longer making the choice between infrastructure in their own datacenter’s or going all-in on one of the public clouds. More and more companies are taking hybrid or multi-cloud approaches to their applications and infrastructure – a practice that maximizes the value and utility of the cloud. When you can right-size your infrastructure to be in line with the technical and cost requirements, you end up running more efficiently by providing more flexibility as time goes on and requirements change. In IT things are always changing so it’s wise to put a high premium on flexibility.

So why isn’t everybody right-sizing there workloads through a hybrid cloud model? Like everything else in IT, it really comes down to inertia and fear. The inertia stems from the propensity of organizations to continue to do what they’ve always been doing. It’s an easy route to take because it’s generally considered more difficult to get fired for a decision NOT made rather than to make the decision to blaze a new path. The fear component comes from, not only change, but the enterprises concerns over data security in the cloud. Recent analysis shows that data governance and security are major concerns for companies who are considering cloud computing.

The multi-cloud approach with it’s efficient, right-sized workloads and variable cost model is so obviously advantageous, what’s the best way to overcome the organizational inertia and fear and adopt this approach? We have a lot of experience in this realm since we offer public, private, and hybrid cloud services. Here are some things that we see successful companies doing to adopt a multi-cloud approach:

1. Find the low-hanging fruit

All workloads aren’t created equal. To make your first foray into the public cloud successful, start with an application that would be somewhat easy to move into a new environment. Examples would be a web application that runs on common database software and uses a fairly vanilla configuration. Often times the “low hanging fruit” are non-essential or internal applications. If your first migration to the public cloud is successful, it will be easier to get organizational support to move other applicable workloads there as well.

2. Leverage vendors and tools

Just like you use a wide range of software tools and vendors to run your datacenter, managing a multi-cloud environment should be no different. Like any good engineer would say, sometimes it’s about just having the right tools for the job. Leveraging vendors can allow you to ensure security, monitor performance, troubleshoot problems and increase the general reliability of your applications. The most powerful reason to do this is that it reduces the burden on your team and ultimately allows you to do much more with less.

3. Enforce consistency

Consistency is really important. Whether it’s OS configurations, access methodologies, or deployment processes, consistency increases stability and enhances security. As a matter of security and organization, your public cloud presence should be as consistent with your private cloud. This doesn’t have to mean they’re exact replica’s; it means that the general processes, guidelines, and procedures you use everywhere else are part of your public cloud environment, regardless of whether you use the exact same tools to achieve that parity. Enforcing consistency will save you headaches by minimizing mistakes and ensuring enterprise security regardless of where the data sits.

What to think about when considering data-at-rest encryption

by John Grange
atrest

In our┬áprevious post in our security best-practices series we addressed data-in-flight encryption, what it means, and offered some tips for implementing it in your environment. Like data-in-flight, many compliance regulations require your data-at-rest to be encrypted as well. Data-at-rest is the inactive data that’s being digitally stored on your servers. While keys, access policies and audits are also critical, encryption is the front line in protecting your data-at-rest.

Encrypting your your data while it’s at-rest can be a much more complex and costly operation than encrypting your data-in-flight. Often times, and depending on a number of factors, encrypting the the data you’re storing can require changes to physical hardware or adjustments to your application to interact with an encrypted file system.

By 2017, two-thirds of all workloads will be processed in the cloud. Protecting that data is challenging because the popular cloud hosting platforms vary in their security practices, customization, and capabilities. Understanding your data footprint and the available encryption options are key to avoiding a costly data breach and meeting compliance regulations. Ensuring your cloud hosting vendor offers compliance options that include encryption of your data-at-rest is a good way to find a vendor that has a strong orientation around security and compliance.

Here are a few things to consider when looking to encrypt your data-at-rest:

If possible, use Self-Encrypting Drives (SED’s)

A SED is a self-encrypting drive that has a built-in ability to encrypt any data coming in and decrypt any data going out. Self-encrypted disks are easy to implement and their use is essentially invisible to users. Because the encryption is native to the disk itself, you can achieve very high performance despite the encrypting and decrypting of data being written and read. SED’s are more expensive than regular drives but are a sure-fire way to protect your data.

Choose software-based full-disk encryption wisely

There are countless full-disk encryption packages out there and some are better (much better) than others. Make sure to choose software from a vendor that’s stable and will continue to support their product. Also, your solution should use industry standard encryption algorithms and not proprietary one’s and provide key management. Finally, if you need to encrypt data that’s already there, be sure to choose an option that doesn’t require a re-partitioning of the server (something Microsoft’s Bitlocker requires). Software-based full-disk encryption is a less expensive proposition than SED’s but they degrade server performance and introduce complexity.

Native solutions are your best bet

Native solutions are implementations of encryption that are “built-in” to a system. A SED is a native solution in that the encryption is actually built into the disk itself, this is why SED’s perform well and provide simpler implementation. There are also encryption options that are a component of the file system your Operating System is using. While these implementations can require additional software, NTFS (Windows) and ext4 (Linux) are common file systems that have native encryption capabilities. The act of encrypting and decrypting data as it’s being written or read creates performance overhead and the closer that process is happening to the actual disk the better performance you get.