a cloud buyers guide

A Buyer’s Guide to Cloud

buyguide_Cloud

Most businesses have discovered the value that cloud computing can bring to their IT operations. They may have discovered how it helps to meet their regulatory compliance priorities by being in a SOC 2 audited data center. Others may see a cost advantage as they are approaching a server refresh when costly hardware needs to be replaced. They recognize an advantage of placing this hardware as an operational expense as opposed to the large capital expense they need to make every three years. No matter the business driver, the typical business person isn’t sure where to start to find the right cloud provider. In this fast paced and ever-changing technology environment these IT managers may wonder, is there a buyer’s guide to Cloud?

Where Exactly is the Cloud?…and Where is My Data?

Except for the cloud hyperscalers, (Amazon AWS, Microsoft Azure, and Google) cloud providers create their product in a multi-tenant data center. A multi-tenant data center is a purpose-built facility designed specifically for the needs of the business IT infrastructure and accommodates many businesses. These facilities are highly secured and most times unknown to the public. Many offer additional colocation services that allow their customers to enter the center to manage their own servers. This is a primary difference with the hyperscalers, as they offer no possibility of customers seeing the sites where their data resides. The hyperscale customer doesn’t know where there data is except for a region of the country or availability zone. The hyperscaler’s customer must base their buying decision on trusting the security practices of the large technology companies Google, Amazon, and Microsoft. These are some of the same organizations that are currently under scrutiny from governments around the world for data privacy concerns.  The buying decisions for cloud and data center for cloud seekers should start at the multi-tenant data center. Therefore, the first consideration in a buyer’s guide for the cloud will start with the primary characteristics to evaluate in the data center and are listed below.

  1. Location– Location is a multi-faceted consideration in a datacenter. First, the datacenter needs to be close to a highly available power grid and possibly alternate power companies. Similarly, the telecommunications bandwidth needs to be abundant, diverse and redundant. Finally, the proximity of the data center to its data users is crucial because speed matters. The closer the users are to the data, the less data latency, which means happier cloud users.
  2. Security– As is in all forms of IT today, security is paramount. It is important to review the data center’s security practices. This will include physical as well as technical security.
  3. People behind the data– The support staff at the datacenter creating and servicing your cloud instances can be the key to success. They should have the proper technical skills, responsiveness and be available around the clock.

Is My Cloud Infrastructure Portable?

The key technology that has enabled cloud computing is virtualization. Virtualization creates an additional layer above the operating system called a hypervisor that allows for sharing hardware resources. This allows multiple virtual servers (VMs) to be created on a single hardware server. Businesses have used virtualization for years, VMware and Microsoft HyperV being the most popular choices. If you are familiar with and have some secondary or backup infrastructure on the same hypervisor as your cloud provider, you can create a portable environment. A solution where VMs can be moved or replicated with relative ease avoids vendor lock-in. One primary criticism of the hyperscalers is that it can be easy to move data in but much more difficult to migrate the data out. This lack of portability is reinforced by the proprietary nature of their systems. One of the technologies that the hyperscalers are beginning to use to become more portable is containers. Containers are similar to VMs however they don’t utilize guest operating systems for the virtual servers. This has had a limited affect on portability because containers are a leading-edge technology and have not met widespread acceptance.

What Kind of Commitment Do I Make?

The multi-tenant data center offering a virtualized cloud solution will include an implementation fee and require a commitment term with the contract. Their customized solution will require pre-implementation engineering time, so they will be looking to recoup those costs. Both fees are typically negotiable and a good example where an advisor like Two Ears One Mouth can assist you through this process and save you money.

The hyperscaler will not require either charge because they don’t provide custom solutions and are difficult to leave so the term commitment is not required. The hyperscaler will offer a discount with a contract term as an incentive for a term commitment; these offerings are called reserved instances. With a reserved instance, they will discount your monthly recurring charge (MRC) for a two or three-year commitment.

Finding the best cloud provider for your business is a time-consuming and difficult process. When considering a hyperscaler the business user will receive no support or guidance. Working directly with a multi-tenant data center is more service-oriented but can misuse the cloud buyer’s time. The cloud consumer can work with a single data center representative that states “we are the best” and trust them. Alternatively, they can interview multiple data center provider representatives and create the ambiguous “apples to apples” spreadsheet of prospective vendors. However, neither is effective.

At Two Ears One Mouth IT consulting we will listen to your needs first and then guide you through the process. With our expertise and market knowledge you will be comforted to know we have come to the right decision for you company’s specific requirements. We save our customers time and money and provide our services at little or no cost to them!

If you would like assistance in selecting a cloud provider for your business contact us at:

Jim Conwell (513) 227-4131      jim.conwell@twoearsonemouth.net

www.twoearsonemouth.net

we listen first…

Are End User Compute and Virtual Desktop the Same Thing?

One of the challenges in staying current with business technologies in the market today is the fact that many of the technologies overlap. Naming conventions complicate it even further. I’ve recently come across this again as I have studied End User Compute (EUC) and Virtual Desktop Infrastructure (VDI). VDI is a subset of technologies of the broader category of EUC.

Both are important components of the cloud and data center world, so I feel compelled to provide a better description and comparison here.

End User Compute

EUC can be defined or thought of as the “last mile” of the data center; a term borrowed from the telecommunications industry. For this reason, it piques the interest of so many data center mavens. EUC helps organizations use servers or software in the datacenter to automate or improve the end users’ desktop experience.

There have always been two primary providers in EUC: VMWare, and Citrix. Not by coincidence, these have also been leaders in software for a business datacenter. VMware’s product, Horizon View, provides virtual desktop capabilities to users utilizing VMware’s virtualization technology. Citrix’s XenDesktop is thought by many to be a more feature rich product with better mobile integration and administration.

Recently the large public cloud providers have started to embrace this technology. Amazon Web Services (AWS) has developed a development platform for their partners to help them specialize in certain technologies. One of the first technologies approached was EUC.

Virtual Desktop Infrastructure  

VDI is defined as the software-based technology to host a desktop Operating System (OS) on a centralized server in the data center. It is thought of as server-based computing, where the end users don’t have a unique and independent desktop. An image of the desktops is stored on the server and delivered to the desktop devise on bootup.  

VDI creates an environment where the end user has very little control of their desktop. For the IT manager, this means fewer headaches with less help desk service tickets. This can result in less cost for the business. Additional cost savings for the business can be the desktop devises themselves. Re-purposed PC’s or “dumb” terminals can be utilized at the desktop because the storage, processing, and RAM requirements for the desktop workstation are less with VDI.

For as long as I can remember VDI was the technology that was on the brink of mass adoption. I believe one of the reasons we in the IT industry had this perception was because IT managers recommended it. VDI can make their jobs easier and help eliminate the burden of Shadow IT.

One issue holding back VDI is its cost in two primary areas, bandwidth, and storage. In many cases, some or all the VDI servers will be stored off site. Moving this data back and forth can prove to be a challenge. The only way to be sure you have the right amount of bandwidth is by performing a valid proof of concept.

Storage can also be an issue in regard to the type of storage more so than the quantity. Faster storage with a greater input/output (IOPS) like solid state storage is required to avoid problems like a “boot storm.” A boot storm is the degradation of service that occurs when a significant number of end users boot up within a very narrow time frame and overwhelm the network with data requests. 

Are VDI and EUC really the same thing with the difference being merely semantics? Possibly; they are very similar technologies that overlap considerably. Another concept the two have in common is they are perfect examples of my golden rule of technology implementation. When considering new solutions an organization needs to start with this question; is there a problem to be solved and what is the return on investment? Buying technology for technologies sake, or to satisfy a limited number of users does not produce results to support the overall business.

If you would like to talk more about strategies for cloud solutions contact us at:

Jim Conwell (513) 227-4131      jim.conwell@twoearsonemouth.net

www.twoearsonemouth.net

we listen first…

Azure enabling precision agriculture


FarmBeats is an AI for Earth Lighthouse project aimed at showcasing the benefits of Azure in a variety of applications. Water scarcity and pollution are threatening the livelihood of farmers around the world; they are under immense pressure to produce. Through sensors, drones, data analytics, AI, and connectivity solutions, Microsoft is enabling precision agriculture to improve yield while reducing resource consumption.

Active Directory (AD) in the Cloud

 

active-directory-cloud

Most business user’s first experience with the cloud is through a Software as a Service (SaaS) like Microsoft Office 365. Many times, as the business cloud presence grows, and their infrastructure becomes more balanced between cloud and on-premise, new challenges emerge. A common challenge of this integration is synchronizing a directory of users on all parts of the network.  Microsoft manages the directory of users with Active Directory (AD). Most Microsoft-based networks have had an AD server on the premise that manages user identification and authentication. As connectivity outside the premise has increased through e-commerce and cloud computing, new technologies have been developed for the AD. Office 365 users, whether they realize it or not, use the AD system within Microsoft Azure. Microsoft offers this same cloud-based AD system to all Azure users as Azure AD. Before detailing a cloud-based AD strategy I will briefly review the benefits of an AD system on the corporate network.

1.      Identify the Environment- AD creates a central identification and authentication across all platforms and locations of the corporate network.

2.      Enable Users- AD enables users to have more of a self-service experience, less dependent on corporate IT resources. Users can also receive benefits like single sign-on (SSO) for logging on to multiple applications or services.

3.      Protecting Corporate Data- Authentication is the most basic form of network security. This can verify users on a network like a passport verifies travelers entering a visiting country.

All public cloud providers provide different forms of AD, in this article I will focus on Microsoft Azure. Most administrators that consider Azure AD are concerned that it will create another complicated layer on top of the premise AD server. Actually, the opposite is true; it can offer a kind of “AD lite”. It will assist by breaking down a user’s identification into a simple field such as name, tenant, role, and password.

 The same Microsoft Azure AD that is used as the directory for Office 365 is free to Azure users. However, there are premium tiers that offer additional functionality.  These premium levels can offer value added features such as company branding and self-service features for users such as password reset.

By storing a business’s directory services and authentication in the public or private cloud, a business creates a secure and always available directory service. The Azure AD is completely scalable and can be integrated with other services through APIs and web based protocols. This will also allow integrations with on premise AD servers and allow single sign-on for all applications. The Azure AD can be thought of as an identification as a service.

Azure AD services can be managed directly on the Azure portal for simple configurations. More sophisticated deployments may be managed by common tools such as AD Domain Services (AD DS) Lightweight Directory Access Protocol (LDAP), and Active Directory Federation Services (AD FS).

Office 365 directory examples can provide a basic outline of how these services work. Their directories can be identified three ways: cloud only, synchronized identity, and federated identity. Cloud only is created within the Office 365 admin portal and managed behind the scenes through Azure AD. Synchronized identity accounts are created on premise with an AD server where passwords are kept in sync with the cloud. The synchronized identity method uses the cloud as its ultimate basis for the directory.  The federated identity is more complex. Users are based in the on-premise directory and kept in sync with the cloud however, ultimate verification is retained by the on-premise AD services.

Cloud services benefit most IT infrastructure environments although they may also create complications.  Employing a synchronize directory for all users and applications on and off the premise creates a stable foundation to identify and protect all users and data on the corporate network.

 

If you would like to talk more about strategies to migrate data to the cloud contact us at:

Jim Conwell (513) 227-4131      jim.conwell@twoearsonemouth.net

www.twoearsonemouth.net

we listen first…

How to Get Started with Microsoft Azure

azure-icon-250x250

image courtesy of Microsoft.com

A few months ago I wrote an article on getting started with Amazon Web Services (AWS): now I wanted to follow-up by writing the same about Microsoft Azure. Microsoft Azure is the public cloud offering deployed through Microsoft’s global network of datacenters. Azure has continued to gain market share from its chief rival, AWS. Being in second place is not something Microsoft is used to with their offerings. However, in the cloud, like with internet web browsers, Microsoft got off to a slow start. Capturing market share will not prove as simple with AWS as it was with Netscape and the web browser market in the 90’s but in the last two years, progress has been made. Much of the progress can be attributed to Satya Nadella, Microsoft’s current CEO.  Nadella proclaimed from his start a commitment to the cloud. Most recently Microsoft has expressed their commitment to support Linux and other operating systems (OS) within Azure. Embracing another OS and open source projects is new for Microsoft and seems to be paying off.

Like the other large public cloud providers, Microsoft has an easy to use self-service portal for Azure that can make it simple to get started. In addition to the portal, Microsoft entices small and new users with a free month of service. The second version of the portal released last year has improved the user experience greatly. Their library of pre-configured cloud instances is one of the best in the market. A portal user can select a preconfigured group of servers that would create a complex solution like SharePoint. The SharePoint instance includes all the components required: The Windows Server, SQL Server and SharePoint Server. What would take hours previously now can be “spun-up” in the cloud with a few clicks of your mouse. There are dozens of pre-configured solutions such as this SharePoint example. The greatest advantage Microsoft has over its cloud rivals is it has a deep and long-established channel of partners and providers. These partners, and the channel Microsoft developed for their legacy products, allow them to provide the best support of all the public cloud offerings.

Considerations for Getting Started with Microsoft Azure

  1. Decide the type of workload– It is very important to decide not only what workloads can go to the cloud but also what applications you want to start with. Start with non-production applications that are non-critical to the business.
  2. Define your goals and budget– Think about what you want to achieve with your migration to the cloud. Cost savings? Transferring IT from the capital expense to an operational expense? Be sure you calculate your budget for your cloud instance; Azure has a great tool for cost estimation. In addition, make sure you check costs as you go. The cloud has developed a reputation for starting out with low-costs and increasing them quickly.
  3. Determine your user identity strategy– Most IT professionals are familiar with Microsoft Active Directory (AD). This is Microsoft’s application that authenticates users to the network behind the corporate firewall. AD has become somewhat outdated, not only by cloud’s off-site applications but also by today’s limitless mobile devices. Today, Microsoft offers Azure Active Directory (AAD). AAD is designed for the cloud and works across platforms. At first, you may implement a hybrid approach between AD, AAD and Office 365 users. You can start this hybrid approach through a synchronization of these two authentication technologies. At some point, you may need to add on federation that will add additional connectivity to other applications such as commonly used SaaS applications.
  4. Security– An authentications strategy is a start for security but additional work will need to be done. A future article will detail cloud security best practices in more detail. While it is always best to have a security expert to recommend a security solution, there are some general best practices we can mention here. Try to use virtual machine appliances whenever possible. The virtual firewall, intrusion detection, and antivirus devices add another level of security without adding additional hardware. Devices such as these can be found in the Azure marketplace. Use dedicated links for connectivity if possible. These will incur a greater expense but will eliminate threats from the open Internet. Disable remote desktop and secure shell access to virtual machines. These protocols exist to offer easier access to manage virtual machines over the internet. After you disable these try to use point to point or site to site Virtual Private Networks (VPN‘s). Finally, encrypt all data at rest in virtual machines to help secure data.

Practically every business can find applications to migrate to a public cloud infrastructure such as Azure. Very few businesses put their entire IT infrastructure in a public cloud environment. A sound cloud strategy, and determining which applications to migrate enables the enterprise to get the most from a public cloud vendor.

If you would like to learn more about Azure and a cloud strategy for your business contact us at:

Jim Conwell (513) 227-4131      jim.conwell@twoearsonemouth.net

www.twoearsonemouth.net

 

3 Reasons to Use Your Hometown’s Datacenter and Cloud Provider

Cincinnati-dc

photo courtesy of scripps.com

Now that the business cloud market has matured, it has become easier to recognize the leaders of the technology as well as the providers that make the most sense to partner with your business. There are many large public cloud providers and most agree on three leaders:  Amazon Web Services (AWS), Microsoft Azure and Google Cloud. Google has been an uncharacteristic laggard in the space and seems to be struggling with the Business to Business model (B2B). Clearly, a B2B strategy can evolve from Business to Consumer (B2C) strategy, one can look no further than the public cloud leader AWS.

Whether Google Cloud can succeed is unclear. What is clear, however, is that there will always be a place for large public cloud providers. They have fundamentally changed how IT in business is done. The mentality the public cloud help to create, “go fast and break things“, has been an important concept for the enterprise IT sandbox.

Where Does the Local Data Center Fit in?  

I also believe there will always be a place in business IT for the local data center and cloud provider. The local data center and cloud provider mentioned here is not an engineer putting a rack up in his basement, or even the IT service provider whose name you recognize hosted in another data center. The local data center I am referencing has been in business many years, most likely before the technology of “cloud” was invented. My hometown in Cincinnati, Ohio has such a respected data center, 3z.net. 3z has been in business for over 25 years and offers its clients a 100% uptime Service Level Agreement (SLA). It has all the characteristics a business looks for in an organization it trusts its data with: generator, multiple layers of security, and SOC II level of compliance. It uses only top tier telecom providers for bandwidth and its cloud infrastructure uses technology leaders such as Cisco and VMware.  Most of all, 3z is easy to do business with.

To follow are three primary reasons to use a local datacenter.

  1. Known and Predictable Cost- The local data centers’ cloud cost may appear more expensive on the initial cost evaluation; however, they are often less expensive in the long run. There are many reasons for this but most often it is based on the rate charged for transmitting and receiving data to your cloud. Large public clouds charge fees based on the gigabyte of outbound data. While it is pennies per gigabyte, it can add up quickly. With the per gigabyte charges, the business doesn’t know all their costs up front. The local datacenter will typically charge a flat fee for monthly bandwidth that includes all the data coming and going. This creates an “all you can eat” model and a fixed cost.
  2. Customized and Increased Support for Applications- Many of the applications the enterprise will use cloud may require customization and additional support from the cloud provider. A good example of this is Disaster Recovery (DR) or Disaster Recovery as a Service (DRaaS). DRaaS requires a higher level of support for the enterprise in the planning phases as most IT leaders have not been exposed to DR best practices. Additionally, the IT leaders in the enterprise want the assurance of a trusted partner to rely on in the unlikely event they declare an emergency for DR. In many of the local cloud provider and datacenters I work with, the president of the datacenter will happily provide his private cell phone number for assistance.
  3. Known and Defined Security and Compliance- Most enterprise leaders feel a certain assurance of knowing exactly where their data resides. This may never change, or at least not to an IT auditor. Knowing the location and state of your data also helps the enterprise “check the boxes” for regulatory compliance. Many times, the SOC certifications are not enough, more specific details are required. 3z in Cincinnati will encrypt all of your data at rest as a matter of their process. Additional services like these can ease the IT leader’s mind when the time for an audit comes.

It is my opinion that the established local datacenter will survive, and flourish.  However, it may need to adjust to stay relevant and competitive with the large public cloud providers. For example, they will need to emulate some of the popular public cloud offerings such as an easy to use self-service portal and a “try it for free” cloud offering. I believe the local datacenter’s personalized processes are important and I offer support for 3z and its competitive peers to prosper in the future.

If you would like to learn more or visit 3z please contact us at:

Jim Conwell (513) 227-4131      jim.conwell@twoearsonemouth.net

www.twoearsonemouth.net

First Ohio Datacenter with AWS Direct Connect Now Open

cologix

It’s beginning to feel more like the Silicon Valley in Central Ohio.

If you haven’t seen or heard about the new Cologix datacenter, take a minute to read on.

Cololgix datacenter has been in the Columbus area for many years and operates 27 network neutral datacenters in North America. Its newest facility, COL3, is the largest multi-tenant datacenter in Columbus and resides on the same 8-acre campus as their existing datacenters COL1 and COL2. It offers over 50 network service providers including the Ohio-IX Internet Exchange peering connection.

Most exciting of all is its 20+ cloud service providers, which includes a direct connection to the market leading Amazon Web Services (AWS). This is the first AWS direct connection in the region providing customers with low latency access to AWS US East Region 2. With direct connect AWS customers create a dedicated connection to the AWS infrastructure in their region. When AWS is in the same datacenter where your IT infrastructure resides, such as Cologix, all that is needed for connectivity is a small cross connect fee.

Here are some pertinent specifications of Cologix COL3:

Facility

    • Owned & operated in a 200,000+ SQF purpose-built facilities on 8 acre campus
    • Rated to Miami-Dade hurricane standards
    • 4 Data Halls – Up to 20 Milliwatt (MW)
    • 24” raised floor with anti-static tiles
    • 150 lbs/SQF floor loading capacity with dedicated, sunken loading deck

Power:

  • 2N Electrical, N+1 Mechanical Configurations
  • 2N diverse feeds from discrete substations
  • Redundant parallel IEM power bus systems serve functionality and eliminate all single points of failure
  • 2N generator configuration- Two (2) MW Caterpillar side A and Two (2) MW Caterpillar side B
  • On-site fuel capacity for 72 hours run time at full load
  • Redundant 48,000-gallon tanks onsite, priority refueling from diverse supplies & facility exemption from emergency power

Cooling:

  • Raised floor cold air plenum supply; return air plenum
  • 770 tons per Data Hall cooling capacity
  • Liebert, pump refrigerant DSE
  • Concurrently maintainable, A &B systems

Network:

  • 50+ unique networks in the Cologix-controlled Meet-Me-Room
  • Network neutral facility with 16+ fiber entrances
  • Managed BGP IP (IPv4 & IPv6); multi-carrier blend with quad-redundant routers & Cologix provided customer AS numbers & IP space
  • Most densely connected interconnect site in the region including dark fiber network access
  • Connected to the Columbus FiberNet system plus fiber feeds reaching all 88 Ohio counties
  • Metro area dark fiber available

 

If you would like to learn more or visit COL3 please contact us at:

Jim Conwell (513) 227-4131      jim.conwell@twoearsonemouth.net  

www.twoearsonemouth.net

 

Wireless Account Optimization and Management; 3 Things to Look for

wireless-acct-opp

There are tasks in which large enterprises can engage that can have a huge financial impact on their communications cost.  Few are as financially rewarding, yet incredibly mundane, as Wireless Account Optimization (WAO). This is a prime example of a service that is better for the enterprise to outsource. First, it is time consuming and tedious, typically the business would not want their people on this type of project. Secondly, it is a service that the WAO provider will have a much better toolset and expertise to handle. To optimize a large wireless corporate account properly you need to be aware of all the available plans offered by the wireless vendor such as Verizon, AT&T or Sprint. Available plans would include not only current plans but also legacy plans that even your vendors account representative wouldn’t recognize. These legacy billing plans may have been abandoned by the carrier years ago for various reasons such as technology or pricing changes.  Seldom do wireless carriers remove these legacy plans from their billing platform, they are just “retired” and assumed unavailable. A WAO consultant will be aware of these plans and have them included in their automated account optimization system. This provides financial benefits well beyond what can be obtained with an analysis from the internal staff of the company.

The primary benefits to be recognized by a WAO partner would be:

Optimization –  cost savings from a rate plan analysis.

Procurement – includes ordering all new devices and upgrades from existing wireless carrier(s).

Inventory Documentation – Includes all device information (i.e. phone, model, ESN, IMEI, current rate plan, daily usage data, contract expiration and upgrade eligibility dates)

3 Things to Look for in a WAO Partner:

  1. Near real-time optimization technology – up to the minute monitoring of plans optimized prior to the billing cycle. No surprise overages with at least a 30% savings over current billing.
  2. Friendly and easy to use helpdesk and customer web portal.
  3. 100% ROI pricing model – performance based fee with no up-front cost.

If you’re looking for this type of service and have over 100 corporate liable wireless lines, we can help you with selecting the right partner.

Please contact us at (513) 227-4131 or jim.conwell@outlook.com

#wireless #wirelessaaccount #wirelessaccountreview #wirelessaccountoptimization

#itconsulting #twoearsonemouthitconsulting

Containers, the Forecast for Cloud?

kuberimage courtesy kubernetes.io

One of the most exciting and simultaneously challenging things about working in technology is the speed at which change occurs. The process from a cutting-edge technology to a ubiquitous and commoditized product can happen in the blink of an eye. Now that the cloud has made its way into all sizes and types of business the next related technology has emerged: containers.
How we got to this port
VMware’s introduction of virtualization was thought by many to be the predecessor of cloud as we know it today. This revolutionary technology allowed early adopters to reduce costs and enhance their IT agility through virtualization software. The day of physical servers for each application are over. Cloud technology has evolved from a single software for the enterprise, to an outsourced product that is provided to businesses such as major technology institutions like Amazon, Microsoft, and Google. Most recently, containers have evolved as a next step for cloud and are largely developed to suit the needs of software developers.
The difference between Virtual Machines (VM’s) and Containers
A container is defined by Docker as a stand-alone executable software package that includes everything needed to run an application: code, runtime, system libraries and settings. In many ways, that sounds like a VM. However, there are significant differences. Above the physical infrastructure, a VM uses a hypervisor to manage the VMs. Each VM has their own guest operating system such as Windows or Linux (see image #1). A container uses the host operating system and the physical infrastructure which supports the container platform such as Docker. Docker then supports the binaries and libraries of the applications. Containers do a much better job of isolating applications from its surroundings and this allows the enterprise to use the same container instance from development to production.

vm   cont
(Image 1)                                                            (Image 2) Images courtesy of docker.com
How can Containers be used in the Enterprise today?
Docker is currently the most popular company driving the movement for container based solutions in the enterprise. The Docker platform enables independence between applications and infrastructure allowing the applications to move from development to production quickly and seamlessly. By isolating software from its surroundings, it can help reduce conflicts between teams running different software on the same infrastructure. While containers were originally designed for software developers, it is becoming a valuable IT infrastructure solution for the enterprise.
One popular platform allowing the enterprise to benefit from container technology is Kubernetes. Kubernetes is an opensource system originally designed by Google that was donated it to the Cloud Native Computing Foundation (CNCF). Kubernetes assists with three primary functions in developing containers: deployment, scaling and monitoring. Finally, open source companies such as Red Hat are developing products to help utilize these tools and simplify containers for all types of business. OpenShift, designed by Red Hat, is a container application platform that has helped simplify Docker and Kubernetes for the business IT manager. The adoption of new technology, such as cloud computing, often takes time to be accepted in the enterprise. Containers seem to be avoiding this trend and have been accepted and implemented quickly in businesses of all types and sizes.