5G affects cloud

5G or not 5G- What, When and How it Affects IT and Cloud

 

Before entering the cloud and IT business I spent more than a decade working with wireless technologies for business.  During this time, I saw the advent of data to cell phones and the transitions of the generations of data offerings that have been delivered. Generation 2 (G2) brought very rudimentary data to cell phones such as text messaging. G3 brought the internet and its many applications such as mobile email. G4 brought us the high-speed internet we use today offering instant access to applications such as real time video. With each transition of the technology, corporate marketing and spin became more extraordinary creating more time between the introduction and the practical delivery of the new product. Now comes 5G and I expect this trend to continue. Although we hear of the current availability of 5G from wireless carriers the products are not fully developed for practical use and are likely years away for business applications.

What is 5G and who will provide it?

The latest technology of 5G wireless data will be provided to us by the same carriers that delivered wireless service in the past, AT&T, Verizon, and Sprint. Although the primary standards for 5G have been set there is still much to be developed in the technology and will likely be introduced as different versions. This will be similar to 4G when it was first launched with its distinct alternatives of WiMAX and LTE.  5G has already been split into different delivery types, 5G, 5GE, and 5 GHz. Verizon’s first introduction to 5G is designed for the home and small office while AT&T is focused on mobile devices in very limited markets. Most believe there will be fixed wireless versions for point to point circuits for business. At this point, it isn’t clear what versions each provider will offer in 5G as it matures and becomes universally delivered.

The technology of 5G

Similar to all previous generations in the evolution of wireless data 5G offers greater speeds as its primary driver for acceptance. What may slow widespread deployment of 5G is the fact that 4G technology continues to improve and provide greater speeds for users. However, the wireless spectrum available to 4G providers is running short so the transition to 5G is imminent. Most of the 5G technology will be provided on an alternate wireless spectrum, above 6 GHz, and not provided to wireless consumers previously. This new swath of spectrum will offer much greater capacity and speed but won’t come without its own challenges. To achieve these higher speeds the carriers will need to use much higher frequency transmissions called millimeter waves. Millimeter waves cannot penetrate buildings, weather, and trees as well as the previous frequencies. To overcome this wireless carriers will need to implement additional, smaller cell sites called microcells. Many wireless carriers have already implemented microcells complementing the macrocells used in previous offerings of wireless service. Building out additional data network and cell sites such as microcells is expensive and time-consuming. This will add to the delay of a fully implemented 5G offering from the carriers.

Business advantages of 5G

To say that one of the advantages of 5G are greater data speeds would be true, but there is much more to it for business applications. The following are the primary advantages, related to speed, that 5G will provide for businesses for cloud computing.

  • Lower latency – Wireless 5G networks will decrease latency, the time it takes data packets to be stored and retrieved, greatly. This will benefit many business applications such as voice, video and artificial intelligence (AI)
  • Multiple connections- The base stations, or cell sites of 5G, will handle many more simultaneous connections than 4G. This will increase speed for users and capacity for providers.
  • Full duplex transmission- 5G networks can transmit and receive data simultaneously. This full duplex transmission increases the speed and reliability of wireless connectivity enabling new applications and enhancing exiting ones.

Cloud and business solutions enhanced by 5G

It is difficult to say exactly how businesses will benefit from 5G service since it is still being developed. However, the advantages listed above lend themselves to several applications which are sure to be enhanced for business.

The increased speeds and decreased latency 5G offers will expand options and availability for disaster recovery (DR) and data network backups for businesses. When speeds previously only offered to business via wireline can be delivered without wires business continuity will be increased. Many businesses outages today are caused by accidental cable cuts and power outages that wireless 5G will eliminate. It is also possible wireless point to point circuits could replace traditionally wired circuits for the business’s primary data and internet service.

The technology and increasing number of the internet of things (IOT) applications will be enhanced by 5G. The increased speed and connectivity capacity will allow this ubiquitous technology to continue to grow. Similarly, the trend for more and faster edge computing connectivity will benefit. This will enhance applications such as autonomous vehicles that require instant connectivity to networks and other vehicles. Content delivery networks like the ones used for delivery of Netflix will be able to deliver their products faster and more reliably. These are just a few examples of the technologies today that are demanding 5G’s advantages and will expedite its availability.

While the technology to deliver 5G is mostly completed, the timing of widespread implementation for business is still unclear. This is attributed in part to the improvement of 4G speeds in its ability to satisfy today’s consumer’s needs. More importantly, new technologies are not accepted in the marketplace because the technology is ready but rather because the business applications demand them. 5G technologies will be driven by many business applications but widespread acceptance won’t occur for at least another two years. If you want to consult with a partner that has expertise in all aspects of telecom, wireless and cloud technologies, give us a call and we will be glad to find the right solution for your business.

Contact @ Jim Conwell (513) 227-4131      jim.conwell@twoearsonemouth.net

www.twoearsonemouth.net

we listen first…

 

Getting Started with Amazon Web Services (AWS)

icon-cloud-aws

Amazon Web Services is a little-known division of the online retail giant, except for those of us in the business of IT. Its interesting to see that the profits from AWS represented 56 percent of Amazon’s total operating income with $2.57 billion in revenue. While AWS amounted to about 9 percent of total revenue, its margins and sustained growth make it stand out on Wall Street. As businesses make the move to the cloud they may ponder what it takes Getting Started with Amazon Web Services (AWS)

When we have helped organizations evolve by moving part or all of their IT infrastructure to the AWS cloud, we have found that planning is the key to their success. Most businesses have had some cloud presence in their IT infrastructure. The most common, Software as a Service (SaaS), has lead the hyper growth of the cloud. What I will consider here with AWS is how businesses use it for Infrastructure as a Service (IaaS). IaaS is defined as a form of cloud computing that relocates a business’s applications that are currently on their own servers to a hosted cloud provider. Businesses consider this to reduce hardware cost, become more agile with their IT and even improve security. To follow are the 5 simple steps we have developed to move to IaaS with AWS.

Getting Started with Amazon Web Services (AWS)

1)      Define the workloads to migrate- The first cloud migration should be kept as simple as possible. Do not start your cloud practice with any business critical or production applications. A good idea, and where many businesses start, is a data backup solution. You can use your existing backup software or one that partners with AWS currently. These are industry leaders such as Commvault and Veritas, and if you already use these solutions that is even better. Start small and you may even find you can operated in the  free tier of Amazon virtual server or instances. (https://aws.amazon.com/free/)

2)      Calculate cost and Return on Investment (ROI)- Of the two primary types of costs used to calculate ROI, hard and soft costs, hard costs seem to be the greatest savings as you first start your cloud presence. These costs include the server hardware used, if cloud isn’t already utilized,  as well as the time needed to assemble and configure it. When configuring  a physical hardware server a hardware technician will have to make an estimation on the applications growth in order to size the server properly. With AWS it’s pay as you go, only renting what you actually use. Other hard cost such as power consumption and networking costs will be saved as well. Many times when starting small, it doesn’t take a formal process of ROI or documenting soft costs, such as customer satisfaction, to see that it makes sense. Finally, another advantage of starting with a modest presence in the AWS infrastructure is that you may be able to stay within the free tier for the first year. This  offering includes certain types of storage suitable for backups and the networking needed for data migration.

3)      Determine cloud compatibility- There are still applications that don’t work well in a cloud environment. That is why it is important to work with a partner that has experience in cloud implementation. It can be as simple as an application that requires a premium of bandwidth, or is sensitive to data latency. Additionally, industries that are subject to regulation, such as PCI/DSS or HIPAA are further incentivized to understand what is required and the associated costs . For instance, healthcare organizations are bound to secure their Protected Health Information (PHI). This regulated data should be encrypted both in transit and at rest. This example of encryption wouldn’t necessarily change your ROI, but needs to be considered. A strong IT governance platform is always a good idea and can assure smooth sailing for the years to come.

4)      Determine how to migrate existing data to the cloud- Amazon AWS provides many ways to migrate data, most of which will not incur any additional fees. These proven methods not only help secure your data but also speed up the process of implementation of your first cloud instance. To follow are the most popular ways.

  1. a) Virtual Private Network- This common but secure transport method is available to move data via the internet that is not sensitive to latency. In most cases a separate virtual server for an AWS storage gateway will be used.
  2. b) Direct Connect- AWS customers can create a dedicated telecom connection to the AWS infrastructure in their region of the world. These pipes are typically either 1 or 10 Gbps and are provided by the customer’s telecommunications provider. They will terminate at the far end of an Amazon partner datacenter. For example, in the midwest this location is in Virginia. The AWS customer pays for the circuit as well as a small recurring cross-connect fee for the datacenter.
  3. c) Import/Export– AWS will allow their customers to ship their own storage devices containing data to AWS to be migrated to their cloud instance. AWS publishes a list of compatible devices and will return the hardware when the migration is completed.
  4. d) Snowball– Snowball is similar to import/export except that Amazon provides the storage devices for this product. A Snowball can store up to 50 Terabytes (TB) of data and can be combined in series with up to 4 other Snowballs. It also makes sense in sites with little or no internet connectivity. This unique device is set to ship as is, there is no need to box it up. It can encrypt the data and has two 10 GIG Ethernet ports for data transfer. Devices like the Snowball are vital for migrations with large amounts of data. Below is a chart showing approximate transfer times depending on the internet connection speed and the amount of data to be transferred. It is easy to see large migrations couldn’t happen without these devices. The final column shows the amount of data where is makes sense to “seed” the data with a hardware devices rather than transfer it over the internet or a direct connection.
    Company’s Internet Speed Theoretical days to xfer 100 TB @ 80% Utilization Amount of data to consider device
    T3 (44.73 Mbps) 269 days 2 TB or more
    100 Mbps 120 days 5 TB or more
    1000 Mbps (GIG) 12 days 60 TB or more

    1)      Test and Monitor- Once your instance is setup, and all the data migrated, it’s time to test. Best practices are to test the application in the most realistic setting possible. This means during business hours and in an environment when bandwidth consumption will be similar to the production environment. You wont need to look far to find products that can monitor the health of your AWS instances; AWS provides a free utility called CloudWatch. CloudWatch monitors your Amazon Web Services (AWS) resources and the applications you run on AWS in real time. You can use CloudWatch to collect and track metrics, which are variables you can measure for your resources and applications. CloudWatch alarms send notifications or automatically make changes to the resources you are monitoring based on rules that you define. For example, you can monitor the CPU usage and disk reads and writes of your Amazon instances and then use this data to determine whether you should launch additional instances to handle increased load. You can also use this data to stop under-used instances to save money. In addition to monitoring the built-in metrics that come with AWS, you can monitor your own custom metrics. With CloudWatch, you gain system-wide visibility into resource utilization, application performance, and operational health.

    To meet and learn more about how AWS can benefit your organization contact me at (513) 227-4131 or jim.conwell@outlook.com.

 

cloud savings

Enabling precision agriculture with IOT


FarmBeats is an AI for Earth Lighthouse project aimed at showcasing the benefits of IOT in a variety of applications. Water scarcity and pollution are threatening the livelihood of farmers around the world; they are under immense pressure to produce. Through sensors, drones, data analytics, AI, and connectivity solutions, Microsoft is enabling precision agriculture to improve yield while reducing resource consumption.

The AWS & VMware Partnership

VMware.AWS-image

image courtesy of eweek.com

In the world of technology, partnerships are vital as no provider does everything well. Some partnerships appear successful at first glance, but others require more of a wait and see approach. When I first heard that VMware and Amazon Web Services (AWS) were forming a partnership I felt I wanted a better explanation as to how it would work before deciding on its merits. My cynicism was primarily founded in VMware’s previous attempts to play the public cloud market such as the failed vCloud Air. After learning more, I’m still not convinced it will work but the more I understand, the more sense it makes.

It can be said that VMware invented the cloud through its pioneering of the technology of virtualization. It allowed the enterprise in the 1990’s to spend less money on IT hardware and infrastructure. They taught users how to build and add to an IT infrastructure in minutes rather than weeks. They taught us how to make IT departments to be agile. In a similar way, it seemed that AWS has built an enormous and rapidly growing industry from nothing. It had the foresight to take their excess IT infrastructure and sell it, or more precisely rent it. This excess infrastructure had the ability to be rented because it was built on their flavor of virtualization. For these two to join forces does make sense. Many businesses have built their virtualized IT infrastructure, or cloud, with the VMware hypervisor. This can be on the premises, in another data center or both. With the trend for corporate IT infrastructure to migrate off-site, the business is left with a decision. Should they take a “lift and shift” strategy to migrate data off site or should they redesign their applications for a native cloud environment? The lift and shift strategy refers to moving an application or operation from one environment to another without redesigning the application. When a business has invested in VMware and management has decided to move infrastructure off site, a lift and shift strategy makes sense.

To follow is a more detailed look at a couple of the advantages of this partnership and why it makes sense to work with VMware and AWS together.

Operational Benefits

With VMware Cloud on AWS, an organization that is familiar with VMware can create a simple and consistent operational strategy of their Multi-cloud environment. VMware’s feature sets and tools for compute (vSphere), storage (vSAN) and networking (NSX) can all be utilized. There is no need to change VMware provisioning, storage, and lifecycle policies. This means you can easily move applications between their on-premises environments and AWS without having to purchase any new hardware, rewrite applications, or modify your operations. Utilizing features like vMotion and VMware Site Recovery Manager have been optimized for AWS allowing users migrate and protect critical applications at all their sites.

Scalability and Global Reach

Using the vCenter web client and VMware‘s unique features like vMotion enhance AWS. AWS’s inherent benefits of unlimited scale and multiple Availability Zones (AZ) fit hand in glove with VMware’s cloud management. A primary example is an East Coast enterprise opening a West Coast office. The AWS cloud will allow a user to create infrastructure on the AZ West Coast on demand in minutes. VMware’s vCenter web client will allow management of the new site as well as the existing primary infrastructure from a single pane of glass. This example displays not only how the enterprise can take advantage of the benefits of this partnership but also that the partnership will appeal to the needs of a larger enterprise.

The benefit above, as with the solution in total, is based on the foundation of an existing VMware infrastructure. This article has just touched on a couple of the advantages of the VMware AWS partnership, there are many. It may be noted that cost is not one of them. This shouldn’t surprise many IT professionals as large public cloud offerings don’t typically reduce cost. Likewise, VMware has never been known as an inexpensive hypervisor. The enterprise may realize soft cost reduction by removing much of the complexity, risk, and time associated with moving to the hybrid cloud.

Both AWS and VMware are leaders in their categories and are here to stay. Whether this partnership survives or flourishes, however, only time will tell.

If you would like to learn more about a multi-cloud strategy for your business contact us at: Jim Conwell (513) 227-4131      jim.conwell@twoearsonemouth.net

www.twoearsonemouth.net

 

 

Getting Started with Microsoft Azure

 

azure-icon-250x250

image courtesy of Microsoft.com

A few months ago I wrote an article on getting started with Amazon Web Services (AWS): now I wanted to follow-up by writing the same about  getting started with Microsoft Azure. Microsoft Azure is the public cloud offering deployed through Microsoft’s global network of datacenters. Azure has continued to gain market share from its chief rival, AWS. Being in second place is not something Microsoft is used to with their offerings. However, in the cloud, like with internet web browsers, Microsoft got off to a slow start. Capturing market share will not prove as simple with AWS as it was with Netscape and the web browser market in the 90’s but in the last two years, progress has been made. Much of the progress can be attributed to Satya Nadella, Microsoft’s current CEO.  Nadella proclaimed from his start a commitment to the cloud. Most recently Microsoft has expressed their commitment to support Linux and other operating systems (OS) within Azure. Embracing another OS and open source projects is new for Microsoft and seems to be paying off.

Like the other large public cloud providers, Microsoft has an easy to use self-service portal for Azure that can make it simple to get started. In addition to the portal, Microsoft entices small and new users with a free month of service. The second version of the portal released last year has improved the user experience greatly. Their library of pre-configured cloud instances is one of the best in the market. A portal user can select a preconfigured group of servers that would create a complex solution like SharePoint. The SharePoint instance includes all the components required: The Windows Server, SQL Server and SharePoint Server. What would take hours previously now can be “spun-up” in the cloud with a few clicks of your mouse. There are dozens of pre-configured solutions such as this SharePoint example. The greatest advantage Microsoft has over its cloud rivals is it has a deep and long-established channel of partners and providers. These partners, and the channel Microsoft developed for their legacy products, allow them to provide the best support of all the public cloud offerings.

Considerations for Getting Started with Microsoft Azure

Decide the type of workload

It is very important to decide not only what workloads can go to the cloud but also what applications you want to start with. Start with non-production applications that are non-critical to the business.

Define your goals and budget

Think about what you want to achieve with your migration to the cloud. Cost savings? Transferring IT from the capital expense to an operational expense? Be sure you calculate your budget for your cloud instance; Azure has a great tool for cost estimation. In addition, make sure you check costs as you go. The cloud has developed a reputation for starting out with low-costs and increasing them quickly.

Determine your user identity strategy

Most IT professionals are familiar with Microsoft Active Directory (AD). This is Microsoft’s application that authenticates users to the network behind the corporate firewall. AD has become somewhat outdated, not only by cloud’s off-site applications but also by today’s limitless mobile devices. Today, Microsoft offers Azure Active Directory (AAD). AAD is designed for the cloud and works across platforms. At first, you may implement a hybrid approach between AD, AAD and Office 365 users. You can start this hybrid approach through a synchronization of these two authentication technologies. At some point, you may need to add on federation that will add additional connectivity to other applications such as commonly used SaaS applications.

Security

An authentications strategy is a start for security but additional work will need to be done. A future article will detail cloud security best practices in more detail. While it is always best to have a security expert to recommend a security solution, there are some general best practices we can mention here. Try to use virtual machine appliances whenever possible. The virtual firewall, intrusion detection, and antivirus devices add another level of security without adding additional hardware. Devices such as these can be found in the Azure marketplace. Use dedicated links for connectivity if possible. These will incur a greater expense but will eliminate threats from the open Internet. Disable remote desktop and secure shell access to virtual machines. These protocols exist to offer easier access to manage virtual machines over the internet. After you disable these try to use point to point or site to site Virtual Private Networks (VPN‘s). Finally, encrypt all data at rest in virtual machines to help secure data.

Practically every business can find applications to migrate to a public cloud infrastructure such as Azure. Very few businesses put their entire IT infrastructure in a public cloud environment. A sound cloud strategy, and determining which applications to migrate enables the enterprise to get the most from a public cloud vendor.

If you would like to learn more about Azure and a cloud strategy for your business contact us at:

Jim Conwell (513) 227-4131      jim.conwell@twoearsonemouth.net

www.twoearsonemouth.net

Should We Eliminate or Embrace Shadow IT?

shadowit_image

With cloud computing’s acceptance in business coupled with the ease of entry and setup with public cloud offerings, the terminology of Shadow IT has reemerged. Wikipedia defines Shadow IT as “a term often used to describe information-technology systems and solutions built and used inside organizations without explicit organizational approval”.

If the cloud initiated this reemergence, the Internet of Things (IOT) and the Bring Your Own Devise (BYOD) phenomenon’s have exacerbated it. When employees started bringing their mobile phones and tablets to the office they began integrating applications they used in their personal life to business. Likewise, Machine Learning (ML) applications have influenced corporate IT and its guidelines throughout the enterprise. Opponents say Shadow IT challenges the IT governance within the organization. What may appear to be a disadvantage to the IT department may be advantageous to the company. To follow are some of the advantages and disadvantages of shadow IT.

Advantages

  • Increased agility – departments within an organization can create their own IT resources without depending on the lag time and processes of the IT department.
  • Empowering employees – employees will be more productive when they feel they have the power to make decisions, including IT selections, on their own.
  • Increased creativity – putting the process of creating IT resources in the hands of the user often creates a better product and experience for that user.

Disadvantages

  • Security – Employees outside the IT department rarely consider security when implementing IT services.
  • Cost- When IT resources can be implemented at the employee level, as opposed to being purchased centrally, there will be wasted resources.
  • IT governance and compliance –Outside of the IT department, purchasers will not consider the regulatory concerns and governance. Processes and rules for IT need to be in place regardless if the resources are centrally implemented.

IT departments are not wrong to have contempt for the concept of Shadow IT. However, we believe they can learn to work with aspects of it. If a business can communicate across all departments and overcome the disadvantages listed above, we believe Shadow IT can be a win/win for the entire enterprise.

If you need assistance designing your evolution to the cloud or data center

please contact us at Jim Conwell (513) 227-4131      jim.conwell@twoearsonemouth.net      www.twoearsonemouth.net

 

What is a Software Defined Wide Area Network (SD-WAN)

sdwan2

image courtesy of catchsp.com

The trend for software or applications to manage technology and its processes has become commonplace in the world of enterprise IT.  So common, in fact, that it has created its own prefix for IT solutions, Software Defined or SD. Virtualization software from companies like VMware revolutionized the way the enterprise built datacenters and coined the phrase “software defined network”. Today this concept has expanded out from the corporate datacenter to the Wide Area Network (WAN), and ultimately to the enterprise branch offices and even to customers. The Software Defined WAN (SD-WAN) can simplify management of the WAN and significantly reduce the cost of the telecommunication circuits that create the WAN.

What’s a WAN?

A Wide Area Network, or WAN, allow companies to extend their computer networks to connect remote branch offices to data centers and deliver the applications and services required to perform business functions. Historically, when companies extend networks over greater distances and sometimes across multiple telecommunication carriers’ networks, they face operational challenges. Additionally, with the increase of bandwidth intensive applications like Voice over Internet Protocol (VOIP) and video conferencing, costs and complications grew. WAN technology has evolved to accommodate bandwidth requirements. In the early 2000’s Frame Relay gave way to Multi-Protocol Label Switching (MPLS). However, MPLS technology has recently fallen out of favor, primarily because it has remained a proprietary technology.

Why SD-Wan?

MPLS, a very mature and stable WAN platform, has grown costly and less effective with age. The business enterprise needs to select one MPLS vendor and use them at all sites. That MPLS provider needs to look to a local telecom provider to provide the last mile to remote branches and possibly even the head end. This has historically brought unwelcomed blame and finger pointing as the circuit develops troubles or is out of service. It also creates a very slow implementation timeline for a new site. MPLS solutions are typically designed with one Internet source at the head end that supports the entire WAN for Web browsing. This will create a poor internet experience for the branch and many trouble tickets and frustrations for the IT team at the head end. SD-WAN can eliminate these problems unless it isn’t designed correctly, in which case it has the potential to create problems of its own.

SD-WAN uses broadband internet connections at each site for connectivity. The software component of the solution (SD) allows for the management and monitoring of these circuits provided by multiple vendors. The broadband connections are ubiquitous and inexpensive, provided by local cable TV providers. Broadband internet connections offer more bandwidth and are much less expensive than an MPLS node. Additionally, broadband circuits can be installed in weeks instead of the months required for a typical new MPLS site. In an SD-WAN deployment, each site has its own internet connectivity, the same broadband circuit that is delivering connectivity. This greatly increases the satisfaction of the branch users for internet speed and reduces total traffic over the WAN. However, it creates a challenge for the cyber security of the enterprise. When each remote site has its own internet, each site needs its own cyber security solution. Producing a valid cyber security solution can reduce the cost savings that result from the broadband internet.

Gartner recently has labeled SD-WAN as a disruptive technology due to both its superior management of a WAN and its reduced costs. Implementation of an SD-Wan implementation requires a partner with expertise. Some providers today pride themselves on having the best database to find the cheapest broadband circuits for each site. However, it is vital to pick a partner that also can provide an ongoing management of the circuits at each site and a deep understanding of the cyber security risks of an SD-WAN solution.

If you need assistance designing your SD-WAN Solution please contact us at:

Jim Conwell (513) 227-4131      jim.conwell@outlook.com      www.twoearsonemouth.net

#sdwan #sd-wan