Population Health- What is it; what’s the BUZZ?!

pop-health

A common term heard today by those of us that work in, or closely with the healthcare industry is population health. Population health is a term whose meaning today goes far beyond its literal definition. Wikipedia defines population health as “the health outcomes of a group of individuals, including the distribution of such outcomes within the group. It is an approach to health that aims to improve the health of an entire human population”. Population Health has come to represent something much larger to many of us; a shift in who is leading and defining healthcare services. The current transition happening within healthcare affects all the players; but none more than the patients.  This evolution is happening in healthcare now and goes beyond the government’s attempts to fix the processes. First let me define who the players are in the healthcare model:

·         Payers are insurance providers: the institutions that can dictate which doctor or hospital we go to

·         Providers provide care services to patients: they are typically doctors, nurses or institutions that

           provide the care to us 

·         Patients are those of us that receive the care from providers

One of the key drivers of these changes, and the evolution to population health, is technology specifically and BIG DATA. Big Data and analytics are giving the payers and providers the information they need to create a more proactive approach to healthcare. I recently heard a CEO at a large healthcare software vendor state that it takes the healthcare industry 15-17 years for a 50% acceptance rate of a new innovation. If the lack of innovation is changing for the better, and I believe it is, it is in large part due to the Center for Medicaid Services (CMS) driving its change. CMS promoted the acceptance and implementation of technology through its Meaningful Use program. Meaningful Use, and now the Merit-Based Incentive Payment System (MIPS), offers providers monetary rewards for the implementation of certain technologies in predetermined time frames. The race for CMS Meaningful Use dollars seemed to change the mindset of healthcare providers as it relates to the adoption of new technology.

 For instance, it was determined through data analytics that 20% of re-admitted patients, back for the same condition, had failed to have their first prescription filled from the initial stay. This has driven the development of technology that contacts and reminds the patient to fill his/her prescription through personalized email and text messaging. Additionally, Qualcomm and other technology companies have developed devices that can send home test information, such as blood pressure or blood sugar levels, with little or no setup from the patient. This data is transferred to the provider for immediate results.

The payers are getting into the act also. One large insurance company has stated they will set the same co-pay for a video call with a provider as one would pay for an office visit. This will create greater demand for video appointments, greater efficiency in regard to the use of doctors’ time and most importantly a better level of service to the patient.

Another recent and substantial disruption in healthcare services has been the presence of primary care type services in retail stores such as Walgreens and Kroger. Walmart is next to provide healthcare services; they have already stated they will approach it as they do all of their products and services, with the number one concern being the consumer. This not only provides the patient with more options but it also changes the way they view healthcare. Maybe healthcare is similar to other consumer goods: you can buy the cheap generic version for less money or spend more for a higher level of quality or service. If healthcare is to complete this transition to be a more patient driven model, the ultimate goal will be to let the market decide who the patient sees for their care. Ultimately, I think that is the final result for Population Health is allowing providers to be more efficient while patients to have more choices and a higher quality of care.

Contact us to learn about the IT challenges with your practice. We may have helped another with the same.

.Jim Conwell (513) 227-4131      jim.conwell@outlook.com      www.twoearsonemouth.net

 

 

 

 

 

 

 

Getting Started with Amazon Web Services (AWS)

icon-cloud-aws

Getting Started with Amazon Web Services (AWS)

Amazon Web Services is a little-known division of the online retail giant, except for those of us in the business of IT. Its interesting to see that the profits from AWS represented 56 percent of Amazon’s total operating income with $2.57 billion in revenue. While AWS amounted to about 9 percent of total revenue, its margins and sustained growth make it stand out on Wall Street.

When we have helped organizations evolve by moving part or all of their IT infrastructure to the AWS cloud, we have found that planning is the key to their success. Most businesses have had some cloud presence in their IT infrastructure. The most common, Software as a Service (SaaS), has lead the hyper growth of the cloud. What I will consider here with AWS is how businesses use it for Infrastructure as a Service (IaaS). IaaS is defined as a form of cloud computing that relocates a business’s applications that are currently on their own servers to a hosted cloud provider. Businesses consider this to reduce hardware cost, become more agile with their IT and even improve security. To follow are the 5 simple steps we have developed to move to IaaS with AWS.

1)      Define the workloads to migrate- The first cloud migration should be kept as simple as possible. Do not start your cloud practice with any business critical or production applications. A good idea, and where many businesses start, is a data backup solution. You can use your existing backup software or one that partners with AWS currently. These are industry leaders such as Commvault and Veritas, and if you already use these solutions that is even better. Start small and you may even find you can operated in the  free tier of Amazon virtual server or instances. (https://aws.amazon.com/free/)

2)      Calculate cost and Return on Investment (ROI)- Of the two primary types of costs used to calculate ROI, hard and soft costs, hard costs seem to be the greatest savings as you first start your cloud presence. These costs include the server hardware used, if cloud isn’t already utilized,  as well as the time needed to assemble and configure it. When configuring  a physical hardware server a hardware technician will have to make an estimation on the applications growth in order to size the server properly. With AWS it’s pay as you go, only renting what you actually use. Other hard cost such as power consumption and networking costs will be saved as well. Many times when starting small, it doesn’t take a formal process of ROI or documenting soft costs, such as customer satisfaction, to see that it makes sense. Finally, another advantage of starting with a modest presence in the AWS infrastructure is that you may be able to stay within the free tier for the first year. This  offering includes certain types of storage suitable for backups and the networking needed for data migration.

3)      Determine cloud compatibility- There are still applications that don’t work well in a cloud environment. That is why it is important to work with a partner that has experience in cloud implementation. It can be as simple as an application that requires a premium of bandwidth, or is sensitive to data latency. Additionally, industries that are subject to regulation, such as PCI/DSS or HIPAA are further incentivized to understand what is required and the associated costs . For instance, healthcare organizations are bound to secure their Protected Health Information (PHI). This regulated data should be encrypted both in transit and at rest. This example of encryption wouldn’t necessarily change your ROI, but needs to be considered. A strong IT governance platform is always a good idea and can assure smooth sailing for the years to come.

4)      Determine how to migrate existing data to the cloud- Amazon AWS provides many ways to migrate data, most of which will not incur any additional fees. These proven methods not only help secure your data but also speed up the process of implementation of your first cloud instance. To follow are the most popular ways.

  1. a) Virtual Private Network- This common but secure transport method is available to move data via the internet that is not sensitive to latency. In most cases a separate virtual server for an AWS storage gateway will be used.
  2. b) Direct Connect- AWS customers can create a dedicated telecom connection to the AWS infrastructure in their region of the world. These pipes are typically either 1 or 10 Gbps and are provided by the customer’s telecommunications provider. They will terminate at the far end of an Amazon partner datacenter. For example, in the midwest this location is in Virginia. The AWS customer pays for the circuit as well as a small recurring cross-connect fee for the datacenter.
  3. c) Import/Export– AWS will allow their customers to ship their own storage devices containing data to AWS to be migrated to their cloud instance. AWS publishes a list of compatible devices and will return the hardware when the migration is completed.
  4. d) Snowball– Snowball is similar to import/export except that Amazon provides the storage devices for this product. A Snowball can store up to 50 Terabytes (TB) of data and can be combined in series with up to 4 other Snowballs. It also makes sense in sites with little or no internet connectivity. This unique device is set to ship as is, there is no need to box it up. It can encrypt the data and has two 10 GIG Ethernet ports for data transfer. Devices like the Snowball are vital for migrations with large amounts of data. Below is a chart showing approximate transfer times depending on the internet connection speed and the amount of data to be transferred. It is easy to see large migrations couldn’t happen without these devices. The final column shows the amount of data where is makes sense to “seed” the data with a hardware devices rather than transfer it over the internet or a direct connection.
    Company’s Internet Speed Theoretical days to xfer 100 TB @ 80% Utilization Amount of data to consider device
    T3 (44.73 Mbps) 269 days 2 TB or more
    100 Mbps 120 days 5 TB or more
    1000 Mbps (GIG) 12 days 60 TB or more

    1)      Test and Monitor- Once your instance is setup, and all the data migrated, it’s time to test. Best practices are to test the application in the most realistic setting possible. This means during business hours and in an environment when bandwidth consumption will be similar to the production environment. You wont need to look far to find products that can monitor the health of your AWS instances; AWS provides a free utility called CloudWatch. CloudWatch monitors your Amazon Web Services (AWS) resources and the applications you run on AWS in real time. You can use CloudWatch to collect and track metrics, which are variables you can measure for your resources and applications. CloudWatch alarms send notifications or automatically make changes to the resources you are monitoring based on rules that you define. For example, you can monitor the CPU usage and disk reads and writes of your Amazon instances and then use this data to determine whether you should launch additional instances to handle increased load. You can also use this data to stop under-used instances to save money. In addition to monitoring the built-in metrics that come with AWS, you can monitor your own custom metrics. With CloudWatch, you gain system-wide visibility into resource utilization, application performance, and operational health.

    To meet and learn more about how AWS can benefit your organization contact me at (513) 227-4131 or jim.conwell@outlook.com.

 

3-2-1 Backup! Data Backup’s Evolving Technologies

veean-bu-c
infographic courtesy of Veeam software

Like all IT technologies, data backup software is quickly evolving and becoming simpler and more useful to businesses. This is critical as we have entered a time with massive data breaches and cyber terrorism, in addition to the normal IT outages that can threaten to eliminate businesses of any size business.

When backup software was first introduced it was based on file-level technology; backing up individual files. This was effective when a user accidently deleted a spreadsheet or an email that needed to be recovered. However, in the event of the failure of an entire server, or multiple servers, the restore process was painful. Many times, the Operating Systems (OS) were not backed up. The IT professional would need to rebuild the system hardware, load the proper version of the operating system and restore files one by one. This process was prone to errors and very time consuming; it would be common to take a week to restore one server.

The exponential growth of data storage, and the widespread use of server virtualization, allowed backup software manufacturers to offer Recovery Time Objectives (RTO) that were never dreamt of before. Virtualization technology, such as industry leader VMware, transforms physical servers to virtual machines (VM’s) that can be backed up at an image level. Image level backups of a VM include an OS that can be restored quickly with much less concern of matching hardware requirements. With all the growth in data most larger organizations have separated their backup and replication processes, both technologically and geographically.

Innovative companies such as Veeam Software have developed solutions that focus on virtualized machines (VM’s) and make creating backup simple and repeatable. Other forward-thinking companies like Zerto have focused on replication, meaning they work toward a Disaster Recovery (DR) process. What once was a week-long process of recovering a lost server now takes place in minutes at a relatively low cost!

So now that the technology has simplified the recovery process for an IT outage, it is important to set processes and metrics for your backups. A good place to start is the 3-2-1 rule. Three copies, on two different media, with one off site.

Having a tape backup and taking it to a safe deposit box every week doesn’t provide the business continuity assurances needed in today’s business environment. It is widely accepted as best practice to have the first backup copy (of 3) onsite and always on a media that can be restored quickly. An example of this would be a disk or Network Attached Storage (NAS) that is available to restore the “deleted file” quickly and without need to look any further than the company’s own server room.

Although tapes have outlived some of their usefulness to the company’s backup process, it is still frequently the second type of media used in the 3-2-1 rule. Tape is a good, reliable source to archive data that may not need to be restored quickly. Data that’s required to be saved for years, such as data governed under regulatory compliance, can be archived and stored on tape at a fraction of the cost of disk or cloud.

Finally, it is a universally accepted best practice that a company should have (at least) one backup copy off site. Today’s off site copy typically isn’t kept in a safe deposit box or safe at the CEO’s home. Most companies today have at least a portion of their backup in a hardened, audited, multi-tenant data center such as Microsoft Azure or Amazon AWS. These have pre-configured virtualized platforms created so that their customers can transmit their backups automatically. This will require a secure internet connection or a dedicated point-to-point connection for increased security and guaranteed speed. This type of solution will allow a company to run their applications directly from their datacenter backup minutes after a power outage or building disaster has occurred. It is important to take the time to become familiar with current backup and DR technologies and work with best practices such as the 3-2-1 rule.

For more information on building your back-up and recovery strategies, reach me by phone or email below.

Jim Conwell (513) 227-4131 jim.conwell@outlook.com   www.twoearsonemouth.net

 

 

Why have my employees gone Phishing?Why can’t data breaches be stopped!

gf-2gone-phish

From Anthem Healthcare to the Democratic National Committee data breaches have been more widespread and costlier with time. At the start phishing attempts were mostly laughable, misspelled words in broken English about Nigerian prince’s. Today phishing campaigns can produce emails and websites that can’t be distinguished from those they emulate. Still the good guys in technology has stayed mostly current with their counterparts that lie, steal and sometime ruin lives with their crimes. However, one part of the story that has kept email phishing and data breaches alive and well are its potential victims themselves. People, employees who haven’t received proper training and don’t take the time to stop and think before they click inside an email help to perpetuate the spread of malware.

It’s interesting to learn that most phishing sites are up and running less than 24 hours. This gives traditional methods of protection, updates to operating systems and security software, very little chance of staying current. Additionally, phishing sites are not dedicated to that purpose anymore but are hidden within benign domains.

So as phishing emails increase it is imperative for the employees of an organization learn how to “not take the bait”

To follow are some of the most common tips offered to help email users:

  • Stay aware! – If you receive an email from someone you know, but it seems out of character, check with the sender for its authenticity.
  • Check it out! – If you’re unsure about an email, and the links within it, hover your mouse over the links and it will show you the URL it contains. If it looks to be deceptive delete the email right away. If the web address appears to be right, check it again making sure everything is spelled correctly.
  • No unapproved downloads.
  • Do not click any links within email without a thorough vetting process of the link. Remember the mantra, when in doubt throw it out!
  • Be aware that over 60% of the impersonated companies are either in the fields of technology or finance. Be extra cautious as you receive offers in email from these types of companies.

Strengthening an organization’s IT security policy means moving beyond technologies that are designed to detect the older “static” phishing domain with more advanced and automated technologies. Also, the company’s employees have to be engaged in the fight through awareness training to catch the attacks that get passed the first and second lines of defense.

You can contact me at (513) 227-4131 or jim.conwell@outlook.com for specific strategies to join the fight like training your users what to look for!

 

I found giving back can provide you more.

I remember an instance when I heard our President at the time, George W. Bush, speak to a national audience on ways our nation could be made greater.  He suggested that all citizens volunteer in some way to serve their own local communities.

I liked that idea when I heard it. At the time I was struggling with my desire to give to charity.  The cynic in me always questions what percentage of my donation to a charity actually reaches the people in need and how quickly it reaches them. The thought of volunteering would allow me to see firsthand the effect of my contribution on those in need and in my community.  My next thought was what could I do, what do I have of value to give?

I thought of my relationship with my grown daughters, both of which turned out to be amazing women. I loved being a parent to my daughters and always wanted a son to share some of my more “guy” attributes.  I began to consider mentoring a child as a way to give back to the community. The thought of changing a life of a child that needed help seemed perfect. A friend of mine who mentors directed me to the Big Brothers Big Sisters of Greater Cincinnati. (BBBS)

I was impressed with BBBS from the start. Beyond the required background check for mentors they completed a detailed interview in my home to determine the traits in me to make a good match with a child in need within their portfolio. Through this process, Julie the case worker, not only determined I enjoyed watching sports but also that  I like participating in them. This, in addition to other characteristics, would be used to find my match.

Into my life walks, or should I say runs, Jazques Peyton. I met Ques for the first time when he was 8 years old. We met at his home with the BBBS case worker and his mother Keya. Ques was very quiet and apprehensive but I could see in his eyes an excitement to start this process. We started out slowly with one hour meetings about once a week.  Slowly but surely our time increased, then really picked up as we began signing Ques up for sports teams. Ques took to organized sports immediately and with great enthusiasm. He had never participated before because he didn’t have a father present to assist. Ques enjoys all three major sports, football, basketball and baseball. I’ve been able to help coach Ques’ baseball team which has given me the opportunity to see how he interacts with his peers. I also get a better perspective on his community and culture. Ques and I have developed an incredible relationship; I love him like he is my own son.

A second benefit I had hoped for, and have received, was a closer exposure to a culture other than my own. I have never consider myself or my family racist, yet I did have an ignorance based on my lack of interaction with inner-city and low income cultures. This ignorance lent itself to my lack of compassion for these cultures in my community. I assumed these cultures would feel the same toward my culture. Ques’ family has treated me like a family member, always making me feel welcome and going out of their way to be helpful. I receive the same treatment from his friends and neighbors. My ignorance and ill-conceived judgements faded quickly.

My goal in becoming a mentor to a young man was to enhance and change a life. Although I am not near completion of the process, my goal has been achieved. It has enhanced and changed my life; it provides a fulfillment like none I have ever experienced before.

If you would like to learn more about mentoring or my experience please give me a call; (513) 227-4131

IMG_bengals-2013-withme
Who-Dey!

Why outsource my IT? What’s a MSP?

AAEAAQAAAAAAAAwgAAAAJDAxNmMxNDJhLTM0MWEtNDIwZi1iNDBiLTAwYjVhZTYzMzM4Mw[1]

Most organizations, big and small, have gone through this exercise with Information Technology, as well as other services. “Should I hire a dedicated person, assign it to someone in the organization as an additional responsibility or outsource”? A better term for outsource in IT is engage a Managed Service Provider (MSP). When posing this question for IT services; size matters! In this exercise, we will assume there are from between 20 to 100 IT users in the organization considering an MSP.

When a company I consult with is near the lower end of this user count many times they will tell me that an employee’s relative; brother, sister or husband does their IT work. I call this type of IT provider a trunker, as their office and tools are in the trunk of their car. A trunker can be a smart way to go, receiving a prompt and personalized service response. However; it is important the trunker has a way to stay current with technology. Also, at least one employee of the organization be aware of all he or she does and documents all passwords and major tasks.

 I’ve seen the same level of service can be achieved with an IT MSP as the organization outgrows the trunker. The MSP will typically have an upfront cost to inspect and become familiar with the IT infrastructure. Then there will be a recurring charge, monthly or quarterly, for help-desk support that is either handled remotely or on the customers site. With few exceptions, organizations of 100 employees or less, are serviced satisfactorily with a remote agreement. When an issue calls for onsite service they will pay the predetermined labor rate. Another factor that is determined up front are Service Level Agreements (SLA’s). SLA’s will define how quickly the MSP will respond. As it was with the trunker mentioned before it’s up to the organization to keep track of the IT provider and their tasks. This can be made easier by the fact that an MSP, because it will engage multiple technicians for one customer, needs to document everything for their own benefit.

The MSP is the system I see work most often. So let me answer my original question. Why outsource my I.T?!

1)   Consistency and predictability of service. Based on the MSP’s reputation and the SLA’s provided most organizations experience responsive and high continuity of service. When the agreement ends, they can expect a smooth transition to the new vendor or person. I have witnessed many times when the trunker provider relationship ends poorly. The organization can be put in a position of having no documentation and not even knowing the passwords to access their systems.

2)   Transparency. Most MSP’s, as a part of their service, offer dashboards showing real-time status of devises on the network. Many even offer your business remote access to monitor your network. This is a major cost reduction based the cost to host or maintain monitoring yourself.

3)   Expertise. There is knowledge in numbers. Although you may only see or speak with one person as the face of your IT partner, you’re working with a team with vast experience and knowledge. The technical staff of an MSP will always have greater level of experience and a better knowledge of the trends in technology. This is particularly true in regulated organizations such as in healthcare and financial businesses.

Contact us for a free analysis of your business and what will serve it best.

The Business Cloud Roadmap

roadmap-cloud

Cloud Computing has taken a foothold in the business IT space, serving small companies to the largest enterprise. Almost all organizations, including the regulated ones like healthcare, utilize cloud in some form. However, very few have moved their entire infrastructure to a public cloud. The trend for partial or “hybrid cloud” seems to be likely to continue.

The cloud computing concept is said to have originated when companies such as Google and Amazon saw the opportunity to sell, or rent, some of their surplus storage from their infrastructure. With the advent of virtualization software like VMware private or dedicated cloud became popular with organizations with multiple servers. Private cloud refers to a dedicated infrastructure of hosts (high capacity servers), running virtualization software (hypervisor) to create many more virtual servers.  Public cloud is a resource pool of shared infrastructure supplied by a cloud provider allowing their customers to rent virtual servers. A popular way a hybrid cloud occurs when an organization has their own private cloud and uses a cloud provider for public cloud to add additional virtual machines for additional workloads, backup or disaster recovery (DR). It is applications like DR and Dev-Ops that have driven and sustained the cloud growth. The term DEV-OPS comes from the combination of the terms development and operations and refers to the ability to integrate these two groups of an organization. The promise of DEV-OPS is to take a project from development to full implementation seamlessly and automatically.

Cloud Types

Current business cloud offerings can be classified into three categories: Software as a Service (SaaS), Infrastructure as a Service (IaaS) and Platform as a Service (PaaS). SaaS is seen most often when a software company hosts its software and provides it to their customers; usually licensed by the user. Primary examples of SaaS are Microsoft Office 365 and SalesForce.com, two leaders in Enterprise software. These types of offerings, with continued growth and popularity, assure that the Enterprise won’t have all their cloud “eggs in one basket”. IaaS occurs when a cloud provider provides Virtual Servers or Virtual Machines (VM) to their client. This offers several benefits to the client including; elimination of CAPEX purchase of server hardware and increased agility and scalability for their IT network. IT Agility has come to mean moving quickly in deploying new infrastructure for new applications and embracing new trends such as DEV-OPS quickly. PaaS is a system of parts that can be used by developers to simplify and expedite building and implementing applications. The characteristics of a desirable PaaS platform is one that is highly reliable and very scalable such Amazons Elastic Beanstalk.

Current Cloud Market Share Leaders

From the beginning the leader in Enterprise Cloud Providers has be Amazon Web Services (AWS).AWS is by far the fastest growing and most profitable part of Amazons vast business, although not as well-known as their consumer products. AWS grossed more than 7 billion dollars of revenue in 2015.  That was a year over year growth of nearly 80% from the previous year. Second place in business cloud providers is Microsoft, predicted by analyst who follow them, to exceed 8 billion dollars of revenue in 2016. This number includes Microsoft Office 365 and their AWS-like platform Azure. Microsoft Office 365 has allowed Microsoft to claim a large market share of cloud through its $5.00 per month email inbox with virtually unlimited storage. The cloud trend drove Microsoft to cannibalize much of this share from themselves and their Exchange Server software.  However disruptive it is the right direction and Microsoft has committed to move their products to the cloud and are doing so successfully with their new leadership. Recently Microsoft has been promoting the benefits of integrating the Office 365 product with Azure. This coupled with the fact that Microsoft has a large partner network to support their existing enterprise software should allow Microsoft to start taking market share from AWS. Most business technology insiders believe AWS is a superior platform to Azure. Both of these company have the enormous challenge of being able to assist and support their enterprise customers as their cloud infrastructures scale larger and larger. Third place in business cloud market share is Google, a position they are not used to being in. While still far behind they have a very aggressive forecast for their business cloud product. Google predicts their cloud revenue to increase and exceed that of their search dollars in five years. Given that the primary component of a business cloud offering is a scalable infrastructure, I wouldn’t count Google out.

Future Cloud Forecast (Trends)

With the top two business cloud providers already household names and a distant 3rd place from a contender like Google, where do we go from here? I haven’t yet referred to the thousands of business cloud providers worldwide battling for what’s left of the market; which by any standards is huge. These providers range from telecom carriers to managed service providers to smaller multi-tenant data-centers who have evolved from their traditional colocation services. Many of the telecom carriers, whose original intent was to compete head to head with the big three, have dropped out. The multi-tenant data-centers however, have developed a different model and have a much greater success rate. They provide a higher level of service through the initial consultation, implementation and post installation management and support. This model also consists of Service Level Agreements (SLA’s) that are explained and easily understood and measured. Their pricing models are also more consistent easily understood also. The big three’s pricing can be confusing and is often filled with variables like transaction costs that can increase costs quickly. Another driver that causes the enterprise to look to the small datacenter is that it is local. The leadership of an enterprise, particularly regulated ones, want to know exactly where their data resides and who is monitoring it.

I believe the future of the business enterprise cloud lie in a combination of the two models, large scalable world-wide providers like AWS and Azure, complimented by the smaller customer focused data-center cloud providers. Additionally, I see these consultative type providers often managing the entire cloud infrastructure of the business, their own as well as the AWS or Azure portion. There are a couple of ways I see how this could happen. First, Amazon and Microsoft and the like may focus on building a partner network of these service minded data-centers allowing smaller data-centers to resell their products at an acceptable margin. Even more likely these service providers may develop systems or portals that allow their customers to add services from any provider while managing the cloud infrastructure and the entire process.

However it goes, cloud is here to stay, as well as a mix of business cloud providers that vary in their level of service and consultation to their clients.

Is the Small & Medium Business Market (SMB) ready to embrace the Cloud?

smb_cloud

Microsoft thinks the SMB is ready to embrace the cloud and is betting on it. Microsoft has recently adjusted their delivery channel to allow their smaller solution providers to easily sell and implement their cloud offering; Azure. These same providers have succeeded with Office 365 most recently and the Windows Operating Systems, (OS) and Office applications have been a mainstay of the SMB network for years. Cloud solutions have made sense for the Enterprise and larger business for some time. Their larger budgets and greater internal expertise helped facilitate adoption. However, the small business market has not widely participated, except for some low entry cost solutions like Microsoft Office 365 (O365).

Let’s identify what the different type of business cloud services; defined in some part by the amount of management the cloud provider supplies.

  1. Software as a Service– (SaaS)

In this framework everything is managed by the cloud provider. The cloud components that are provided include; networking, servers, storage, virtualization software (hypervisor), server operating system (OS) and the application itself. A good example of this is Microsoft Office 365 (O365) that provides its subscriber hosted email, one-drive storage and all the Office applications. Office applications like Word and Excel were licensed and purchased before O365.

  1. Platform as a Service– (PaaS)

In this framework the cloud provider manages less of the solution, the customer manages their own application and the data that supports it. The cloud provider still provides the networking, servers, storage, hypervisor and the OS. A good example of this is a test and development environment for the customer. The customer may want to test a new software or application in a “live” environment before going online with it.

  1. Infrastructure as a Service– (IaaS)

In this framework the customer manages most of the solution, renting the hardware infrastructure from the cloud provider instead of buying and supporting the hardware themselves. Typically, the cloud provider plays a more passive role here which, in most cases, is what they want. The cloud provider will supply the networking, servers, storage and hypervisor while the customer provides the OS, application and its data. Since the customer provides the OS they can create their own servers, called virtual machines or VM’s, without any interaction from the cloud provider in a matter of minutes. The “agility” the IT department can provide to the business is one of the greatest technical benefits of the cloud. When most IT professionals speak of the cloud they are talking about IaaS.

What barriers are dropping for SMB cloud growth?

  1. Price- Price is usually one of the primary considerations of purchases for SMB. Cloud, like other technology products that have become a commodity, is starting to reduce in price quickly. As we have observed the business cloud market in the past 5 years we have seen pricing drop between 30 and 50 percent. Now as Microsoft becomes more aggressive in the SMB market for cloud we see that trend continuing if not accelerating.
  2. In-house expertise One of the barriers that has held back SMB from embracing the cloud was the level of internal expertise required in an IaaS application. This type of expertise is becoming more common as cloud matures. Additionally, advancements in software have made it easier to manage. Once a technical mystery to many, now cloud is becoming a commodity and more manageable by many.
  3. Proprietary applications and regulatory challenges- The cloud has become more application “agnostic” as it has matured. Today’s cloud providers are accepting all Operating Systems and their applications. There are very few technical barriers to the cloud. Even more important, most cloud providers will provide complete industry and regulatory compliance. This has come as providers have built more compliant cloud infrastructure. Prospective cloud customers have also become more educated as to how compliance is achieved in the cloud. Cloud providers need to consider both United States as well as many international standards. U.S. government regulations that have begun to embrace the cloud are: HIPAA/HITECH, (healthcare) FERPA (education) and regulations for the departments of treasury, defense, state and justice.

What are the predominate applications for SMB cloud?

As the walls for SMB entrance for cloud services are coming down most companies decide to start slowly, moving just one or two workloads to the cloud at first. They almost always start with non-production applications. Below are the three most popular applications we see today:

  1. Test and Development- Organizations typically want to give their new business applications, or even significant upgrades, a real-life test before going live to production. To enable this, a cloud based “test & dev” server is ideal. Its fast, and inexpensive compared to purchasing new hardware for a lab environment.
  2. Backup and Disaster Recovery-(DR) Backup and DR are well suited for cloud. Virtualization, and its agility, seem to be designed for Backup and DR. The best results and fastest recovery time objectives (RTO) and recovery point objectives (RPO) occur when the entire environment is virtualized.
  3. Integration with the Internet of Things (IOT)- When you look at managing devices and apps that live in the internet, what better way to do it than with a cloud server that also resides there. Devices that measure and report on energy usage, employee productivity or asset location and utilization all have applications well suited to be placed in the cloud for their reporting and analytics.

These applications, and the migration of a company’s full range of internal and commercial applications all assure the continued growth of cloud services for all types and sizes of business.

You can contact me at (513) 227-4131 or jim.conwell@outlook.com so I can determine if twoearsonemouth would be the right IT partner for your organization. We can migrate your business applications to Office 365 or Azure.

Anniversary of a Loss (A Personal Story)

pat-me-ques-2It amazes me how our subconscious mind can be aware of things that our conscience mind hasn’t realized. This can happen to me when an anniversary of a major life event is coming up and I start feeling the feelings before my conscience mind is aware of where they are coming from.

On August 5, 2016, my sister Pat Gateff succumbed to Parkinson’s disease after a long battle with the disease. Parkinson’s Disease is a disorder of the central nervous system that affects movement and sometimes leads to dementia.
For as long as I can remember Pat and I were close. She was older than me and my relationship to her many times was more maternal than that of a sibling. One of my first memories, and one that remains clear to this day, is of Pat taking me to register for my first day of kindergarten as my mother was ill and in the hospital. As I grew older our relationship evolved into more of a sibling relationship, although Pat always had the role like a mentor or teacher. On the anniversary of her death I found myself thinking about some the most important lessons Pat imparted to me through the way she lived her life.

1. Compassion- When I was younger my family had several elderly single aunts that were reaching the age of requiring special care. Pat would consistently spend many hours per week providing for our Aunts. I was young at the time and I would think to myself, why does Pat do this? Did my father make her do it? Was she being paid? I now know that she was being paid, although not in the sense I was thinking. The lesson learned, and one of the greatest life has to offer, is the benefit we receive by helping others. Delaying your own needs, and helping others who are less fortunate, can offer the greatest riches life has to offer.

2. Forgiveness- Pat, because of the trust my father had in her, was the executor of my father’s estate. As my father’s death became imminent, and her responsibilities became overwhelming, she asked me for help with some tasks. Although I was capable of doing what I was asked, my ego took over in the form of control and I began to over step my bounds. I wasn’t being helpful. Pat became very frustrated with me and gave me a stern “talking to”! Pat spoke to me in tone and in words in a way she had never spoken to me before. I remember very well thinking, “well I’ll never speak to her again”! It took a while but I did come to the realization that I was at fault and that I need to ask for forgiveness. Again, to find the courage to say I’m sorry did not come quickly. Finally, I was ready to make my amends. It was amazing to me to witness firsthand what true forgiveness is and how it feels. When I finished stumbling through the words of my apology her words and expression let me know I was forgiven, and had been forgiven before I ever started. As time went on my mistake was never mentioned again by Pat, as if it had never happened. Pat was able to forgive and forget!

3. Unconditional Love- Virtually any way I acted or spoke Pat treated me the same, with respect and love. Watching Pat in her relations with her husband and children gave me a better understanding and appreciation as to what love looks like.

I try to carry all these characteristic into my own relationships. I believe once in your lifetime you are given the person that will have the most profound impact on your life. Mine was my sister Pat, I will always be grateful for my relationship with her and sad that I lost her too soon.