Choosing a cloud service provider in an evolving marketplace

Last year, Gartner axed 14 cloud vendors from its Magic Quadrant for Cloud IaaS, choosing to focus only on global vendors currently offering, or developing, hyperscale integrated IaaS and PaaS offerings. This bit of spring cleaning left behind a more manageable roster of six companies classified into two distinct segments: the Leaders, comprising Amazon Web Services, Microsoft, and Google, and the Niche Players, represented by Alibaba, Oracle, and IBM.


Even within this simplified league table of three, Amazon continues to be the clear leader in terms of revenue and market share. 


Now AWS may dominate the cloud market, but Microsoft and Google are growing much faster.

According to Q1 2019 figures, these challengers are growing, respectively, at 75 and 83 percent against a comparatively middling 41 percent growth for AWS.

AWS market share has also remained stagnant at 33 percent between Q1 2018 and 2019.

Additionally, a Gartner scorecard that evaluates public IaaS cloud providers across 263 required, preferred and optional criteria found that Azure has pulled away from AWS in the required criteria for the first time since the scoring started.


Going multicloud and hybrid

But even as the cloud IaaS market coalesces into a “Big Three vs Others” comparison, RightScale’s 2019 State of the Cloud survey revealed that businesses are combining public and private clouds in a hybrid multi-cloud strategy that leverages almost 5 clouds on average.

Another study from Kentik found that the most common cloud combination was AWS and Azure, with the AWS-Google Cloud combination trailing not far behind.

Though public cloud remains the top priority across enterprises, the number of companies deploying a hybrid public plus private cloud strategy is increasing.

At the same time, companies with a multicloud strategy, combining multiple public OR private clouds, has decreased. 

All this would suggest that the cloud market is not a zero-sum play. Businesses are using a combination of cloud providers, including the niche players, to design solutions that deliver the best outcomes. And cloud providers will have to take this preference for hybrid clouds into consideration while developing solutions for their customers. 

More importantly, none of this makes it any easier for customers to narrow down the right vendors for their workloads given the absence of any common framework for assessment. But it is possible to define a template defining some key considerations that should drive the choice of cloud providers. 

Choosing a cloud service provider

There are several factors that can influence a company’s choice of cloud provider including the platform’s choice of technology and architecture, data security, compliance and governance policies, interoperability, portability and migration support, services development roadmap, etc.

The Cloud Industry Forum, a UK-based not-for-profit organization promoting cloud adoption, has a fairly comprehensive list of 8 criteria for selecting the right cloud service provider. At Vinnter we believe that there are 4 important aspects to picking a cloud provider:

Location proximity: There are two reasons to ensure cloud service providers have actual operations in the customer’s target market. The first is latency, which some have even referred to as the Achilles heel of cloud adoption.

One study on the global network performance of AWS, Google Cloud and Microsoft Azure found that data center location directly affected network latency with network performance varying across different service providers while connecting across different regions.

This lag can have huge performance implications for many modern business and IoT applications that depend on and expect low latency.    

 The second critical factor is that of data sovereignty. Many countries, including Russia, China, Germany, France, Indonesia and Vietnam, have data residency regulations that require data to be stored in the region. GDPR sovereignty requirements have strict mandates on the collection and processing of EU residents’ data. 

Cloud providers are responding to these mandates of latency and sovereignty by opening up multiple regional data centers. For instance, AWS announced plans to open an infrastructure region in Italy, the company’s sixth in Europe, that would address both low latency and data residency requirements.

In Germany, Microsoft has placed the customer data in its data centers with an independent German data trustee, making it difficult for anyone, including Microsoft or US authorities, to access it without the customer’s permission.  

Transparent pricing: Optimizing cloud costs continues to be a top priority, even among advanced cloud users, with one study estimating wastage at about 35 percent of cloud spends. 

The study also identified four reasons for this wastage: complexity of cloud pricing, a better-safe-than-sorry approach leading to overprovisioning, lack of visibility into cost implications, and lack of adequate tools for optimizing spends.

As cloud providers launch new innovations and pricing models the pricing is only expected to get more complicated with no basis for comparison across services.

In fact, 2018 was predicted to be the year when cloud providers would consolidate and simplify their offerings and pricing structures.  

A transparent pricing model can address almost all the waste factors mentioned earlier. It provides customers with a common and objective basis for comparing and choosing the service that best suits their workloads as well as optimize provisioning to actual demand. 

Accessible documentation and support: Documentation and customer support are critical factors, often the difference between productivity and wastage, in an evolving area such as cloud computing.  Extensive and accessible documentation can make it easier for customers to implement and manage their cloud services more optimally. This has to be backed by 24/7 customer service and dedicated account managers to help customers resolve their cloud service problems and queries.

Both Google Cloud and AWS provide access to comprehensive documentation as well as community forums where customers can address implementation or performance related issues. All top three vendors offer a lower level of support by default but expect customers to pay for anything more. For instance, Google and AWS offer different levels of support at different price points.  

Going forward, the documentation and support offered by a cloud service provider could become a key point of differentiation in customers’ choice of cloud platform. 

Finding the right cloud skills: As more and more businesses move their workloads to the cloud, there is a growing demand for people with the skills to develop, operate and maintain end-services deployed in a cloud environment. This will be a critical factor irrespective of choice of cloud service provider.  

But acquiring the right cloud skills has reportedly become a full-blown crisis with a significant majority of IT managers finding it “somewhat difficult” to find cloud management talent. 

In order to deal with this crisis, Deloitte advises businesses to start with an inventory of cloud computing skills in the company across different areas such as architecture, security, governance, operations and DevOps as well as cloud brand-specific skills. 

The next step is to define the skills related to these same areas that are required to get the company to where it wants to be in terms of cloud technologies. Finally, train, hire and/or replace talent to build a cloud-first approach to technology. 

Today, it is simply not possible to create a like-to-like comparison template of all major service providers that would be relevant to every business looking to select a cloud platform. But there are certain broader themes that apply across services that need to be considered to determine the correct fit for every company’s technology and workload profile, usage pattern, technical maturity and budget.

Rather more importantly, the cloud is not the plug-and-play environment that it promised to be. Even post-adoption every business will need in-house cloud skills to accelerate development and innovation. That perhaps will be the biggest challenge of all.   

The analytical challenges of IoT data

If data is indeed the new oil, then we’re still a long way off from mastering the science of extracting, refining and deploying it as a strategic enterprise asset. That, in short, seems to be the conclusion that emerges from two separate studies. This may come from the analytical challenges that come with IoT data.

The first study from Gartner classifies 87% of businesses as having low BI and analytics maturity. Organizations within this low maturity group are further divided along two levels; a basic level characterized by spreadsheet-based analytics and a higher opportunistic level where data analytics is deployed but piecemeal and without any central leadership and guidance.  

In the second study from New Vantage Partners, a majority of C-suite executives conceded that they were yet to create either a data culture or a data-driven organization. Rather more worryingly, the proportion of companies that self-identified as data-driven seemed to be on the decline.

Companies may not yet be data-driven but the data flow shows no signs of slowing down.

According to IDC, the global datasphere will grow to 175 Zettabytes (ZB) in 2025, up from 23 ZB in 2017. Even as that happens, the consumer share of this data will drop from 47%, in 2017, to 36%, in 2025. This means that the bulk of this data surge will be driven by, what IDC refers to as, the sensorized world or the IoT. 


Key challenges of IoT data

The surge of IoT data comes with a lot of economic value, estimated at around $11 trillion by 2025. But it also comes with some significant challenges in terms of aggregating data from disparate, distributed sources and applying analytics to extract strategic value. 

The primary challenge of IoT data is its real-time nature. By 2025, 30% of all data will be real-time, with IoT accounting for nearly 95% of it, 20% of all data will be critical and 10% of all data will be hypercritical. Analytics will have to happen in real-time for companies to benefit from these types of data.

Then there is the issue of time series data. This refers to any data that has a time stamp, such as a smart metering service in the IoT space or even stock prices. A company’s IoT infrastructure must be capable of collecting, storing and analyzing huge volumes of time series data. The challenge here is that most conventional databases are not equipped to handle this type of data.

The distributed nature of IoT data, where most of the data is created outside enterprise data centers, also presents its own set of analytics challenges. Chief among them is the need to process at least some of this distributed data, especially hypercritical and critical data, to be processed outside the data center. IoT analytics itself, therefore, will have to become distributed with some analytics logic shifting out of the cloud to the edge. IoT analytics will have to be distributed across devices, edge servers, gateways and central processing environments. In fact, Gartner predicts that half of all large enterprises will integrate edge computing principles into their IoT projects by 2020.    

These are just a few of the characteristics of IoT data that differentiate it from conventional data sets. And traditional data-analytics technologies and capabilities are not designed to handle the volume, variety and complexity of IoT data. Most companies will have to completely revamp their analytics capabilities to include IoT specific capabilities such as streaming analytics, the ability to identify and prioritize between different data types and formats, and edge analytics. 

CSPs take the lead in IoT data analytics

Many companies are turning to cloud-based IoT platforms that offer rich data services alongside their core IoT offerings. Customers are looking for real-time capabilities across data ingestion, storage, processing, and analysis such as for the rich data ingestion, transformation, storage and processing.  Some cloud vendors are even offering their own hardware to enhance the interoperability and performance between IoT devices and data that is processed in the cloud. 

According to a Bain & Company study, CSPs (cloud service providers) are seen as leaders in providing a comprehensive set of tools that address all the IoT data analytics needs of the enterprise. These CSPs, according to the same study, are also playing a key role in lowering barriers to IoT adoption, facilitating simpler implementations and enabling customers to design, deploy and scale new use cases as quickly as possible. 

AWS takes the lead among IoT CSPs

Among the big brand CSPs, Amazon AWS has consistently been ranked as the platform of choice, followed by Microsoft Azure and Google Cloud Platform, in the annual IoT Developer Survey conducted by the Eclipse IoT Working Group. 


With data collection and analytics remaining a top three concern among developers, Amazon AWS offers arguably the most robust cloud-based IoT analytics solution in the market today.

The AWS IoT Analytics platform is a managed service that eliminates the complexity of operationalizing sophisticated analytics for massive volumes of IoT data. With AWS Lambda, developers also have access to a functional programming model that enables them to build and test IoT applications for both cloud and on-premise deployments. 

In terms of data collection & analytics, Amazon offers two distinct services in the form of AWS IoT Analytics and Kinesis Data Analytics. 

AWS IoT Analytics has the capabilities required for a range of IoT applications with built-in AWS Core support to simplify the setup process. With AWS IoT Analytics, it becomes much easier to cleanse bad data and to enrich data streams with external sources. AWS IoT Analytics allows data scientists access to raw and processed data, the facility to save and retrieve specific subsets of data and flexibility for rule-based routing of data across multiple processes.  

Kinesis Data Analytics is more suited for real-time data ingestion applications, like remote monitoring and process control, that require low latency response times in the range of milliseconds. The service integrates with other AWS tools like Amazon DynamoDB, Amazon Redshift, AWS IoT, Amazon EC2, to streamline the data analytics process. The Kinesis Analytics suite comprises a raft of services including, Kinesis Data Streams, Kinesis Data Firehose, Kinesis Data Analytics and Kinesis video streams. Amazon Kinesis Data Streams enables the continuous capture of large volume real-time data feeds and events of different kinds. Raw data from Kinesis can then be cleaned and process through AWS Lambda or Amazon ECS.  Kinesis Firehose prepares and loads streaming data to S3, Redshift or Elasticsearch for near real-time processing and analytics. 

While Kinesis offers developers more flexibility in development and integration, AWS IoT focuses on simplifying deployment using prebuilt components. It is possible to combine these two solutions to build a comprehensive IoT solution encompassing streaming as well as at-rest data.  

Late last year, Amazon AWS announced the launch of four new capabilities that would make it easier to ingest data from edge devices. AWS IoT SiteWise is a managed service that makes it easy to collect, structure, and search data from industrial equipment at scale. With AWS IoT Events, customers can now easily detect and respond to events from large numbers of IoT sensors and applications. AWS IoT Things Graph enables a no-code approach to IoT development with a visual drag-and-drop interface that allows developers to build IoT applications by simply connecting devices and services and defining their interactions. And finally there was AWS IoT Greengrass Connectors, a service that would enable developers to connect devices to third-party applications, on-premises software, and AWS services through cloud APIs.

Over and above all this, AWS has established a strong partner network of edge-to-cloud service providers and device manufacturers to offer customer the deepest technical and domain expertise required to mitigate the complexity of IoT projects. 

Apart from being a developer favorite, AWS IoT has also built up a client roster of some of the biggest brands in the industry including LG, Bayer, NASA, British Gas and Analog Devices, to name just a few. 

Notwithstanding the challenges of Big Data and analytics, there have been many successful IoT implementation across diverse sectors. Here then are just a couple of success stories of how companies and their IoT partners were able to use the power of big data analytics in IoT. 

IoT data analytics success stories

Bayer & AWS IoT Core: Bayer Crop Science, a division of Bayer, provides a range of products and services that maximize crop production and enable sustainable agriculture for farmers worldwide. The company uses IoT devices on harvesting machines to monitor crop traits that are then manually transmitted, over several days, to its data centers for analysis. The lack of real-time data collection and analytics meant that Bayer could not immediately address any issues with equipment calibration, jamming, or deviations to help with routing plans for subsequent runs. 

Already an AWS customer, Bayer’s IoT team decided to move its data-collection and analysis pipeline to AWS IoT Core. The company built a new IoT pipeline to manage the collection, processing, and analysis of seed-growing data. 

The new solution captures multiple terabytes of data, at an average of one million traits per day during planting or harvest season, from the company’s research fields across the globe. This data is delivered to Bayer’s data analysts in near real-time. The AWS IoT solution also provides a robust edge processing and analytics framework that can be scaled across a variety of IoT use cases and IoT initiatives. 

Bayer is now planning to use AWS IoT Analytics to capture and analyze drone imagery and data from environmental IoT sensors in greenhouses for monitoring and optimizing growing conditions.

Microsoft Azure IoT Hub & ActionPoint: Many manufacturers still use paper checklists, manual processes, human observation and legacy closed-loop technologies to monitor and maintain their equipment. Even in the case of modernized plants, manufacturers often did not have the right sensors in place to provide all the data required, or they had no analytics solutions to analyze the sensor data.  

Custom software developer ActionPoint partnered with Microsoft and Dell Technologies to develop IoT-PREDICT, an industrial IoT solution for predictive maintenance that incorporates machine learning, data analytics, and other advanced capabilities. The solution is powered by the Microsoft Windows 10 IoT Enterprise operating system running on Dell Edge Gateway hardware, and combined with the Microsoft Azure tool set to provide state-of-the-art edge computing. 

The combination of Windows 10 IoT Enterprise and Azure delivers a highly effective IoT solution that customers can deploy in minutes. It also gives the IoT-PREDICT solution the flexibility and scalability that allows manufacturers to start small with IoT and grow at their own pace.

IoT-PREDICT helps manufacturers quickly reduce downtime, lower costs, and increase the overall efficiency of their equipment and operations. It helps maximize the impact of manufacturer data by using the Microsoft Azure IoT Hub to gather data and make it available to several Azure services, including Azure Time Series Insights, Azure Stream Analytics. Manufacturers can now explore the data using Time Series Insights, or use Stream Analytics to take action with the data by setting up queries and alerts based on various performance thresholds.

IoT data analytics has certain unique characteristics and challenges that cannot be addressed by conventional analytics technologies and capabilities. But like in any analytics operation, the primary objective remains the same: to generate actionable insights that can enable positive business value. It is not just about choice or sensor or connectivity protocol or CSP. It has to be about ensuring the integrity of what McKinsey defines as the insights value chain. In order to ensure that every IoT project leads to demonstrable business value, organizations have to ensure the integrity of the entire insights value chain. 

Pentair & AWS: Pentair is a water treatment company that offers a comprehensive range of smart, sustainable water solutions to homes, businesses and industries around the world. The company relies on connected systems to monitor and manage its product installations, most of which are in remote locations. Traditionally, the company took the custom building route to develop its connected systems, which came with its own set of disadvantages. 

Pentair needed a powerful, flexible IoT platform, with high availability and scalability and a high degree of reuse across all lines of its business. Pentair also wanted a comprehensive solution that covered everything from IoT data ingestion, to analysis and visualization. 

The company teamed up with AWS Partner Network (APN) Advanced Technology Partner and IoT Competency Partner Bright Wolf to evaluate potential technology providers including Amazon, GE, IBM, Microsoft and others against a set of platform characteristics. This included data ingestion, infrastructure controls, deployment options, machine learning and visualization tools, development tools and the overall openness of each platform.

“AWS came out on top when it came to the raw scoring,” says Brian Boothe, the lead for Pentair’s Connected Products Initiative.

Till date, Pentair has deployed three different connected solutions using the AWS IoT platform and a flexible, scalable, and reusable reference architecture developed by Bright Wolf. The benefits according to Pentair include, accelerated time to market for value-added services, simpler integration, cost savings from deploying commodity edge devices on the open AWS IoT platform enterprise-grade scalability and availability. 

Recruitment success at Vinnter

During the spring and summer we at Vinnter have been working extremely hard to find new employees to hire to the best working place in Göteborg (yes, we might be prejudiced in this judgement… 😁 ). And the Vinnter recruitment success has finally appeared.

We have been working with several tools and services in parallell since the autumn of 2018, but during the early spring we decided to give LinkedIn job ads a chance. This has proven to be the most cost effective way of finding the best applications possible.

The hit rate may sound low, with a 3% hit rate. But we are extremely satisfied. We have held around 50 initial interviews and approximately 20 second interviews. But the best move ever has been when we decided to develop our own test to set in the hands of applicants that pass the first interview. It has been a great help for us to filter out those who really are ready to start with us from a competence perspective.

If you are interested in what opportunities you might find here, please head over to our career pages.

In the end it has resulted in us signing with four new employees! We are super excited to get them onboard (some have already started) and get them to contribute to our customers success and satisfaction with Vinnter as their chosen partner for development of IoT and connected things.

Curious of who we are? Well, just head over to an overview of our team! You can hire a team of these great people if you want!

Adam Grönberg

Adam is one of our latest recruitments at Vinnter. During 2019 we have been working hard at finding the right skills and mindset in applicants. Adam is a person that think things through and his heart lies within backend development and agile methodologies.

Although his heart lies within Java and backend development, he has a shown track record of being capable in creating great results in the frontend as well as in app development for Android.

Benedikt Ivarsson

Benedikt is a cool, calm and collected superhero coming to Sweden from our neighbour Iceland. He has got extensive experience from a range of different employments and companies and his competence is broad.

He is an ambitious and honest leader who is doing everything within his power to achieve a satisfied customer and a job well done. He consider it to be easy to look at the bigger picture when it comes to selecting a solution when facing a project.

With his positive attitude he makes a good team player as well as a good leader. His educational background is in Software Engineering, Project Management, IT Infrastructure and Enterprise

Specialties: Finding solutions, Flexible management, Project management, Product development, Software architecture, Pre-sale, Consulting, IoT, Python, C#, C++, C, Java, Hosting Services, Integration, ATM’s, Provisioning, Mediation, Automotive industry, Health care systems, Fintech solution, Telco solutions, Military Defence solutions, Financial Management, Product development.

Wishing you all a great summer!

The year of 2019 has been a great one for Vinnter. Lots of things have happened and much more is still to change before we may consider ourselves “done”. Of course we never get “done”, since this is not how the world works. Our customers are in constant change and so do we need to be. But it is not for the sake of changing, it is about adapting to new circumstances in our business as well as within our customer’s business.

Vinnter started out with a focus on development of electronics hardware for our customers, accompanied by some software development needed to get the electronics to adapt to customer requirements and connect to the Internet. Today our main business and our customer’s requirements has changed a bit and our focus has turned towards more software related development, both within embedded development and cloud service development. It is also extremely clear that cloud services are here to stay when it comes to connected things.

During the year we have signed on three new employees and we are looking to employ even more. We have more to do than ever before and are looking towards a great year both financially and assignment wise.

We look forward to interacting with you all again after the summer vacations! Meanwhile we would like to wish you a great summer where the focus lies on relaxation and regaining energy through great activities with friends and family.

From all of us; HAVE A GREAT SUMMER!

Magnus Ivarsson

From the start of the IT boom era Magnus has been paving his way towards greater knowledge and experience in the area of embedded software development. During several years he worked as an embedded developer within Assa Abloy and somewhere there the strong commitment to test driven development put a strong foothold in Magnus.

As an under-consultant to Vinnter Magnus has been so dedicated to continue working with us for so long that we consider him to be “ours”. At Vinnter we do not leave any consultants behind… They are of course as included into our community as they themselves feel is wanted. And at Vinnter we always welcome Magnus great sense of humor and insightful contributions to problem we need to solve.

Ulf Eliasson

Ulf has much of his focus on embedded software development, while at the same time not excluding any other part of the value chain in IoT. Ulf is very versatile and takes great pride in both writing beautiful code and putting the end user requirements and experiences into the first room.

Creating great embedded software solutions is not a thing he does before the first test is written. According to Ulf this is one of the most important factors for great software solutions that just work.

His experience before starting his career with us at Vinnter was gathered from working with software solution development within a larger automotive company located in the Göteborg area. Any guess which?

Magnus Bergqvist

There is no doubt that Java has changed the world of backend development since it arrived. Magnus is one of the believers that has grown up to become a senior software developer with a special edge in Java as well as many other segments of advanced server system development.

Since 2012 he has been levelling up the ladder of backend development and his latest achievements is mainly involving cloud service related developments.

Today Magnus is a celebrated competence within architecture as well as being a high performing senior software developer of advanced server systems.