Using design thinking when developing IoT solutions

A McKinsey analysis of over 150 use cases estimated that IoT could have an annual economic impact in the $3.9 trillion to $11.1 trillion range by 2025. At the top end, that’s a value equivalent to 11% of the global economy. We believe using design thinking when developing IoT solutions will help us reach financial targets.

SOURCE: https://www.mckinsey.com/business-functions/digital-mckinsey/our-insights/the-internet-of-things-the-value-of-digitizing-the-physical-world

It is not that difficult to envisage that degree of value-add given the continually evolving technical and strategic potential of IoT; there are new standards, components, platforms, protocols, etc. emerging almost daily. It is now possible to combine these options in seemingly endless ways, to address different use case requirements of connectivity, bandwidth, power consumption, user interaction, etc. to suit almost every potential application and user need out there. 

There will, however, be several technical, regulatory and human resources challenges that will have to be addressed before we can extract the real value of IoT. But perhaps the biggest challenge will lie in the approach that IoT companies take to identifying user needs and developing solutions that represent real value. 

Every technology cycle, from the dot com boom to the current AI gold rush, produces its own set of quirky, weird and downright pointless applications. And IoT is no different, with many products boasting connectivity features that may qualify them for a “smart” tag but offer no real benefits whatsoever. Every premier industry event like CES is followed by a slew of news roundups describing bewilderingly absurd “smart” solutions, from smart dental floss to $8,000 voice-activated toilets. 

But we believe that IoT’s true potential and value will only emerge when the focus is squarely on leveraging the power of IoT to address what are known as wicked problems. 

The concept of wicked problems was first defined by design theorist Horst Rittel in the context of social planning in the mid 1960s. It refers to complex issues, characterized by multiple interdependent variables and disparate perspectives, which seem impossible to solve. These problems do not necessarily lend themselves to traditional linear problem-solving processes and methodologies and require a new approach that can handle the inherent ambiguity and complexity of these issues. It was design theorist and academic Richard Buchanan who, in 1992, referenced design thinking as the innovation required to tackle wicked problems. 

Notwithstanding smart litter boxes that can text and smart garbage cans that automate shopping lists, the focal point of IoT has to be on identifying and addressing intractable problems and design thinking is the approach that will enable the IoT industry to do just that.  

Design thinking – A brief history

For many in the industry, design thinking is almost inextricably linked to Tim Brown and IDEO, and both played an important role in mainstreaming both the term and the practice. But as IDEO helpfully clarifies on its website, though they are often credited with inventing the term, design thinking has roots in a global conversation that has been unfolding for decades.

To understand how that conversation unfolded, we turn to Nigel Cross, Emeritus Professor of Design Studies at The Open University, UK, and his 2001 paper Designerly Ways Of Knowing: Design Discipline Versus Design Science. The paper traces the roots of what would eventually evolve into design thinking to the 1920s, and the first modern design movement. According to Cross, the aspiration was to “scientise” design and produce works of art and design that adhered to key scientific values such as objectivity and rationality. 

These aspirations surfaced again, in the 1960s, but the focus had evolved considerably. If formerly the emphasis was on scientific design products, the design methods movement of the 60s focused on the scientific design process and design methodology emerged as a valid subject of inquiry. The decade was capped by cognitive scientist and Nobel Prize laureate Herbert Simon’s 1969 book, Sciences of the Artificial, which refers to techniques such as rapid prototyping and testing through observation that are part of the design thinking process today.  

This “design science decade” laid the groundwork for experts from various fields to examine their own design processes and contribute ideas that would move the aspiration to scientise design along.  IDEO came along in the early 90s with a design process, modeled on the work developed at the Stanford Design School, that even non-designers could wrap their head around, thus providing the impetus to take design thinking mainstream. By 2005, Stanford had launched its own course on design thinking. Today, there are several leading educational institutions offering design thinking courses and a whole range of non-design businesses that rely on design thinking to resolve some of their wickedest problems. 

So, what is design thinking?  

Let’s start with a slice of history again. 

In the 80s, Bryan Lawson, professor at the School of Architecture of the University of Sheffield, United Kingdom, conducted an empirical study to understand how the approach to problem-solving varies between scientists and designers. The study revealed that scientists used problem-focused strategies as opposed to designers who employed solution-focused strategies. Scientists solve by analysis whereas designers solve by synthesis. 

A problem-focused approach relies on identifying and defining all parameters of a problem in order to create a solution. Solution-focused thinking, on the other hand, starts with a goal, say an improved future result, rather than focusing only on resolving the problem. 

Design thinking is a solution-focused methodology that enables the creative resolution of problems and creation of solutions, with the intent of an improved future result. It’s an approach that values analysis as well as synthesis. It is an integrated cognitive approach that combines divergent thinking, the art of creating choices, with convergent thinking, the science of making choices. Design thinking provides non-designers with elements from the designer’s toolkit that allows them to take a solution-focused approach to problem-solving. 

SOURCE: https://www.ideou.com/pages/design-thinking

IDEO’s definition of design thinking as a human-centered approach also includes what is often referred to as the three lenses of innovation; desirability, feasibility and viability. Human-centred design always begins by establishing desirability, defining what people want. The next stage is to establish if it is technically feasible to deliver what people want. And finally, even a desired and technically feasible solution must be commercially viable for a business. Design thinking, then, is a process that delivers innovative solutions that are optimally positioned at the overlap between desirability, feasibility and viability.  

This framework should be the ideal starting point for product development in the IoT industry. Today, a lot of solutions seem to take desirability and viability for granted just because it is technically feasible to embed almost anything with connectivity. But is this the right approach to IoT innovation?  

The 5-stage design thinking model 

The design thinking process guide from the Hasso-Plattner Institute of Design at Stanford (d.school) prescribes a 5-stage model that progresses as follows:

SOURCE: http://longevity3.stanford.edu/designchallenge/design-thinking-process/

EMPATHIZE: Empathy is a critical component of the human-centred design process as it rarely if ever begins with preconceived ideas, assumptions and hypotheses. This stage allows enterprise teams to better understand the people that they are designing for; understand their needs, values, belief systems and their lived experience. As the process guide puts it, the best solutions come out of the best insights into human behavior. Design thinking encourages practitioners to observe how people interact with their environment in the context of the design challenge at hand. Designers should also directly engage with end users, not in the form of a structured interview but as a loosely bounded conversation. Both these approaches can throw up insights that may not necessarily be captured by historical data or expert opinions. 

DEFINE: This stage of more about defining the design challenge from the perspective of collected end user insights rather need defining a solution. The “define’ stage enables the synthesis of vast amounts of data, collected in the previous stage, into insights that can help focus the design challenge. At the end of this stage, it must be possible to articulate an actionable problem statement that will inform the rest of the process. 

IDEATE: The purpose of ideation is not to hone in on a right idea but generate the broadest range of possible ideas that are relevant to the design challenge. Finding the right idea will happen in the user testing and feedback stage. In the meantime, use as many ideation techniques as possible to move beyond the obvious into the potentially innovative. Most important of all, defer judgement as evaluating ideas as they flow can curb imagination, creativity and intuition. At the end of the ideation process, define quality voting criteria to move multiple ideas into the prototyping stage. 

PROTOTYPE: Build low-resolution (cheap and quick) prototypes as it means that more prospective ideas can be tested. Use these prototypes to elicit feedback from users and the team that can then be looped back into refining these solutions across multiple iterations. A productive prototype is one that communicates the concept of the proposed solution, stimulates conversation and allows for the quick and cheap failure of unworkable ideas. 

TEST: Prototyping and testing often work as two halves of the same phase rather than as two distinct phases. In fact, the prototype design will have to reflect the key elements that must be tested and even how they will have to be tested. Testing need not necessarily focus only on users’ feedback to the presented prototype. In fact, this stage can sometimes generate new insights as people interact with the prototype. Rather than telling users how to use the prototype, allow them to interact freely and compare different prototypes. 

And finally there is iterate. This is not so much a stage as a golden rule of design thinking. The point of design thinking is to create a repetitive learning loop that allows teams to refine and refocus ideas or even change directions entirely. 

Of course, the Stanford model is not the only design thinking framework in circulation today. Those interested in more options can find an introductory compilation at 10 Models for Design Thinking. Though these frameworks may vary in nomenclature and process structure, some central design thinking concepts such as empathy and iteration remain common to most.

Is design thinking effective? 

According to one source, only 24% of design thinking users measure the impact of their programs. Even a survey from Stanford’s d.school found that organizations struggled to determine ROI.  

However, in an excellent article in the Harvard Business Review, Jeanne Liedtka, professor of business administration at the University of Virginia’s Darden School of Business, concludes, after a seven-year 50-project cross-sectoral qualitative study that “design thinking has the potential to do for innovation exactly what TQM did for manufacturing: unleash people’s full creative energies, win their commitment and radically improve processes.

A more quantitative study by Forrester on The Total Economic Impact Of IBM’s Design Thinking Practice provides a litany of quantified benefits that includes the realization of $20.6 million in total value due to a design thinking-led reduction in design, development and maintenance costs.  

But the limited availability of quantitative data has been offset by the steady stream of success stories of world-leading companies transforming elements of their business with design thinking. 

Design thinking offers the framework that, at a fundamental level, will enable the IoT industry to reorient itself away from a “what can I connect next to the internet” mindset to a “where do users need help the most” approach. Its human-centric empathy-driven approach enables businesses to identify and understand potential contexts and problems from the perspective of the end-user rather than from the point of view of the possibilities afforded by technology. Companies can now use the three lenses of innovation to evaluate the practical, technical and commercial value of the solutions that they plan to deploy. And finally, the inclusive and iterative design process will ensure a much higher probability of success while enabling real value for customers. 

Access Control & Iot Security: Challenges And Opportunities

IoT, the new attack vector

IoT attacks increased by over 217% in 2018. But a report with the provocative title of IoT CyberattacksAre The Norm, The Security Mindset Isn’t found that only 7% of organizations consider themselves equipped to tackle IoT security challenges. If that sounds wanting, consider this: 82% of organizations that develop IoT devices are concerned that the devices are not adequately secured from a cyberattack. Another study found that only 43% of enterprise IoT implementations prioritize security during the development/deployment process and only 38% involve security decision-makers in the process. Access control is considered being the first line of defence when it comes to IoT security.

Now, those broad trend indicators can possibly apply to any nascent technology. But there are two factors that make the IoT scenario particularly precarious. The first is the fact that, by all indications, the IoT is emerging as a potentially preferred attack vector for launching botnet assaults or even infiltrating enterprise networks. The second is that thus far, the IoT industry, from device developers to enterprise IT organizations, seems oblivious or ill-equipped to even secure access control and authentication, one of the fundamental components of any technology security strategy. 

Key IoT security challenges

However, an objective analysis of the scenario cannot but mention some of the unique characteristics of IoT networks that make security much more of a challenge than with other technology environments.  

First off, there’s the attack surface. An estimated 20 billion devices will be connected to the IoT by 2020, that’s 20 billion potential endpoint targets for malicious intent. A lot of these devices will be deployed in areas where it may be impossible or impractical to provide physical security, which makes it easier for bad actors to physically compromise devices on the network. Apart from the physical device, each IoT system comprises multiple edges and tiers including mobile applications, cloud and network interfaces, backend APIs, etc. Each one of these elements represents a potential vulnerability and just one unsecured component can be leveraged to compromise the entire network.  

Second there’s the sheer heterogeneity of IoT networks, with a range of different hardware and software stacks, governed by different access-control frameworks and with varying levels of privileged access. This means that there is no one size-fits-all approach to security and IoT security strategy will have to be designed around the characteristics of participating entities on each network. 

And finally, most IoT devices have limited power, storage, bandwidth and computational capabilities. So conventional security methods that are effective in other computing systems will be too complex to run on these constrained IoT devices. 

Device visibility precedes access control 

It is this distributed nature of IoT, where large volumes of devices communicate autonomously across multiple standards and protocols, that makes security more complex than it is in other more monolithic computing environments. That’s also why the IoT industry will need to reimagine conventional access control and authentication models and protocols and purpose them for this new paradigm. The right access control and authentication frameworks enables companies to identify IoT devices, isolate compromised nodes, ensure the integrity of data, and authenticate users and authorize different levels of data access. 

Since access control is the first point of contact between a device and the IoT network, these technologies must be able to recognize these devices in order to determine the next course of action. IoT devices have to be visible before access control and authentication can kick in and do its job. But most enterprises currently do not fare very well on the IoT device visibility score; a mere 5% keep an inventory of all managed IoT devices and only 8% have the capability to scan for IoT devices in real-time. But 46% are making it a priority in 2019 to enhance IoT discovery, isolation and access control, and that provides the starting point for a discussion on the merits of the different access control models available today. 

There are several types of access control models that can be considered for different IoT scenarios; from the basic ACL (Access Control List) model to the slightly more advanced MAC (Mandatory Access Control) model used primarily in military applications to the still-evolving and sophisticated Trust Attribute-Based Access Control model that builds on the ABAC (Attribute-Based Access Control) model to address requirement specific to IoT. 

Types of access control and authentication models 

But for the purposes of this article, we shall focus on more mainstream models that include RBAC (Role-Based Access Control), ABAC, CapBAC (Capability-Based Access Control) and UCON (Usage Control) model. 

RBAC: As the name suggests, this model manages resource access based on a hierarchy of permissions and rights assigned to specific roles. It allows multiple users to be grouped into roles that need access to the same resources. This approach can be useful in terms of limiting the number of access policies but may not be suitable for complex and dynamic IoT scenarios.  However, it is possible to extend RBAC to address fine-grained access control requirements of IoT though this could result in “role explosion” and create an administrative nightmare. 

The OrBAC (Organizational-Based Access Control) model was created to address issues related to RBAC and to make it more flexible. This model introduced new abstraction levels and the capability to include different contextual data such as historic, spatial and temporal data. There has also been a more recent evolution along this same trajectory with Smart OrBAC, a model designed for IoT environments that offers context-aware access control. 

ABAC: In this model, the emphasis shifts from roles to attributes on the consideration that access control may not always have to be determined by just identity and roles. Access requests in ABAC are evaluated against a range of attributes that define the user, the resource, the action, the context and the environment. This approach affords more dynamic access control capabilities as user access and the actions they can perform can change in real-time based on changes in the contextual attributes.  

ABAC provides more fine-grained and contextual access control that is more suited for IoT environments than the previous RBAC. It enables administrators to choose the best combination of a range of variables to build a robust and comprehensive set of access rules and policies. In fact, they can apply access control policy even without any prior knowledge of specific subjects by using data points that are more effective at indicating identity. The biggest challenge in this model could be to define a set of attributes that is acceptable across the board. 

CapBAC: Both RBAC and ABAC are models that use a centralized approach for access control, as in all authentication requests are processed by a central authority. Though these models have been applied in IoT-specific scenarios, achieving end-to-end security using a centralized architecture on a distributed system such as the IoT can be quite challenging. 

The CapBAC model is based on a distributed approach where “things” are able to make authorization decisions without having to defer to a centralized authority. This approach accounts for the unique characteristics of the IoT such as large volume of devices and limited device-level resources. Local environmental conditions are also a key consideration driving authorization decisions in this model, thus enabling context-aware access control that is critical to IoT. 

The capability, in this case, refers to a communicable, unforgeable token of authority that uniquely references an object as well as an associated set of access rights or privileges. Any process with the right key is granted the capability to interact with the referenced object as per the defined access rights. The biggest advantage of this model is that distributed devices do not have to manage complex sets of policies or carry out elaborate authentication protocols which makes it ideal for resource constrained IoT devices.

UCON: This an evolution of the traditional RBAC and ABAC models that introduces more flexibility in handling authorizations. In the traditional models, subject and object attributes can be changed either before the authorization request begins or after it is completed, but not when the subject has been granted permission to interact with an object. 

The UCON model introduces the concept of mutable attributes as well as two new decision factors, namely obligations and conditions, to go with authorizations. Mutable attributes are subject, object or contextual features that change their value as a consequence of usage of an object. By enabling continuous policy evaluation even when access is ongoing, UCON makes it possible to intervene as soon as a change in attribute value renders the execution right invalid.

 

Apart from these mainstream models, there are also several models, such as Extensible Access Control Markup Language (XACML), OAuth, and User-Managed Access (UMA) that are being studied for their applicability to IoT environments. But it is fair to say that the pace of development of IoT-specific access control models is seriously lagging development efforts in other areas such as connectivity options, standards and protocols. 

The other worrying aspect of the situation is that enterprise efforts to address IoT security concerns do not show the same urgency as those driving IoT deployments. All this even after a large scale malware attack in 2016 hijacked over 600,000 IoT devices using just around 60 default device credentials. A robust access control and authentication solution should help thwart an attack of that intensity. But then again, access control is just one component, a critical one nevertheless, of an integrated IoT security strategy. The emphasis has to be on security by design, though hardware, software and application development, rather than as an afterthought. And that has to happen immediately considering that the biggest IoT vulnerability according to the most recent top 10 list from the Open Web Application Security Project is Weak, Guessable, Or Hardcoded Passwords.  

From Smart To Helpful – The Next Generation Connected Home

“No one asked for smartness, for the smart home.” That’s the head of Google’s smart home products explaining the company’s decision to focus on delivering a helpful home that provides actual benefits rather than a smart home that showcases technology. This is key; the next generation connected home must provide convenience and actual benefit.

SOURCE: https://www.cbinsights.com/research/smart-home-market-map-company-list/

Smart home, helpful home, what’s in a name when the industry is growing at a CAGR of almost 15% and is expected to more than double in value, from USD 24.10 billion in 2016 to USD 53.45 billion in 2022. Growing acceptance of connected home devices powered global shipments to over 168 million in the first quarter of 2019, up 37.3% from the previous year. IDC estimates that shipments will continue to grow at almost a 15% CAGR, from 840.7 million units in end 2019 to 1.46 billion units by 2023. 

SOURCE:

There are a lot of factors fueling the increasing acceptance of connected home devices. A majority of consumers expect their next home to be connected and are willing to pay more for a helpful home. Though this trend may be spearheaded by digital-savvy millennials and their penchant for tech innovations, the convenience afforded by these smart solutions is drawing in the older generations as well. Fairly recent innovations like voice-enabled interfaces are simplifying the adoption process for a larger proportion of consumers. At the same time, increasing competition and falling device prices, rising interest in green homes and sustainable living, have all, to varying degrees, helped convert consumer interest into action. 

But of course, there has to be underlying value to all these trends and preferences. 

Key value drivers in smart home systems

There are broadly three layers of value in a smart home system. The first is the convenience of anytime-anywhere accessibility and control, where consumers can change the state of their devices, such as lock door, turn off lights, etc., even remotely, through a simple voice or app interface. 

The second layer enables consumers to monitor and manage the performance of these systems based on the data they generate. For instance, consumers can manage their energy consumption based on the smart meter data or create a fine-grained zone-based temperature control using smart thermostats to control costs. 

The final layer is automation, which is the logic layer that enables consumers to fine tune and automate the entire system based on their individual needs and preferences. 

Till date, there have been some empirical quantifications of value in terms of how a lot of smart homeowners in the US save 30 minutes a day and $1,180 every year or how smart thermostats can cut temperature control costs by 20%. However, it is possible, at least theoretically, to link adoption to value as smart home segments such as energy and security management with, tangible value propositions of cost savings and safety, have traditionally experienced higher rates of adoption. 

But as the smart home markets evolves beyond the hype and adoption cycle, the dynamics of value are changing. And Google’s pivot from smart to helpful reflects this shift in the connected home market. It is no longer about the technology but about the value it can deliver.    

The future value of smart home technologies

Customers get smart home tech. Most consumers in the US, a key market for this emerging technology, most people already use at least on smart home device. According to one report, US broadband households now own more than 10 connected devices with purchase intention only getting stronger through the years. The global average for smart home devices per household is forecast to be 16.53, up from the current 5.35. 

Along with device density, consumer expectations of the technology are also rising. Almost 80% of consumers in a global study expect a seamless, personalized and unified experience where their house, car, phone and more all talk to each other. They expect emerging technologies like AI to enhance their connected experience. And they expect all this to be delivered without compromising privacy or security. 

There is a similar shift on the supply side of the market too. 

If the emphasis thus far was on getting products into consumers’ homes, the future will be about creating a cohesive experience across all these devices. In this future, services, rather than devices, will determine the value of an IoT vendor. With device margins fading away, the leaders will be determined by their ability to leverage the power of smart home device data to deliver services that represent real value for consumers.  

So a seamless cohesive cross-device experience is what consumers expect and is also what will drive revenue for smart home solution providers. And the first step towards realizing this future will be to address the systemic issue of interoperability in smart homes. 

Interoperability in smart home technologies

Interoperability over brand loyalty, that seems to be the consumer stance according to a report from market research and consulting firm Parks Associates. When it comes it purchasing new devices, more people prioritize interoperability with their current smart home set up over matching brands to their existing products. 

SOURCE: https://www.parksassociates.com/blog/article/pr-03272019

The true smart home is not a loosely connected set of point solutions. It is an integrated ecosystem of smart devices that delivers a seamless and cohesive smart home experience. 

For smart home vendors, interoperability creates the data foundation on which to build and monetize new solutions and services that add value to the consumer experience. Ninety seven percent of respondents to a 2018 online survey of decision-makers in the smart home industry believed that shared data and communication standards would benefit their business. These benefits ranged from the ability to create new solution categories (54%), capture and correlate across richer data sets (43%), focus on core strengths rather than grappling with integration issues (44%) and accelerate adoption (48%).     

There are two fallouts from the limited interoperability standards in the smart home market today. The first is the integration challenges it creates for consumers trying to create a cohesive ecosystem out of an extensive choice set of solutions fragmented by different standards and protocols. 

There are a few ways in which consumers can address this challenge. The rapid rise of smart speakers, the fastest-growing consumer technology in recent times, and voice-enabled interfaces has helped streamline the adoption and simplified integration to a certain degree. The next option is to invest in a dedicated smart home hub, like Insteon Hub and Samsung SmartThings Hub, that ties together and translates various protocol communications from smart home devices. Many of these hubs can now be controlled using Amazon Alexa and Google Assistant voice controls. Universal Control Apps such as IFTTT and Yonomi also enable users to link their devices and define simple rule-based actions with the caveat that they have been integrated by device manufacturers. Many device vendors have also launched “works with” programs to expand compatibility and enable consumers to create a more or less unified smart home solution. 

Though each of these approaches have their merit, collectively they represent an approach to mitigate the symptoms of fragmentation rather than enforce interoperability by design. A shared standard would go a long in addressing the challenges of the current approach to enabling organic interoperability in smart homes. 

OCF and open source, open standard interoperability

OCF (Open Connectivity Foundation) is an industry consortium dedicated to ensuring secure interoperability for consumers and IoT businesses. Its members include tech giants such as Microsoft, Cisco, Intel and appliance majors such as Samsung, LG, Electrolux and Haier.      

For businesses, OCF provides open standard specifications, code and a certification program to enable manufacturers to bring OCF Certified products with broad scale interoperability across operating systems, platforms, transports and vendors. The Foundation’s 1.0 was ratified last year and will soon be published as an ISO/IEC standard. OCF also provides two open source implementations — IoTivity and IoTivity Lite — for manufacturers looking to adopt the ratified standard and maximize interoperability without having to develop for different standards and devices. 

OCF’s latest 2.0 specification introduces several new features including device-to-device connectivity over the cloud, something that was not possible in 1.0 The 2.0 specification will be submitted for ISO/IEC ratification later this year. 

With key partners like Zigbee its now worldwide recognized specification, OCF continues to advance in developing a truly open IoT protocol, equipping developers and manufacturers in the IoT ecosystem with the tools they need to provide a secure, interoperable end user experience.

OCF works with key partners, such as Zigbee, Wi-Fi Alliance, World Wide Web Consortium (W3C), Thread, and Personal Connected Health Alliance (PCHAlliance), and with over 400 members from the industry to create standards that extend interoperability as an operating principle.  

Interoperability, however, is often only the second biggest concern of smart home consumers. The first is security, relating to hacked or hijacked connected home systems, and privacy, relating to how consumer data is collected, utilized and despatched. 

Security & privacy in smart homes

In July this year, there were news reports about a massive smart home breach that exposed two billion consumer records. This was not the result of any sophisticated or coordinated attack and more the consequence of one misconfigured Internet-facing database without a password. It was a similar situation with Mirai attack of 2016 where consumer IoT devices such as home routers, air-quality monitors and personal surveillance cameras were hijacked to launch one of the biggest DDoS attacks ever. Then too, there was no sophistication involved. The attackers simply used 60 commonly used default device credentials to infect over 600,000 devices.  

IoT, including consumer IoT, offers some unique challenges when it comes to security. But the security mindset has yet to catch up with the immensity of the challenge. 

It’s a similar situation when it comes to privacy. Globally, most consumers find the data collection process creepy, do not trust companies to handle and protect their personal information responsibly and are significantly concerned about the way personal data is used without their permission. 

The situation may just be set to change as the first standards for consumer IoT security start to roll in. 

Earlier this year, ETSI, a European standards organization, released a globally applicable standard for Consumer IoT security that defines a security baseline for internet-connected consumer products and provide a basis for future IoT certification schemes. The new standard specifies several high-level provisions that include a pointed rejection of default passwords. The ETSI specification also mandates a vulnerability disclosure policy that would allow security researchers and others to report security issues.

Security is an issue of consumer trust, not of compliance. The smart home industry has to take the lead on ensuring the security of connected homes by adopting a “secure by design” principle. 

Emerging opportunities in smart homes

As mentioned earlier, consumers really expect their smart home experience to flow through to their outdoor routines, their automobiles and their entire daily schedules. Smart home devices will be expected to take on more complex consumer workloads, like health applications for instance, and AI will play a significant role in making this happen. AI will also open up the next generation of automation possibilities for consumers and play a central role in ensuring the security of smart home networks.

Data will play a central role in delivering a unified, personalized and whole-home IoT experience for consumers. Companies with the capability to take cross-device data and convert it into insight and monetizable services will be able to open up new revenue opportunities. However, these emerging data-led opportunities will come with additional scrutiny on a company’s data privacy and security credentials. 

Evaluating Top Three Iot Platforms against Three Critical IoT-Specific Capabilities

The cloud market is currently dominated by three platforms – Amazon Web Services, Microsoft Azure, and Google Cloud Platform – that control nearly 65% of the global market. But as the core cloud computing market matures, new technologies such as artificial intelligence, machine learning and IoT are opening up a new front in a renewed battle for dominance. These upsell technologies could well provide the strategic differentiator that will shake up the current rankings. It is cumbersome to evaluate and compare the capabilities of IoT platforms.

The value of the global IoT platforms market, comprising both cloud-based and on-premise software and services, is estimated at USD 6.11 billion by 2024. The market is currently growing at almost 29% CAGR and CSPs (Cloud Service Providers) have played a key role in lowering barriers to IoT adoption. By standardizing components that can be shared across vertical applications, CSPs are lowering costs, simplifying implementations and empowering customers to experiment with and quickly scale up new use cases. 

CSP IoT offerings are still focused on delivering broad horizontal services with little potential for industry-specific optimizations. But that will change as the market matures and the need for more nuanced and sophisticated solutions opens up. In the meanwhile, let’s find out how the top three cloud platforms fare when it comes to IoT.  

In order to make this a bit more objective, we will be looking at how these platforms perform in terms of three components that are critical for any IoT solution: 

  1. Core IoT
  2. Edge Computing
  3. Data management & analytics

These categories are by no means perfectly mutually exclusive, and there can be a bit of overlap, but they do provide a more like-for-like basis for comparison in terms of fundamental IoT capabilities.  

Core IoT

Amazon Web Services:

AWS IoT Core is a managed cloud service that allows for the easy and secure connection and interaction between devices and cloud applications. It supports billions of devices across multiple industry-standard and custom protocols. The service stores the latest state of every connected device, allowing applications to track, communicate and interact with devices even when they are disconnected. AWS IoT Core allows users to implement new device and application features by simply defining and updating business rules in real-time. The service supports a variety of communication protocols including HTTP, WebSockets, and MQTT.

Authentication and end-to-end encryption across connection points ensures that data is never exchanged between devices and AWS IoT Core without first establishing identity. Users can further secure access by applying policies with granular permissions.

With AWS IoT Core, users can easily connect to a range of other AWS services, like AWS Lambda, Amazon Kinesis, Amazon S3, Amazon SageMaker, Amazon DynamoDB, Amazon CloudWatch, AWS CloudTrail, Amazon QuickSight, and Amazon Elasticsearch Service, without having to manage any infrastructure. 

Microsoft Azure:

Azure IoT offers two frameworks for building IoT solutions to address different sets of customer requirements.  

Azure IoT Central is a fully managed SaaS solution that uses a model-based approach to help users without expertise in cloud-solution development build enterprise-grade IoT solutions. Then there are Azure IoT solution accelerators, a collection of enterprise-grade solution accelerators that can help speed up development of custom IoT solutions. Both these solutions use Azure IoT Hub, the core Azure PaaS. 

The capabilities of Azure IoT Central can be categorized in terms of the four personas who interact with the application. 

The Builder uses web-based tools to create a template for the devices that connect to the IoT application. These templates can define several operational variables such as device properties, behavior settings, business properties and telemetry data characteristics. Builders can also define custom rules and actions to manage the data from connected devices. Azure IoT Central even generates simulated data for builders to test their device templates. 

The Device Developer then creates the code, using Microsoft’s open-source Azure IoT SDKs, that runs on the devices. These SDKs offer broad language, platform and protocol support to connect a range of devices to the Azure IoT Central application.

The Operator uses a customizable Azure IoT Central application UI for day-to-day management of the devices, including provisioning, monitoring and troubleshooting. 

The Administrator is responsible for managing access to the application by defining user roles and permissions. 

Google Cloud Platform:

Google’s Cloud IoT Core is a fully managed service for easily and securely connecting, managing, and ingesting data from millions of globally dispersed devices. There are two main components to the solution, a device manager and a protocol bridge.  

The device manager enables the configuration and management of individual devices and can be used to establish the identity of a device, authenticate the device, and remotely control the device from the cloud. The protocol bridge provides connection endpoints with native support for industry standard protocols such as MQTT and HTTP to connect and manage all devices and gateways as a single global system. 

Google has also launched a Cloud IoT provisioning service, currently in early access, that leverages tamper-resistant hardware-based security to simplify the process of device provisioning and on-boarding for customers and OEMs. 

The Cloud IoT Core service runs on Google’s serverless infrastructure, which scales instantly and automatically in response to real-time changes. 

Edge Computing

Amazon Web Services:

AWS provides two solutions for edge computing, Amazon FreeRTOS to program, deploy, secure, connect, and manage small, low-power edge devices and AWS IoT Greengrass for devices that can act locally on data while still using the cloud for management, analytics, and storage. 

Amazon FreeRTOS is a popular open source operating system for microcontrollers that streamlines the task of connecting small, low-power devices to cloud services like AWS IoT Core or to more powerful edge devices running AWS IoT Greengrass or even to a mobile device via Bluetooth Low Energy. It comes with software libraries that makes it easy to configure network connectivity options, program device IoT capabilities and secure device and data connections. 

With AWS IoT Greengrass, devices can be programmed to filter device data locally and transmit only the data required for cloud applications. This helps reduce cost while simultaneously increasing the quality of data transmitted to the cloud. AWS IoT Greengrass enables connected devices to run AWS Lambda functions, execute machine learning models and connect to third-party applications, on-premise software and AWS services using AWS IoT Greengrass Connectors. Device programming also becomes extremely easy as code can be developed and tested in the cloud and then be deployed seamlessly to the devices with AWS Lambda. 

Microsoft Azure:

Azure IoT Edge is a fully managed service built on Azure IoT Hub that extends cloud workloads, including AI, analytics, third-party services and business logic, to edge devices via standard containers. For instance, users have the option of leveraging Project Brainwave, a deep learning platform from Microsoft for real-time AI serving in the cloud, to deliver real-time AI to the edge. Processing data locally and transmitting back only the data required for further analysis can reduce the cost and enhance the quality of data. The solution enables AI and analytics models to be built and trained in the cloud before they are deployed on-premise. All workloads can be remotely deployed and managed through Azure IoT Hub with zero-touch device provisioning. 

As with AWS, Azure IoT Edge also offers device management capabilities even when they are offline or with intermittent connectivity. The solution automatically syncs the latest device states when they are reconnected to ensure seamless operability. 

Google Cloud Platform:

Google’s IoT edge service strategy is centered around two components; Edge TPU, a new hardware chip, and Cloud IoT Edge, a software stack that extends Google Cloud AI capabilities to gateways and connected devices.  

Edge TPU is a purpose-built ASIC chip designed and optimized to run TensorFlow Lite ML models at the edge and within a small footprint. Edge TPUs complement Google’s cloud IoT capabilities by allowing customers to build and train machine learning models in the cloud and then run the models on Cloud IoT Edge devices. The combination extends Google Cloud’s powerful data processing and machine learning capabilities to IoT gateways and end devices even while increasing operational reliability, enhancing device and data security and enabling faster real-time predictions for critical IoT operations.

The company is working with semiconductor manufacturers and device makers to embed its IoT edge innovations in the development of intelligent devices and gateways. 

IoT Analytics

Amazon Web Services: 

AWS IoT Analytics is a fully-managed service that automates every stage of the IoT data analytics process. The service can be configured to automatically filter data based on need, enrich data with device-specific metadata, run scheduled or ad hoc queries using the built-in query engine or perform more complex analytics or machine learning interference. Users can also schedule and execute their own custom analysis, packaged in a container, and the service will automate the execution. 

AWS IoT Analytics stores device data in an IoT-optimized time-series data store and offers capabilities for time-series analysis. The company offers a fully managed, serverless, time series data service called Amazon Timestream that can process trillions of events at 1,000X speeds and at one-tenth the cost of conventional relational databases.

AWS also offers real-time IoT device monitoring either as an out-of-the-box feature of its Kinesis Data Analytics solution or as a reference implementation for building custom device monitoring solutions. 

Microsoft Azure:

Azure Stream Analytics is a fully managed serverless PaaS offering designed to analyze and process streaming data from multiple sources simultaneously and in real-time. It integrates with Azure Event Hubs and Azure IoT Hub to ingest millions of events per second from a variety of sources. The service can be configured to trigger relevant actions and initiate appropriate workflows based on the patterns and relationships identified in the extracted information. 

Azure Stream Analytics on Azure IoT Edge enables the deployment of near-real-time intelligence closer to IoT devices to complement big data analytics done in the cloud. A job can be created in Azure Stream Analytics and then deployed and managed using Azure IoT Hub. 

The Microsoft IoT platform also offers Time Series Insights, a fully managed, end-to-end solution to ingest, store and query highly contextualized, IoT time series data. Time Series Insights seamlessly integrates with Azure IoT Hub to instantly ingest billions of events for analytics. Data and insights from this solution can be integrated into existing applications and workflows or new custom solutions can be created with the Time Series Insights Apache Parquet-based flexible storage system and REST APIs.

Google Cloud Platform:

Google Cloud IoT offers a range of services, at the edge and in the cloud, to extract real-time insights from a distributed network of IoT devices. The device data captured by Cloud IoT Core is aggregated into a single global system and published to Cloud Pub/Sub, part of Google Cloud’s stream analytics program, for downstream analytics. Cloud Pub/Sub ingests event streams and delivers them to Cloud Dataflow, a serverless fully managed data transformation and enrichment service, to ensure reliable, exactly-once, low-latency data transformation. The transformed data is then analyzed with BigQuery, a serverless cloud data warehouse with built-in in-memory BI Engine and machine learning.     

Using Cloud IoT Edge, discussed earlier in this article, all these data processing, analytics and machine learning capabilities can then be extended to billions of edge devices. 

Each of these platforms offer a vast range of IoT-specific tools, solutions and services and another layer of complex cloud services and third-party integrations that make it almost impossible to make an exhaustive comparison. But features such as device provisioning & management, real-time streaming analytics and edge computing are capabilities that are critical to every IoT implementation irrespective of application or vertical. Of course, there are other factors, like pricing and security that also come into play. But looking at a platform’s core IoT, edge computing and real-time analytics capabilities, affords a like-to-like comparison provides the context for a more detailed drill down. 

Enterprise IoT: Why Securing IoT Devices Needs to Be the Number One Priority

The number of IoT devices around the world keeps on growing. Globally, there are now more than 26 billion connected devices, according to research from Statista – up from 15 billion in 2015 – with the number projected to rise to over 75 billion by 2025. In 2018, the global IoT market stood at about $164 billion, and is expected to increase almost tenfold over the next six years, reaching around $1.6 trillion by 2025. The popularity of IoT technology is drastically transforming how society functions and how businesses are run. Be it manufacturing, transportation, telecoms, logistics, retail, insurance, finance or healthcare, the vast proliferation of IoT technology is on course to disrupt practically every industry on the planet. However, as more and more IoT devices are deployed across the enterprise, new challenges emerge for developers – and securing IoT systems is chief among them. 

(Image source: statista.com)

IoT in the Enterprise

Although much media attention surrounding IoT has focused on consumer products – smart speakers, thermostats, lights, door locks, fridges, etc. – some of the most exciting IoT innovations are coming from the business sector. The combination of sensor data and sophisticated analytical algorithms is allowing companies in a broad range of industries to streamline operations, increase productivity, develop leading-edge products, and solve age-old business problems. Consider the performance of all types of equipment and machinery – from jet engines to HVAC systems – being constantly monitored with sensors to predict the point of failure and avoid downtime automatically. Or how about driver speed behavior information being shared in real-time with an insurer – or geolocation beacons pushing targeted advertisements and marketing messages to customers when they are in or near a store. Usage of data from IoT sensors and controllers for better decision making – combined with automation for better efficiencies – is enormously valuable. As such, more and more businesses are getting on board with the IoT revolution.  

84% of the 700+ executives from a range of sectors interviewed for a Forbes Insights survey last year said that their IoT networks had grown over the previous three years. What’s more, 60% said that their organizations were expanding or transforming with new lines of business thanks to IoT initiatives, and 36% were considering potential new business directions. 63% were already delivering new or updated services directly to customers using the Internet of Things.   

By industry, nearly six in ten (58%) executives in the financial services sector reported having well-developed IoT initiatives, as did 55% of those in healthcare, 53% in communications, 51% in manufacturing, and 51% retail. 

(Image source: info.forbes.com)

The survey also showed that leveraging IoT as part of a business transformation strategy increases profitability. 75% of leading enterprises credited IoT with delivering increased revenue. 45% reported that the Internet of Things had helped boost profits by up to 5% over the previous year, another 41% said that it had boosted profits by 5% to 15%, and 14% had experienced a profit boost of more than 15% – and all anticipated IoT to have a significant profit-boosting impact in the year ahead. 

(Image source: info.forbes.com)

However, key to profitability and business success with IoT technology is security. Indeed, along with developing/maintaining appropriate algorithms/software and speed of rollout, securing IoT was as one of the three top IoT challenges cited by the executives. How do organizations ensure the integrity of their IoT data? How do they ensure that the various operational systems being automated with the technology are controlled as intended? These are questions that need to be answered, for a lot of hard IoT security lessons have been learned in recent years. 

Securing IoT in the Enterprise – An Ongoing Challenge 

As the number of connected IoT devices in the enterprise increases, new threats emerge. Distributed Denial of Service (DDoS) attacks provide a number of high-profile examples. Here, vulnerable connected devices are hijacked by hackers and used to send repeated and frequent queries that bombard the Domain Name Server (DNS), causing it to crash. For instance, the Mirai botnet in 2016 shut down major internet providers in North America and Europe by taking over hundreds of thousands of IoT devices – mainly IP security cameras, network video recorders and digital video recorders – and using them for a DDoS attack.

Mirai was able to take advantage of these insecure IoT devices in a simple but clever way – by scanning big blocks of the internet for open Telnet ports, then attempting to log in using 61 username/password combinations that are frequently used as the default for these devices and never changed. In this way, it was able to amass an army of compromised CCTV cameras and routers to launch the attack. Perhaps most concerning of all, however, is that the Mirai botnet source code still exists “in the wild”, meaning that anyone can use it to attempt to launch a DDoS attack against any business with IoT implementations – and many cybercriminals have done just that. 

Another example involves a US university in 2017, which suddenly found over 5,000 on-campus IoT devices – including vending machines and light bulbs – making hundreds of DNS queries every 15 minutes to sub-domains related to seafood. The botnet spread across the network and launched a DDoS attack, resulting in slow or completely inaccessible connectivity across the campus. Again, it was weak default passwords that left these devices vulnerable. 

One of the main problems with IoT devices being used in workplace environments is that many are not inherently secure. Part of the issue is that there are literally thousands of individual IoT manufacturing companies – many of which started life in the consumer market – with very little consistency between them. What this means is that each IoT device that ends up in the workplace – be it a lightbulb, vending machine, or CCTV camera – will likely have its own operating system. Each will likely have its own security setup as well – which will be different from every other connected thing in the office – and a different online dashboard from which it is operated. Many of these devices are also shipped with default usernames and passwords, making them inherently hackable. The manufacturers, meanwhile, take little or no responsibility if any of these devices are hacked, meaning the onus for securing IoT in all its forms falls entirely upon an organization’s IT department – and too often no one is assigned to this critical task. 

What makes it so critical? Well, thanks to Shodan – a specialized search engine that lets users find information about IoT devices (including computers, cameras, printers, routers and servers) – anyone, including hackers, can locate devices that use default logins with a simple web search. However, what’s good for hackers can be seen as being good for enterprises, too. Though the very existence of Shodan is perhaps scary, IT professionals should be using the search engine proactively as a security tool to find out if any information about devices on the company’s network is publicly accessible. After that, securing IoT is down to them. 

(Image source: shodan.io)

Another issue that renders securing IoT devices absolutely essential is the threat of spy tech and ransomware. Many IoT devices incorporate microphones, cameras, and the means to record their location, leaving organizations vulnerable to sensitive data being stolen or company secrets being exposed and held to ransom. Things like IoT-enabled building management systems can also be left open to surveillance or meddling from malicious third parties. A hacker could, for instance, lock all the doors in an office building or cut all the power. As an example, researchers at Def Con demonstrated how such a system can be targeted with ransomware by gaining full remote control of a connected thermostat. In a real-life scenario, such an attack could result in an office becoming uninhabitable, opening an organization up to ransom demands to regain control. 

In short, with the ever-increasing number of IoT devices an organization relies upon, the attack surface grows in kind – as does the unpredictability with regards to how hackers may seek to exploit them.

The Huge Costs of Not Securing IoT 

Securing IoT should be a top priority for practically all businesses for the simple reason that practically all businesses are invested in IoT. In fact, according to recent research from DigiCert – State of IoT Security Survey 2018 – 92% of organizations report that IoT will be important to their business by 2020. The executives interviewed cited increased operational efficiency, improving the customer experience, growing revenue, and achieving business agility as the top four goals of their IoT investments. 

(Image source: digicert.com)

However, securing IoT remains the biggest concern for 82% of these organizations. And it’s no wonder – a full 100% of bottom-tier enterprises (i.e. enterprises that are having the most problems with IoT security issues) had experienced at least one IoT security incident in 2018. Of these, 25% reported related losses of at least $34 million over the previous two years. 

(Image source: digicert.com)

These bottom-tier companies are much more likely to experience data breaches, malware/ransomware attacks, unauthorized access/control of IoT devices, and IoT-based DDoS attacks than top-tier companies (i.e. companies that are best prepared in terms of IoT security). So – what are top-tier companies doing differently? Well, DigiCert found that they all had five key behaviors in common – they were all ensuring device data integrity (authentication), implementing scalable security, securing over-the-air updates, utilizing software-based key storage, and encrypting all sensitive data. 

Speaking to Security Now, Mike Nelson, Vice President of IoT Security at DigiCert, comments on the findings: “The security challenges presented by IoT are similar to the many IT and internet security challenges industries have faced for years. Encryption of data in transit, authentication of connections, ensuring the integrity of data – these challenges are not new. However, in the IoT ecosystem these challenges require new and unique ways of thinking to make sure the way you’re solving those challenges works. Regarding evolution of security challenges, the biggest challenge is simply the scale and the magnitude of growth. Having scalable solutions is going to be critical.”

(Image source: digicert.com)

Final Thoughts

IoT has the potential to open up many new opportunities for growth and agility within the enterprise. However, securing IoT devices remains absolutely crucial. Organizations need to take the necessary steps to ensure that their devices and data are adequately protected from end to end. This will involve conducting a thorough review of the current IoT environment, evaluating the risks, and prioritizing primary security concerns that need to be addressed. Strong and unique passwords must also be mandatory for every device. Firmware must be constantly updated, and only secure web, mobile and cloud applications with strong encryption and data protection features must be used. All data must be encrypted – both at rest and in transit – with end-to-end encryption made a product requirement for all devices that connect. It’s also important that this data is secured and processed securely after it has been transmitted across the network. Device updates must be monitored and managed around the clock and around the calendar. Finally, the security framework and architecture must be scalable to support IoT deployments both now and in the future. As such, working with third parties that have the resources and expertise to manage scaling IoT security programs will be invaluable. 

Vinnter serves as an enabler for developing new business and service strategies for traditional industries, as well as fresh start-ups. We help companies stay competitive through embedded software development, communications and connectivity, hardware design, cloud services and secure IoT platforms. Our skilled and experienced teams of developers, engineers and business consultants will help you redefine your organization for the digital age, creating new, highly-secure connected products and digital services that meet the evolving demands of your customers. Get in touch to find out more. 

Making sense of IoT connectivity protocols

IoT, abbreviated from the Internet of Things, refers to a connected system of devices, machines, objects, animals or people with the ability to autonomously communicate across a common network without the need for human-to-human or human-to-computer interaction. This relatively recent innovation is already revolutionizing a lot of sectors with its ability to add connected intelligence to almost everything, including smart homes, smart automobiles, smart factories, smart buildings, smart cities, smart power grids, smart healthcare, smart agriculture and smart livestock farming, to name just a few.  

IoT is still a nascent innovation but it has an evolutionary trail that leads back, as per the ITU (International Telecommunications Union), to the early years of the last century. 

A brief history of telemetry, M2M and IoT

A 2016 Intro to Internet of Things presentation from the ITU charts the legacy of the modern IoT revolution back to 1912, when an electric utility in Chicago developed a telemetry system to monitor electrical loads in the power grid using the city’s telephone lines. The next big milestone, wireless telemetry using radio transmissions rather than landline infrastructure, was passed in 1930 and used to monitor weather conditions from balloons. Then came aerospace telemetry with the launch of Sputnik in 1957, an event widely considered the precursor to today’s modern satellite communications era.     

At this point M2M as we know it was still some years away, awaiting two landmark breakthroughs across three decades apart to propel it into the mainstream. 

The first breakthrough came in 1968 when Greek-American inventor and businessman Theodore G. Paraskevakos came up with the idea of combining telephony and computing, the theoretical foundation to modern M2M technologies, while working on a caller line ID system. The second happened in 1995 with Siemens launching the M1, a GSM data module that allowed machines to communicate over wireless networks. From thereon, regular improvements in wireless connectivity, and the Federal Communications Commission’s advocacy for the use of spectrum-efficient digital networks over analog networks, paved the way for more widespread adoption of cellular M2M technologies.  

IoT is the most recent mutation in this extended evolutionary chain of autonomous machine-to-machine connectivity. However, though both approaches share the same foundational principles, there are some marked differences as shown in the chart below. 

SOURCE: https://ipwithease.com/internet-of-things-vs-machine-to-machine-iot-vs-m2m/

Perhaps one of the most significant distinctions between M2M and IoT is in terms of ambition and scope. Current estimates indicate anywhere between 22 and 25 billion connected IoT devices by 2025. But before we have even tapped into the potential of networking billions of physical objects, industry aspirations are visualizing an Internet of Everything, where not just objects and devices but everything, including people, process data and things are connected into one seamless and intelligent ecosystem. 

But whatever the breadth of the ambitions for IoT, the availability of quality connectivity options will eventually determine the value of the outcome. Today there are an overwhelming range of connectivity technologies on offer with a range of capabilities suited for different IoT applications. 

Classifying IoT connectivity technologies

When it comes IoT connectivity, technology is constantly changing, with existing options being constantly updated and upgraded and new alternatives being continually introduced. And given the diversity of the IoT applications market, available solutions can be classified across a complex matrix of characteristics including range, bandwidth, power consumption, cost, ease of implementation, security, etc. But it is possible to classify these solutions using a simple 4 part taxonomy, namely:

  1. Classic connectivity solutions, comprising traditional short-range wireless solutions 
  2. Non-cellular IoT, proprietary technologies deployed industry players/consortia.
  3. Cellular IoT, standardized technologies that operate in the licensed spectrum.
  4. Satellite IoT, for areas that cannot be covered by any of the above. 
SOURCE: https://www.counterpointresearch.com/lpwans-will-co-exist-no-war-brewing-between-cellular-non-cellular/

Both cellular and non-cellular IoT technologies fall under the broad, and rather self-explanatory, category of low-power wide-area networks or LPWANs. While the former is a standardized technology provided in the licensed spectrum by mobile network operators, the latter refers to private proprietary solutions operating in unlicensed radio frequencies. Both solutions, however, are purpose-designed for IoT and are capable of transmitting small packets of data across long distances, over an extended period, with very limited resource usage. The forecast for LPWAN technologies is that they will cover 100 percent of the global population by 2022. 

1. Classic Connectivity: 

There are a range of technologies that fall under this category, including Wi-Fi, Bluetooth and Bluetooth Low Energy, NFC, RFID, and mesh technologies such as ZigBee, Thread and Z-Wave. As mentioned earlier, these are all short-range solutions that are ideal for bounded environments such as smart homes for example. But if short-range seems like a limitation, these solutions make up for it by enabling high bandwidth transmissions at low power consumption rates. Most of these solutions may not be designed specifically for IoT. But as long as the requirement does not include long-distance data transmission, they could still serve as a crucial hub in a larger hybrid IoT environment. 

 2. Non-Cellular IoT: 

There are currently two popular LPWAN solutions, LoRaWAN and Sigfox, in this space.  

  • LoRaWAN is an open IoT protocol for secure, carrier-grade LPWAN connectivity. It is backed by the LoRa Alliance, a global nonprofit association of telecoms, technology blue chips, hardware manufacturers, systems integrators, and sensor and semiconductor majors. 

The protocol wireless connects battery operated ‘things’ to the internet, enabling low-cost, low-power, mobile and secure bi-directional communication. The solution can also scale from a single gateway installation to a global network of devices across IoT, M2M and other large-scale smart applications. Though the LoRaWAN protocol defines the technical implementation, it does not place any restrictions on type of deployment, giving customers the flexibility to innovate. One of the arguments challenging the technology’s open-standard credential has focused on implementation being tied to chips from LoRa Alliance member Semtech. However, other suppliers have recently announced an interest in adopting LoRa radio technology. 

SOURCE: https://lora-alliance.org

LoRaWAN already has a massive global footprint with over a 100 network operators having deployed its networks across the world by the end of 2018. The alliance also announced that it has tripled the number of end-devices connecting to its networks.    

  • Sigfox was one of the first companies to create a dedicated IoT network that used Ultra Narrow Band modulation in the 200 kHz public band to exchange radio messages over the air. The company’s stated ambition is to mitigate the cost and complexity of IoT adoption by eliminating the need for sensor batteries and reducing the dependence on expensive silicon modules. 

The company’s proprietary protocol is designed for IoT applications that transmit data in infrequent short bursts across long distances, while ensuring low connectivity costs, and reducing energy consumption. It works with several large manufacturers such as STMicroelectronics, Atmel, and Texas Instruments for its endpoint modules in order to ensure the lowest cost for its customers.

The Sigfox network is currently operational in 60 countries, covering an estimated 1 billion people worldwide, connecting 6.2 million devices and transmitting 13 million messages each day. Sigfox has also teamed up with satellite operator Eutelsat to launch a satellite that will enable global coverage.  

There are a few other players, like Link Labs and Weightless SIG, offering their own LPWAN technologies. But LoRaWAN and Sigfox dominate the market, accounting for nearly two-thirds of low-power wide-area networks deployments. 

There is, however, a significant challenge emerging from their counterparts in cellular IoT with technologies like NB-IoT and LTE-M. 

3. Cellular/Mobile IoT: 

Proprietary technologies operating in the unlicensed spectrum may seem to have the market cornered, but cellular/mobile IoT is rapidly catching up.  Earlier this year the GSMA announced the availability of mobile low-power wide-area IoT networks in 50 markets around the world with a total of 114 launches as of May 2019. 

SOURCE: https://www.gsma.com/iot/deployment-map/

These launches include both LTE-M (LTE Cat-M/eMTC) and NarrowBand IoT (NB-IoT/LTE Cat-NB), a set of complementary, IoT-optimized cellular standards developed by the 3GPP (3rd Generation Partnership Project). Both these Mobile IoT networks are ideal for low-cost, low-power, long-range IoT applications and together they are positioned to address the entire spectrum of LPWAN needs across a range of industries and use cases. Operators have a choice of cellular technologies to ensure that they can provide clearly differentiated IoT services based on the market dynamics in their regions. And both these technologies can coexist with 2G, 3G and 4G networks. There are, however, some key distinctions between the two, stemming primarily from the focus on covering as wide a range of IoT applications as possible.  

  • LTE-M (Long Term Evolution for Machines) enables the reuse of existing LTE mobile network infrastructure base while reducing device complexity, lowering power consumption and extending coverage, including better indoor penetration. LTE-M standards are designed to deliver a 10X improvement in battery life and bring down module costs by as much as 50 percent when compared to standard LTE devices. 

One significant development in the LTE-IoT market has been the launch of the MulteFire Alliance, a global consortium that wants to extend the benefits of LTE to the unlicensed spectrum. The group’s MulteFire LTE technology is built on 3GPP standards and will continue to evolve with those standards but operates in the unlicensed or shared spectrum. The objective is to blend the benefits of LTE with ease of deployment. Key features of the latest MulteFire Release 1.1 specifications include optimization for Industrial IoT, support for eMTC-U and NB-IoT-U, and access to new spectrum bands.  

  • NarrowBand IoT or NB-IoT is based on narrow band radio technology and is targeted at low-complexity low-performance cost-sensitive applications in the Massive IoT segment.  The technology is relatively easier to design and deploy as it is not as complex as traditional cellular modules. In addition, it enhances network capacity as well as efficiency to support a massive number of low throughput connections over just 200khz of spectrum. NB-IoT can also be significantly more economical to deploy compared with other technologies as it eliminates the need for gateways by communicating directly with the primary server.

Both these technologies are already 5G-ready. They will continue to evolve to support 5G use cases and coexist with other 3GPP 5G technologies.

The race for 5G deployments has already begun in earnest. Following the launch of 5G services in South Korea and the US earlier this year, 16 more markets are expected to join this as yet exclusive club in 2018. 

The emergence of 5G, the fifth generation of wireless mobile communications, will no doubt have a major impact on how these services are delivered. These fifth generation networks, with their promise of higher capacity, lower latency and energy/cost savings, have the potential to support more innovative bandwidth-intensive applications and massive machine-type communications (mMTC). 

4. Satellite IoT:

This is ideal for remote areas that are not covered by cellular service. Though that may seem like a niche market, some reports indicate that there may be as many as 1,600 satellites dedicated to IoT applications over the next 5 years. Satellite communications company Iridium has partnered with Amazon Web Services to launch Iridium CloudConnect, the first satellite-powered cloud-based solution for Internet of Things (IoT) applications. 

All of which brings up the question, which IoT protocol is right for you? Every technology discussed here has its USPs and its limitations. Every IoT application has its own requirements in terms of data rate, latency, deployment cost etc. A protocol that works perfectly well for a particular use case may prove to be completely inadequate for another. 

So there is no one-size-fits-all protocol that can be prescribed by application or even by industry. As a matter of fact, sticking to just one technology standard doesn’t make sense in many Internet of Things (IoT) implementations, and that’s according to Sigfox.

Choosing a cloud service provider in an evolving marketplace

Last year, Gartner axed 14 cloud vendors from its Magic Quadrant for Cloud IaaS, choosing to focus only on global vendors currently offering, or developing, hyperscale integrated IaaS and PaaS offerings. This bit of spring cleaning left behind a more manageable roster of six companies classified into two distinct segments: the Leaders, comprising Amazon Web Services, Microsoft, and Google, and the Niche Players, represented by Alibaba, Oracle, and IBM.

SOURCE: https://www.bmc.com/blogs/gartner-magic-quadrant-cloud-iaas/

Even within this simplified league table of three, Amazon continues to be the clear leader in terms of revenue and market share. 

SOURCE: https://www.parkmycloud.com/blog/aws-vs-azure-vs-google-cloud-market-share/
SOURCE: https://www.parkmycloud.com/blog/aws-vs-azure-vs-google-cloud-market-share/

Now AWS may dominate the cloud market, but Microsoft and Google are growing much faster.

According to Q1 2019 figures, these challengers are growing, respectively, at 75 and 83 percent against a comparatively middling 41 percent growth for AWS.

AWS market share has also remained stagnant at 33 percent between Q1 2018 and 2019.

Additionally, a Gartner scorecard that evaluates public IaaS cloud providers across 263 required, preferred and optional criteria found that Azure has pulled away from AWS in the required criteria for the first time since the scoring started.

SOURCE: https://info.flexerasoftware.com/SLO-WP-State-of-the-Cloud-2019
SOURCE: https://blogs.gartner.com/elias-khnaser/2018/08/01/just-published-new-scorecards-for-aws-azure-gcp-and-oci-cloud-iaas/

Going multicloud and hybrid

But even as the cloud IaaS market coalesces into a “Big Three vs Others” comparison, RightScale’s 2019 State of the Cloud survey revealed that businesses are combining public and private clouds in a hybrid multi-cloud strategy that leverages almost 5 clouds on average.

Another study from Kentik found that the most common cloud combination was AWS and Azure, with the AWS-Google Cloud combination trailing not far behind.

Though public cloud remains the top priority across enterprises, the number of companies deploying a hybrid public plus private cloud strategy is increasing.

At the same time, companies with a multicloud strategy, combining multiple public OR private clouds, has decreased. 

All this would suggest that the cloud market is not a zero-sum play. Businesses are using a combination of cloud providers, including the niche players, to design solutions that deliver the best outcomes. And cloud providers will have to take this preference for hybrid clouds into consideration while developing solutions for their customers. 

More importantly, none of this makes it any easier for customers to narrow down the right vendors for their workloads given the absence of any common framework for assessment. But it is possible to define a template defining some key considerations that should drive the choice of cloud providers. 

Choosing a cloud service provider

There are several factors that can influence a company’s choice of cloud provider including the platform’s choice of technology and architecture, data security, compliance and governance policies, interoperability, portability and migration support, services development roadmap, etc.

The Cloud Industry Forum, a UK-based not-for-profit organization promoting cloud adoption, has a fairly comprehensive list of 8 criteria for selecting the right cloud service provider. At Vinnter we believe that there are 4 important aspects to picking a cloud provider:

Location proximity: There are two reasons to ensure cloud service providers have actual operations in the customer’s target market. The first is latency, which some have even referred to as the Achilles heel of cloud adoption.

One study on the global network performance of AWS, Google Cloud and Microsoft Azure found that data center location directly affected network latency with network performance varying across different service providers while connecting across different regions.

This lag can have huge performance implications for many modern business and IoT applications that depend on and expect low latency.    

 The second critical factor is that of data sovereignty. Many countries, including Russia, China, Germany, France, Indonesia and Vietnam, have data residency regulations that require data to be stored in the region. GDPR sovereignty requirements have strict mandates on the collection and processing of EU residents’ data. 

Cloud providers are responding to these mandates of latency and sovereignty by opening up multiple regional data centers. For instance, AWS announced plans to open an infrastructure region in Italy, the company’s sixth in Europe, that would address both low latency and data residency requirements.

In Germany, Microsoft has placed the customer data in its data centers with an independent German data trustee, making it difficult for anyone, including Microsoft or US authorities, to access it without the customer’s permission.  

Transparent pricing: Optimizing cloud costs continues to be a top priority, even among advanced cloud users, with one study estimating wastage at about 35 percent of cloud spends. 

The study also identified four reasons for this wastage: complexity of cloud pricing, a better-safe-than-sorry approach leading to overprovisioning, lack of visibility into cost implications, and lack of adequate tools for optimizing spends.

As cloud providers launch new innovations and pricing models the pricing is only expected to get more complicated with no basis for comparison across services.

In fact, 2018 was predicted to be the year when cloud providers would consolidate and simplify their offerings and pricing structures.  

A transparent pricing model can address almost all the waste factors mentioned earlier. It provides customers with a common and objective basis for comparing and choosing the service that best suits their workloads as well as optimize provisioning to actual demand. 

Accessible documentation and support: Documentation and customer support are critical factors, often the difference between productivity and wastage, in an evolving area such as cloud computing.  Extensive and accessible documentation can make it easier for customers to implement and manage their cloud services more optimally. This has to be backed by 24/7 customer service and dedicated account managers to help customers resolve their cloud service problems and queries.

Both Google Cloud and AWS provide access to comprehensive documentation as well as community forums where customers can address implementation or performance related issues. All top three vendors offer a lower level of support by default but expect customers to pay for anything more. For instance, Google and AWS offer different levels of support at different price points.  

Going forward, the documentation and support offered by a cloud service provider could become a key point of differentiation in customers’ choice of cloud platform. 

Finding the right cloud skills: As more and more businesses move their workloads to the cloud, there is a growing demand for people with the skills to develop, operate and maintain end-services deployed in a cloud environment. This will be a critical factor irrespective of choice of cloud service provider.  

But acquiring the right cloud skills has reportedly become a full-blown crisis with a significant majority of IT managers finding it “somewhat difficult” to find cloud management talent. 

In order to deal with this crisis, Deloitte advises businesses to start with an inventory of cloud computing skills in the company across different areas such as architecture, security, governance, operations and DevOps as well as cloud brand-specific skills. 

The next step is to define the skills related to these same areas that are required to get the company to where it wants to be in terms of cloud technologies. Finally, train, hire and/or replace talent to build a cloud-first approach to technology. 

Today, it is simply not possible to create a like-to-like comparison template of all major service providers that would be relevant to every business looking to select a cloud platform. But there are certain broader themes that apply across services that need to be considered to determine the correct fit for every company’s technology and workload profile, usage pattern, technical maturity and budget.

Rather more importantly, the cloud is not the plug-and-play environment that it promised to be. Even post-adoption every business will need in-house cloud skills to accelerate development and innovation. That perhaps will be the biggest challenge of all.   

The analytical challenges of IoT data

If data is indeed the new oil, then we’re still a long way off from mastering the science of extracting, refining and deploying it as a strategic enterprise asset. That, in short, seems to be the conclusion that emerges from two separate studies. This may come from the analytical challenges that come with IoT data.

The first study from Gartner classifies 87% of businesses as having low BI and analytics maturity. Organizations within this low maturity group are further divided along two levels; a basic level characterized by spreadsheet-based analytics and a higher opportunistic level where data analytics is deployed but piecemeal and without any central leadership and guidance.  

In the second study from New Vantage Partners, a majority of C-suite executives conceded that they were yet to create either a data culture or a data-driven organization. Rather more worryingly, the proportion of companies that self-identified as data-driven seemed to be on the decline.

Companies may not yet be data-driven but the data flow shows no signs of slowing down.

According to IDC, the global datasphere will grow to 175 Zettabytes (ZB) in 2025, up from 23 ZB in 2017. Even as that happens, the consumer share of this data will drop from 47%, in 2017, to 36%, in 2025. This means that the bulk of this data surge will be driven by, what IDC refers to as, the sensorized world or the IoT. 

SOURCE: https://www.seagate.com/files/www-content/our-story/trends/files/idc-seagate-dataage-whitepaper.pdf

Key challenges of IoT data

The surge of IoT data comes with a lot of economic value, estimated at around $11 trillion by 2025. But it also comes with some significant challenges in terms of aggregating data from disparate, distributed sources and applying analytics to extract strategic value. 

The primary challenge of IoT data is its real-time nature. By 2025, 30% of all data will be real-time, with IoT accounting for nearly 95% of it, 20% of all data will be critical and 10% of all data will be hypercritical. Analytics will have to happen in real-time for companies to benefit from these types of data.

Then there is the issue of time series data. This refers to any data that has a time stamp, such as a smart metering service in the IoT space or even stock prices. A company’s IoT infrastructure must be capable of collecting, storing and analyzing huge volumes of time series data. The challenge here is that most conventional databases are not equipped to handle this type of data.

The distributed nature of IoT data, where most of the data is created outside enterprise data centers, also presents its own set of analytics challenges. Chief among them is the need to process at least some of this distributed data, especially hypercritical and critical data, to be processed outside the data center. IoT analytics itself, therefore, will have to become distributed with some analytics logic shifting out of the cloud to the edge. IoT analytics will have to be distributed across devices, edge servers, gateways and central processing environments. In fact, Gartner predicts that half of all large enterprises will integrate edge computing principles into their IoT projects by 2020.    

These are just a few of the characteristics of IoT data that differentiate it from conventional data sets. And traditional data-analytics technologies and capabilities are not designed to handle the volume, variety and complexity of IoT data. Most companies will have to completely revamp their analytics capabilities to include IoT specific capabilities such as streaming analytics, the ability to identify and prioritize between different data types and formats, and edge analytics. 

CSPs take the lead in IoT data analytics

Many companies are turning to cloud-based IoT platforms that offer rich data services alongside their core IoT offerings. Customers are looking for real-time capabilities across data ingestion, storage, processing, and analysis such as for the rich data ingestion, transformation, storage and processing.  Some cloud vendors are even offering their own hardware to enhance the interoperability and performance between IoT devices and data that is processed in the cloud. 

According to a Bain & Company study, CSPs (cloud service providers) are seen as leaders in providing a comprehensive set of tools that address all the IoT data analytics needs of the enterprise. These CSPs, according to the same study, are also playing a key role in lowering barriers to IoT adoption, facilitating simpler implementations and enabling customers to design, deploy and scale new use cases as quickly as possible. 

AWS takes the lead among IoT CSPs

Among the big brand CSPs, Amazon AWS has consistently been ranked as the platform of choice, followed by Microsoft Azure and Google Cloud Platform, in the annual IoT Developer Survey conducted by the Eclipse IoT Working Group. 

SOURCE :https://iot.eclipse.org/resources/iot-developer-survey/iot-developer-survey-2019.pdf

With data collection and analytics remaining a top three concern among developers, Amazon AWS offers arguably the most robust cloud-based IoT analytics solution in the market today.

The AWS IoT Analytics platform is a managed service that eliminates the complexity of operationalizing sophisticated analytics for massive volumes of IoT data. With AWS Lambda, developers also have access to a functional programming model that enables them to build and test IoT applications for both cloud and on-premise deployments. 

In terms of data collection & analytics, Amazon offers two distinct services in the form of AWS IoT Analytics and Kinesis Data Analytics. 

AWS IoT Analytics has the capabilities required for a range of IoT applications with built-in AWS Core support to simplify the setup process. With AWS IoT Analytics, it becomes much easier to cleanse bad data and to enrich data streams with external sources. AWS IoT Analytics allows data scientists access to raw and processed data, the facility to save and retrieve specific subsets of data and flexibility for rule-based routing of data across multiple processes.  

Kinesis Data Analytics is more suited for real-time data ingestion applications, like remote monitoring and process control, that require low latency response times in the range of milliseconds. The service integrates with other AWS tools like Amazon DynamoDB, Amazon Redshift, AWS IoT, Amazon EC2, to streamline the data analytics process. The Kinesis Analytics suite comprises a raft of services including, Kinesis Data Streams, Kinesis Data Firehose, Kinesis Data Analytics and Kinesis video streams. Amazon Kinesis Data Streams enables the continuous capture of large volume real-time data feeds and events of different kinds. Raw data from Kinesis can then be cleaned and process through AWS Lambda or Amazon ECS.  Kinesis Firehose prepares and loads streaming data to S3, Redshift or Elasticsearch for near real-time processing and analytics. 

While Kinesis offers developers more flexibility in development and integration, AWS IoT focuses on simplifying deployment using prebuilt components. It is possible to combine these two solutions to build a comprehensive IoT solution encompassing streaming as well as at-rest data.  

Late last year, Amazon AWS announced the launch of four new capabilities that would make it easier to ingest data from edge devices. AWS IoT SiteWise is a managed service that makes it easy to collect, structure, and search data from industrial equipment at scale. With AWS IoT Events, customers can now easily detect and respond to events from large numbers of IoT sensors and applications. AWS IoT Things Graph enables a no-code approach to IoT development with a visual drag-and-drop interface that allows developers to build IoT applications by simply connecting devices and services and defining their interactions. And finally there was AWS IoT Greengrass Connectors, a service that would enable developers to connect devices to third-party applications, on-premises software, and AWS services through cloud APIs.

Over and above all this, AWS has established a strong partner network of edge-to-cloud service providers and device manufacturers to offer customer the deepest technical and domain expertise required to mitigate the complexity of IoT projects. 

Apart from being a developer favorite, AWS IoT has also built up a client roster of some of the biggest brands in the industry including LG, Bayer, NASA, British Gas and Analog Devices, to name just a few. 

Notwithstanding the challenges of Big Data and analytics, there have been many successful IoT implementation across diverse sectors. Here then are just a couple of success stories of how companies and their IoT partners were able to use the power of big data analytics in IoT. 

IoT data analytics success stories

Bayer & AWS IoT Core: Bayer Crop Science, a division of Bayer, provides a range of products and services that maximize crop production and enable sustainable agriculture for farmers worldwide. The company uses IoT devices on harvesting machines to monitor crop traits that are then manually transmitted, over several days, to its data centers for analysis. The lack of real-time data collection and analytics meant that Bayer could not immediately address any issues with equipment calibration, jamming, or deviations to help with routing plans for subsequent runs. 

Already an AWS customer, Bayer’s IoT team decided to move its data-collection and analysis pipeline to AWS IoT Core. The company built a new IoT pipeline to manage the collection, processing, and analysis of seed-growing data. 

The new solution captures multiple terabytes of data, at an average of one million traits per day during planting or harvest season, from the company’s research fields across the globe. This data is delivered to Bayer’s data analysts in near real-time. The AWS IoT solution also provides a robust edge processing and analytics framework that can be scaled across a variety of IoT use cases and IoT initiatives. 

Bayer is now planning to use AWS IoT Analytics to capture and analyze drone imagery and data from environmental IoT sensors in greenhouses for monitoring and optimizing growing conditions.

Microsoft Azure IoT Hub & ActionPoint: Many manufacturers still use paper checklists, manual processes, human observation and legacy closed-loop technologies to monitor and maintain their equipment. Even in the case of modernized plants, manufacturers often did not have the right sensors in place to provide all the data required, or they had no analytics solutions to analyze the sensor data.  

Custom software developer ActionPoint partnered with Microsoft and Dell Technologies to develop IoT-PREDICT, an industrial IoT solution for predictive maintenance that incorporates machine learning, data analytics, and other advanced capabilities. The solution is powered by the Microsoft Windows 10 IoT Enterprise operating system running on Dell Edge Gateway hardware, and combined with the Microsoft Azure tool set to provide state-of-the-art edge computing. 

The combination of Windows 10 IoT Enterprise and Azure delivers a highly effective IoT solution that customers can deploy in minutes. It also gives the IoT-PREDICT solution the flexibility and scalability that allows manufacturers to start small with IoT and grow at their own pace.

IoT-PREDICT helps manufacturers quickly reduce downtime, lower costs, and increase the overall efficiency of their equipment and operations. It helps maximize the impact of manufacturer data by using the Microsoft Azure IoT Hub to gather data and make it available to several Azure services, including Azure Time Series Insights, Azure Stream Analytics. Manufacturers can now explore the data using Time Series Insights, or use Stream Analytics to take action with the data by setting up queries and alerts based on various performance thresholds.

IoT data analytics has certain unique characteristics and challenges that cannot be addressed by conventional analytics technologies and capabilities. But like in any analytics operation, the primary objective remains the same: to generate actionable insights that can enable positive business value. It is not just about choice or sensor or connectivity protocol or CSP. It has to be about ensuring the integrity of what McKinsey defines as the insights value chain. In order to ensure that every IoT project leads to demonstrable business value, organizations have to ensure the integrity of the entire insights value chain. 

Pentair & AWS: Pentair is a water treatment company that offers a comprehensive range of smart, sustainable water solutions to homes, businesses and industries around the world. The company relies on connected systems to monitor and manage its product installations, most of which are in remote locations. Traditionally, the company took the custom building route to develop its connected systems, which came with its own set of disadvantages. 

Pentair needed a powerful, flexible IoT platform, with high availability and scalability and a high degree of reuse across all lines of its business. Pentair also wanted a comprehensive solution that covered everything from IoT data ingestion, to analysis and visualization. 

The company teamed up with AWS Partner Network (APN) Advanced Technology Partner and IoT Competency Partner Bright Wolf to evaluate potential technology providers including Amazon, GE, IBM, Microsoft and others against a set of platform characteristics. This included data ingestion, infrastructure controls, deployment options, machine learning and visualization tools, development tools and the overall openness of each platform.

“AWS came out on top when it came to the raw scoring,” says Brian Boothe, the lead for Pentair’s Connected Products Initiative.

Till date, Pentair has deployed three different connected solutions using the AWS IoT platform and a flexible, scalable, and reusable reference architecture developed by Bright Wolf. The benefits according to Pentair include, accelerated time to market for value-added services, simpler integration, cost savings from deploying commodity edge devices on the open AWS IoT platform enterprise-grade scalability and availability. 

Recruitment success at Vinnter

During the spring and summer we at Vinnter have been working extremely hard to find new employees to hire to the best working place in Göteborg (yes, we might be prejudiced in this judgement… 😁 ). And the Vinnter recruitment success has finally appeared.

We have been working with several tools and services in parallell since the autumn of 2018, but during the early spring we decided to give LinkedIn job ads a chance. This has proven to be the most cost effective way of finding the best applications possible.

The hit rate may sound low, with a 3% hit rate. But we are extremely satisfied. We have held around 50 initial interviews and approximately 20 second interviews. But the best move ever has been when we decided to develop our own test to set in the hands of applicants that pass the first interview. It has been a great help for us to filter out those who really are ready to start with us from a competence perspective.

If you are interested in what opportunities you might find here, please head over to our career pages.

In the end it has resulted in us signing with four new employees! We are super excited to get them onboard (some have already started) and get them to contribute to our customers success and satisfaction with Vinnter as their chosen partner for development of IoT and connected things.

Curious of who we are? Well, just head over to an overview of our team! You can hire a team of these great people if you want!

Benedikt Ivarsson

Benedikt is a cool, calm and collected superhero coming to Sweden from our neighbour Iceland. He has got extensive experience from a range of different employments and companies and his competence is broad.

He is an ambitious and honest leader who is doing everything within his power to achieve a satisfied customer and a job well done. He consider it to be easy to look at the bigger picture when it comes to selecting a solution when facing a project.

With his positive attitude he makes a good team player as well as a good leader. His educational background is in Software Engineering, Project Management, IT Infrastructure and Enterprise

Specialties: Finding solutions, Flexible management, Project management, Product development, Software architecture, Pre-sale, Consulting, IoT, Python, C#, C++, C, Java, Hosting Services, Integration, ATM’s, Provisioning, Mediation, Automotive industry, Health care systems, Fintech solution, Telco solutions, Military Defence solutions, Financial Management, Product development.