Dilan Ustunyagiz

Dilan believes that to be able to create value in today’s saturated markets, an unconventional cross-field perspective is needed, which is where the real creativity lies. That said, she has been training herself across fields. She holds a BSc in Industrial Engineering, BA in Media and Visual Arts, and MSc in Entrepreneurship and Business Design. As a devoted lifelong learner, she holds certificates across different topics from Politecnico di Milano, Tufts University, and Harvard University.

Akin to her educational background, she also has a colorful professional background. It ranges from project management in education and government relations to service development in banking, as well as experience from business development in the transportation industry.

Steered by a curious and human-centric nature, she is keen on creating and claiming value hands-on anywhere technology meets business and analytics meets strategy.

Erik Manfredsson

Erik is a dedicated collaborator who puts the team’s progress firsthand. Coming from a banking background, he has gained extensive interpersonal skills and a profound understanding of attending customer needs, which he now uses to create a great end-user experience. 

Erik finds the creativity of problem solving the most engaging part of his job and thrives in a flexible and dynamic work environment.

Being a goal-oriented person, he likes to take on challenging tasks to continue to develop his technical abilities. One of his biggest motivators is the continuous learning the job offers, which is why software development quickly has become a big passion of his.

His experience in software development stems from high-level languages such as Java and Python, as well as server-less cloud solutions.

Carlos Delgado

Carlos is passionate on creating products that can solve real problems. 

In the last two years he has been working on Web and Mobile applications. He has extensive experience within product development of wearable technologies. After having run his own IT business for a couple of years, he decided to change career as a coder.

His main tools are Javascript and libraries and frameworks related to it.

Carlos is always eager to learn new things, and loves to improve the work environment by bringing positive values to the team.

Other interests of his are blockchain and smart-contracts, which is becoming a crucial part of developing connected things.

Wouter Dankers

Wouter is an enthusiastic and driven person who is eager to learn and experiment with new technologies. He received his M.Sc in embedded and intelligent systems from Halmstad University, Sweden and his  B.Sc from the Katholieke Hogeschool Kempen, Belgium. Wouter is an experienced and versatile software engineer who takes great pride in writing well functioning and clean code that scales and ages well.

In his free time, Wouter likes to tinker around with his pet projects, ranging from embedded systems to web design. When Wouter is not tinkering around he likes to spend time outside. 

If you want to know more about Wouter and his experiences, take a look at his profile on LinkedIn.

Linus Nibell

Linus is a technical person with an eye for complexity and system architecture. He always strives to learn and understand more how systems works and communicate.

He strongly believes in quality in all aspects, which reflects how he views software development. For him it is not necessary to build the perfect system, but the right system for the job.
 
One other aspect of his ways of working is being a key player and see which kind of pattern or strukture suits the task best.

Linus is not afraid to fail and learn from it. For him the value of a happy customer and great product for the job is higher.

Servitisation: Benefits And Best Practices

Servitisation is a complex-sounding word that encapsulates a simple idea: shifting from a reliance on products as a driver for economic growth, to an emphasis on delivering services that complement a particular product, and give the consumer a more rounded package.

So instead of focussing on selling new products, servitisation concentrates on giving the consumer the outcome that they would associate with a product — in terms of what it can do, how its top level of performance can be sustained, and through the addition of value adding features or services that expand upon this performance in various ways.

Why Servitisation Matters

Makers and sellers of goods must now operate in an environment that’s in a continual state of disruption, due to technological advancements and product or process innovations. In this climate, the challenge is to find new ways to differentiate your brand, and to adjust your product line or portfolio so as to remain competitive.

Products in today’s market also face the risk of quickly becoming commodities, as customer behaviour and expectations alter, the actual lifespans of products decrease, and pressures increase from a global market in which access to information and intellectual property make it easy for players to replicate items — in turn making it difficult for producers to distinguish their own offerings from the rest of the pack.

Historically, manufacturers have traditionally been responsible for providing the hard goods, with the consumer taking responsibility for ongoing maintenance and repairs. However as consumers have become more aware of the possibilities offered by new materials and fabrication techniques, there’s no longer a viable demand for products that require constant repair.

A sustainable business model for these times is one that benefits both manufacturer and consumer, and keeps their interests in line with each other. The shift from a product-oriented strategy to one that’s based on services can accomplish this in a number of ways.

You might for example add a service like maintenance or monitoring to an existing product. Making products available to consumers on a rental basis would be another strategy. Or you might offer a service-based alternative to the simple product itself, with expertise and consulting to perform the functions that the customer initially bought the item to accomplish.

The classic example of servitisation in this sense is the “power by the hour” policy adopted by Rolls-Royce in the dispersal of its jet engines — a model that gives consumers the product, together with an ongoing maintenance and monitoring programme and related services. The pay-per-copy service that Xerox offers is an example of another type, with the company maximising on the results of what its products can achieve.

The Benefits That Servitisation Offers

Over the last decade, the value found in production alone has been declining, while the value contribution derived from services has been on the rise. Market analysis also indicates that services lead to increased revenue while also having higher profit margins than selling products. Manufacturers who can make the shift to providing innovative and worthwhile services alongside their products therefore stand to capture more of their customer value.

Due to the long-term connection with consumers that ongoing service provision makes possible, servitisation offers organisations a more continuous revenue stream, and greater financial stability. Over time, opportunities to sell additional products or services may come to light.

The extended interaction with customers also facilitates the nurturing of good relationships, and fosters customer loyalty. This aids in customer retention — one of the foundation stones for economic success.

By packaging both a product and a set of relevant services, the servitisation model enables manufacturers to market a complete solution to the customer’s requirements. This generates revenue for the organisation from both the product and solution sides, while giving consumers the complete offerings that they expect and desire.

Putting Servitisation Into Best Practice

Embracing servitisation requires an initial change in mind-set at the level of the organisation. Rather than viewing yourself as either a product manufacturer or a service provider, the trick is to establish and maintain the optimum balance between the two sets of activities.

Making the switch will result in changes that affect the entire organisation, both in its operational and commercial aspects. In all likelihood, the organisational focus, procedures, and structure will alter in significant ways, and it’s important to communicate and manage these changes effectively.

One way of handling this is to set up a dedicated change management team, which takes responsibility for situational monitoring, developing strategies for implementation, making adjustments as necessary, and communicating the results to all stakeholders in the enterprise.

Staff training should also be a part of the transition. Training programmes should address new methods and modes of delivery for services, customer interaction, how to access and use the knowledge bases connecting products and services, etc.

Since servitisation is a customer-centric endeavour, it’s important to monitor customer data, and to keep an eye on trends in consumer behaviour and expectations. An awareness of customer requirements and pain points will empower you to adjust the nature of your service offerings, so that they remain relevant and unique.

From a financial perspective, it’s important to assign the expenses resulting from the sale of various services to relevant business functions, and to adopt relevant metrics or performance indicators to keep track of these flows of funds.

In some instances, the organisation may have to make financial provisions not only for the costs associated with maintenance, but also for any penalties specified in risk sharing contracts. Wherever possible, financial obligations should be at least partly transferred to the supply chain, through the design of contractual agreements that formalise reliability and quality requirements. Here, it may be necessary to implement performance management processes that can assist your suppliers in sticking to their contract requirements.

Maintenance and monitoring are two core requirements of the servitisation economy, and to implement them successfully, it’s crucial to have the right technologies available. Predictive maintenance analytics and technology are particularly useful in this regard. Technologies like digital twins and virtualisation enable manufacturers to optimise the design of their equipment and facilities, along with production and maintenance processes.

In addition, sustaining an effective flow of information across the supply chain may require a software platform that enables scalable and real-time performance monitoring, with advanced machine learning algorithms for process automation. This becomes particularly relevant for servitisation implementations involving connected technology and devices.

At Vinnter, we provide our customers with highly skilled and experienced teams of software and business developers in the area of connected things. Our activities cover applications ranging  from electronics design to the development of cloud services and mobile applications.

Even more important for success is the user and business perspective on an idea. Therefore we also have people skilled in asking the hard questions of why. If we know about the why it is so much easier to find out the how and what, which leads to whom and when.

If you would like to know more about how Vinnter can assist in your servitisation efforts, get in touch with us.

Why The Future Of Smart Connected Things Is Tiny Machine Learning

 

by Gustav Evertsson, Vinnter AB

Tiny machine learning or TinyML is altering the shape and nature of the machine learning landscape.

During the last two decades we have seen a boom in machine learning like never before. As a technology, machine learning is actually much older than that. Recently however, some major research projects like Long short-term memory networks, ImageNet, and the introduction of GPUs have made machine learning a feasible option for many problems. Faster internet connections and larger and larger memory devices (both for storage and ram) have also been making the data needed to train machine learning models more available. Companies like Google, Amazon, Facebook, and others have in many cases been open with the technology they originally developed for in-house use cases, and are now driving the development of new machine learning algorithms in many different ways. Cloud providers now also offer machine learning environments, making it very easy for organisations to both get started with the technology, and to scale up when needed.

In parallel with this we have also seen a growth spurt in the internet of things, with computing power becoming cheaper and less power hungry, so that it now can be added to a wide range of things to make them smarter. In the IoT boom we have also seen how sensors of all kinds can be used to monitor a diverse range of conditions — for example the environment, our own health, or the device itself.

The standard way to handle all this data has been to send it to the cloud for processing. However, because bandwidth is normally expensive or limited in scope, and sensors can normally generate a lot of data in a short time, most of this information is lost in transit. But if machine learning data analysis can be applied more locally to IoT devices, these losses may be eradicated, and new possibilities can open up.

These two technologies are now combining into what is called tiny machine learning (TinyML) — an environment in which the processing power is now sufficient to run machine learning models even in small power constraint applications, together with direct access to sensor data. On the software side, improvements in machine learning models have not only extended their capabilities, but also made them more efficient when applied to the simpler tasks more often associated with IoT devices.

TinyML Processing

The algorithms used in tiny machine learning are in essence much the same as those in traditional ML operations, with initial model training typically occurring on a local computer, or in the cloud. After this initial training, the model is condensed to produce a more compact package, in a process called deep compression. Two techniques often employed at this stage are pruning and knowledge distillation.

Once this distillation is complete, the model is quantised to reduce its storage weight, and to convert it to a format compatible with the connected device. Encoding may also occur, if it’s necessary to further reduce the size of the learning model.

The model is then converted into a format which can be interpreted by a light neural network interpreter, such as TensorFlow Lite (TF Lite).

TensorFlow by Google is one of the most popular machine learning libraries, and in 2017 TensorFlow Lite was released, targeting mobile applications. TensorFlow Lite Micro (released in 2019) targets even smaller microprocessor applications. These two platforms have made this process of shrinking the model to fit embedded devices a lot easier. It is now possible to

develop and train machine learning models on high performance desktop or cloud machines, then deploy them on embedded platforms while still using the same API.

Edge Processing For TinyML

As IoT devices and applications become more integrated with mission and business-critical use cases, the response time from information processing at data centres or in the cloud will not be quick enough. There may also be situations where hundreds of IoT sensors need to connect to the cloud simultaneously, creating network congestion.

Processing the data at the edge gives several benefits. From the point of view of privacy and data protection laws, auditing and compliance are much easier to handle when the data does not leave the device. Securing the information is also easier, because it can be very short lived when it is consumed as soon as it is read from the sensor.

Guaranteeing Energy Efficiency For Tiny Machine Learning

In many cases, processing data locally at the network edge consumes a lot less energy than transmitting it to the cloud or data centre, so battery life can improve. Some TinyML devices are capable of operating continuously for a year, running on a battery the size of a coin. This introduces options for remote environment monitoring applications in areas like agriculture, weather prediction, or the study of earthquakes.

Network latency can also be reduced, when the data does not have to be transmitted back and forth to the cloud. For example augmented reality is data-intensive, and it becomes very noticeable if there is a delay in the video processing.

Looking into the future, the cost in power is expected to continue to go down for CPU and memory, but not for radio transmission, where we seem to be closer to the physical limit of how much data per Wh we can send. This will only make the case for tinyML stronger in the future, where we will likely see ultra low power ML devices running for years on small cell batteries,  and needing to transmit data to the cloud only when anomalies are detected. We are also beginning to see microprocessors specific for machine learning applications, like the Syntiant NDP100 with a footprint of only 1.4 x 1.8mm, and power consumption of less than 140 μW while still doing voice recognition. Another example is the Edge TPU by Google, an ASIC chip made to run ML models while still only consuming a few watts of power.

 

 

Hena Hodzic

Hena is extremely positive, curious, brave and driven person who is not afraid to try different things, experience new challenges and embrace knowledge.  She has been attracted by electronics and technology since she was a child and is still highly motivated to improve her skills and learn more in that same branch.

She was born in Sarajevo where she finished her  degree in the Master Programme on department for Automatic Control and Electronics. After that she came to Sweden where she finished Master Programme in Intelligent Embedded Systems. As she is holding two master’s degrees in different areas, one of her advantages is good understanding of both hardware and software.

In her opinion the desire to learn something accompanied by pleasant atmosphere and surroundings are enough for success. Hena puts high demands on herself and her inspiration lies in the famous quote: “I have not failed. I have just found 10 000 ways that won’t work.”. With her great ambitions and support from excellent Vinnter team there is no doubt that she will reach for the stars and become an excellent team player and engineer.

Justifying connected things –  Business models

How come so many Internet of Things (IoT) ventures fail? 

According to data from McKinsey, about 75% of IoT-based businesses don’t make it off the ground. That’s very significant indeed, especially considering all the hype that the technology has received over the past decade. 

Is it because the scope for object connectivity is more limited than we first thought, or is it that companies are jumping into the market with business models that are unsuitable for IoT and don’t maximise the opportunities it offers? 

This latter suggestion wouldn’t be unheard of amongst early adopters for any new technology — but is a particular hurdle in the IoT space which is such a significant shift away from the status quo. Below, we’ll take a look at why this ‘new tech old business model’ is stymying IoT development and explore some solutions that move away from this. 

 

What are IoT companies doing wrong currently? 

One of the major pain points IoT companies come up against is trying to impose a hardware-based business model onto a technology that centres around connectivity and service provision.

This hardware-centric approach focuses, much like manufacturing has always done until recently, on the traditional design-build-sell process.

The issue with this is that for an IoT business to succeed, it needs to provide continuous value for consumers. The product itself is just the start — companies need to plan for networks on which their products operate as well as service platforms which collect and manage data. The margins here are less clear-cut and, given how young a technology IoT really is, there’s not a huge pool of knowledge on how to do this. 

In other words, it’s easy enough for IoT companies to build something that works, but much more difficult to predict, forecast and follow through with making it profitable. 

Why do connected business models work best for IoT companies?

IoT works at its best when it provides continuous value for customers — and this is achieved through the platforms and networks that allow people to process the data your physical product provides. 

In this sense, the connectivity of the product is as important as the product itself. Business plans which centre this, rather than being explicitly product-focused, are more likely to support the continuous costs associated with IoT technology.

In terms of what this looks like on a practical level, connected business plans should consider: 

 

Subscription-based

One simple way to set up your business for the ongoing costs you’ll face as an IoT company is to use a subscription-based model for payments. 

The good news is that companies are increasingly used to paying for key technologies on a subscription basis thanks to the recent SaaS boom. Key to IoT companies success is transferring this concept onto what people view as a physical product. 

Facilitation of pilot projects

As well as being relatively new as a technology, IoT solutions can take a little while to realise a positive ROI — the nature of the tech means that you’re in it for the long game. 

This means that helping customers set up pilot projects should be central to all connected business plans. Accompanying customers on the first stage of their journey with advice on getting the most of the data they produce could help along an investment that they’d otherwise be hesitant to make. 

Circular, not linear

One of IoT’s biggest appeal points is that it can help companies shift to what we call the ‘circular economy’. This reimagines the traditional product lifecycle (buy, use, dispose, replace) into something much more sustainable. 

The circular economy centres reuse, maintenance and recycling to create a system in which products last longer, less waste is produced and fewer raw materials are needed. As more companies look to a sustainable future, connected IoT products, their ability to self regulate, and potential to assist with key maintenance tasks look to play an increasingly important role.

Look towards the future with your IoT business plan and consider how your product could fit into such a system. 

What connected IoT business models are out there and how do they work? 

The good news is that despite these teething problems, IoT looks set to stick around. 

We can predict this because, despite the issues experienced by many ventures at the moment, there are a number of proven connected business models out there right now which companies use very successfully. 

The most successful of these business models include: 

  • Compliance monitoring: compliance is a huge expense for manufacturers (each year US manufacturers spend an estimated $192 billion on it). Using a connected device to monitor key compliance metrics like emissions is cheaper and more efficient than having someone come to check every quarter — and more transparent too.  
  • Predictive maintenance: we all understand the delays, frustration and losses that broken equipment can cause, even on a small scale. Devices that monitor activity levels, stress and other key metrics can issue automated alerts when they start to underperform, so that supplier technicians can fix the issue before it becomes any bigger. 
  • Remote diagnostics: you can now automate condition monitoring and optimising using smart sensors. Examples here include warehouse temperatures for perishable goods and soil conditions for plants. 
  • Asset tracking: microcontrollers connected to mobile internet can track and monitor an asset from anywhere on Earth. This is an unprecedented level of transparency, and is particularly useful for supply chains looking to reduce loss and theft and improve fleet efficiency and demand forecasting. 
  • Automatic fulfillment: smart devices can be programmed to order certain products automatically when they run out. See the Amazon Dash and ‘smart’ devices (like fridges and dishwashers currently making waves in the consumer space). 

A few final thoughts…

Estimates suggest that the IoT has a potential economic impact of between $3.9 and $11.1 trillion by the year 2025. To realise that, many IoT ventures will need to rethink how they structure their business. 

IoT products are physical products, so to many it would seem entirely natural to treat them as you would other consumer or commercial electronics. Yet because of the nature of these devices and how central the ‘service’ side of them is to customer success, a hardware-based business model will yield limited results in the long term.  

Instead companies should look to centre connectivity in their IoT business models. Ultimately, the appeal of IoT technology is the continuous value it delivers to customers, and business models that capitalise on this will bring the most success. 

What Does The Current Gartner Hype Curve Tell Us?

For the IT industry, the Gartner Hype Curve provides a graphical representation of the maturity and adoption of various technologies and applications. The analysis accompanying Gartner Hype Cycles also gives an indication of how potentially relevant these solutions are to resolving real life business problems, and the opportunities that they can provide to businesses, for expanding or improving their operations, and gaining a competitive edge.

As with all kinds of analysis however, it’s important for anyone reading and interpreting the data to appreciate the underlying principles guiding the research, and to understand the full implications of all the observations that the analysis brings to light.

Gartner Hype Cycle Methodology

According to Gartner, Inc., the methodology that the research firm uses in preparing their Gartner Hype Cycle, “gives you a view of how a technology or application will evolve over time, providing a sound source of insight to manage its deployment within the context of your specific business goals.”

Each Hype Cycle looks in depth at five key phases in the life cycle of a particular technology or application:

  1. Innovation Trigger: This is a breakthrough or discovery that gains public attention and media coverage. Often at this ideas stage, no usable products or viable business models are available. Much of the hype comes from proof of concept (PoC) evidence, and the potential implications of the new technology.
  1. Peak of Inflated Expectations: As a result of all the early publicity, a number of success stories often appear at this point — but tales of failure may be just as common. Some companies will take action on the basis of these early successes, but many will not.
  1. Trough of Disillusionment: During this phase of the cycle, interest in the new technology declines and cynicism begins to set in, as experiments and implementations fail to deliver the promised results. However, if surviving providers of the new technology manage to improve their products and services to the satisfaction of early adopters, investment in the development may continue.
  1. Slope of Enlightenment: As time goes on, second and third-generation products or services appear from the technology providers. More instances of how the new technology can benefit the enterprise begin to emerge, and observers now have a better understanding of how it works. As a result, more enterprises begin to invest and fund pilot schemes. However, more conservative elements still remain on the fence.
  1. Plateau of Productivity: At the final stage of Hype Cycle maturity, mainstream adoption of the new technology starts to take off. For businesses looking to invest or implement development projects of their own, there’s a clearer understanding of the criteria for assessing provider viability.

A typical Gartner Hype Curve might look like this:

 

(Image source: Gartner)

What The Current Gartner Hype Curve Suggests

The current Gartner Hype Curve considers five technology trends which are “revolutionising how customers experience digital”, and should provide food for thought for businesses making their strategic plans for 2020 and beyond.

1. Multiexperience

Observers in retail and other industries whose consumers take a multi-platform approach to interacting with brands will already be familiar with what Gartner calls “multiexperience.” It’s a blanket term for the various devices and apps that people use on their many digital journeys. This typically involves a combination of interaction modes and touch points, ranging from web and mobile apps, through natural-language-based chat and voice interfaces, to gestures used in 3D or virtual environments.

For businesses wishing to keep pace with this trend, their in-house development teams or external contractors should master mobile app design, development, and architecture. These teams should create mobile apps with modalities based on specific touch points, while engineering a consistent and unified user experience (UX) across web, mobile, wearable devices, conversational interfaces, and immersive experiences.

2. Machines Without Interfaces

So-called “interfaceless” machines are becoming more widespread, as manufacturers in various sectors are phasing out on-board instrument panels in favour of apps that run on their handler’s mobile devices. Device control is being enhanced by the large, high-resolution screens now common on mobile devices. Meanwhile, control software design is easier with the availability of configurable APIs (application programming interfaces).

3. Agent Interfaces

As interface design evolves across a range of industries, interfaces incorporating Artificial Intelligence (AI) are enabling developers to predict what users intend to do, on the basis of information gleaned from past interactions.

Conversational UIs (or chatbots) are an example of these intelligent agent interfaces, which have the potential to greatly influence how enterprises interact with their consumers, offer services, and provide tools to their employees.

4. Facial Recognition Payment Systems

Pioneered and gaining popularity in China, facial recognition payment systems use QR codes and the scanning / recognition capabilities of mobile device cameras and sensors to bypass traditional cash and card-based mechanisms.

Though the technology requires a high degree of confidence and trust in the payment service provider, these systems are gaining adoption outside of China. Apple’s Face ID with Apple Pay is one example.

5. Inclusive Design

As diversity becomes a key issue both in and outside the work place, designers must give consideration to all potential users of their products and services. By taking into account the special needs of all possible communities, inclusive design can serve the broadest possible population of users. To ensure this, the data sources used in design efforts must reflect all potential user segments, and avoid data sets that are too narrow or non inclusive.

Should We Take This At Face Value?

Gartner Inc., places the emphasis on Chief Information Officers (CIOs), as the business leaders who most need to understand how digital experiences are developed and delivered. The research firm’s clients use the Gartner Hype Curve and its implications as the basis for understanding the promise of an emerging technology within the context of their particular industry, and each individual enterprise’s appetite for risk.

Early adopters need to weigh the balance of a potentially risky investment in largely untested technology against the success that could emerge from getting ahead of the rest of the market.

Executives with a more modest approach to risk-taking will generally insist on a sound cost / benefit analysis of new technologies or methods, before making any financial commitments.

In the case of technologies and services with too many unanswered questions concerning their commercial viability, it might be better to adopt a more conservative stance, and wait until others  in your sector have been able to deliver tangible value.

Industry analyst Elaine Burke proposes an additional phase to the Gartner Hype Cycle after the plateau of productivity, to reflect the practical reality of when everyday technology becomes a source of everyday frustration.

Burke argues that The Morass of Malfunction should be included, to take account of the stage in a technology’s maturity when a disconnect occurs between user expectations and the technology provider’s development plan. A typical example would be the experience of waiting for a website to load new elements while you are scrolling, and just as you click or tap on the thing you were looking for, the whole layout jumps, and you’re instantly transported to somewhere you didn’t want to go.

By including some concession to the usability issues of a technology after it gains widespread acceptance, the Gartner Hype Curve could give a more complete picture of its life cycle.