Blog posts

2023

Data Platform - the centerpiece where it all comes together - Part 1

8 minute read

Published:

A data platform serves as the “link” between IoT connectivity and AI services. It enables the seamless integration and consolidation of data from various sources while acting as a robust foundation for AI services. Additionally, a data platform can provide data reliability, scalability, and robustness while also considering security, governance, and cost constraints. In this Blogpost I will describe some basics of a dataplatform and the requirements a dataplatform should fulfill. In the following blogpost (part 2) I will describe the components of a data platform as well as its functionalities.

Connectivity - the basis for local and global data exchange - Part 2

19 minute read

Published:

In the first part of this blogpost series, fundamental and strategical questions about connectivity have been adressed. This second part delves into the operationalization of hardware components and communication protocols essential for cloud connectivity.

Connectivity - the basis for local and global data exchange - Part 1

11 minute read

Published:

WHY move the local machine world into the cloud at all? Is local connectivity on the machine side not sufficient? If I want to bring my local machine world into the cloud, which data integration strategy is the right one for me? These are precisely the questions I address within this blogpost and the underlying whitepaper.

How to get from Sensor Data to a digital product

5 minute read

Published:

“We should digitalise our product! Digital twin, predictive maintenance, intelligent quality control: can’t we offer that too?” Service managers from the mechanical engineering sector are increasingly confronted with questions like these. Do such questions sound familiar to you? This blogpost together with follow up blogposts as well as the underlying whitepaper provides the technical and strategic answers as well as a fact-based introduction to the world of product digitalisation.

2022

Using AWS Lambda with custom container image for Machine Learning inference

8 minute read

Published:

Using serverless offerings from cloud providers has many advantages like no provisioning and managing of servers and automatic scaling. But serverless functions also come with some limitiations like complicated deployment package management, limited package size and limited RAM. Especially if you want to use lambda functions for Machine Learning model inference, these limitations can be quite restrictive. AWS also recognized that so since December 2020 they extended their lambda offering by container image support. In this post I’ll describe my experiences using aws lambda with custom container images and describe the ups and downs of this approach.