Pavel Sklenar

Azure Functions in Kubernetes Example

Introduction This tutorial shows how to deploy two simple examples of Azure Functions written in Python to Kubernetes (AKS used here, but not required). Envrironment setup Install prerequisites Docker (e.g. https://www.docker.com/products/docker-desktop) Kubectl (https://kubernetes.io/docs/tasks/tools/install-kubectl/) Azure cli (https://docs.microsoft.com/cs-cz/cli/azure/install-azure-cli) Func cli (https://docs.microsoft.com/en-us/azure/azure-functions/functions-run-local) Download Functions source code git clone git@github.com:pajikos/cloud-samples.git cd cloud-samples/azure-functions-in-k8s Setup your custom properties Edit file .env and fill correct values. Load envrironment properties # Load env properties source .env Login to the Docker registry You need a Docker registry with push rights to be able to upload images with two functions.

Creating two node Nomad cluster

Nomad is an interesting alternative to Kubernetes. As authors of Nomad say, it is a suplement to Kubernetes and offers some great features. Here you can find a guide on how to create a fresh Nomad cluster with two nodes. The first node acts as a server and client, the second node acts as a client only. I know this architecture is not recommended for production purposes, but I would like to test it only.

Extend local network to cloud with Nebula

Nebula Network Nebula is a scalable overlay networking tool with a focus on performance, simplicity, and security. It lets you seamlessly connect computers anywhere in the world. This post is about extending the local network by another server running anywhere in the world, everything secured by the Nebula network. Devices on the local network should be able to access devices on the Nebula network and some devices need to access devices on the local network as well.

Transfer data between two systems using Azure functions

Introduction This simple demo shows possible integration between two systems (system A and system B) using Azure Functions. Note The full example with all resources could be found here on GitHub. The architecture constraints: All updates from the system A must be transfered into the system B The system A is listening on HTTP with REST API The system B is listening on HTTP with REST API The system B is not fully compatible in message definitions, so field mapping must be used The mapping must be saved in DB (I chose CosmosDB) Due to missing push notification in the system A, its API must be periodically checked The system B may be occasionally offline, so some type of persistent bus should be used.

Creating Azure Red Hat OpenShift 4 cluster

You can find here the guide: how to create an Azure Red Hat OpenShift 4 (ARO) cluster, how to setup connectivity to the new ARO cluster, how to deploy an example application. Note The full example with all resources could be found here on GitHub. Create Azure Red Hat OpenShift 4 (ARO) cluster Prepare environment Set the correct subsription: az account set -s TestingSubscription The file .

Jenkins - Creating Dynamic Project Folders with Job DSL

This post is about the dynamic creating of project folders in Jenkins using Job DSL Plugin. The newly created project folders will be accessible by a specific group or user only, so you are able to use your Jenkins instance by multiple teams and each team will have their own folder for their projects. Before we can start, the following plugins must be installed: Matrix Authorization Strategy Job DSL Plugin Setting Up Correct Authorization Type The first step is related to set up a correct authorization type in the Configure Global Security menu: