Configure, Manage, and Simplify Your Observability Data Pipelines with the Calyptia Core Docker Extension

Configure, Manage, and Simplify Your Observability Data Pipelines with the Calyptia Core Docker Extension

Written by Eduardo Silva in NewsCalyptia Coreon December 16, 2022

Configure, Manage, and Simplify Your Observability Data Pipelines with the Calyptia Core Docker Extension

This post was co-written with , DevRel Manager at Docker. It also appears on the Docker Blog.

Applications produce a lot of observability data. And it can be a constant struggle to source, ingest, filter, and output that data to different systems. Managing these observability data pipelines is essential for being able to leverage your data and quickly gain actionable insights.

In cloud and containerized environments, Fluent Bit is a popular choice for marshaling data across cloud-native environments. A super fast, lightweight, and highly scalable logging and metrics processor and forwarder, it recently reached three billion downloads.

Calyptia Core, from the creators of Fluent Bit, further simplifies the data collection process with a powerful processing engine. Calyptia Core lets you create custom observability data pipelines and take control of your data.

And with the new Calyptia Core Docker Extension, you can build and manage observability pipelines within Docker Desktop. Let’s take a look at how it works!

architectural illustration

What is Calyptia Core?

Calyptia Core plugs into your existing observability and security infrastructure to help you process large amounts of logs, metrics, security, and event data. With Calyptia Core, you can:

  • Connect common sources to the major destinations (e.g. Splunk, Datadog, Elasticsearch, etc.)

  • Process 100k events per second per replicas with efficient routing.
    Automatically collect data from Kubernetes and its various flavors (GKE, EKS, AKS, OpenShift, Tanzu, etc).

  • Build reliability into your data pipeline at scale to debug data issues.

Why Calyptia Core?

Observability as a concept is common in the day-to-day life of engineers. But the different data standards, data schemas, storage backends, and dev stacks contribute to tool fatigue, resulting in lower developer productivity and increased total cost of ownership.

Calyptia Core aims to simplify the process of building an observability pipeline. You can also augment the streaming observability data to add custom markers and discard or mask unneeded fields.

Why run Calyptia Core as a Docker Extension?

Docker Extensions help you build and integrate software applications into your daily workflows. With Calyptia Core as a Docker Extension, you now have an easier, faster way to deploy Calyptia Core.

Once the extension is installed and started, you’ll have a running Calyptia core. This allows you to easily define and manage your observability pipelines and concentrate on what matters most — discovering actionable insights from the data.

Getting started with Calyptia Core

Calyptia Core is in Docker Extension Marketplace. In the tutorial below, we’ll install Calyptia Core in Docker Desktop, build a data pipeline with mock data, and visualize it with Vivo.

Initial setup

Make sure you’ve installed the latest version of Docker Desktop (or at least v4.8+). You’ll also need to enable Kubernetes under the Preferences tab. This will start a Kubernetes single-node cluster when starting Docker Desktop.

Screen capture

Installing the Calyptia Core Docker Extension

Step 1

Open Docker Desktop and click “Add Extensions” under Extensions to go to the Docker Extension Marketplace.

screen capture

Step 2

Install the Calyptia Core Docker Extension.

Extensions marketplace screen capture

By clicking on the details, you can see what containers or binaries are pulled during installation.

extension installation screen capture

Step 3

Once the extension is installed, you’re ready to deploy Calyptia Core! Select “Deploy Core” and you’ll be asked to login and authenticate the token for the Docker Extension.

welcome screen

In your browser, you’ll see a message asking to confirm the device.

device confirmation screen
all set screen

Step 4

After confirming, Calyptia Core will be deployed. You can now select “Manage Core” to build, configure, and manage your data pipelines.

Core desktop extension screen

You’ll be taken to, where you can build your custom observability data pipelines from a host of source and destination connectors.


Step 5

In this tutorial, let’s create a new pipeline and set docker-extension as the name.


Add “Mock Data” as a source and “Vivo” as the destination.

NOTE: Vivo is a real time data viewer embedded in the Calyptia Core Docker Extension. You can make changes to the data pipelines like adding new fields or connectors and view the streaming observability data from Vivo in the Docker Extension.


Step 6

Hit “Save & Deploy” to create the pipeline in the Docker Desktop environment.


With the Vivo Live Data Viewer, you can view the data without leaving Docker Desktop.



The Calyptia Core Docker Extension makes it simple to manage and deploy observability pipelines without leaving the Docker Desktop developer environment. And that’s just the beginning. You can also use automated logging in Calyptia Core for automated data collection from your Kubernetes pods and use metadata  to perform processing rules before it’s delivered to the chosen destination.

Give the Calyptia Core Docker Extension a try, and let us know what you think at

You might also like

Fluent Bit or Fluentd

Fluent Bit and Fluentd – a child or a successor?

Fluent Bit may have started as a sibling to Fluentd, but it is fair to say that it has now grown up and is Fluentd's equal. Learn which is right for your needs and how they can be used together.

Continue reading
Calyptia + Lua + AI

Transform your logs in-flight with Lua, AI, and Calyptia

Learn how Calyptia lets you create custom processing rules to transform your data using Lua and how Calyptia integrates AI to simplify data processing.

Continue reading
Fluent Bit v3

Fluent Bit v3 gives users greater control of their data and telemetry pipelines

New release allows filtering of Windows and MacOS metrics, supports SQL for parsing logs, adds support for HTTP/2, and more.

Continue reading