Skip to main content

Cribl: Send Slack Notifications

What

This is the first of a series of Posts where I want to show some Private and/or  "Smart Home" use cases that involve Cribl.

Cribl (Stream) is a Log/Data Streaming Platform that allows you to route, enrich, reduce (and more) the data that you send through it. Check cribl.io, their Docs, Courses and Community for more details. 

I got to know Cribl during my day-to-day job as Big Data Engineer and started using it also for some private stuff. 

  1. Because of its flexibility (countless sources and destinations and even more ways to interact with the data in between)
  2. Because they have a free SaaS tier where you have an account on their Cloud Platform and you can send up to 1TB per day to your instance. It also includes access to Cribl Edge and Cribl Search; 2 different, but related Cribl Products. 
    1. Yes, free.
  3. Not because Cribl pays me for this posts. They don't (yet). 
Update: Here's the second Cribl related post, about sending Data to Google Sheets.

What Exactly

I had some use cases where I wanted to be notified about certain situations. As there is no "send to your mobile" Integration (yet) in Cribl, I setup a Webhook Destination that is able to send a message to a configured Slack channel.
Having Slack on your mobile with notification enabled for this channel allows you to be notified right when the situation you monitor triggers.

To give you a real life example: We wanted to book a camping site which is usually sold out pretty quick (at least the good spots at the lake) and the issue is, that they open the booking for the next season on a random day. First come, first served. 

You could click refresh many times for a long time - or you combine Cribl and Slack.

Fortunately the booking page was a pretty simple HTML site where I was able to setup a REST Collector in Cribl (more on that in a later post) to collect it. With data juiced by this Collector I could parse what's written on the booking page. It would say (at least something like that in German) :"booking for next season opens soon".

So, this Collector was running every hour. A pipeline was dropping all events, all the time, except when it not included the string, like "booking for next season opens soon", because that would most probably mean that booking is open (it was open). 

This "event" (Data from the Collector run that was not dropped) was shaped by a pipeline and finally sent to a Webhook that was configured to send it to my Slack Chanel. I saw the notification on my mobile and started my book-the-spot-at-the-lake Pipeline (well, this is still a pretty manual process)... 

More use cases that I actually solved with this setup:
  • Summary of daily energy stats about my PV system and grid consumption.
  • Be notified when a book is due at the library
  • Follow Covid metrics of my city/region
  • Follow the progress of an IndieGoGo campaign

Think IFTTT, but more flexible (but a little bit less click and play'ish)

How

The Collector part will be explained in another post. This is about the configuration in Slack and for the Cribl Webhook Destination. 

  • Configure an Incoming Webhook for your Slack Channel. The decription in their Documentation is straight forward, so I won't try to do it better:
  • Configure a Cribl Webhook Destination with the URL received from Step 1, using the following settings as inspiration (not the final truth):
  • Configure a Pipeline in Cribl. The details are dependent on your data, but you basically have to parse the relevant data from your events and at the end you have to create a field called "text". This will end up as the Message in your Channel.
    • You can also format the message and do more fancy stuff. 

      Here is an example where I assign two values that I parsed in earlier steps of the Pipeline to a new field called "text".  I am not sure if you have to, but I throw away all the other fields besides 'text' in "Remove Fields"

  • Configure a Date Route in Cribl that connects your Data Source with the Slack Webhook Destination, routing it via the Pipeline

    • The Filter of the Route is dependent on your Source. If all is coming from a REST Collector as in my example, you can simply use its input id as filter
    • The Pipeline is the one created earlier. 
      • It could also be setup as a Post Processing Pipeline of the Slack Webhook. So everything send to this Webhook will go through this Pipe. Depends on your use case and setup if it makes sense.
    • The Destination is the Slack Webhook 
    • Example of a Route with a filter for a specific string in the Input ID (because I was too lazy to look for the whole name of the Collector)
    •  

Example of a Cribl created Message in Slack (Covid Metrics...) 





Conclusion

  • With this Webhhook Destination you can use Cribl to notify yourself via Slack about certain situations/events. 
  • Not meant to be a data storage, just a free and (once setup) simple way to get notifications to the client of your choice (at least when Slack runs on it).
  • Yes, everything described here could be done with a simple script in the crontab of a rasperry pi. But once you have the destination setup in Cribl, you can quickly set it up (and/or clone it) for various use cases and control it from the mighty CLOUD.
  • Feel free to comment with how you utilized this setup.
  • For me personally the outcome was a good spot at the lake/sea :) :


References/Links



Comments

Popular posts from this blog

Cribl: Send Data to Google Sheets

  What This is the second of a series of Posts where I want to show some Private and/or   "Smart Home" use cases that involve Cribl . Cribl (Stream) is a  Log/Data Streaming Platform  that allows you to route, enrich, reduce (and more) the data that you send through it. Check  cribl.io , their Docs, University and Community for more details.  Here's the first Post, about Slack Notifications. What Exactly For some of my Use Cases / Data sources I was looking for a free longterm storage solution . Not for big data volumes - basically for summary statistics . In this case the daily Energy Consumption and Production (Photovoltaic) in/at my house.  I don't have Splunk (or similar) 24/7 running in the Cloud or OnPrem. For midterm Analysis and Visualization I use Humio / Crowdstrike Falcon Logscale, but it only has a retention of 7 days. Than I thought about Google Sheets . It was one of my first Cloud services and I still use it for some of my private ...

Smart Home Observability with Cribl

Cribl User Group Talk I was invited to give a Talk about some of the "Smart Home" use cases I created with Cribl during the October 2023 Cribl UG Meeting. The presentation was based around my Mini Photovoltaic setup (a.k.a. "Balkonkraftwerk") that I monitor with free tools and the main data hub is Cribl. Here is a summary, YouTube and PDF copy of the slides from the session.