Webhooks to Delta Lake

This page provides you with instructions on how to extract data from Webhooks and load it into Delta Lake. (If this manual process sounds onerous, check out Stitch, which can do all the heavy lifting for you in just a few clicks.)

What are webhooks?

A webhook is a way for one application to provide other applications with real-time information. Webhooks send data through user-defined HTTP POST callbacks, which means an application that uses webhooks can POST data when an event occurs to a specified endpoint (web address).

What is Delta Lake?

Delta Lake is an open source storage layer that sits on top of existing data lake file storage, such AWS S3, Azure Data Lake Storage, or HDFS. It uses versioned Apache Parquet files to store data, and a transaction log to keep track of commits, to provide capabilities like ACID transactions, data versioning, and audit history.

Getting data out of webhooks

Different applications have different ways to set up webhooks. Often, you can use a management console to define the webhook and specify the endpoint to which you want data delivered. You must make sure that the specified endpoint exists on your server.

What does webhook data look like?

Webhooks post data to your specified endpoints in JSON format. It's up to you to parse the JSON objects and decide how to load them into your data warehouse.

Loading data into Delta Lake on Databricks

To create a Delta table, you can use existing Apache Spark SQL code and change the format from parquet, csv, or json to delta. Once you have a Delta table, you can write data into it using Apache Spark's Structured Streaming API. The Delta Lake transaction log guarantees exactly-once processing, even when there are other streams or batch queries running concurrently against the table. By default, streams run in append mode, which adds new records to the table. Databricks provides quickstart documentation that explains the whole process.

Keeping data from webhooks up to date

Once you've set up the webhooks you want and have begun collecting data, you can relax – as long as everything continues to work correctly. You have to keep an eye on any changes your applications make to the data they deliver. You should also watch out for cases where your script doesn't recognize a new data type. And since you'll be responsible for maintaining your script, every time your users want slightly different information, you'll have to modify the script.

Easier and faster alternatives

If all this sounds a bit overwhelming, don’t be alarmed. If you have all the skills necessary to go through this process, chances are building and maintaining a script like this isn’t a very high-leverage use of your time.

Thankfully, products like Stitch were built to move data from Webhooks to Delta Lake automatically. With just a few clicks, Stitch starts extracting your Webhooks data, structuring it in a way that's optimized for analysis, and inserting that data into your Delta Lake data warehouse.