Demo
Blog /

Using Rest APIs to Build Pipelines & Stream Data with Postman

Hubert Dulay
Decodable

In this blog, we build a streaming data pipeline with Decodable’s REST APIs using Postman to invoke the APIs from a GUI, and then use Postman to send streaming data into the new pipeline using Decodable's HTTP connector.

We'll also go over how to configure Postman with the necessary configuration and variables all within a collection - a set of saved requests that can be loaded and shared with others.

Let’s get started.

Use Cases

CI/CD

Using Decodable’s REST APIs is a very easy way to incorporate pipeline deployment into your CI/CD pipeline using an automation server like Jenkins. You can build entire streaming data pipelines and promote it through your environments as a single unit of work. Unit testing is also easy with the REST API. You can even export the Postman collection as your mechanism of deployment using Newman, a command-line Collection Runner for Postman. You can run and test a Postman Collection directly from the command line and meant for integration into your CI/CD pipeline.

Onboarding Customers

Sometimes when you onboard a customer, they may need a dedicated data pipeline to consume your data. You can build a complete streaming data pipeline single unit of work with the data and scale required to service that customer.

Step 1: Token

First you’ll need a Decodable token to be able to authenticate to the Decodable service. Start by installing the Decodable CLI here. We will be using the OAuth2 token that the CLI uses for testing the REST APIs. The commands below first installs yq which is a YAML parser and the second parses the auth file in ~/.decodable that the Decodable CLI created. Installing yq is optional but in case you would like to capture this information from a script, yq can help.

Make sure you’re able to login using the Decodable CLI and can execute at least one command before retrieving the access token.

Save this token to populate the postman variables later. You will also need to save the Decodable account name you created when you ran this command.

For production applications, you must retrieve an OAuth2 token directly via the standard APIs. This will allow you to request the appropriate scopes as well as receive a refresh token so you can renew your auth token on an ongoing basis. The OAuth2 URLs can be requested at support@decodable.co. For testing purposes, we’ll use the token in the auth file.

Step 2: GitHub

Next, clone the git repository using the command below. This directory contains an exported collection of Decodable REST API calls that we will execute to build a full streaming data pipeline.

Find the file decodable-apis.postman_collection.json. You will use this collection file to import into Postman.

Step 3: Import the Collection Into Postman

Open Postman and create a new workspace:

Then import the collection by clicking Import and selecting the json file decodable-apis.postman_collection.json:

You will see decodable-apis appear in the navigation pane. Expand the folder to inspect the REST calls.

Step 4: Update Postman Variables

The 2 variables you will need to provide: the token you obtained and the Decodable account name in step 1. Click on decodable-apis to view the settings for this collection. Under Authorization, notice the Type is preset to OAuth 2.0.

Next click on the Variables. There are two variables to set: decodable-account and token. Under CURRENT VALUE. Set the values to be your Decodable account name and the token you obtained in step 1 respectively. Then click Save.

Step 5: Create & List Streams

Click on list streams to view the request. Notice the {{decodable-account}} variable in the URL. Postman will populate it with the value defined in the collection before submitting the request. Click Send to submit the request and see the response below.

Step 6: Assemble Pipeline

Execute all of the Create requests to create all of the components in the image below. You will need to save the stream ids for demo_day_stream and demo_day_stream_initcap and paste them into the stream_id field in the connection creation requests.

The other variables listed are for the REST API source connection and S3 sink connection.

Connection-id

This variable should be assigned the ID assigned by Decodable. In the response of the create rest api source connection call, the value will be located under properties.endpoint. In the example below, the connection-id is 7c1d39af.

S3-*

All the variables with the prefix s3- configure the S3 connections. For details, see Decodable's documentation for the S3 connection.

Step 7: Post a Message

In the Post Messages request, replace the {{connection-id}} variable with the connection id from step 6 to post messages to the REST API source connection. Messages sent to the REST API can be batched together in a single request.

Click send and monitor your S3 bucket for messages to arrive.

Summary

Postman is an easy way to build automation around Decodable’s REST APIs. For help with projects or if you intend to build an application that uses these APIs, please contact support@decodable.co.

Watch the video of this demo here: 


You can get started with Decodable for free - our developer account includes enough for you to build a useful pipeline and - unlike a trial - it never expires.

Learn more:

Join the community Slack

Publishing Data Products with AsyncAPI

AsyncAPI is an open source project that simplifies sharing data through a gateway by standardizing definitions of streaming data sources. In this blog, we will use Decodable to publish data products using AsyncAPI, allowing consumers to pull data into their domain.

Learn more

Demo Day - Connect Kafka to S3

Eric Sammer and Tim James demonstrate how Decodable connects Kafka to S3 by way of Decodable's own dogfooding - using for internal metrics - in the context of how Decodable connects to a range of systems including Redis, AWS Kinesis, Pulsar, RedPanda, RedShift, Snowflake, Snowpipe, Apache Pinot/StarTree and more.

Learn more

The Top 5 Streaming ETL Patterns

ETL and ELT are traditionally scheduled batch operations, but as the need for always-on, always-current data services becomes the norm, realtime ELT operating on streams of data is the goal of many organizations - if not the reality, yet.In real world usage, the ‘T’ in ETL represents a wide range of patterns assembled from primitive operations. In this blog we’ll explore these operations and see examples of how they’re implemented as SQL statements.

Learn more

Heading

Lorem ipsum dolor sit amet, consectetur adipiscing elit. Suspendisse varius enim in eros elementum tristique. Duis cursus, mi quis viverra ornare, eros dolor interdum nulla, ut commodo diam libero vitae erat. Aenean faucibus nibh et justo cursus id rutrum lorem imperdiet. Nunc ut sem vitae risus tristique posuere.

Learn more
Tags
Pintrest icon in black
Demo
Webinar

Start using Decodable today.