Skip to main content

External Data Provider - Share Workflow Data with External Systems

In modern trading and data analysis workflows, you often need to share processed data with external systems, trading bots, or third-party applications. The External Data Provider worker enables you to expose your workflow outputs through secure REST API endpoints, making your data accessible via simple HTTP GET requests.

This guide covers the basics of setting up data sharing endpoints and connecting your workflows to external consumers.

How External Data Provider Works

Data Provision Pipeline

Data Storage: Stores workflow output data in memory with unique provider IDs Endpoint Generation: Creates REST API endpoints automatically Access Control: Provides secure URLs for data retrieval Response Formatting: Supports JSON and plain text responses Metadata Inclusion: Adds timestamps and context information

Key Features

  • Automatic Endpoint Creation: No manual API development required
  • Unique Provider IDs: Each data share gets a secure, unique identifier
  • Flexible Response Types: JSON for structured data, text for simple values
  • Temporary Storage: Data persists for the workflow session
  • Secure Access: Endpoints require valid provider IDs

Basic Setup and Usage

Step 1: Add External Data Provider to Your Workflow

Canvas Integration

  • Drag the External Data Provider worker onto your workflow canvas
  • Connect it to any worker that produces data you want to share
  • Configure the response type (JSON or text) based on your data format

Configuration Options

  • response_type: Choose "JSON" for structured data or "text" for plain text
  • data: Connect the output from your data-producing worker
  • endpoint_id: Automatically generated unique identifier

Step 2: Generate Your API Endpoint

Worker Execution

  • Run your workflow to generate the data
  • The External Data Provider creates a unique provider_id
  • Receive the endpoint URL in the worker response

Example Worker Response

{
"endpoint": "https://api.apudflow.io/api/w/2511351432345546/3167507321031740",
"provider_id": "3167507321031740",
"response_type": "JSON",
"status": "success",
"message": "Data provided successfully. External systems can GET from: https://api.apudflow.io/api/w/2511351432345546/3167507321031740",
"provided_at": 1766781515.75263
}

Step 3: Access Your Data Externally

API Endpoint Usage

  • Use the provided endpoint URL for GET requests
  • No authentication required for public data sharing
  • Endpoints return data in the configured format

Example API Response

{
"data": {
"symbol": "AAPL",
"price": 195.50,
"volume": 45230000,
"timestamp": "2025-12-26T10:30:00Z"
},
"metadata": {
"provided_at": 1766781515.75263,
"endpoint_id": "3167507321031740",
"workflow_id": "2511351432345546",
"type": "provider",
"response_type": "JSON"
},
"status": "success"
}

Simple Trading Integration Example

Price Alert System

Workflow Setup

  1. Add a price fetching worker (Twelve Data, Yahoo Finance, etc.)
  2. Connect to External Data Provider
  3. Configure for JSON response

External System Integration

import requests

# Get the endpoint from your workflow
endpoint_url = "https://api.apudflow.io/api/w/2511351432345546/3167507321031740"

# Fetch data from your workflow
response = requests.get(endpoint_url)
data = response.json()

current_price = data['data']['price']
if current_price > 200:
print(f"Price alert: {current_price}")
# Send notification, execute trade, etc.

This basic setup allows external trading systems to monitor your workflow data in real-time, enabling automated trading decisions based on your analysis.

Best Practices

  • Use Appropriate Response Types: JSON for complex data structures, text for simple values
  • Handle Errors Gracefully: Check response status and handle missing data
  • Monitor Endpoint Usage: Track how often your data is accessed
  • Plan Data Updates: Re-run workflows to refresh shared data
  • Secure Sensitive Data: Avoid sharing confidential information through public endpoints

Next Steps

Once you understand the basics, explore advanced integrations like webhook connections and multi-system data sharing in our advanced guides.