Streaming External Events into Microsoft Fabric Eventhouse Using Custom Endpoints

A practical guide to ingesting IoT and application events into Microsoft Fabric using Eventstream and Eventhouse

Real-time analytics has become a foundational capability for modern data platforms. Whether you are processing IoT telemetry, application events, or operational metrics, the ability to ingest and query data with low latency directly impacts business responsiveness.

Microsoft Fabric addresses this requirement by combining Eventstream and Eventhouse, enabling scalable, near–real-time analytics without the operational complexity of traditional streaming architectures.

In this article, we walk through a practical, end-to-end pattern for streaming external events into Microsoft Fabric Eventhouse using Eventstream Custom Endpoints — a common scenario when working with proprietary systems or custom applications that do not natively integrate with Azure Event Hubs.

Why Use Eventstream Custom Endpoints?

Eventstream Custom Endpoints are ideal when:

  • You control the event producer (custom apps, services, IoT gateways)
  • Native Azure Event Hubs integration is not available
  • You want a lightweight SDK-based ingestion option
  • You need near–real-time analytics using KQL

This pattern cleanly decouples event producers from analytics consumers while preserving scalability and low latency.

Scenario Overview

In this example, we simulate a simple IoT ingestion scenario:

  • A workstation gathers telemetry from multiple IoT devices
  • Events are sent to Microsoft Fabric via an Eventstream Custom Endpoint
  • Eventstream forwards events to Eventhouse
  • Data becomes queryable within seconds using KQL

Although the data is synthetic, the architecture closely mirrors real-world production deployments.

High-Level Architecture

The ingestion pipeline follows these steps:

  1. An external application produces JSON-formatted events
  2. Events are sent to an Eventstream Custom Endpoint
  3. Eventstream validates and optionally transforms the data
  4. Events are routed to Eventhouse
  5. Data is queried in near real time using KQL

Setting Up Microsoft Fabric Resources

1. Create a Workspace

Create a workspace in Microsoft Fabric (for example, wsIOT) to logically group our resources.

2. Create an Eventhouse

Inside the workspace, create an Eventhouse.
Fabric automatically provisions a KQL database with the same name (for example, ehIOTDemo).

3. Create an Eventstream

Create an Eventstream and configure (in Edit mode):

  • Destination: Your Eventhouse
  • Input data format: JSON

This Eventstream acts as the ingestion gateway for external producers.

Designing the Event Payload

Each event represents a single IoT measurement. Events are sent as JSON, which works naturally with Eventhouse’s schema-on-ingest capabilities.

Example Event Payload

{
"batch_id": "06970e42-d8db-78d5-8000-f8c3ff46a7c7",
"city_code": "PAR",
"location_id": 17,
"device_id": "DEV-17-47",
"utc_time": "2026-01-21T14:35:25.553844+00:00",
"measured_value": 98.51
}

Simulating Event Producers with Python

To simulate the end-to-end process, we use a Python script. The script consists of two main functions:

Get Ugurpaca’s stories in your inbox

Join Medium for free to get updates from this writer.Subscribe

1. Reading sensor data (read_from_sensors) : Generates hypothetical IoT sensor data to simulate real device measurements.

2. Sending data to Fabric (send_event_batch) : Sends event data to the Fabric Eventstream using the azure-eventhub package. The package can be installed using “pip install azure-eventhub”.

To connect to Microsoft Fabric, we use a connection string obtained from the SAS Key Authentication panel in the Eventstream properties.

Use the value shown with the red arrow for Connection string
import asyncio
import json
import random
import time
from datetime import datetime, timezone
from azure.eventhub.aio import EventHubProducerClient
from azure.eventhub import EventData
from uuid_extensions import uuid7, uuid7str as uuid

cities = ["PAR", "LON", "BER"]
CONNECTION_STR = "YOUR CONNECTION STRING HERE"
batch_size:int=100

# Simulate reading data from IoT sensors
def read_from_sensors(batch_size=100,batch_id=""):
batch = []
for _ in range(batch_size):
location_id=random.randint(1, 20)
data_point = {
"batch_id": batch_id,
"city_code": random.choice(cities),
"location_id": location_id,
"device_id": f"DEV-{location_id}-{random.randint(1, 50)}",
"utc_time": datetime.now(timezone.utc).isoformat(),
"measured_value": round(random.uniform(70, 100), 2)
}
batch.append(data_point)

return batch

# Send a batch of events to Event Hub
async def send_event_batch(event_batch):
producer = EventHubProducerClient.from_connection_string(CONNECTION_STR)
async with producer:
event_data_batch = await producer.create_batch()
for data in event_batch:
event = EventData(json.dumps(data))
event_data_batch.add(event)
await producer.send_batch(event_data_batch)
print(f"Sent batch of {len(event_batch)} events.")

# Main processing loop
async def process():
batch_no = 1 # Batch counter.
try:
batch_id = uuid()
print("Starting reading data. Ctrl+C to stop.")
while True:
data_batch = read_from_sensors(batch_size,batch_id)
await send_event_batch(data_batch)
batch_no += 1
if batch_no > 1000 : #Run this for a limited number of batches to avoid cost for the sample
return
except KeyboardInterrupt:
print("\nStopped.")


if __name__ == "__main__":
print("Sending event to Fabric Event Stream...")
asyncio.run(process())
print("Done!")

Monitoring and Querying Data

Monitoring in Eventstream

While the script is running, incoming events can be observed directly in the Eventstream UI.

Monitoring incoming data in Eventstream
Monitoring incoming data in Eventhouse

Querying in Eventhouse

Once ingested, data becomes immediately available for querying using KQL.Below a basic KQL query for analyzing ingested data.

From here, you can:

  • Aggregate metrics
  • Filter by device or location
  • Build real-time dashboards

If you are building real-time analytics solutions on Microsoft Fabric, Eventstream Custom Endpoints offer a clean and scalable ingestion pattern — especially for custom or non-Azure-native producers.

If you found this useful, feel free to clap, share, or comment. I plan to publish follow-up articles on schema mapping, performance tuning, and production-grade security patterns in Fabric.

About The Author

Ugur Paca

Senior Software Developer

Paca, U (27/01/2026) Streaming External Events into Microsoft Fabric Eventhouse Using Custom Endpoints. Streaming External Events into Microsoft Fabric Eventhouse Using Custom Endpoints | by Ugurpaca | Jan, 2026 | Medium

Share this on...