From 4337dd2393115e0b8a5ce00c309171703182aa14 Mon Sep 17 00:00:00 2001
From: Mark Hulbert <39801222+m-hulbert@users.noreply.github.com>
Date: Fri, 22 Aug 2025 10:28:22 +0200
Subject: [PATCH 1/7] Convert inbound integrations to MDX
---
.../inbound/kafka-connector.textile | 110 -----------------
content/integrations/inbound/webhooks.textile | 89 --------------
.../integrations/inbound/kafka-connector.mdx | 115 ++++++++++++++++++
.../integrations/inbound/webhooks.mdx | 96 +++++++++++++++
4 files changed, 211 insertions(+), 199 deletions(-)
delete mode 100644 content/integrations/inbound/kafka-connector.textile
delete mode 100644 content/integrations/inbound/webhooks.textile
create mode 100644 src/pages/docs/platform/integrations/inbound/kafka-connector.mdx
create mode 100644 src/pages/docs/platform/integrations/inbound/webhooks.mdx
diff --git a/content/integrations/inbound/kafka-connector.textile b/content/integrations/inbound/kafka-connector.textile
deleted file mode 100644
index c55895d1e1..0000000000
--- a/content/integrations/inbound/kafka-connector.textile
+++ /dev/null
@@ -1,110 +0,0 @@
----
-title: Ably Kafka Connector
-meta_description: "The Ably Kafka Connector sends data from Kafka to an Ably channel in realtime."
-meta_keywords: "Kafka, Kafka Connector, channel"
-languages:
- - none
-redirect_from:
- - /docs/general/kafka-connector
----
-
-The Ably Kafka Connector integrates "Kafka":https://kafka.apache.org/ with Ably to enable realtime event distribution from Kafka to web, mobile, and IoT clients via Ably's channels.
-
-The connector is Confluent "Gold":https://www.confluent.io/hub/ably/kafka-connect-ably verified, ensuring compliance with Confluent's "Verified Integrations Program":https://www.confluent.io/partners/connect/. You can use it to send data from one or more Kafka topics to one or more Ably channels.
-
-
-
-
-
-h2(#install). Install
-
-The Ably Kafka Connector is a sink connector built on top of "Kafka Connect":https://docs.confluent.io/platform/current/connect/index.html#how-kafka-connect-works.
-
-Install the Ably Kafka Connector from:
-
-* "GitHub":https://github.com/ably/kafka-connect-ably to run within your own infrastructure.
-* "Confluent Hub":https://www.confluent.io/hub/ably/kafka-connect-ably to run on the Confluent Platform.
-
-Once installed, configure it with your "Ably API key":/docs/auth#api-keys to enable data from Kafka topics to be published into Ably channels.
-
-h2(#mapping). Mapping
-
-The Ably Kafka Connector supports two mapping methods:
-
-# "Static":#static to assign messages to a fixed Ably channel.
-* "Pattern-based":#pattern to dynamically assign messages based on interpolation of topic and record keys.
-
-h3(#static). Static mapping
-
-Static mapping assigns one or more Kafka topics to a single Ably channel. The channel name stays the same, regardless of the Kafka record.
-
-For example, a sports website streaming live updates can set @channel = basketball@, ensuring all Kafka records, regardless of their Kafka topic, publish to the basketball channel.
-
-The following example maps all Kafka topics to the basketball channel:
-
-```[javascript]
-channel = basketball
-message.name = news_update
-```
-
-h3(#pattern). Pattern-based mapping
-
-Pattern-based mapping dynamically maps multiple Kafka topics to different Ably channels. It provides the ability to adjust configuration by interpolating across record key and record topic values. Each Kafka record determines the target channel. Additionally each message is published to the Ably channel corresponding to its Kafka topic. For example, setting @channel = channel_#{topic}@ routes Kafka messages to a channel matching their topic name.
-
-The following example maps Kafka topics to Ably channels based on the topic name:
-
-```[javascript]
-channel = channel_#{topic}
-message.name = message_#{key}
-```
-
-h3(#mixed). Static and pattern-based mapping
-
-You can use static and pattern-based mapping in conjunction. For example, you can dynamically map the channel while keeping the message name static, ensuring messages are routed to topic-specific channels while maintaining a consistent message name. The following configuration maps Kafka topics to Ably channels based on the topic name, while keeping the message name static:
-
-```[javascript]
-channel = channel_#{topic}
-message.name = single_message
-```
-
-h2(#publish). Publish messages with a schema
-
-The Ably Kafka Connector supports messages that include schema information. It converts these messages to JSON before publishing them to Ably using the "Kafka Connect Schema Registry":https://docs.confluent.io/platform/current/schema-registry/connect.html and supported converters.
-
-For example, if messages on the Kafka topic are serialized using Avro, and schemas are registered in a Schema Registry, configure the connector to convert Avro to JSON.
-
-Set the following properties in your Kafka Connect configuration:
-
-```[text]
-value.converter=io.confluent.connect.avro.AvroConverter
-value.converter.schema.registry.url=https://
-```
-
-This configuration ensures Kafka messages are correctly deserialized and transformed before reaching Ably.
-
-h2(#configure). Configure the Kafka connector
-
-The Ably Kafka Connector sends Kafka messages to Ably channels in realtime. Configuration depends on the installation method used:
-
-|_. Installation method |_. Configuration steps |
-| Docker | Create a @docker-compose-connector.properties@ file inside the @/config@ directory. \n - An example file is already available in the repository. |
-| Single connect worker | - Provide a configuration file as a command-line argument when running the worker. |
-| Distributed connect workers | - Use the Confluent REST API @/connectors@ endpoint. \n - Pass the configuration as JSON. |
-
-h3. Connector configuration properties
-
-You must configure these core properties to get the connector working.
-
-|_. Property |_. Description |
-| @channel@ | The Ably channel to which messages are published. Supports "Dynamic Channel Configuration:":#dynamic-channel-configuration |
-| @client.key@ | An Ably API key used for authentication. Must have *publish* capability for the specified channel. |
-| @client.id@ | The Ably client ID the connector uses. Defaults to @"kafka-connect-ably-example"@. |
-| @name@ | A globally unique name for the connector. Defaults to @"ably-channel-sink"@. |
-| @topics@ | A comma-separated list of Kafka topics to publish from. |
-| @tasks.max@ | The maximum number of tasks the connector should run. Defaults to @1@. |
-| @connector.class@ | The class name for the connector. Must be a subclass of @org.apache.kafka.connect.connector@. Defaults to @io.ably.kafka.connect.ChannelSinkConnector@. |
-
-
diff --git a/content/integrations/inbound/webhooks.textile b/content/integrations/inbound/webhooks.textile
deleted file mode 100644
index a954e2830e..0000000000
--- a/content/integrations/inbound/webhooks.textile
+++ /dev/null
@@ -1,89 +0,0 @@
----
-title: Inbound webhooks
-meta_description: “Incoming webhooks let you integrate external web services with Ably.”
-meta_keywords: “Ably, incoming, inbound, webhooks, webhook configuration, web services, realtime.”
-languages:
- - nodejs
-redirect_from:
- - /docs/general/incoming-webhooks
----
-
-External services can publish messages to Ably channels using the "REST API":/docs/api/rest-api, however, a simpler alternative is to use "incoming webhooks":#configure.
-
-Many web services generate webhooks to communicate with applications. These webhooks are triggered based on interactions with their APIs or infrastructure. When a webhook request is received by Ably, its payload is published to a channel as an "unenveloped":/docs/api/rest-api#unenveloped message.
-
-Ably also supports "outbound webhooks":/docs/integrations/webhooks, which send data from Ably to other external services such as AWS, Google Cloud Platform or Zapier.
-
-
-
-h2(#configure). Configure an incoming webhook
-
-Set up incoming webhooks in the Integrations tab of the "Ably dashboard":https://ably.com/accounts/any/app/any/integrations:
-
-1. Click *Register a new webhook endpoint*.
-2. *Name* your webhook.
-3. *Select an Ably channel* to receive webhook messages.
-4. Click *Generate a URL*.
-5. Copy the generated URL and configure your external service with it.
-
-h3(#test). Test incoming webhook
-
-Run the following Curl command to simulate an incoming webhook request:
-
-```[sh]
-curl -X POST 'https://rest.ably.io/channels/webhook-test/messages?key={{API_KEY_NAME}}:{{API_KEY_SECRET}}&enveloped=false' \
- -H 'content-type: application/json' --data '{"some":"json"}'
-```
-
-Incoming webhooks function as REST publishes, meaning they follow the same behavior and functionality as the "REST publish API":/docs/api/rest-api#publish.
-
-Ably responds with the @channel@ and @messageId@:
-
-```[json]
-{
- "channel": "webhook-test",
- "messageId": "20xxxxxxx"
-}
-```
-
-A successful request returns a @201@ status. Failures return with an "@ErrorInfo@":/docs/api/rest-sdk/types#error-info response.
-
-h2(#receive). Receive webhook messages
-
-Incoming webhooks publish messages to an Ably channel. You can "subscribe":/docs/pub-sub#subscribe to these messages using an Ably SDK:
-
-```[javascript]
-const Ably = require("ably");
-
-const ably = new Ably.Realtime('{{API_KEY}}');
-
-const channel = ably.channels.get('webhook-test');
-
-channel.subscribe((message) => {
- console.log(`webhook received: ${JSON.stringify(message.data)}`);
-});
-```
-
-h2(#headers). Optional headers
-
-The request body of incoming webhooks is treated as a message to be published. If the external service allows, you can customize webhook requests by including optional headers and parameters.
-
-The following example demonstrates how to set a message @name@ using the @X-Ably-Name@ header:
-
-```[sh]
-curl -X POST 'https://rest.ably.io/channels/webhook-test/messages?key=key:secret&enveloped=false' \
- -H 'content-type: application/json' --data '{"some":"json"}' \
- -H 'X-Ably-Name: webhook-message'
-```
-
-Then, filter messages by name:
-
-```[javascript]
-channel.subscribe('webhook-message', (message) => {
- console.log("webhook: " + JSON.stringify(message.data));
-});
-```
-
-To ensure that publishes are "idempotent":/docs/pub-sub/advanced#idempotency, add a unique @X-Ably-MessageId@ header.
diff --git a/src/pages/docs/platform/integrations/inbound/kafka-connector.mdx b/src/pages/docs/platform/integrations/inbound/kafka-connector.mdx
new file mode 100644
index 0000000000..4b89842ff9
--- /dev/null
+++ b/src/pages/docs/platform/integrations/inbound/kafka-connector.mdx
@@ -0,0 +1,115 @@
+---
+title: Ably Kafka Connector
+meta_description: "The Ably Kafka Connector sends data from Kafka to an Ably channel in realtime."
+meta_keywords: "Kafka, Kafka Connector, channel"
+redirect_from:
+ - /docs/general/kafka-connector
+---
+
+The Ably Kafka Connector integrates [Kafka](https://kafka.apache.org/) with Ably to enable realtime event distribution from Kafka to web, mobile, and IoT clients via Ably's channels.
+
+The connector is Confluent [Gold](https://www.confluent.io/hub/ably/kafka-connect-ably) verified, ensuring compliance with Confluent's [Verified Integrations Program](https://www.confluent.io/partners/connect/). You can use it to send data from one or more Kafka topics to one or more Ably channels.
+
+
+
+## Install
+
+The Ably Kafka Connector is a sink connector built on top of [Kafka Connect](https://docs.confluent.io/platform/current/connect/index.html#how-kafka-connect-works).
+
+Install the Ably Kafka Connector from:
+
+* [GitHub](https://github.com/ably/kafka-connect-ably) to run within your own infrastructure.
+* [Confluent Hub](https://www.confluent.io/hub/ably/kafka-connect-ably) to run on the Confluent Platform.
+
+Once installed, configure it with your [Ably API key](/docs/auth#api-keys) to enable data from Kafka topics to be published into Ably channels.
+
+## Mapping
+
+The Ably Kafka Connector supports two mapping methods:
+
+* [Static](#static) to assign messages to a fixed Ably channel.
+* [Pattern-based](#pattern) to dynamically assign messages based on interpolation of topic and record keys.
+
+### Static mapping
+
+Static mapping assigns one or more Kafka topics to a single Ably channel. The channel name stays the same, regardless of the Kafka record.
+
+For example, a sports website streaming live updates can set `channel = basketball`, ensuring all Kafka records, regardless of their Kafka topic, publish to the basketball channel.
+
+The following example maps all Kafka topics to the basketball channel:
+
+
+```javascript
+channel = basketball
+message.name = news_update
+```
+
+
+### Pattern-based mapping
+
+Pattern-based mapping dynamically maps multiple Kafka topics to different Ably channels. It provides the ability to adjust configuration by interpolating across record key and record topic values. Each Kafka record determines the target channel. Additionally each message is published to the Ably channel corresponding to its Kafka topic. For example, setting `channel = channel_#{topic}` routes Kafka messages to a channel matching their topic name.
+
+The following example maps Kafka topics to Ably channels based on the topic name:
+
+
+```javascript
+channel = channel_#{topic}
+message.name = message_#{key}
+```
+
+
+### Static and pattern-based mapping
+
+You can use static and pattern-based mapping in conjunction. For example, you can dynamically map the channel while keeping the message name static, ensuring messages are routed to topic-specific channels while maintaining a consistent message name. The following configuration maps Kafka topics to Ably channels based on the topic name, while keeping the message name static:
+
+
+```javascript
+channel = channel_#{topic}
+message.name = single_message
+```
+
+
+## Publish messages with a schema
+
+The Ably Kafka Connector supports messages that include schema information. It converts these messages to JSON before publishing them to Ably using the [Kafka Connect Schema Registry](https://docs.confluent.io/platform/current/schema-registry/connect.html) and supported converters.
+
+For example, if messages on the Kafka topic are serialized using Avro, and schemas are registered in a Schema Registry, configure the connector to convert Avro to JSON.
+
+Set the following properties in your Kafka Connect configuration:
+
+
+```text
+value.converter=io.confluent.connect.avro.AvroConverter
+value.converter.schema.registry.url=https://
+```
+
+
+This configuration ensures Kafka messages are correctly deserialized and transformed before reaching Ably.
+
+## Configure the Kafka connector
+
+The Ably Kafka Connector sends Kafka messages to Ably channels in realtime. Configuration depends on the installation method used:
+
+| Installation method | Configuration steps |
+| ------------------- | ------------------- |
+| Docker | Create a `docker-compose-connector.properties` file inside the `/config` directory. An example file is already available in the repository. |
+| Single connect worker | - Provide a configuration file as a command-line argument when running the worker. |
+| Distributed connect workers | - Use the Confluent REST API `/connectors` endpoint. Pass the configuration as JSON. |
+
+### Connector configuration properties
+
+You must configure these core properties to get the connector working.
+
+| Property | Description |
+| -------- | ----------- |
+| `channel` | The Ably channel to which messages are published. Supports [Dynamic Channel Configuration](#pattern). |
+| `client.key` | An Ably API key used for authentication. Must have **publish** capability for the specified channel. |
+| `client.id` | The Ably client ID the connector uses. Defaults to `"kafka-connect-ably-example"`. |
+| `name` | A globally unique name for the connector. Defaults to `"ably-channel-sink"`. |
+| `topics` | A comma-separated list of Kafka topics to publish from. |
+| `tasks.max` | The maximum number of tasks the connector should run. Defaults to `1`. |
+| `connector.class` | The class name for the connector. Must be a subclass of `org.apache.kafka.connect.connector`. Defaults to `io.ably.kafka.connect.ChannelSinkConnector`. |
+
+
diff --git a/src/pages/docs/platform/integrations/inbound/webhooks.mdx b/src/pages/docs/platform/integrations/inbound/webhooks.mdx
new file mode 100644
index 0000000000..37f3245492
--- /dev/null
+++ b/src/pages/docs/platform/integrations/inbound/webhooks.mdx
@@ -0,0 +1,96 @@
+---
+title: Inbound webhooks
+meta_description: "Incoming webhooks let you integrate external web services with Ably."
+meta_keywords: "Ably, incoming, inbound, webhooks, webhook configuration, web services, realtime."
+redirect_from:
+ - /docs/general/incoming-webhooks
+---
+
+External services can publish messages to Ably channels using the [REST API](/docs/api/rest-api), however, a simpler alternative is to use [incoming webhooks](#configure).
+
+Many web services generate webhooks to communicate with applications. These webhooks are triggered based on interactions with their APIs or infrastructure. When a webhook request is received by Ably, its payload is published to a channel as an [unenveloped](/docs/api/rest-api#unenveloped) message.
+
+Ably also supports [outbound webhooks](/docs/integrations/webhooks), which send data from Ably to other external services such as AWS, Google Cloud Platform or Zapier.
+
+
+
+## Configure an incoming webhook
+
+Set up incoming webhooks in the Integrations tab of the [Ably dashboard](https://ably.com/accounts/any/app/any/integrations):
+
+1. Click **Register a new webhook endpoint**.
+2. **Name** your webhook.
+3. **Select an Ably channel** to receive webhook messages.
+4. Click **Generate a URL**.
+5. Copy the generated URL and configure your external service with it.
+
+### Test incoming webhook
+
+Run the following Curl command to simulate an incoming webhook request:
+
+
+```shell
+curl -X POST 'https://rest.ably.io/channels/webhook-test/messages?key={{API_KEY_NAME}}:{{API_KEY_SECRET}}&enveloped=false' \
+ -H 'content-type: application/json' --data '{"some":"json"}'
+```
+
+
+Incoming webhooks function as REST publishes, meaning they follow the same behavior and functionality as the [REST publish API](/docs/api/rest-api#publish).
+
+Ably responds with the `channel` and `messageId`:
+
+
+```json
+{
+ "channel": "webhook-test",
+ "messageId": "20xxxxxxx"
+}
+```
+
+
+A successful request returns a `201` status. Failures return with an [`ErrorInfo`](/docs/api/rest-sdk/types#error-info) response.
+
+## Receive webhook messages
+
+Incoming webhooks publish messages to an Ably channel. You can [subscribe](/docs/pub-sub#subscribe) to these messages using an Ably SDK:
+
+
+```javascript
+const Ably = require("ably");
+
+const ably = new Ably.Realtime('{{API_KEY}}');
+const channel = ably.channels.get('webhook-test');
+
+channel.subscribe((message) => {
+ console.log(`webhook received: ${JSON.stringify(message.data)}`);
+});
+```
+
+
+## Optional headers
+
+The request body of incoming webhooks is treated as a message to be published. If the external service allows, you can customize webhook requests by including optional headers and parameters.
+
+The following example demonstrates how to set a message `name` using the `X-Ably-Name` header:
+
+
+```shell
+curl -X POST 'https://rest.ably.io/channels/webhook-test/messages?key=key:secret&enveloped=false' \
+ -H 'content-type: application/json' --data '{"some":"json"}' \
+ -H 'X-Ably-Name: webhook-message'
+```
+
+
+Then, filter messages by name:
+
+
+```javascript
+channel.subscribe('webhook-message', (message) => {
+ console.log("webhook: " + JSON.stringify(message.data));
+});
+```
+
+
+To ensure that publishes are [idempotent](/docs/pub-sub/advanced#idempotency), add a unique `X-Ably-MessageId` header.
From e413527f0b6d565fa8a462a6f5031b0cfa565453 Mon Sep 17 00:00:00 2001
From: Mark Hulbert <39801222+m-hulbert@users.noreply.github.com>
Date: Fri, 22 Aug 2025 10:28:49 +0200
Subject: [PATCH 2/7] Convert outbound streaming integrations to MDX
---
content/integrations/streaming/amqp.textile | 42 ----
.../integrations/streaming/datadog.textile | 90 --------
content/integrations/streaming/index.textile | 193 -----------------
content/integrations/streaming/kafka.textile | 51 -----
.../integrations/streaming/kinesis.textile | 111 ----------
content/integrations/streaming/pulsar.textile | 44 ----
content/integrations/streaming/sqs.textile | 111 ----------
.../platform/integrations/streaming/amqp.mdx | 41 ++++
.../integrations/streaming/datadog.mdx | 91 ++++++++
.../platform/integrations/streaming/index.mdx | 194 ++++++++++++++++++
.../platform/integrations/streaming/kafka.mdx | 52 +++++
.../integrations/streaming/kinesis.mdx | 111 ++++++++++
.../integrations/streaming/pulsar.mdx | 43 ++++
.../platform/integrations/streaming/sqs.mdx | 112 ++++++++++
14 files changed, 644 insertions(+), 642 deletions(-)
delete mode 100644 content/integrations/streaming/amqp.textile
delete mode 100644 content/integrations/streaming/datadog.textile
delete mode 100644 content/integrations/streaming/index.textile
delete mode 100644 content/integrations/streaming/kafka.textile
delete mode 100644 content/integrations/streaming/kinesis.textile
delete mode 100644 content/integrations/streaming/pulsar.textile
delete mode 100644 content/integrations/streaming/sqs.textile
create mode 100644 src/pages/docs/platform/integrations/streaming/amqp.mdx
create mode 100644 src/pages/docs/platform/integrations/streaming/datadog.mdx
create mode 100644 src/pages/docs/platform/integrations/streaming/index.mdx
create mode 100644 src/pages/docs/platform/integrations/streaming/kafka.mdx
create mode 100644 src/pages/docs/platform/integrations/streaming/kinesis.mdx
create mode 100644 src/pages/docs/platform/integrations/streaming/pulsar.mdx
create mode 100644 src/pages/docs/platform/integrations/streaming/sqs.mdx
diff --git a/content/integrations/streaming/amqp.textile b/content/integrations/streaming/amqp.textile
deleted file mode 100644
index d313aab215..0000000000
--- a/content/integrations/streaming/amqp.textile
+++ /dev/null
@@ -1,42 +0,0 @@
----
-title: AMQP integration
-meta_description: "Send data to AMQP based on message, channel lifecycle, channel occupancy, and presence events."
-meta_keywords: "AMQP, integrations, events, serverless"
-languages:
- - none
-redirect_from:
- - /docs/general/firehose/amqp-rule
----
-
-"AMQP":https://www.amqp.org integrations enable you to automatically forward events that occur in Ably to AMQP-compatible brokers.
-
-h2(#create). Create an AMQP integration
-
-To create an AMQP integration in your "dashboard:":https://ably.com/dashboard/any
-
-1. Login and select the application you wish to integrate with AMQP.
-2. Click the *Integrations* tab.
-3. Click the *New Integration Rule* button.
-4. Choose Firehose.
-5. Choose AMQP.
-6. Configure the AMQP "settings":#settings.
-7. Click *Create*.
-
-You can also create an AMQP integration using the "Control API":/docs/platform/account/control-api.
-
-h3(#settings). Settings
-
-The following settings are available when creating an AMQP integration:
-
-|_. Setting |_. Description |
-| URL | Specifies the AMQP connection URL in the format `amqps://username:password@host.name/vhost`. |
-| Headers | Allows the inclusion of additional information in key-value format. |
-| "Source":/docs/integrations/streaming#sources | Specifies the event types being sent to AMQP. |
-| "Channel filter":/docs/integrations/streaming#filter | Filters the source channels based on a regular expression. |
-| Encoding | Specifies the encoding format of messages. Either JSON or MsgPack. |
-| "Enveloped":/docs/integrations/streaming#enveloped | Checkbox to set whether messages should be enveloped or not. Enveloped is the default. |
-| Routing key | Specifies the "routing key":/messages#routing used by the AMQP exchange to route messages to a physical queue. Supports interpolation. |
-| Exchange | An optional RabbitMQ exchange. Supports interpolation. |
-| Route mandatory | Messages are rejected if the route does not exist when set to @true@. Fails silently otherwise. |
-| Route persistent | Messages are marked as persistent, instructing the broker to write them to disk if the queue is durable. |
-| Optional TTL (minutes) | Override the default queue time to live (TTL), in minutes, for messages to be persisted. |
diff --git a/content/integrations/streaming/datadog.textile b/content/integrations/streaming/datadog.textile
deleted file mode 100644
index 536bf055c9..0000000000
--- a/content/integrations/streaming/datadog.textile
+++ /dev/null
@@ -1,90 +0,0 @@
----
-title: Datadog integration
-meta_description: "Connect Ably and Datadog to monitor messages, channels, and connections in realtime, integrating your Ably statistics with your existing Datadog setup."
-meta_keywords: "Datadog, integrations, statistics, metrics, monitoring, analytics, enterprise"
----
-
-The Ably "Datadog":https://docs.datadoghq.com/integrations/ably/ integration enables you to monitor your application's statistics. Every 60 seconds, Ably streams a comprehensive set of "statistics":/docs/metadata-stats/stats#metrics to the Datadog API.
-
-
-
-h2(#setup). Setup the Datadog integration
-
-To connect Ably with Datadog, authorize the integration through Datadog's "OAuth":https://docs.datadoghq.com/developers/integrations/oauth_for_integrations/ flow. This process requires the @api_keys_write@ scope, allowing Ably to push data to your Datadog account.
-
-Once the integration is active, Datadog provides a specific Ably "dashboard":https://docs.datadoghq.com/integrations/ably/, enabling you to monitor key metrics without extra setup.
-
-The following steps setup the Datadog integration:
-
-# In Datadog, go to *Integrations*, find the *Ably* tile, and click *Install Integration*.
-# Click *Connect Accounts* to start the authorization process. Datadog redirects you to Ably.
-# Log in to your *Ably* account.
-# Select your application from the *Your Apps* page.
-# Navigate to *Integrations*, and select *Connect to Datadog*.
-# Datadog authorization page, authorize Ably using *OAuth* to grant access. The required authorization scope is: @api_keys_write@.
-# After completing authorization, you will be redirected to the *Ably dashboard*, and the process is complete.
-
-h2(#remove). Remove access
-
-Removing access disconnects Ably from Datadog, stopping data transmission and revoking authorization. Follow the steps remove the Ably and Datadog integration using either platform.
-
-h3(#in-ably). Remove access using Ably
-
-* Open your application's integration settings.
-* Click *Remove* next to the Datadog integration.
-* Ably revokes OAuth credentials and stops metric transmission.
-
-h3(#in-datadog). Remove access using Datadog
-
-* Remove associated Ably API keys via *Integrations* or *API Keys*.
-* Adjust scopes or entirely revoke OAuth tokens if necessary.
-
-h2(#lite). Datadog lite
-
-Datadog Lite is a lightweight version of the full Datadog integration that sends a reduced set of "statistics":/docs/metadata-stats/stats#metrics to the Datadog API. This integration is designed for use cases where full statistics are not required, such as when you only need to monitor a limited number channels or connections.
-
-DataDog Lite is available to "Pro":/docs/platform/pricing/pro packages. A 30-day trial is also available to "Standard":/docs/platform/pricing/standard packages, which can be enabled by contacting "Ably support":https://ably.com/support.
-
-The following statistics are streamed from Ably to Datadog using the Lite integration:
-
-|_. Metric Name |_. Description |
-|@messages.all.all.count@|Total number of messages that were successfully sent, summed over all message types and transports.|
-|@messages.all.all.billableCount@|Total number of billable messages that were successfully sent, summed over all message types and transports.|
-|@messages.all.all.data@|Total message size of all messages that were successfully sent, summed over all message types and transports.|
-|@messages.all.all.uncompressedData@|Total uncompressed message size, excluding delta compression.|
-|@messages.all.all.failed@|Total number of messages that failed. These are messages which did not succeed for some reason other than Ably explicitly refusing them, such as they were rejected by an external integration target, or a service issue on Ably's side.|
-|@messages.all.all.refused@|Total number of messages that were refused by Ably. For example, due to rate limiting, malformed messages, or incorrect client permissions.|
-|@messages.all.messages.count@|Total message count, excluding presence and state messages.|
-|@messages.all.messages.billableCount@|Total billable message count, excluding presence and state messages.|
-|@messages.all.messages.data@|Total message size, excluding presence and state messages.|
-|@messages.all.messages.uncompressedData@|Total number of messages that failed. These are messages which did not succeed for some reason other than Ably explicitly refusing them, such as they were rejected by an external integration target, or a service issue on Ably's side.|
-|@messages.all.messages.failed@|Total number of messages excluding presence and state messages that failed. These are messages which did not succeed for some reason other than Ably explicitly refusing them, such as they were rejected by an external integration target, or a service issue on Ably’s side.|
-|@messages.all.messages.refused@|Total number of messages excluding presence and state messages that were refused by Ably. For example, due to rate limiting, malformed messages, or incorrect client permissions.|
-|@messages.all.presence.count@|Total presence message count.|
-|@messages.all.presence.billableCount@|Total billable presence message count.|
-|@messages.all.presence.data@|Total presence message size.|
-|@messages.all.presence.uncompressedData@|Total uncompressed presence message size, excluding delta compression.|
-|@messages.all.messages.failed@|Total number of presence messages excluding presence and state messages that failed. These are messages which did not succeed for some reason other than Ably explicitly refusing them, such as they were rejected by an external integration target, or a service issue on Ably's side.|
-|@messages.all.messages.refused@|Total number of presence messages excluding presence and state messages that were refused by Ably. For example, due to rate limiting, malformed messages, or incorrect client permissions.|
-|@messages.inbound.all.all.count@|Total inbound message count, received by Ably from clients.|
-|@messages.inbound.all.all.data@|Total inbound message size, received by Ably from clients.|
-|@messages.inbound.all.all.uncompressedData@|Total uncompressed inbound message size, excluding delta compression, received by Ably from clients.|
-|@messages.inbound.all.all.failed@|Total number of inbound messages that failed. These are messages which did not succeed for some reason other than Ably explicitly refusing them, such as a service issue on Ably's side.|
-|@messages.inbound.all.all.refused@|Total number of inbound messages that were refused by Ably. For example, due to rate limiting, malformed messages, or incorrect client permissions.|
-|@messages.outbound.all.all.count@|Total outbound message count, sent from Ably to clients.|
-|@messages.outbound.all.all.billableCount@|Total billable outbound message count, sent from Ably to clients.|
-|@messages.outbound.all.all.data@|Total outbound message size, sent from Ably to clients.|
-|@messages.outbound.all.all.uncompressedData@|Total uncompressed outbound message size, excluding delta compression, sent from Ably to clients.|
-|@messages.outbound.all.all.failed@|Total number of outbound messages that failed. These are messages which did not succeed for some reason other than Ably explicitly refusing them, such as rejection by an external integration target, or a service issue on Ably's side.|
-|@messages.outbound.all.all.refused@|Total number of outbound messages that were refused by Ably. This is generally due to rate limiting.|
-|@connections.all.peak@ |Peak connection count.|
-|@channels.peak@ |Peak active channel count.|
-|@push.channelMessages@ |Total number of channel messages published over Ably that contained a push payload. Each of these may have triggered notifications to be sent to a device with a matching registered push subscription.|
-|@messages.persisted.messages.count@ |Total count of persisted messages, excluding presence and state messages.|
-|@messages.persisted.messages.data@ |Total size of persisted messages, excluding presence and state messages.|
-|@messages.persisted.messages.uncompressedData@ |Total uncompressed persisted message size, excluding delta compression, and presence and state messages.|
-|@messages.persisted.presence.count@ |Total count of persisted presence messages.|
-|@messages.persisted.presence.data@ |Total size of persisted presence messages.|
-|@messages.persisted.presence.uncompressedData@ |Total uncompressed persisted presence message size, excluding delta compression.|
diff --git a/content/integrations/streaming/index.textile b/content/integrations/streaming/index.textile
deleted file mode 100644
index 7040a79fd5..0000000000
--- a/content/integrations/streaming/index.textile
+++ /dev/null
@@ -1,193 +0,0 @@
----
-title: Outbound streaming overview
-meta_description: "Outbound streaming integrations enable you to stream data from Ably to an external service for realtime processing."
-meta_keywords: "realtime streaming, stream processing"
-languages:
- - javascript
- - nodejs
- - php
- - python
- - ruby
- - java
- - swift
- - objc
- - csharp
- - go
-redirect_from:
- - /docs/general/versions/v1.1/firehose
- - /docs/general/versions/v1.0/firehose
- - /docs/general/firehose
----
-
-Ably's streaming integrations enable you to stream data that's published in the Ably platform to an external streaming or queueing service.
-
-For example, any device that publishes messages on a channel can immediately stream those messages to "Amazon Kinesis":/docs/integrations/streaming/kinesis, so that you can process the data in realtime.
-
-Each message is delivered once to your streaming or queueing server, making the system well-suited for asynchronous processing of realtime data published by Ably. For example, workers consuming data from your stream or queue can persist each message of a live chat to your own database, start publishing updates when a channel becomes active, or trigger an event if a device submits a location indicating it has reached its destination.
-
-
-
-
-
-There are two ways to create an outbound streaming integration:
-
-* Using the "Ably dashboard":#dashboard.
-* Using the "Control API":#api.
-
-h2(#dashboard). Create an outbound streaming integration using the dashboard
-
-You can create outbound streaming integrations in your "dashboard:":https://ably.com/dashboard/any for the following services:
-
-* "Amazon Kinesis":/docs/integrations/streaming/kinesis
-* "Amazon SQS":/docs/integrations/streaming/sqs
-* "AMQP":/docs/integrations/streaming/amqp
-* "Apache Kafka":/docs/integrations/streaming/kafka
-* "Apache Pulsar":/docs/integrations/streaming/pulsar
-
-h2(#api). Create an outbound streaming integration using the Control API
-
-To create a new "integration":/docs/platform/account/control-api#examples-rules for an Ably application, send a POST request to @/apps/{app_id}/rules@ with the integration configuration in the request body.
-
-h2(#sources). Event types
-
-You can configure webhooks to listen for the following event types:
-
-|_. Event type |_. Description |
-| @channel.lifecycle@ | Triggered when a channel is created or discarded. |
-| @channel.message@ | Triggered when "messages":/docs/messages are published. |
-| @channel.occupancy@ | Triggered when the number of users in a channel "changes":/docs/channels/options#occupancy. |
-| @channel.presence@ | Triggered when users enter, leave, or update their "presence":/docs/presence-occupancy/presence. |
-
-h2(#filter). Channel filter
-
-Set a filter to restrict which channels an integration applies to using a regular expression.
-
-The following examples demonstrate channel names that you can match against using regular expressions to control which channels a webhook rule applies to:
-
-```[text]
-mychannel:public
-public
-public:events
-public:events:conferences
-public:news:americas
-public:news:europe
-```
-
-* @^public.*@ — Matches any channel that starts with @public@. This includes @public@, both @public:events@ channels, and both @public:news@ channels.
-* @^public$@ — Matches only channels named exactly @public@.
-* @:public$@ — Matches channels that end with @:public@. This includes only @mychannel:public@.
-* @^public:events$@ — Matches channels named exactly @public:events@. This does not include @public:events:conferences@.
-* @^public.*europe$@ — Matches channels that start with @public@ and end with @europe@. This includes only @public:news:europe@.
-* @news@ — Matches any channel name that includes the word @news@. This includes @public:news:americas@ and @public:news:europe@.
-
-h2(#enveloped). Enveloped messages
-
-Message enveloping adds structured metadata such as the publisher's @clientId@ and the originating channel name, alongside the message payload.
-
-This metadata is useful when processing events dynamically or when additional context about the message source is required. Enveloped messages are recommended for most use cases, as they provide a consistent format for all events.
-
-Enveloped messages include the following headers:
-
-|_. Header |_. Description |
-| @x-ably-version@ | Specifies the version. |
-| @content-type@ | Indicates the payload format, which can be either @application/json@ or @application/x-msgpack@ for enveloped messages. |
-
-Each enveloped message contains the following fields:
-
-|_. Field |_. Description |
-| @source@ | The "event type":#sources. Possible values are @channel.message@, @channel.presence@, @channel.lifecycle@ or @channel.occupancy@. |
-| @appId@ | The app that generated the event. |
-| @channel@ | The channel where the event occurred. |
-| @site@ | The datacenter that sent the message. |
-| @timestamp@ | A timestamp in milliseconds since the epoch representing when the event occurred. |
-
-In addition, it will contain another field which will contain the actual message. This field is named according to the message type.
-
-The following is an example of an enveloped @message@ payload:
-
-```[json]
-{
- "source": "channel.message",
- "appId": "aBCdEf",
- "channel": "channel-name",
- "site": "eu-central-1-A",
- "ruleId": "1-a2Bc",
- "messages": [{
- "id": "ABcDefgHIj:1:0",
- "connectionId": "ABcDefgHIj",
- "timestamp": 1123145678900,
- "data": "some message data",
- "name": "my message name"
- }]
-}
-```
-
-h3(#decode). Decode enveloped messages
-
-Ably SDKs automatically decode messages into @Message@ objects. Messages sent via an integration need to be decoded manually.
-
-There are two methods available for decoding messages into @Message@ objects:
-
-* @Message.fromEncodedArray()@ for an array of messages.
-* @Message.fromEncoded()@ for single messages.
-
-There are also equivalent methods for decoding presence messages into @PresenceMessage@ objects:
-
-* @PresenceMessage.fromEncodedArray()@ for an array of presence messages.
-* @PresenceMessage.fromEncoded()@ for single messages.
-
-Decoding is essential because it reconstructs the original data payload using the encoding field, ensuring the correct data type is restored, whether it's a string, binary, or structured object. If the message was encrypted, passing your encryption key to the method allows the SDK to decrypt data automatically.
-
-Ably strongly recommends decoding all messages received over integrations before processing them to avoid issues with unexpected data formats.
-
-h2(#non-enveloped). Non-enveloped messages
-
-You can turn off enveloping if you only need the raw message payload or the external service follows a strict data structure. This results in smaller payloads and eliminates the need to parse additional metadata. However, it requires you to handle raw payload decoding manually.
-
-Non-enveloped messages use a more straightforward format, delivering events directly without additional wrapping. Instead of a structured envelope, key properties are exposed through HTTP headers alongside the raw payload.
-
-For example, if you publish a JSON message to the channel @my_channel@ using the following cURL request:
-
-```[curl]
-curl -X POST https://rest.ably.io/channels/my_channel/messages \
- -u "{{API_KEY}}" \
- -H "Content-Type: application/json" \
- --data '{ "name": "publish", "data": "example" }'
-```
-
-The received message will include:
-
-* @x-ably-message-name: publish@
-* @Payload: example@
-
-The following example demonstrates a non-enveloped presence event:
-
-```[javascript]
-realtime = new Ably.Realtime({
- key: '{{API_KEY}}',
- clientId: 'bob'
-});
-channel = realtime.channels.get('some_channel');
-await channel.presence.enter('some data');
-```
-
-The received presence message will include:
-* @x-ably-message-action: enter@
-* @Payload: "some data"@
-
-h3(#non-structure). Non-enveloped structure
-
-Non-enveloped messages include headers that provide essential context about the payload, such as its source, format, and metadata. The following headers are included in all non-enveloped messages:
-
-|_. Header |_. Description |
-| @content-type@ | Defines the payload type: @application/json@ for JSON, @text/plain@ for text, or @application/octet-stream@ for binary data. |
-| @x-ably-version@ | The version of the integration. |
-| @x-ably-envelope-appid@ | The @appID@ from which the message originated. |
-| @x-ably-envelope-channel@ | Name of the Ably channel that sent the message. |
-| @x-ably-envelope-rule-id@ | The ID of the integration that triggered the event. |
-| @x-ably-envelope-site@ | The datacenter that processed the event. |
-| @x-ably-envelope-source@ | "Event source":#sources, indicating the type of event: @channel.message@, @channel.presence@, @channel.lifecycle@, or @channel.occupancy@. |
-| @x-ably-message-client-id@ | @ClientID@ of the connection that sent the event. |
-| @x-ably-message-connection-id@ | @ConnectionID@ that initiated the event. |
-| @x-ably-message-id@ | Unique @messageID@ for tracking. |
-| @x-ably-message-timestamp@ | Timestamp of when the message was originally sent. |
diff --git a/content/integrations/streaming/kafka.textile b/content/integrations/streaming/kafka.textile
deleted file mode 100644
index c60a2c60d4..0000000000
--- a/content/integrations/streaming/kafka.textile
+++ /dev/null
@@ -1,51 +0,0 @@
----
-title: Apache Kafka integration
-meta_description: "Send data to Kafka based on message, channel lifecycle, channel occupancy, and presence events."
-meta_keywords: "Kafka, integrations, events, serverless"
-redirect_from:
- - /docs/general/firehose/kafka-rule
----
-
-"Kafka":https://kafka.apache.org/ integrations enable you to automatically forward events that occur in Ably to Kafka topics.
-
-
-
-h2(#create). Create a Kafka integration
-
-To create a Kafka integration in your "dashboard:":https://ably.com/dashboard/any
-
-1. Login and select the application you wish to integrate with Kafka.
-2. Click the *Integrations* tab.
-3. Click the *New Integration Rule* button.
-4. Choose Firehose.
-5. Choose Kafka.
-6. Configure the Kafka "settings":#settings.
-7. Click *Create*.
-
-You can also create a Kafka integration using the "Control API":/docs/platform/account/control-api.
-
-
-h3(#settings). Settings
-
-The following settings are available when creating a Kafka integration:
-
-|_. Setting |_. Description |
-| "Source":/docs/integrations/streaming#sources | Specifies the event types being sent to Kafka. |
-| "Channel filter":/docs/integrations/streaming#filter | Filters the source channels based on a regular expression. |
-| Encoding | Specifies the encoding format of messages. Either JSON or MsgPack. |
-| "Enveloped":/docs/integrations/streaming#enveloped | Checkbox to set whether messages should be enveloped or not. Enveloped is the default. |
-| Routing key | Specifies the "routing key":/messages#routing used to route messages to Kafka topics. |
-| Mechanism | The "SASL/SCRAM mechanism":#mechanism used for Kafka connection. |
-| Username | The username to connect to Kafka with. |
-| Password | The password to connect to Kafka with. |
-| Brokers | List of Kafka broker endpoints in the format @:@. |
-
-h3(#mechanism). Authentication mechanism
-
-The available authentication mechanisms for Kafka are:
-
-* "SASL/PLAIN":https://docs.confluent.io/platform/current/kafka/authentication_sasl/authentication_sasl_plain.html#kafka-sasl-auth-plain
-* "SASL/SCRAM-SHA-256":https://docs.confluent.io/platform/current/kafka/authentication_sasl/authentication_sasl_scram.html#kafka-sasl-auth-scram
-* "SASL/SCRAM-SHA-512":https://docs.confluent.io/platform/current/kafka/authentication_sasl/authentication_sasl_scram.html
diff --git a/content/integrations/streaming/kinesis.textile b/content/integrations/streaming/kinesis.textile
deleted file mode 100644
index 36858eaf61..0000000000
--- a/content/integrations/streaming/kinesis.textile
+++ /dev/null
@@ -1,111 +0,0 @@
----
-title: AWS Kinesis integration
-meta_description: "Send data to Kinesis based on message, channel lifecycle, channel occupancy, and presence events."
-meta_keywords: "Kinesis, integrations, events, serverless"
-languages:
- - none
-redirect_from:
- - /docs/general/firehose/kinesis-rule
----
-
-"Kinesis":https://aws.amazon.com/kinesis/ integrations enable you to automatically forward events that occur in Ably to AWS Kinesis streams.
-
-h2(#create). Create a Kinesis integration
-
-To create a Kinesis integration in your "dashboard:":https://ably.com/dashboard/any
-
-1. Login and select the application you wish to integrate with Kinesis.
-2. Click the *Integrations* tab.
-3. Click the *New Integration Rule* button.
-4. Choose Firehose.
-5. Choose AWS Kinesis.
-6. Configure the Kinesis "settings":#settings.
-7. Click *Create*.
-
-You can also create a Kinesis integration using the "Control API":/docs/platform/account/control-api.
-
-h3(#settings). Settings
-
-The following settings are available when creating a Kinesis integration:
-
-|_. Setting |_. Description |
-| AWS Region | Specifies the AWS region of your Kinesis Stream. |
-| Stream Name | Defines the name of the Kinesis Stream to connect to. |
-| "AWS authentication scheme":#auth | Choose the authentication method. Either *AWS credentials* or *ARN of an assumable role*. |
-| AWS Credentials | If using AWS credentials, enter the values in @key:value@ format. |
-| ARN of an assumable role | If using ARN of an assumable role, enter the ARN of the role that Ably can assume to access your Kinesis stream. |
-| "Source":/docs/integrations/streaming#sources | Specifies the event types being sent to Kinesis. |
-| "Channel filter":/docs/integrations/streaming#filter | Filters the source channels based on a regular expression. |
-| Encoding | Specifies the encoding format of messages. Either JSON or MsgPack. |
-| "Enveloped":/docs/integrations/streaming#enveloped | Checkbox to set whether messages should be enveloped or not. Enveloped is the default. |
-| Partition key | Specifies the "partition key":/messages#routing used to route messages to a Kinesis stream shard. |
-
-h2(#auth). AWS authentication
-
-Delegate access to your AWS resources by creating an "IAM role":https://docs.aws.amazon.com/IAM/latest/UserGuide/id_roles.html that the Ably AWS account can assume.
-
-This approach follows AWS best practices, as it avoids sharing access keys directly. Specify the role's ARN to grant Ably the necessary permissions in a secure manner.
-
-h3(#kinesis). Create a Kinesis policy
-
-The following steps show you how to create a policy for AWS Kinesis.
-
-1. In the IAM console sidebar select *Policies*.
-2. Click *Create Policy*.
-3. Click the JSON tab and enter the following JSON to configure the policy:
-
-```[json]
-{
- "Version": "2012-10-17",
- "Statement": [
- {
- "Sid": "ReadWriteToSingleStream",
- "Effect": "Allow",
- "Action": [
- "kinesis:DescribeLimits",
- "kinesis:DescribeStream",
- "kinesis:GetShardIterator",
- "kinesis:GetRecords",
- "kinesis:ListTagsForStream",
- "kinesis:MergeShards",
- "kinesis:PutRecord",
- "kinesis:PutRecords",
- "kinesis:UpdateShardCount"
- ],
- "Resource": [
- "arn:aws:kinesis:::stream/"
- ]
- }
- ]
-}
-```
-
-
-
-4. Click *Next: Tags*. You don't need to add any tags.
-5. Click *Next: Review*.
-6. Enter a suitable name for your policy.
-7. Click *Create Policy*.
-
-You have created a policy that grants the permissions required to use a Kinesis stream. You must now attach it to the role that you'll specify in your Ably integration rule.
-
-h3(#role). Create a role
-
-Create an IAM role as follows:
-
-1. In the AWS IAM console, click *Roles* in the sidebar and then click *Create Role*.
-2. For type of trusted entity select *Another AWS account*.
-3. For Account ID specify 203461409171. This is the Ably AWS account.
-4. Click the *Require external ID checkbox* and then enter an external ID of "@.@":/docs/platform/account/control-api#ids.
-5. Click *Next: Permissions*.
-6. Now select the policy you created earlier to attach to this role. You can type the name of your policy into the *Filter policies* search box.
-
-Then ensure the checkbox for the policy is selected.
-
-1. Click *Next: Tags*.
-2. You don't need to add tags so click *Next: Review*.
-3. Enter a suitable name for your role.
-4. Click *Create Role*.
diff --git a/content/integrations/streaming/pulsar.textile b/content/integrations/streaming/pulsar.textile
deleted file mode 100644
index 88222aae88..0000000000
--- a/content/integrations/streaming/pulsar.textile
+++ /dev/null
@@ -1,44 +0,0 @@
----
-title: Apache Pulsar integration
-meta_description: "Send data to Pulsar based on message, channel lifecycle, channel occupancy, and presence events."
-meta_keywords: "Pulsar, integrations, events, serverless"
-languages:
- - none
-redirect_from:
- - /docs/general/firehose/pulsar-rule
----
-
-"Pulsar":https://pulsar.apache.org integrations enable you to automatically forward events that occur in Ably to Pulsar topics.
-
-
-
-h2(#create). Create a Pulsar integration
-
-To create a rule in your "dashboard:":https://ably.com/dashboard/any
-
-1. Login and select the application you wish to integrate with Pulsar.
-2. Click the *Integrations* tab.
-3. Click the *New Integration Rule* button.
-4. Choose Firehose.
-5. Choose Pulsar.
-6. Configure the Pulsar "settings":#settings.
-7. Click *Create*.
-
-You can also create a Pulsar integration using the "Control API":/docs/platform/account/control-api.
-
-h4(#settings). Settings
-
-The following settings are available when creating a Pulsar integration:
-
-|_. Setting |_. Description |
-| "Source":/docs/integrations/streaming#sources | Specifies the event types being sent to Pulsar. |
-| "Channel filter":/docs/integrations/streaming#filter | Filters the source channels based on a regular expression. |
-| Encoding | Specifies the encoding format of messages. Either JSON or MsgPack. |
-| "Enveloped":/docs/integrations/streaming#enveloped | Checkbox to set whether messages should be enveloped or not. Enveloped is the default. |
-| Routing key | Specifies the "routing key":/messages#routing used to route messages to Pulsar topics. |
-| Topic | Defines the Pulsar topic to publish messages to. Must be in the format @tenant/namespace/topic_name@. |
-| Service URL | Specifies the Pulsar cluster URL in the format @pulsar://host:port@ or @pulsar+ssl://host:port@. |
-| JWT Token | JWT to use for authentication. |
-| TLS trust certificates | Specify a list of trusted CA certificates to verify TLS certificates presented by Pulsar. |
diff --git a/content/integrations/streaming/sqs.textile b/content/integrations/streaming/sqs.textile
deleted file mode 100644
index ff398ce157..0000000000
--- a/content/integrations/streaming/sqs.textile
+++ /dev/null
@@ -1,111 +0,0 @@
----
-title: AWS SQS integration
-meta_description: "Send data to SQS based on message, channel lifecycle, channel occupancy, and presence events."
-meta_keywords: "SQS, integrations, events, serverless"
-languages:
- - none
-redirect_from:
- - /docs/general/firehose/sqs-rule
----
-
-"SQS":https://aws.amazon.com/sqs integrations enable you to automatically forward events that occur in Ably to AWS SQS queues.
-
-h2(#create). Create an SQS integration
-
-To create an SQS integration in your "dashboard:":https://ably.com/dashboard/any
-
-1. Login and select the application you wish to integrate with SQS.
-2. Click the *Integrations* tab.
-3. Click the *New Integration Rule* button.
-4. Choose Firehose.
-5. Choose AWS SQS.
-6. Configure the SQS "settings":#settings.
-7. Click *Create*.
-
-You can also create an SQS integration using the "Control API":/docs/platform/account/control-api.
-
-h4(#settings). Settings
-
-The following settings are available when creating an SQS integration:
-
-|_. Setting |_. Description |
-| URL | Specifies the URL for the SQS queue, including credentials, region, and stream name. Only HTTPS is supported. |
-| AWS Region | Specifies the AWS region of your SQS queue. |
-| "AWS authentication scheme":#auth | Choose the authentication method. Either *AWS credentials* or *ARN of an assumable role*. |
-| AWS Credentials | If using AWS credentials, enter the values in @key:value@ format. |
-| ARN of an assumable role | If using ARN of an assumable role, enter the ARN of the role that Ably can assume to access your SQS queue. |
-| "Source":/docs/integrations/streaming#sources | Specifies the event types being sent to SQS. |
-| "Channel filter":/docs/integrations/streaming#filter | Filters the source channels based on a regular expression. |
-| Encoding | Specifies the encoding format of messages. Either JSON or MsgPack. |
-| "Enveloped":/docs/integrations/streaming#enveloped | Checkbox to set whether messages should be enveloped or not. Enveloped is the default. |
-
-h2(#auth). AWS authentication
-
-Delegate access to your AWS resources by creating an "IAM role":https://docs.aws.amazon.com/IAM/latest/UserGuide/id_roles.html that the Ably AWS account can assume.
-
-This approach follows AWS best practices, as it avoids sharing access keys directly. Specify the role's ARN to grant Ably the necessary permissions in a secure manner.
-
-h3(#sqs). Create an SQS policy
-
-The following steps show you how to create a policy for AWS SQS.
-
-1. In the IAM console sidebar select *Policies*.
-2. Click *Create Policy*.
-3. Click the JSON tab and enter the following JSON to configure the policy:
-
-```[json]
-{
- "Version": "2012-10-17",
- "Statement": [
- {
- "Sid": "AllowReadWriteSQS",
- "Effect": "Allow",
- "Action": [
- "sqs:DeleteMessage",
- "sqs:TagQueue",
- "sqs:GetQueueUrl",
- "sqs:ChangeMessageVisibility",
- "sqs:DeleteMessageBatch",
- "sqs:SendMessageBatch",
- "sqs:UntagQueue",
- "sqs:ReceiveMessage",
- "sqs:SendMessage",
- "sqs:ListQueueTags",
- "sqs:ChangeMessageVisibilityBatch"
- ],
- "Resource": [
- "arn:aws:sqs:::"
- ]
- }
- ]
-}
-```
-
-
-
-4. Click *Next: Tags*. You don't need to add any tags.
-5. Click *Next: Review*.
-6. Enter a suitable name for your policy.
-7. Click *Create Policy*.
-
-You have created a policy that grants the permissions required to use an SQS queue.
-
-h3(#role). Create a role
-
-Create an IAM role as follows:
-
-1. In the AWS IAM console, click *Roles* in the sidebar and then click *Create Role*.
-2. For type of trusted entity select *Another AWS account*.
-3. For Account ID specify 203461409171. This is the Ably AWS account.
-4. Click the *Require external ID checkbox* and then enter an external ID of "@.@":/docs/platform/account/control-api#ids.
-5. Click *Next: Permissions*.
-6. Now select the policy you created earlier to attach to this role. You can type the name of your policy into the *Filter policies* search box.
-
-Then ensure the checkbox for the policy is selected.
-
-1. Click *Next: Tags*.
-2. You don't need to add tags so click *Next: Review*.
-3. Enter a suitable name for your role.
-4. Click *Create Role*.
diff --git a/src/pages/docs/platform/integrations/streaming/amqp.mdx b/src/pages/docs/platform/integrations/streaming/amqp.mdx
new file mode 100644
index 0000000000..b967478ef8
--- /dev/null
+++ b/src/pages/docs/platform/integrations/streaming/amqp.mdx
@@ -0,0 +1,41 @@
+---
+title: AMQP integration
+meta_description: "Send data to AMQP based on message, channel lifecycle, channel occupancy, and presence events."
+meta_keywords: "AMQP, integrations, events, serverless"
+redirect_from:
+ - /docs/general/firehose/amqp-rule
+---
+
+[AMQP](https://www.amqp.org) integrations enable you to automatically forward events that occur in Ably to AMQP-compatible brokers.
+
+## Create an AMQP integration
+
+To create an AMQP integration in your [dashboard:](https://ably.com/dashboard/any)
+
+1. Login and select the application you wish to integrate with AMQP.
+2. Click the **Integrations** tab.
+3. Click the **New Integration Rule** button.
+4. Choose Firehose.
+5. Choose AMQP.
+6. Configure the AMQP [settings](#settings).
+7. Click **Create**.
+
+You can also create an AMQP integration using the [Control API](/docs/platform/account/control-api).
+
+### Settings
+
+The following settings are available when creating an AMQP integration:
+
+| Setting | Description |
+| ------- | ----------- |
+| URL | Specifies the AMQP connection URL in the format `amqps://username:password@host.name/vhost`. |
+| Headers | Allows the inclusion of additional information in key-value format. |
+| [Source](/docs/integrations/streaming#sources) | Specifies the event types being sent to AMQP. |
+| [Channel filter](/docs/integrations/streaming#filter) | Filters the source channels based on a regular expression. |
+| Encoding | Specifies the encoding format of messages. Either JSON or MsgPack. |
+| [Enveloped](/docs/integrations/streaming#enveloped) | Checkbox to set whether messages should be enveloped or not. Enveloped is the default. |
+| Routing key | Specifies the [routing key](/messages#routing) used by the AMQP exchange to route messages to a physical queue. Supports interpolation. |
+| Exchange | An optional RabbitMQ exchange. Supports interpolation. |
+| Route mandatory | Messages are rejected if the route does not exist when set to `true`. Fails silently otherwise. |
+| Route persistent | Messages are marked as persistent, instructing the broker to write them to disk if the queue is durable. |
+| Optional TTL (minutes) | Override the default queue time to live (TTL), in minutes, for messages to be persisted. |
diff --git a/src/pages/docs/platform/integrations/streaming/datadog.mdx b/src/pages/docs/platform/integrations/streaming/datadog.mdx
new file mode 100644
index 0000000000..7107298105
--- /dev/null
+++ b/src/pages/docs/platform/integrations/streaming/datadog.mdx
@@ -0,0 +1,91 @@
+---
+title: Datadog integration
+meta_description: "Connect Ably and Datadog to monitor messages, channels, and connections in realtime, integrating your Ably statistics with your existing Datadog setup."
+meta_keywords: "Datadog, integrations, statistics, metrics, monitoring, analytics, enterprise"
+---
+
+The Ably [Datadog](https://docs.datadoghq.com/integrations/ably/) integration enables you to monitor your application's statistics. Every 60 seconds, Ably streams a comprehensive set of [statistics](/docs/metadata-stats/stats#metrics) to the Datadog API.
+
+
+
+## Setup the Datadog integration
+
+To connect Ably with Datadog, authorize the integration through Datadog's [OAuth](https://docs.datadoghq.com/developers/integrations/oauth_for_integrations/) flow. This process requires the `api_keys_write` scope, allowing Ably to push data to your Datadog account.
+
+Once the integration is active, Datadog provides a specific Ably [dashboard](https://docs.datadoghq.com/integrations/ably/), enabling you to monitor key metrics without extra setup.
+
+The following steps setup the Datadog integration:
+
+1. In Datadog, go to **Integrations**, find the **Ably** tile, and click **Install Integration**.
+2. Click **Connect Accounts** to start the authorization process. Datadog redirects you to Ably.
+3. Log in to your **Ably** account.
+4. Select your application from the **Your Apps** page.
+5. Navigate to **Integrations**, and select **Connect to Datadog**.
+6. Datadog authorization page, authorize Ably using **OAuth** to grant access. The required authorization scope is: `api_keys_write`.
+7. After completing authorization, you will be redirected to the **Ably dashboard**, and the process is complete.
+
+## Remove access
+
+Removing access disconnects Ably from Datadog, stopping data transmission and revoking authorization. Follow the steps remove the Ably and Datadog integration using either platform.
+
+### Remove access using Ably
+
+* Open your application's integration settings.
+* Click **Remove** next to the Datadog integration.
+* Ably revokes OAuth credentials and stops metric transmission.
+
+### Remove access using Datadog
+
+* Remove associated Ably API keys via **Integrations** or **API Keys**.
+* Adjust scopes or entirely revoke OAuth tokens if necessary.
+
+## Datadog lite
+
+Datadog Lite is a lightweight version of the full Datadog integration that sends a reduced set of [statistics](/docs/metadata-stats/stats#metrics) to the Datadog API. This integration is designed for use cases where full statistics are not required, such as when you only need to monitor a limited number channels or connections.
+
+DataDog Lite is available to [Pro](/docs/platform/pricing/pro) packages. A 30-day trial is also available to [Standard](/docs/platform/pricing/standard) packages, which can be enabled by contacting [Ably support](https://ably.com/support).
+
+The following statistics are streamed from Ably to Datadog using the Lite integration:
+
+| Metric Name | Description |
+| ----------- | ----------- |
+|`messages.all.all.count`|Total number of messages that were successfully sent, summed over all message types and transports.|
+|`messages.all.all.billableCount`|Total number of billable messages that were successfully sent, summed over all message types and transports.|
+|`messages.all.all.data`|Total message size of all messages that were successfully sent, summed over all message types and transports.|
+|`messages.all.all.uncompressedData`|Total uncompressed message size, excluding delta compression.|
+|`messages.all.all.failed`|Total number of messages that failed. These are messages which did not succeed for some reason other than Ably explicitly refusing them, such as they were rejected by an external integration target, or a service issue on Ably's side.|
+|`messages.all.all.refused`|Total number of messages that were refused by Ably. For example, due to rate limiting, malformed messages, or incorrect client permissions.|
+|`messages.all.messages.count`|Total message count, excluding presence and state messages.|
+|`messages.all.messages.billableCount`|Total billable message count, excluding presence and state messages.|
+|`messages.all.messages.data`|Total message size, excluding presence and state messages.|
+|`messages.all.messages.uncompressedData`|Total number of messages that failed. These are messages which did not succeed for some reason other than Ably explicitly refusing them, such as they were rejected by an external integration target, or a service issue on Ably's side.|
+|`messages.all.messages.failed`|Total number of messages excluding presence and state messages that failed. These are messages which did not succeed for some reason other than Ably explicitly refusing them, such as they were rejected by an external integration target, or a service issue on Ably's side.|
+|`messages.all.messages.refused`|Total number of messages excluding presence and state messages that were refused by Ably. For example, due to rate limiting, malformed messages, or incorrect client permissions.|
+|`messages.all.presence.count`|Total presence message count.|
+|`messages.all.presence.billableCount`|Total billable presence message count.|
+|`messages.all.presence.data`|Total presence message size.|
+|`messages.all.presence.uncompressedData`|Total uncompressed presence message size, excluding delta compression.|
+|`messages.all.messages.failed`|Total number of presence messages excluding presence and state messages that failed. These are messages which did not succeed for some reason other than Ably explicitly refusing them, such as they were rejected by an external integration target, or a service issue on Ably's side.|
+|`messages.all.messages.refused`|Total number of presence messages excluding presence and state messages that were refused by Ably. For example, due to rate limiting, malformed messages, or incorrect client permissions.|
+|`messages.inbound.all.all.count`|Total inbound message count, received by Ably from clients.|
+|`messages.inbound.all.all.data`|Total inbound message size, received by Ably from clients.|
+|`messages.inbound.all.all.uncompressedData`|Total uncompressed inbound message size, excluding delta compression, received by Ably from clients.|
+|`messages.inbound.all.all.failed`|Total number of inbound messages that failed. These are messages which did not succeed for some reason other than Ably explicitly refusing them, such as a service issue on Ably's side.|
+|`messages.inbound.all.all.refused`|Total number of inbound messages that were refused by Ably. For example, due to rate limiting, malformed messages, or incorrect client permissions.|
+|`messages.outbound.all.all.count`|Total outbound message count, sent from Ably to clients.|
+|`messages.outbound.all.all.billableCount`|Total billable outbound message count, sent from Ably to clients.|
+|`messages.outbound.all.all.data`|Total outbound message size, sent from Ably to clients.|
+|`messages.outbound.all.all.uncompressedData`|Total uncompressed outbound message size, excluding delta compression, sent from Ably to clients.|
+|`messages.outbound.all.all.failed`|Total number of outbound messages that failed. These are messages which did not succeed for some reason other than Ably explicitly refusing them, such as rejection by an external integration target, or a service issue on Ably's side.|
+|`messages.outbound.all.all.refused`|Total number of outbound messages that were refused by Ably. This is generally due to rate limiting.|
+|`connections.all.peak` |Peak connection count.|
+|`channels.peak` |Peak active channel count.|
+|`push.channelMessages` |Total number of channel messages published over Ably that contained a push payload. Each of these may have triggered notifications to be sent to a device with a matching registered push subscription.|
+|`messages.persisted.messages.count` |Total count of persisted messages, excluding presence and state messages.|
+|`messages.persisted.messages.data` |Total size of persisted messages, excluding presence and state messages.|
+|`messages.persisted.messages.uncompressedData` |Total uncompressed persisted message size, excluding delta compression, and presence and state messages.|
+|`messages.persisted.presence.count` |Total count of persisted presence messages.|
+|`messages.persisted.presence.data` |Total size of persisted presence messages.|
+|`messages.persisted.presence.uncompressedData` |Total uncompressed persisted presence message size, excluding delta compression.|
diff --git a/src/pages/docs/platform/integrations/streaming/index.mdx b/src/pages/docs/platform/integrations/streaming/index.mdx
new file mode 100644
index 0000000000..c756f43f76
--- /dev/null
+++ b/src/pages/docs/platform/integrations/streaming/index.mdx
@@ -0,0 +1,194 @@
+---
+title: Outbound streaming overview
+meta_description: "Outbound streaming integrations enable you to stream data from Ably to an external service for realtime processing."
+meta_keywords: "realtime streaming, stream processing"
+redirect_from:
+ - /docs/general/versions/v1.1/firehose
+ - /docs/general/versions/v1.0/firehose
+ - /docs/general/firehose
+---
+
+Ably's streaming integrations enable you to stream data that's published in the Ably platform to an external streaming or queueing service.
+
+For example, any device that publishes messages on a channel can immediately stream those messages to [Amazon Kinesis](/docs/integrations/streaming/kinesis), so that you can process the data in realtime.
+
+Each message is delivered once to your streaming or queueing server, making the system well-suited for asynchronous processing of realtime data published by Ably. For example, workers consuming data from your stream or queue can persist each message of a live chat to your own database, start publishing updates when a channel becomes active, or trigger an event if a device submits a location indicating it has reached its destination.
+
+
+
+There are two ways to create an outbound streaming integration:
+
+* Using the [Ably dashboard](#dashboard).
+* Using the [Control API](#api).
+
+## Create an outbound streaming integration using the dashboard
+
+You can create outbound streaming integrations in your [dashboard:](https://ably.com/dashboard/any) for the following services:
+
+* [Amazon Kinesis](/docs/integrations/streaming/kinesis)
+* [Amazon SQS](/docs/integrations/streaming/sqs)
+* [AMQP](/docs/integrations/streaming/amqp)
+* [Apache Kafka](/docs/integrations/streaming/kafka)
+* [Apache Pulsar](/docs/integrations/streaming/pulsar)
+
+## Create an outbound streaming integration using the Control API
+
+To create a new [integration](/docs/platform/account/control-api#examples-rules) for an Ably application, send a POST request to `/apps/{app_id}/rules` with the integration configuration in the request body.
+
+## Event types
+
+You can configure webhooks to listen for the following event types:
+
+| Event type | Description |
+| ---------- | ----------- |
+| `channel.lifecycle` | Triggered when a channel is created or discarded. |
+| `channel.message` | Triggered when [messages](/docs/messages) are published. |
+| `channel.occupancy` | Triggered when the number of users in a channel [changes](/docs/channels/options#occupancy). |
+| `channel.presence` | Triggered when users enter, leave, or update their [presence](/docs/presence-occupancy/presence). |
+
+## Channel filter
+
+Set a filter to restrict which channels an integration applies to using a regular expression.
+
+The following examples demonstrate channel names that you can match against using regular expressions to control which channels a webhook rule applies to:
+
+
+```text
+mychannel:public
+public
+public:events
+public:events:conferences
+public:news:americas
+public:news:europe
+```
+
+
+| RegEx | Channels |
+| ----- | -------- |
+| `^public.*` | Matches any channel that starts with `public`. This includes `public`, both `public:events` channels, and both `public:news` channels. |
+| `^public$` | Matches only channels named exactly `public`. |
+| `:public$` | Matches channels that end with `:public`. This includes only `mychannel:public`. |
+| `^public:events$` | Matches channels named exactly `public:events`. This does not include `public:events:conferences`. |
+| `^public.*europe$` | Matches channels that start with `public` and end with `europe`. This includes only `public:news:europe`. |
+| `news` | Matches any channel name that includes the word `news`. This includes `public:news:americas` and `public:news:europe`. |
+
+## Enveloped messages
+
+Message enveloping adds structured metadata such as the publisher's `clientId` and the originating channel name, alongside the message payload.
+
+This metadata is useful when processing events dynamically or when additional context about the message source is required. Enveloped messages are recommended for most use cases, as they provide a consistent format for all events.
+
+Enveloped messages include the following headers:
+
+| Header | Description |
+| ------ | ----------- |
+| `x-ably-version` | Specifies the version. |
+| `content-type` | Indicates the payload format, which can be either `application/json` or `application/x-msgpack` for enveloped messages. |
+
+Each enveloped message contains the following fields:
+
+| Field | Description |
+| ----- | ----------- |
+| `source` | The [event type](#sources). Possible values are `channel.message`, `channel.presence`, `channel.lifecycle` or `channel.occupancy`. |
+| `appId` | The app that generated the event. |
+| `channel` | The channel where the event occurred. |
+| `site` | The datacenter that sent the message. |
+| `timestamp` | A timestamp in milliseconds since the epoch representing when the event occurred. |
+
+In addition, it will contain another field which will contain the actual message. This field is named according to the message type.
+
+The following is an example of an enveloped `message` payload:
+
+
+```json
+{
+ "source": "channel.message",
+ "appId": "aBCdEf",
+ "channel": "channel-name",
+ "site": "eu-central-1-A",
+ "ruleId": "1-a2Bc",
+ "messages": [{
+ "id": "ABcDefgHIj:1:0",
+ "connectionId": "ABcDefgHIj",
+ "timestamp": 1123145678900,
+ "data": "some message data",
+ "name": "my message name"
+ }]
+}
+```
+
+
+### Decode enveloped messages
+
+Ably SDKs automatically decode messages into `Message` objects. Messages sent via an integration need to be decoded manually.
+
+There are two methods available for decoding messages into `Message` objects:
+
+* `Message.fromEncodedArray()` for an array of messages.
+* `Message.fromEncoded()` for single messages.
+
+There are also equivalent methods for decoding presence messages into `PresenceMessage` objects:
+
+* `PresenceMessage.fromEncodedArray()` for an array of presence messages.
+* `PresenceMessage.fromEncoded()` for single messages.
+
+Decoding is essential because it reconstructs the original data payload using the encoding field, ensuring the correct data type is restored, whether it's a string, binary, or structured object. If the message was encrypted, passing your encryption key to the method allows the SDK to decrypt data automatically.
+
+Ably strongly recommends decoding all messages received over integrations before processing them to avoid issues with unexpected data formats.
+
+## Non-enveloped messages
+
+You can turn off enveloping if you only need the raw message payload or the external service follows a strict data structure. This results in smaller payloads and eliminates the need to parse additional metadata. However, it requires you to handle raw payload decoding manually.
+
+Non-enveloped messages use a more straightforward format, delivering events directly without additional wrapping. Instead of a structured envelope, key properties are exposed through HTTP headers alongside the raw payload.
+
+For example, if you publish a JSON message to the channel `my_channel` using the following cURL request:
+
+
+```shell
+curl -X POST https://rest.ably.io/channels/my_channel/messages \
+ -u "{{API_KEY}}" \
+ -H "Content-Type: application/json" \
+ --data '{ "name": "publish", "data": "example" }'
+```
+
+
+The received message will include:
+
+* `x-ably-message-name: publish`
+* `Payload: example`
+
+The following example demonstrates a non-enveloped presence event:
+
+
+```javascript
+realtime = new Ably.Realtime({
+ key: '{{API_KEY}}',
+ clientId: 'bob'
+});
+channel = realtime.channels.get('some_channel');
+await channel.presence.enter('some data');
+```
+
+
+The received presence message will include:
+* `x-ably-message-action: enter`
+* `Payload: "some data"`
+
+### Non-enveloped structure
+
+Non-enveloped messages include headers that provide essential context about the payload, such as its source, format, and metadata. The following headers are included in all non-enveloped messages:
+
+| Header | Description |
+| ------ | ----------- |
+| `content-type` | Defines the payload type: `application/json` for JSON, `text/plain` for text, or `application/octet-stream` for binary data. |
+| `x-ably-version` | The version of the integration. |
+| `x-ably-envelope-appid` | The `appID` from which the message originated. |
+| `x-ably-envelope-channel` | Name of the Ably channel that sent the message. |
+| `x-ably-envelope-rule-id` | The ID of the integration that triggered the event. |
+| `x-ably-envelope-site` | The datacenter that processed the event. |
+| `x-ably-envelope-source` | [Event source](#sources), indicating the type of event: `channel.message`, `channel.presence`, `channel.lifecycle`, or `channel.occupancy`. |
+| `x-ably-message-client-id` | `ClientID` of the connection that sent the event. |
+| `x-ably-message-connection-id` | `ConnectionID` that initiated the event. |
+| `x-ably-message-id` | Unique `messageID` for tracking. |
+| `x-ably-message-timestamp` | Timestamp of when the message was originally sent. |
diff --git a/src/pages/docs/platform/integrations/streaming/kafka.mdx b/src/pages/docs/platform/integrations/streaming/kafka.mdx
new file mode 100644
index 0000000000..d38b3a5cfd
--- /dev/null
+++ b/src/pages/docs/platform/integrations/streaming/kafka.mdx
@@ -0,0 +1,52 @@
+---
+title: Apache Kafka integration
+meta_description: "Send data to Kafka based on message, channel lifecycle, channel occupancy, and presence events."
+meta_keywords: "Kafka, integrations, events, serverless"
+redirect_from:
+ - /docs/general/firehose/kafka-rule
+---
+
+[Kafka](https://kafka.apache.org/) integrations enable you to automatically forward events that occur in Ably to Kafka topics.
+
+
+
+## Create a Kafka integration
+
+To create a Kafka integration in your [dashboard:](https://ably.com/dashboard/any)
+
+1. Login and select the application you wish to integrate with Kafka.
+2. Click the **Integrations** tab.
+3. Click the **New Integration Rule** button.
+4. Choose Firehose.
+5. Choose Kafka.
+6. Configure the Kafka [settings](#settings).
+7. Click **Create**.
+
+You can also create a Kafka integration using the [Control API](/docs/platform/account/control-api).
+
+
+### Settings
+
+The following settings are available when creating a Kafka integration:
+
+| Setting | Description |
+| ------- | ----------- |
+| [Source](/docs/integrations/streaming#sources) | Specifies the event types being sent to Kafka. |
+| [Channel filter](/docs/integrations/streaming#filter) | Filters the source channels based on a regular expression. |
+| Encoding | Specifies the encoding format of messages. Either JSON or MsgPack. |
+| [Enveloped](/docs/platform/integrations/streaming#enveloped) | Checkbox to set whether messages should be enveloped or not. Enveloped is the default. |
+| Routing key | Specifies the [routing key](/docs/messages#routing) used to route messages to Kafka topics. |
+| Mechanism | The [SASL/SCRAM mechanism](#mechanism) used for Kafka connection. |
+| Username | The username to connect to Kafka with. |
+| Password | The password to connect to Kafka with. |
+| Brokers | List of Kafka broker endpoints in the format `:`. |
+
+### Authentication mechanism
+
+The available authentication mechanisms for Kafka are:
+
+* [SASL/PLAIN](https://docs.confluent.io/platform/current/kafka/authentication_sasl/authentication_sasl_plain.html#kafka-sasl-auth-plain)
+* [SASL/SCRAM-SHA-256](https://docs.confluent.io/platform/current/kafka/authentication_sasl/authentication_sasl_scram.html#kafka-sasl-auth-scram)
+* [SASL/SCRAM-SHA-512](https://docs.confluent.io/platform/current/kafka/authentication_sasl/authentication_sasl_scram.html)
diff --git a/src/pages/docs/platform/integrations/streaming/kinesis.mdx b/src/pages/docs/platform/integrations/streaming/kinesis.mdx
new file mode 100644
index 0000000000..e0f6acb8f7
--- /dev/null
+++ b/src/pages/docs/platform/integrations/streaming/kinesis.mdx
@@ -0,0 +1,111 @@
+---
+title: AWS Kinesis integration
+meta_description: "Send data to Kinesis based on message, channel lifecycle, channel occupancy, and presence events."
+meta_keywords: "Kinesis, integrations, events, serverless"
+redirect_from:
+ - /docs/general/firehose/kinesis-rule
+---
+
+[Kinesis](https://aws.amazon.com/kinesis/) integrations enable you to automatically forward events that occur in Ably to AWS Kinesis streams.
+
+## Create a Kinesis integration
+
+To create a Kinesis integration in your [dashboard:](https://ably.com/dashboard/any)
+
+1. Login and select the application you wish to integrate with Kinesis.
+2. Click the **Integrations** tab.
+3. Click the **New Integration Rule** button.
+4. Choose Firehose.
+5. Choose AWS Kinesis.
+6. Configure the Kinesis [settings](#settings).
+7. Click **Create**.
+
+You can also create a Kinesis integration using the [Control API](/docs/platform/account/control-api).
+
+### Settings
+
+The following settings are available when creating a Kinesis integration:
+
+| Setting | Description |
+| ------- | ----------- |
+| AWS Region | Specifies the AWS region of your Kinesis Stream. |
+| Stream Name | Defines the name of the Kinesis Stream to connect to. |
+| [AWS authentication scheme](#auth) | Choose the authentication method. Either **AWS credentials** or **ARN of an assumable role**. |
+| AWS Credentials | If using AWS credentials, enter the values in `key:value` format. |
+| ARN of an assumable role | If using ARN of an assumable role, enter the ARN of the role that Ably can assume to access your Kinesis stream. |
+| [Source](/docs/integrations/streaming#sources) | Specifies the event types being sent to Kinesis. |
+| [Channel filter](/docs/integrations/streaming#filter) | Filters the source channels based on a regular expression. |
+| Encoding | Specifies the encoding format of messages. Either JSON or MsgPack. |
+| [Enveloped](/docs/integrations/streaming#enveloped) | Checkbox to set whether messages should be enveloped or not. Enveloped is the default. |
+| Partition key | Specifies the [partition key](/messages#routing) used to route messages to a Kinesis stream shard. |
+
+## AWS authentication
+
+Delegate access to your AWS resources by creating an [IAM role](https://docs.aws.amazon.com/IAM/latest/UserGuide/id_roles.html) that the Ably AWS account can assume.
+
+This approach follows AWS best practices, as it avoids sharing access keys directly. Specify the role's ARN to grant Ably the necessary permissions in a secure manner.
+
+### Create a Kinesis policy
+
+The following steps show you how to create a policy for AWS Kinesis.
+
+1. In the IAM console sidebar select **Policies**.
+2. Click **Create Policy**.
+3. Click the JSON tab and enter the following JSON to configure the policy:
+
+
+```json
+{
+ "Version": "2012-10-17",
+ "Statement": [
+ {
+ "Sid": "ReadWriteToSingleStream",
+ "Effect": "Allow",
+ "Action": [
+ "kinesis:DescribeLimits",
+ "kinesis:DescribeStream",
+ "kinesis:GetShardIterator",
+ "kinesis:GetRecords",
+ "kinesis:ListTagsForStream",
+ "kinesis:MergeShards",
+ "kinesis:PutRecord",
+ "kinesis:PutRecords",
+ "kinesis:UpdateShardCount"
+ ],
+ "Resource": [
+ "arn:aws:kinesis:::stream/"
+ ]
+ }
+ ]
+}
+```
+
+
+
+
+4. Click **Next: Tags**. You don't need to add any tags.
+5. Click **Next: Review**.
+6. Enter a suitable name for your policy.
+7. Click **Create Policy**.
+
+You have created a policy that grants the permissions required to use a Kinesis stream. You must now attach it to the role that you'll specify in your Ably integration rule.
+
+### Create a role
+
+Create an IAM role as follows:
+
+1. In the AWS IAM console, click **Roles** in the sidebar and then click **Create Role**.
+2. For type of trusted entity select **Another AWS account**.
+3. For Account ID specify 203461409171. This is the Ably AWS account.
+4. Click the **Require external ID checkbox** and then enter an external ID of [`.`](/docs/platform/account/control-api#ids).
+5. Click **Next: Permissions**.
+6. Now select the policy you created earlier to attach to this role. You can type the name of your policy into the **Filter policies** search box.
+
+Then ensure the checkbox for the policy is selected.
+
+7. Click **Next: Tags**.
+8. You don't need to add tags so click **Next: Review**.
+9. Enter a suitable name for your role.
+10. Click **Create Role**.
diff --git a/src/pages/docs/platform/integrations/streaming/pulsar.mdx b/src/pages/docs/platform/integrations/streaming/pulsar.mdx
new file mode 100644
index 0000000000..f03f0b7cb8
--- /dev/null
+++ b/src/pages/docs/platform/integrations/streaming/pulsar.mdx
@@ -0,0 +1,43 @@
+---
+title: Apache Pulsar integration
+meta_description: "Send data to Pulsar based on message, channel lifecycle, channel occupancy, and presence events."
+meta_keywords: "Pulsar, integrations, events, serverless"
+redirect_from:
+ - /docs/general/firehose/pulsar-rule
+---
+
+[Pulsar](https://pulsar.apache.org) integrations enable you to automatically forward events that occur in Ably to Pulsar topics.
+
+
+
+## Create a Pulsar integration
+
+To create a rule in your [dashboard:](https://ably.com/dashboard/any)
+
+1. Login and select the application you wish to integrate with Pulsar.
+2. Click the **Integrations** tab.
+3. Click the **New Integration Rule** button.
+4. Choose Firehose.
+5. Choose Pulsar.
+6. Configure the Pulsar [settings](#settings).
+7. Click **Create**.
+
+You can also create a Pulsar integration using the [Control API](/docs/platform/account/control-api).
+
+#### Settings
+
+The following settings are available when creating a Pulsar integration:
+
+| Setting | Description |
+| ------- | ----------- |
+| [Source](/docs/integrations/streaming#sources) | Specifies the event types being sent to Pulsar. |
+| [Channel filter](/docs/integrations/streaming#filter) | Filters the source channels based on a regular expression. |
+| Encoding | Specifies the encoding format of messages. Either JSON or MsgPack. |
+| [Enveloped](/docs/platform/integrations/streaming#enveloped) | Checkbox to set whether messages should be enveloped or not. Enveloped is the default. |
+| Routing key | Specifies the [routing key](/docs/messages#routing) used to route messages to Pulsar topics. |
+| Topic | Defines the Pulsar topic to publish messages to. Must be in the format `tenant/namespace/topic_name`. |
+| Service URL | Specifies the Pulsar cluster URL in the format `pulsar://host:port` or `pulsar+ssl://host:port`. |
+| JWT Token | JWT to use for authentication. |
+| TLS trust certificates | Specify a list of trusted CA certificates to verify TLS certificates presented by Pulsar. |
diff --git a/src/pages/docs/platform/integrations/streaming/sqs.mdx b/src/pages/docs/platform/integrations/streaming/sqs.mdx
new file mode 100644
index 0000000000..1ce4282fc2
--- /dev/null
+++ b/src/pages/docs/platform/integrations/streaming/sqs.mdx
@@ -0,0 +1,112 @@
+---
+title: AWS SQS integration
+meta_description: "Send data to SQS based on message, channel lifecycle, channel occupancy, and presence events."
+meta_keywords: "SQS, integrations, events, serverless"
+redirect_from:
+ - /docs/general/firehose/sqs-rule
+---
+
+[SQS](https://aws.amazon.com/sqs) integrations enable you to automatically forward events that occur in Ably to AWS SQS queues.
+
+## Create an SQS integration
+
+To create an SQS integration in your [dashboard:](https://ably.com/dashboard/any)
+
+1. Login and select the application you wish to integrate with SQS.
+2. Click the **Integrations** tab.
+3. Click the **New Integration Rule** button.
+4. Choose Firehose.
+5. Choose AWS SQS.
+6. Configure the SQS [settings](#settings).
+7. Click **Create**.
+
+You can also create an SQS integration using the [Control API](/docs/platform/account/control-api).
+
+#### Settings
+
+The following settings are available when creating an SQS integration:
+
+| Setting | Description |
+| ------- | ----------- |
+| URL | Specifies the URL for the SQS queue, including credentials, region, and stream name. Only HTTPS is supported. |
+| AWS Region | Specifies the AWS region of your SQS queue. |
+| [AWS authentication scheme](#auth) | Choose the authentication method. Either **AWS credentials** or **ARN of an assumable role**. |
+| AWS Credentials | If using AWS credentials, enter the values in `key:value` format. |
+| ARN of an assumable role | If using ARN of an assumable role, enter the ARN of the role that Ably can assume to access your SQS queue. |
+| [Source](/docs/integrations/streaming#sources) | Specifies the event types being sent to SQS. |
+| [Channel filter](/docs/integrations/streaming#filter) | Filters the source channels based on a regular expression. |
+| Encoding | Specifies the encoding format of messages. Either JSON or MsgPack. |
+| [Enveloped](/docs/integrations/streaming#enveloped) | Checkbox to set whether messages should be enveloped or not. Enveloped is the default. |
+
+## AWS authentication
+
+Delegate access to your AWS resources by creating an [IAM role](https://docs.aws.amazon.com/IAM/latest/UserGuide/id_roles.html) that the Ably AWS account can assume.
+
+This approach follows AWS best practices, as it avoids sharing access keys directly. Specify the role's ARN to grant Ably the necessary permissions in a secure manner.
+
+### Create an SQS policy
+
+The following steps show you how to create a policy for AWS SQS.
+
+1. In the IAM console sidebar select **Policies**.
+2. Click **Create Policy**.
+3. Click the JSON tab and enter the following JSON to configure the policy:
+
+
+```json
+{
+ "Version": "2012-10-17",
+ "Statement": [
+ {
+ "Sid": "AllowReadWriteSQS",
+ "Effect": "Allow",
+ "Action": [
+ "sqs:DeleteMessage",
+ "sqs:TagQueue",
+ "sqs:GetQueueUrl",
+ "sqs:ChangeMessageVisibility",
+ "sqs:DeleteMessageBatch",
+ "sqs:SendMessageBatch",
+ "sqs:UntagQueue",
+ "sqs:ReceiveMessage",
+ "sqs:SendMessage",
+ "sqs:ListQueueTags",
+ "sqs:ChangeMessageVisibilityBatch"
+ ],
+ "Resource": [
+ "arn:aws:sqs:::"
+ ]
+ }
+ ]
+}
+```
+
+
+
+
+4. Click **Next: Tags**. You don't need to add any tags.
+5. Click **Next: Review**.
+6. Enter a suitable name for your policy.
+7. Click **Create Policy**.
+
+You have created a policy that grants the permissions required to use an SQS queue.
+
+### Create a role
+
+Create an IAM role as follows:
+
+1. In the AWS IAM console, click **Roles** in the sidebar and then click **Create Role**.
+2. For type of trusted entity select **Another AWS account**.
+3. For Account ID specify 203461409171. This is the Ably AWS account.
+4. Click the **Require external ID checkbox** and then enter an external ID of [`.`](/docs/platform/account/control-api#ids).
+5. Click **Next: Permissions**.
+6. Now select the policy you created earlier to attach to this role. You can type the name of your policy into the **Filter policies** search box.
+
+Then ensure the checkbox for the policy is selected.
+
+7. Click **Next: Tags**.
+8. You don't need to add tags so click **Next: Review**.
+9. Enter a suitable name for your role.
+10. Click **Create Role**.
From 4a3c9a3b7213b942de771cab359c36c2cf28702c Mon Sep 17 00:00:00 2001
From: Mark Hulbert <39801222+m-hulbert@users.noreply.github.com>
Date: Fri, 22 Aug 2025 10:29:12 +0200
Subject: [PATCH 3/7] Convert webhook integrations to MDX
---
content/integrations/webhooks/azure.textile | 41 --
.../integrations/webhooks/cloudflare.textile | 39 --
.../webhooks/gcp-function.textile | 42 --
content/integrations/webhooks/ifttt.textile | 82 ---
content/integrations/webhooks/index.textile | 475 -----------------
content/integrations/webhooks/lambda.textile | 106 ----
content/integrations/webhooks/zapier.textile | 41 --
.../platform/integrations/webhooks/azure.mdx | 40 ++
.../integrations/webhooks/cloudflare.mdx | 38 ++
.../integrations/webhooks/gcp-function.mdx | 41 ++
.../platform/integrations/webhooks/ifttt.mdx | 87 +++
.../platform/integrations/webhooks/index.mdx | 501 ++++++++++++++++++
.../platform/integrations/webhooks/lambda.mdx | 106 ++++
.../platform/integrations/webhooks/zapier.mdx | 40 ++
14 files changed, 853 insertions(+), 826 deletions(-)
delete mode 100644 content/integrations/webhooks/azure.textile
delete mode 100644 content/integrations/webhooks/cloudflare.textile
delete mode 100644 content/integrations/webhooks/gcp-function.textile
delete mode 100644 content/integrations/webhooks/ifttt.textile
delete mode 100644 content/integrations/webhooks/index.textile
delete mode 100644 content/integrations/webhooks/lambda.textile
delete mode 100644 content/integrations/webhooks/zapier.textile
create mode 100644 src/pages/docs/platform/integrations/webhooks/azure.mdx
create mode 100644 src/pages/docs/platform/integrations/webhooks/cloudflare.mdx
create mode 100644 src/pages/docs/platform/integrations/webhooks/gcp-function.mdx
create mode 100644 src/pages/docs/platform/integrations/webhooks/ifttt.mdx
create mode 100644 src/pages/docs/platform/integrations/webhooks/index.mdx
create mode 100644 src/pages/docs/platform/integrations/webhooks/lambda.mdx
create mode 100644 src/pages/docs/platform/integrations/webhooks/zapier.mdx
diff --git a/content/integrations/webhooks/azure.textile b/content/integrations/webhooks/azure.textile
deleted file mode 100644
index 31b3242ef9..0000000000
--- a/content/integrations/webhooks/azure.textile
+++ /dev/null
@@ -1,41 +0,0 @@
----
-title: Azure Functions integration
-meta_description: "Trigger Microsoft Azure functions based on message, channel lifecycle, channel occupancy, and presence events."
-meta_keywords: "Microsoft Azure, integrations, events, serverless"
-languages:
- - none
-redirect_from:
- - /docs/general/events/azure
- - /docs/general/webhooks/azure
----
-
-"Azure Function":https://azure.microsoft.com/en-gb/services/functions/ integrations enable you to trigger Microsoft's event-driven serverless compute functions when an event occurs in Ably.
-
-h2(#create). Create an Azure Function integration
-
-To create an Azure Function integration in your "dashboard:":https://ably.com/dashboard/any
-
-1. Login and select the application you wish to integrate with an Azure Function.
-2. Click the *Integrations* tab.
-3. Click the *New Integration Rule* button.
-4. Choose *Webhook*.
-5. Choose *Azure Functions*.
-6. Configure the Azure Functions "settings":#settings.
-7. Click *Create*.
-
-You can also create an Azure Function integration using the "Control API":/docs/platform/account/control-api.
-
-h3(#settings). Settings
-
-The following settings are available when creating an Azure Function integration:
-
-|_. Setting |_. Description |
-| Azure App ID | The ID of your Azure App. |
-| Function Name | The name of your Azure Function. |
-| Headers | Allows the inclusion of additional information in key-value format. |
-| Request Mode | Choose between *Single Request* or *Batch Request*. |
-| "Source":/docs/integrations/webhooks#sources | Specifies the event types being sent to Azure Functions. |
-| "Channel filter":/docs/integrations/webhooks#filter | Filters the source channels based on a regular expression. |
-| Encoding | Specifies the encoding format of messages. Either JSON or MsgPack. |
-| "Enveloped":/docs/integrations/webhooks#enveloped | Checkbox to set whether messages should be enveloped or not. Enveloped is the default. Only available when @Request Mode@ is set to @Single@. |
-| Sign with key | Payloads will be signed with an API key so they can be validated by your servers. Only available when @Request Mode@ is set to @Batched@. |
diff --git a/content/integrations/webhooks/cloudflare.textile b/content/integrations/webhooks/cloudflare.textile
deleted file mode 100644
index 0befffa7c8..0000000000
--- a/content/integrations/webhooks/cloudflare.textile
+++ /dev/null
@@ -1,39 +0,0 @@
----
-title: Cloudflare Worker integration
-meta_description: "Trigger Cloudflare Workers based on message, channel lifecycle, channel occupancy, and presence events."
-meta_keywords: "Cloudflare Workers, integrations, events, serverless"
-languages:
- - none
-redirect_from:
- - /docs/general/events/cloudflare
- - /docs/general/webhooks/cloudflare
----
-
-"Cloudflare Worker":https://workers.cloudflare.com integrations enable Cloudflare's Edge Network to distribute your JavaScript-based functions when an event occurs in Ably.
-
-h2(#create). Create a Cloudflare Worker integration
-
-To create a Cloudflare Worker integration in your "dashboard:":https://ably.com/dashboard/any
-
-1. Login and select the application you wish to integrate with a Cloudflare Worker.
-2. Click the *Integrations* tab.
-3. Click the *New Integration Rule* button.
-4. Choose *Webhook*.
-5. Choose *Cloudflare Workers*.
-6. Configure the Cloudflare Worker "settings":#settings.
-7. Click *Create*.
-
-You can also create a Cloudflare Worker integration using the "Control API":/docs/platform/account/control-api.
-
-h3(#settings). Settings
-
-The following settings are available when creating a Cloudflare Worker integration:
-
-|_. Setting |_. Description |
-| URL | The URL of the Cloudflare Worker to POST a summary of events to. |
-| Headers | Allows the inclusion of additional information in key-value format. |
-| Request Mode | Choose between *Single Request* or *Batch Request*. |
-| "Source":/docs/integrations/webhooks#sources | Specifies the event types being sent to Cloudflare Workers. |
-| "Channel filter":/docs/integrations/webhooks#filter | Filters the source channels based on a regular expression. |
-| Encoding | Specifies the encoding format of messages. Either JSON or MsgPack. |
-| Sign with key | Payloads will be signed with an API key so they can be validated by your servers. Only available when @Request Mode@ is set to @Batched@. |
diff --git a/content/integrations/webhooks/gcp-function.textile b/content/integrations/webhooks/gcp-function.textile
deleted file mode 100644
index e4bb70c82b..0000000000
--- a/content/integrations/webhooks/gcp-function.textile
+++ /dev/null
@@ -1,42 +0,0 @@
----
-title: Google Function integration
-meta_description: "Trigger Google Functions based on message, channel lifecycle, channel occupancy, and presence events."
-meta_keywords: "Google Functions, integrations, events, serverless"
-languages:
- - none
-redirect_from:
- - /docs/general/events/google-functions
- - /docs/general/webhooks/google-functions
----
-
-"Google Function":https://cloud.google.com/functions integrations enable you to trigger event-driven serverless compute functions when an event occurs in Ably.
-
-h2(#create). Create a Google Function integration
-
-To create a Google Function integration in your "dashboard:":https://ably.com/dashboard/any
-
-1. Login and select the application you wish to integrate with a Google Function.
-2. Click the *Integrations* tab.
-3. Click the *New Integration Rule* button.
-4. Choose *Webhook*.
-5. Choose *Google Functions*.
-6. Configure the Google Function "settings":#settings.
-7. Click *Create*.
-
-You can also create a Google Function integration using the "Control API":/docs/platform/account/control-api.
-
-h3(#settings). Settings
-
-The following settings are available when creating a Google Function integration:
-
-|_. Setting |_. Description |
-| Region | The region in which your Google Function is hosted. |
-| Project ID | The ID of your Google Cloud Project. |
-| Function | The name of your Google Function. |
-| Headers | Allows the inclusion of additional information in key-value format. |
-| Request Mode | Choose between *Single Request* or *Batch Request*. |
-| "Source":/docs/integrations/webhooks#sources | Specifies the event types being sent to your Google Function. |
-| "Channel filter":/docs/integrations/webhooks#filter | Filters the source channels based on a regular expression. |
-| Encoding | Specifies the encoding format of messages. Either JSON or MsgPack. |
-| "Enveloped":/docs/integrations/webhooks#enveloped | Checkbox to set whether messages should be enveloped or not. Enveloped is the default. Only available when @Request Mode@ is set to @Single@. |
-| Sign with key | Payloads will be signed with an API key so they can be validated by your servers. Only available when @Request Mode@ is set to @Batched@. |
diff --git a/content/integrations/webhooks/ifttt.textile b/content/integrations/webhooks/ifttt.textile
deleted file mode 100644
index 44f6738ef3..0000000000
--- a/content/integrations/webhooks/ifttt.textile
+++ /dev/null
@@ -1,82 +0,0 @@
----
-title: IFTTT integration
-meta_description: "Trigger IFTTT based on message, channel lifecycle, channel occupancy, and presence events."
-meta_keywords: "IFTTT, integrations, events, serverless"
-languages:
- - none
-redirect_from:
- - /docs/general/events/ifttt
- - /docs/general/webhooks/ifttt
----
-
-"IFTTT":https://ifttt.com/maker_webhooks (If This Then That) integrations enable you to trigger conditional chains, and help to combine various services together when an event occurs in Ably.
-
-h2(#create). Create a IFTTT integration
-
-To create an IFTTT integration in your "dashboard:":https://ably.com/dashboard/any
-
-1. Login and select the application you wish to integrate with IFTTT.
-2. Click the *Integrations* tab.
-3. Click the *New Integration Rule* button.
-4. Choose *Webhook*.
-5. Choose *IFTTT*.
-6. Configure the IFTTT "settings":#settings.
-7. Click *Create*.
-
-You can also create an IFTTT integration using the "Control API":/docs/platform/account/control-api.
-
-h3(#settings). Settings
-
-The following settings are available when creating an IFTTT integration:
-
-|_. Setting |_. Description |
-| IFTTT Webhook key | The webhook key for your IFTTT account. |
-| Event name | The name used to identify the IFTTT applet. |
-| "Source":/docs/integrations/webhooks#sources | Specifies the event types being sent to IFTTT. |
-| "Channel filter":/docs/integrations/webhooks#filter | Filters the source channels based on a regular expression. |
-| Encoding | Specifies the encoding format of messages. Either JSON or MsgPack. |
-
-h2(#restrictions). Restrictions
-
-IFTTT has limitations on the data it can process. All payloads must be @JSON@ and use only the keys @value1@, @value2@, or @value3@. Any other format or additional keys will not be processed.
-
-As a result, "enveloping":/docs/integrations/webhooks#enveloped, “batching”:/docs/integrations/webhooks#batching are not supported. Additionally, protocols that require decoding such as "MQTT":/docs/protocols/mqtt, are not supported with IFTTT.
-
-To ensure data is processed by IFTTT, it must match the required IFTTT structure. The following example shows the headers and payload sent to IFTTT when a message is sent to a channel:
-
-```[json]
-{
- "value1" :"data I want to send 1",
- "value2" :"data I want to send 2",
- "value3" :"data I want to send 3"
-}
-```
-
-For a "message data":/docs/api/realtime-sdk/messages#data or "presence message data":/docs/api/realtime-sdk/presence#presence-message of @{ "value1": "My first message", "value2": "My second message" }@, the following would be sent to your IFTTT endpoint:
-
-Headers:
-
-```[text]
-host: https://maker.ifttt.com/trigger/{YOUR_EVENT}/with/key/{YOUR_IFTTT_KEY}
-content-type: application/json
-x-ably-envelope-appid: {YOUR_APP_ID}
-x-ably-envelope-channel: {YOUR_CHANNEL}
-x-ably-envelope-rule-id: {YOUR_RULE_ID}
-x-ably-envelope-site: {ably-server-location}
-x-ably-envelope-source: channel.message
-x-ably-message-encoding: json
-x-ably-message-id: {UNIQUE_ABLY_MESSAGE_ID}
-x-ably-message-timestamp: {TIMESTAMP_ORIGINAL_MESSAGE_WAS_SENT}
-x-ably-version: 1.2
-content-length: 18
-connection: keep-alive
-```
-
-Payload:
-
-```[json]
-{
- "value1": "My first message",
- "value2": "My second message"
-}
-```
diff --git a/content/integrations/webhooks/index.textile b/content/integrations/webhooks/index.textile
deleted file mode 100644
index c528fdd61e..0000000000
--- a/content/integrations/webhooks/index.textile
+++ /dev/null
@@ -1,475 +0,0 @@
----
-title: Outbound webhooks overview
-meta_description: "A guide on webhook payloads, including batched, enveloped, and non-enveloped event payloads, with decoding examples and sources."
-meta_keywords: "webhooks, Ably, payloads, batched events, enveloped events, non-enveloped events, message decoding, presence events, channel lifecycle, data processing"
-languages:
- - javascript
- - nodejs
- - php
- - python
- - ruby
- - java
- - swift
- - objc
- - csharp
- - go
-redirect_from:
- - /docs/general/functions
- - /docs/general/events
- - /docs/general/webhooks
----
-
-Outbound webhook integrations enable you to trigger serverless functions and notify HTTP endpoints when events occur in Ably.
-
-Events include when messages are "published":/docs/pub-sub#publish, when "presence":/docs/presence-occupancy/presence#trigger-events events occur, changes in channel occupancy and when channels are created or discarded. Data can be delivered individually or in batches to external services.
-
-
-
-
-
-There are two ways to create an outbound webhook integration:
-
-* Using the "Ably dashboard":#dashboard.
-* Using the "Control API":#api.
-
-h2(#filter). Channel filter
-
-Set a filter to restrict which channels an integration applies to using a regular expression.
-
-The following examples demonstrate channel names that you can match against using regular expressions to control which channels a webhook rule applies to:
-
-```[text]
-mychannel:public
-public
-public:events
-public:events:conferences
-public:news:americas
-public:news:europe
-```
-
-* @^public.*@ — Matches any channel that starts with @public@. This includes @public@, both @public:events@ channels, and both @public:news@ channels.
-* @^public$@ — Matches only channels named exactly @public@.
-* @:public$@ — Matches channels that end with @:public@. This includes only @mychannel:public@.
-* @^public:events$@ — Matches channels named exactly @public:events@. This does not include @public:events:conferences@.
-* @^public.*europe$@ — Matches channels that start with @public@ and end with @europe@. This includes only @public:news:europe@.
-* @news@ — Matches any channel name that includes the word @news@. This includes @public:news:americas@ and @public:news:europe@.
-
-h2(#sources). Event types
-
-You can configure webhooks to listen for the following event types:
-
-|_. Event type |_. Description |
-| @channel.lifecycle@ | Triggered when a channel is created or discarded. |
-| @channel.message@ | Triggered when "messages":/docs/messages are published. |
-| @channel.occupancy@ | Triggered when the number of users in a channel "changes":/docs/channels/options#occupancy. |
-| @channel.presence@ | Triggered when users enter, leave, or update their "presence":/docs/presence-occupancy/presence. |
-
-
-
-h2(#single). Single requests
-
-In single request mode, a @POST@ request is made to your endpoint each time an event occurs.
-
-This is useful in certain use cases where an endpoint can only process one message per request, however it can lead to the endpoint being overloaded in high-throughput scenarios. Single requests are best suited to where this a 1:1 relationship between messages being sent and the events being called.
-
-Multiple requests can be in-flight at once, however be aware there is a "limit on concurrency":/docs/platform/pricing/limits#integrations. If it is exceeded then new messages are placed in a short 10 message queue. If that is exceeded then further messages are rejected.
-
-Ably will retry failed @5XX@ requests. If the response times out, Ably will retry twice, first after 4 seconds and then again after 20 seconds.
-
-h2(#batched). Batched requests
-
-Batched requests are useful for endpoints that have the potential to be overloaded by requests, or have no requirement to process messages one-by-one.
-
-Batched requests are published at most once per second, but this may vary by integration. Once a batched request is triggered, all other events will be queued so that they can be delivered in a batch in the next request. The next request will be issued within one second with the following caveats:
-
-* Only a limited number of HTTP requests are in-flight at one time for each configured integration. Therefore, if you want to be notified quickly, you should accept requests quickly and defer any work to be done asynchronously.
-* If there are more than 1,000 events queued for a payload, the oldest 1,000 events will be bundled into this payload and the remaining events will be delivered in the subsequent payload. Therefore, if your sustained rate of events is expected to be more than 1,000 per second or your servers are slow to respond, then it is possible a backlog will build up and you will not receive all events.
-
-If a batched request fails, Ably will retry the request using an exponential backoff strategy.
-
-The backoff delay follows the formula: @delay = delay * sqrt(2)@ where the initial delay is 1 second. For example, if a webhook request fails repeatedly, the retry delays will be:
-
-* Initial request := 1.4s wait → 1st retry.
-* 1st retry := 2s wait → 2nd retry.
-* 2nd retry := 2.8s wait → 3rd retry.
-* 3rd retry := 4s wait → 4th retry.
-* 4th retry := 5.6s wait → successful request.
-
-The back off for consecutively failing requests will increase until it reaches 60s. All subsequent retries for failed requests will then be made at 60s intervals until a request is successful. The queue of events is retain for 5 minutes. If an event cannot be delivered within that time then events are discarded to prevent the queue from growing indefinitely.
-
-h3(#batched-events). Batched event payloads
-
-Given the various potential combinations of enveloped, batched, and message sources, it's helpful to understand what to expect in different scenarios.
-
-Batched events will have the following headers:
-
-|_. Header |_. Description |
-| @content-type@ | The type of the payload. This will be @application/json@ or @application/x-msgpack@. |
-| @x-ably-version@ | The version of the Webhook. Currently, this is @1.2@. |
-
-Each batched message will have the following fields::
-
-|_. Field |_. Description |
-| @name@ | The event type, for example, @presence.message@, @channel.message@, or @channel.closed@. |
-| @webhookId@ | A unique internal ID for the configured webhook. |
-| @source@ | The source of the webhook, which will be one of @channel.message@, @channel.presence@, @channel.lifecycle@, or @channel.occupancy@. |
-| @timestamp@ | A timestamp in milliseconds since the epoch for the presence event. |
-| @data@ | An object containing the event data, defined below in "JSONPath format":https://goessner.net/articles/JsonPath/. |
-
-h4(#batched-message). Batched message events
-
-For @message@ events, the @data@ field will contain the following:
-
-|_. Field |_. Description |
-| @data.channelId@ | The name of the channel that the presence event belongs to. |
-| @data.site@ | An internal site identifier indicating which primary datacenter the member is present in. |
-| @data.messages@ | An @Array@ of raw messages. |
-
-The following example is a batched @message@ payload:
-
-```[json]
-{
- "items": [{
- "webhookId": "ABcDEf",
- "source": "channel.message",
- "serial": "a7bcdEFghIjklm123456789:4",
- "timestamp": 1562124922426,
- "name": "channel.message",
- "data": {
- "channelId": "chat-channel-4",
- "site": "eu-west-1-A",
- "messages": [{
- "id": "ABcDefgHIj:1:0",
- "clientId": "user-3",
- "connectionId": "ABcDefgHIj",
- "timestamp": 1123145678900,
- "data": "the message data",
- "name": "a message name"
- }]
- }
- }]
-}
-```
-
-h4(#decode-messages). Decode batched messages
-
-The Ably SDK automatically decodes messages sent over the Realtime service into "@message@" objects. However, batched, enveloped webhook payloads require explicit decoding using:
-
-- "@Message.fromEncoded@":/docs/api/realtime-sdk/messages#message-from-encoded := For an array of messages.
-- "@Message@":/docs/api/realtime-sdk/types#message := For a single message.
-
-The benefits of decoding include fully restoring @data@ to its original datatype using encoding. Additionally, it supports automatic decryption when an "encryption":/docs/channels/options/encryption key is provided. Its recommended to decode all messages received via webhooks to ensure proper data handling.
-
-The following example demonstrates how to decode an array of messages received via a webhook:
-
-```[javascript]
-webhookMessage.items.forEach((item) => {
- const messages = Ably.Realtime.Message.fromEncodedArray(item.data.messages);
- messages.forEach((message) => {
- console.log(message.toString());
- })
-})
-```
-
-h3(#batched-structure). Batched presence structure
-
-Batched presence events group multiple presence messages in a single payload. @presence@ events @data@ contains:
-
-- @data.channelId@ := The name of the channel the presence event belongs to.
-- @data.site@ := An internal site identifier, indicating the primary datacenter the member is present in.
-- @data.presence@ := An @Array@ of raw presence messages.
-
-The following is an example of a batched @presence@ payload:
-
-```[json]
-{
- "items": [{
- "webhookId": "ABcDEf",
- "source": "channel.presence",
- "serial": "a7bcdEFghIjklm123456789:4",
- "timestamp": 1562124922426,
- "name": "presence.message",
- "data": {
- "channelId": "education-channel",
- "site": "eu-west-1-A",
- "presence": [{
- "id": "ABcDefgHIj:1:0",
- "clientId": "bob",
- "connectionId": "ABcDefgHIj",
- "timestamp": 1123145678900,
- "data": "the message data",
- "action": 4
- }]
- }
- }]
-}
-```
-
-h4(#decode-presence). Decode batched presence
-
-Presence messages sent "over the realtime service":/docs/channels are automatically decoded into "@PresenceMessage@":/docs/api/realtime-sdk/types#presence-message objects by the Ably client library. For webhooks, you need to do this manually using "@PresenceMessage.fromEncodedArray@":/docs/api/realtime-sdk/presence#presence-from-encoded-array on the @data.presence@ array, or "@PresenceMessage.fromEncoded@":/docs/api/realtime-sdk/presence#presence-from-encoded on an individual entry. These methods convert the encoded values into "@PresenceMessage@":/docs/api/realtime-sdk/types#presence-message objects—either as an array or a single instance.
-
-This allows you to decode the numerical action into a "@Presence action@":/docs/api/realtime-sdk/presence#presence-action string (such as "@enter@", "@update@", or "@leave@"), fully decode the @data@ (using the @encoding@) back into its original datatype or an equivalent in the client library, and, if you’re using "encryption":/docs/channels/options/encryption, pass your encryption key to decrypt the @data@.
-
-The following example demonstrates how to decode an array of presence messages received via a webhook:
-
-```[javascript]
-webhookMessage.items.forEach((item) => {
- const messages = Ably.Realtime.PresenceMessage.fromEncodedArray(item.data.messages);
- messages.forEach((message) => {
- console.log(message.toString());
- })
-})
-```
-
-h3(#batched-lifecycle). Batched channel lifecycle structure
-
-Ably includes the following fields in the @data@ object for batched @channel.lifecycle@ events:
-
-- @data.channelId@ := The name of the channel where the lifecycle event occurred.
-- @data.status@ := a A "@ChannelStatus@":/docs/api/realtime-sdk/channel-metadata#channel-details object that describes the channel's current state.
-
-The @name@ of a @channel.lifecycle@ event will be @channel.opened@ or @channel.closed@.
-
-The following is example a batched @channel lifecycle@ payload:
-
-```[json]
-{
- "items": [{
- "webhookId": "ABcDEf",
- "source": "channel.lifecycle",
- "timestamp": 1562124922426,
- "serial": "a7bcdEFghIjklm123456789:4",
- "name": "channel.opened",
- "data": {
- "channelId": "chat-channel-5",
- "name": "chat-channel-5",
- "status": {
- "isActive": true,
- "occupancy": {
- "metrics": {
- "connections": 1,
- "publishers": 1,
- "subscribers": 1,
- "presenceConnections": 1,
- "presenceMembers": 0,
- "presenceSubscribers": 1,
- "objectPublishers": 1,
- "objectSubscribers": 1
- }
- }
- }
- }
- }]
-}
-```
-
-h2(#enveloped). Enveloped events
-
-Enveloping events adds structured metadata such as the publisher's @clientId@ and the originating channel name, alongside the payload.
-
-This metadata is useful when processing events dynamically or when additional context about the source is required. Enveloped messages are recommended for most use cases, as they provide a consistent format for all events.
-
-Enveloped events include the following headers:
-
-|_. Header |_. Description |
-| @x-ably-version@ | Specifies the Webhook version. Currently, this should be set to @1.2@. |
-| @content-type@ | Indicates the payload format, which can be either @application/json@ or @application/x-msgpack@ for enveloped messages. |
-
-Each enveloped message contains the following fields:
-
-|_. Field |_. Description |
-| @source@ | The origin of the webhook event. Possible values are: @channel.message@, @channel.presence@, @channel.lifecycle@, @channel.occupancy@ |
-| @appId@ | The Ably app that generated the event. |
-| @channel@ | The Ably channel where the event occurred. |
-| @site@ | The Ably datacenter that sent the message. |
-| @timestamp@ | A timestamp in milliseconds since the epoch representing when the presence event occurred. |
-
-In addition, it will contain another field which will contain the actual message, which is named according to the message type.
-
-h3(#enveloped-message-events). Enveloped message events
-
-For @message@ events, the @messages@ array contains a raw message.
-
-The following is an example of an enveloped @message@ payload:
-
-```[json]
-{
- "source": "channel.message",
- "appId": "aBCdEf",
- "channel": "channel-name",
- "site": "eu-central-1-A",
- "ruleId": "1-a2Bc",
- "messages": [{
- "id": "ABcDefgHIj:1:0",
- "connectionId": "ABcDefgHIj",
- "timestamp": 1123145678900,
- "data": "some message data",
- "name": "my message name"
- }]
-}
-```
-
-h4(#decode). Decode enveloped messages
-
-Ably SDKs automatically decode messages into @Message@ objects. Messages sent via an integration need to be decoded manually.
-
-There are two methods available for decoding messages into @Message@ objects:
-
-* @Message.fromEncodedArray()@ for an array of messages.
-* @Message.fromEncoded()@ for single messages.
-
-There are also equivalent methods for decoding presence messages into @PresenceMessage@ objects:
-
-* @PresenceMessage.fromEncodedArray()@ for an array of presence messages.
-* @PresenceMessage.fromEncoded()@ for single messages.
-
-Decoding is essential because it reconstructs the original data payload using the encoding field, ensuring the correct data type is restored, whether it's a string, binary, or structured object. If the message was encrypted, passing your encryption key to the method allows the SDK to decrypt data automatically.
-
-Ably strongly recommends decoding all messages received over integrations before processing them to avoid issues with unexpected data formats.
-
-The following example demonstrates how to decode an array of messages received via a webhook:
-
-```[javascript]
-const messages = Ably.Realtime.Message.fromEncodedArray(item.messages);
-
-messages.forEach((message) => {
- console.log(message.toString());
-});
-```
-
-h3(#presence). Enveloped presence events
-
-Webhook "presence":/docs/presence-occupancy/presence events contain raw @presence@ data in the @presence@ array.
-
-The following example is an enveloped @message@ payload with a @presence@ array:
-
-```[json]
-{
- "source": "channel.message",
- "appId": "aBCdEf",
- "channel": "channel-name",
- "site": "eu-central-1-A",
- "ruleId": "1-a2Bc",
- "presence": [{
- "id": "abCdEFgHIJ:1:0",
- "clientId": "bob",
- "connectionId": "Ab1CDE2FGh",
- "timestamp": 1582270137276,
- "data": "some data in the presence object",
- "action": 4
- }]
-}
-```
-
-h4(#presence-decode). Decode enveloped presence messages
-
-Presence messages sent over Realtime are automatically decoded into @PresenceMessage@ objects by the Ably SDK. However, webhook presence messages require explicit decoding.
-
-To decode presence messages received via webhooks, use the appropriate method:
-* For multiple messages, use @PresenceMessage.fromEncodedArray()@ on the presence array.
-* For a single message, use @PresenceMessage.fromEncoded()@ on an individual presence entry.
-
-Both methods convert encoded presence messages into @PresenceMessage@ objects, restoring the original format.
-
-Decoding presence messages provides several advantages:
-* It converts numerical presence action values into readable strings such as *enter*, *update*, or *leave*.
-* It reconstructs the original data field, ensuring it matches the format it was sent in.
-* If encryption is enabled, passing your encryption key will automatically decrypt the data field.
-
-Ably strongly recommends decoding all presence messages received via webhooks to ensure proper data handling.
-
-The following example demonstrates decoding an array of presence messages using the Ably JavaScript SDK:
-
-```[javascript]
-const messages = Ably.Realtime.PresenceMessage.fromEncodedArray(item.messages);
-messages.forEach((message) => {
- console.log(message.toString());
-})
-```
-
-h2(#non-env-message). Non-enveloped events
-
-You can turn off enveloping if your endpoint only needs the raw message payload or follows a strict data structure. This results in smaller payloads and eliminates the need to parse additional metadata. However, it requires you to handle raw payload decoding manually.
-
-Non-enveloped webhook messages include headers that provide essential context about the payload, such as its source, format, and metadata. The following headers are included in all non-enveloped messages:
-
-|_. Header |_. Description |
-| @content-type@ | Defines the payload type: @application/json@ for JSON, @text/plain@ for text, or @application/octet-stream@ for binary data. |
-| @x-ably-version@ | Webhook version, currently 1.2. |
-| @x-ably-envelope-appid@ | Ably @appID@ from which the message originated. |
-| @x-ably-envelope-channel@ | Name of the Ably channel that sent the message. |
-| @x-ably-envelope-rule-id@ | @RuleID@ that triggered the webhook event. |
-| @x-ably-envelope-site@ | Ably datacenter that processed the event. |
-| @x-ably-envelope-source@ | Event source, indicating the type of event: @channel.message@, @channel.presence@, @channel.lifecycle@, or @channel.occupancy@. |
-| @x-ably-message-client-id@ | @ClientID@ of the connection that sent the event. |
-| @x-ably-message-connection-id@ | @ConnectionID@ that initiated the event. |
-| @x-ably-message-id@ | Unique @messageID@ for tracking. |
-| @x-ably-message-timestamp@ | Timestamp of when the message was originally sent. |
-
-h3(#non-env-message). Non-enveloped message events
-
-For @message@ events, there will be additional headers:
-
-|_. Header |_. Description |
-| @x-ably-message-name@ | The name of the message. |
-
-The payload will contain the data of the message.
-
-For example, if you publish a message to the channel @my_channel@ using the following cURL request:
-
-```[curl]
-curl -X POST https://rest.ably.io/channels/my_channel/messages \
- -u "{{API_KEY}}" \
- -H "Content-Type: application/json" \
- --data '{ "name": "publish", "data": "example" }'
-```
-
-The @x-ably-message-name@ header would be @publish@ and the payload would be @example@.
-
-
-h3(#non-env-presence). Non-enveloped presence messages
-
-For @Presence@ events, there will be the additional headers:
-
-|_. Header |_. Description |
-| @x-ably-message-name@ | The action performed by the event (@update@, @enter@, @leave@). |
-
-The payload will contain the "data":/docs/api/realtime-sdk/presence#presence-message of the @Presence@ message.
-
-The following example demonstrates a non-enveloped "enter":/docs/api/realtime-sdk/presence#enter presence event:
-
-```[javascript]
-realtime = new Ably.Realtime({
- key: '{{API_KEY}}',
- clientId: 'bob'
-});
-channel = realtime.channels.get('some_channel');
-await channel.presence.enter('some data');
-```
-
-The @x-ably-message-action@ header would be @enter@ and the payload would be @some data@.
-
-h2(#security). Webhook security
-
-Ably advises you to use a secure HTTPS URL when you configure webhooks. This way, you ensure that all communication with your servers is encrypted with TLS and cannot be intercepted.
-
-In addition, Ably optionally supports signing webhook requests so you can verify their authenticity. This applies to both "single":#single and "batched":#batched webhook requests, as well as any streaming integrations that also rely on HTTP-based callbacks. Ably sends the signature in the @X-Ably-Signature@ header for batched requests and references the connected key in the @X-Ably-Key@ header.
-
-The following steps are required to verify the signature:
-
-1. Start with the webhook request body. This is a JSON string encoded with content-encoding @utf-8@.
-2. Identify the key based on the @keyId@ indicated in the @X-Ably-Key@ header.
-3. Calculate the HMAC of that request body using the SHA-256 algorithm and the corresponding @keyValue@ (the secret part of the key after the "@:@").
-4. Encode the resulting HMAC using RFC 3548 base64.
-5. Compare that result with the signature value in the @X-Ably-Signature@ header.
-
-h3(#sign). Sign webhook requests
-
-If you choose to sign your webhook requests, it is recommended that you try the following:
-
-1. Set up a free "@RequestBin@":https://requestbin.com/ HTTP endpoint test URL.
-2. "Configure":#configure a webhook with the URL set to the @RequestBin@ endpoint. Make sure you choose to "batched":#batched messages and use a key to sign each webhook request.
-3. Trigger an event using the Dev Console in your app "dashboard":https://ably.com/dashboard/any. This will generate a webhook. Then confirm that RequestBin received the webhook.
diff --git a/content/integrations/webhooks/lambda.textile b/content/integrations/webhooks/lambda.textile
deleted file mode 100644
index cbe9059424..0000000000
--- a/content/integrations/webhooks/lambda.textile
+++ /dev/null
@@ -1,106 +0,0 @@
----
-title: AWS Lambda integration
-meta_description: "Trigger AWS Lambda functions based on message, channel lifecycle, channel occupancy, and presence events."
-meta_keywords: "AWS Lambda, integrations, events, serverless"
-languages:
- - none
-redirect_from:
- - /docs/general/events/aws-lambda
- - /docs/general/webhooks/aws-lambda
----
-
-"AWS Lambda":https://aws.amazon.com/lambda/ integrations enable you to trigger event-driven serverless compute functions when an event occurs in Ably. They are useful for integrating into various AWS services.
-
-h2(#create). Create an AWS Lambda integration
-
-To create an AWS Lambda integration in your "dashboard:":https://ably.com/dashboard/any
-
-1. Login and select the application you wish to integrate with AWS Lambda.
-2. Click the *Integrations* tab.
-3. Click the *New Integration Rule* button.
-4. Choose *Webhook*.
-5. Choose *AWS Lambda*.
-6. Configure the AWS Lambda "settings":#settings.
-7. Click *Create*.
-
-You can also create an AWS Lambda integration using the "Control API":/docs/platform/account/control-api.
-
-h3(#settings). Settings
-
-The following settings are available when creating an AWS Lambda integration:
-
-|_. Setting |_. Description |
-| AWS Region | Specifies the region of your AWS Lambda. |
-| Function Name | The name of your AWS Lambda function. |
-| "AWS authentication scheme":#auth | Choose the authentication method. Either *AWS credentials* or *ARN of an assumable role*. |
-| AWS Credentials | If using AWS credentials, enter the values in @key:value@ format. |
-| ARN of an assumable role | If using ARN of an assumable role, enter the ARN of the role that Ably can assume to access your AWS Lambada function. |
-| Qualifier | The qualifier of your Lambda function, if set. |
-| "Source":/docs/integrations/webhooks#sources | Specifies the event types being sent to your AWS Lambda function. |
-| "Channel filter":/docs/integrations/webhooks#filter | Filters the source channels based on a regular expression. |
-| Encoding | Specifies the encoding format of messages. Either JSON or MsgPack. |
-| "Enveloped":/docs/integrations/webhooks#enveloped | Checkbox to set whether messages should be enveloped or not. Enveloped is the default. |
-
-h2(#auth). AWS authentication
-
-Delegate access to your AWS resources by creating an "IAM role":https://docs.aws.amazon.com/IAM/latest/UserGuide/id_roles.html that the Ably AWS account can assume.
-
-This approach follows AWS best practices, as it avoids sharing access keys directly. Specify the role's ARN to grant Ably the necessary permissions in a secure manner.
-
-h3(#kinesis). Create a Lambda policy
-
-The following steps show you how to create a policy for an AWS Lambda.
-
-1. In the IAM console sidebar select *Policies*.
-2. Click *Create Policy*.
-3. Click the JSON tab and enter the following JSON to configure the policy:
-
-```[json]
-{
-{
- "Version": "2012-10-17",
- "Statement": [
- {
- "Sid": "AllowInvokeLambdaFunction",
- "Effect": "Allow",
- "Action": [
- "lambda:InvokeAsync",
- "lambda:InvokeFunction"
- ],
- "Resource": [
- "arn:aws:lambda:::function:"
- ]
- }
- ]
-}
-```
-
-
-
-4. Click *Next: Tags*. You don't need to add any tags.
-5. Click *Next: Review*.
-6. Enter a suitable name for your policy.
-7. Click *Create Policy*.
-
-You have created a policy that grants the permissions required to use a Kinesis stream. You must now attach it to the role that you'll specify in your Ably integration rule.
-
-h3(#role). Create a role
-
-Create an IAM role as follows:
-
-1. In the AWS IAM console, click *Roles* in the sidebar and then click *Create Role*.
-2. For type of trusted entity select *Another AWS account*.
-3. For Account ID specify 203461409171. This is the Ably AWS account.
-4. Click the *Require external ID checkbox* and then enter an external ID of "@.@":/docs/platform/account/control-api#ids.
-5. Click *Next: Permissions*.
-6. Now select the policy you created earlier to attach to this role. You can type the name of your policy into the *Filter policies* search box.
-
-Then ensure the checkbox for the policy is selected.
-
-1. Click *Next: Tags*.
-2. You don't need to add tags so click *Next: Review*.
-3. Enter a suitable name for your role.
-4. Click *Create Role*.
diff --git a/content/integrations/webhooks/zapier.textile b/content/integrations/webhooks/zapier.textile
deleted file mode 100644
index 279750de42..0000000000
--- a/content/integrations/webhooks/zapier.textile
+++ /dev/null
@@ -1,41 +0,0 @@
----
-title: Zapier integration
-meta_description: "Trigger Zapier based on message, channel lifecycle, channel occupancy, and presence events."
-meta_keywords: "Zapier, integrations, events, serverless"
-languages:
- - none
-redirect_from:
- - /docs/general/events/zapier
- - /docs/content//webhooks/zapier
- - /docs/general/webhooks/zapier
----
-
-"Zapier":https://zapier.com/page/webhooks integrations enable you to trigger Zapier Zaps when an event occurs in Ably. They are useful for integrating with thousands of other services using Zapier's webhooks feature.
-
-h2(#create). Create a Zapier integration
-
-To create a Zapier integration in your "dashboard:":https://ably.com/dashboard/any
-
-1. Login and select the application you wish to integrate with Zapier.
-2. Click the *Integrations* tab.
-3. Click the *New Integration Rule* button.
-4. Choose *Webhook*.
-5. Choose *Zapier*.
-6. Configure the Zapier "settings":#settings.
-7. Click *Create*.
-
-You can also create a Zapier integration using the "Control API":/docs/platform/account/control-api.
-
-h3(#settings). Settings
-
-The following settings are available when creating a Zapier integration:
-
-|_. Setting |_. Description |
-| URL | The Zapier webhook URL to POST a summary of events to. |
-| Headers | Allows the inclusion of additional information in key-value format. |
-| Request Mode | Choose between *Single Request* or *Batch Request*. |
-| "Source":/docs/integrations/webhooks#sources | Specifies the event types being sent to Zapier. |
-| "Channel filter":/docs/integrations/webhooks#filter | Filters the source channels based on a regular expression. |
-| Encoding | Specifies the encoding format of messages. Either JSON or MsgPack. |
-| "Enveloped":/docs/integrations/webhooks#enveloped | Checkbox to set whether messages should be enveloped or not. Enveloped is the default. Only available when @Request Mode@ is set to @Single@. |
-| Sign with key | Payloads will be signed with an API key so they can be validated by your servers. Only available when @Request Mode@ is set to @Batched@. |
diff --git a/src/pages/docs/platform/integrations/webhooks/azure.mdx b/src/pages/docs/platform/integrations/webhooks/azure.mdx
new file mode 100644
index 0000000000..1a0f723939
--- /dev/null
+++ b/src/pages/docs/platform/integrations/webhooks/azure.mdx
@@ -0,0 +1,40 @@
+---
+title: Azure Functions integration
+meta_description: "Trigger Microsoft Azure functions based on message, channel lifecycle, channel occupancy, and presence events."
+meta_keywords: "Microsoft Azure, integrations, events, serverless"
+redirect_from:
+ - /docs/general/events/azure
+ - /docs/general/webhooks/azure
+---
+
+[Azure Function](https://azure.microsoft.com/en-gb/services/functions/) integrations enable you to trigger Microsoft's event-driven serverless compute functions when an event occurs in Ably.
+
+## Create an Azure Function integration
+
+To create an Azure Function integration in your [dashboard:](https://ably.com/dashboard/any)
+
+1. Login and select the application you wish to integrate with an Azure Function.
+2. Click the **Integrations** tab.
+3. Click the **New Integration Rule** button.
+4. Choose **Webhook**.
+5. Choose **Azure Functions**.
+6. Configure the Azure Functions [settings](#settings).
+7. Click **Create**.
+
+You can also create an Azure Function integration using the [Control API](/docs/platform/account/control-api).
+
+### Settings
+
+The following settings are available when creating an Azure Function integration:
+
+| Setting | Description |
+| ------- | ----------- |
+| Azure App ID | The ID of your Azure App. |
+| Function Name | The name of your Azure Function. |
+| Headers | Allows the inclusion of additional information in key-value format. |
+| Request Mode | Choose between **Single Request** or **Batch Request**. |
+| [Source](/docs/integrations/webhooks#sources) | Specifies the event types being sent to Azure Functions. |
+| [Channel filter](/docs/integrations/webhooks#filter) | Filters the source channels based on a regular expression. |
+| Encoding | Specifies the encoding format of messages. Either JSON or MsgPack. |
+| [Enveloped](/docs/integrations/webhooks#enveloped) | Checkbox to set whether messages should be enveloped or not. Enveloped is the default. Only available when `Request Mode` is set to `Single`. |
+| Sign with key | Payloads will be signed with an API key so they can be validated by your servers. Only available when `Request Mode` is set to `Batched`. |
diff --git a/src/pages/docs/platform/integrations/webhooks/cloudflare.mdx b/src/pages/docs/platform/integrations/webhooks/cloudflare.mdx
new file mode 100644
index 0000000000..28793d8910
--- /dev/null
+++ b/src/pages/docs/platform/integrations/webhooks/cloudflare.mdx
@@ -0,0 +1,38 @@
+---
+title: Cloudflare Worker integration
+meta_description: "Trigger Cloudflare Workers based on message, channel lifecycle, channel occupancy, and presence events."
+meta_keywords: "Cloudflare Workers, integrations, events, serverless"
+redirect_from:
+ - /docs/general/events/cloudflare
+ - /docs/general/webhooks/cloudflare
+---
+
+[Cloudflare Worker](https://workers.cloudflare.com) integrations enable Cloudflare's Edge Network to distribute your JavaScript-based functions when an event occurs in Ably.
+
+## Create a Cloudflare Worker integration
+
+To create a Cloudflare Worker integration in your [dashboard:](https://ably.com/dashboard/any)
+
+1. Login and select the application you wish to integrate with a Cloudflare Worker.
+2. Click the **Integrations** tab.
+3. Click the **New Integration Rule** button.
+4. Choose **Webhook**.
+5. Choose **Cloudflare Workers**.
+6. Configure the Cloudflare Worker [settings](#settings).
+7. Click **Create**.
+
+You can also create a Cloudflare Worker integration using the [Control API](/docs/platform/account/control-api).
+
+### Settings
+
+The following settings are available when creating a Cloudflare Worker integration:
+
+| Setting | Description |
+| ------- | ----------- |
+| URL | The URL of the Cloudflare Worker to POST a summary of events to. |
+| Headers | Allows the inclusion of additional information in key-value format. |
+| Request Mode | Choose between **Single Request** or **Batch Request**. |
+| [Source](/docs/integrations/webhooks#sources) | Specifies the event types being sent to Cloudflare Workers. |
+| [Channel filter](/docs/integrations/webhooks#filter) | Filters the source channels based on a regular expression. |
+| Encoding | Specifies the encoding format of messages. Either JSON or MsgPack. |
+| Sign with key | Payloads will be signed with an API key so they can be validated by your servers. Only available when `Request Mode` is set to `Batched`. |
diff --git a/src/pages/docs/platform/integrations/webhooks/gcp-function.mdx b/src/pages/docs/platform/integrations/webhooks/gcp-function.mdx
new file mode 100644
index 0000000000..230ac650c2
--- /dev/null
+++ b/src/pages/docs/platform/integrations/webhooks/gcp-function.mdx
@@ -0,0 +1,41 @@
+---
+title: Google Function integration
+meta_description: "Trigger Google Functions based on message, channel lifecycle, channel occupancy, and presence events."
+meta_keywords: "Google Functions, integrations, events, serverless"
+redirect_from:
+ - /docs/general/events/google-functions
+ - /docs/general/webhooks/google-functions
+---
+
+[Google Function](https://cloud.google.com/functions) integrations enable you to trigger event-driven serverless compute functions when an event occurs in Ably.
+
+## Create a Google Function integration
+
+To create a Google Function integration in your [dashboard:](https://ably.com/dashboard/any)
+
+1. Login and select the application you wish to integrate with a Google Function.
+2. Click the **Integrations** tab.
+3. Click the **New Integration Rule** button.
+4. Choose **Webhook**.
+5. Choose **Google Functions**.
+6. Configure the Google Function [settings](#settings).
+7. Click **Create**.
+
+You can also create a Google Function integration using the [Control API](/docs/platform/account/control-api).
+
+### Settings
+
+The following settings are available when creating a Google Function integration:
+
+| Setting | Description |
+| ------- | ----------- |
+| Region | The region in which your Google Function is hosted. |
+| Project ID | The ID of your Google Cloud Project. |
+| Function | The name of your Google Function. |
+| Headers | Allows the inclusion of additional information in key-value format. |
+| Request Mode | Choose between **Single Request** or **Batch Request**. |
+| [Source](/docs/integrations/webhooks#sources) | Specifies the event types being sent to your Google Function. |
+| [Channel filter](/docs/integrations/webhooks#filter) | Filters the source channels based on a regular expression. |
+| Encoding | Specifies the encoding format of messages. Either JSON or MsgPack. |
+| [Enveloped](/docs/integrations/webhooks#enveloped) | Checkbox to set whether messages should be enveloped or not. Enveloped is the default. Only available when `Request Mode` is set to `Single`. |
+| Sign with key | Payloads will be signed with an API key so they can be validated by your servers. Only available when `Request Mode` is set to `Batched`. |
diff --git a/src/pages/docs/platform/integrations/webhooks/ifttt.mdx b/src/pages/docs/platform/integrations/webhooks/ifttt.mdx
new file mode 100644
index 0000000000..83d04e8ae5
--- /dev/null
+++ b/src/pages/docs/platform/integrations/webhooks/ifttt.mdx
@@ -0,0 +1,87 @@
+---
+title: IFTTT integration
+meta_description: "Trigger IFTTT based on message, channel lifecycle, channel occupancy, and presence events."
+meta_keywords: "IFTTT, integrations, events, serverless"
+redirect_from:
+ - /docs/general/events/ifttt
+ - /docs/general/webhooks/ifttt
+---
+
+[IFTTT](https://ifttt.com/maker_webhooks) (If This Then That) integrations enable you to trigger conditional chains, and help to combine various services together when an event occurs in Ably.
+
+## Create a IFTTT integration
+
+To create an IFTTT integration in your [dashboard:](https://ably.com/dashboard/any)
+
+1. Login and select the application you wish to integrate with IFTTT.
+2. Click the **Integrations** tab.
+3. Click the **New Integration Rule** button.
+4. Choose **Webhook**.
+5. Choose **IFTTT**.
+6. Configure the IFTTT [settings](#settings).
+7. Click **Create**.
+
+You can also create an IFTTT integration using the [Control API](/docs/platform/account/control-api).
+
+### Settings
+
+The following settings are available when creating an IFTTT integration:
+
+| Setting | Description |
+| ------- | ----------- |
+| IFTTT Webhook key | The webhook key for your IFTTT account. |
+| Event name | The name used to identify the IFTTT applet. |
+| [Source](/docs/integrations/webhooks#sources) | Specifies the event types being sent to IFTTT. |
+| [Channel filter](/docs/integrations/webhooks#filter) | Filters the source channels based on a regular expression. |
+| Encoding | Specifies the encoding format of messages. Either JSON or MsgPack. |
+
+## Restrictions
+
+IFTTT has limitations on the data it can process. All payloads must be `JSON` and use only the keys `value1`, `value2`, or `value3`. Any other format or additional keys will not be processed.
+
+As a result, [enveloping](/docs/integrations/webhooks#enveloped), "batching":/docs/integrations/webhooks#batching are not supported. Additionally, protocols that require decoding such as [MQTT](/docs/protocols/mqtt), are not supported with IFTTT.
+
+To ensure data is processed by IFTTT, it must match the required IFTTT structure. The following example shows the headers and payload sent to IFTTT when a message is sent to a channel:
+
+
+```json
+{
+ "value1" :"data I want to send 1",
+ "value2" :"data I want to send 2",
+ "value3" :"data I want to send 3"
+}
+```
+
+
+For a [message data](/docs/api/realtime-sdk/messages#data) or [presence message data](/docs/api/realtime-sdk/presence#presence-message) of `{ "value1": "My first message", "value2": "My second message" }`, the following would be sent to your IFTTT endpoint:
+
+Headers:
+
+
+```text
+host: https://maker.ifttt.com/trigger/{YOUR_EVENT}/with/key/{YOUR_IFTTT_KEY}
+content-type: application/json
+x-ably-envelope-appid: {YOUR_APP_ID}
+x-ably-envelope-channel: {YOUR_CHANNEL}
+x-ably-envelope-rule-id: {YOUR_RULE_ID}
+x-ably-envelope-site: {ably-server-location}
+x-ably-envelope-source: channel.message
+x-ably-message-encoding: json
+x-ably-message-id: {UNIQUE_ABLY_MESSAGE_ID}
+x-ably-message-timestamp: {TIMESTAMP_ORIGINAL_MESSAGE_WAS_SENT}
+x-ably-version: 1.2
+content-length: 18
+connection: keep-alive
+```
+
+
+Payload:
+
+
+```json
+{
+ "value1": "My first message",
+ "value2": "My second message"
+}
+```
+
diff --git a/src/pages/docs/platform/integrations/webhooks/index.mdx b/src/pages/docs/platform/integrations/webhooks/index.mdx
new file mode 100644
index 0000000000..3b4d000d37
--- /dev/null
+++ b/src/pages/docs/platform/integrations/webhooks/index.mdx
@@ -0,0 +1,501 @@
+---
+title: Outbound webhooks overview
+meta_description: "A guide on webhook payloads, including batched, enveloped, and non-enveloped event payloads, with decoding examples and sources."
+meta_keywords: "webhooks, Ably, payloads, batched events, enveloped events, non-enveloped events, message decoding, presence events, channel lifecycle, data processing"
+redirect_from:
+ - /docs/general/functions
+ - /docs/general/events
+ - /docs/general/webhooks
+---
+
+Outbound webhook integrations enable you to trigger serverless functions and notify HTTP endpoints when events occur in Ably.
+
+Events include when messages are [published](/docs/pub-sub#publish), when [presence](/docs/presence-occupancy/presence#trigger-events) events occur, changes in channel occupancy and when channels are created or discarded. Data can be delivered individually or in batches to external services.
+
+
+
+There are two ways to create an outbound webhook integration:
+
+* Using the [Ably dashboard](#dashboard).
+* Using the [Control API](#api).
+
+## Channel filter
+
+Set a filter to restrict which channels an integration applies to using a regular expression.
+
+The following examples demonstrate channel names that you can match against using regular expressions to control which channels a webhook rule applies to:
+
+
+```text
+mychannel:public
+public
+public:events
+public:events:conferences
+public:news:americas
+public:news:europe
+```
+
+
+| RegEx | Channels |
+| ----- | -------- |
+| `^public.*` | Matches any channel that starts with `public`. This includes `public`, both `public:events` channels, and both `public:news` channels. |
+| `^public$` | Matches only channels named exactly `public`. |
+| `:public$` | Matches channels that end with `:public`. This includes only `mychannel:public`. |
+| `^public:events$` | Matches channels named exactly `public:events`. This does not include `public:events:conferences`. |
+| `^public.*europe$` | Matches channels that start with `public` and end with `europe`. This includes only `public:news:europe`. |
+| `news` | Matches any channel name that includes the word `news`. This includes `public:news:americas` and `public:news:europe`.
+
+## Event types
+
+You can configure webhooks to listen for the following event types:
+
+| Event type | Description |
+| ---------- | ----------- |
+| `channel.lifecycle` | Triggered when a channel is created or discarded. |
+| `channel.message` | Triggered when [messages](/docs/messages) are published. |
+| `channel.occupancy` | Triggered when the number of users in a channel [changes](/docs/channels/options#occupancy). |
+| `channel.presence` | Triggered when users enter, leave, or update their [presence](/docs/presence-occupancy/presence). |
+
+
+
+## Single requests
+
+In single request mode, a `POST` request is made to your endpoint each time an event occurs.
+
+This is useful in certain use cases where an endpoint can only process one message per request, however it can lead to the endpoint being overloaded in high-throughput scenarios. Single requests are best suited to where this a 1:1 relationship between messages being sent and the events being called.
+
+Multiple requests can be in-flight at once, however be aware there is a [limit on concurrency](/docs/platform/pricing/limits#integrations). If it is exceeded then new messages are placed in a short 10 message queue. If that is exceeded then further messages are rejected.
+
+Ably will retry failed `5XX` requests. If the response times out, Ably will retry twice, first after 4 seconds and then again after 20 seconds.
+
+## Batched requests
+
+Batched requests are useful for endpoints that have the potential to be overloaded by requests, or have no requirement to process messages one-by-one.
+
+Batched requests are published at most once per second, but this may vary by integration. Once a batched request is triggered, all other events will be queued so that they can be delivered in a batch in the next request. The next request will be issued within one second with the following caveats:
+
+* Only a limited number of HTTP requests are in-flight at one time for each configured integration. Therefore, if you want to be notified quickly, you should accept requests quickly and defer any work to be done asynchronously.
+* If there are more than 1,000 events queued for a payload, the oldest 1,000 events will be bundled into this payload and the remaining events will be delivered in the subsequent payload. Therefore, if your sustained rate of events is expected to be more than 1,000 per second or your servers are slow to respond, then it is possible a backlog will build up and you will not receive all events.
+
+If a batched request fails, Ably will retry the request using an exponential backoff strategy.
+
+The backoff delay follows the formula: `delay = delay * sqrt(2)` where the initial delay is 1 second. For example, if a webhook request fails repeatedly, the retry delays will be:
+
+* Initial request: 1.4s wait → 1st retry.
+* 1st retry: 2s wait → 2nd retry.
+* 2nd retry: 2.8s wait → 3rd retry.
+* 3rd retry: 4s wait → 4th retry.
+* 4th retry: 5.6s wait → successful request.
+
+The back off for consecutively failing requests will increase until it reaches 60s. All subsequent retries for failed requests will then be made at 60s intervals until a request is successful. The queue of events is retain for 5 minutes. If an event cannot be delivered within that time then events are discarded to prevent the queue from growing indefinitely.
+
+### Batched event payloads
+
+Given the various potential combinations of enveloped, batched, and message sources, it's helpful to understand what to expect in different scenarios.
+
+Batched events will have the following headers:
+
+| Header | Description |
+| ------ | ----------- |
+| `content-type` | The type of the payload. This will be `application/json` or `application/x-msgpack`. |
+| `x-ably-version` | The version of the Webhook. Currently, this is `1.2`. |
+
+Each batched message will have the following fields::
+
+| Field | Description |
+| ----- | ----------- |
+| `name` | The event type, for example, `presence.message`, `channel.message`, or `channel.closed`. |
+| `webhookId` | A unique internal ID for the configured webhook. |
+| `source` | The source of the webhook, which will be one of `channel.message`, `channel.presence`, `channel.lifecycle`, or `channel.occupancy`. |
+| `timestamp` | A timestamp in milliseconds since the epoch for the presence event. |
+| `data` | An object containing the event data, defined below in [JSONPath format](https://goessner.net/articles/JsonPath/). |
+
+#### Batched message events
+
+For `message` events, the `data` field will contain the following:
+
+| Field | Description |
+| ----- | ----------- |
+| `data.channelId` | The name of the channel that the presence event belongs to. |
+| `data.site` | An internal site identifier indicating which primary datacenter the member is present in. |
+| `data.messages` | An `Array` of raw messages. |
+
+The following example is a batched `message` payload:
+
+
+```json
+{
+ "items": [{
+ "webhookId": "ABcDEf",
+ "source": "channel.message",
+ "serial": "a7bcdEFghIjklm123456789:4",
+ "timestamp": 1562124922426,
+ "name": "channel.message",
+ "data": {
+ "channelId": "chat-channel-4",
+ "site": "eu-west-1-A",
+ "messages": [{
+ "id": "ABcDefgHIj:1:0",
+ "clientId": "user-3",
+ "connectionId": "ABcDefgHIj",
+ "timestamp": 1123145678900,
+ "data": "the message data",
+ "name": "a message name"
+ }]
+ }
+ }]
+}
+```
+
+
+#### Decode batched messages
+
+The Ably SDK automatically decodes messages sent over the Realtime service into [`message`](/docs/api/realtime-sdk/types#message) objects. However, batched, enveloped webhook payloads require explicit decoding using:
+
+* [`Message.fromEncoded`](/docs/api/realtime-sdk/messages#message-from-encoded) for an array of messages.
+* [`Message`](/docs/api/realtime-sdk/types#message) for a single message.
+
+The benefits of decoding include fully restoring `data` to its original datatype using encoding. Additionally, it supports automatic decryption when an [encryption](/docs/channels/options/encryption) key is provided. Its recommended to decode all messages received via webhooks to ensure proper data handling.
+
+The following example demonstrates how to decode an array of messages received via a webhook:
+
+
+```javascript
+webhookMessage.items.forEach((item) => {
+ const messages = Ably.Realtime.Message.fromEncodedArray(item.data.messages);
+ messages.forEach((message) => {
+ console.log(message.toString());
+ })
+})
+```
+
+
+### Batched presence structure
+
+Batched presence events group multiple presence messages in a single payload. `presence` events `data` contains:
+
+| Property | Description |
+| -------- | ----------- |
+| `data.channelId` | The name of the channel the presence event belongs to. |
+| `data.site` | An internal site identifier, indicating the primary datacenter the member is present in. |
+| `data.presence` | An `Array` of raw presence messages. |
+
+The following is an example of a batched `presence` payload:
+
+
+```json
+{
+ "items": [{
+ "webhookId": "ABcDEf",
+ "source": "channel.presence",
+ "serial": "a7bcdEFghIjklm123456789:4",
+ "timestamp": 1562124922426,
+ "name": "presence.message",
+ "data": {
+ "channelId": "education-channel",
+ "site": "eu-west-1-A",
+ "presence": [{
+ "id": "ABcDefgHIj:1:0",
+ "clientId": "bob",
+ "connectionId": "ABcDefgHIj",
+ "timestamp": 1123145678900,
+ "data": "the message data",
+ "action": 4
+ }]
+ }
+ }]
+}
+```
+
+
+#### Decode batched presence
+
+Presence messages sent [over the realtime service](/docs/channels) are automatically decoded into [`PresenceMessage`](/docs/api/realtime-sdk/types#presence-message) objects by the Ably client library. For webhooks, you need to do this manually using [`PresenceMessage.fromEncodedArray`](/docs/api/realtime-sdk/presence#presence-from-encoded-array) on the `data.presence` array, or [`PresenceMessage.fromEncoded`](/docs/api/realtime-sdk/presence#presence-from-encoded) on an individual entry. These methods convert the encoded values into [`PresenceMessage`](/docs/api/realtime-sdk/types#presence-message) objects—either as an array or a single instance.
+
+This allows you to decode the numerical action into a [`Presence action`](/docs/api/realtime-sdk/presence#presence-action) string (such as `enter`, `update`, or `leave`), fully decode the `data` (using the `encoding`) back into its original datatype or an equivalent in the client library, and, if you're using [encryption](/docs/channels/options/encryption), pass your encryption key to decrypt the `data`.
+
+The following example demonstrates how to decode an array of presence messages received via a webhook:
+
+
+```javascript
+webhookMessage.items.forEach((item) => {
+ const messages = Ably.Realtime.PresenceMessage.fromEncodedArray(item.data.messages);
+ messages.forEach((message) => {
+ console.log(message.toString());
+ })
+})
+```
+
+
+### Batched channel lifecycle structure
+
+Ably includes the following fields in the `data` object for batched `channel.lifecycle` events:
+
+| Property | Description |
+| -------- | ----------- |
+| `data.channelId` | The name of the channel where the lifecycle event occurred. |
+| `data.status` | A [`ChannelStatus`](/docs/api/realtime-sdk/channel-metadata#channel-details) object that describes the channel's current state. |
+
+The `name` of a `channel.lifecycle` event will be `channel.opened` or `channel.closed`.
+
+The following is example a batched `channel lifecycle` payload:
+
+
+```json
+{
+ "items": [{
+ "webhookId": "ABcDEf",
+ "source": "channel.lifecycle",
+ "timestamp": 1562124922426,
+ "serial": "a7bcdEFghIjklm123456789:4",
+ "name": "channel.opened",
+ "data": {
+ "channelId": "chat-channel-5",
+ "name": "chat-channel-5",
+ "status": {
+ "isActive": true,
+ "occupancy": {
+ "metrics": {
+ "connections": 1,
+ "publishers": 1,
+ "subscribers": 1,
+ "presenceConnections": 1,
+ "presenceMembers": 0,
+ "presenceSubscribers": 1,
+ "objectPublishers": 1,
+ "objectSubscribers": 1
+ }
+ }
+ }
+ }
+ }]
+}
+```
+
+
+## Enveloped events
+
+Enveloping events adds structured metadata such as the publisher's `clientId` and the originating channel name, alongside the payload.
+
+This metadata is useful when processing events dynamically or when additional context about the source is required. Enveloped messages are recommended for most use cases, as they provide a consistent format for all events.
+
+Enveloped events include the following headers:
+
+| Header | Description |
+| ------ | ----------- |
+| `x-ably-version` | Specifies the Webhook version. Currently, this should be set to `1.2`. |
+| `content-type` | Indicates the payload format, which can be either `application/json` or `application/x-msgpack` for enveloped messages. |
+
+Each enveloped message contains the following fields:
+
+| Field | Description |
+| ----- | ----------- |
+| `source` | The origin of the webhook event. Possible values are: `channel.message`, `channel.presence`, `channel.lifecycle`, `channel.occupancy` |
+| `appId` | The Ably app that generated the event. |
+| `channel` | The Ably channel where the event occurred. |
+| `site` | The Ably datacenter that sent the message. |
+| `timestamp` | A timestamp in milliseconds since the epoch representing when the presence event occurred. |
+
+In addition, it will contain another field which will contain the actual message, which is named according to the message type.
+
+### Enveloped message events
+
+For `message` events, the `messages` array contains a raw message.
+
+The following is an example of an enveloped `message` payload:
+
+
+```json
+{
+ "source": "channel.message",
+ "appId": "aBCdEf",
+ "channel": "channel-name",
+ "site": "eu-central-1-A",
+ "ruleId": "1-a2Bc",
+ "messages": [{
+ "id": "ABcDefgHIj:1:0",
+ "connectionId": "ABcDefgHIj",
+ "timestamp": 1123145678900,
+ "data": "some message data",
+ "name": "my message name"
+ }]
+}
+```
+
+
+#### Decode enveloped messages
+
+Ably SDKs automatically decode messages into `Message` objects. Messages sent via an integration need to be decoded manually.
+
+There are two methods available for decoding messages into `Message` objects:
+
+* `Message.fromEncodedArray()` for an array of messages.
+* `Message.fromEncoded()` for single messages.
+
+There are also equivalent methods for decoding presence messages into `PresenceMessage` objects:
+
+* `PresenceMessage.fromEncodedArray()` for an array of presence messages.
+* `PresenceMessage.fromEncoded()` for single messages.
+
+Decoding is essential because it reconstructs the original data payload using the encoding field, ensuring the correct data type is restored, whether it's a string, binary, or structured object. If the message was encrypted, passing your encryption key to the method allows the SDK to decrypt data automatically.
+
+Ably strongly recommends decoding all messages received over integrations before processing them to avoid issues with unexpected data formats.
+
+The following example demonstrates how to decode an array of messages received via a webhook:
+
+
+```javascript
+const messages = Ably.Realtime.Message.fromEncodedArray(item.messages);
+
+messages.forEach((message) => {
+ console.log(message.toString());
+});
+```
+
+
+### Enveloped presence events
+
+Webhook [presence](/docs/presence-occupancy/presence) events contain raw `presence` data in the `presence` array.
+
+The following example is an enveloped `message` payload with a `presence` array:
+
+
+```json
+{
+ "source": "channel.message",
+ "appId": "aBCdEf",
+ "channel": "channel-name",
+ "site": "eu-central-1-A",
+ "ruleId": "1-a2Bc",
+ "presence": [{
+ "id": "abCdEFgHIJ:1:0",
+ "clientId": "bob",
+ "connectionId": "Ab1CDE2FGh",
+ "timestamp": 1582270137276,
+ "data": "some data in the presence object",
+ "action": 4
+ }]
+}
+```
+
+
+#### Decode enveloped presence messages
+
+Presence messages sent over Realtime are automatically decoded into `PresenceMessage` objects by the Ably SDK. However, webhook presence messages require explicit decoding.
+
+To decode presence messages received via webhooks, use the appropriate method:
+* For multiple messages, use `PresenceMessage.fromEncodedArray()` on the presence array.
+* For a single message, use `PresenceMessage.fromEncoded()` on an individual presence entry.
+
+Both methods convert encoded presence messages into `PresenceMessage` objects, restoring the original format.
+
+Decoding presence messages provides several advantages:
+* It converts numerical presence action values into readable strings such as **enter**, **update**, or **leave**.
+* It reconstructs the original data field, ensuring it matches the format it was sent in.
+* If encryption is enabled, passing your encryption key will automatically decrypt the data field.
+
+Ably strongly recommends decoding all presence messages received via webhooks to ensure proper data handling.
+
+The following example demonstrates decoding an array of presence messages using the Ably JavaScript SDK:
+
+
+```javascript
+const messages = Ably.Realtime.PresenceMessage.fromEncodedArray(item.messages);
+messages.forEach((message) => {
+ console.log(message.toString());
+})
+```
+
+
+## Non-enveloped events
+
+You can turn off enveloping if your endpoint only needs the raw message payload or follows a strict data structure. This results in smaller payloads and eliminates the need to parse additional metadata. However, it requires you to handle raw payload decoding manually.
+
+Non-enveloped webhook messages include headers that provide essential context about the payload, such as its source, format, and metadata. The following headers are included in all non-enveloped messages:
+
+| Header | Description |
+| ------ | ----------- |
+| `content-type` | Defines the payload type: `application/json` for JSON, `text/plain` for text, or `application/octet-stream` for binary data. |
+| `x-ably-version` | Webhook version, currently 1.2. |
+| `x-ably-envelope-appid` | Ably `appID` from which the message originated. |
+| `x-ably-envelope-channel` | Name of the Ably channel that sent the message. |
+| `x-ably-envelope-rule-id` | `RuleID` that triggered the webhook event. |
+| `x-ably-envelope-site` | Ably datacenter that processed the event. |
+| `x-ably-envelope-source` | Event source, indicating the type of event: `channel.message`, `channel.presence`, `channel.lifecycle`, or `channel.occupancy`. |
+| `x-ably-message-client-id` | `ClientID` of the connection that sent the event. |
+| `x-ably-message-connection-id` | `ConnectionID` that initiated the event. |
+| `x-ably-message-id` | Unique `messageID` for tracking. |
+| `x-ably-message-timestamp` | Timestamp of when the message was originally sent. |
+
+### Non-enveloped message events
+
+For `message` events, there will be additional headers:
+
+| Header | Description |
+| ------ | ----------- |
+| `x-ably-message-name` | The name of the message. |
+
+The payload will contain the data of the message.
+
+For example, if you publish a message to the channel `my_channel` using the following cURL request:
+
+
+```shell
+curl -X POST https://rest.ably.io/channels/my_channel/messages \
+ -u "{{API_KEY}}" \
+ -H "Content-Type: application/json" \
+ --data '{ "name": "publish", "data": "example" }'
+```
+
+
+The `x-ably-message-name` header would be `publish` and the payload would be `example`.
+
+
+### Non-enveloped presence messages
+
+For `Presence` events, there will be the additional headers:
+
+| Header | Description |
+| ------ | ----------- |
+| `x-ably-message-name` | The action performed by the event (`update`, `enter`, `leave`). |
+
+The payload will contain the [data](/docs/api/realtime-sdk/presence#presence-message) of the `Presence` message.
+
+The following example demonstrates a non-enveloped [enter](/docs/api/realtime-sdk/presence#enter) presence event:
+
+
+```javascript
+realtime = new Ably.Realtime({
+ key: '{{API_KEY}}',
+ clientId: 'bob'
+});
+channel = realtime.channels.get('some_channel');
+await channel.presence.enter('some data');
+```
+
+
+The `x-ably-message-action` header would be `enter` and the payload would be `some data`.
+
+## Webhook security
+
+Ably advises you to use a secure HTTPS URL when you configure webhooks. This way, you ensure that all communication with your servers is encrypted with TLS and cannot be intercepted.
+
+In addition, Ably optionally supports signing webhook requests so you can verify their authenticity. This applies to both [single](#single) and [batched](#batched) webhook requests, as well as any streaming integrations that also rely on HTTP-based callbacks. Ably sends the signature in the `X-Ably-Signature` header for batched requests and references the connected key in the `X-Ably-Key` header.
+
+The following steps are required to verify the signature:
+
+1. Start with the webhook request body. This is a JSON string encoded with content-encoding `utf-8`.
+2. Identify the key based on the `keyId` indicated in the `X-Ably-Key` header.
+3. Calculate the HMAC of that request body using the SHA-256 algorithm and the corresponding `keyValue` (the secret part of the key after the "`:`").
+4. Encode the resulting HMAC using RFC 3548 base64.
+5. Compare that result with the signature value in the `X-Ably-Signature` header.
+
+### Sign webhook requests
+
+If you choose to sign your webhook requests, it is recommended that you try the following:
+
+1. Set up a free [`RequestBin`](https://requestbin.com/) HTTP endpoint test URL.
+2. [Configure](#configure) a webhook with the URL set to the `RequestBin` endpoint. Make sure you choose to [batched](#batched) messages and use a key to sign each webhook request.
+3. Trigger an event using the Dev Console in your app [dashboard](https://ably.com/dashboard/any). This will generate a webhook. Then confirm that RequestBin received the webhook.
diff --git a/src/pages/docs/platform/integrations/webhooks/lambda.mdx b/src/pages/docs/platform/integrations/webhooks/lambda.mdx
new file mode 100644
index 0000000000..ca142ac3b1
--- /dev/null
+++ b/src/pages/docs/platform/integrations/webhooks/lambda.mdx
@@ -0,0 +1,106 @@
+---
+title: AWS Lambda integration
+meta_description: "Trigger AWS Lambda functions based on message, channel lifecycle, channel occupancy, and presence events."
+meta_keywords: "AWS Lambda, integrations, events, serverless"
+redirect_from:
+ - /docs/general/events/aws-lambda
+ - /docs/general/webhooks/aws-lambda
+---
+
+[AWS Lambda](https://aws.amazon.com/lambda/) integrations enable you to trigger event-driven serverless compute functions when an event occurs in Ably. They are useful for integrating into various AWS services.
+
+## Create an AWS Lambda integration
+
+To create an AWS Lambda integration in your [dashboard:](https://ably.com/dashboard/any)
+
+1. Login and select the application you wish to integrate with AWS Lambda.
+2. Click the **Integrations** tab.
+3. Click the **New Integration Rule** button.
+4. Choose **Webhook**.
+5. Choose **AWS Lambda**.
+6. Configure the AWS Lambda [settings](#settings).
+7. Click **Create**.
+
+You can also create an AWS Lambda integration using the [Control API](/docs/platform/account/control-api).
+
+### Settings
+
+The following settings are available when creating an AWS Lambda integration:
+
+| Setting | Description |
+| ------- | ----------- |
+| AWS Region | Specifies the region of your AWS Lambda. |
+| Function Name | The name of your AWS Lambda function. |
+| [AWS authentication scheme](#auth) | Choose the authentication method. Either **AWS credentials** or **ARN of an assumable role**. |
+| AWS Credentials | If using AWS credentials, enter the values in `key:value` format. |
+| ARN of an assumable role | If using ARN of an assumable role, enter the ARN of the role that Ably can assume to access your AWS Lambada function. |
+| Qualifier | The qualifier of your Lambda function, if set. |
+| [Source](/docs/integrations/webhooks#sources) | Specifies the event types being sent to your AWS Lambda function. |
+| [Channel filter](/docs/integrations/webhooks#filter) | Filters the source channels based on a regular expression. |
+| Encoding | Specifies the encoding format of messages. Either JSON or MsgPack. |
+| [Enveloped](/docs/integrations/webhooks#enveloped) | Checkbox to set whether messages should be enveloped or not. Enveloped is the default. |
+
+## AWS authentication
+
+Delegate access to your AWS resources by creating an [IAM role](https://docs.aws.amazon.com/IAM/latest/UserGuide/id_roles.html) that the Ably AWS account can assume.
+
+This approach follows AWS best practices, as it avoids sharing access keys directly. Specify the role's ARN to grant Ably the necessary permissions in a secure manner.
+
+### Create a Lambda policy
+
+The following steps show you how to create a policy for an AWS Lambda.
+
+1. In the IAM console sidebar select **Policies**.
+2. Click **Create Policy**.
+3. Click the JSON tab and enter the following JSON to configure the policy:
+
+
+```json
+{
+{
+ "Version": "2012-10-17",
+ "Statement": [
+ {
+ "Sid": "AllowInvokeLambdaFunction",
+ "Effect": "Allow",
+ "Action": [
+ "lambda:InvokeAsync",
+ "lambda:InvokeFunction"
+ ],
+ "Resource": [
+ "arn:aws:lambda:::function:"
+ ]
+ }
+ ]
+}
+```
+
+
+
+
+4. Click **Next: Tags**. You don't need to add any tags.
+5. Click **Next: Review**.
+6. Enter a suitable name for your policy.
+7. Click **Create Policy**.
+
+You have created a policy that grants the permissions required to use a Kinesis stream. You must now attach it to the role that you'll specify in your Ably integration rule.
+
+### Create a role
+
+Create an IAM role as follows:
+
+1. In the AWS IAM console, click **Roles** in the sidebar and then click **Create Role**.
+2. For type of trusted entity select **Another AWS account**.
+3. For Account ID specify 203461409171. This is the Ably AWS account.
+4. Click the **Require external ID checkbox** and then enter an external ID of [`.`](/docs/platform/account/control-api#ids).
+5. Click **Next: Permissions**.
+6. Now select the policy you created earlier to attach to this role. You can type the name of your policy into the **Filter policies** search box.
+
+Then ensure the checkbox for the policy is selected.
+
+7. Click **Next: Tags**.
+8. You don't need to add tags so click **Next: Review**.
+9. Enter a suitable name for your role.
+10. Click **Create Role**.
diff --git a/src/pages/docs/platform/integrations/webhooks/zapier.mdx b/src/pages/docs/platform/integrations/webhooks/zapier.mdx
new file mode 100644
index 0000000000..3d048b1513
--- /dev/null
+++ b/src/pages/docs/platform/integrations/webhooks/zapier.mdx
@@ -0,0 +1,40 @@
+---
+title: Zapier integration
+meta_description: "Trigger Zapier based on message, channel lifecycle, channel occupancy, and presence events."
+meta_keywords: "Zapier, integrations, events, serverless"
+redirect_from:
+ - /docs/general/events/zapier
+ - /docs/content//webhooks/zapier
+ - /docs/general/webhooks/zapier
+---
+
+[Zapier](https://zapier.com/page/webhooks) integrations enable you to trigger Zapier Zaps when an event occurs in Ably. They are useful for integrating with thousands of other services using Zapier's webhooks feature.
+
+## Create a Zapier integration
+
+To create a Zapier integration in your [dashboard:](https://ably.com/dashboard/any)
+
+1. Login and select the application you wish to integrate with Zapier.
+2. Click the **Integrations** tab.
+3. Click the **New Integration Rule** button.
+4. Choose **Webhook**.
+5. Choose **Zapier**.
+6. Configure the Zapier [settings](#settings).
+7. Click **Create**.
+
+You can also create a Zapier integration using the [Control API](/docs/platform/account/control-api).
+
+### Settings
+
+The following settings are available when creating a Zapier integration:
+
+| Setting | Description |
+| ------- | ----------- |
+| URL | The Zapier webhook URL to POST a summary of events to. |
+| Headers | Allows the inclusion of additional information in key-value format. |
+| Request Mode | Choose between **Single Request** or **Batch Request**. |
+| [Source](/docs/integrations/webhooks#sources) | Specifies the event types being sent to Zapier. |
+| [Channel filter](/docs/integrations/webhooks#filter) | Filters the source channels based on a regular expression. |
+| Encoding | Specifies the encoding format of messages. Either JSON or MsgPack. |
+| [Enveloped](/docs/integrations/webhooks#enveloped) | Checkbox to set whether messages should be enveloped or not. Enveloped is the default. Only available when `Request Mode` is set to `Single`. |
+| Sign with key | Payloads will be signed with an API key so they can be validated by your servers. Only available when `Request Mode` is set to `Batched`. |
From a78f7b37aa34391518f176a141c055915a2e28b3 Mon Sep 17 00:00:00 2001
From: Mark Hulbert <39801222+m-hulbert@users.noreply.github.com>
Date: Fri, 22 Aug 2025 10:29:39 +0200
Subject: [PATCH 4/7] Convert integrations overview and queues to MDX
---
content/integrations/index.textile | 52 ----
.../docs/platform/integrations/index.mdx | 52 ++++
.../docs/platform/integrations/queues.mdx | 245 ++++++++++--------
.../integrations/skip-integrations.mdx | 69 +++--
4 files changed, 217 insertions(+), 201 deletions(-)
delete mode 100644 content/integrations/index.textile
create mode 100644 src/pages/docs/platform/integrations/index.mdx
rename content/integrations/queues.textile => src/pages/docs/platform/integrations/queues.mdx (67%)
rename content/integrations/skip-integrations.textile => src/pages/docs/platform/integrations/skip-integrations.mdx (88%)
diff --git a/content/integrations/index.textile b/content/integrations/index.textile
deleted file mode 100644
index 78cadd28d7..0000000000
--- a/content/integrations/index.textile
+++ /dev/null
@@ -1,52 +0,0 @@
----
-title: Integrations overview
-meta_description: "Integrations enable external services to send data to Ably channels, and for Ably events to send their data to external services."
-meta_keywords: "integrations, integration, integrate, stream, external service, webhook, webhooks, functions"
-redirect_from:
- - /docs/general/integrations
- - /docs/integrations/aws-authentication
----
-
-Ably integrations enable you to send your data from Ably to an external service or push data into Ably from an external service.
-
-h2(#inbound). Inbound integrations
-
-Inbound integrations are where one of your external systems is sending data into Ably.
-
-* "Inbound webhooks":/docs/integrations/inbound/webhooks enable you to configure an endpoint for generated requests to be picked up by Ably and published to a channel.
-* The "Kafka connector":/docs/integrations/inbound/kafka-connector enables you to send data from one or more Kafka topics into Ably channels.
-
-h2(#outbound). Outbound webhooks
-
-"Outbound webhooks":/docs/integrations/webhooks enable you to push data to an external system from within Ably. These events that happen within Ably include messages being published to a channel, presence events being emitted, and changes in the channel occupancy and activity.
-
-The following pre-built webhooks can be configured:
-
-* "AWS Lambda functions":/docs/integrations/webhooks/lambda
-* "Google Cloud functions":/docs/integrations/webhooks/gcp-function
-* "Zapier":/docs/integrations/webhooks/zapier
-* "Cloudflare Workers":/docs/integrations/webhooks/cloudflare
-* "IFTTT":/docs/integrations/webhooks/ifttt
-* "Datadog":/docs/integrations/webhooks/datadog
-
-h2(#continuous). Outbound streaming
-
-"Outbound streaming":/docs/integrations/streaming involves streaming a constant flow of data from Ably to other streaming or queuing services. This is useful for integrating Ably with large-scale, event-driven architectures or data pipelines.
-
-The following pre-built services can be configured:
-
-* "Kafka":/docs/integrations/streaming/kafka
-* "AWS Kinesis":/docs/integrations/streaming/kinesis
-* "AMQP":/docs/integrations/streaming/amqp
-* "AWS SQS":/docs/integrations/streaming/sqs
-* "Apache Pulsar":/docs/integrations/streaming/pulsar
-
-h2(#queues). Message queues
-
-"Message queues":/docs/integrations/queues enable asynchronous communication between a queueing pattern. Producers (Ably channels) publish messages to a queue, and consumers retrieve them in a first-in, first-out order.
-
-Whilst pub-sub channels broadcast messages to all subscribers, queues distribute work among consumers. Both patterns serve different use cases. For example, pub/sub is ideal for many users to receive realtime updates, while queues handle tasks like triggering emails efficiently.
-
-h2(#skip). Skip integrations
-
-Privileged users can "skip integrations":docs/integrations/skip-integrations on a per-message basis, providing greater control and flexibility when publishing messages to a channel.
diff --git a/src/pages/docs/platform/integrations/index.mdx b/src/pages/docs/platform/integrations/index.mdx
new file mode 100644
index 0000000000..548ff36f42
--- /dev/null
+++ b/src/pages/docs/platform/integrations/index.mdx
@@ -0,0 +1,52 @@
+---
+title: Integrations overview
+meta_description: "Integrations enable external services to send data to Ably channels, and for Ably events to send their data to external services."
+meta_keywords: "integrations, integration, integrate, stream, external service, webhook, webhooks, functions"
+redirect_from:
+ - /docs/general/integrations
+ - /docs/integrations/aws-authentication
+---
+
+Ably integrations enable you to send your data from Ably to an external service or push data into Ably from an external service.
+
+## Inbound integrations
+
+Inbound integrations are where one of your external systems is sending data into Ably.
+
+* [Inbound webhooks](/docs/integrations/inbound/webhooks) enable you to configure an endpoint for generated requests to be picked up by Ably and published to a channel.
+* The [Kafka connector](/docs/integrations/inbound/kafka-connector) enables you to send data from one or more Kafka topics into Ably channels.
+
+## Outbound webhooks
+
+[Outbound webhooks](/docs/integrations/webhooks) enable you to push data to an external system from within Ably. These events that happen within Ably include messages being published to a channel, presence events being emitted, and changes in the channel occupancy and activity.
+
+The following pre-built webhooks can be configured:
+
+* [AWS Lambda functions](/docs/platform/integrations/webhooks/lambda)
+* [Google Cloud functions](/docs/platform/integrations/webhooks/gcp-function)
+* [Zapier](/docs/platform/integrations/webhooks/zapier)
+* [Cloudflare Workers](/docs/platform/integrations/webhooks/cloudflare)
+* [IFTTT](/docs/platform/integrations/webhooks/ifttt)
+* [Datadog](/docs/platform/integrations/streaming/datadog)
+
+## Outbound streaming
+
+[Outbound streaming](/docs/integrations/streaming) involves streaming a constant flow of data from Ably to other streaming or queuing services. This is useful for integrating Ably with large-scale, event-driven architectures or data pipelines.
+
+The following pre-built services can be configured:
+
+* [Kafka](/docs/integrations/streaming/kafka)
+* [AWS Kinesis](/docs/integrations/streaming/kinesis)
+* [AMQP](/docs/integrations/streaming/amqp)
+* [AWS SQS](/docs/integrations/streaming/sqs)
+* [Apache Pulsar](/docs/integrations/streaming/pulsar)
+
+## Message queues
+
+[Message queues](/docs/integrations/queues) enable asynchronous communication between a queueing pattern. Producers (Ably channels) publish messages to a queue, and consumers retrieve them in a first-in, first-out order.
+
+Whilst pub-sub channels broadcast messages to all subscribers, queues distribute work among consumers. Both patterns serve different use cases. For example, pub/sub is ideal for many users to receive realtime updates, while queues handle tasks like triggering emails efficiently.
+
+## Skip integrations
+
+Privileged users can [skip integrations](docs/platform/integrations/skip-integrations) on a per-message basis, providing greater control and flexibility when publishing messages to a channel.
diff --git a/content/integrations/queues.textile b/src/pages/docs/platform/integrations/queues.mdx
similarity index 67%
rename from content/integrations/queues.textile
rename to src/pages/docs/platform/integrations/queues.mdx
index cf7a4108a7..2659cedcc3 100644
--- a/content/integrations/queues.textile
+++ b/src/pages/docs/platform/integrations/queues.mdx
@@ -2,8 +2,6 @@
title: Ably Queues
meta_description: "Ably queues provide a queueing mechanism to integrate Ably with your external service."
meta_keywords: "Ably queues, queueing, integrations"
-languages:
- - nodejs
redirect_from:
- /docs/general/versions/v1.1/queues
- /docs/general/versions/v1.0/queues
@@ -13,12 +11,12 @@ redirect_from:
Ably Queues are traditional message queues that provide a mechanism for you to consume, process, store or reroute data from Ably to your servers. Queues provide an asynchronous machine-to-machine protocol, with each machine assuming one, or both, roles:
-- Producers := Publish messages to a queue
-- Consumers := Retrieve messages from the queue.
+* Producers: Publish messages to a queue
+* Consumers: Retrieve messages from the queue.
Ably Queues guarantee at least once delivery using a message acknowledgement protocol. Ably also provides reliable ordering of messages by channel. For example, if messages published in a single channel are republished to a queue, and there is only one consumer for that queue, then the consumer will receive the messages in the order they were published. However, if you have more than one consumer, reliable ordering is not possible. Equally, if you have messages from multiple channels, reliable ordering is only supported per channel not across all channels.
-h2(#lifecycle). Queues lifecycle
+## Queues lifecycle
Messages enter a queue through the following process:
@@ -35,9 +33,7 @@ Messages published to a queue go through the following lifecycle:
The following diagram illustrates the lifecycle of a message in an Ably Queue:
-
-
-
+
Queues decouple producers and consumers in your system:
@@ -45,7 +41,7 @@ Queues decouple producers and consumers in your system:
* You can increase throughput capacity by adding more consumers, as each one pulls messages independently.
* Ably stores messages until a consumer acknowledges successful processing, so if a consumer crashes or disconnects, no data is lost.
-h2(#use-case). Use cases
+## Use cases
The following are some common use cases for Ably Queues:
@@ -56,55 +52,56 @@ The following are some common use cases for Ably Queues:
* You want a backlog of messages if your consumers fall behind or go offline, ensuring no data is lost.
* You need to provision the necessary queues ahead of time, for example, one queue for chat messages and another for analytics events.
-h2(#provision). Provision a Queue
+## Provision a Queue
To get started you need to provision a physical queue and set up a queue rule to move the published messages from a channel into the queue.
You can provision one or more queues for your app, however by default each new app is provisioned without any queues.
-
+
The following steps explain how to provision an Ably Queue:
-1. Go to the *Queues* tab of an app in your Ably "dashboard.":https://ably.com/accounts/any/apps/any/queues
-2. Click *Provision a new Queue*.
+1. Go to the **Queues** tab of an app in your Ably [dashboard.](https://ably.com/accounts/any/apps/any/queues)
+2. Click **Provision a new Queue**.
3. In the New Queue section, fill in the following fields:
-** *Name* - Enter a unique name for your queue. This name, combined with your app ID, forms the full queue identifier.
-** *Region* - Select the physical region where the queue will be hosted, for example, US East (Virginia). All queues are replicated across two datacenters in that region for high availability.
-** *TTL (minutes)* - Set the time-to-live for messages. The default and maximum is 60 minutes. Messages that are not consumed within the TTL will be moved to the "Dead Letter Queue":#deadletter.
-** *Max length* - Define the maximum number of messages the queue can retain. The default and maximum is 10,000. When the queue is full, the oldest message is moved to the Dead Letter Queue to make space.
-4. Click *Create* to finish provisioning the queue.
+ * **Name** - Enter a unique name for your queue. This name, combined with your app ID, forms the full queue identifier.
+ * **Region** - Select the physical region where the queue will be hosted, for example, US East (Virginia). All queues are replicated across two datacenters in that region for high availability.
+ * **TTL (minutes)** - Set the time-to-live for messages. The default and maximum is 60 minutes. Messages that are not consumed within the TTL will be moved to the [Dead Letter Queue](#deadletter).
+ * **Max length** - Define the maximum number of messages the queue can retain. The default and maximum is 10,000. When the queue is full, the oldest message is moved to the Dead Letter Queue to make space.
+4. Click **Create** to finish provisioning the queue.
-
+
-h3(#config). Configure a Queue rule
+### Configure a Queue rule
After you provision a Queue, create one or more Queue rules to republish messages, presence events, or channel events from channels into that queue.
The following steps explain how to set up a queue rule:
-1. Go to the *Integrations* tab of an app in your Ably "dashboard.":https://ably.com/accounts/any/apps/any/integrations
-2. Click *New Integration Rule*.
-3. In the *Select your rule type* section, choose Queue.
-4. In the *Choose queue* dropdown, select the queue you want to send data to.
+1. Go to the **Integrations** tab of an app in your Ably [dashboard.](https://ably.com/accounts/any/apps/any/integrations)
+2. Click **New Integration Rule**.
+3. In the **Select your rule type** section, choose Queue.
+4. In the **Choose queue** dropdown, select the queue you want to send data to.
5. (Optional) Add custom Headers as key:value pairs to include metadata.
-** Click *Another header* to add additional headers.
-6. In the "*Source*":/docs/integrations/webhooks#sources dropdown, select which type of Ably event should trigger the rule.
-7. (Optional) Add a "*Channel Filter*":/docs/integrations/webhooks#channel-filter using a regular expression to apply the rule to specific channels. Leave empty to apply to all.
-8. Choose an "*Encoding*":/docs/integrations/webhooks#encoding format.
-9. Click *Create* to finish setting up the rule.
+** Click **Another header** to add additional headers.
+6. In the [**Source**](/docs/integrations/webhooks#sources) dropdown, select which type of Ably event should trigger the rule.
+7. (Optional) Add a [**Channel Filter**](/docs/integrations/webhooks#channel-filter) using a regular expression to apply the rule to specific channels. Leave empty to apply to all.
+8. Choose an [**Encoding**](/docs/integrations/webhooks#encoding) format.
+9. Click **Create** to finish setting up the rule.
-h3(#stats). Queue stats
+### Queue stats
-Provisioned queues are visible in your app "dashboard":https://ably.com/accounts/any/apps/any/queues and provide realtime stats for the current state of each queue.
+Provisioned queues are visible in your app [dashboard](https://ably.com/accounts/any/apps/any/queues) and provide realtime stats for the current state of each queue.
The following table describes the sections of the queue stats:
-|_. Section |_. Purpose |
+| Section | Description |
+| ------- | ----------- |
| Name | Displays the unique identifier for the queue. |
| State | Indicates the current operational state of the queue. |
| Region | Specifies the physical data center location of the queue. |
@@ -124,7 +121,8 @@ The queue stats show the current state of your queue. Your app and account dashb
The following table describes the sections of the stats:
-|_. Section |_.Purpose |
+| Section | Description |
+| ------- | ----------- |
| Messages total | Shows the total number of messages processed during the specified time period. |
| Messages published (REST & Realtime) | Displays the total number of messages sent to Ably. |
| Messages received (Realtime) | Shows the number of messages delivered to subscribers through Realtime channels. |
@@ -140,29 +138,34 @@ The following table describes the sections of the stats:
| Data transferred | Displays the amount of data transferred through the Ably system during the specified period. |
| Avg. message size | Indicates the average size of messages processed during the specified time frame. |
-h2(#cli). Use a CLI to consume messages
+## Use a CLI to consume messages
-You can verify that messages are reaching your queue by consuming them from the command line using the "AMQP protocol":https://www.npmjs.com/package/amqp-consume-cli.
+You can verify that messages are reaching your queue by consuming them from the command line using the [AMQP protocol](https://www.npmjs.com/package/amqp-consume-cli).
-The following installs the @amqp-consume-cli@ package globally:
+The following installs the `amqp-consume-cli` package globally:
-```[sh]
+
+```shell
npm install amqp-consume-cli -g
```
+
-Once installed, go to your Ably "dashboard":https://ably.com/accounts/any/apps/any/app+keys and retrieve an API key that has the capability to subscribe to queues.
+Once installed, go to your Ably [dashboard](https://ably.com/accounts/any/apps/any/app+keys) and retrieve an API key that has the capability to subscribe to queues.
The following example shows how to consume from a queue using the CLI:
-```[sh]
+
+```shell
amqp-consume --queue-name [Name] \
--host [Server endpoint host] --port [Server endpoint port] \
--ssl --vhost shared --creds [your API key]
```
+
-The following is an example of the @amqp-consume@ output you will see when a message is published to the queue:
+The following is an example of the `amqp-consume` output you will see when a message is published to the queue:
-```[sh]
+
+```text
Message received
Attributes: { contentType: 'application/json',
headers: {},
@@ -185,9 +188,9 @@ Data: {
]
}
```
+
-h2(#amqp). Consume messages using AMQP
-
+## Consume messages using AMQP
The AMQP protocol provides a rich set of functionality to amongst other things bind to exchanges, provision queues and configure routing. This functionality exists so that queues can be dynamically provisioned by clients and messages can be routed to these queues as required.
@@ -195,19 +198,22 @@ Unlike Ably channels, queues are pre-provisioned and all routing is handled by q
It is possible to consume multiple queues from a single connection, and also consume more than one message at a time. Refer to your AMQP library's documentation to enable these capabilities.
-The following steps explain how to consume messages from an Ably Queue using the AMQP protocol:
+You need the following to consume messages from an Ably Queue using the AMQP protocol:
-- @Queue name@ := For example: UATwBQ:example-queue, formed from your app ID and the name you gave the queue.
-- @Host@ := For example: @us-east-1-a-queue.ably.io@
-- @Port@ := @5671@, the TLS-only port supported for secure AMQP consumption.
-- @Vhost@ := Always shared
-- @Username@ := The part before the colon in your API key. For example, @APPID.KEYID@ from @APPID.KEYID:SECRET@.
-- @Password@ := The part after the colon in your API key. For example, @SECRET@ from @APPID.KEYID:SECRET@.
-- @AMQP Client@ := Connect using any AMQP 0.9.1 compatible client that supports TLS.
+| Value | Description |
+| ----- | ----------- |
+| `Queue name` | For example: UATwBQ:example-queue, formed from your app ID and the name you gave the queue. |
+| `Host` | For example: us-east-1-a-queue.ably.io |
+| `Port` | 5671, the TLS-only port supported for secure AMQP consumption. |
+| `Vhost` | Always shared |
+| `Username` | The part before the colon in your API key. For example, APPID.KEYID from APPID.KEYID:SECRET. |
+| `Password` | The part after the colon in your API key. For example, SECRET from APPID.KEYID:SECRET. |
+| `AMQP Client` | Connect using any AMQP 0.9.1 compatible client that supports TLS. |
The following example shows how to consume from a queue in Node.js:
-```[nodejs]
+
+```nodejs
const url = 'amqps://APPID.KEYID:SECRET@us-east-1-a-queue.ably.io/shared'
amqp.connect(url, (err, conn) => {
if (err) { return handleError(err) }
@@ -236,34 +242,38 @@ amqp.connect(url, (err, conn) => {
})
})
```
+
The Node.js example above shows how to consume messages from a queue using AMQP. Take note of the following:
* The queue rule wraps each message in an envelope (default behavior). The first step is to parse the envelope JSON.
-* The @Message.fromEncodedArray@ method decodes the payload into standard "@Message@":/docs/api/realtime-sdk/types#message objects.
-* While messages is always an @Array@, each envelope currently contains only one message.
+* The `Message.fromEncodedArray` method decodes the payload into standard [`Message`](/docs/api/realtime-sdk/types#message) objects.
+* While messages is always an `Array`, each envelope currently contains only one message.
-
+
-h2(#stomp). Consume messages using STOMP
+## Consume messages using STOMP
The STOMP protocol is a simple text-based protocol designed for working with message-oriented middleware. It provides an interoperable wire format that allows STOMP clients to talk with any message broker that supports the STOMP protocol and as such is a good fit for use with Ably Queues
-The following steps explain the details you will need to use the STOMP protocol:
+You need the following to consume messages from an Ably Queue using the STOMP protocol:
-- @Queue name@ := For @UATwBQ:example-queue@, formed from your app ID and the name you gave the queue.
-- @Host@ := For example: @us-east-1-a-queue.ably.io@
-- @Port@ := 61614, the TLS-only STOMP port (different from AMQP's port).
-- @Vhost@ := Always shared
-- @Username@ := The part before the colon in your API key. For example, @APPID.KEYID@ from @APPID.KEYID:SECRET@.
-- @Password@ := The part after the colon in your API key. For example, @SECRET@ from A@PPID.KEYID:SECRET@.
-- @STOMP Client@ := Connect using any STOMP client that supports TLS.
+| Value | Description |
+| ----- | ----------- |
+| `Queue name` | For example: UATwBQ:example-queue, formed from your app ID and the name you gave the queue. |
+| `Host` | For example: us-east-1-a-queue.ably.io |
+| `Port` | 61614, the TLS-only STOMP port (different from AMQP's port). |
+| `Vhost` | Always shared |
+| `Username` | The part before the colon in your API key. For example, APPID.KEYID from APPID.KEYID:SECRET. |
+| `Password` | The part after the colon in your API key. For example, SECRET from APPID.KEYID:SECRET. |
+| `STOMP Client` | Connect using any STOMP client that supports TLS. |
-The following is a simple Node.js example using the @stomp/stompjs@ SDK:
+The following is a simple Node.js example using the `stomp/stompjs` SDK:
-```[nodejs]
+
+```nodejs
const connectOptions = {
'host': 'us-east-1-a-queue.ably.io',
'port': 61614, /* STOMP TLS port */
@@ -307,26 +317,28 @@ Stompit.connect(connectOptions, (error, client) => {
})
})
```
+
The node.js example above shows how to consume messages from a queue using STOMP. Take note of the following:
* The queue rule wraps each message in an envelope (default setting). The first step is to parse the envelope JSON.
-* Use the @Message.fromEncodedArray@ method to decode the message into standard Message objects.
-* While messages is always an @Array@, each envelope currently contains only one message.
+* Use the `Message.fromEncodedArray` method to decode the message into standard Message objects.
+* While messages is always an `Array`, each envelope currently contains only one message.
-h2(#enveloped). Enveloped and non-enveloped messages
+## Enveloped and non-enveloped messages
-When you configure a queue rule, Ably gives you the option to envelope messages. This option is enabled by default. In most cases, using envelopes adds flexibility. Ably includes additional metadata in a portable format, such as the @clientId@ of the publisher and the @channel@ the message came from.
+When you configure a queue rule, Ably gives you the option to envelope messages. This option is enabled by default. In most cases, using envelopes adds flexibility. Ably includes additional metadata in a portable format, such as the `clientId` of the publisher and the `channel` the message came from.
If performance is your priority, you can disable envelopes. In that case, Ably publishes only the message payload. This removes one layer of parsing but requires you to decode the raw payload yourself.
-By default, Ably encodes messages sent to queues in JSON. However, you can choose to use "MsgPack":https://msgpack.org/, a binary format, when configuring your queue rules.
+By default, Ably encodes messages sent to queues in JSON. However, you can choose to use [MsgPack](https://msgpack.org/), a binary format, when configuring your queue rules.
-h3(#envelope-message). Enveloped message
+### Enveloped message
The following example shows the data sent to the queue when a message is published without an envelope (no headers are required):
-```[json]
+
+```json
{
"source": "channel.message",
"appId":"ael724",
@@ -344,23 +356,27 @@ The following example shows the data sent to the queue when a message is publish
]
}
```
+
-h3(#no-envelope-message). Non-enveloped message
+### Non-enveloped message
The following shows the headers sent to the queue when a message is published without an envelope:
-- @X-ABLY-ENVELOPE-SOURCE@ := @channel.message@
-- @X-ABLY-ENVELOPE-APPID@ := @ael724@
-- @X-ABLY-ENVELOPE-CHANNEL@ := @foo@
-- @X-ABLY-ENVELOPE-SITE@ := @eu-west-1-A@
-- @X-ABLY-ENVELOPE-RULE-ID@ := @wYge7g@
-- @X-ABLY-MESSAGE-ID@ := @vjzxPR-XK3:3:0@
-- @X-ABLY-MESSAGE-TIMESTAMP@ := @1485914937909@
-- @X-ABLY-MESSAGE-CONNECTION-ID@ := @vjzxPR-XK3@
+| Header | Example value |
+| ------ | ------------- |
+| `X-ABLY-ENVELOPE-SOURCE` | `channel.message` |
+| `X-ABLY-ENVELOPE-APPID` | `ael724` |
+| `X-ABLY-ENVELOPE-CHANNEL` | `foo` |
+| `X-ABLY-ENVELOPE-SITE` | `eu-west-1-A` |
+| `X-ABLY-ENVELOPE-RULE-ID` | `wYge7g` |
+| `X-ABLY-MESSAGE-ID` | `vjzxPR-XK3:3:0` |
+| `X-ABLY-MESSAGE-TIMESTAMP` | `1485914937909` |
+| `X-ABLY-MESSAGE-CONNECTION-ID` | `vjzxPR-XK3` |
The following example shows the data sent to the queue when a message is published without an envelope:
-```[json]
+
+```json
{
"source": "channel.message",
"appId":"ael724",
@@ -378,12 +394,14 @@ The following example shows the data sent to the queue when a message is publish
]
}
```
+
-h3(#envelope-presence). Enveloped presence message
+### Enveloped presence message
The following example shows the data sent to the queue when a presence message is published without an envelope (no headers are required):
-```[json]
+
+```json
{
"source": "channel.presence",
"appId":"ael724",
@@ -402,29 +420,33 @@ The following example shows the data sent to the queue when a presence message i
]
}
```
+
-
+
-h3(#no-envelope-presence). Non-enveloped presence message
+### Non-enveloped presence message
The following shows the headers sent to the queue when a presence message is published without an envelope:
-- @X-ABLY-ENVELOPE-SOURCE@ := @channel.presence@
-- @X-ABLY-ENVELOPE-APPID@ := @ael724@
-- @X-ABLY-ENVELOPE-CHANNEL@ := @foo@
-- @X-ABLY-ENVELOPE-SITE@ := @eu-west-1-A@
-- @X-ABLY-ENVELOPE-RULE-ID@ := @wYge7g@
-- @X-ABLY-MESSAGE-ID@ := @vjzxPR-XK3:5:0@
-- @X-ABLY-MESSAGE-TIMESTAMP@ := @1485914937909@
-- @X-ABLY-MESSAGE-CONNECTION-ID@ := @vjzxPR-XK3@
-- @X-ABLY-MESSAGE-CLIENT-ID@ := @bob@
-- @X-ABLY-MESSAGE-ACTION@ := @enter@
+| Header | Example value |
+| ------ | ------------- |
+| `X-ABLY-ENVELOPE-SOURCE` | `channel.presence` |
+| `X-ABLY-ENVELOPE-APPID` | `ael724` |
+| `X-ABLY-ENVELOPE-CHANNEL` | `foo` |
+| `X-ABLY-ENVELOPE-SITE` | `eu-west-1-A` |
+| `X-ABLY-ENVELOPE-RULE-ID` | `wYge7g` |
+| `X-ABLY-MESSAGE-ID` | `vjzxPR-XK3:5:0` |
+| `X-ABLY-MESSAGE-TIMESTAMP` | `1485914937909` |
+| `X-ABLY-MESSAGE-CONNECTION-ID` | `vjzxPR-XK3` |
+| `X-ABLY-MESSAGE-CLIENT-ID` | `bob` |
+| `X-ABLY-MESSAGE-ACTION` | `enter` |
The following example shows the data sent to the queue when a message is published without an envelope:
-```[json]
+
+```json
{
"source": "channel.presence",
"appId":"ael724",
@@ -443,8 +465,9 @@ The following example shows the data sent to the queue when a message is publish
]
}
```
+
-h2(#deadletter). Dead Letter Queue
+## Dead Letter Queue
When you provision a Queue, Ably also provisions a Dead Letter Queue (DLQ) automatically. This special queue stores messages that fail processing or expire before being consumed.
@@ -452,15 +475,15 @@ It is recommended that you consume messages from the Dead Letter Queue so you ca
Ably moves messages into the Dead Letter Queue in the following cases:
-* The message is rejected; @basic.reject@ or @basic.nack@, with @requeue=false@.
+* The message is rejected; `basic.reject` or `basic.nack`, with `requeue=false`.
* The message exceeds its TTL and expires.
* The Queue reaches its maximum length and a new message is published. In this case, Ably removes the oldest message from the queue and places it in the Dead Letter Queue to make room.
Ably deletes any message in the Dead Letter Queue if it later meets one of these conditions. For example, if it expires due to TTL. These messages are not recoverable.
-Ably names the Dead Letter Queue using the reserved format @APPID:deadletter@, where @APPID@ is your app's ID. Each app with one or more queues has exactly one Dead Letter Queue, and you'll see it listed in the Queues dashboard. You can subscribe to it like any other queue.
+Ably names the Dead Letter Queue using the reserved format `APPID:deadletter`, where `APPID` is your app's ID. Each app with one or more queues has exactly one Dead Letter Queue, and you'll see it listed in the Queues dashboard. You can subscribe to it like any other queue.
-h2(#scalability). Queue scalability
+## Queue scalability
Ably Queues are offered in two flavors; multi-tenanted and dedicated.
@@ -469,4 +492,4 @@ The multi-tenanted service is provided as part of the core platform to all custo
For customers with more demanding requirements of up to millions of messages per second Ably has two solutions:
* Dedicated queue clusters that scale to millions of messages, for enterprise customers only.
-* An "outbound streaming integration":/docs/integrations/streaming to stream your realtime data directly into your own streaming or queueing service.
+* An [outbound streaming integration](/docs/integrations/streaming) to stream your realtime data directly into your own streaming or queueing service.
diff --git a/content/integrations/skip-integrations.textile b/src/pages/docs/platform/integrations/skip-integrations.mdx
similarity index 88%
rename from content/integrations/skip-integrations.textile
rename to src/pages/docs/platform/integrations/skip-integrations.mdx
index 190be76402..16d4ae76a4 100644
--- a/content/integrations/skip-integrations.textile
+++ b/src/pages/docs/platform/integrations/skip-integrations.mdx
@@ -2,34 +2,24 @@
title: Skip integrations
meta_description: "Learn how to skip integrations on a per-message basis, including examples for skipping all or specific integration rules."
meta_keywords: "Ably, skip integrations, skipRule, message extras, privileged headers, integration rules, Control API, channel messaging"
-languages:
- - javascript
- - nodejs
- - php
- - python
- - ruby
- - java
- - swift
- - objc
- - csharp
- - go
---
Privileged users can skip integrations on a per-message basis, providing greater control and flexibility when publishing messages to a channel. Skipping integration helps avoid infinite loops, for example, when an integration republishes a message to the same channel, potentially triggering itself again.
A strong use case for skipping integrations is in chat applications. For example, a moderation service publishes a command telling clients to edit or delete a message. That command should not trigger further moderation events by itself.
-
+
-h2. Skip all integrations
+## Skip all integrations
-To skip all integration rules for a specific message, set the @skipRule@ field to @'*'@ in the @privileged@ section of the message "@extras@":/docs/api/rest-sdk/messages#extras.
+To skip all integration rules for a specific message, set the `skipRule` field to `'*'` in the `privileged` section of the message [`extras`](/docs/api/rest-sdk/messages#extras).
The following example shows how to skip all integration rules when publishing a message to a channel:
-```[javascript]
+
+```javascript
const rest = new Ably.Rest('{{API_KEY}}');
const channel = rest.channels.get('{{RANDOM_CHANNEL_NAME}}');
await channel.publish({
@@ -43,7 +33,7 @@ await channel.publish({
});
```
-```[nodejs]
+```nodejs
const rest = new Ably.Rest('{{API_KEY}}');
const channel = rest.channels.get('{{RANDOM_CHANNEL_NAME}}');
await channel.publish({
@@ -57,7 +47,7 @@ await channel.publish({
});
```
-```[ruby]
+```ruby
rest = Ably::Rest.new('{{API_KEY}}')
channel = rest.channels.get('{{RANDOM_CHANNEL_NAME}}')
while true
@@ -65,7 +55,7 @@ await channel.publish({
end
```
-```[python]
+```python
rest = AblyRest('{{API_KEY}}')
channel = rest.channels.get('{{RANDOM_CHANNEL_NAME}}')
extras = {
@@ -77,7 +67,7 @@ await channel.publish({
await channel.publish(Message(name='message', data="abc", extras=extras))
```
-```[php]
+```php
$rest = new Ably\AblyRest('{{API_KEY}}');
$channel = $rest->channels->get('{{RANDOM_CHANNEL_NAME}}');
$channel->publish(
@@ -92,7 +82,7 @@ await channel.publish({
);
```
-```[java]
+```java
AblyRest rest = new AblyRest("{{API_KEY}}");
Channel channel = rest.channels.get("{{RANDOM_CHANNEL_NAME}}");
@@ -108,7 +98,7 @@ await channel.publish({
);
```
-```[csharp]
+```csharp
AblyRest rest = new AblyRest("{{API_KEY}}");
var channel = rest.Channels.Get("{{RANDOM_CHANNEL_NAME}}");
@@ -119,7 +109,7 @@ await channel.publish({
channel.Publish(message);
```
-```[objc]
+```objc
ARTRest *rest = [[ARTRest alloc] initWithKey:@"{{API_KEY}}"];
ARTRestChannel *channel = [rest.channels get:@"{{RANDOM_CHANNEL_NAME}}"];
ARTJsonObject *extras = @{
@@ -128,14 +118,14 @@ await channel.publish({
[channel publish:@"event" data:@"data" extras:extras];
```
-```[swift]
+```swift
let rest = ARTRest(key: "{{API_KEY}}")
let channel = rest.channels.get("{{RANDOM_CHANNEL_NAME}}")
let extras: NSDictionary = ["privileged": ["skipRule": "*"]]
channel.publish("event", data: "data", extras: extras as ARTJsonCompatible)
```
-```[go]
+```go
rest, err := ably.NewREST(ably.WithKey("{{API_KEY}}"))
channel := rest.Channels.Get("{{RANDOM_CHANNEL_NAME}}")
privileged := make(map[string]string)
@@ -147,15 +137,17 @@ await channel.publish({
})
```
+
-h2. Skip specific integration rules
+## Skip specific integration rules
-You can also skip specific integration rules by including their ruleId in an array passed to skipRule. Rule IDs can be found in the Integrations tab of your Ably "dashboard":https://ably.com/dashboard/any, via the Control API, or in the message envelope.
+You can also skip specific integration rules by including their ruleId in an array passed to skipRule. Rule IDs can be found in the Integrations tab of your Ably [dashboard](https://ably.com/dashboard/any), via the Control API, or in the message envelope.
The following example shows how to skip specific integration rules when publishing a message to a channel:
-```[javascript]
+
+```javascript
const rest = new Ably.Rest('{{API_KEY}}');
const channel = rest.channels.get('{{RANDOM_CHANNEL_NAME}}');
await channel.publish({
@@ -167,7 +159,7 @@ await channel.publish({
})
```
-```[nodejs]
+```nodejs
const rest = new Ably.Rest('{{API_KEY}}');
const channel = rest.channels.get('{{RANDOM_CHANNEL_NAME}}');
await channel.publish({
@@ -179,7 +171,7 @@ await channel.publish({
})
```
-```[ruby]
+```ruby
rest = Ably::Rest.new('{{API_KEY}}')
channel = rest.channels.get('{{RANDOM_CHANNEL_NAME}}')
while true
@@ -187,7 +179,7 @@ await channel.publish({
end
```
-```[python]
+```python
rest = AblyRest('{{API_KEY}}')
channel = rest.channels.get('{{RANDOM_CHANNEL_NAME}}')
extras = {
@@ -199,7 +191,7 @@ await channel.publish({
await channel.publish(Message(name='message', data="abc", extras=extras))
```
-```[php]
+```php
$rest = new Ably\AblyRest('{{API_KEY}}');
$channel = $rest->channels->get('{{RANDOM_CHANNEL_NAME}}');
$channel->publish(
@@ -214,7 +206,7 @@ await channel.publish({
);
```
-```[java]
+```java
AblyRest rest = new AblyRest("{{API_KEY}}");
Channel channel = rest.channels.get("{{RANDOM_CHANNEL_NAME}}");
@@ -230,7 +222,7 @@ await channel.publish({
);
```
-```[csharp]
+```csharp
AblyRest rest = new AblyRest("{{API_KEY}}");
var channel = rest.Channels.Get("{{RANDOM_CHANNEL_NAME}}");
@@ -241,7 +233,7 @@ await channel.publish({
channel.Publish(message);
```
-```[objc]
+```objc
ARTRest *rest = [[ARTRest alloc] initWithKey:@"{{API_KEY}}"];
ARTRestChannel *channel = [rest.channels get:@"{{RANDOM_CHANNEL_NAME}}"];
ARTJsonObject *extras = @{
@@ -250,14 +242,14 @@ await channel.publish({
[channel publish:@"event" data:@"data" extras:extras];
```
-```[swift]
+```swift
let rest = ARTRest(key: "{{API_KEY}}")
let channel = rest.channels.get("{{RANDOM_CHANNEL_NAME}}")
let extras: NSDictionary = ["privileged": ["skipRule": ["rule_id_1"]]]
channel.publish("event", data: "data", extras: extras as ARTJsonCompatible)
```
-```[go]
+```go
rest, err := ably.NewREST(ably.WithKey("{{API_KEY}}"))
channel := rest.Channels.Get("{{RANDOM_CHANNEL_NAME}}")
privileged := make(map[string][]string)
@@ -269,3 +261,4 @@ await channel.publish({
})
```
+
From 81dd294bc588998916354dcc23e4c6d7e66567eb Mon Sep 17 00:00:00 2001
From: Mark Hulbert <39801222+m-hulbert@users.noreply.github.com>
Date: Fri, 22 Aug 2025 10:34:58 +0200
Subject: [PATCH 5/7] Add redirects and update navigation with new /platform
URL
---
src/data/nav/platform.ts | 38 +++++++++----------
.../integrations/inbound/kafka-connector.mdx | 1 +
.../integrations/inbound/webhooks.mdx | 1 +
.../docs/platform/integrations/index.mdx | 1 +
.../docs/platform/integrations/queues.mdx | 1 +
.../integrations/skip-integrations.mdx | 2 +
.../platform/integrations/streaming/amqp.mdx | 1 +
.../integrations/streaming/datadog.mdx | 2 +
.../platform/integrations/streaming/index.mdx | 1 +
.../platform/integrations/streaming/kafka.mdx | 1 +
.../integrations/streaming/kinesis.mdx | 1 +
.../integrations/streaming/pulsar.mdx | 1 +
.../platform/integrations/streaming/sqs.mdx | 1 +
.../platform/integrations/webhooks/azure.mdx | 1 +
.../integrations/webhooks/cloudflare.mdx | 1 +
.../integrations/webhooks/gcp-function.mdx | 1 +
.../platform/integrations/webhooks/ifttt.mdx | 1 +
.../platform/integrations/webhooks/index.mdx | 1 +
.../platform/integrations/webhooks/lambda.mdx | 1 +
.../platform/integrations/webhooks/zapier.mdx | 1 +
20 files changed, 40 insertions(+), 19 deletions(-)
diff --git a/src/data/nav/platform.ts b/src/data/nav/platform.ts
index 81e69613d8..99e86ae10a 100644
--- a/src/data/nav/platform.ts
+++ b/src/data/nav/platform.ts
@@ -140,7 +140,7 @@ export default {
pages: [
{
name: 'Overview',
- link: '/docs/integrations',
+ link: '/docs/platform/integrations',
index: true,
},
{
@@ -148,11 +148,11 @@ export default {
pages: [
{
name: 'Inbound webhooks',
- link: '/docs/integrations/inbound/webhooks',
+ link: '/docs/platform/integrations/inbound/webhooks',
},
{
name: 'Kafka Connector',
- link: '/docs/integrations/inbound/kafka-connector',
+ link: '/docs/platform/integrations/inbound/kafka-connector',
},
],
},
@@ -161,32 +161,32 @@ export default {
pages: [
{
name: 'Overview',
- link: '/docs/integrations/webhooks',
+ link: '/docs/platform/integrations/webhooks',
index: true,
},
{
name: 'Lambda Functions',
- link: '/docs/integrations/webhooks/lambda',
+ link: '/docs/platform/integrations/webhooks/lambda',
},
{
name: 'Azure Functions',
- link: '/docs/integrations/webhooks/azure',
+ link: '/docs/platform/integrations/webhooks/azure',
},
{
name: 'Google Functions',
- link: '/docs/integrations/webhooks/gcp-function',
+ link: '/docs/platform/integrations/webhooks/gcp-function',
},
{
name: 'Zapier',
- link: '/docs/integrations/webhooks/zapier',
+ link: '/docs/platform/integrations/webhooks/zapier',
},
{
name: 'Cloudflare Workers',
- link: '/docs/integrations/webhooks/cloudflare',
+ link: '/docs/platform/integrations/webhooks/cloudflare',
},
{
name: 'IFTTT',
- link: '/docs/integrations/webhooks/ifttt',
+ link: '/docs/platform/integrations/webhooks/ifttt',
},
],
},
@@ -195,42 +195,42 @@ export default {
pages: [
{
name: 'Overview',
- link: '/docs/integrations/streaming',
+ link: '/docs/platform/integrations/streaming',
index: true,
},
{
name: 'Kafka',
- link: '/docs/integrations/streaming/kafka',
+ link: '/docs/platform/integrations/streaming/kafka',
},
{
name: 'Kinesis',
- link: '/docs/integrations/streaming/kinesis',
+ link: '/docs/platform/integrations/streaming/kinesis',
},
{
name: 'AMQP',
- link: '/docs/integrations/streaming/amqp',
+ link: '/docs/platform/integrations/streaming/amqp',
},
{
name: 'SQS',
- link: '/docs/integrations/streaming/sqs',
+ link: '/docs/platform/integrations/streaming/sqs',
},
{
name: 'Pulsar',
- link: '/docs/integrations/streaming/pulsar',
+ link: '/docs/platform/integrations/streaming/pulsar',
},
{
name: 'DataDog',
- link: '/docs/integrations/streaming/datadog',
+ link: '/docs/platform/integrations/streaming/datadog',
},
],
},
{
name: 'Message Queues',
- link: '/docs/integrations/queues',
+ link: '/docs/platform/integrations/queues',
},
{
name: 'Skip integrations',
- link: '/docs/integrations/skip-integrations',
+ link: '/docs/platform/integrations/skip-integrations',
},
],
},
diff --git a/src/pages/docs/platform/integrations/inbound/kafka-connector.mdx b/src/pages/docs/platform/integrations/inbound/kafka-connector.mdx
index 4b89842ff9..e659321385 100644
--- a/src/pages/docs/platform/integrations/inbound/kafka-connector.mdx
+++ b/src/pages/docs/platform/integrations/inbound/kafka-connector.mdx
@@ -4,6 +4,7 @@ meta_description: "The Ably Kafka Connector sends data from Kafka to an Ably cha
meta_keywords: "Kafka, Kafka Connector, channel"
redirect_from:
- /docs/general/kafka-connector
+ - /docs/integrations/kafka-connector
---
The Ably Kafka Connector integrates [Kafka](https://kafka.apache.org/) with Ably to enable realtime event distribution from Kafka to web, mobile, and IoT clients via Ably's channels.
diff --git a/src/pages/docs/platform/integrations/inbound/webhooks.mdx b/src/pages/docs/platform/integrations/inbound/webhooks.mdx
index 37f3245492..e08dfd5be9 100644
--- a/src/pages/docs/platform/integrations/inbound/webhooks.mdx
+++ b/src/pages/docs/platform/integrations/inbound/webhooks.mdx
@@ -4,6 +4,7 @@ meta_description: "Incoming webhooks let you integrate external web services wit
meta_keywords: "Ably, incoming, inbound, webhooks, webhook configuration, web services, realtime."
redirect_from:
- /docs/general/incoming-webhooks
+ - /docs/integrations/inbound
---
External services can publish messages to Ably channels using the [REST API](/docs/api/rest-api), however, a simpler alternative is to use [incoming webhooks](#configure).
diff --git a/src/pages/docs/platform/integrations/index.mdx b/src/pages/docs/platform/integrations/index.mdx
index 548ff36f42..e7af80cc86 100644
--- a/src/pages/docs/platform/integrations/index.mdx
+++ b/src/pages/docs/platform/integrations/index.mdx
@@ -5,6 +5,7 @@ meta_keywords: "integrations, integration, integrate, stream, external service,
redirect_from:
- /docs/general/integrations
- /docs/integrations/aws-authentication
+ - /docs/integrations
---
Ably integrations enable you to send your data from Ably to an external service or push data into Ably from an external service.
diff --git a/src/pages/docs/platform/integrations/queues.mdx b/src/pages/docs/platform/integrations/queues.mdx
index 2659cedcc3..06c2502198 100644
--- a/src/pages/docs/platform/integrations/queues.mdx
+++ b/src/pages/docs/platform/integrations/queues.mdx
@@ -7,6 +7,7 @@ redirect_from:
- /docs/general/versions/v1.0/queues
- /docs/general/versions/v0.8/queues
- /docs/general/queues
+ - /docs/integrations/queues
---
Ably Queues are traditional message queues that provide a mechanism for you to consume, process, store or reroute data from Ably to your servers. Queues provide an asynchronous machine-to-machine protocol, with each machine assuming one, or both, roles:
diff --git a/src/pages/docs/platform/integrations/skip-integrations.mdx b/src/pages/docs/platform/integrations/skip-integrations.mdx
index 16d4ae76a4..8fe5f0bc44 100644
--- a/src/pages/docs/platform/integrations/skip-integrations.mdx
+++ b/src/pages/docs/platform/integrations/skip-integrations.mdx
@@ -2,6 +2,8 @@
title: Skip integrations
meta_description: "Learn how to skip integrations on a per-message basis, including examples for skipping all or specific integration rules."
meta_keywords: "Ably, skip integrations, skipRule, message extras, privileged headers, integration rules, Control API, channel messaging"
+redirect_from:
+ - /docs/integrations/skip-integrations
---
Privileged users can skip integrations on a per-message basis, providing greater control and flexibility when publishing messages to a channel. Skipping integration helps avoid infinite loops, for example, when an integration republishes a message to the same channel, potentially triggering itself again.
diff --git a/src/pages/docs/platform/integrations/streaming/amqp.mdx b/src/pages/docs/platform/integrations/streaming/amqp.mdx
index b967478ef8..e9b5f38ade 100644
--- a/src/pages/docs/platform/integrations/streaming/amqp.mdx
+++ b/src/pages/docs/platform/integrations/streaming/amqp.mdx
@@ -4,6 +4,7 @@ meta_description: "Send data to AMQP based on message, channel lifecycle, channe
meta_keywords: "AMQP, integrations, events, serverless"
redirect_from:
- /docs/general/firehose/amqp-rule
+ - /docs/integrations/streaming/amqp
---
[AMQP](https://www.amqp.org) integrations enable you to automatically forward events that occur in Ably to AMQP-compatible brokers.
diff --git a/src/pages/docs/platform/integrations/streaming/datadog.mdx b/src/pages/docs/platform/integrations/streaming/datadog.mdx
index 7107298105..9f4578449d 100644
--- a/src/pages/docs/platform/integrations/streaming/datadog.mdx
+++ b/src/pages/docs/platform/integrations/streaming/datadog.mdx
@@ -2,6 +2,8 @@
title: Datadog integration
meta_description: "Connect Ably and Datadog to monitor messages, channels, and connections in realtime, integrating your Ably statistics with your existing Datadog setup."
meta_keywords: "Datadog, integrations, statistics, metrics, monitoring, analytics, enterprise"
+redirect_from:
+ - /docs/integrations/streaming/datadog
---
The Ably [Datadog](https://docs.datadoghq.com/integrations/ably/) integration enables you to monitor your application's statistics. Every 60 seconds, Ably streams a comprehensive set of [statistics](/docs/metadata-stats/stats#metrics) to the Datadog API.
diff --git a/src/pages/docs/platform/integrations/streaming/index.mdx b/src/pages/docs/platform/integrations/streaming/index.mdx
index c756f43f76..86e3601866 100644
--- a/src/pages/docs/platform/integrations/streaming/index.mdx
+++ b/src/pages/docs/platform/integrations/streaming/index.mdx
@@ -6,6 +6,7 @@ redirect_from:
- /docs/general/versions/v1.1/firehose
- /docs/general/versions/v1.0/firehose
- /docs/general/firehose
+ - /docs/integrations/streaming
---
Ably's streaming integrations enable you to stream data that's published in the Ably platform to an external streaming or queueing service.
diff --git a/src/pages/docs/platform/integrations/streaming/kafka.mdx b/src/pages/docs/platform/integrations/streaming/kafka.mdx
index d38b3a5cfd..cfea382446 100644
--- a/src/pages/docs/platform/integrations/streaming/kafka.mdx
+++ b/src/pages/docs/platform/integrations/streaming/kafka.mdx
@@ -4,6 +4,7 @@ meta_description: "Send data to Kafka based on message, channel lifecycle, chann
meta_keywords: "Kafka, integrations, events, serverless"
redirect_from:
- /docs/general/firehose/kafka-rule
+ - /docs/integrations/streaming/kafka
---
[Kafka](https://kafka.apache.org/) integrations enable you to automatically forward events that occur in Ably to Kafka topics.
diff --git a/src/pages/docs/platform/integrations/streaming/kinesis.mdx b/src/pages/docs/platform/integrations/streaming/kinesis.mdx
index e0f6acb8f7..4ac64047a8 100644
--- a/src/pages/docs/platform/integrations/streaming/kinesis.mdx
+++ b/src/pages/docs/platform/integrations/streaming/kinesis.mdx
@@ -4,6 +4,7 @@ meta_description: "Send data to Kinesis based on message, channel lifecycle, cha
meta_keywords: "Kinesis, integrations, events, serverless"
redirect_from:
- /docs/general/firehose/kinesis-rule
+ - /docs/integrations/streaming/kinesis
---
[Kinesis](https://aws.amazon.com/kinesis/) integrations enable you to automatically forward events that occur in Ably to AWS Kinesis streams.
diff --git a/src/pages/docs/platform/integrations/streaming/pulsar.mdx b/src/pages/docs/platform/integrations/streaming/pulsar.mdx
index f03f0b7cb8..8b345c56ed 100644
--- a/src/pages/docs/platform/integrations/streaming/pulsar.mdx
+++ b/src/pages/docs/platform/integrations/streaming/pulsar.mdx
@@ -4,6 +4,7 @@ meta_description: "Send data to Pulsar based on message, channel lifecycle, chan
meta_keywords: "Pulsar, integrations, events, serverless"
redirect_from:
- /docs/general/firehose/pulsar-rule
+ - /docs/integrations/streaming/pulsar
---
[Pulsar](https://pulsar.apache.org) integrations enable you to automatically forward events that occur in Ably to Pulsar topics.
diff --git a/src/pages/docs/platform/integrations/streaming/sqs.mdx b/src/pages/docs/platform/integrations/streaming/sqs.mdx
index 1ce4282fc2..75d738e6fc 100644
--- a/src/pages/docs/platform/integrations/streaming/sqs.mdx
+++ b/src/pages/docs/platform/integrations/streaming/sqs.mdx
@@ -4,6 +4,7 @@ meta_description: "Send data to SQS based on message, channel lifecycle, channel
meta_keywords: "SQS, integrations, events, serverless"
redirect_from:
- /docs/general/firehose/sqs-rule
+ - /docs/integrations/streaming/sqs
---
[SQS](https://aws.amazon.com/sqs) integrations enable you to automatically forward events that occur in Ably to AWS SQS queues.
diff --git a/src/pages/docs/platform/integrations/webhooks/azure.mdx b/src/pages/docs/platform/integrations/webhooks/azure.mdx
index 1a0f723939..558573c696 100644
--- a/src/pages/docs/platform/integrations/webhooks/azure.mdx
+++ b/src/pages/docs/platform/integrations/webhooks/azure.mdx
@@ -5,6 +5,7 @@ meta_keywords: "Microsoft Azure, integrations, events, serverless"
redirect_from:
- /docs/general/events/azure
- /docs/general/webhooks/azure
+ - /docs/integrations/webhooks/azure
---
[Azure Function](https://azure.microsoft.com/en-gb/services/functions/) integrations enable you to trigger Microsoft's event-driven serverless compute functions when an event occurs in Ably.
diff --git a/src/pages/docs/platform/integrations/webhooks/cloudflare.mdx b/src/pages/docs/platform/integrations/webhooks/cloudflare.mdx
index 28793d8910..6a4299e23b 100644
--- a/src/pages/docs/platform/integrations/webhooks/cloudflare.mdx
+++ b/src/pages/docs/platform/integrations/webhooks/cloudflare.mdx
@@ -5,6 +5,7 @@ meta_keywords: "Cloudflare Workers, integrations, events, serverless"
redirect_from:
- /docs/general/events/cloudflare
- /docs/general/webhooks/cloudflare
+ - /docs/integrations/webhooks/cloudflare
---
[Cloudflare Worker](https://workers.cloudflare.com) integrations enable Cloudflare's Edge Network to distribute your JavaScript-based functions when an event occurs in Ably.
diff --git a/src/pages/docs/platform/integrations/webhooks/gcp-function.mdx b/src/pages/docs/platform/integrations/webhooks/gcp-function.mdx
index 230ac650c2..4a350a3814 100644
--- a/src/pages/docs/platform/integrations/webhooks/gcp-function.mdx
+++ b/src/pages/docs/platform/integrations/webhooks/gcp-function.mdx
@@ -5,6 +5,7 @@ meta_keywords: "Google Functions, integrations, events, serverless"
redirect_from:
- /docs/general/events/google-functions
- /docs/general/webhooks/google-functions
+ - /docs/integrations/webhooks/gcp-function
---
[Google Function](https://cloud.google.com/functions) integrations enable you to trigger event-driven serverless compute functions when an event occurs in Ably.
diff --git a/src/pages/docs/platform/integrations/webhooks/ifttt.mdx b/src/pages/docs/platform/integrations/webhooks/ifttt.mdx
index 83d04e8ae5..3c1ea76161 100644
--- a/src/pages/docs/platform/integrations/webhooks/ifttt.mdx
+++ b/src/pages/docs/platform/integrations/webhooks/ifttt.mdx
@@ -5,6 +5,7 @@ meta_keywords: "IFTTT, integrations, events, serverless"
redirect_from:
- /docs/general/events/ifttt
- /docs/general/webhooks/ifttt
+ - /docs/integrations/webhooks/ifttt
---
[IFTTT](https://ifttt.com/maker_webhooks) (If This Then That) integrations enable you to trigger conditional chains, and help to combine various services together when an event occurs in Ably.
diff --git a/src/pages/docs/platform/integrations/webhooks/index.mdx b/src/pages/docs/platform/integrations/webhooks/index.mdx
index 3b4d000d37..6609175c30 100644
--- a/src/pages/docs/platform/integrations/webhooks/index.mdx
+++ b/src/pages/docs/platform/integrations/webhooks/index.mdx
@@ -6,6 +6,7 @@ redirect_from:
- /docs/general/functions
- /docs/general/events
- /docs/general/webhooks
+ - /docs/integrations/webhooks
---
Outbound webhook integrations enable you to trigger serverless functions and notify HTTP endpoints when events occur in Ably.
diff --git a/src/pages/docs/platform/integrations/webhooks/lambda.mdx b/src/pages/docs/platform/integrations/webhooks/lambda.mdx
index ca142ac3b1..ef405ae4f4 100644
--- a/src/pages/docs/platform/integrations/webhooks/lambda.mdx
+++ b/src/pages/docs/platform/integrations/webhooks/lambda.mdx
@@ -5,6 +5,7 @@ meta_keywords: "AWS Lambda, integrations, events, serverless"
redirect_from:
- /docs/general/events/aws-lambda
- /docs/general/webhooks/aws-lambda
+ - /docs/integrations/webhooks/lambda
---
[AWS Lambda](https://aws.amazon.com/lambda/) integrations enable you to trigger event-driven serverless compute functions when an event occurs in Ably. They are useful for integrating into various AWS services.
diff --git a/src/pages/docs/platform/integrations/webhooks/zapier.mdx b/src/pages/docs/platform/integrations/webhooks/zapier.mdx
index 3d048b1513..7fbd6ab9f1 100644
--- a/src/pages/docs/platform/integrations/webhooks/zapier.mdx
+++ b/src/pages/docs/platform/integrations/webhooks/zapier.mdx
@@ -6,6 +6,7 @@ redirect_from:
- /docs/general/events/zapier
- /docs/content//webhooks/zapier
- /docs/general/webhooks/zapier
+ - /docs/integrations/webhooks/zapier
---
[Zapier](https://zapier.com/page/webhooks) integrations enable you to trigger Zapier Zaps when an event occurs in Ably. They are useful for integrating with thousands of other services using Zapier's webhooks feature.
From 3797cff90e96df5d454febf1056291173e0f244b Mon Sep 17 00:00:00 2001
From: Mark Hulbert <39801222+m-hulbert@users.noreply.github.com>
Date: Fri, 22 Aug 2025 10:35:41 +0200
Subject: [PATCH 6/7] Change firehose image to png
---
.../{Reactor-Firehose.gif => Reactor-Firehose.png} | Bin
1 file changed, 0 insertions(+), 0 deletions(-)
rename src/images/content/diagrams/{Reactor-Firehose.gif => Reactor-Firehose.png} (100%)
diff --git a/src/images/content/diagrams/Reactor-Firehose.gif b/src/images/content/diagrams/Reactor-Firehose.png
similarity index 100%
rename from src/images/content/diagrams/Reactor-Firehose.gif
rename to src/images/content/diagrams/Reactor-Firehose.png
From 9bc0da1e8758a6ad0aa514ff4fda45fb645c9e30 Mon Sep 17 00:00:00 2001
From: Mark Hulbert <39801222+m-hulbert@users.noreply.github.com>
Date: Fri, 22 Aug 2025 10:40:49 +0200
Subject: [PATCH 7/7] Update links to include /platform/ in integration links
---
content/partials/types/_message.textile | 2 +-
.../partials/types/_presence_message.textile | 2 +-
src/pages/docs/asset-tracking/index.mdx | 2 +-
src/pages/docs/auth/capabilities.mdx | 4 ++--
src/pages/docs/channels/index.mdx | 2 +-
src/pages/docs/channels/options/deltas.mdx | 2 +-
.../docs/chat/moderation/custom/index.mdx | 2 +-
src/pages/docs/chat/rooms/index.mdx | 2 +-
.../docs/guides/chat/build-livestream.mdx | 2 +-
src/pages/docs/liveobjects/storage.mdx | 2 +-
src/pages/docs/messages/index.mdx | 4 ++--
.../docs/metadata-stats/metadata/index.mdx | 2 +-
src/pages/docs/platform/account/app/index.mdx | 2 +-
.../docs/platform/account/app/queues.mdx | 4 ++--
.../docs/platform/account/control-api.mdx | 10 ++++-----
.../docs/platform/architecture/index.mdx | 2 +-
src/pages/docs/platform/errors/codes.mdx | 2 +-
src/pages/docs/platform/index.mdx | 2 +-
.../integrations/inbound/webhooks.mdx | 2 +-
.../docs/platform/integrations/index.mdx | 22 +++++++++----------
.../docs/platform/integrations/queues.mdx | 8 +++----
.../platform/integrations/streaming/amqp.mdx | 8 +++----
.../platform/integrations/streaming/index.mdx | 12 +++++-----
.../platform/integrations/streaming/kafka.mdx | 4 ++--
.../integrations/streaming/kinesis.mdx | 8 +++----
.../integrations/streaming/pulsar.mdx | 4 ++--
.../platform/integrations/streaming/sqs.mdx | 6 ++---
.../platform/integrations/webhooks/azure.mdx | 6 ++---
.../integrations/webhooks/cloudflare.mdx | 4 ++--
.../integrations/webhooks/gcp-function.mdx | 6 ++---
.../platform/integrations/webhooks/ifttt.mdx | 6 ++---
.../platform/integrations/webhooks/lambda.mdx | 6 ++---
.../platform/integrations/webhooks/zapier.mdx | 6 ++---
.../docs/platform/pricing/enterprise.mdx | 2 +-
src/pages/docs/platform/pricing/limits.mdx | 2 +-
src/pages/docs/platform/pricing/pro.mdx | 2 +-
src/pages/docs/platform/pricing/standard.mdx | 2 +-
.../docs/presence-occupancy/occupancy.mdx | 2 +-
.../docs/presence-occupancy/presence.mdx | 2 +-
src/pages/docs/protocols/index.mdx | 4 ++--
src/pages/docs/pub-sub/advanced.mdx | 4 ++--
src/pages/docs/storage-history/storage.mdx | 6 ++---
42 files changed, 92 insertions(+), 92 deletions(-)
diff --git a/content/partials/types/_message.textile b/content/partials/types/_message.textile
index 13ebbed3b2..b9532413f4 100644
--- a/content/partials/types/_message.textile
+++ b/content/partials/types/_message.textile
@@ -16,7 +16,7 @@ h6(#extras).
default: extras
csharp: Extras
-Metadata and/or ancillary payloads, if provided. Valid payloads include "@push@":/docs/push/publish#payload, "@headers@" (a map of strings to strings for arbitrary customer-supplied metadata), "@ephemeral@":/docs/pub-sub/advanced#ephemeral, and "@privileged@":/docs/integrations/webhooks#skipping objects. __Type: @JSONObject@, @JSONArray@plain C# object that can be converted to JSON@JSON Object@@Hash@, @Array@@Dict@, @List@@Dictionary@, @Array@@NSDictionary *@, @NSArray *@@Associative Array@, @Array@__
+Metadata and/or ancillary payloads, if provided. Valid payloads include "@push@":/docs/push/publish#payload, "@headers@" (a map of strings to strings for arbitrary customer-supplied metadata), "@ephemeral@":/docs/pub-sub/advanced#ephemeral, and "@privileged@":/docs/platform/integrations/webhooks#skipping objects. __Type: @JSONObject@, @JSONArray@plain C# object that can be converted to JSON@JSON Object@@Hash@, @Array@@Dict@, @List@@Dictionary@, @Array@@NSDictionary *@, @NSArray *@@Associative Array@, @Array@__
h6(#id).
default: id
diff --git a/content/partials/types/_presence_message.textile b/content/partials/types/_presence_message.textile
index 30e129ae96..8139a7882d 100644
--- a/content/partials/types/_presence_message.textile
+++ b/content/partials/types/_presence_message.textile
@@ -16,7 +16,7 @@ h4.
- dataData := The presence update payload, if provided __@String@, @ByteArray@, @JSONObject@, @JSONArray@@String@, @byte[]@, plain C# object that can be converted to Json@String@, @StringBuffer@, @JSON Object@@String@, @[]byte@@String@, @Binary@ (ASCII-8BIT String), @Hash@, @Array@@String@, @Bytearray@, @Dict@, @List@@String@, @NSData@, @Dictionary@, @Array@@NSString *@, @NSData *@, @NSDictionary *@, @NSArray *@@String@, @Binary String@, @Associative Array@, @Array@__
-- extrasExtras := Metadata and/or ancillary payloads, if provided. The only currently valid payloads for extras are the "@push@":/docs/push/publish#sub-channels, "@ref@":/docs/channels/messages#interactions and "@privileged@":/docs/integrations/webhooks#skipping objects. __Type: @JSONObject@, @JSONArray@plain C# object that can be converted to Json@String@, @[]byte@@JSON Object@@Hash@, @Array@@Dict@, @List@@Dictionary@, @Array@@NSDictionary *@, @NSArray *@@Associative Array@, @Array@@Map@, @List@__
+- extrasExtras := Metadata and/or ancillary payloads, if provided. The only currently valid payloads for extras are the "@push@":/docs/push/publish#sub-channels, "@ref@":/docs/channels/messages#interactions and "@privileged@":/docs/platform/integrations/webhooks#skipping objects. __Type: @JSONObject@, @JSONArray@plain C# object that can be converted to Json@String@, @[]byte@@JSON Object@@Hash@, @Array@@Dict@, @List@@Dictionary@, @Array@@NSDictionary *@, @NSArray *@@Associative Array@, @Array@@Map@, @List@__
- idId := Unique ID assigned by Ably to this presence update __Type: @String@__
diff --git a/src/pages/docs/asset-tracking/index.mdx b/src/pages/docs/asset-tracking/index.mdx
index 23144138b5..b617dd9cb1 100644
--- a/src/pages/docs/asset-tracking/index.mdx
+++ b/src/pages/docs/asset-tracking/index.mdx
@@ -13,7 +13,7 @@ The Ably Asset Tracking solution provides two [SDKs](/docs/asset-tracking/using-
* **Publishing SDK** (Android, iOS) - for embedding in apps on the asset to be tracked.
* **Subscribing SDK** (Android, iOS, JavaScript) - for embedding in apps that want to observe the asset being tracked using a realtime subscription.
-As Ably is used as the underlying transport, you have direct access to your data and can use [Ably Integrations](/docs/integrations) for a wide range of applications, in addition to direct realtime subscriptions. Examples include:
+As Ably is used as the underlying transport, you have direct access to your data and can use [Ably Integrations](/docs/platform/integrations) for a wide range of applications, in addition to direct realtime subscriptions. Examples include:
* Passing data to another service for realtime processing or tracking.
* Persistence of data to a database for later retrieval.
diff --git a/src/pages/docs/auth/capabilities.mdx b/src/pages/docs/auth/capabilities.mdx
index b17a204d39..fd25089dc0 100644
--- a/src/pages/docs/auth/capabilities.mdx
+++ b/src/pages/docs/auth/capabilities.mdx
@@ -59,9 +59,9 @@ Channel mode flags offer the ability for clients to use different capabilities f
## API key capabilities
-An [Ably API key](/docs/auth#api-key) can have a single set of permissions, applied to any number of [channels](/docs/channels) or [queues](/docs/integrations/queues).
+An [Ably API key](/docs/auth#api-key) can have a single set of permissions, applied to any number of [channels](/docs/channels) or [queues](/docs/platform/integrations/queues).
-You can also choose whether to restrict the API key to only channels, only [queues](/docs/integrations/queues), or to match a set of channel or queue names. If you've chosen to restrict the API key to *selected channels and queues*, you can use a comma separated list of resources the API key can access, making use of wildcards to provide access to areas of your app. It is worth noting an API key will provide the same permissions to all resources it has access to.
+You can also choose whether to restrict the API key to only channels, only [queues](/docs/platform/integrations/queues), or to match a set of channel or queue names. If you've chosen to restrict the API key to *selected channels and queues*, you can use a comma separated list of resources the API key can access, making use of wildcards to provide access to areas of your app. It is worth noting an API key will provide the same permissions to all resources it has access to.
To view the capabilities for an existing API key:
diff --git a/src/pages/docs/channels/index.mdx b/src/pages/docs/channels/index.mdx
index 00f3fb1a3e..d326c89279 100644
--- a/src/pages/docs/channels/index.mdx
+++ b/src/pages/docs/channels/index.mdx
@@ -151,7 +151,7 @@ A namespace is the first part of a channel name up to the first colon (`:`). If
Channel namespaces have the same restrictions as those listed for channels. Additionally they cannot contain the wildcard character `*`.
-Use channel namespaces to apply operations to all channels within that group, such as [capabilities](/docs/auth/capabilities), [channel rules](#rules) and [integrations](/docs/integrations).
+Use channel namespaces to apply operations to all channels within that group, such as [capabilities](/docs/auth/capabilities), [channel rules](#rules) and [integrations](/docs/platform/integrations).