diff --git a/docs/reuse/apps/opentelemetry/logs-advance-option-otel.md b/docs/reuse/apps/opentelemetry/logs-advance-option-otel.md
deleted file mode 100644
index b0e689e798..0000000000
--- a/docs/reuse/apps/opentelemetry/logs-advance-option-otel.md
+++ /dev/null
@@ -1,5 +0,0 @@
-**Advance options** for log collection can be used as follows:
- * **Timestamp Format**. By default, Sumo Logic will automatically detect the timestamp format of your logs. However, you can manually specify a timestamp format for a source by configuring the following:
- - **Timestamp locator**. Use a [Go regular expression](https://github.com/google/re2/wiki/Syntax) to match the timestamp in your logs. Ensure the regular expression includes a named capture group called `timestamp_field`.
- - **Layout**. Specify the exact layout of the timestamp to be parsed. For example, `- %Y-%m-%dT%H:%M:%S.%LZ`. To learn more about the formatting rules, refer to [this guide](https://github.com/open-telemetry/opentelemetry-collector-contrib/blob/main/internal/coreinternal/timeutils/internal/ctimefmt/ctimefmt.go#L68).
- - **Location (Time zone)**. Define the geographic location (timezone) to use when parsing a timestamp that does not include a timezone. The available locations depend on the local IANA Time Zone database. For example, `America/New_York`. See more examples [here](https://en.wikipedia.org/wiki/List_of_tz_database_time_zones).
diff --git a/docs/reuse/apps/opentelemetry/timestamp-parsing.md b/docs/reuse/apps/opentelemetry/timestamp-parsing.md
new file mode 100644
index 0000000000..633369060c
--- /dev/null
+++ b/docs/reuse/apps/opentelemetry/timestamp-parsing.md
@@ -0,0 +1 @@
+**Timestamp Parsing**. You can configure timestamp parsing for logs ingested using this source template. For more information, see [Timestamps, Time Zones, Time Ranges, and Date Formats for the OpenTelemetry Collector](/docs/send-data/opentelemetry-collector/remote-management/source-templates/otrm-time-reference).
diff --git a/docs/send-data/opentelemetry-collector/remote-management/source-templates/apache/index.md b/docs/send-data/opentelemetry-collector/remote-management/source-templates/apache/index.md
index 76ad098ba7..9fe7afb829 100644
--- a/docs/send-data/opentelemetry-collector/remote-management/source-templates/apache/index.md
+++ b/docs/send-data/opentelemetry-collector/remote-management/source-templates/apache/index.md
@@ -77,9 +77,9 @@ In this step, you will configure the yaml required for Apache Collection. Below
- **Error file log path**. Enter the path to the error log file for your Apache instance.
- **Fields/Metadata**. You can provide any customer fields to be tagged with the data collected. By default, sumo tags `_sourceCategory` with the value otel/apache user needs to provide the value for `webengine.cluster.name`.
-import OtelLogAdvanceOption from '../../../../../reuse/apps/opentelemetry/logs-advance-option-otel.md';
+import TimestampParsing from '../../../../../reuse/apps/opentelemetry/timestamp-parsing.md';
-
+
**Processing Rules**. You can add **processing rules** for logs/metrics collected. To learn more, refer to [Processing Rules](../../processing-rules/index.md).
diff --git a/docs/send-data/opentelemetry-collector/remote-management/source-templates/docker/index.md b/docs/send-data/opentelemetry-collector/remote-management/source-templates/docker/index.md
index 289b99a274..ab63c74519 100644
--- a/docs/send-data/opentelemetry-collector/remote-management/source-templates/docker/index.md
+++ b/docs/send-data/opentelemetry-collector/remote-management/source-templates/docker/index.md
@@ -77,9 +77,9 @@ In this step, you will configure the yaml required for Docker Collection. Below
- **Excluded Image List**. A list of strings, [regexes](https://golang.org/pkg/regexp/), or [globs](https://github.com/gobwas/glob) whose referent container image names will not be among the queried containers for scrapping metrics. Learn more about [*excluded_images*](https://github.com/open-telemetry/opentelemetry-collector-contrib/blob/main/receiver/dockerstatsreceiver/README.md#configuration).
- **Fields/Metadata**. You can provide any customer fields to be tagged with the data collected. By default, Sumo Logic tags `_sourceCategory` with the value otel/docker.
-import OtelLogAdvanceOption from '../../../../../reuse/apps/opentelemetry/logs-advance-option-otel.md';
+import TimestampParsing from '../../../../../reuse/apps/opentelemetry/timestamp-parsing.md';
-
+
**Processing Rules**. You can add **processing rules** for logs/metrics collected. To learn more, refer to [Processing Rules](../../processing-rules/index.md).
diff --git a/docs/send-data/opentelemetry-collector/remote-management/source-templates/elasticsearch/index.md b/docs/send-data/opentelemetry-collector/remote-management/source-templates/elasticsearch/index.md
index 6cca8d5b4e..85a4e2b0cb 100644
--- a/docs/send-data/opentelemetry-collector/remote-management/source-templates/elasticsearch/index.md
+++ b/docs/send-data/opentelemetry-collector/remote-management/source-templates/elasticsearch/index.md
@@ -71,16 +71,16 @@ import CollectorInstallation from '../../../../../reuse/apps/opentelemetry/colle
In this step, you will configure the yaml required for Elasticsearch collection. Below are the inputs required for configuration:
- **Name**. Name of the source template.
-- **Description**. Description for the source template.
+- **Description**. Description for the source template.
- **Log Filepath**. Location where the Elasticsearch logs are logged. Please refer to your elasticsearch.conf file.
- **Endpoint**. Enter the url of the server you need to monitor. (default: `localhost:9200`).
- **Username**. Enter the Elasticsearch username.
- **Password Environment Variable Name**. Enter the Elasticsearch password environment variable name.
- **Fields/Metadata**. You can provide any customer fields to be tagged with the data collected. By default, Sumo Logic tags `_sourceCategory` with the value otel/elasticsearch user needs to provide the value for `db.cluster.name`.
-import OtelLogAdvanceOption from '../../../../../reuse/apps/opentelemetry/logs-advance-option-otel.md';
+import TimestampParsing from '../../../../../reuse/apps/opentelemetry/timestamp-parsing.md';
-
+
**Processing Rules**. You can add **processing rules** for logs/metrics collected. To learn more, refer to [Processing Rules](../../processing-rules/index.md).
@@ -92,4 +92,4 @@ import DataConfiguration from '../../../../../reuse/apps/opentelemetry/data-conf
:::info
Refer to the [changelog](changelog.md) for information on periodic updates to this source template.
-:::
\ No newline at end of file
+:::
diff --git a/docs/send-data/opentelemetry-collector/remote-management/source-templates/index.md b/docs/send-data/opentelemetry-collector/remote-management/source-templates/index.md
index 24d14d6e31..4700be505d 100644
--- a/docs/send-data/opentelemetry-collector/remote-management/source-templates/index.md
+++ b/docs/send-data/opentelemetry-collector/remote-management/source-templates/index.md
@@ -86,4 +86,15 @@ In this section, we'll show you how to set up source templates for the following
})
Windows
Learn how to configure our OTel Windows source template.
+
+
+
diff --git a/docs/send-data/opentelemetry-collector/remote-management/source-templates/kafka/index.md b/docs/send-data/opentelemetry-collector/remote-management/source-templates/kafka/index.md
index 866963a5b4..3895753186 100644
--- a/docs/send-data/opentelemetry-collector/remote-management/source-templates/kafka/index.md
+++ b/docs/send-data/opentelemetry-collector/remote-management/source-templates/kafka/index.md
@@ -68,9 +68,9 @@ In this step, you will configure the yaml required for Kafka collection. Below a
- **Endpoint**. The URL of the broker endpoint (default: `localhost:9092`).
- **Fields/Metadata**. You can provide any customer fields to be tagged with the data collected. By default, Sumo Logic tags `_sourceCategory` with the value otel/kafka user needs to provide the value for `webengine.cluster.name`.
-import OtelLogAdvanceOption from '../../../../../reuse/apps/opentelemetry/logs-advance-option-otel.md';
+import TimestampParsing from '../../../../../reuse/apps/opentelemetry/timestamp-parsing.md';
-
+
**Processing Rules**. You can add **processing rules** for logs/metrics collected. To learn more, refer to [Processing Rules](../../processing-rules/index.md).
diff --git a/docs/send-data/opentelemetry-collector/remote-management/source-templates/localfile/index.md b/docs/send-data/opentelemetry-collector/remote-management/source-templates/localfile/index.md
index e5eeff21dc..2be2ae2439 100644
--- a/docs/send-data/opentelemetry-collector/remote-management/source-templates/localfile/index.md
+++ b/docs/send-data/opentelemetry-collector/remote-management/source-templates/localfile/index.md
@@ -55,9 +55,9 @@ In this step, you will configure the yaml required for Local File collection. Be
- **Collection should begin from**. Defines where will the collection of the logs start from. Possible values are "End of File" and "Beginning of File".
- **Detect messages spanning multiple lines**. You can enable this option when dealing with logs which span over multiple lines. On enabling this option you will need to specify **Boundary regex location** where you can specify if the expression defines end or start of the log line and **Expression to match message boundary** where you will define the expression.
-import OtelLogAdvanceOption from '../../../../../reuse/apps/opentelemetry/logs-advance-option-otel.md';
+import TimestampParsing from '../../../../../reuse/apps/opentelemetry/timestamp-parsing.md';
-
+
**Processing Rules**. You can add processing rules for logs collected. To learn more, refer to [Processing Rules](../../processing-rules/index.md).
diff --git a/docs/send-data/opentelemetry-collector/remote-management/source-templates/mysql/index.md b/docs/send-data/opentelemetry-collector/remote-management/source-templates/mysql/index.md
index 6452676a6a..18ff50dced 100644
--- a/docs/send-data/opentelemetry-collector/remote-management/source-templates/mysql/index.md
+++ b/docs/send-data/opentelemetry-collector/remote-management/source-templates/mysql/index.md
@@ -80,7 +80,7 @@ import CollectorInstallation from '../../../../../reuse/apps/opentelemetry/colle
In this step, you will configure the yaml required for MySQL collection. Below are the inputs required for configuration:
- **Name**. Name of the source template.
-- **Description**. Description for the source template.
+- **Description**. Description for the source template.
- **Error log path**. Location where the SQL Errors are logged. Please refer to your my.cnf file.
- **Slow Transaction log file path (optional)**. Location where the Slow SQL transactions are logged. Please refer to your my.cnf file.
- **Endpoint**. The URL of the MySQL endpoint (default: `localhost:3306`).
@@ -88,9 +88,9 @@ In this step, you will configure the yaml required for MySQL collection. Below a
- **Password Environment Variable Name**. Enter the MySQL password environment variable name.
- **Fields/Metadata**. You can provide any customer fields to be tagged with the data collected. By default, Sumo Logic tags `_sourceCategory` with the value otel/mysql user needs to provide the value for `db.cluster.name`.
-import OtelLogAdvanceOption from '../../../../../reuse/apps/opentelemetry/logs-advance-option-otel.md';
+import TimestampParsing from '../../../../../reuse/apps/opentelemetry/timestamp-parsing.md';
-
+
**Processing Rules**. You can add **processing rules** for logs/metrics collected. To learn more, refer to [Processing Rules](../../processing-rules/index.md).
@@ -102,4 +102,4 @@ import DataConfiguration from '../../../../../reuse/apps/opentelemetry/data-conf
:::info
Refer to the [changelog](changelog.md) for information on periodic updates to this source template.
-:::
\ No newline at end of file
+:::
diff --git a/docs/send-data/opentelemetry-collector/remote-management/source-templates/nginx/index.md b/docs/send-data/opentelemetry-collector/remote-management/source-templates/nginx/index.md
index 06fb335cff..93797feb74 100644
--- a/docs/send-data/opentelemetry-collector/remote-management/source-templates/nginx/index.md
+++ b/docs/send-data/opentelemetry-collector/remote-management/source-templates/nginx/index.md
@@ -76,9 +76,9 @@ In this step, you will configure the yaml required for Nginx collection. Below a
- **Path to Nginx error Log file**. Enter the path to the error log file for your Nginx instance.
- **Fields/Metadata**. You can provide any customer fields to be tagged with the data collected. By default sumo tags `_sourceCategory` with the value otel/nginx user needs to provide the value for `webengine.cluster.name`.
-import OtelLogAdvanceOption from '../../../../../reuse/apps/opentelemetry/logs-advance-option-otel.md';
+import TimestampParsing from '../../../../../reuse/apps/opentelemetry/timestamp-parsing.md';
-
+
**Processing Rules**. You can add **processing rules** for logs/metrics collected. To learn more, refer to [Processing Rules](../../processing-rules/index.md).
diff --git a/docs/send-data/opentelemetry-collector/remote-management/source-templates/otrm-time-reference.md b/docs/send-data/opentelemetry-collector/remote-management/source-templates/otrm-time-reference.md
new file mode 100644
index 0000000000..823610d6c6
--- /dev/null
+++ b/docs/send-data/opentelemetry-collector/remote-management/source-templates/otrm-time-reference.md
@@ -0,0 +1,174 @@
+---
+id: otrm-time-reference
+title: Timestamps, Time Zones, Time Ranges, and Date Formats for OpenTelemetry Remote Management
+description: Learn how Sumo Logic manages timestamps, time zones, and dates, and the configuration options that are available with OTRM source templates.
+keywords:
+ - time
+ - time reference
+ - timezone
+ - time zone
+---
+
+import Tabs from '@theme/Tabs';
+import TabItem from '@theme/TabItem';
+import useBaseUrl from '@docusaurus/useBaseUrl';
+
+We support several options for handling timestamps, time zones, and date formats in logs ingested through our OpenTelemetry Remote Management (OTRM) source templates.
+
+This guide covers timestamp parsing behavior, configuration, and troubleshooting specific to OTRM. If you're using traditional Sumo Logic sources (not OTRM), refer to the general [Time Reference documentation](/docs/send-data/reference-information/time-reference/).
+
+When collecting log data, the timestamp attached to messages is critical for data integrity and accurate search results. Sumo Logic indexes the timestamp of each message to ensure results fall within the query’s time range, allowing you to reconstruct event timelines reliably.
+
+## Timestamps
+
+Timestamp is the part of a log message that marks the time that an event occurred. During ingestion, we can detect the message timestamp, convert it to Unix epoch time (the number of milliseconds since midnight, January 1, 1970 UTC), and index it. The timestamp is parsed either using the default timestamp parsing settings, or a custom format that you specify, including the time zone.
+
+When configuring a source template, you can specify a custom format to parse timestamps in your log messages.
+
+:::note
+Currently, only `strptime` timestamps are supported in the source templates.
+:::
+
+### Timestamp considerations
+
+By default, we can automatically detect timestamps in your log messages. Automatic detection identifies timestamps in common formats and prefers timestamps that appear early in the message.
+
+If your log messages from a source contain multiple timestamps, timestamps in unusual formats, or a mix of distinct timestamp formats, you have two options:
+
+* Configure a Source template for each log format.
+* Configure a custom timestamp format for your Source template.
+
+## Specifying a custom timestamp format and time zone
+
+OpenTelemetry Collectors can automatically parse most timestamps without any issues. However, if you see timestamp parsing issues, you can manually specify the timestamp format in the Sumo Logic UI when configuring a new Source template or editing the timestamp information for an existing Source template.
+
+1. Perform one of the following steps:
+ * If you're configuring a new Source template, proceed to Step 2. Or,
+ * To edit the timestamp settings for an existing Source template, navigate to the source template. Then click **Edit** to the right of the Source name and go to Step 2.
+1. Navigate to the **Timestamp Parsing** section and select **Specify the format** and enter the below details:
+ 1. **Select Timezone**. Define the geographic location (time zone) to use while parsing a timestamp that does not include a time zone. The available locations depend on the local IANA Time Zone database. For example, `America/New_York`. For more examples, refer to the [List of tz database time zones](https://en.wikipedia.org/wiki/List_of_tz_database_time_zones).
+ 1. **Format**. Specify the exact layout of the timestamp to be parsed. For example, `- %Y-%m-%dT%H:%M:%S.%LZ`. To learn more about the formatting rules, refer to [this guide](https://github.com/open-telemetry/opentelemetry-collector-contrib/blob/main/internal/coreinternal/timeutils/internal/ctimefmt/ctimefmt.go#L68).
+ 1. **Timestamp locator**. Use a [Go regular expression](https://github.com/google/re2/wiki/Syntax) to match the timestamp in your logs. Ensure the regular expression includes a named capture group called `timestamp_field`.
+
+
+### Using _format for troubleshooting
+
+You can use `_format` to see how the timestamp is parsed from the log file. Assign `_format` an alias to return it in your search results, for example:
+
+```sql
+| _format as timestampFormat
+```
+
+The fields returned in the search results of `_format` are:
+
+```sql
+t:,o:,l:,p:
+```
+
+where `` can take the below mentioned values:
+
+* `fail`. Failed to locate timestamp.
+* `cache`. Success, cached format.
+* `def`. Success, default (user-specified) format.
+* `full`. Success, from "full" parsing against library of patterns.
+* `none`. Local/receipt time because timestamp parsing is not enabled for this source.
+* `ac1`. Auto-corrected by the "window-based" heuristic (what we call "auto-correction" today). Sumo Logic assumes that all log messages coming from a particular Source will have timestamps that are close together. If a message comes through that appears to be more than one day earlier or later than recent messages from that source, it will be auto-corrected to match the current time. You can stop this auto-correction by explicitly configuring a custom timestamp format on your Source. For example, assume the Collector parses the timestamp "Dec **2**, 2021 2:39:58 AM". If the previously received message from that Source has a timestamp prior to "Dec **1**, 2021 2:39:58 AM" or after "Dec **3**, 2021 2:39:58 AM", the Collector will auto-correct the timestamp to the current time.
+* `ac2`. Auto-corrected by the -1y, +2d heuristic. Sumo Logic assumes that all log messages coming from a particular Source will have timestamps that are within a window of -1 year through +2 days compared to the current time. Any log message with a parsed timestamp outside of that window is automatically re-stamped with the current time. For example, assume the Collector parses the timestamp "Dec 2, **2021** 2:39:58 AM". If the previously received message from that Source is prior to "Dec 1, **2020** 2:39:58 AM" or after "Dec 4, **2021** 2:39:58 AM", the Collector will auto-correct the timestamp to the current time.
+
+#### Example
+
+When you’re troubleshooting issues related to timestamp, you can run a query similar to this to see how the timestamp is parsed:
+
+```sql
+_sourceCategory=PaloAltoNetworks
+| _format as timestampformat
+```
+
+The result would look like this:
+
+### Timestamp format examples
+
+The following conventions are some examples of the supported formats for `strptime` in OpenTelemetry collector:
+
+| `strptime` Format | Example |
+|-------------------|---------|
+| `%Y-%m-%d'T'%H:%M:%S*%f%z` | 2023-08-20'T'13:20:10*633+0000 |
+| `%Y %b %d %H:%M:%S.%f %Z` | 2024 Mar 03 05:12:41.211 PDT |
+| `%b %d %H:%M:%S %z %Y` | Jan 21 18:20:11 +0000 2023 |
+| `%d/%b/%Y:%H:%M:%S %z` | 19/Apr/2023:06:36:15 -0700 |
+| `%b %d, %Y %l:%M:%S %p` | Dec 2, 2023 2:39:58 AM |
+| `%b %d %Y %H:%M:%S` | Jun 09 2023 15:28:14 |
+| `%b %d %H:%M:%S %Y` | Apr 20 00:00:35 2010 |
+| `%b %d %H:%M:%S %z` | Sep 28 19:00:00 +0000 |
+| `%b %d %H:%M:%S` | Mar 16 8:12:04 |
+| `%Y-%m-%dT%H:%M:%S%z` | 2023-10-14T22:11:20+0000 |
+| `%Y-%m-%d %H:%M:%S %z` | 2023-08-19 12:17:55 -0400 |
+| `%Y-%m-%d %H:%M:%S%z` | 2023-08-19 12:17:55-0400 |
+| `%Y %b %d %H:%M:%S.%f*%Z` | 2023 Apr 13 22:08:13.211*PDT |
+| `%Y %b %d %l:%M:%S` | 2023 Mar 10 1:44:20 |
+| `%Y-%m-%d %H:%M:%S,%f%z` | 2023-03-10 14:30:12,655+0000 |
+| `%Y-%m-%d %H:%M:%S` | 2023-02-27 15:35:20 |
+| `%Y-%m-%d %H:%M:%S.%f%z` | 2023-03-12 13:11:34.222-0700 |
+| `%Y-%m-%d'T'%H:%M:%S.%f` | 2023-07-22'T'16:28:55.444 |
+| `%Y-%m-%d'T'%H:%M:%S` | 2023-09-08'T'03:13:10 |
+| `%Y-%m-%d'T'%H:%M:%S'%z` | 2023-03-12'T'17:56:22'-0700' |
+| `%Y-%m-%dT%H:%M:%S.%f%z` | 2023-11-22'T'10:10:15.455 |
+| `%Y-%m-%d'T'%H:%M:%S` | 2023-02-11'T'18:31:44 |
+| `%Y-%m-%d*%H:%M:%S:%f` | 2023-10-30*02:47:33:899 |
+| `%Y-%m-%d*%H:%M:%S` | 2023-07-04*13:23:55 |
+| `%y-%m-%d %H:%M:%S,%f %z` | 23-02-11 16:47:35,985 +0000 |
+| `%y-%m-%d %H:%M:%S,%f` | 23-06-26 02:31:29,573 |
+| `%y-%m-%d %H:%M:%S` | 23-04-19 12:00:17 |
+| `%m/%d/%y %l:%M:%S` | 06/01/23 4:11:05 |
+| `%m%d%y %H:%M:%S` | 220423 11:42:35 |
+| `%Y%m%d %H:%M:%S.%f` | 20230423 11:42:35.173 |
+| `%m/%d/%y*%H:%M:%S` | 08/10/23*13:33:56 |
+| `%m/%d/%Y*%H:%M:%S` | 11/23/2023*05:13:11 |
+| `%m/%d/%y %H:%M:%S %z` | 04/23/23 04:34:22 +0000 |
+| `%m/%d/%Y %H:%M:%S %z` | 10/03/2023 07:29:46 -0700 |
+| `%H:%M:%S` | 11:42:35 |
+| `%H:%M:%S,%f` | 11:42:35,173 |
+| `%d/%b %H:%M:%S,%f` | 23/Apr 11:42:35,173 |
+| `%d/%b/%Y:%H:%M:%S` | 23/Apr/2023:11:42:35 |
+| `%d/%b%Y %H:%M:%S` | 23/Apr/2023 11:42:35 |
+| `%d-%b-%Y %H:%M:%S` | 23-Apr-2023 11:42:35 |
+| `%d-%b-%Y %H:%M:%S` | 23-Apr-2023 11:42:36 |
+| `%d %b %Y %H:%M:%S` | 23 Apr 2023 11:42:35 |
+| `%d %b %Y %H:%M:%S*%f` | 23 Apr 2023 10:32:35*311 |
+| `%m%d_%H:%M:%S` | 0423_11:42:35 |
+| `%m%d_%H:%M:%S.%f` | 0423_11:42:35.883 |
+| `%q/%g/%Y %l:%M:%S %p:%f` | 8/5/2023 3:31:18 AM:234 |
+| `%q/%d/%Y %I:%M:%S %p` | 9/28/2023 2:23:15 PM |
+
+### Time zone considerations
+
+The following considerations apply to time zones:
+
+We highly recommend that the time zone be set explicitly on any source template where the logs do not have a time zone available. Sumo Logic always attempts to determine the time zone for the Source. However, if that is not possible, the time zone will revert to UTC. In these cases, the time zone will be incorrect, and that could significantly affect forensic analysis and reporting.
+
+### Default time zone
+
+By default, we use the time zone from your web browser set by the operating system to display hours and minutes everywhere in our user interface. You can change the default time zone that the user interface displays by adjusting the **Default time zone** setting on the **Preferences** page. This option overrides the time zone from your web browser, and changes how hours and minutes are displayed in the UI. But this is a personal setting, and does not change the time zone for anyone else in your organization.
+
+UI elements that are affected by this setting include:
+
+- **Time Range** field in the **Search** page
+- **Time** column of the **Messages** pane
+- Dashboards
+- Anomaly Detection
+
+Changing the **Default time zone** setting affects how the UI displays messages, but not the actual timestamp in the log message.
+
+For example, the following screenshot shows the time zone set to **PST** in the UI, in the **Time** column. The logs were collected from a system that was also configured to use the **PST** time zone, which is displayed in the timestamp of the **Message** column. The timestamps in both columns match as they are set to the same time zone.
+
+
+
+The next screenshot shows the same search result after changing the Default Time zone setting to UTC. Now the Time column is displayed in UTC, while the Message column retains the original timestamp, in PST.
+
+
+
+In another example, if your time zone is set to **UTC**, and you share a dashboard with another user who has their tim zone set to **PST**, what will they see?
+
+They will see the same data, just displayed using their custom set time zone. For example, if you have a Panel that uses a time series, the timeline on the X axis of your chart is displayed in your time zone, **UTC**. The other user will see the timeline on the X axis displayed in their time zone, **PST**. But the data displayed in the chart is exactly the same.
+
+
diff --git a/docs/send-data/opentelemetry-collector/remote-management/source-templates/postgresql/index.md b/docs/send-data/opentelemetry-collector/remote-management/source-templates/postgresql/index.md
index bb302c87a4..a9f215fe02 100644
--- a/docs/send-data/opentelemetry-collector/remote-management/source-templates/postgresql/index.md
+++ b/docs/send-data/opentelemetry-collector/remote-management/source-templates/postgresql/index.md
@@ -99,9 +99,9 @@ In this step, you will configure the yaml required for PostgreSQL collection. Be
- **Password Environment Variable Name**. Enter the PostgreSQL password environment variable name.
- **Fields/Metadata**. You can provide any customer fields to be tagged with the data collected. By default, Sumo Logic tags `_sourceCategory` with the value otel/postgresql user needs to provide the value for `db.cluster.name`.
-import OtelLogAdvanceOption from '../../../../../reuse/apps/opentelemetry/logs-advance-option-otel.md';
+import TimestampParsing from '../../../../../reuse/apps/opentelemetry/timestamp-parsing.md';
-
+
**Processing Rules**. You can add **processing rules** for logs/metrics collected. To learn more, refer to [Processing Rules](../../processing-rules/index.md).
@@ -113,4 +113,4 @@ import DataConfiguration from '../../../../../reuse/apps/opentelemetry/data-conf
:::info
Refer to the [changelog](changelog.md) for information on periodic updates to this source template.
-:::
\ No newline at end of file
+:::
diff --git a/docs/send-data/opentelemetry-collector/remote-management/source-templates/rabbitmq/index.md b/docs/send-data/opentelemetry-collector/remote-management/source-templates/rabbitmq/index.md
index bc9eee2e4b..338ea27c44 100644
--- a/docs/send-data/opentelemetry-collector/remote-management/source-templates/rabbitmq/index.md
+++ b/docs/send-data/opentelemetry-collector/remote-management/source-templates/rabbitmq/index.md
@@ -67,9 +67,9 @@ In this step, you will configure the yaml required for Local File Collection. Be
- **Username**. Required. Enter the RabbitMQ username.
- **Password Environment Variable Name**. Required. Enter the RabbitMQ password environment variable name.
-import OtelLogAdvanceOption from '../../../../../reuse/apps/opentelemetry/logs-advance-option-otel.md';
+import TimestampParsing from '../../../../../reuse/apps/opentelemetry/timestamp-parsing.md';
-
+
**Processing Rules**. You can add processing rules for logs collected. To learn more, refer to [Processing Rules](../../processing-rules/index.md).
diff --git a/docs/send-data/opentelemetry-collector/remote-management/source-templates/redis/index.md b/docs/send-data/opentelemetry-collector/remote-management/source-templates/redis/index.md
index f32725d8d7..e6d982c52e 100644
--- a/docs/send-data/opentelemetry-collector/remote-management/source-templates/redis/index.md
+++ b/docs/send-data/opentelemetry-collector/remote-management/source-templates/redis/index.md
@@ -58,9 +58,9 @@ separated by a colon.
- **Username** (Optional). Enter the Redis username in case you are using a specific user for monitoring.
- **Password Environment Variable Name** (Required). Enter the Redis password environment variable name.
-import OtelLogAdvanceOption from '../../../../../reuse/apps/opentelemetry/logs-advance-option-otel.md';
+import TimestampParsing from '../../../../../reuse/apps/opentelemetry/timestamp-parsing.md';
-
+
**Processing Rules**. You can add processing rules for logs collected. To learn more, refer to [Processing Rules](../../processing-rules/index.md).
diff --git a/docs/send-data/reference-information/time-reference.md b/docs/send-data/reference-information/time-reference.md
index 82fe7a0b58..aec2c5bf06 100644
--- a/docs/send-data/reference-information/time-reference.md
+++ b/docs/send-data/reference-information/time-reference.md
@@ -23,6 +23,10 @@ The timestamp is the part of a log message that marks the time that an event occ
When configuring a Source, you can choose to use the default timestamp parsing settings, or you can specify a custom format for us to parse timestamps in your log messages. The **Enable Timestamp Parsing** option is selected by default. If it's deselected, no timestamp information is parsed at all. Instead, we stamp logs with the time at which the messages are processed.
+:::note Using OpenTelemetry Remote Management (OTRM)?
+This page covers timestamp parsing for standard Sumo Logic sources. If you're using OTRM source templates, refer to [Timestamps, time zones, time ranges, and date formats for OTRM](/docs/send-data/opentelemetry-collector/remote-management/source-templates/otrm-time-reference) instead.
+:::
+
### Timestamp considerations
By default, we can automatically detect timestamps in your log messages. Automatic detection identifies timestamps in common formats and prefers timestamps that appear early in the message.
diff --git a/sidebars.ts b/sidebars.ts
index e12694f62f..dcb4096a99 100644
--- a/sidebars.ts
+++ b/sidebars.ts
@@ -274,6 +274,7 @@ module.exports = {
]
},
'send-data/opentelemetry-collector/remote-management/source-templates/st-with-secrets',
+ 'send-data/opentelemetry-collector/remote-management/source-templates/otrm-time-reference',
],
},
{
diff --git a/static/img/send-data/source-template-edit.png b/static/img/send-data/source-template-edit.png
new file mode 100644
index 0000000000..c57475ba95
Binary files /dev/null and b/static/img/send-data/source-template-edit.png differ
diff --git a/static/img/send-data/st-timestamp-parsing.png b/static/img/send-data/st-timestamp-parsing.png
new file mode 100644
index 0000000000..9194950bfe
Binary files /dev/null and b/static/img/send-data/st-timestamp-parsing.png differ