Suggest Edits

Input API : Summary

If you have admin rights, you are able to create / delete multiple write API keys.
Write API keys are used for input APIs, to send data to Logmatic.io

 

Authentication

All input APIs are authenticated using a Write API Key .
It must be specified:

  • Or as a query parameter : api_key
  • Or as a header: x-api-key

How to create a write API key ?

To do so, click on the configure menu and then on +add Write API keys:

List of all your API keys

List of all your API keys

You can create multiple write API keys

In order to have better control over who send or gets the data on your account.

Editing an Input API key

Changing name, disabling a key and daily usage

Editing a write API key

Editing a write API key

When editing a key, you are able to provide a name, to enable/disable it and to see the daily usage.

Assigning a daily quota

As an advance usage you can also apply a daily quota to key. When a quota is reached, the key is then automatically disabled until the end of the daily cycle (midnight UTC).

This is really useful when you are not sure about the volume sent by some of your systems and you still want to get sure you won't reach the limit of your plan.

Enable daily quota and assign a type

Enable daily quota and assign a type

There are 2 kind of quotas:

  • Percentage of global limit: you assign a share of your plan limit. So if you increase your plan in the future the limit will still be relative to the total plan.
  • Volume in MB: you simply assign the allowed number of megabytes for a day coming from this key.

Deleting a key

Be aware that there is no going back and you'll have to change the configuration of all your log forwarders if you do it by mistake.

Input API: HTTP/HTTPS

The HTTP/S API is used to send events to Logmatic.io.
It can be used manually by doing a curl ,as we will demonstrate in this chapter.

Find the complete documentation here

Input API: TCP, TCP/S and UDP

The TCP protocol, or even more, the UDP protocol are more efficient and faster than the HTTP calls. In this chapter , we explain how you can send events directly through these channels.

The TCP protocol guarantees that events are delivered and secured to Logmatic.io. These 2 features are not available over UDP, so the only reason why you would prefer UDP over TCP is performance or maybe the log forwarder process you use only supports UDP.

Find the complete documentation here

Suggest Edits

Input API TCP, TCP/S & UDP

 

Sending logs over TCP

To send logs over TCP, you simply have to prefix your events by your API key

The target server is api.logmatic.io and the port is 10514.

Single line

If you want to try it, use telnet as follows:

> telnet api.logmatic.io 10514
  Trying 23.101.57.133...
  Connected to api.logmatic.io.
  Escape character is '^]'.
> <your_write_api_key> "This is my first line!"

Don't forget to provide your API key

By replacing <your_write_api_key> with your Write API key

Here we just sent "This is my first line!", please check that it works for you in the main view.

Multi-line

You can send multiple events in a single call by splitting with end-of-line characters \n.

> telnet api.logmatic.io 10514
Trying 23.101.57.133...
Connected to api.logmatic.io.
Escape character is '^]'.
> <your_write_api_key> "This is my first line!\nThis is my second line!"

Recognized log content

You can provide JSON events or syslog events directly in the TCP connection.

> telnet api.logmatic.io 10514
Trying 23.101.57.133...
Connected to api.logmatic.io.
Escape character is '^]'.
> <your_write_api_key> {"message":"This is my first line!","appname":"app1"}

Please refer to the Overview section if you want more details about recognized formats.

Sending logs securely with TCP SSL

To secure your calls

  • Replace the port 10514 by 10515.
  • Set up your client with the right certificate that you can download by clicking here.

Here is an example with an OpenSSL client:

> openssl s_client -connect api.logmatic.io:10515
  ...
>  <your_write_api_key> "This is my first line!"

Sending logs over UDP

The UDP endpoint fulfills the same rules as the TCP endpoints:

  • The API key must be the first token of the sentence
  • You can provide multiple lines by inserting end-of-line characters
  • Some types are automatically parsed

The target server is still api.logmatic.io but the port becomes 514.

If you want to test it, you can install Netcat as we are going to use it as a UDP client.

To send your first line with the UDP protocol proceed as follow:

> nc -vu api.logmatic.io 514
> <your_api_key> This is my first line!
Suggest Edits

Input API: HTTP / HTTPS

Even if the following URLs start with HTTP here, Logmatic.io provides a HTTP/S secure communication protocol to guaranty that your logs won't be intercepted by undesired third parties.

 
posthttps://api.logmatic.io/v1/input/your_write_api_key
curl -H "content-type:text/plain" --data 'This is my first line of log!' https://api.logmatic.io/v1/input/<your_write_api_key>
{{ results.method }}
{{ results.url }}
{{ results.requestHeaders }}
{{ results.data }}
{{ results.responseHeaders }}
     200 OK
{
  "message":"This is my first line of log!"
}

Path Params

your_write_api_key
string
required

One of your write API keys need to be provided so the API route the data to your platform

 

Sending plain text

Fill your API key with a single POST endpoint provided by Logmatic.io:
You can either test it directly from this section or use a single curl command:

curl -H "content-type:text/plain" --data 'This is my first line of log!' https://api.logmatic.io/v1/input/<your_api_key>

As long as the endpoint returns a 200 status code it means that your message was properly taken into account. You should then be able to see it in the user interface.

You can also send multiple lines in a single call by inserting end-of-line characters \n or \r\n:

curl -H "content-type:text/plain" --data 'This is my first line!\nThis is my second line of log!' https://api.logmatic.io/v1/input/<your_api_key>

Metadata query parameters

Metadata parameters help you enrich the log events you provide. For instance, if you have multiple applications, hosts or users you may want to provide this information along the content.

The POST endpoint then becomes:

https://api.logmatic.io/v1/input/<your_api_key>?hostname=<hostname>&appname=<appname>&username=<username>

Sending some JSON

Some people prefer to work directly in JSON as there events are already structured. To send structured JSON, the content-type has to be changed to application-json.

Let's say you want to push 2 JSON objects. You can send them like this:

curl -H "content-type:application/json" - -data '[{"username":"user1","appname":"app1","message":"This is my first line!"},{"username":"user2","appname":"app2","message":"This is my second line!"}]' https://api.logmatic.io/v1/input/<your_api_key>?hostname=host1

You'll notice that we kept hostname=host1 as a metadata query parameter. When working with JSON objects these additional fields are merged into JSON objects.

Retrieve client's IP address and user-agent

The API is able to inject the IP address or the user-agent of the calling client in the incoming log event. This can be very useful for Mobile and Web analytics usages.

To do so, you simply have to add one or both of the following headers into your HTTP calls:

  • X-Logmatic-Add-IP=<desired ip attribute>: recopies the IP of the calling client to the <desired ip attribute> of the provided log events
  • X-Logmatic-Add-UserAgent=<desired ua attribute>: recopies the User-Agent of the calling client to the <desired ua attribute> of the provided log events
Suggest Edits

Output API : Summary

If you have admin rights, you are able to create / delete multiple write API keys.
Read API keys are used for output APIs, to extract data from Logmatic.io.

 

Authentication

All output APIs are authenticated using a Read API Key .
It must be specified:

  • Or as a query parameter : api_key
  • Or as a header: x-api-key

Create a Read API key

To create an output API key, click on the configure menu and then on add Read API key:

List of all your API keys

List of all your API keys

You can create multiple keys in order to have better control over who gets access to your account's data. Furthermore you can apply security filters on a per key basis.

You can delete a key in order to revoke it.

Response format

The response is always provided as application/json .
We use an appropriate HTTP status code to better qualify success or errors.
The result is always bundled in the result field of the response.

{
	"result": {
      //result object here
	} 
}

When the API fails, it will include some insight as to what failed. The format is always the same, the status code and a message are provided in an error object.

{
"error": {
      "status": 403,
      "message": "Forbidden. You do not have access to this resource."
    }
}

Rate limiting

Rate limiting

When an API accesses raw or aggregated data such as Saved Analysis, List events API or Aggregation API. You are limited to 50 million of events if the time range exceeds 24 hours. Under a time range of 24 hours, there is no limit on the number of events you can go through.

If you experience any problem with this API, please contact the support team.

Security filters

Just like users, api keys can be applied a security filter to limit the data they can retrieve. To learn more about this feature, you can refer to Manage users & security filters

Assign a name or a security filter to an API key

Assign a name or a security filter to an API key

Available Discovery APIs

List groups defined on your platform

List Fields and Metrics defined on your platform

Available Output APIs

Output Api Name
Description

Returns the data used to build saved analysis in the platform

Fetches a single event from the platform

Lists a filtered set of events

Aggregate a filtered set of data by grouping them by fields and aggregating them by metrics

Suggest Edits

List Groups API

The List the groups API is a Discovery API which lets you get the list of all the (Fields & Metrics) groups available in your platform.

 

Authentication

 Authentication is required for this endpoint.
posthttps://app.logmatic.io/beta/discover/groups
No code samples available
{{ results.method }}
{{ results.url }}
{{ results.requestHeaders }}
{{ results.data }}
{{ results.responseHeaders }}
{
  "result": {
    "groups": [
      {
        "id": "syslog",
        "name": "Syslog",
        "description": "Syslog Fields"
      }, 
      {
        "id": "apache",
        "name": "xxx",
        "description": "xxx"
      }
    ] 
  }
}
 

Generate a read API key

Before starting, you'll need to generate a valid Read API Key. Please refer to the Output API (beta) : Summary if you don't have one already or don't know how to properly use it.

API description

The List the groups API is a Discovery API which lets you get the list of all the (Fields & Metrics) groups available in your platform.

Suggest Edits

List Fields and Metrics API

List Fields and Metrics defined on your platform

 

Authentication

 Authentication is required for this endpoint.
posthttps://app.logmatic.io/beta/discover/fields_metrics
{
  //Mandatory parameters
    "fields": true,
    "metrics": true,
  
  //Filtering parameter
    "groups": ["xxx", "yyy"]
}
{{ results.method }}
{{ results.url }}
{{ results.requestHeaders }}
{{ results.data }}
{{ results.responseHeaders }}
{
  "result": {
    "fields": [
      {
        "id": "xxx",
        "name": "xxx",
        "description":  "xxx"
      }
    ],
    "metrics": [
      {
        "id": "xxx",
        "name": "xxx",
        "description":  "xxx"
      }
    ] 
  }
}
{
  "result": {
    "fields": [
      {
        //In case of traditional field
        "type": "attribute", 
        "attribute":  "custom.xxx"
      }
    ],
    "metrics": [
      {
        (...)
      }
    ] 
  }
}
{
  "result": {
    "fields": [
      {
        //In case of tagset
        "type": "tagset",
        "tags": [
          { 
            "id": "xxx",
            "name": "xxx", 
            "color": "xxx"
          }
        ]
      }
    ],

    "metrics": [
      {
				(...)
      }
    ] 
  }
}
{
  "result": {
    "fields": [
      {
				(...)
      }
    ],
    "metrics": [
      {
        //In case of traditional metric
        "type": "attribute",
        "attribute":  "custom.xxx",
        "aggregation": "sum"
      }
    ] 
  }
}
{
  "result": {
    "fields": [
      {
				(...)
      }
    ],
    "metrics": [
      {
        //In case of formula
        "type": "formula",
        "formula": "xxx"
      }
    ] 
  }
}
{
  "result": {
    "fields": [
      {
				(...)
      }
    ],
    "metrics": [
      {
        //In case of count
        "type": "core" 
      }
    ] 
  }
}

Body Params

fields
boolean

If false, the fields will be omitted.

metrics
boolean

If false, the metrics will be omitted.

groups
array of strings

The subset of groups you want to retrieve the Fields & Metrics from. Everything is returned by default.

 

Generate a read API key

Before starting, you'll need to generate a valid Read API Key. Please refer to the Output API (beta) : Summary if you don't have one already or don't know how to properly use it.

API description

The "List Fields and Metrics API" is a Discovery API which let you get a list of all the fields and metrics defined on your platform.

Suggest Edits

List events API

Lists a filtered set of events

 

Authentication

 Authentication is required for this endpoint.
posthttps://app.logmatic.io/beta/events/list
	
{
    "time": {
        "to": "now",
        "from": "now - 3h"
    }
}
{
    "event": true,
    "afterEvent": "AAAAAVOaRF9dwR6J_EFWT2FRMWF6endJVU1iYVBpeV9w",
    "time": {
        "to": "now",
        "from": "now - 3h"
    }
}
{
    "filters": [
        {
            "search": {
                "query": "*"
            }
        },
        {
            "field": {
                "id": "syslog_hostname",
                "values": [
                    "prod-data1",
                    "prod-data2"
                ],
                "exclude": false
            }
        }
    ],
    "event": true,
    "time": {
        "to": "now",
        "from": "now - 3h"
    }
}
	
{
    "columns": [
        {
            "field": "syslog_hostname"
        },
        {
            "attribute": "syslog.appname"
        }
    ],
    "time": {
        "to": "now",
        "from": "now - 3h"
    }
}
{{ results.method }}
{{ results.url }}
{{ results.requestHeaders }}
{{ results.data }}
{{ results.responseHeaders }}
{
    "result": {
        "events": [
            {
                "id": "AAAAAVOaRF9dwR6J_EFWT2FRMWF6endJ1iYVBpeV9w",
                "date": "2016-03-21T17:41:25.469Z"
            },
            {
                "id": "AAAAAVOaRF9dwR6J-0FWT2FRMWF6endJ1iYVBpeV9v",
                "date": "2016-03-21T17:41:25.469Z"
            },
            {
                "id": "AAAAAVOaRF9dwR6J-kFWT2FRMWF6endJ1iYVBpeV9u",
                "date": "2016-03-21T17:41:25.469Z"
            },
          ...
        ]
    }
}
{
  "result": {
    "events": [
      {
        "id": "AAAAAVOEYMdYps92oUFWT0VZT294Y3E0V1BKR21F",
        "date": "2015-03-17T11:40:48.344Z",
        "event": {
          "custom": {
            //User data
          },
          "syslog": {
            //Syslog data
          },
          "message": "..."
        }
      },
      {
        "id": "AAAAAVOEYMdK7Gq9tkFWTWVrV294Y3E0V1BKR2wt",
        "date": "2015-03-17T11:40:48.330Z",
        "event": {
          "custom": {
            //User data
          },
          "syslog": {
            //Syslog data
          },
          "message": "...""
        }
      },
     ...
}
{
  "result": {
    "events": [
      {
        "id": "AAAAAVOEXr_3ps7puUFWT0V1V294Y3E0V1BJeEZW",
        "date": "2015-03-17T11:38:35.383Z",
        "columns": [
          "prod-4",
          "haproxy"
        ]
      },
      {
        "id": "AAAAAVOEXr_l7Go3-kFVYc0JCV294Y3E0V1BJeEZY",
        "date": "2015-03-17T11:38:35.365Z",
        "columns": [
          "prod-5",
          "apache"
        ]
      },
     ...
}

Body Params

time
object
 
time.to
mixed type
required

Higher bound of the timerange (allowed formats: ISO-8601, EPOCH in ms or date math such as "now - 3h" )

time.from
mixed type
required

Lower bound of the timerange (allowed formats: ISO-8601, EPOCH in ms or date math such as "now - 3h" )

time.offset
int32

Timezone offset

time.timezone
string

Timezone

filters
array

Search query or equality filters as an array. Example: [{ "search": { "query": "*" } }, { "field": { "id": "field_id", "values": ["value_1", "value_2"], "exclude": false } }]

event
boolean

Whether the full event should be included in the response or not.

columns
array

Fields, metrics or attributes you want to specifically see in the result. Example: [{ "field": "syslog_hostname" }, { "metric": "custom_responsetime" }, { "attribute": "syslog.appname" }]

limit
int32

The number of events returned at once. Max: 1000

afterEvent
string

An event id to start with. Used to paginate over the results.

 

Rate limiting

When an API accesses raw or aggregated data such as Saved Analysis, List events API or Aggregation API. You are limited to 50 million of events if the time range exceeds 24 hours. Under a time range of 24 hours, there is no limit on the number of events you can go through.

If you experience any problem with this API, please contact the support team.

Generate a read API key

Before starting, you'll need to generate a valid Read API Key. Please refer to the Output API (beta) : Summary if you don't have one already or don't know how to properly use it.

API description

Lists a filtered set of events.

By default, the events are returned as object that contain the id and the date. However, specific columns can be asked or even the full objects. See the examples above to understand the different options here.

Query parameters description

We are only going to describe the time and filters parameters.

time parameter

The time parameter lets you define the main time range and also your time zone or offset.

The time.from and time.to parameters define the lower and higher bounds of your query. They can take the following values:

  • an ISO-8601 string (eg 2017-09-04T15:27:01+02:00)
  • a unix timestamp (number representing the elapsed millisec since epoch)
  • a date math string aligned with ElasticSearch documentation (eg now, now - 3h, now - 7d, etc...)
{
		"time": {
        "to": "now",
        "from": "now - 3h",
      	"timezone": "Europe/Paris",
      	//"offset": 3600
    }
}
Parameter
Description
Default
Format

time.to

Higher bound of the time range

string

time.from

Lower bound of the time range

string

time.timezone

The timezone of your timerange and the results provided if it applies. Here is a comprehensive list.

UTC

string

time.offset

If the timezone is not defined you can provide in second the offset you want to apply.

integer

filters parameter

filters parameter is used to apply the query on a subset of the whole data. This is defined as an array of objects. You can define either field or search filters.

{
  //"search": {(...)},
  //"time": {(...)},
  //"field": {(...)},
  //"format": true/false,

	"filters": [
      {
          //Example with the syslog_hostname field
          "field": {
              "id": "syslog_hostname",
              "values": [
                  "prod-data1",
                  "prod-data2"
              ],
              "exclude": false
          }
      },
      {
          //A query string
          "search": "a search query"
      }
  ],

  //"groupBy": [...],
  //"compute": [...]
}

field filter

Filter the query by defining a list of values to include or exclude.

Parameter
Description
Default
Format

filters.field.id

Field ID

string

filters.field.values

The list of values you want to include or exclude

List of strings

filters.field.exclude

True, if you want to exclude the defined list of values

false

boolean

search filter

Filter by a search query.

Parameter
Description
Default
Format

filters.search

A search query as defined in the Using the search bar article.

string

Suggest Edits

Fetch API

Fetches a single event from the platform

 

Authentication

 Authentication is required for this endpoint.
posthttps://app.logmatic.io/beta/events/fetch
{
  //Mandatory parameters	
  "id": "AAAAAVOej_mE1i_TLEFWT2VqdHZ5endJVU1iYVByZ0Vf",
  "event": true
}
{
  //Mandatory parameters	
  "id": "AAAAAVOej_mE1i_TLEFWT2VqdHZ5endJVU1iYVByZ0Vf",
  "columns": [
      {
          "field": "syslog_hostname"
      },
      {
          "attribute": "syslog.appname"
      }
  ]
}
{{ results.method }}
{{ results.url }}
{{ results.requestHeaders }}
{{ results.data }}
{{ results.responseHeaders }}
     200 Full eventWith columns
{
    "result": {
        "id": "AAAAAVOej_mE1i_TLEFWT2VqdHZ5endJVU1iYVByZ0Vf",
        "date": "2016-03-22T13:42:28.996Z",
        "event": {
            "custom": {
                "app": {
                    "instance": 0
                },
                "level": "error",
                "message": "middlewareError",
          //...
      }
}
{
    "result": {
        "id": "AAAAAVOej_mE1i_TLEFWT2VqdHZ5endJVU1iYVByZ0Vf",
        "date": "2016-03-22T13:42:28.996Z",
        "columns": [
            "analytics-1",
            "app"
        ]
    }
}

Body Params

id
string
required

The id of the event retrieved from the "List event API"

event
boolean

Whether the full event should be included in the response or not.

columns
array

Fields, metrics or attributes you want to specifically see in the result. Example: [{ "field": "syslog_hostname" }, { "metric": "custom_responsetime" }, { "attribute": "syslog.appname" }]

 

Generate a read API key

Before starting, you'll need to generate a valid Read API Key. Please refer to the Output API (beta) : Summary if you don't have one already or don't know how to properly use it.

API Description

With the Fetch API you can get a specific raw-event only using its ID as retrieved on the API events list.

You can ask to extract specific(s) information(s) from the raw-event.
You can decide to get the entire raw-event or not.

If not, the response then contains the values of the fields, metrics and attributes specified in columns query parameter.

We did this in order to give you the possibilities to extract meaningful values from a specific event without getting the whole raw-event.

Suggest Edits

Saved Analyses API

Returns the data used to build saved analyses in the platform

 

Authentication

 Authentication is required for this endpoint.
posthttps://app.logmatic.io/beta/saved_analyses/savedAnalysesId/execute
{
	"time": {
		"from": "now-3h",
		"to": "now"
  }
}
{{ results.method }}
{{ results.url }}
{{ results.requestHeaders }}
{{ results.data }}
{{ results.responseHeaders }}
{
    "result": {
        "values": [
            {
                "time": 1458187200000,
                "fields": {},
                "metrics": {
                    "m0": 2125
                }
            },
            {
                "time": 1458189000000,
                "fields": {},
                "metrics": {
                    "m0": 1345
                }
            },
          ...
        ]
    }
}
{
    "result": {
        "events": [
            {
                "logmatic": {
                    "date": "2016-03-17T09:52:48.102Z"
                },
                "custom": {
                    "app": {
                        "instance": 0
                    },
                    "res": {
                        "statusCode": 200
                    },
                    "level": "info",
                    "responseTime": 25,
                    "message": "HTTP GET /modules/components/workbench/controller.js",
                    "req": {
                        "headers": {
                            "user-agent": "Mozilla/5.0 (Windows NT 6.2; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/49.0.2623.87 Safari/537.36"
                        },
                        "method": "GET",
                        "url": "/modules/components/workbench/controller.js"
                    }
                },
                "syslog": {
                    "severity": 6,
                    "hostname": "analytics-1",
                    "appname": "app",
                    "prival": 134,
                    "facility": 16,
                    "version": 0,
                    "timestamp": "2016-03-17T09:52:48.102Z"
                }
            },
            {
                "logmatic": {
                    "date": "2016-03-17T09:52:45.402Z"
                },
                "custom": {
                    "app": {
                        "instance": 0
                    },
                    "res": {
                        "statusCode": 200
                    },
                    "level": "info",
                    "responseTime": 14,
                    "message": "HTTP GET /modules/components/admin/entries.js",
                    "req": {
                        "headers": {
                            "user-agent": "Mozilla/5.0 (Windows NT 6.2; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/49.0.2623.87 Safari/537.36"
                        },
                        "method": "GET",
                        "url": "/modules/components/admin/entries.js"
                    }
                },
                "syslog": {
                    "severity": 6,
                    "hostname": "analytics-1",
                    "appname": "app",
                    "prival": 134,
                    "facility": 16,
                    "version": 0,
                    "timestamp": "2016-03-17T09:52:45.402Z"
                }
            },
          ...
        }
      ]
    }
}

Path Params

savedAnalysesId
string
required

The identifier found in the URL when you load a bookmark (analysis) in your platform

Body Params

time
object
 
time.to
mixed type

Higher bound of the timerange (allowed formats: ISO-8601, EPOCH in ms or date math such as "now - 3h" )

time.from
mixed type

Lower bound of the timerange (allowed formats: ISO-8601, EPOCH in ms or date math such as "now - 3h" )

time.offset
int32

Timezone offset in milliseconds

time.timezone
string

Timezone eg: UTC+01:00 or Europe/Paris

limit
int32

Maximum number of rows (Only against paginable views)

field
object
 
field.from
int32

Used to paginate (Only on list views or list of events)

format
boolean

Provides results with display names and colors

 

Rate limiting

When an API accesses raw or aggregated data such as Saved Analysis, List events API or Aggregation API. You are limited to 50 million of events if the time range exceeds 24 hours. Under a time range of 24 hours, there is no limit on the number of events you can go through.

If you experience any problem with this API, please contact the support team.

Generate a read API key

Before starting, you'll need to generate a valid Read API Key. Please refer to the Output API (beta) : Summary if you don't have one already or don't know how to properly use it.

API description

As for now, you can extract data from Logmatic.io only from a valid Saved Analysis identifier. The base API path to extract any data out of Logmatic.io is http://app.logmatic.io/beta/saved_analyses/<bookmarkId>/execute as a POST or a GET.

Query parameters description

time parameter

The time parameter lets you define the main time range and also your time zone or offset.

The time.from and time.to parameters define the lower and higher bounds of your query. They can take the following values:

  • an ISO-8601 string (eg 2017-09-04T15:27:01+02:00)
  • a unix timestamp (number representing the elapsed millisec since epoch)
  • a date math string aligned with ElasticSearch documentation (eg now, now - 3h, now - 7d, etc...)
{
		"time": {
        "to": "now",
        "from": "now - 3h",
      	"timezone": "Europe/Paris",
      	//"offset": 3600
    }
}
Parameter
Description
Default
Format

time.to

Higher bound of the time range

string

time.from

Lower bound of the time range

string

time.timezone

The timezone of your timerange and the results provided if it applies. Here is a comprehensive list.

UTC

string

time.offset

If the timezone is not defined you can provide in second the offset you want to apply.

integer

2 possible cases are implemented:

  • If this is a list view then the returned data will be the raw log events
  • If this is an analytical view (time series, leaderboards, etc...) then the returned data will be aggregated results
Suggest Edits

Aggregation API

Aggregate a filtered set of data by grouping events by fields and aggregating them by metrics

 

Authentication

 Authentication is required for this endpoint.
posthttps://app.logmatic.io/beta/events/aggregate
	
{
    "compute": [
        {
            "metric": {
                "id": "count",
                "output": "cnt"
            }
        }
    ],
    "time": {
        "to": "now",
        "from": "now - 3h"
    }
}
{
    "compute": [
        {
            "metric": {
                "id": "count",
                "output": "cnt"
            }
        }
    ],
    "groupBy": [
        {
            "field": {
                "id": "syslog_hostname",
                "total": "_TOTAL_HOSTNAME_",
                "output": "host"
            }
        }
    ],
    "format": true,
    "time": {
        "to": "now",
        "from": "now - 3h"
    }
}
{
    "compute": [
        {
            "metric": {
                "id": "count",
                "output": "cnt"
            }
        }
    ],
    "groupBy": [
        {
            "field": {
                "id": "syslog_hostname",
                "total": "_TOTAL_HOSTNAME_",
                "output": "host"
            }
        },
        {
            "time": {
                "interval": "1m",
                "output": "t"
            }
        }
    ],
    "time": {
        "to": "now",
        "from": "now - 3h"
    }
}
{
    "compute": [
        {
            "metric": {
                "id": "count",
                "output": "cnt"
            }
        }
    ],
    "filters": [
        {
            "field": {
                "id": "syslog_hostname",
                "values": [
                    "prod-data1",
                    "prod-data2"
                ],
                "exclude": false
            }
        }
    ],
    "time": {
        "to": "now",
        "from": "now - 3h"
    }
}
{
    "compute": [
        {
            "metric": {
                "id": "count",
                "output": "cnt"
            }
        }
    ],
    "groupBy": [
        {
            "field": {
                "id": "syslog_hostname",
                "total": "_TOTAL_HOSTNAME_",
                "output": "host",
                "limit": 5,
              	//Natural order on first groupBy is mandatory to enable pagination
                "order": {
                    "@natural": "asc"
                }
            }
        }
    ],
    "time": {
        "to": "now",
        "from": "now - 3h"
    },
  	//Paginate until host "prod-data4"
  	"afterBucket": {
    	 "host": "prod-data4"
    }
}
{
    "compute": [
        {
            "metric": {
                "id": "count",
                "output": "cnt"
            }
        }
    ],
    "groupBy": [
        {
            "field": {
                "id": "syslog_hostname",
                "total": "_TOTAL_HOSTNAME_",
                "output": "host"
            }
        },
        {
            "field": {
                "id": "syslog_appname",
                "output": "app"
            }
        }
    ],
    "time": {
        "to": "now",
        "from": "now - 3h"
    }
}
{{ results.method }}
{{ results.url }}
{{ results.requestHeaders }}
{{ results.data }}
{{ results.responseHeaders }}
{
    "result": {
        "aggregates": [
            {
                "bucket": {},
                "values": {
                    "cnt": 128930
                }
            }
        ]
    }
}
{
    "result": {
        "aggregates": [
            {
                "bucket": {
                    "host": {
                        "id": "prod-data3",
                        "name": "prod-data3",
                        "color": "#ff7f00"
                    }
                },
                "values": {
                    "cnt": {
                        "value": 33041,
                        "formatted": "33.04k"
                    }
                }
            },
            {
                "bucket": {
                    "host": {
                        "id": "prod-data2",
                        "name": "prod-data2",
                        "color": "#ffff33"
                    }
                },
                "values": {
                    "cnt": {
                        "value": 23206,
                        "formatted": "23.21k"
                    }
                }
            },
          ...
}
{
    "result": {
        "aggregates": [
            {
                "bucket": {
                    "host": "prod-data3",
                    "t": "2016-03-21T14:59:00.000Z"
                },
                "values": {
                    "cnt": 45
                }
            },
            {
                "bucket": {
                    "host": "prod-data3",
                    "t": "2016-03-21T15:00:00.000Z"
                },
                "values": {
                    "cnt": 313
                }
            },
          ...
}
{
    "result": {
        "aggregates": [
            {
                "bucket": {
                    "host": "prod-data5"
                },
                "values": {
                    "cnt": 20575
                }
            },
            {
                "bucket": {
                    "host": "prod-es1"
                },
                "values": {
                    "cnt": 4296
                }
            }
          //...
        ],
      	//Used as an anchor to call the "afterBucket" pagination parameter
        "lastBucket": {
            "host": "prod-mongo1"
        }
    }
}
{
    "result": {
        "aggregates": [
            {
                "bucket": {
                    "host": "analytics-1",
                    "app": "app"
                },
                "values": {
                    "cnt": 12290
                }
            },
            {
                "bucket": {
                    "host": "analytics-1",
                    "app": "sshd"
                },
                "values": {
                    "cnt": 3584
                }
            },
            {
                "bucket": {
                    "host": "analytics-1",
                    "app": "CRON"
                },
                "values": {
                    "cnt": 9
                }
            },
          //...
        ]
    }
}

Body Params

time
object
 
time.to
mixed type
required

Higher bound of the timerange (allowed formats: ISO-8601, EPOCH in ms or date math such as "now - 3h" )

time.from
mixed type
required

Lower bound of the timerange (allowed formats: ISO-8601, EPOCH in ms or date math such as "now - 3h" )

time.offset
int32

Timezone offset

time.timezone
string

Timezone

compute
array
required

Compute up to 5 aggregated metrics. See details below. Example: [{ "metric": { "id": "count", "output": "cnt" } }]

groupBy
array

Group by up to 3 fields or time. See details below. Example: [{ "field": { "id": "syslog_hostname", "total": "_TOTAL_HOSTNAME_", "output": "host" } }, { "time": { "interval": "1m", "output": "t" } }]

filters
array

Search query or equality filters as an array. Example: [{ "search": { "query": "*" } }, { "field": { "id": "field_id", "values": ["value_1", "value_2"], "exclude": false } }]

format
boolean

If true, the response contains formatted strings and colors aligned with what your application would do.

afterBucket
object

Used to paginate if you exceed the max size of buckets returned. Available only if the first groupBy is sorted in natural order. It must be defined as an object defining the groupBy name et the bucket as follow: {"host": "prod-data4"}

 
 

Rate limiting

When an API accesses raw or aggregated data such as Saved Analysis, List events API or Aggregation API. You are limited to 50 million of events if the time range exceeds 24 hours. Under a time range of 24 hours, there is no limit on the number of events you can go through.

If you experience any problem with this API, please contact the support team.

Generate a read API key

Before starting, you'll need to generate a valid Read API Key. Please refer to the Output API (beta) Summary if you don't have one already or don't know how to properly use it.

API description

The aggregation API returns the result of up to 5 aggregated metrics and 3 group bys. It offers an access to the aggregation capabilities of your Logmatic.io platform.

Query parameters description

We only detail query parameters that have a higher level of complexity here such as: time, filters, groupBy and compute.

time parameter

The time parameter lets you define the main time range and also your time zone or offset.

The time.from and time.to parameters define the lower and higher bounds of your query. They can take the following values:

  • an ISO-8601 string (eg 2017-09-04T15:27:01+02:00)
  • a unix timestamp (number representing the elapsed millisec since epoch)
  • a date math string aligned with ElasticSearch documentation (eg now, now - 3h, now - 7d, etc...)
{
		"time": {
        "to": "now",
        "from": "now - 3h",
      	"timezone": "Europe/Paris",
      	//"offset": 3600
    }
}
Parameter
Description
Default
Format

time.to

Higher bound of the time range

string

time.from

Lower bound of the time range

string

time.timezone

The timezone of your timerange and the results provided if it applies. Here is a comprehensive list.

UTC

string

time.offset

If the timezone is not defined you can provide in second the offset you want to apply.

integer

filters parameter

filters parameter is used to apply the query on a subset of the whole data. This is defined as an array of objects. You can define either field or search filters.

{
  //"search": {(...)},
  //"time": {(...)},
  //"field": {(...)},
  //"format": true/false,

	"filters": [
      {
          //Example with the syslog_hostname field
          "field": {
              "id": "syslog_hostname",
              "values": [
                  "prod-data1",
                  "prod-data2"
              ],
              "exclude": false
          }
      },
      {
          //A query string
          "search": {"query":"a search query"}
      }
  ],

  //"groupBy": [...],
  //"compute": [...]
}

field filter

Filter the query by defining a list of values to include or exclude.

Parameter
Description
Default
Format

filters.field.id

Field ID

string

filters.field.values

The list of values you want to include or exclude

List of strings

filters.field.exclude

True, if you want to exclude the defined list of values

false

boolean

search filter

Filter by a search query.

Parameter
Description
Default
Format

filters.search

A search query as defined in the Using the search bar article.

string

groupBy parameter

The groupBy parameter is used to split the aggregation over any defined field on the plateform or over the time. This is defined as a list of objects: either field or time.

Limits:

  • You can ask up to 3 groupBys at a time
  • You can ask up to 100 values aggregated over each groupBy
  • You can ask up to 1000 values combined over all the groupBys at a time

Pagination:
Pagination over the first groupBy can be enabled by asking a natural order ("@natural": "asc/desc").

{
  //"search": {(...)},
  //"time": {(...)},
  //"field": {(...)},
  //"format": true/false,

  "groupBy": [
    {
      "field": {
        "id": "field_id",
        "limit": xx,
        "output": "out_attr_name_f",
        "total": "_total_",
        "order": {
          "metric_id": "asc/desc"
          //or "@natural": "asc/desc" if you want to enable pagination
        } 
      }
    }, 
    { 
      "time": {
        "interval": "1m",
        "output": xxx,
        "total": "_TOTAL_"
      } 
    }
  ],
  //"compute": [{(...)}] 
}

Group by a field

Parameter
Description
Default
Format

groupBy.field.id

Field ID

string

groupBy.field.limit

Limit the number of values for that field. No limit for some pre-defined fields like tagsets, Max is 100, product of limits cannot exceed 1000.

integer

groupBy.field.output

The display name of your field in output result.

string

groupBy.field.total

If you want subtotals to be returned provide a display name that'll be returned in the response

null

string

groupBy.field.order

You can ask to order ascending or descending according to a metric (eg {"count": "desc"}) or to the natural order (eg {"@natural": "desc"}). When ordered as natural order, pagination is enabled over the first groupBy.

Descending over the first defined metric by default.

object

Group by time

Parameter
Description
Default
Format

groupBy.time.interval

Time steps for aggregation

1m

string

groupBy.time.output

Display name of the time attribute in the result

integer

groupBy.time.total

Return the total over time. Provide a display name that'll be returned in the response

null

string

compute parameter

The compute parameter is where you define which metrics you want to get aggregated into all the resulting buckets of your groupBys.

You can define up to 5 metrics per query.

{
  //"search": {(...)},
  //"time": {(...)},
  //"field": {(...)},
  //"format": true/false,
  //"groupBy": [{(...)}],

  "compute": [
    {
      "metric": {
        "id": "metric_id",
        "output": "out_attr_name_m"
      }
    }, 
    {
      "timeseries": {
        "metricId": "metric_id",
        "output": "output_attr_name_m2",
        "interval": "1m"
      }
    }
  ]
}

Compute metric

Parameter attribute
Description
Default
Format

compute.metric.id

Metric ID

string

compute.metric.output

The display name of your metric in output result.

string

Compute timeseries

Parameter attribute
Descritpion
Default
Format

compute.timeseries.metricId

Metric ID

string

compute.timeseries.output

The display name of the time in output result.

string

compute.timeseries.interval

The interval: 1s for "1 sec", 1m for "1 minute", 1h for "1 hour", etc...

string