Suggest Edits

Summary

If you have admin rights, you are able to create / delete multiple write API keys.
Write API keys are used for input APIs, to send data to Logmatic.io

 

Authentication

All input APIs are authenticated using a Write API Key .
What must be specified:

  • Either as a query parameter : api_key
  • Or as a header: x-api-key

How to create a write API key ?

The API key is your identifier when sending data to us. It is required if you want to send data manually or with any log shippers

In order to find it, click on the Configure button on the Side Menu and on the API keys sub-menu:

API Key overview

API Key overview

When your platform is provisioned, a key will be generated automatically. But if need be, you can generate new ones.

You can create multiple write API keys

In order to have better control over who sends or gets the data on your account.
Find more here

This key will be referred to as <your_api_key> in all further steps.

3 Available endpoints

As we mentioned, there are many ways to send data to Logmatic.io. We'll help you decide which one is best suited to your usage. If you want to understand what is behind this, you need to know that there are only 3 exposed input API endpoints:

All further examples and use cases found in this documentation will always rely on one of these.

Each endpoint has advantages and drawbacks. They are all summarized in the following table:

Protocol
Secure
Garanteed delivery
Endpoint

NO

YES

api.logmatic.io PORT: 10514

YES

YES

api.logmatic.io PORT: 10515

NO

NO

api.logmatic.io PORT: 514

Best practices to send logs in realtime

Many open source projects can help you send machine data in real time to a remote machine. To make things easier, we selected a few solutions that are good standards to ensure great performance, functionality, security, and reliability.

This table aims to present an overview of these options:

Name OS File fwd Enrichment features Endpoint used
Rsyslog Linux YES NO TCP/S, UDP
Nxlog Windows / Linux YES NO TCP/S
Logstash Windows / Linux YES YES HTTP/S, TCP/S, UDP
Syslog-NG Windows / Linux YES YES HTTP/S, TCP/S, UDP
Fluentd Windows/ Linux YES YES HTTP/S, TCP/S, UDP
Agent less Windows / Linux NO NO HTTP/S, TCP/S, UDP

Let's explain a few things here:

  • File fwd means that the log shippers can watch a file and push any appended content to a remote process
  • Enrichment features means that the log shippers can enrich incoming messages on-the-fly before sending them.

Agentless allows to regroup all the solutions by directly calling one of the endpoints within your applications with:

And many other sources.

Reserved fields

the DATE field

By default Logmatic.io generates a timestamp that corresponds to the reception date. This field is used to display your data over the timeline.

Date reserved fileds

timestamp, date, _timestamp, Timestamp, eventTime and published_date

Date recognized formats

The recognized date formats are: ISO8601, UNIX (the milliseconds EPOCH format) and RFC3164.

the MESSAGE field

By default, Logmatic.io considers the message field as the content to display in priority.

MESSAGE is indexed as text

MESSAGE is also the only attribute indexed as text, so you can search over any tokenized word without mentioning the path to the attribute.

Recognized log content & parsers

When plain text is sent to Logmatic.io, the system checks if there are any regular Syslogs.
If you need to extract more value from your logs (apache, key-value, JSON, etc...), you can also enable available parsing extensions or create new ones.
Check the chapter about input parsers here.

Recognized Syslogs

Syslog is the standard for computer message logging. Syslog can be used for computer system management and security auditing, as well as generalized informational, analytical, and debugging messages. It is supported by a wide variety of devices (eg. printers and routers) and receivers across multiple platforms. Because of this, Syslog can be used to integrate log data from many different types of systems into a central repository.

You can provide properly formatted Syslog logs as long as they respect the RFC-5424 standard.

Once the message is extracted, Logmatic.io tries to extract other patterns like JSON or Key-Value pairs. This is very useful since Syslog can become your main format to provide any type of log event.

Let's look at an example. The line below is a valid RFC-5424 Syslog:

<85>0 2014-09-01T13:05:06.048413+00:00 client-logs sshd - - -  pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=222.186.34.120  user=root

It is automatically recognized as a syslog with key-value pairs in it and turned into:

{
  "message": "pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=222.186.34.120  user=root",
  "custom": {
    "uid": 0,
    "euid": 0,
    "tty": "ssh",
    "rhost": "222.186.34.120",
    "user": "root",
  },
  "syslog":  {
    "prival": 85,
    "severity": 5,
    "facility": 10,
    "version": 0,
    "appname": "sshd",
    "hostname": "client-logs",
    "timestamp": "2014-09-01T13:07:57.922742+00:00"
  }
}

Editing an Input API key

Changing name, disabling a key and daily usage

Editing a write API key

Editing a write API key

When editing a key, you are able to provide a name, to enable/disable it and to see the daily usage.

Assigning a daily quota

For an advanced usage you can also apply a daily quota to key. When a quota is reached, the key is automatically disabled until the end of the daily cycle (midnight UTC).

This is really useful when you are not sure about the volume sent by some of your systems and you still want to make sure you won't reach the limit of your plan.

Enable daily quota and assign a type

Enable daily quota and assign a type

There are 2 kind of quotas:

  • Percentage of global limit: you assign a share of your plan limit. So if you increase your plan in the future the limit will still respect the total plan.
  • Volume in MB: you simply assign the allowed number of megabytes for a day coming from this key.

Deleting a key

Be aware that there is no going back and you'll have to change the configuration of all your log forwarders if you do it by mistake.

Input API: HTTP/HTTPS

The HTTP/S API is used to send events to Logmatic.io.
It can be used manually by doing a curl ,as we will show in this chapter.

Find the complete documentation here

Input API: TCP, TCP/S and UDP

The TCP protocol, or even the UDP protocol are more efficient and faster than HTTP calls. In this chapter , we explain how you can send events directly through these channels.

The TCP protocol guarantees that events are delivered to Logmatic.io and secured. These 2 features are not available over UDP, so the only reason why you would prefer UDP over TCP is performance or maybe the log forwarder process you use only supports UDP.

Find the complete documentation here

Suggest Edits

TCP, TCP/S & UDP

 

Sending logs over TCP

To send logs over TCP, you simply have to prefix your events by your API key

The target server is api.logmatic.io and the port is 10514.

Single line

If you want to try it, use telnet as follows:

> telnet api.logmatic.io 10514
  Trying 23.101.57.133...
  Connected to api.logmatic.io.
  Escape character is '^]'.
> <your_write_api_key> "This is my first line!"

Don't forget to provide your API key

By replacing <your_write_api_key> with your Write API key

Here we just sent "This is my first line!", please check that it works for you in the main view.

Multi-line

You can send multiple events in a single call by splitting end-of-line characters \n.

> telnet api.logmatic.io 10514
Trying 23.101.57.133...
Connected to api.logmatic.io.
Escape character is '^]'.
> <your_write_api_key> "This is my first line!\nThis is my second line!"

Recognized log content

You can provide JSON events or Syslog events directly in the TCP connection.

> telnet api.logmatic.io 10514
Trying 23.101.57.133...
Connected to api.logmatic.io.
Escape character is '^]'.
> <your_write_api_key> {"message":"This is my first line!","appname":"app1"}

Please refer to the Overview section if you want more details about recognized formats.

Sending logs securely with TCP SSL

To secure your calls

  • Replace the port 10514 by 10515.
  • Set up your client with the right certificate that you can download by clicking here.

Here is an example with an OpenSSL client:

> openssl s_client -connect api.logmatic.io:10515
  ...
>  <your_write_api_key> "This is my first line!"

Sending logs over UDP

The UDP endpoint follows the same rules as the TCP endpoints:

  • The API key must be the first token of the sentence
  • You can provide multiple lines by inserting end-of-line characters
  • Some types are automatically parsed

The target server is still api.logmatic.io but the port becomes 514.

If you want to test it, you can install Netcat as we are going to use it as a UDP client.

To send your first line with the UDP protocol proceed as follows:

> nc -vu api.logmatic.io 514
> <your_api_key> This is my first line!
Suggest Edits

HTTP / HTTPS

Even if the following URLs start with HTTP here, Logmatic.io provides a HTTP/S secure communication protocol to guaranty that your logs won't be intercepted by undesired third parties.

 
posthttps://api.logmatic.io/v1/input/your_write_api_key
curl --request POST \
  --url https://api.logmatic.io/v1/input/your_write_api_key
var request = require("request");

var options = { method: 'POST',
  url: 'https://api.logmatic.io/v1/input/your_write_api_key' };

request(options, function (error, response, body) {
  if (error) throw new Error(error);

  console.log(body);
});
require 'uri'
require 'net/http'

url = URI("https://api.logmatic.io/v1/input/your_write_api_key")

http = Net::HTTP.new(url.host, url.port)
http.use_ssl = true
http.verify_mode = OpenSSL::SSL::VERIFY_NONE

request = Net::HTTP::Post.new(url)

response = http.request(request)
puts response.read_body
var settings = {
  "async": true,
  "crossDomain": true,
  "url": "https://api.logmatic.io/v1/input/your_write_api_key",
  "method": "POST",
  "headers": {},
  "processData": false
}

$.ajax(settings).done(function (response) {
  console.log(response);
});
import requests

url = "https://api.logmatic.io/v1/input/your_write_api_key"

response = requests.request("POST", url)

print(response.text)
{"ok":True}

Path Params

your_write_api_key
string
required

One of your write API keys need to be provided so the API route the data to your platform

Body Params

attribute_string
string
attribute_int
int32
 

Sending plain text

Fill your API key with a single POST endpoint provided by Logmatic.io:
You can either test it directly from this section or use a single curl command:

curl -H "content-type:text/plain" --data 'This is my first line of log!' https://api.logmatic.io/v1/input/<your_api_key>

As long as the endpoint returns a 200 status code it means that your message was properly taken into account. You should then be able to see it in the user interface.

You can also send multiple lines in a single call by inserting end-of-line characters \n or \r\n:

curl -H "content-type:text/plain" --data 'This is my first line!\nThis is my second line of log!' https://api.logmatic.io/v1/input/<your_api_key>

Metadata query parameters

Metadata parameters help you enrich the log events you provide. For instance, if you have multiple applications, hosts or users, you may want to provide this information along the content.

The POST endpoint then becomes:

https://api.logmatic.io/v1/input/<your_api_key>?hostname=<hostname>&appname=<appname>&username=<username>

Sending some JSON

Some people prefer to work directly in JSON as their events are already structured. To send structured JSON logs, the content-type has to be changed to application-json.

Let's say you want to push 2 JSON objects. You can send them like this:

curl -H "content-type:application/json" - -data '[{"username":"user1","appname":"app1","message":"This is my first line!"},{"username":"user2","appname":"app2","message":"This is my second line!"}]' https://api.logmatic.io/v1/input/<your_api_key>?hostname=host1

You'll notice that we kept hostname=host1 as a metadata query parameter. When working with JSON objects these additional fields are merged into JSON objects.

Retrieve client's IP address and user-agent

The API is able to inject the IP address or the user agent of the calling client in the incoming log event. This can be very useful for Mobile and Web analytics usages.

To do so, you simply have to add one or both of the following headers into your HTTP calls:

  • X-Logmatic-Add-IP=<desired ip attribute>: recopies the IP of the calling client to the <desired ip attribute> of the provided log events
  • X-Logmatic-Add-UserAgent=<desired ua attribute>: recopies the User-Agent of the calling client to the <desired ua attribute> of the provided log events
Suggest Edits

Summary

If you have admin rights, you are able to create / delete multiple write API keys.
Read API keys are used for output APIs, to extract data from Logmatic.io.

 

Authentication

All output APIs are authenticated using a Read API Key .
It must be specified:

  • Either as a query parameter : api_key
  • Or as a header: x-api-key

Create a Read API key

To create an output API key, click on the configure menu and then on add Read API key:

List of all your API keys

List of all your API keys

You can create multiple keys in order to have better control over who gets access to your account's data. Furthermore you can apply security filters to a per key basis.

You can delete a key in order to revoke it.

Response format

The response is always provided as application/json .
We use an appropriate HTTP status code to better qualify success or errors.
The result is always bundled in the result field of the response.

{
	"result": {
      //result object here
	} 
}

When the API fails, it will include insights into what failed. The format is always the same, the status code and a message are provided in an error object.

{
"error": {
      "status": 403,
      "message": "Forbidden. You do not have access to this resource."
    }
}

Rate limiting

What is rate limiting?

it is when an API accesses raw or aggregated data such as Saved Analysis, List events API or Aggregation API. You are limited to 50 million events if the time range exceeds 24 hours. Under a time range of 24 hours, there is no limit on the number of events you can go through.

If you experience any problem with this API, please contact the support team.

Security filters

Just like for users, you can apply a security to filter your Api keys to limit the data they can retrieve. To learn more about this feature, you can refer to Manage users & security filters

Assign a name or a security filter to an API key

Assign a name or a security filter to an API key

Available Discovery APIs

List groups defined on your platform

List Fields and Metrics defined on your platform

Available Output APIs

Output Api Name
Description

Returns the data used to build saved analysis in the platform

Fetches a single event from the platform

Lists a filtered set of events

Aggregate a filtered set of data by grouping them by fields and aggregating them by metrics

Suggest Edits

List Group API

The List the groups API is a Discovery API which lets you get the list of all the (Fields & Metrics) groups available in your platform.

 

Header Auth

 Authentication is required for this endpoint.
posthttps://app.logmatic.io/beta/discover/groups
curl --request POST \
  --url https://app.logmatic.io/beta/discover/groups \
  --header 'content-type: application/json' \
  --header 'x-api-key: X-API-Key'
var request = require("request");

var options = { method: 'POST',
  url: 'https://app.logmatic.io/beta/discover/groups',
  headers: 
   { 'x-api-key': 'X-API-Key',
     'content-type': 'application/json' } };

request(options, function (error, response, body) {
  if (error) throw new Error(error);

  console.log(body);
});
require 'uri'
require 'net/http'

url = URI("https://app.logmatic.io/beta/discover/groups")

http = Net::HTTP.new(url.host, url.port)
http.use_ssl = true
http.verify_mode = OpenSSL::SSL::VERIFY_NONE

request = Net::HTTP::Post.new(url)
request["content-type"] = 'application/json'
request["x-api-key"] = 'X-API-Key'

response = http.request(request)
puts response.read_body
var settings = {
  "async": true,
  "crossDomain": true,
  "url": "https://app.logmatic.io/beta/discover/groups",
  "method": "POST",
  "headers": {
    "content-type": "application/json",
    "x-api-key": "X-API-Key"
  },
  "processData": false
}

$.ajax(settings).done(function (response) {
  console.log(response);
});
import requests

url = "https://app.logmatic.io/beta/discover/groups"

headers = {
    'content-type': "application/json",
    'x-api-key': "X-API-Key"
    }

response = requests.request("POST", url, headers=headers)

print(response.text)
{
  "result": {
    "groups": [
      {
        "id": "syslog",
        "name": "Syslog",
        "description": "Syslog Fields"
      }, 
      {
        "id": "http_web_access",
        "name": "http_web_access",
        "description": "xxx"
      }
    ] 
  }
}

Headers

Content-Type
string
required

Default content Type, any other content type wont work

 

API description

The List the groups API is a Discovery API which lets you get the list of all the (Fields & Metrics) groups available in your platform.

Suggest Edits

List Fields and Metrics API

List Fields and Metrics defined on your platform

 

Header Auth

 Authentication is required for this endpoint.
posthttps://app.logmatic.io/beta/discover/fields_metrics
{
  //Mandatory parameters
    "fields": true,
    "metrics": true,
  
  //Filtering parameter
    "groups": ["xxx", "yyy"]
}
{
  "result": {
    "fields": [
      {
        "id": "xxx",
        "name": "xxx",
        "description":  "xxx"
      }
    ],
    "metrics": [
      {
        "id": "xxx",
        "name": "xxx",
        "description":  "xxx"
      }
    ] 
  }
}
{
  "result": {
    "fields": [
      {
        //In case of traditional field
        "type": "attribute", 
        "attribute":  "custom.xxx"
      }
    ],
    "metrics": [
      {
        (...)
      }
    ] 
  }
}
{
  "result": {
    "fields": [
      {
        //In case of tagset
        "type": "tagset",
        "tags": [
          { 
            "id": "xxx",
            "name": "xxx", 
            "color": "xxx"
          }
        ]
      }
    ],

    "metrics": [
      {
				(...)
      }
    ] 
  }
}
{
  "result": {
    "fields": [
      {
				(...)
      }
    ],
    "metrics": [
      {
        //In case of traditional metric
        "type": "attribute",
        "attribute":  "custom.xxx",
        "aggregation": "sum"
      }
    ] 
  }
}
{
  "result": {
    "fields": [
      {
				(...)
      }
    ],
    "metrics": [
      {
        //In case of formula
        "type": "formula",
        "formula": "xxx"
      }
    ] 
  }
}
{
  "result": {
    "fields": [
      {
				(...)
      }
    ],
    "metrics": [
      {
        //In case of count
        "type": "core" 
      }
    ] 
  }
}

Body Params

fields
boolean

If false, the fields will be omitted.

metrics
boolean

If false, the metrics will be omitted.

groups
array of strings

The subset of groups you want to retrieve the Fields & Metrics from. Everything is returned by default.

 

API description

The "List Fields and Metrics API" is a Discovery API which let you get a list of all the fields and metrics defined on your platform.

Suggest Edits

List events API

Lists a filtered set of events

 

Header Auth

 Authentication is required for this endpoint.
posthttps://app.logmatic.io/beta/events/list
	
{
    "time": {
        "to": "now",
        "from": "now - 3h"
    }
}
{
    "event": true,
    "afterEvent": "AAAAAVOaRF9dwR6J_EFWT2FRMWF6endJVU1iYVBpeV9w",
    "time": {
        "to": "now",
        "from": "now - 3h"
    }
}
{
    "filters": [
        {
            "search": {
                "query": "*"
            }
        },
        {
            "field": {
                "id": "syslog_hostname",
                "values": [
                    "prod-data1",
                    "prod-data2"
                ],
                "exclude": false
            }
        }
    ],
    "event": true,
    "time": {
        "to": "now",
        "from": "now - 3h"
    }
}
	
{
    "columns": [
        {
            "field": "syslog_hostname"
        },
        {
            "attribute": "syslog.appname"
        }
    ],
    "time": {
        "to": "now",
        "from": "now - 3h"
    }
}
{
    "result": {
        "events": [
            {
                "id": "AAAAAVOaRF9dwR6J_EFWT2FRMWF6endJ1iYVBpeV9w",
                "date": "2016-03-21T17:41:25.469Z"
            },
            {
                "id": "AAAAAVOaRF9dwR6J-0FWT2FRMWF6endJ1iYVBpeV9v",
                "date": "2016-03-21T17:41:25.469Z"
            },
            {
                "id": "AAAAAVOaRF9dwR6J-kFWT2FRMWF6endJ1iYVBpeV9u",
                "date": "2016-03-21T17:41:25.469Z"
            },
          ...
        ]
    }
}
{
  "result": {
    "events": [
      {
        "id": "AAAAAVOEYMdYps92oUFWT0VZT294Y3E0V1BKR21F",
        "date": "2015-03-17T11:40:48.344Z",
        "event": {
          "custom": {
            //User data
          },
          "syslog": {
            //Syslog data
          },
          "message": "..."
        }
      },
      {
        "id": "AAAAAVOEYMdK7Gq9tkFWTWVrV294Y3E0V1BKR2wt",
        "date": "2015-03-17T11:40:48.330Z",
        "event": {
          "custom": {
            //User data
          },
          "syslog": {
            //Syslog data
          },
          "message": "...""
        }
      },
     ...
}
{
  "result": {
    "events": [
      {
        "id": "AAAAAVOEXr_3ps7puUFWT0V1V294Y3E0V1BJeEZW",
        "date": "2015-03-17T11:38:35.383Z",
        "columns": [
          "prod-4",
          "haproxy"
        ]
      },
      {
        "id": "AAAAAVOEXr_l7Go3-kFVYc0JCV294Y3E0V1BJeEZY",
        "date": "2015-03-17T11:38:35.365Z",
        "columns": [
          "prod-5",
          "apache"
        ]
      },
     ...
}

Body Params

time
object
 
time.to
mixed type
required

Higher bound of the timerange (allowed formats: ISO-8601, EPOCH in ms or date math such as "now - 3h" )

time.from
mixed type
required

Lower bound of the timerange (allowed formats: ISO-8601, EPOCH in ms or date math such as "now - 3h" )

time.offset
int32

Timezone offset

time.timezone
string

Timezone

filters
array

Search query or equality filters as an array. Example: [{ "search": { "query": "*" } }, { "field": { "id": "field_id", "values": ["value_1", "value_2"], "exclude": false } }]

event
boolean

Whether the full event should be included in the response or not.

columns
array

Fields, metrics or attributes you want to specifically see in the result. Example: [{ "field": "syslog_hostname" }, { "metric": "custom_responsetime" }, { "attribute": "syslog.appname" }]

limit
int32

The number of events returned at once. Max: 1000

afterEvent
string

An event id to start with. Used to paginate over the results.

 

What is rate limiting?

When an API accesses raw or aggregated data such as Saved Analysis, List events API or Aggregation API. You are limited to 50 million of events if the time range exceeds 24 hours. Under a time range of 24 hours, there is no limit on the number of events you can go through.

If you experience any problem with this API, please contact the support team.

API description

Lists a filtered set of events.

By default, the events are returned as object that contain the id and the date. However, specific columns can display full objects. See the examples above to understand the different options.

Query parameters description

We are only going to describe the time and filter parameters.

time parameter

The time parameter lets you define the main time range and also your time zone or offset.

The time.from and time.to parameters define the lower and higher bounds of your query. They can take the following values:

  • an ISO-8601 string (eg 2017-09-04T15:27:01+02:00)
  • a unix timestamp (number representing the elapsed millisec since epoch)
  • a date math string aligned with ElasticSearch documentation (eg now, now - 3h, now - 7d, etc...)
{
		"time": {
        "to": "now",
        "from": "now - 3h",
      	"timezone": "Europe/Paris",
      	//"offset": 3600
    }
}
Parameter
Description
Default
Format

time.to

Higher bound of the time range

string

time.from

Lower bound of the time range

string

time.timezone

The time zone of your time range and the results provided if it applies. Here is a comprehensive list.

UTC

string

time.offset

If the time zone is not defined you can provide in second the offset you want to apply.

integer

filters parameter

filters parameter is used to apply the query on a subset of the whole data. This is defined as an array of objects. You can define either field or search filters.

{
  //"search": {(...)},
  //"time": {(...)},
  //"field": {(...)},
  //"format": true/false,

	"filters": [
      {
          //Example with the syslog_hostname field
          "field": {
              "id": "syslog_hostname",
              "values": [
                  "prod-data1",
                  "prod-data2"
              ],
              "exclude": false
          }
      },
      {
          //A query string
          "search": "a search query"
      }
  ],

  //"groupBy": [...],
  //"compute": [...]
}

field filter

Filter the query by defining a list of values to include or exclude.

Parameter
Description
Default
Format

filters.field.id

Field ID

string

filters.field.values

The list of values you want to include or exclude

List of strings

filters.field.exclude

True, if you want to exclude the defined list of values

false

boolean

search filter

Filter by a search query.

Parameter
Description
Default
Format

filters.search

A search query as defined in the Using the search bar article.

string

Suggest Edits

Fetch API

Fetches a single event from the platform

 

Header Auth

 Authentication is required for this endpoint.
posthttps://app.logmatic.io/beta/events/fetch
{
  //Mandatory parameters	
  "id": "AAAAAVOej_mE1i_TLEFWT2VqdHZ5endJVU1iYVByZ0Vf",
  "event": true
}
{
  //Mandatory parameters	
  "id": "AAAAAVOej_mE1i_TLEFWT2VqdHZ5endJVU1iYVByZ0Vf",
  "columns": [
      {
          "field": "syslog_hostname"
      },
      {
          "attribute": "syslog.appname"
      }
  ]
}
{
    "result": {
        "id": "AAAAAVOej_mE1i_TLEFWT2VqdHZ5endJVU1iYVByZ0Vf",
        "date": "2016-03-22T13:42:28.996Z",
        "event": {
            "custom": {
                "app": {
                    "instance": 0
                },
                "level": "error",
                "message": "middlewareError",
          //...
      }
}
{
    "result": {
        "id": "AAAAAVOej_mE1i_TLEFWT2VqdHZ5endJVU1iYVByZ0Vf",
        "date": "2016-03-22T13:42:28.996Z",
        "columns": [
            "analytics-1",
            "app"
        ]
    }
}

Body Params

id
string
required

The id of the event retrieved from the "List event API"

event
boolean

Whether the full event should be included in the response or not.

columns
array

Fields, metrics or attributes you want to specifically see in the result. Example: [{ "field": "syslog_hostname" }, { "metric": "custom_responsetime" }, { "attribute": "syslog.appname" }]

 

API Description

With the Fetch API you can get a specific raw-event only using its ID as retrieved on the API events list.

You can ask to extract specific information from the raw event.
You can decide to get the entire raw event or not.

If not, the response contains the values of the fields, metrics and attributes specified in columns query parameter.

We did this in order to give you the possibility to extract meaningful values from a specific event without getting the whole raw event.

Suggest Edits

Saved Analyses API

Returns the data used to build saved analyses in the platform

 
posthttps://app.logmatic.io/beta/saved_analyses/savedAnalysesId/execute
curl --request POST \
  --url https://app.logmatic.io/beta/saved_analyses/savedAnalysesId/execute \
  --header 'content-type: application/json' \
  --header 'x-api-key: X-API-Key'
var request = require("request");

var options = { method: 'POST',
  url: 'https://app.logmatic.io/beta/saved_analyses/savedAnalysesId/execute',
  headers: 
   { 'x-api-key': 'X-API-Key',
     'content-type': 'application/json' } };

request(options, function (error, response, body) {
  if (error) throw new Error(error);

  console.log(body);
});
require 'uri'
require 'net/http'

url = URI("https://app.logmatic.io/beta/saved_analyses/savedAnalysesId/execute")

http = Net::HTTP.new(url.host, url.port)
http.use_ssl = true
http.verify_mode = OpenSSL::SSL::VERIFY_NONE

request = Net::HTTP::Post.new(url)
request["content-type"] = 'application/json'
request["x-api-key"] = 'X-API-Key'

response = http.request(request)
puts response.read_body
var settings = {
  "async": true,
  "crossDomain": true,
  "url": "https://app.logmatic.io/beta/saved_analyses/savedAnalysesId/execute",
  "method": "POST",
  "headers": {
    "content-type": "application/json",
    "x-api-key": "X-API-Key"
  },
  "processData": false
}

$.ajax(settings).done(function (response) {
  console.log(response);
});
import requests

url = "https://app.logmatic.io/beta/saved_analyses/savedAnalysesId/execute"

headers = {
    'content-type': "application/json",
    'x-api-key': "X-API-Key"
    }

response = requests.request("POST", url, headers=headers)

print(response.text)
{
    "result": {
        "values": [
            {
                "time": 1458187200000,
                "fields": {},
                "metrics": {
                    "m0": 2125
                }
            },
            {
                "time": 1458189000000,
                "fields": {},
                "metrics": {
                    "m0": 1345
                }
            },
          ...
        ]
    }
}
{
    "result": {
        "events": [
            {
                "logmatic": {
                    "date": "2016-03-17T09:52:48.102Z"
                },
                "custom": {
                    "app": {
                        "instance": 0
                    },
                    "res": {
                        "statusCode": 200
                    },
                    "level": "info",
                    "responseTime": 25,
                    "message": "HTTP GET /modules/components/workbench/controller.js",
                    "req": {
                        "headers": {
                            "user-agent": "Mozilla/5.0 (Windows NT 6.2; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/49.0.2623.87 Safari/537.36"
                        },
                        "method": "GET",
                        "url": "/modules/components/workbench/controller.js"
                    }
                },
                "syslog": {
                    "severity": 6,
                    "hostname": "analytics-1",
                    "appname": "app",
                    "prival": 134,
                    "facility": 16,
                    "version": 0,
                    "timestamp": "2016-03-17T09:52:48.102Z"
                }
            },
            {
                "logmatic": {
                    "date": "2016-03-17T09:52:45.402Z"
                },
                "custom": {
                    "app": {
                        "instance": 0
                    },
                    "res": {
                        "statusCode": 200
                    },
                    "level": "info",
                    "responseTime": 14,
                    "message": "HTTP GET /modules/components/admin/entries.js",
                    "req": {
                        "headers": {
                            "user-agent": "Mozilla/5.0 (Windows NT 6.2; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/49.0.2623.87 Safari/537.36"
                        },
                        "method": "GET",
                        "url": "/modules/components/admin/entries.js"
                    }
                },
                "syslog": {
                    "severity": 6,
                    "hostname": "analytics-1",
                    "appname": "app",
                    "prival": 134,
                    "facility": 16,
                    "version": 0,
                    "timestamp": "2016-03-17T09:52:45.402Z"
                }
            },
          ...
        }
      ]
    }
}

Path Params

savedAnalysesId
string
required

The identifier found in the URL when you load a bookmark (analysis) in your platform

Body Params

time
object
 
time.to
mixed type

Higher bound of the timerange (allowed formats: ISO-8601, EPOCH in ms or date math such as "now - 3h" )

time.from
mixed type

Lower bound of the timerange (allowed formats: ISO-8601, EPOCH in ms or date math such as "now - 3h" )

time.offset
int32

Timezone offset in milliseconds

time.timezone
string

Timezone eg: UTC+01:00 or Europe/Paris

limit
int32

Maximum number of rows (Only against paginable views)

field
object
 
field.from
int32

Used to paginate (Only on list views or list of events)

format
boolean

Provides results with display names and colors

Headers

Content-Type
string
required
X-API-Key
string
required
 

What is rate limiting?

When an API accesses raw or aggregated data such as Saved Analysis, List events API or Aggregation API. You are limited to 50 million events if the time range exceeds 24 hours. Under a time range of 24 hours, there is no limit on the number of events you can go through.

If you experience any problem with this API, please contact the support team.

API description

As for now, you can extract data from Logmatic.io only from a valid Saved Analysis identifier. The base API path to extract any data out of Logmatic.io is http://app.logmatic.io/beta/saved_analyses/<bookmarkId>/execute as a POST or a GET.

Query parameters description

time parameter

The time parameter lets you define the main time range and also your time zone or offset.

The time.from and time.to parameters define the lower and higher bounds of your query. They can take the following values:

  • an ISO-8601 string (eg 2017-09-04T15:27:01+02:00)
  • a unix timestamp (number representing the elapsed millisec since epoch)
  • a date math string aligned with ElasticSearch documentation (eg now, now - 3h, now - 7d, etc...)
{
		"time": {
        "to": "now",
        "from": "now - 3h",
      	"timezone": "Europe/Paris",
      	//"offset": 3600
    }
}
Parameter
Description
Default
Format

time.to

Higher bound of the time range

string

time.from

Lower bound of the time range

string

time.timezone

The time zone of your timerange and the results provided if it applies. Here is a comprehensive list.

UTC

string

time.offset

If the time zone is not defined you can provide in second the offset you want to apply.

integer

2 possible cases are implemented:

  • If this is a list view then the returned data will be the raw log events
  • If this is an analytical view (time series, leaderboards, etc...) then the returned data will be aggregated results
Suggest Edits

Aggregation API

Aggregate a filtered set of data by grouping events by fields and aggregating them by metrics

 
posthttps://app.logmatic.io/beta/events/aggregate
curl --request POST \
  --url https://app.logmatic.io/beta/events/aggregate \
  --header 'content-type: application/json' \
  --header 'x-api-key: X-API-Key'
var request = require("request");

var options = { method: 'POST',
  url: 'https://app.logmatic.io/beta/events/aggregate',
  headers: 
   { 'x-api-key': 'X-API-Key',
     'content-type': 'application/json' } };

request(options, function (error, response, body) {
  if (error) throw new Error(error);

  console.log(body);
});
require 'uri'
require 'net/http'

url = URI("https://app.logmatic.io/beta/events/aggregate")

http = Net::HTTP.new(url.host, url.port)
http.use_ssl = true
http.verify_mode = OpenSSL::SSL::VERIFY_NONE

request = Net::HTTP::Post.new(url)
request["content-type"] = 'application/json'
request["x-api-key"] = 'X-API-Key'

response = http.request(request)
puts response.read_body
var settings = {
  "async": true,
  "crossDomain": true,
  "url": "https://app.logmatic.io/beta/events/aggregate",
  "method": "POST",
  "headers": {
    "content-type": "application/json",
    "x-api-key": "X-API-Key"
  },
  "processData": false
}

$.ajax(settings).done(function (response) {
  console.log(response);
});
import requests

url = "https://app.logmatic.io/beta/events/aggregate"

headers = {
    'content-type': "application/json",
    'x-api-key': "X-API-Key"
    }

response = requests.request("POST", url, headers=headers)

print(response.text)
{
    "result": {
        "aggregates": [
            {
                "bucket": {},
                "values": {
                    "cnt": 128930
                }
            }
        ]
    }
}
{
    "result": {
        "aggregates": [
            {
                "bucket": {
                    "host": {
                        "id": "prod-data3",
                        "name": "prod-data3",
                        "color": "#ff7f00"
                    }
                },
                "values": {
                    "cnt": {
                        "value": 33041,
                        "formatted": "33.04k"
                    }
                }
            },
            {
                "bucket": {
                    "host": {
                        "id": "prod-data2",
                        "name": "prod-data2",
                        "color": "#ffff33"
                    }
                },
                "values": {
                    "cnt": {
                        "value": 23206,
                        "formatted": "23.21k"
                    }
                }
            },
          ...
}
{
    "result": {
        "aggregates": [
            {
                "bucket": {
                    "host": "prod-data3",
                    "t": "2016-03-21T14:59:00.000Z"
                },
                "values": {
                    "cnt": 45
                }
            },
            {
                "bucket": {
                    "host": "prod-data3",
                    "t": "2016-03-21T15:00:00.000Z"
                },
                "values": {
                    "cnt": 313
                }
            },
          ...
}
{
    "result": {
        "aggregates": [
            {
                "bucket": {
                    "host": "prod-data5"
                },
                "values": {
                    "cnt": 20575
                }
            },
            {
                "bucket": {
                    "host": "prod-es1"
                },
                "values": {
                    "cnt": 4296
                }
            }
          //...
        ],
      	//Used as an anchor to call the "afterBucket" pagination parameter
        "lastBucket": {
            "host": "prod-mongo1"
        }
    }
}
{
    "result": {
        "aggregates": [
            {
                "bucket": {
                    "host": "analytics-1",
                    "app": "app"
                },
                "values": {
                    "cnt": 12290
                }
            },
            {
                "bucket": {
                    "host": "analytics-1",
                    "app": "sshd"
                },
                "values": {
                    "cnt": 3584
                }
            },
            {
                "bucket": {
                    "host": "analytics-1",
                    "app": "CRON"
                },
                "values": {
                    "cnt": 9
                }
            },
          //...
        ]
    }
}

Body Params

time
object
 
time.to
mixed type
required

Higher bound of the timerange (allowed formats: ISO-8601, EPOCH in ms or date math such as "now - 3h" )

time.from
mixed type
required

Lower bound of the timerange (allowed formats: ISO-8601, EPOCH in ms or date math such as "now - 3h" )

time.offset
int32

Timezone offset

time.timezone
string

Timezone

compute
array
required

Compute up to 5 aggregated metrics. See details below. Example: [{ "metric": { "id": "count", "output": "cnt" } }]

groupBy
array

Group by up to 3 fields or time. See details below. Example: [{ "field": { "id": "syslog_hostname", "total": "_TOTAL_HOSTNAME_", "output": "host" } }, { "time": { "interval": "1m", "output": "t" } }]

filters
array

Search query or equality filters as an array. Example: [{ "search": { "query": "*" } }, { "field": { "id": "field_id", "values": ["value_1", "value_2"], "exclude": false } }]

format
boolean

If true, the response contains formatted strings and colors aligned with what your application would do.

afterBucket
object

Used to paginate if you exceed the max size of buckets returned. Available only if the first groupBy is sorted in natural order. It must be defined as an object defining the groupBy name et the bucket as follow: {"host": "prod-data4"}

 

Headers

Content-Type
string
required

Default content Type, any other content type wont work

 

Rate limiting

When an API accesses raw or aggregated data such as Saved Analysis, List events API or Aggregation API. You are limited to 50 million of events if the time range exceeds 24 hours. Under a time range of 24 hours, there is no limit on the number of events you can go through.

If you experience any problem with this API, please contact the support team.

API description

The aggregation API returns the result of up to 5 aggregated metrics and 3 group bys. It offers an access to the aggregation capabilities of your Logmatic.io platform.

Query parameters description

We only detail query parameters that have a higher level of complexity here such as: time, filters, groupBy and compute.

Parameter: time

The time parameter lets you define the main time range and also your time zone or offset.

The time.from and time.to parameters define the lower and higher bounds of your query. They can take the following values:

  • an ISO-8601 string (eg 2017-09-04T15:27:01+02:00)
  • a unix timestamp (number representing the elapsed millisec since epoch)
  • a date math string aligned with ElasticSearch documentation (eg now, now - 3h, now - 7d, etc...)
{
		"time": {
        "to": "now",
        "from": "now - 3h",
      	"timezone": "Europe/Paris",
      	//"offset": 3600
    }
}
Parameter
Description
Default
Format

time.to

Higher bound of the time range

string

time.from

Lower bound of the time range

string

time.timezone

The timezone of your timerange and the results provided if it applies. Here is a comprehensive list.

UTC

string

time.offset

If the timezone is not defined you can provide in second the offset you want to apply.

integer

Parameter: filters

filters parameter is used to apply the query on a subset of the whole data. This is defined as an array of objects. You can define either field or search filters.

{
  //"search": {(...)},
  //"time": {(...)},
  //"field": {(...)},
  //"format": true/false,

	"filters": [
      {
          //Example with the syslog_hostname field
          "field": {
              "id": "syslog_hostname",
              "values": [
                  "prod-data1",
                  "prod-data2"
              ],
              "exclude": false
          }
      },
      {
          //A query string
          "search": {"query":"a search query"}
      }
  ],

  //"groupBy": [...],
  //"compute": [...]
}

Filter - filed

Filter the query by defining a list of values to include or exclude.

Parameter
Description
Default
Format

filters.field.id

Field ID

string

filters.field.values

The list of values you want to include or exclude

List of strings

filters.field.exclude

True, if you want to exclude the defined list of values

false

boolean

Filter: search

Filter by a search query.

Parameter
Description
Default
Format

filters.search

A search query as defined in the Using the search bar article.

string

Parameter: groupBy

The groupBy parameter is used to split the aggregation over any defined field on the plateform or over the time. This is defined as a list of objects: either field or time.

Limits:

  • You can ask up to 3 groupBys at a time
  • You can ask up to 100 values aggregated over each groupBy
  • You can ask up to 1000 values combined over all the groupBys at a time

Pagination:
Pagination over the first groupBy can be enabled by asking a natural order ("@natural": "asc/desc").

{
  //"search": {(...)},
  //"time": {(...)},
  //"field": {(...)},
  //"format": true/false,

  "groupBy": [
    {
      "field": {
        "id": "field_id",
        "limit": xx,
        "output": "out_attr_name_f",
        "total": "_total_",
        "order": {
          "metric_id": "asc/desc"
          //or "@natural": "asc/desc" if you want to enable pagination
        } 
      }
    }, 
    { 
      "time": {
        "interval": "1m",
        "output": xxx,
        "total": "_TOTAL_"
      } 
    }
  ],
  //"compute": [{(...)}] 
}

Group by: field

Parameter
Description
Default
Format

groupBy.field.id

Field ID

string

groupBy.field.limit

Limit the number of values for that field. No limit for some pre-defined fields like tagsets, Max is 100, product of limits cannot exceed 1000.

integer

groupBy.field.output

The display name of your field in output result.

string

groupBy.field.total

If you want subtotals to be returned provide a display name that'll be returned in the response

null

string

groupBy.field.order

You can ask to order ascending or descending according to a metric (eg {"count": "desc"}) or to the natural order (eg {"@natural": "desc"}). When ordered as natural order, pagination is enabled over the first groupBy.

Descending over the first defined metric by default.

object

Group by: time

Parameter
Description
Default
Format

groupBy.time.interval

Time steps for aggregation

1m

string

groupBy.time.output

Display name of the time attribute in the result

integer

groupBy.time.total

Return the total over time. Provide a display name that'll be returned in the response

null

string

Parameter: compute

The compute parameter is where you define which metrics you want to get aggregated into all the resulting buckets of your groupBys.

You can define up to 5 metrics per query.

{
  //"search": {(...)},
  //"time": {(...)},
  //"field": {(...)},
  //"format": true/false,
  //"groupBy": [{(...)}],

  "compute": [
    {
      "metric": {
        "id": "metric_id",
        "output": "out_attr_name_m"
      }
    }, 
    {
      "timeseries": {
        "metricId": "metric_id",
        "output": "output_attr_name_m2",
        "interval": "1m"
      }
    }
  ]
}

Compute: metric

Parameter attribute
Description
Default
Format

compute.metric.id

Metric ID

string

compute.metric.output

The display name of your metric in output result.

string

Compute: timeseries

Parameter attribute
Descritpion
Default
Format

compute.timeseries.metricId

Metric ID

string

compute.timeseries.output

The display name of the time in output result.

string

compute.timeseries.interval

The interval: 1s for "1 sec", 1m for "1 minute", 1h for "1 hour", etc...

string