How to upload custom log to Splunk + Fortinet logs

  • Tutorial
How much data do we generate using information systems every day? Great amount! But do we know all the possibilities for working with such data? Definitely not! In the framework of this article, we will tell you what types of data we can upload for further operational analysis in Splunk, and also show how to connect the loading of Fortinet logs and non-standard structure logs, which must be manually divided into fields. Splunk can index data from various sources that can store logs locally on the same machine as the Splunk indexer, and on a remote device. To collect data from remote machines, a special agent is put on them - Splunk Universal Forwarder, which will send data to the indexer.





Splunk offer many ready-made applications and add-ons (Add-ons) with pre-configured parameters for loading a certain type of data, for example, there is an Add-on for data generated by Windows, Linux, Cisco, CheckPoint, etc. In total, more than 800 add-ons have been created that can be found on the SplunkBase website .

Types of Data Sources


All incoming data can be divided into several groups according to their sources:

Files and directories.

Most data comes to Splunk directly from files and directories. You just need to specify the path to the directory from which you want to collect data and after that it will constantly monitor it and as new data appears, they will be immediately loaded into Splunk. Later in this article we will show how this is implemented.

Network events

Splunk can also index data from any network port, for example, deleted syslog data or other applications that transmit data over TCP or UDP port. We will look at this type of data source using Fortinet as an example.

Sources Windows

Splunk lets you customize the loading of many different data
Windows, such as event log, registry, WMI, Active Directory, and performance monitoring data. We wrote more about loading data from Windows into Splunk in a previous article. (link)

Other data sources

  • Metrics
  • Scripts
  • Custom data loading modules
  • HTTP Event Collector

Many tools have already been implemented that can load almost any of your data, but even if nothing suits you, you can create your own scripts or modules, which we will talk about in one of the following articles.

Fortinet


In this section, we will discuss how to implement Fortinet log loading.

1. First you need to download Add-on from the SplunkBase website using this link .

2. Next, you need to install it on your Splunk-indexer (Apps - Manage Apps - Install app from file) .

3. Then configure the reception of data on the UDP port. To do this, go through: Settings - Data Inputs - UDP - New. We specify the port, by default it is 514 port.



Select Sourcetype: fgt_log, also select the required index or create a new one.



4. Configure the sending of data over UDP in Fortinet itself, indicating the same port as in Splunk.
5. We receive data and build analytics.





Custom log


By a non-standard log, we mean a log that has an unknown source code for Splunk, and therefore does not have predefined rules for parsing into fields. In order to get the values ​​of the fields, it will be necessary to first perform several simple manipulations.

Using this log as an example, in addition to parsing, we will show how to implement loading data from directories. There are two development scenarios that depend on where the data is located: on the local indexer machine or on the remote machine.

Local machine


If your data is stored on the Splunk local machine, then downloading is very easy:
Settings - Add data - Monitor - Files & Directories



Select the necessary directory, if necessary, we can register Whitelist or Blacklist.
We select an index or create a new one, the rest by default.




Remote machine


If the directory in which the necessary data is stored is located on the remote machine, then the algorithm of action will be somewhat different. To collect data, we need an agent on the target machine (Splunk Universal Forwarder) configured by Forwarder Management on the Splunk indexer, a sendtoindexer application and an application that tells which directories we will be browsing.

We explained in detail how to install an agent and configure data collection from a remote machine in a previous article , so we won’t repeat it and assume that you already have all those settings.

We will create a special application that will respond to the agent sending data from the specified directories.



The application is automatically saved in the ..splunk / etc / apps folder, you need to transfer it to the ..splunk / etc / deployment-apps folder .

In the ..monitor / local folder, you need to place the inputs.conf configuration file , in which we indicate which folders to forward.
We want to view the test folder in the root directory. You can read more about inputs.conf files on the official website . Add the application to the Server Class related to the target machine. How and why it is necessary to do this, we spoke in the previous article . We recall that this can be done by going along the following path: Settings - Forwarder Management.
[monitor:///test]
index=test
disabled = 0








In order for the data to be loaded, it is necessary that there is an index that was specified in inputs.conf, if it is not there, then create a new one.

Data parsing


After loading, we received data not divided into fields. In order to select fields, go to the Extract Fields menu ( All Fields - Extract New Fields ). You can



parse into fields using the built-in toolkit, which, based on regular expressions, will select the fields that you specify. Or you can write a regular expression yourself if the result of automatic work does not suit you.

Step 1. Select the field.



Step 2. Choose the separation method.
We will use regular expressions.



Step 3. Select the values ​​that will apply to one field and call it.



Step 4We check whether the field was highlighted correctly in other events; if not, we add this event to the selected fields.



Step 5. Select a field in all events that are different in structure.



Step 6. Check to see if something extra was highlighted if, for example, the event does not have such a field.



Then we save the field and now when loading data with the same sourcetype, such a rule for highlighting the field value will be applied to them.

Next, we create all the fields that we need. And now the data is ready for further analysis.



Conclusion


Thus, we told you from what sources you can upload data to Splunk, showed how to configure receiving from network ports, as well as how to download and parse a non-standard log.

We hope you find this information useful.

We are happy to answer all your questions and comments on this topic. Also, if you are interested in something specifically in this area, or in the field of machine data analysis in general, we are ready to modify the existing solutions for you, for your specific task. To do this, you can write about it in the comments or simply send us a request through the form on our website .


Also popular now: