An Introduction to SIEM and its Implementation Tools

Table Of Contents

1. Types of SIEM Tools & Its Implementation

2. Types of SIEM Tools

Types of SIEM Tools & Its Implementation

Types of SIEM Tools are commonly used by security administrators and security incident response professionals. Security Information and Event Management – SIEM orchestrates the complexity of the enterprise IT infrastructure through real-time collection and analysis of security alerts generated from the analysis of log data of various security incidents and technologies such as firewalls, critical applications antivirus systems, wireless access points, servers, routers, IDS /IPS systems, etc. which generates tons of security alerts every day.

All information collected is passed into the management console (which provides an interface for configuring and monitoring the system). Once it reaches the management console it is viewed by the data analyst who can provide feedback which helps to educate the SIEM system in terms of machine learning and increases its familiarity with the surrounding environment.

SIEM is provided by vendors as software, appliances, or managed services and makes use of log and event data from various security systems, networks, and computers and turns it into actionable security insights.
SIEM technology is capable of detecting threats that individual security systems cannot see. They also take part in investigating past security incidents to perform efficient incident responses.

A SIEM system not only identifies that an attack has happened but it also has the capability to understand how and why attacks happened or where vulnerabilities exist for future attacks through generating an insight into last attacks and events by analyzing log data that every user leaves behind in the networks log data. This system operates with a statistical model to analyze log entries. There are different types of SIEM tools to identify these.

SIEM is a standard method for complying with a variety of industry cyber management regulations.
SIEM can also be a standard method for auditing IT networks and systems and provides transparency over logs and provide clear insights.

All SIEM systems are not built the same as a SIEM system that is right for a company may not be right for another

but there are some core features needed for a SIEM system:

Log Management
• Compliance reporting
• Threat Intelligence
• Fine-tuning alert conditions
• Dashboard

Types of SIEM Tools

Some of the best types of SIEM tools with these core features for implementing SIEM are as follows:

SPLUNK

Splunk is one of the best-rated types of SIEM tools that incorporated analytics into the heart of its SIEM which aggregates parses and analyses log data in a distributed system.

It is a distributed system that ingests processes and indexes log data and has three main components;

Splunk Forwarder

Is like an agent that collects log data for the purpose of forwarding log data for log
Indexation and further processing
The forwarder can be of two types:

Universal forwarder– forwards the raw data without any prior treatments. This is faster and requires fewer resources on the host, but result in huge quantities of data sent to the indexer.
Heavy forwarder– performs parsing and indexing at the source, on the host machine and sends only the parsed events to the indexer.

Splunk Indexer– Used for parsing and indexing the data

Search Head– For searching analyzing and reporting search head is used as a Graphical User Interface

Splunk has 3 data processing stages

1. Data Input stage
In the first stage, Splunk software consumes the raw data stream from its sources using the Splunk forwarder and breaks it into blocks, and annotates each block with metadata keys which include hostname, source, and source type of the data.

2. Data storage stage
The data storage phase has the parsing phase and indexation phase
In the Parsing phase, the relevant information is extracted by splunking software by analyzing the data. This is also known as event processing.
It is during this that the phase Splunk software breaks the data into individual events

The parsing phase has many sub-phases

o Breaking the stream of data into individual lines
o Identifying parsing and setting timestamps
o Annotating individual events with metadata copied from the source wide keys
o Annotating individual events with metadata copied from the source wide keys
o Transformforming event data and metadata accordingly to regex transform rules.

The Splunk indexer used for parsing and indexing the data indexes the parsed events which include both compressed raw data and the corresponding index file.
The benefit of indexing is that data can be easily accessed during searching.

3. Data searching stage
This stage controls how the user accesses, views, and uses the indexed data.
As part of the search function, Splunk software stores user-created knowledge objects such as reports, event types, dashboards, alerts, and field extractions
The search function also manages the search process

Role of Management Console in Splunk

The role of the management console is that of a centralized configuration manager in the architecture of Splunk for facilitating the update of deployment clients (forwarders, indexers, and search head) and for the purpose of distributing configurations and app updates and such others.

QRADAR

QRadar is one of the efficient types of SIEM tools with very efficient event processors. Through using QRadar we can run any number of event processors under one console. The console will keep Indexes of everything that is going on and the data can be retrieved quickly. There is a flow processor that is equal to the event processor which can collect over 3 million flows per minute. It collects and normalizes the data and it runs the rules on it and creates indexes so that the data is available quickly.

There is combined processing of events and flow processing in which all of the events go to the event processor and all of the net flows go to the flow processor in the core of the distributed architecture

The event processor and flow processor may be distributed in distance geographically even though we want to keep them close together.

In QRadar Log events could be sent to the event collector and the event collector will send it to event and flow processors thereby facilitating a distributed environment.
QRadar system is said to have made a 75% improvement in the quality of threat detection in its users.

Also, do check out our latest post on Cyber Attacks.

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top
Join Us Now
× How can I help you?