This article contains the steps to create a 2 node Elasticsearch cluster with a Kibana dashboard on Linux with Winlogbeat to ingest live windows event logs in combination with a PowerShell script for uploading exported windows event log (.evtx files) into the ELK stack for easier analysis.
Note: In this guide I deployed every instance on a separate virtual machine, but this can also be done on a single host.
Steps:
Prerequisites:
- Elasticsearch, Kibana, WinLogBeat and Burnham Forensics evtx upload PowerShell script. These can be downloaded from elastic.co and burnhamforensics.com. For this guide I used version 7.9.3 for Elasticsearch, Kibana and WinLogBeat.
- One or more Linux servers on which you want to run the Elasticsearch nodes and the Kibana dashboard from. I used Ubuntu 20.04 LTS to run the different nodes on.
- At least one Windows computer that functions as the .evtx uploader.
- Access to the Windows computers which you wish to continuously send live event logs to the ELK system.
Elasticsearch master node setup:
- Open a terminal on the server you want to run the master elasticsearch node on, and run the following command.
-
tvq@v-elk-master:~/elasticsearch$ ./bin/elasticsearch-certutil cert
- Enter the location you wish to store the .p12 certificate, I choose
config/v-elk.p12
- And lastly enter a password or passphrase to protect this .p12 file
-
- Inside the config directory, edit the
elasticsearch.yml
file and add or change the following lines to match your setup:
cluster.name: v-elk node.name: v-elk-elastic-master network.host: 192.168.1.105 http.port: 9200 discovery.seed_hosts: ["192.168.1.105","192.168.1.110"] cluster.initial_master_nodes: ["v-elk-elastic-master"] xpack.security.enabled: true xpack.security.transport.ssl.enabled: true xpack.security.transport.ssl.verification_mode: Certificate xpack.security.transport.ssl.keystore.path: v-elk.p12 xpack.security.transport.ssl.truststore.path: v-elk.p12
- Notice that I didn’t specify that .p12 password/passphrase in this yml config file. You should avoid storing passwords in plaintext in config files where possible.
- To allow elasticsearch to function, add the password/passphrase for the p12 certificate in the keystore. Do this by running the following commands and entering the correct password each time.
tvq@v-elk-master:~/elasticsearch$ ./bin/elasticsearch-keystore add xpack.security.transport.ssl.keystore.secure_password
tvq@v-elk-master:~/elasticsearch$ ./bin/elasticsearch-keystore add xpack.security.transport.ssl.truststore.secure_password - We are now ready to start elasticsearch for the first time by running
tvq@v-elk-master:~/elasticsearch$ ./bin/elasticsearch
- It is possible that you run into the same issue like I did in my homelab environment. Elasticsearch would not startup and gave the following message in the log file: ” [1]: max virtual memory areas vm.max_map_count [65530] is too low, increase to at least [262144]”. This can be easily fixed by running following command:
tvq@v-elk-master:~/elasticsearch$ sudo sysctl -w vm.max_map_count=262144
- Once elasticsearch is running, open a new terminal window so we can automatically create some users and credentials. Do this by running the command
tvq@v-elk-master:~/elasticsearch$ ./bin/elasticsearch-setup-passwords auto
- Save the Users and their passwords in a safe place like a password manager. (Note: these passwords can be reset and modified using Kibana later on)
- Congratulations, the master node has been setup and is running.
Elasticsearch additional node setup
The hard work is already done in the master node setup only a few tweaks are needed.
- Copy the complete config directory from the master node to the additional node’s config directory.
- Edit the
config/elasticsearch.yml
file where you leave everything the same except for the following lines:
node.name: v-elk-elastic-node1 network.host: 192.168.1.110
- Now start the additional node by running the command
tvq@v-elk-master:~/elasticsearch$ ./bin/elasticsearch
- You’ll see the node join the cluster in the terminal window from the master node.
- Congratulations, the additional node is setup and running. Adding more nodes to the cluster can be done using the same steps.
Kibana setup
- In the Kibana directory locate and edit the
config/kibana.yml
file and edit the following lines:
server.port: 5601 server.host: "192.168.1.100" server.name: "v-elk-kibana" elasticsearch.hosts: ["http://192.168.1.105:9200","http://192.168.1.110:9200"]
- Create a Kibana keystore and add the credentials for kibana to connect to elesticsearch as well as 3 encryption keys which I generated using a Keypass.
tvq@v-elk-kibana:~/kibana$ ./bin/kibana-keystore create
tvq@v-elk-kibana:~/kibana$ ./bin/kibana-keystore add elasticsearch.username
tvq@v-elk-kibana:~/kibana$ ./bin/kibana-keystore add elasticsearch.password
tvq@v-elk-kibana:~/kibana$ ./bin/kibana-keystore add xpack.security.encryptionKey
tvq@v-elk-kibana:~/kibana$ ./bin/kibana-keystore add xpack.reporting.encryptionKey
tvq@v-elk-kibana:~/kibana$ ./bin/kibana-keystore add xpack.encryptedSavedObjects.encryptionKey - It is recommended to add a TLS certificate to Kibana. Either generate a self-signed certificate or upload one to the server running Kibana.
- If you use a crt and key file in PEM format, add following lines to the
config/kibana.yml
file with the correct path:
server.ssl.certificate: "/path/to/kibana-server.crt" server.ssl.key: "/path/to/kibana-server.key"
- If the private key is encrypted, add the passphrase to the keystore by running:
tvq@v-elk-kibana:~/kibana$ ./bin/kibana-keystore add server.ssl.keyPassphrase
- If your certificate is contained in a PKCS#12 file, add the following line to the
config/kibana.yml
file with the correct path:
server.ssl.keystore.path: "/path/to/kibana-server.p12"
- If the PKCS#12 file is encrypted, add the passphrase to the keystore by running:
tvq@v-elk-kibana:~/kibana$ ./bin/kibana-keystore add server.ssl.keystore.password
- Lastly enable TLS for incoming connections by adding the following line to
config/kibana.yml
:
server.ssl.enabled: true
- Now run kibana by using the command
tvq@v-elk-kibana:~/kibana$ ./bin/kibana
- The kibana dashboard can now be reached using
http://192.168.1.100:5601
or if you enabled TLShttps://192.168.1.100:5601
- To login, use the username elastic as this is the superuser and its generated password.
WinLogBeat for real-time centralized collection windows event logs
In this section, I setup both the continuous log forwarding as well as the evtx file upload on the same host. You do not have to have WinLogBeat installed and skip to the section “WinLogBeat uploading exported evtx files” if you are only planning on uploading exported evtx files to the ELK for analysis.
- Download and extract the WinLogBeat zip on a Windows computer. Open this folder and edit the winlogbeat.yml file.
setup.kibana: host: "192.168.1.100:5601" output.elasticsearch: hosts: ["192.168.1.105:9200","192.168.1.110:9200"] username: "${ES_USER}" password: "${ES_PWD}"
- Open a PowerShell and run the following commands from the WinLogBeat folder to create a secure keystore and add the username and password to authenticate to Elasticsearch. This user must be authorized to set up WinLogBeat. I temporarily used the elastic superuser for this purely for the setup. Do not forget to change this further on in this guide.
PS C:\ELK\winlogbeat> .\winlogbeat.exe keystore create
PS C:\ELK\winlogbeat> .\winlogbeat keystore add ES_USER
PS C:\ELK\winlogbeat> .\winlogbeat keystore add ES_PWD - Test the winlogbeat.yml configuration file by running:
PS C:\ELK\winlogbeat> .\winlogbeat.exe test config -c .\winlogbeat.yml -e
- You can proceed to the next step if the final line from the output states: “Config OK“
- Setup Kibana with the preconfigured dashboards using the following command:
PS C:\ELK\winlogbeat> .\winlogbeat.exe setup -e
- The last line from the output should be: “Loaded dashboards“
- You are now ready to install WinLogBeat as a Windows Service by executing the included PowerShell script “install-service-winlogbeat.ps1. You may need to set the ExecutionPolicy to Unrestricted to be able to run .ps1 scripts.
PS C:\ELK\winlogbeat> .\install-service-winlogbeat.ps1
or
PS C:\ELK\winlogbeat> PowerShell.exe -ExecutionPolicy UnRestricted -File .\install-service-winlogbeat.ps1 - Now edit the keystore to a user’s credentials which permission for publishing events in elasticsearch. This can be done using the following commands (–force to override the previously entered credentials):
PS C:\ELK\winlogbeat> .\winlogbeat keystore add ES_USER --force
PS C:\ELK\winlogbeat> .\winlogbeat keystore add ES_PWD --force - Before starting the service, IF you are using a keystore (and you should!). Move the keystore file from C:\ELK\winlogbeat\data to C:\ELK\winlogbeat. Otherwise the service will NOT start as it is looking for the keystore in the root folder and not in the data folder. This may be solved in future releases of the WinLogBeat download.
- Start the service:
PS C:\ELK\winlogbeat> Start-Service winlogbeat
- You should now be able see the windows events for this configured host inside the Kibana dashboards.
- The standard yml config provided along inside the zip file of WinLogBeat already sends following event logs to the ELK stack:
- Application events
- System events
- Security events
- Microsoft-Windows-Sysmon/Operational
- Windows PowerShell
- Microsoft-Windows-PowerShell/Operational
- The PowerShell events allow a analyst to even see which PS commands have been performed on the host configured with WinLogBeat.
- The zip also contains a reference yml config file. This file highlights the most common options you can set for the WinLogBeat application.
WinLogBeat uploading exported evtx files
This next section mostly uses the work of Burnham forensics and modifies some parts. This part requires you to have gone through the previous section from the beginning up until the installation of the service. The service is not necessary for this section as it directly uses the WinLogBeat executable to ship the event logs to ELK. For a detailed explanation of how this all works, I strongly suggest you visit his blog and read through it.
- In the folder of WinLogBeat, add a file called winlogbeat-evtx.yml with the following contents:
# EVTX ELK Winlogbeat Configuration File | v1.0 # Zachary Burnham 2019 | @zmbf0r3ns1cs # Tjebbe Van Quickenborne | Updated 2020 # Winlogbeat Shipper Settings winlogbeat.event_logs: - name: ${EVTX_FILE} no_more_events: stop winlogbeat.shutdown_timeout: 60s winlogbeat.registry_file: evtx-registry.yml # Allow ELK to see active connections monitoring.enabled: true # Add/Drop fields for searching within Kibana processors: - add_fields: target: '' fields: client: ${CLIENT} case_number: ${CASE} identifier: ${ID} log_file: ${FILE} - drop_fields: fields: ["event.kind", "event.code", "agent.ephemeral_id", "ecs.version"] # Client Index Creation setup.ilm.enabled: false output.elasticsearch.index: '${CASE}-${ELK_CLIENT}-evtx' setup.template.name: "winlogbeat-7.9.3" setup.template.pattern: "${CASE}-${ELK_CLIENT}*" # Output data to Elasticsearch output.elasticsearch: hosts: ["192.168.1.105:9200","192.168.1.110:9200"] username: "${ES_USER}" password: "${ES_PWD}" # pipeline: geoip-evtx
- OPTIONAL: If you want to add geolocation information from IP-addresses to the import module, enable line 38 in the above file. And add the following pipeline to Elasticsearch using the DEV Tools in Kibana.
PUT _ingest/pipeline/geoip-evtx { "processors" : [ { "geoip" : { "if" : "ctx.winlog.event_data?.IpAddress != null && (ctx.winlog.event_data.IpAddress.contains('.') || ctx.winlog.event_data.IpAddress.contains(':'))", "field" : "winlog.event_data.IpAddress", "target_field" : "winlog.event_data.geo", "ignore_missing" : true } } ] }

- In the parent directory of WinLogBeat add the following PowerShell script as a .ps1.
# Welcome Banner Write-Host "ELK EVTX UPLOAD SCRIPT | v1.0" Write-Host "Zachary Burnham 2019 | @zmbf0r3ns1cs" Write-Host "[!] If multiple systems, please ensure all logs are local and grouped by system within nested folders." "`n" # Backend Maintenance # Check to see if Winlogbeat Registry File from prior uploads exists $regFile = Test-Path $pwd\winlogbeat\data\evtx-registry.yml # If it does exists, remove Winlogbeat Registry File if($regFile -eq $true){Remove-Item -Path $pwd\winlogbeat\data\evtx-registry.yml -Force} # Get current date for logging $date = Get-Date -UFormat "%m-%d" # Ask for path containing desired logs Do { Write-Host "Enter target directory path containing EVTX logs or folders grouping them by system (i.e. C:\Users\zburnham\EVTX-Logs)." $tempPath = Read-Host "Path" # Check to see if input path exists $checkPath = Test-Path $tempPath if($checkPath -eq $false){Write-Host "[!] Directory Path not found. Please check your input and try again." "`n"} } Until ($checkPath -eq $true) Write-Host "" # Adjust target directory path to have proper syntax for Winlogbeat, if needed $userPath = $tempPath -replace '/','\' # Check for nested folders Write-Host "Do you have nested folders labeled by system within this directory? (Default is NO)" $nested = Read-Host "(y/n)" Switch ($nested){ Y {Write-Host ""} N {Write-Host ""} Default { Write-Host "" Write-Host "[*] Defaulting to no nested folders..." "`n" } } # Perform directory check if nested folders exist if($nested -eq "y"){ Do { # Filter for all folders $folders = Get-ChildItem -Path $userPath -Directory # Verify Info Write-Host "The following folders (systems) were detected:" Write-Host "" Write-Host $folders Write-Host "" Write-Host "Is this the data you wish to upload? (Default is NO)" $answer = Read-Host "(y/n)" Switch ($answer){ Y {Write-Host ""} N {Write-Host ""} Default { Write-Host "" Write-Host "[*] Defaulting to NO..." "`n" } } } Until ($answer -eq "y") } # Ask for Client Name $tempClient = Read-Host "Enter Client Name (i.e. Burnham_Forensics)" # Replace Spaces, if any, in name for ELK Index Name $client = $tempClient -replace '\s','_' # Convert to Lowercase for ELK Index Name $elkClient = $client.ToLower() # Ask for Case Number $case = Read-Host "Enter Case # (i.e. 20-0101)" if($nested -eq "n"){ # Ask for Identifier for easier searching in Kibana Write-Host "" Write-Host "Enter a searchable identifier or note for this evidence upload (i.e. BURNHAM-W10)" $ID = Read-Host "Identifier" } # Informative Message regarding Index Creation Write-Host "" Write-Host "ELK Index: $case-$elkClient-evtx" Write-Host "[!] If new client, don't forget to add this index for viewing under 'Index Patterns' within Kibana settings." "`n" Write-Host "[*] Logs for this upload can be found in 'elk-logging' within the root 'ELK-Tools' folder." "`n" # Nested Folders Code if($nested -eq "y"){ # Filter for all folders $folders = Get-ChildItem -Path $userPath -Directory # Create for loop to cycle through all folders foreach($folder in $folders){ # Define loop vars $i = 1 $ID = $folder $foldersPath = $userPath + "\" + $folder # Filter for just the .evtx files within selected folder $dirs = Get-ChildItem -Path $foldersPath -filter *.evtx $dirsCount = $dirs.Count # Create for loop to grab all .evtx files within selected folder foreach($file in $dirs){ # Add shiny progress bar $percentComplete = ($i / $dirsCount) * 100 Write-Progress -Activity "$i of $dirsCount EVTX files found within $foldersPath sent to ELK" -Status "Uploading $file..." -PercentComplete $percentComplete $filePath = $foldersPath + "\" + $file # Execute Winlogbeat w/custom vars .\winlogbeat\winlogbeat.exe -e -c .\winlogbeat\winlogbeat-evtx.yml -E EVTX_FILE="$filePath" -E CLIENT="$rsmClient" -E CLIENT="$elkClient" -E CASE="$case" -E ID="$ID" -E FILE="$file" 2>&1 >> $pwd\elk-logging\winlogbeat_log_${date}.txt Sleep 3 $i++ } } } # Single Folder Code if($nested -eq "n"){ $i = 1 # Filter by EVTX extension $dirs = Get-ChildItem -Path $userPath -filter *.evtx $dirsCount = $dirs.Count # Create for loop to grab all .evtx files within selected folder foreach($file in $dirs){ # Add shiny progress bar $percentComplete = ($i / $dirsCount) * 100 Write-Progress -Activity "$i of $dirsCount EVTX files found within $userPath sent to ELK" -Status "Uploading $file..." -PercentComplete $percentComplete $filePath = $userPath + "\" + $file # Execute Winlogbeat w/custom vars .\winlogbeat\winlogbeat.exe -e -c .\winlogbeat\winlogbeat-evtx.yml -E EVTX_FILE="$filePath" -E CLIENT="$client" -E ELK_CLIENT="$elkClient" -E CASE="$case" -E ID="$ID" -E FILE="$file" 2>&1 >> $pwd\elk-logging\winlogbeat_log_${date}.txt Sleep 3 $i++ } } # Show message confirming successful upload Write-Host "[*] EVTX Upload completed. Use the 'Discover' tab in Kibana to view."
- To make the PowerShell script of Burnham forensics to work without having to modify it, please have the following folder structure:
- A directory containing the ps1 script of Burnham Forensics, a folder called elk-logging and a folder called WinLogBeat where the configuration files and executables reside.
- Open a PowerShell ISE as Administrator and open the PowerShell script from Burnham forensics.
- Run the PowerShell script and provide the following details:
- The full path of the folder where the evtx file(s) are located
- Does the folder contain nested folders labeled by system? yes or no
- Enter the client name
- Enter the case number
- Enter an identifier to this upload
- This will create a new index called: <case No>–<client name>-evtx
- Add a index pattern like *-<client name>-evtx to be able to view the events of this specific client.
- Congratulations, you can now upload exported .evtx files to ELK.
I do suggest that you take a look at the article by Brunham Forensics for a more detailed look and breakdown of his work.
Congratulations, you’ve made it to the end of this guide. You should now have a fully functional ELK stack with the capability of analyzing exported evtx files as part of an incident response and/or real time centralized windows event logs for easy correlation and analysis. Play around with Kibana, use the build-in dashboards or create dashboards that meet your specific needs to improve your workflow and help you analyze windows event logs in a more efficient way.
If you run in any issues while trying to create a setup like this, have any questions or comments, don’t hesitate to comment on this article and I’ll try my best to get back to you as soon as I can.
Keep Learning – T.
Sources:
- https://www.elastic.co/guide/en/elasticsearch/reference/current/getting-started.html
- https://www.elastic.co/guide/en/kibana/current/get-started.html
- https://www.elastic.co/guide/en/beats/winlogbeat/current/winlogbeat-installation-configuration.html
- https://burnhamforensics.com/2019/11/19/manually-upload-evtx-log-files-to-elk-with-winlogbeat-and-powershell/