Table of Contents
- Introduction to Offline PCAP Analysis
- Setup and Prerequisites
- Basic Offline Analysis Commands
- Working with Unit42 PCAPs
- Analyzing Specific Malware Families
- Advanced Analysis Techniques
- Batch Processing Multiple PCAPs
- Interpreting and Exporting Results
Introduction to Offline PCAP Analysis
Offline PCAP analysis with Suricata allows you to:
-
Forensic Investigation: Analyze historical network traffic after an incident
-
Malware Analysis: Study malware behavior in a controlled environment
-
Rule Testing: Validate detection rules against known malicious traffic
-
Training: Learn threat detection without live network access
-
Incident Response: Quickly analyze captured traffic from compromised systems
Why Use Offline Mode?
| Use Case |
Benefit |
| Forensics |
Analyze past incidents without time pressure |
| Training |
Practice with real malware traffic safely |
| Rule Development |
Test and tune detection rules |
| Performance Testing |
Benchmark Suricata with controlled data |
| Research |
Study attack patterns and techniques |
Setup and Prerequisites
Required Software
1
2
3
4
5
6
7
8
9
|
# Install Suricata (if not already installed)
sudo apt install suricata # Ubuntu/Debian
sudo dnf install suricata # RHEL/Alma Linux
# Install supporting tools
sudo apt install jq tcpdump tshark unzip wget
# Verify Suricata installation
suricata --build-info
|
Download Unit42 PCAP Repository
Unit42 provides excellent PCAP samples for training and analysis:
1
2
3
4
5
6
|
# Clone the repository
git clone https://github.com/PaloAltoNetworks/Unit42-Wireshark-tutorials.git
cd Unit42-Wireshark-tutorials
# List available PCAPs
ls -lh *.zip
|
Important: All ZIP archives in this repository are password-protected with the password: infected
1
2
3
4
5
6
7
8
9
10
|
# Example: Extract Emotet infection PCAP
unzip -P infected emotet-infection-traffic.zip
# Example: Extract multiple archives
for file in *.zip; do
unzip -P infected "$file"
done
# Create organized directory structure
mkdir -p ~/pcap-analysis/{emotet,trickbot,ursnif,hancitor,redline}
|
Basic Offline Analysis Commands
Simple PCAP Analysis
The most basic way to run Suricata in offline mode:
1
2
3
4
5
6
7
8
|
# Basic syntax
suricata -r /path/to/capture.pcap
# Example with specific config
suricata -c /etc/suricata/suricata.yaml -r sample.pcap
# Specify output directory
suricata -r sample.pcap -l /var/log/suricata/offline/
|
Command Line Options for Offline Mode
1
2
3
4
5
6
7
8
9
10
11
12
13
14
|
# Run with verbose output
suricata -r sample.pcap -v
# Run with specific rule file
suricata -r sample.pcap -S /path/to/custom.rules
# Run with specific log directory
suricata -r sample.pcap -l /tmp/suricata-output/
# Disable specific protocols
suricata -r sample.pcap --disable-detection
# Set specific HOME_NET
suricata -r sample.pcap --set vars.address-groups.HOME_NET="[192.168.1.0/24]"
|
Monitoring Progress
1
2
3
4
5
6
7
8
|
# Run with statistics output
suricata -r large-capture.pcap --set stats.enabled=yes -v
# Run with performance statistics
suricata -r sample.pcap --engine-analysis
# Time the analysis
time suricata -r sample.pcap -l /tmp/output/
|
Working with Unit42 PCAPs
Example 1: Emotet Infection Analysis
Emotet is a sophisticated malware loader. Let’s analyze a typical infection:
1
2
3
4
5
6
7
8
9
10
11
12
13
|
# Navigate to downloaded PCAPs
cd ~/Unit42-Wireshark-tutorials
# Extract Emotet PCAP
unzip -P infected emotet-infection-traffic.zip
# Run Suricata analysis
suricata -r Example-1-2021-01-06-Emotet-infection.pcap \
-l /tmp/emotet-analysis/ \
-v
# Quick check of generated alerts
cat /tmp/emotet-analysis/eve.json | jq 'select(.event_type=="alert")' | head -20
|
Expected Traffic Patterns
Emotet typically generates:
-
HTTP POST requests to compromised WordPress sites
-
Command and Control (C2) traffic over HTTP/HTTPS
-
Encoded data in POST request bodies
-
Follow-up malware downloads (Trickbot, Qakbot, etc.)
Analyzing Results
1
2
3
4
5
6
7
8
9
10
11
12
13
|
# Count alerts by signature
cat /tmp/emotet-analysis/eve.json | \
jq -r 'select(.event_type=="alert") | .alert.signature' | \
sort | uniq -c | sort -rn
# Extract C2 domains
cat /tmp/emotet-analysis/eve.json | \
jq -r 'select(.event_type=="http") | .http.hostname' | \
sort -u
# Find suspicious HTTP methods
cat /tmp/emotet-analysis/eve.json | \
jq 'select(.event_type=="http" and .http.http_method=="POST")'
|
Example 2: Trickbot Infection
Trickbot is banking malware often distributed by Emotet:
1
2
3
4
5
6
7
8
9
10
|
# Extract and analyze Trickbot PCAP
unzip -P infected trickbot-infection.zip
suricata -r 2019-09-12-Trickbot-gtag-del111-infection.pcap \
-l /tmp/trickbot-analysis/ \
-c /etc/suricata/suricata.yaml
# Analyze Trickbot-specific indicators
cat /tmp/trickbot-analysis/eve.json | \
jq 'select(.event_type=="alert" and (.alert.signature | contains("Trickbot")))'
|
Trickbot Traffic Characteristics
1
2
3
4
5
6
7
8
9
10
11
12
|
# Look for malspam delivery
jq 'select(.event_type=="http" and .http.hostname | contains("invoice"))' \
/tmp/trickbot-analysis/eve.json
# Find C2 traffic (typically HTTP GET requests)
jq 'select(.event_type=="http" and .http.http_method=="GET")' \
/tmp/trickbot-analysis/eve.json | \
jq -r '.http.url' | head -20
# Identify .png files (Trickbot often uses .png extensions for EXE files)
jq 'select(.event_type=="http" and (.http.url | contains(".png")))' \
/tmp/trickbot-analysis/eve.json
|
Example 3: Ursnif/Gozi Banking Malware
1
2
3
4
5
6
7
8
9
10
11
12
13
14
|
# Extract Ursnif PCAP
unzip -P infected ursnif-infection-traffic.zip
# Run analysis with specific rule set
suricata -r Ursnif-traffic-example-1.pcap \
-l /tmp/ursnif-analysis/ \
--set vars.address-groups.HOME_NET="[10.0.0.0/8]"
# Analyze banking malware indicators
cat /tmp/ursnif-analysis/eve.json | \
jq 'select(.event_type=="alert" and
(.alert.signature | contains("Banking") or
.alert.signature | contains("Ursnif") or
.alert.signature | contains("Gozi")))'
|
Ursnif Traffic Analysis
1
2
3
4
5
6
7
8
9
10
11
12
13
14
|
# Check for HTTPS C2 communication
jq 'select(.event_type=="tls")' /tmp/ursnif-analysis/eve.json | \
jq -r '.tls.sni' | sort -u
# Look for suspicious HTTP patterns
jq 'select(.event_type=="http" and .http.status==200)' \
/tmp/ursnif-analysis/eve.json | \
jq '{hostname: .http.hostname, url: .http.url, method: .http.http_method}'
# Extract file hashes from fileinfo events
jq 'select(.event_type=="fileinfo")' /tmp/ursnif-analysis/eve.json | \
jq '{filename: .fileinfo.filename,
md5: .fileinfo.md5,
sha256: .fileinfo.sha256}'
|
Example 4: Hancitor Malware
Hancitor (also known as Chanitor) is a downloader that delivers additional malware:
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
|
# Extract and analyze Hancitor traffic
unzip -P infected hancitor-infection.zip
suricata -r Example-1-2021-02-10-Hancitor-infection.pcap \
-l /tmp/hancitor-analysis/
# Find initial infection vector
jq 'select(.event_type=="http" and .http.http_method=="POST")' \
/tmp/hancitor-analysis/eve.json | head -1
# Identify follow-up malware downloads
jq 'select(.event_type=="http" and
(.http.http_content_type | contains("application")))' \
/tmp/hancitor-analysis/eve.json
# Look for Cobalt Strike beacons (common Hancitor payload)
jq 'select(.event_type=="alert" and
(.alert.signature | contains("Cobalt Strike")))' \
/tmp/hancitor-analysis/eve.json
|
Example 5: RedLine Stealer
RedLine Stealer is information-stealing malware:
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
|
# Extract RedLine Stealer PCAP
unzip -P infected 2023-07-redline-stealer.zip
# Analyze with focus on data exfiltration
suricata -r 2023-07-redline-stealer.pcap \
-l /tmp/redline-analysis/ \
-v
# Look for stolen data transmission
jq 'select(.event_type=="http" and .http.http_method=="POST")' \
/tmp/redline-analysis/eve.json
# Check for suspicious TLS connections
jq 'select(.event_type=="tls")' /tmp/redline-analysis/eve.json | \
jq '{ja3: .tls.ja3.hash, issuer: .tls.issuer}'
# Find DNS queries to suspicious domains
jq 'select(.event_type=="dns")' /tmp/redline-analysis/eve.json | \
jq -r '.dns.rrname' | sort -u
|
Advanced Analysis Techniques
Using Custom Rule Sets
Create a custom rule file for specific analysis:
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
|
# Create custom rules file
cat > /tmp/custom-malware.rules << 'EOF'
alert http any any -> any any (msg:"Suspicious User-Agent";
http.user_agent; content:"MSIE 6.0"; sid:1000001;)
alert http any any -> any any (msg:"POST to suspicious path";
http.method; content:"POST"; http.uri; content:"/gate.php"; sid:1000002;)
alert tls any any -> any any (msg:"Self-signed certificate detected";
tls.cert_issuer; pcre:"/CN=localhost/i"; sid:1000003;)
alert dns any any -> any any (msg:"Suspicious TLD query";
dns.query; content:".tk"; sid:1000004;)
EOF
# Run Suricata with custom rules
suricata -r sample.pcap \
-S /tmp/custom-malware.rules \
-l /tmp/custom-analysis/
|
Multi-Pass Analysis
Perform multiple analysis passes with different configurations:
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
|
#!/bin/bash
# multi-pass-analysis.sh
PCAP_FILE="$1"
OUTPUT_BASE="/tmp/analysis-$(date +%Y%m%d-%H%M%S)"
# Pass 1: Full rule set
echo "Pass 1: Full detection..."
suricata -r "$PCAP_FILE" \
-l "${OUTPUT_BASE}/pass1-full/" \
-v
# Pass 2: HTTP/HTTPS only
echo "Pass 2: Web traffic focus..."
suricata -r "$PCAP_FILE" \
-l "${OUTPUT_BASE}/pass2-web/" \
--set app-layer.protocols.http.enabled=yes \
--set app-layer.protocols.tls.enabled=yes
# Pass 3: File extraction enabled
echo "Pass 3: File extraction..."
suricata -r "$PCAP_FILE" \
-l "${OUTPUT_BASE}/pass3-files/" \
--set file-store.enabled=yes
# Generate summary report
echo "Generating summary..."
cat > "${OUTPUT_BASE}/summary.txt" << EOF
Analysis Summary for: $PCAP_FILE
Generated: $(date)
Pass 1 - Full Detection:
Total Alerts: $(grep -c '"event_type":"alert"' "${OUTPUT_BASE}/pass1-full/eve.json")
Unique Signatures: $(jq -r 'select(.event_type=="alert") | .alert.signature' \
"${OUTPUT_BASE}/pass1-full/eve.json" | sort -u | wc -l)
Pass 2 - Web Traffic:
HTTP Requests: $(grep -c '"event_type":"http"' "${OUTPUT_BASE}/pass2-web/eve.json")
TLS Sessions: $(grep -c '"event_type":"tls"' "${OUTPUT_BASE}/pass2-web/eve.json")
Pass 3 - File Extraction:
Files Extracted: $(ls -1 "${OUTPUT_BASE}/pass3-files/files/" 2>/dev/null | wc -l)
EOF
cat "${OUTPUT_BASE}/summary.txt"
|
Deep Protocol Analysis
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
|
# Analyze specific protocols in depth
# DNS Analysis
jq 'select(.event_type=="dns")' eve.json | \
jq '{query: .dns.rrname,
type: .dns.rrtype,
answer: .dns.answers,
rcode: .dns.rcode}' > dns-analysis.json
# HTTP User-Agent Analysis
jq 'select(.event_type=="http")' eve.json | \
jq -r '.http.http_user_agent' | \
sort | uniq -c | sort -rn > user-agents.txt
# TLS Certificate Analysis
jq 'select(.event_type=="tls")' eve.json | \
jq '{subject: .tls.subject,
issuer: .tls.issuer,
ja3: .tls.ja3.hash,
sni: .tls.sni}' > tls-certs.json
# File Hash Extraction
jq 'select(.event_type=="fileinfo")' eve.json | \
jq '{filename: .fileinfo.filename,
md5: .fileinfo.md5,
sha256: .fileinfo.sha256,
size: .fileinfo.size}' > file-hashes.json
|
Timeline Analysis
Create a timeline of malicious activity:
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
|
#!/bin/bash
# create-timeline.sh
EVE_JSON="$1"
echo "Creating timeline from $EVE_JSON"
jq -r 'select(.event_type=="alert") |
[.timestamp, .alert.signature, .src_ip, .dest_ip] |
@tsv' "$EVE_JSON" | \
sort | \
column -t -s $'\t' > timeline-alerts.txt
jq -r 'select(.event_type=="http") |
[.timestamp, .http.http_method, .http.hostname, .http.url] |
@tsv' "$EVE_JSON" | \
sort | \
column -t -s $'\t' > timeline-http.txt
echo "Timeline files created:"
echo " - timeline-alerts.txt"
echo " - timeline-http.txt"
|
Batch Processing Multiple PCAPs
Process All PCAPs in a Directory
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
|
#!/bin/bash
# batch-analyze-pcaps.sh
PCAP_DIR="$1"
OUTPUT_BASE="$2"
if [ -z "$PCAP_DIR" ] || [ -z "$OUTPUT_BASE" ]; then
echo "Usage: $0 <pcap-directory> <output-directory>"
exit 1
fi
mkdir -p "$OUTPUT_BASE"
# Process each PCAP file
for pcap in "$PCAP_DIR"/*.pcap; do
filename=$(basename "$pcap" .pcap)
output_dir="$OUTPUT_BASE/$filename"
echo "Processing: $pcap"
echo "Output: $output_dir"
mkdir -p "$output_dir"
# Run Suricata
suricata -r "$pcap" -l "$output_dir" -v
# Generate quick summary
alert_count=$(grep -c '"event_type":"alert"' "$output_dir/eve.json" 2>/dev/null || echo "0")
echo "$filename: $alert_count alerts" >> "$OUTPUT_BASE/summary.txt"
done
echo "Batch processing complete!"
cat "$OUTPUT_BASE/summary.txt"
|
Parallel Processing for Large Datasets
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
|
#!/bin/bash
# parallel-pcap-analysis.sh
PCAP_DIR="$1"
OUTPUT_BASE="$2"
THREADS="${3:-4}" # Default to 4 threads
mkdir -p "$OUTPUT_BASE"
# Function to process a single PCAP
process_pcap() {
pcap="$1"
output_base="$2"
filename=$(basename "$pcap" .pcap)
output_dir="$output_base/$filename"
mkdir -p "$output_dir"
suricata -r "$pcap" -l "$output_dir" 2>&1 | \
tee "$output_dir/analysis.log"
# Extract key metrics
alerts=$(grep -c '"event_type":"alert"' "$output_dir/eve.json" 2>/dev/null || echo "0")
echo "$filename,$alerts" >> "$output_base/results.csv"
}
export -f process_pcap
# Initialize results file
echo "Filename,Alerts" > "$OUTPUT_BASE/results.csv"
# Process in parallel
find "$PCAP_DIR" -name "*.pcap" | \
xargs -P "$THREADS" -I {} bash -c 'process_pcap "$@"' _ {} "$OUTPUT_BASE"
echo "Parallel processing complete!"
column -t -s, "$OUTPUT_BASE/results.csv"
|
Automated Report Generation
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
|
#!/bin/bash
# generate-analysis-report.sh
EVE_JSON="$1"
REPORT_FILE="${2:-analysis-report.html}"
cat > "$REPORT_FILE" << 'EOF'
<!DOCTYPE html>
<html>
<head>
<title>Suricata PCAP Analysis Report</title>
<style>
body { font-family: Arial, sans-serif; margin: 20px; }
table { border-collapse: collapse; width: 100%; margin: 20px 0; }
th, td { border: 1px solid #ddd; padding: 8px; text-align: left; }
th { background-color: #4CAF50; color: white; }
.section { margin: 30px 0; }
.metric { background: #f0f0f0; padding: 10px; margin: 10px 0; }
</style>
</head>
<body>
<h1>Suricata PCAP Analysis Report</h1>
<div class="section">
<h2>Summary</h2>
EOF
# Add metrics
echo "<div class='metric'>" >> "$REPORT_FILE"
echo "<strong>Total Alerts:</strong> $(grep -c '"event_type":"alert"' "$EVE_JSON")" >> "$REPORT_FILE"
echo "</div>" >> "$REPORT_FILE"
echo "<div class='metric'>" >> "$REPORT_FILE"
echo "<strong>Unique Signatures:</strong> $(jq -r 'select(.event_type=="alert") | .alert.signature' "$EVE_JSON" | sort -u | wc -l)" >> "$REPORT_FILE"
echo "</div>" >> "$REPORT_FILE"
# Add top alerts table
cat >> "$REPORT_FILE" << 'EOF'
</div>
<div class="section">
<h2>Top Alerts</h2>
<table>
<tr><th>Count</th><th>Signature</th></tr>
EOF
jq -r 'select(.event_type=="alert") | .alert.signature' "$EVE_JSON" | \
sort | uniq -c | sort -rn | head -10 | \
awk '{print "<tr><td>" $1 "</td><td>" substr($0, index($0,$2)) "</td></tr>"}' >> "$REPORT_FILE"
cat >> "$REPORT_FILE" << 'EOF'
</table>
</div>
</body>
</html>
EOF
echo "Report generated: $REPORT_FILE"
|
Interpreting and Exporting Results
Understanding Alert Severity
1
2
3
4
5
6
7
8
9
10
11
12
13
|
# Group alerts by severity
jq 'select(.event_type=="alert")' eve.json | \
jq -r '[.alert.severity, .alert.signature] | @tsv' | \
awk '{count[$1]++} END {for (severity in count)
print severity "\t" count[severity]}' | \
sort -n
# High severity alerts only
jq 'select(.event_type=="alert" and .alert.severity <= 2)' eve.json | \
jq '{signature: .alert.signature,
category: .alert.category,
src: .src_ip,
dst: .dest_ip}'
|
Creating IOC Lists
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
|
# Extract all malicious IPs
jq -r 'select(.event_type=="alert") | .dest_ip' eve.json | \
sort -u > malicious-ips.txt
# Extract malicious domains
jq -r 'select(.event_type=="dns" or .event_type=="http") |
.dns.rrname // .http.hostname' eve.json | \
grep -v "^$" | sort -u > malicious-domains.txt
# Extract file hashes
jq 'select(.event_type=="fileinfo")' eve.json | \
jq -r '.fileinfo.sha256' | \
grep -v "^$" | sort -u > file-hashes.txt
# Create combined IOC file
cat > iocs.json << EOF
{
"analysis_date": "$(date -u +%Y-%m-%dT%H:%M:%SZ)",
"malicious_ips": $(cat malicious-ips.txt | jq -R . | jq -s .),
"malicious_domains": $(cat malicious-domains.txt | jq -R . | jq -s .),
"file_hashes": $(cat file-hashes.txt | jq -R . | jq -s .)
}
EOF
|
1
2
3
4
5
6
7
8
|
# Convert to Splunk-friendly format
jq -c 'select(.event_type=="alert") |
{
time: .timestamp,
source: "suricata",
sourcetype: "suricata:alert",
event: .
}' eve.json > splunk-export.json
|
1
2
3
4
|
# Bulk import format for Elasticsearch
jq -c 'select(.event_type=="alert") |
{index: {_index: "suricata-alerts", _type: "_doc"}}, .' \
eve.json > elastic-bulk.json
|
CSV Export
1
2
3
4
5
6
7
|
# Export alerts to CSV
echo "Timestamp,Signature,Severity,Source IP,Dest IP,Category" > alerts.csv
jq -r 'select(.event_type=="alert") |
[.timestamp, .alert.signature, .alert.severity,
.src_ip, .dest_ip, .alert.category] |
@csv' eve.json >> alerts.csv
|
Statistical Analysis
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
|
#!/bin/bash
# statistical-analysis.sh
EVE_JSON="$1"
echo "=== Suricata Analysis Statistics ==="
echo ""
echo "Event Type Distribution:"
jq -r '.event_type' "$EVE_JSON" | sort | uniq -c | sort -rn
echo ""
echo "Alert Category Distribution:"
jq -r 'select(.event_type=="alert") | .alert.category' "$EVE_JSON" | \
sort | uniq -c | sort -rn
echo ""
echo "Top 10 Source IPs:"
jq -r 'select(.event_type=="alert") | .src_ip' "$EVE_JSON" | \
sort | uniq -c | sort -rn | head -10
echo ""
echo "Top 10 Destination IPs:"
jq -r 'select(.event_type=="alert") | .dest_ip' "$EVE_JSON" | \
sort | uniq -c | sort -rn | head -10
echo ""
echo "Protocol Distribution:"
jq -r '.proto' "$EVE_JSON" | sort | uniq -c | sort -rn
|
Best Practices for Offline Analysis
1. Organize Your Workflow
1
2
3
4
5
6
7
8
9
10
11
|
# Create standard directory structure
mkdir -p ~/pcap-analysis/{
raw-pcaps,
results/{alerts,flows,files},
reports,
scripts
}
# Use descriptive naming
# Good: 2024-01-15-emotet-infection-client-192.168.1.100.pcap
# Bad: capture1.pcap
|
2. Document Your Analysis
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
|
# Create analysis log
cat > analysis-log.md << EOF
# PCAP Analysis Log
## File Information
- Filename: $(basename $PCAP_FILE)
- Size: $(du -h $PCAP_FILE | cut -f1)
- Analysis Date: $(date)
## Analysis Parameters
- Suricata Version: $(suricata --version)
- Rule Version: $(grep "version" /var/lib/suricata/rules/suricata.rules | head -1)
- HOME_NET: [192.168.0.0/16, 10.0.0.0/8]
## Initial Findings
- Total Alerts: [TO BE FILLED]
- High Severity: [TO BE FILLED]
- Malware Family: [TO BE FILLED]
## IOCs Identified
[TO BE FILLED]
## Recommendations
[TO BE FILLED]
EOF
|
3. Version Control for Rules
1
2
3
|
# Save rule configuration with analysis
cp /etc/suricata/suricata.yaml ./analysis-config-$(date +%Y%m%d).yaml
cp /var/lib/suricata/rules/suricata.rules ./rules-$(date +%Y%m%d).rules
|
4. Validate Results
1
2
3
4
5
6
7
|
# Cross-check with other tools
tcpdump -r sample.pcap 'tcp and port 80' | wc -l
tshark -r sample.pcap -Y "http" -T fields -e http.host | sort -u
# Compare with Wireshark
tshark -r sample.pcap -q -z conv,tcp
tshark -r sample.pcap -q -z http,tree
|
Troubleshooting Common Issues
Issue 1: No Alerts Generated
1
2
3
4
5
6
7
8
|
# Check if rules are loaded
suricata -r sample.pcap -T # Test mode
# Verify HOME_NET matches the traffic
tcpdump -r sample.pcap -n | head -20
# Try with verbose logging
suricata -r sample.pcap -v --set outputs.1.eve-log.enabled=yes
|
Issue 2: Memory Issues with Large PCAPs
1
2
3
4
5
6
7
8
9
10
|
# Process in chunks
tcpdump -r large.pcap -w chunk -C 100 # Split into 100MB chunks
# Process each chunk
for chunk in chunk*; do
suricata -r "$chunk" -l "output-$chunk/"
done
# Merge results
cat output-*/eve.json > combined-eve.json
|
Issue 3: Missing Protocol Detection
1
2
3
4
5
|
# Enable specific app-layer protocols
suricata -r sample.pcap \
--set app-layer.protocols.http.enabled=yes \
--set app-layer.protocols.dns.enabled=yes \
--set app-layer.protocols.tls.enabled=yes
|
Conclusion
Offline PCAP analysis with Suricata is a powerful technique for:
-
Incident Response: Quickly analyze captured traffic
-
Malware Research: Study attack patterns safely
-
Training: Build skills with real-world examples
-
Rule Development: Test and validate detection logic
The Unit42 PCAP repository provides excellent training material covering various malware families and attack techniques. Practice with these samples to build your network security analysis skills.
Additional Resources
-
Unit42 GitHub: https://github.com/PaloAltoNetworks/Unit42-Wireshark-tutorials
-
Suricata Documentation: https://suricata.readthedocs.io/
-
Malware Traffic Analysis: https://www.malware-traffic-analysis.net/
-
Community Forum: https://forum.suricata.io/
Security Note: Always analyze malicious PCAPs in an isolated environment. These captures contain real malware traffic and should be handled with appropriate precautions.