Splunk Cheat Sheet
Administration
Paths
- All config specs:
ls /opt/splunk/etc/system/README
- Default conf (never use)
ls /opt/splunk/etc/system/default
- Local conf
ls /opt/splunk/etc/system/local
- Merged and running config
var/run/merged/server.conf
Splunk Configuration
server.conf
- Allow remote login when using the free license
[general] allowRemoteLogin = always
- Do not show the update information
[applicationsManagement] allowInternetAccess = false
inputs.conf
- Set the sourcetype on the forwarder machines, this is for the universal forwarder
[monitor://path\log\file.txt*] sourcetype = FileXYflightapi disabled = 0
indexes.conf
- /opt/splunk/etc/apps/search/local
[security] coldPath = $SPLUNK_DB/security/colddb enableDataIntegrityControl = 0 enableTsidxReduction = 0 homePath = $SPLUNK_DB/security/db maxTotalDataSizeMB = 1024 thawedPath = $SPLUNK_DB/security/thaweddb
inputs.conf
- Server
[splunktcp://9997] queueSize = 2MB disabled = 0
btool
- Check Syntax
./splunk btool check
- List server.conf / general
./splunk btool server list general
or server.conf / sslConfig
./splunk btool server list sslConfig
- See where changes come from
./splunk btool server list general --debug
- Show the script stanza form Inputs.conf
./splunk btool inputs list script
- and see wehre the change comes from
./splunk btool inputs list script --debug
- Or monitor
./splunk btool inputs list monitor
Server Commands
- Show running server.conf
./splunk show config server
/or inputs.conf
./splunk show config inputs
- Set/Show the server name
./splunk set servername splunk## ./splunk show servername
- Set/Show the default host name
./splunk show default-hostname ./splunk show default-hostname
- Add a test index to the search app
./splunk add index test -app search
- Add a receiving port to the search app
./splunk enable listen 9997 -app search
- Force reload
https://domain.com:8000/debug/refresh
Config Tracker (Splunk9+)
index = _configtracker index=_configtracker server.conf serverName
Diag
- Diag selections
These switches select which categories of information should be collected. The current components available are: index_files, index_listing, dispatch, etc, log, searchpeers, consensus, conf_replication_summary, suppression_listing, rest, kvstore, file_validate, profiler
- Sample
./splunk diag --collect=index_files,etc
btool
- List all configurations incl. the location
btool check --debug
- List all input stanzas
splunk btool inputs list /opt/splunk/bin/splunk btool outputs list --debug /opt/splunk/bin/splunk btool inputs list --debug /opt/splunk/bin/splunk btool server list --debug /opt/splunk/bin/splunk btool props list --debug /opt/splunk/bin/splunk btool indexes list --debug
- Database Dir: /opt/splunk/var/lib/splunk/
Cluster
- Cluster Status, only available from the cluster master
./bin/splunk show cluster-status ./bin/splunk show cluster-status -auth admin:$(</mnt/splunk-secrets/password)
- Including indexes
./bin/splunk show cluster-status --verbose ./bin/splunk list cluster-config ./bin/splunk show cluster-bundle-status
- Maintenance
bin/splunk show maintenance-mode -auth admin:$(</mnt/splunk-secrets/password)
SH-Cluster
- SH Cluster Status
bin/splunk show shcluster-status -auth admin:$(</mnt/splunk-secrets/password) splunk show shcluster-status --verbose bin/splunk list shcluster-member-info
- Restart the search head cluster
splunk rolling-restart shcluster-members
- Force App Update
splunk apply shcluster-bundle -target <URI>:<management_port> -auth <username>:<password>
Note: To obtain an URI you may use splunk show shcluster-status
KV Store
- Status
bin/splunk show kvstore-status -auth admin:$(</mnt/splunk-secrets/password)
- Clean
bin/splunk stop && bin/splunk clean kvstore --local -f ; bin/splunk start
- Resync
splunk resync kvstore
SH Bundles
- Put Bundles into the deployer, in etc/shcluster
- Sample
splunk@dep1 splunk]$ mkdir -p etc/shcluster/apps/base-app-demo1/default [splunk@dep1 splunk]$ echo "#test" > etc/shcluster/apps/base-app-demo1/default/server.conf
- Apply in stages
[splunk@dep1 splunk]$ bin/splunk apply shcluster-bundle -target https://sh1:8089 -action stage [splunk@dep1 splunk]$ bin/splunk apply shcluster-bundle -target https://sh1:8089 -action send
- Status
[splunk@dep1 splunk]$ bin/splunk list shcluster-bundle -member_uri https://sh3:8089 [splunk@dep1 splunk]$ bin/splunk show bundle-replication-status [splunk@sh1 splunk]$ bin/splunk show bundle-replication-status -auth admin:$(</mnt/splunk-secrets/password)
Smart Store
- Check Filesystem:
bin/splunk cmd splunkd rfs -- ls --starts-with volume:remote_store
- Check Logs
cat var/log/splunk/splunkd.log | grep S3
Rolling restart
splunk rolling-restart cluster-peers splunk rolling-restart shcluster-peers
HEC
- Sening a test message
curl "https://localhost:8088/services/collector" \ -H "Authorization: Splunk <Auth Token>" \ -d '{"event": "Hello, world!", "sourcetype": "manual"}' -k
Debugging Searches
General
- Search Splunkd Log
index=_internal sourcetype=splunkd
- Status / LogLevel
index=_internal sourcetype=splunkd status_code=* index=_internal sourcetype=splunkd log_level=ERROR
Apps
- ConfDeployment
index=_internal component=ConfDeployment data.task=*Apps
index=_internal sourcetype=splunkd_conf | top data.task
- Checking SHC Bundle Deployment Status
index=_internal component=ConfDeployment data.task=*Apps | table host data.source data.source data.target_label data.task data.status
- Filter for SRC
index=_internal sourcetype=splunkd_conf data.task=createDeployableApps | rex "\"src\":\"(?<src>[^\"]+)\"" | top _time,src
- Find missing baseline
index=_internal sourcetype=splunkd_conf STOP_ON_MISSING_LOCAL_BASELINE | timechart count by host
- Overall configuration behaviour
index=_internal sourcetype=splunkd_conf pullFrom data.to_repo!=*skipping* | timechart count by data.to_repo
- Evidence of Caption Switching
index=_internal sourcetype=splunkd_conf pullFrom data.from_repo!=*skipping* | timechart count by data.from_repo
- Find the destructive resynch events
index=_internal sourcetype=splunkd_conf installSnapshot | timechart count by host
- App Creation
index=_internal sourcetype=splunkd "Detected app creation"
Mongod startup
index="_internal" MongoDB starting | top host index="_internal" "MongoDB starting" source="/opt/splunk/var/log/splunk/mongod.log"
S3
- SmartStore
index=_internal sourcetype=splunkd S3Client
Searching
Timechart
M=CB PCC=SYDXXX | timechart max(DTM) as CRSMessages span=30s
Sparkline
M=CB | stats sparkline max(DTM) as Messages by PCC
Lookups
- List Lookups by Rest
| rest /services/data/lookup-table-files
Lookups are used to normalize data, currently there are lookups defined for:
| inputlookup name
- Sample Lookup Query, Show the top bookings and show the carrier name
M=BOI earliest=-1d latest=now | stats count(AIR) as Amount by AIR | sort Amount desc, limit=20 | lookup airlines Code as AIR OUTPUT Hint | rename Hint as Carrier | fields Carrier, Amount
- Sample Lookup Query, Show the top PCCs and show the customer name
M=FAPI CMD=GetFares | top PCC showperc=f | lookup pcc PCC as PCC OUTPUT Owner,CRSName | rename Owner as Customer | fields Customer, count
Extract Json
- Sample
info 2023-05-12 01:14:01: MQTT publish: topic 'zigbee2mqtt/0xa4c138acf7922221', payload '{"battery":100,"illuminance":11,"keep_time":"10","linkquality":14,"occupancy":false,"sensitivity":"high"}'
- Search
sourcetype=zigbee2mqtt zigbee2mqtt/0xa4c1381e3ed015b7 | rex field=_raw "(?<json_field>\{.*\})" | spath input=json_field | table _time occupancy battery tamper
- Sort by time and output the most recent record
sourcetype=zigbee2mqtt zigbee2mqtt/0xa4c1381e3ed015b7 | rex field=_raw "(?<json_field>\{.*\})" | spath input=json_field | sort - _time | head 1 | table _time occupancy battery tamper
Advanced Search Samples
Regex Samples
String to search:
Feb 13 14:07:02 10.0.3.30 Feb 13 14:07:02 mail mimedefang.pl[10780]: MDLOG,s1DD71da017590,mail_in,,,<support@domain.com>,<support@domain.com>,Warning Message
Regex to extract the message id:
explorer mimedefang.pl | rex field=_raw "MDLOG\,(?<MSGID>.*),mail*" | top 100 MSGID,_time | fields _time, MSGID
String to search:
Feb 13 13:59:57 10.0.3.6 Feb 13 13:59:57 neptun vsftpd[8973]: [keytravel] FTP response: Client "194.74.154.185", "226 Transfer complete."
Regex to extract the login:
host="10.0.3.6" ": [*]" FTP | rex field=_raw "(?<Login>\s{1}\[.*\])" | top Login
String to search:
Mar 5 15:07:10 10.0.3.30 Mar 5 15:07:10 mail sm-mta[15042]: s25E727n015042: Milter add: header: X-Spam-Status: Yes, score=21.8 required=5.0 tests=BAYES_99,GEO_MAIL_SEARCH,\n \tHELO_DYNAMIC_IPADDR,HTML_MESSAGE,MIME_HTML_ONLY,RCVD_IN_BL_SPAMCOP_NET,\n\tRCVD_IN_BRBL_LASTEXT,RCVD_IN_PBL,RCVD_IN_PSBL,RCVD_IN_RP_RNBL, \n\tRCVD_IN_SORBS_DUL,RCVD_IN_XBL,RDNS_DYNAMIC,SPF_NEUTRAL,URIBL_DBL_SPAM,\n\tURIBL_WS_SURBL
Regex to extract the message id:
host="10.0.3.30" "X-Spam-Status: Yes" | rex field=_raw "]: (?<MSGID>.*): Milter" | top MSGID
String to search:
M=FEEDEDF OAD=142 TOTFLIGHTFILES=71 TOTALOMAFILES=71 TOTNBRFLIGHTS=4406 TOTNBRALOMAS=6066 TOTKEYS=10614 SIZETOT=8839080 DURATION=13 TTL=432000 INFO=0 Host=VM-XC01 Job=hhh_edf_NL_2018-11-19-1349-1-90-RT.csv Code=HHH-FR-01
Regex to extract the date range (1-90):
M=FEEDEDF | rex field=_raw "Job=hhh_edf_\w+-\d+-\d+-\d+-(?<STR>.*\d*-\d*)-RT" | top STR
Regex to expand date to day, month and year, sample:
DATE=2020-01-01 ....
Regex
rex field=DATE "(?<Year>[^\-]+)\-(?<Month>[^\-]+)\-(?<Day>[^\-]+)"
Then aggregate by
stats sum(...) as Something by Month Year
Sample:
Oct 31 12:14:39 192.168.100.1 %ASA-4-106023: Deny tcp src outside:185.176.27.178/46086 dst inside:192.168.100.237/12834 by access-group "static_outside" [0x0, 0x0]
Regex:
host="192.168.100.1" | rex field=_raw "Deny tcp src outside:(?<SRC>[^\/]+).*dst inside:(?<DST>[^\/]+)\/(?<PORT>[^\s+]+)" | top SRC,DST,PORT
Sample:
Jun 3 15:29:32 192.168.100.1 %ASA-6-302013: Built inbound TCP connection 2154199512 for outside:212.19.51.190/64499 (212.19.51.190/64499) to inside:192.168.100.240/443 (146.0.228.21/443)
Regex to get a table of SRC,DST and Port
host="192.168.100.1" Built inbound TCP connection * | rex field=_raw "for outside:(?<SRC>[^\/]+)" | rex field=_raw "to inside:(?<DST>[^\/]+)\/(?<PORT>[^\s+]+)" | top 500 SRC,DST,PORT
Regex to get date
2023-12-17 18:00:00 * | rex "^(?<Year>[^\-]+)\-(?<Month>[^\-]+)\-(?<Day>[^ ]+)\s+(?<Hour>[^\:]+):(?<Minute>[^\:]+):(?<Second>[^ ]+)" | eval dateref=Year + "-" + Month + "-" + Day + " " + Hour + ":" + Minute + ":" + Second | sort - dateref
Lookahead Sample
Records(s) to look ahead and group
Mar 5 15:34:20 10.0.3.30 Mar 5 15:34:20 spamd child[6707]: GSCORE=0 COU=ES ASN=AS12357 IP=77.230.132.146 MFROM=ibe@elegancejewelrydesigns.com MTO=ibe@hitchhiker.com MSGID=s25EYHtn016074 HELO=static-146-132-230-77.ipcom.comunitel.net IPN=1306952850 LAT=40.0000 LON=-4.0000 CTY=0 Mar 5 15:34:24 10.0.3.30 Mar 5 15:34:24 mail sm-mta[16074]: s25EYHtn016074: Milter add: header: X-Spam-Status: Yes, score=23.4 required=3.0 tests=BAYES_99,CK_HELO_GENERIC,\n\tGEO_MAIL_SEARCH,HELO_DYNAMIC_IPADDR,HTML_MESSAGE,MIME_HTML_ONLY,\n\tRAZOR2_CF_RANGE_51_100,RAZOR2_CF_RANGE_E8_51_100,RAZOR2_CHECK,\n\tRCVD_IN_BL_SPAMCOP_NET,RCVD_IN_BRBL_LASTEXT,RCVD_IN_PSBL,RCVD_IN_SORBS_WEB,\n\tRCVD_IN_XBL,SPF_NEUTRAL,URIBL_BLACK,URIBL_DBL_SPAM,URIBL_JP_SURBL,\n\tURIBL_WS_SURBL autolearn=spam version=3.3.1
Query:
earliest=-1d latest=now host="10.0.3.30" "X-Spam-Status: Yes" OR GSCORE | rex field=_raw "]: (?<MSGID>.*): Milter" | transaction MSGID | search "X-Spam-Status: Yes" | top MFROM
Transaction
Transaction sample
TIME>20 | transaction TID | rename TIME as ResponseTime | table _time,TID,host,ResponseTime,DEP,ARR,M,SC,SS,PCC | search PCC=XXX
Append two searches
- Use appendcols
M=FAPI FT=1 STAT=0 USR=USR.PROD | stats count(CMD) as XXX | appendcols [search M=FAPI FT=1 STAT=0 USR NOT XXXX.PROD | stats count(CMD) as OTHER]
Stats
User Stats
- Source
index="_internal" sourcetype=splunk_web_access
- total number of users:
index="_internal" sourcetype=splunk_web_access | timechart span=1d count(user) as total_users
- distinct number of users:
index="_internal" sourcetype=splunk_web_access | timechart span=1d dc(user) as distinct_users
- count per user:
index="_internal" sourcetype=splunk_web_access | timechart span=1d count as count_user by user
Message Stats
- Messages per day
index=* earliest=-24h@h latest=now | stats count by index
- Volume per Index
index=* | eval size=len(_raw) | eval GB=(size/1024/1024/1024) | stats sum(GB) by index
- Volume per Day
index=* | eval size=len(_raw) | eval GB=(size/1024/1024/1024) | timechart sum(GB) span=1d
Searches
- Amount of searches per day
index=_audit action="search" search="*" NOT user="splunk-system-user" savedsearch_name="" NOT search="\'|history*" NOT search="\'typeahead*" | timechart count
Links
- KV Store Renew cert
- http://wiki.intern/index.php/Renew_internal_Splunk_License