gpburdell

gpburdell is a pretty cool user. Here are a bunch of searches created by gpburdell

Authentication Anomalies via "area" algorithm

`authentication` 
| search ( action="success" ) 
| eval citycountry=src_city+", "+src_country 
| stats (name, purpose, rqts, searchstring, created_at, updated_at, commentary) VALUES(citycountry) as CityCountry(name, purpose, rqts, searchstring, created_at, updated_at, commentary) VALUES, dc(citycountry) as loccount, max(src_lat) as maxlat, min(src_lat) as minlat,max(src_long) as maxlong, min(src_long) as minlong by user 
| eval delta_lat = abs(maxlat-minlat) 
| eval delta_long=abs(maxlong-minlong) 
| eval area= delta_lat * delta_long * loccount 
| where area > 1000

purpose:

Use 'area' to identify whether a given person could travel the distance between login events.

requirements:

ES app (or something with a matching macro

comments:

More than a day between events

<search>
| sort _time
| streamstats current=f global=f window=1 last(_time) as last_ts
| eval time_since_last = _time - last_ts
| fieldformat time_since_last = tostring(time_since_last, "duration")
| where time_since_last > 60*60*24

purpose:

find situations where there is more than a day between two events

requirements:

any events. the only field dependency is _time

comments:

Splunk Server's Time

* | head 1 | eval tnow = now() | fieldformat tnow=strftime(tnow, "%c %Z") | table tnow

purpose:

shows the time according to the splunk server

requirements:

comments:

Time between events

<search>
| sort _time 
| streamstats current=f global=f window=1 last(_time) as last_ts 
| eval time_since_last = _time - last_ts 
| fieldformat time_since_last = tostring(time_since_last, "duration")

purpose:

add a field to each event which is the time between this event and the previous one. duration between events

requirements:

any data. the only field requirement in this search is _time

comments:

Find Rare Processes (windows)

sourcetype=winregistry | rare process_image

purpose:

find rarely seen windows processes. might indicate custom malware.

requirements:

winregistry data

comments:

Detect Account Sharing

…. | stats dc(src_ip) as ip_count by user

purpose:

Detect Users who login from multiple IP's / User account Sharing

requirements:

Login logs with Username + Source IP field extractions

comments:

  • … - first search for something, maybe with logon/login etc. and review if there are the proper logs for logins and field extractions that are working
  • Do stats to show the distinct count of different source ip's used per user | stats dc(src_ip) as ip_count by user

Detect Clock Skew

| rest /services/server/info
| eval updated_t=round(strptime(updated, "%Y-%m-%dT%H:%M:%S%z"))
| eval delta_t=now()-updated_t
| eval delta=tostring(abs(delta_t), "duration")
| table serverName, updated, updated_t, delta, delta_t

purpose:

Check for server clock skew

requirements:

comments:

If delta is anything other than about 00:00:01 (which is easy to account for when processing a lot of indexers), you may have clock skew.

Simple Top 5 Attackers

sourcetype = "juniper:idp" attack* | top limit=5 src_ip

purpose:

Find the top 5 ip addresses that are attempting to attack us.

requirements:

juniper:idp data

comments:

Detect Machines with High Threatscore

index=<replace> | stats count by src_ip dst_ip dst_port protocol | lookup threatscore clientip as dst_ip | sort –threatscore | where threatscore>0

purpose:

Detect machines/applications who are potentially infected and have active running malware on it. Even use it to detect fraud for shopping site orders coming from bad IP's

requirements:

machine data with external IP's + IP Reputation App

comments:

  • Search Logs index=
  • Make sure fields are extracted fine – you can even let this run in realtime – looks cool: | stats count by src_ip dst_ip dst_port protocol
  • Now we enrich the data with | lookup threatscore clientip as dst_ip
  • Now as there is a new field evaluated (Threatscore) we want to show the IP's with the highest threatscore first by sorting it: | sort –threatscore
  • And now we only want to see malicious connections instead of the good once: | where threatscore>0

Simple Outlier Search

error |stats count by host| eventstats avg(count) as avg stdevp(count) as stdevp | eval ub=avg+2*stdevp, lb=avg-2*stdevp, is_outlier=if(count<lb, 1, if(count>ub, 1, 0)) | where is_outlier=1

purpose:

Find outliers - hosts that have an error count which is greater than two standard deviations away from the mean.

requirements:

hosts with errors. alternatively, you can alter the search (before pipe) to source just about anything else that you'd like to analyze.

comments:

json spath w/ date

... | spath input=message | where strptime('updated_at', "%Y-%m-%d %H:%M:%S %z") > strptime("2013-08-07 00:00:00", "%Y-%m-%d %H:%M:%S")

purpose:

searches for events which contain a field called "message". That field contains json payload and is expanded via a call to spath. Then a value from the resulting expansion is used to find events that contain a date meeting certain criteria.

requirements:

comments:

geo-location w/ user home base lookup

index=geod
# get some location information
| iplocation clientip
# lookup user details from a lookup table
#  including their home location
| lookup user_home_lu user as user
# calculate the distance between the login location
#  and the user's home location
#  using the haversine app (http://apps.splunk.com/app/936/)
| haversine originField=home_latlon units=mi inputFieldLat=lat inputFieldLon=lon
# limit the list to those where the distance is greater
#  than 500 miles
| where distance > 500
# clean up for reporting purposes
| strcat City ", " Region cs
# report the results
| fields user, cs, distance

purpose:

find users that are logging in from a location which is greater than 500 miles away from the registered home office

requirements:

haversine app clientip lookup table with user > home_latlon

comments:

Speed / Distance Login Anomaly

index=geod
| iplocation clientip 
| sort _time 
| strcat lat "," lon latlon 
| streamstats current=f global=f window=1 last(latlon) as last_latlon
| eval last_latlon=if(isnull(last_latlon), latlon, last_latlon)
| streamstats current=f global=f window=1 last(_time) as last_ts
| eval time_since_last = _time - last_ts
| eval time_since_last=if(isnull(time_since_last), 0, time_since_last)
| haversine originField=last_latlon outputField=distance units=mi latlon
| eval speed=if(time_since_last==0, 0, (distance/(time_since_last/60/60)))
| where speed > 500
| strcat speed " MPH" speed
| table user, distance, _time, time_since_last, speed, _raw

purpose:

Find those tuples of events where the speed needed to cover distance in time between events is greater than 500MPH

requirements:

haversine app clientip

comments:

XML with spath

index=demo1 sourcetype=xml-log-data | spath input=message | where strptime('message.updated_at', "%Y-%m-%d %H:%M:%S %z") > strptime("2013-08-07 00:00:00", "%Y-%m-%d %H:%M:%S")

purpose:

searches for events which contain a field called "message" that composite field is expanded via a call to spath. Then a value from the resulting expansion is used to find events that contain a date meeting certain criteria.

requirements:

comments:

Auth anomaly basic with haversine

index=geod 
| iplocation clientip 
| sort _time 
| strcat lat "," lon latlon 
| streamstats current=f global=f window=1 last(latlon) as last_latlon
| eval last_latlon=if(isnull(last_latlon), latlon, last_latlon)
| streamstats current=f global=f window=1 last(_time) as last_ts
| eval time_since_last = _time - last_ts
| eval time_since_last=if(isnull(time_since_last), 0, time_since_last)
| haversine originField=last_latlon outputField=distance units=mi latlon
| eval speed=if(time_since_last==0, 0, (distance/(time_since_last/60/60)))
| strcat speed " MPH" speed
| table user, distance, _time, time_since_last, speed, _raw

purpose:

Find the speed needed to cover the distance between the ip-location specified in two different login events

requirements:

haversine app clientip as ip address

comments:

Extract SQL Insert Params

sourcetype=stream:mysql* query="insert into*" | rex "insert into \S* \((?<aaa>[^)]+)\) values \((?<bbb>[^)]+)\)" | rex mode=sed field=bbb "s/\\\\\"//g" | makemv aaa delim="," | makemv bbb delim="," | eval a_kvfield = mvzip(aaa, bbb) | extract jam_kv_extract | timechart span=1s per_second(m_value) by m_name

purpose:

extracts fields from a SQL Insert statement so that the values inserted into the database can be manipulated via splunk searches. In this case, it is used in conjunction with splunk stream & mysql, but should work with any source / database technology.

requirements:

comments:

Unauthorized Foreign Activity

layout=edit | geoip clientip as clientip | table _time clientip client_country | where client_country NOT ("Germany" OR "Austria" OR "Switzerland")

purpose:

Detect unauthorized admin activity via foreign country

requirements:

Logs with external source IP's

comments:

  • Search for admin activity – like on my webpage in a CMS system for example for "layout=edit"
  • Display all the IP's with table clientip _time
  • Enrich them with geoip lookup (geoip clientip)
  • Display all changes with geo information:
    • layout=edit | lookup geoip clientip as clientip | table _time clientip client_country
  • Review them and create a simple whitelists | where client_country NOT ("Germany" OR "Austria" OR "Switzerland")

song puzzle answer

index=music-puzzle sourcetype=test3 | rename song.parts{}.id as a__pid, song.parts{}.part as a__ppt, song.parts{}.seq as a__pseq | eval tuples = mvzip(mvzip(a__pid, a__ppt, "~~"),a__pseq, "~~") | fields - a__* | mvexpand tuples | rex field=tuples "(?<s_p_id>[^~]+)~~(?<s_p_text>[^~]+)~~(?<s_p_seq>[^~]+)" | sort song.name, s_p_seq | eval s_p_text = urldecode(s_p_text) | stats list(s_p_text) by song.name

purpose:

requirements:

comments:

Search Golf - Episode 1

# source the events in chron order (so "start" is before "end")
index=cst sourcetype=mav-golf | reverse 
# add a line number / temp id to the events
| eval lc=1 | accum lc 
# extract a field to make it easier to deal with action
#  not really necessary in this example - could just search for "start" / "end"
| rex field=_raw "ID=\S\s(?<action>\S+)\s" | stats list(action) as action by ID, lc 
# find action=start for each identifier and join that back into each row
| join ID type=left [search index=cst sourcetype=mav-golf | reverse | eval lc=1 | accum lc | rex field=_raw "ID=\S\s(?<action>\S+)\s"  | search action=start | stats first(lc) as open by ID] 
# find action=end for each identifier and join that back into each row
| join ID type=left [search index=cst sourcetype=mav-golf | reverse | eval lc=1 | accum lc | rex field=_raw "ID=\S\s(?<action>\S+)\s"  | search action=end | stats last(lc) as close by ID] 
# lastly, test each event to see if it's own id is between the start and end.
#  if so - count it.
| eval sc = if(lc>open, if(lc<close, 1, 0), 0) 
# And then sum up those events which should be counted.
| stats sum(sc) as num_events by ID

purpose:

Find the number of events within a sequence of events based on a shared identifier. Keywords ("start" and "end") mark the beginning and end of the sequence. The search cannot use the transaction command.

requirements:

Data like the following: 01/01/2014 01:01:00.003 ID=a start blah blah 01/01/2014 01:01:01.003 ID=d more blah blah 01/01/2014 01:01:02.003 ID=a end blah blah 01/01/2014 01:01:03.003 ID=b start blah blah 01/01/2014 01:01:04.003 ID=c start blah blah 01/01/2014 01:01:05.003 ID=y more blah blah 01/01/2014 01:01:05.006 ID=c more blah blah 01/01/2014 01:01:05.033 ID=c more blah blah 01/01/2014 01:01:06.003 ID=c end blah blah 01/01/2014 01:01:06.033 ID=b more blah blah 01/01/2014 01:01:07.003 ID=b end blah blah 01/01/2014 01:01:08.004 ID=c more blah blah 01/01/2014 01:01:09.005 ID=b more blah blah

comments:

json mv extraction

...
# clean up some field names for ease of typing later
| rename events{}.code AS a_c, events{}.message AS a_m, events{}.timestamp AS a_ts, events{}.priority as a_p
# combine mv fields together using mvzip (to get tuples as comma-delim'd strings)
| eval b_combined = mvzip(mvzip(mvzip(a_c, a_m), a_ts), a_p)
# get rid of the a_* fields, simply b/c we don't need them clogging up the ui
| fields - a_*
# expand out the combined fields
| mvexpand b_combined
# extract nicely named fields from the results (using the comma from mvzip as the delimiter)
| rex field=b_combined "(?<e_c>[^,]*),(?<e_m>[^,]*),(?<e_ts>[^,]*),(?<e_p>[^,]*)"
# get rid of the combined field b/c we don't need it
| fields - b_*
# urldecode the field that you care about
| eval e_m = urldecode(e_m)

purpose:

requirements:

some json data with pretty specific structure

comments:

Machines with Multiple Services

index=firewalltraffic | stats count by src_ip dst_ip dst_port protocol | stats dc(dst_port) as "Different Ports" by dst_ip

purpose:

Detect machines offering multiple services

requirements:

Firewall Traffic and extracted source/destination IP + SRC_Port/DST_Port

comments:

  • Search Firewall Logs index=
  • Make sure fields are extracted fine – you can even let this run in realtime – looks cool: stats count by src_ip dst_ip dst_port protocol
    • You might also use this one to trigger down to say – i can filter only on FTP Traffic (Port 21), SSH Traffic, Web, SMTP, Filter to show what active directory domain controllers are doing by SRC/DST IP etc.
  • Now we only want to see which IP's offering services on how many different ports: | stats dc(dst_port) as "Different Ports" by dst_ip
  • You can also switch by dst_ip with src_ip so you see which host is consuming the most different services
  • You can also filter it down with a additional | where "Different Ports" > 5

Search to end all errors

index=_internal sourcetype="splunkd" log_level="ERROR" 
| stats sparkline count dc(host) as hosts last(_raw) as last_raw_msg values(sourcetype) as sourcetype last(_time) as last_msg_time first(_time) as first_msg_time values(index) as index by punct 
| eval delta=round((first_msg_time-last_msg_time),2) 
| eval msg_per_sec=round((count/delta),2) 
| convert ctime(last_msg_time) ctime(first_msg_time) 
| table last_raw_msg count hosts sparkline msg_per_sec sourcetype index first_msg_time last_msg_time delta  | sort -count

purpose:

identifies frequently occurring errors in your splunk instance. LSS knocking out the top 10 on this list will make your splunk instance very happy

requirements:

comments:

Chart HTTP Status Category % by URL

index=* sourcetype=access* status=* | rex field=bc_uri "/(?<route>[^/]*)/" | rangemap field=status code_100=100-199 code_200=200-299 code_300=300-399 code_400=400-499 code_500=500-599 | rename range as stcat | stats count as sct by route, stcat |  eventstats sum(sct) as ttl by route | eval pct = round((sct/ttl), 2)."%" | xyseries route stcat pct

purpose:

creates a table where the rows are URL values, the columns are HTTP status categories and the cells are the percentage for that status / url combination

requirements:

comments:

Chart HTTP Status Category % by URL (using join)

index=* sourcetype=access* status=* | rex field=bc_uri "/(?<route>[^/]*)/" | stats count as scount by route, status_type | join route [search index=* sourcetype=access* status=* | rex field=bc_uri "/(?<route>[^/]*)/" | stats count as ttl by route] | eval pct = round((scount / ttl), 2)."%" | xyseries route status_type pct

purpose:

requirements:

comments:

kontes sep poker2018



agen domino 99, , BANDAR Q ONLINE , BandarQ , DOMINO ONLINE, DOMINO QIU QIU,poker online poker online

http://alip.web.id/borneopoker-com-bandar-poker-online-serta-bandar-q-online-terpercaya-indonesia/
http://addurl-bank.com/member/jawapkr/

http://alip.web.id/borneopoker-com-bandar-poker-online-serta-bandar-q-online-terpercaya-indonesia/

http://alip.web.id/sbctogel-com-bandar-togel-sgp-dan-agen-judi-togel-hk-online-terpercaya

http://alip.web.id/capsabandarq-com-capsa-online-agen-bandar-q-domino-99-qiu-qiu-online-bandarq/

http://alip.web.id/klikbets-net-betting-judi-bola-poker-casino-togel-bandar-q/

http://alip.web.id/aseanqq-net-poker-domino-99-bandar-kiu-adu-qq-sakong/



http://alip.web.id/agtw9-com-agen-judi-poker-bandar-q-dewa-poker-domino-dan-capsa-susun-online-terpercaya-indonesia/


http://alip.web.id/bandar-judi-online-terlengkap-dan-terpercaya-indonesia-bandar-asia/

https://alip.web.id/bandarqq365-com-situs-bandarq-bandar-poker-dominoqq-adu-q-poker-online-sakong-online-terpercaya/

http://alip.web.id/musimqq-bandar-poker-online-dominoqq-bandarq-bandar-sakong-online-uang-asli-terpercaya-indonesia/ 

http://alip.web.id/pialadominobet-com-dominobet-qiu-qiu-poker-online-bandarq-online-terpercaya/

http://alip.web.id/indolapak99-com-bandarq-domino-99-dominoqq-capsa-online-poker88/

http://alip.web.id/punyapoker-situs-pusat-poker-v-online-poker-bandarq-domino-qq-online-terpercaya/

https://alip.web.id/agungqq-com-agen-dominoqq-bandarq-capsa-susun-qq-online/


https://alip.web.id/parisbola-com-sbobet-bandarq-bola-online-togel-online-live-casino/

http://sekodilemo.blogspot.co.id/2018/01/indolapak99-com-bandarq-domino-99-dominoqq-capsa-online-poker88.html
http://pokeronlineindonesia.web.id
http://pastibet.co
http://jasabandarq.net
http://jawapkr.net
http://cmobet.co
http://helidomino.com
http://arenajuara.com
http://www.iontogel.com
toggle commentscopy w/out commentsedit

purpose:

Use the Haversine app (and formula) to identify whether a user would be able to travel the distance between login events fast enough to make it valid

requirements:

comments:

agen domino 99, , BANDAR Q ONLINE , BandarQ , DOMINO ONLINE, DOMINO QIU QIU,poker online poker online http://alip.web.id/borneopoker-com-bandar-poker-online-serta-bandar-q-online-terpercaya-indonesia/ http://addurl-bank.com/member/jawapkr/ http://alip.web.id/borneopoker-com-bandar-poker-online-serta-bandar-q-online-terpercaya-indonesia/ http://alip.web.id/sbctogel-com-bandar-togel-sgp-dan-agen-judi-togel-hk-online-terpercaya http://alip.web.id/capsabandarq-com-capsa-online-agen-bandar-q-domino-99-qiu-qiu-online-bandarq/ http://alip.web.id/klikbets-net-betting-judi-bola-poker-casino-togel-bandar-q/ http://alip.web.id/aseanqq-net-poker-domino-99-bandar-kiu-adu-qq-sakong/ http://alip.web.id/agtw9-com-agen-judi-poker-bandar-q-dewa-poker-domino-dan-capsa-susun-online-terpercaya-indonesia/ http://alip.web.id/bandar-judi-online-terlengkap-dan-terpercaya-indonesia-bandar-asia/ https://alip.web.id/bandarqq365-com-situs-bandarq-bandar-poker-dominoqq-adu-q-poker-online-sakong-online-terpercaya/ http://alip.web.id/musimqq-bandar-poker-online-dominoqq-bandarq-bandar-sakong-online-uang-asli-terpercaya-indonesia/ http://alip.web.id/pialadominobet-com-dominobet-qiu-qiu-poker-online-bandarq-online-terpercaya/ http://alip.web.id/indolapak99-com-bandarq-domino-99-dominoqq-capsa-online-poker88/ http://alip.web.id/punyapoker-situs-pusat-poker-v-online-poker-bandarq-domino-qq-online-terpercaya/ https://alip.web.id/agungqq-com-agen-dominoqq-bandarq-capsa-susun-qq-online/ https://alip.web.id/parisbola-com-sbobet-bandarq-bola-online-togel-online-live-casino/ http://sekodilemo.blogspot.co.id/2018/01/indolapak99-com-bandarq-domino-99-dominoqq-capsa-online-poker88.html http://pokeronlineindonesia.web.id http://pastibet.co http://jasabandarq.net http://jawapkr.net http://cmobet.co http://helidomino.com http://arenajuara.com http://www.iontogel.com toggle commentscopy w/out commentsedit