So if you are saving data in elastic search and this data contains IP Addresses, you might want to consider enriching your data with GeoIP. I'll take my router logging data (see part 2) and enrich them with GeoIP data.
First of all, let's adjust the Index Template and add the GeoIP Longitude, Latitude and Location fields.
I'll push the index template using the dev tools in Kibana.
PUT _template/mikrotik-log
{
"index_patterns" : [
"mikrotik-log-*"
],
"settings" : {
"index" : {
"codec" : "best_compression",
"refresh_interval" : "5s",
"number_of_shards" : "1",
"number_of_replicas" : "1"
}
},
"mappings" : {
"numeric_detection" : true,
"dynamic_templates" : [
{
"string_fields" : {
"mapping" : {
"type" : "keyword"
},
"match_mapping_type" : "string",
"match" : "*"
}
}
],
"properties" : {
"date" : {
"type" : "keyword"
},
"ap_ssid" : {
"type" : "keyword"
},
"in_interface" : {
"type" : "keyword"
},
"geoip_dst" : {
"dynamic" : true,
"properties" : {
"ip" : {
"type" : "ip"
},
"latitude" : {
"type" : "float"
},
"location" : {
"type" : "geo_point"
},
"longitude" : {
"type" : "float"
}
}
},
"remote_address" : {
"type" : "ip"
},
"interface" : {
"type" : "keyword"
},
"protocol" : {
"type" : "keyword"
},
"mac_address" : {
"type" : "keyword"
},
"released_ip" : {
"type" : "ip"
},
"@version" : {
"type" : "keyword"
},
"host" : {
"type" : "keyword"
},
"action" : {
"type" : "text"
},
"acquired_ip" : {
"type" : "ip"
},
"signal_strength" : {
"type" : "byte"
},
"disconnect_reason" : {
"type" : "text"
},
"out_interface" : {
"type" : "keyword"
},
"topic1" : {
"type" : "keyword"
},
"topic2" : {
"type" : "keyword"
},
"item" : {
"type" : "text"
},
"chain" : {
"type" : "keyword"
},
"method" : {
"type" : "keyword"
},
"wifi_state" : {
"type" : "keyword"
},
"topic3" : {
"type" : "keyword"
},
"length" : {
"type" : "short"
},
"geoip_src" : {
"dynamic" : true,
"properties" : {
"ip" : {
"type" : "ip"
},
"latitude" : {
"type" : "float"
},
"location" : {
"type" : "geo_point"
},
"longitude" : {
"type" : "float"
}
}
},
"src_port" : {
"type" : "keyword"
},
"@timestamp" : {
"type" : "date"
},
"local_address" : {
"type" : "ip"
},
"address_pool" : {
"type" : "keyword"
},
"dst_port" : {
"type" : "keyword"
},
"local_address_lookup" : {
"type" : "keyword"
},
"time" : {
"type" : "keyword"
},
"link_state" : {
"type" : "keyword"
},
"user" : {
"type" : "keyword"
}
}
},
"aliases" : { }
}
Next up, adjusting the logstash pipeline to add the geoip processor:
[archy@elk01 ~]$ sudo vim /etc/logstash/conf.d/mikrotik-log.conf
...
geoip {
source => "dst_address"
target => "geoip_dst"
database => "/etc/logstash/GeoLite2-City.mmdb"
}
geoip {
source => "src_address"
target => "geoip_src"
database => "/etc/logstash/GeoLite2-City.mmdb"
}
...
This will take the src_address and dst_address fields, process them through GeoIP and will enrich the document with GeoIP data. To make sure the changes are applied, restart the logstash service
[archy@elk01 ~]$ sudo systemctl restat logstash.service
Once the pipelines have started up again, you will have to refresh the index patterns in Kibana if they already existed or create a new one if they did not. Refreshing can be done in Kibana by navigating to 'Management' --> 'Index Patterns' --> 'Index_pattern_name' --> 'Refresh field list'
Once you are done with all of it, you should now have GeoIP data fields in your document.
Comments
Post a Comment