diff --git a/README.md b/README.md index 614187b..8a94980 100644 --- a/README.md +++ b/README.md @@ -3,14 +3,15 @@ ![Alt text](https://github.com/ratibor78/geostat/blob/master/geostat.png?raw=true "Grafana dashboard example") -GeoStat is a Python script for parsing Nginx logs files and getting GEO data from incoming IP's in it. This script convert parsed data in to Json format and send it to InfluxDB database so you can use it to build some nice Grafana dashboards for example. It runs as service by SystemD and parse log in "tailf" style. +GeoStat is a Python script for parsing Nginx logs files and getting GEO data from incoming IP's in it. This script converts parsed data into JSON format and sends it to the InfluxDB database so you can use it to build some nice Grafana dashboards for example. It runs as service by SystemD and parses log in tailf command style. Also, it can be run as a Docker container for the easy start. # Main Features: - - Parsing incoming ip's from web server log and convert them in to GEO metrics for the InfluxDB. - - Used standard python libs for the maximum compatibility. + - Parsing incoming IPS from web server log and convert them into GEO metrics for the InfluxDB. + - Used standard python libs for maximum compatibility. - Having an external **settings.ini** for comfortable changing parameters. + - Have a Docker file for quick building Docker image. -Json format that script send to InfluxDB looks like: +JSON format that script sends to InfluxDB looks like: ``` [ { @@ -26,7 +27,7 @@ Json format that script send to InfluxDB looks like: } ] ``` -As you can see there is three tags fields, so you can build dashboards using geohash (with a point on the map) or country code, or build dashboards with variables based on host name tag. A count for any metric equal 1. This script don't parse log file from the begining but parse it line by line after runing. So you can build dashboards using **count** of geohashes or country codes after some time will pass. +As you can see there are three tags fields, so you can build dashboards using geohash (with a point on the map) or country code, or build dashboards with variables based on the host name tag. A count for any metric equals 1. This script doesn't parse log file from the beginning but parses it line by line after running. So you can build dashboards using **count** of geohashes or country codes after some time will pass. You can find the example Grafana dashboard in **geomap.json** file or from grafana.com: https://grafana.com/dashboards/8342 @@ -41,10 +42,10 @@ GeoStat uses a number of open source libs to work properly: Using install.sh script: 1) Clone the repository. 2) CD into dir and run **install.sh**, it will ask you to set a properly settings.ini parameters, like Nginx **access.log** path, and InfluxDB settings. -3) After script will finished you only need to start SystemD service with **systemctl start geostat.service**. +3) After the script will finish you only need to start SystemD service with **systemctl start geostat.service**. Manually: -1) Clone the repository, create environment and install requirements +1) Clone the repository, create an environment and install requirements ```sh $ cd geostat $ virtualenv venv && source venv/bin/activate @@ -69,8 +70,18 @@ $ cp ./GeoLite2-City_some-date/GeoLite2-City.mmdb ./ $ systemctl enable geostat.service $ systemctl start geostat.service ``` +Using Docker image: +1) Build the docker image from the Dockerfile inside geostat repository directory run: +``` +$ docker build -t some-name/geostat . +``` +2) After Docker image will be created you can run it using properly edited **settings.ini** file and you also, +need to forward the Nginx/Apache logfile inside the container: +``` +docker run -d --name geostat -v /opt/geostat/settings.ini:/settings.ini -v /var/log/nginx_access.log:/var/log/nginx_access.log some-name/geostat +``` -After first metrics will go to the InfluxDB you can create nice Grafana dashboards. +After the first metrics will go to the InfluxDB you can create nice Grafana dashboards. Have fun !