-
-
Save pawilon/238c278d3c6c4669771eb81b03264acd to your computer and use it in GitHub Desktop.
# cat /etc/fail2ban/filter.d/gitlab.conf | |
# fail2ban filter configuration for gitlab | |
# Author: Pawel Chmielinski | |
[Init] | |
maxlines = 6 | |
[Definition] | |
# The relevant log file is in /var/log/gitlab/gitlab-rails/production.log | |
# Note that a single failure can appear in the logs up to 3 times with just one login attempt. Adjust your maxfails accordingly. | |
## Example fail - clone repo via https | |
#Started GET "/" for 10.0.0.91 at 2016-10-25 00:01:24 +0200 | |
#Processing by RootController#index as HTML | |
#Completed 401 Unauthorized in 69ms (ActiveRecord: 23.7ms) | |
## Example fail - login via GUI | |
#Started GET "//chmielu/test.git/info/refs?service=git-upload-pack" for 10.0.0.91 at 2016-10-25 00:01:09 +0200 | |
#Processing by Projects::GitHttpController#info_refs as */* | |
# Parameters: {"service"=>"git-upload-pack", "namespace_id"=>"chmielu", "project_id"=>"test.git"} | |
#Filter chain halted as :authenticate_user rendered or redirected | |
#Completed 401 Unauthorized in 50ms (Views: 0.8ms | ActiveRecord: 8.1ms) | |
failregex = ^Started .* for <HOST> at .*<SKIPLINES>Completed 401 Unauthorized | |
ignoreregex = | |
For Gitlab 13 I now use the production_json.log file and use the following failregex:
failregex = ^{"method":"POST","path":"\/users\/sign_in",[a-zA-Z:,"]+,"status":0.*"remote_ip":"<HOST>",
failregex = ^{"method":"POST","path":"\/users\/sign_in",[a-zA-Z:,"]+,"status":0.*"remote_ip":"<HOST>",
For me, the regex did not match all relevant lines in production_json.log
. The regex
failregex = ^{"method":"POST","path":"\/users\/sign_in".*,"status":0.*,"remote_ip":"<HOST>"
is more general and finds all failed Standard logins for me. If you have a central LDAP login, the regex
failregex = ^{"method":"POST","path":("\/users\/sign_in"|"\/users\/auth\/ldapmain\/callback").*,("status":0|"action":"failure").*,"remote_ip":"<HOST>"
will also catch failed LDAP login attempts.
Hm, this does not seem to work for me. I get a lot of "Your GitLab account has been locked due to an excessive number of unsuccessful sign in attempts" emails, but in the production_json.log
file I don't see anything matching those regexes.
cat /var/log/gitlab/gitlab-rails/production_json.log | grep "sign_in" | jq '.status' | grep -v 302 | sort | uniq -c
58 200
So, nothing but successful sign-ins. No status other than 302.
I also do not see any POST
s with sign_in
:
cat /var/log/gitlab/gitlab-rails/production_json.log | grep "POST" | grep "sign_in"
I do see this though:
{"method":"POST","path":"/oauth/token","format":"html","controller":"Oauth::TokensController","action":"create","status":400,"time":"2023-02-12T10:36:38.039Z","params":[{"key":"grant_type","value":"password"},{"key":"username","value":"[email protected]"},{"key":"password","value":"[FILTERED]"},{"key":"token","value":{"grant_type":"password","username":"[email protected]","password":"[FILTERED]"}}]
But there is no host value to extract here.
@slhck I have exactly the same issue today ("Your GitLab account has been locked due to an excessive number of unsuccessful sign in attempts" emails), due to OAuth login attempts... API auth errors are in /var/log/gitlab/gitlab-rails/production.log
for me WITH the HOST value (I'm currently on Gitlab 15.3.1).
So, I suggest to create the following filter in /etc/fail2ban/filter.d/gitlab-oauth.conf
:
# Fail2Ban filter for Gitlab
# Detecting unauthorized access to the Gitlab OAuth API
# typically logged in /var/log/gitlab/gitlab-rails/production.log
[Definition]
failregex = ^Started POST "\/oauth\/token" for <HOST> at
datepattern = ^%%Y-%%m-%%d %%H:%%M:%%S %%z
Of course, don't forget to create a new file in etc/fail2ban/jail.d
like:
[gitlab-oauth]
enabled = true
filter = gitlab-oauth
logpath = /var/log/gitlab/gitlab-rails/production.log
maxretry = 3
bantime = 3600
And restart fail2ban service fail2ban restart
Thanks! I am not sure if this really worked, as the count is zero:
$ fail2ban-client status gitlab
Status for the jail: gitlab
|- Filter
| |- Currently failed: 0
| |- Total failed: 0
| `- File list: /var/log/gitlab/gitlab-rails/production.log
`- Actions
|- Currently banned: 0
|- Total banned: 0
`- Banned IP list
But then again, the attacks seemed to have stopped around 2023-02-12 13:37 (CET).
I've tested so much lines. Unfortunately, there were never any matches... but it must only block lines like this:
- Authentication failure
- invalid_credentials
- Failed Login
How can we solve this? Gitlab 16.10 CE
Strange, as soon as you post something, you find the solution...
For gitlab 11, there is a repository : https://gitlab.com/MiGoller/gitlab-fail2ban-filter