Skip to content

Instantly share code, notes, and snippets.

@xuio
Forked from pbock/buergerbot.rb
Last active October 9, 2024 06:12
Show Gist options
  • Save xuio/297e16b84d700314241393e8f3aa42f3 to your computer and use it in GitHub Desktop.
Save xuio/297e16b84d700314241393e8f3aa42f3 to your computer and use it in GitHub Desktop.
Bürgerbot: Refreshes the Berlin Bürgeramt page until an appointment becomes available, then notifies you via Telegram
#!/usr/bin/env ruby
require 'watir-webdriver'
require 'telegram/bot'
Telegram.bots_config = {
default: '<Telegram Bot token>',
}
Telegram.bot.get_updates
chat_id = '<Telegram Chat ID>'
def log (message) puts " #{message}" end
def success (message) puts "+ #{message}" end
def fail (message) puts "- #{message}" end
def notify (message)
success message.upcase
system 'osascript -e \'Display notification "Bürgerbot" with title "%s"\'' % message
rescue StandardError => e
end
def appointmentAvailable? (b)
url = 'https://service.berlin.de/terminvereinbarung/termin/tag.php?termin=1&dienstleister=122282&anliegen[]=120703&herkunft=1'
puts '-'*80
log 'Trying again'
b.goto url
log 'Page loaded'
link = b.element css: '.calendar-month-table:first-child td.buchbar a'
if link.exists?
link.click
notify 'An appointment is available.'
log 'Enter y to keep searching or anything else to quit.'
Telegram.bot.send_message(chat_id: chat_id, text: 'Found an appointment! Book here: https://service.berlin.de/terminvereinbarung/termin/tag.php?termin=1&dienstleister=122282&anliegen[]=120703&herkunft=1')
return gets.chomp.downcase != 'y'
else
fail 'No luck this time.'
return false
end
rescue StandardError => e
fail 'Error encountered.'
puts e.inspect
return false
end
b = Watir::Browser.new
Telegram.bot.send_message(chat_id: chat_id, text: 'starting bot...')
until appointmentAvailable? b
log 'Sleeping.'
sleep 60
end
@Ruperr
Copy link

Ruperr commented Apr 2, 2022

Is there any way to get rid of the captchas when a new appointment appears?

@nicbou
Copy link

nicbou commented Apr 2, 2022

They are here for a reason. Have a look at their robots.txt.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment