SQUID Proxy Automation
This post does not claim to be new, I give an example from real life. I myself am not an expert on Bash & PHP languages. Most likely, this instruction can be further simplified and improved.
For work, it was necessary to automate the database update process and simplify the modification of the ban files for SquidGuard. In a short time and because of my knowledge, I implemented this task in this way ...
1) We write a script for automatically updating the prohibition databases (update_squidGuard.sh). The script creates a backup copy of the current prohibition sheet, uploads a new database of sheets, unpacks it, updates and reconfigures Squid. He will work once a week.
2) Create a file that will update the database for SquidGuard every 20 minutes (rebuild_base.sh)
3) Run the scripts on the crown:
4) Naturally, there are many complex systems for remote control and monitoring of Squid. Historically, SAMS collects statistics from us, but nobody wants to use its lock lists, for reasons not known to this day. I wrote my little file editor in 30 minutes. Since files with prohibitions are processed by us every 20 minutes, I can easily change the list of prohibitions.
First, create file symlinks (* .diff) for each of the lock groups in / var / www / html /:
Example for group sheets: bad, good, pron.
This is certainly a tedious and not grateful task, but having done it once, you can no longer think about it.
4.1) The easiest access protection to this section of the site through the htaccess file:
4.2) Example index.html file:
4.3) File dealing with the output of locksheet lists.php:
4.4) After changing the lock files, run update.php, which writes the changes to the file.
That's all, you can try to test the result. I hope this experience or parts of it will be useful to someone. Successes.
PS: Criticism is welcome.
For work, it was necessary to automate the database update process and simplify the modification of the ban files for SquidGuard. In a short time and because of my knowledge, I implemented this task in this way ...
1) We write a script for automatically updating the prohibition databases (update_squidGuard.sh). The script creates a backup copy of the current prohibition sheet, uploads a new database of sheets, unpacks it, updates and reconfigures Squid. He will work once a week.
- #!/bin/sh
- echo '__________Создаем резервную копию листа запретов__________'
- tar zcf old_blacklists.tgz /etc/squid/blacklists/
- echo '============================'
- echo 'Успешно!'
- echo '============================'
- echo '__________Скачиваем свежую базу и перемещаем её поверх старой__________'
- /usr/bin/wget -q --cache=off 'http://www.shallalist.de/Downloads/shallalist.tar.gz' –O
- /etc/squid/updatedb/shallalist.tar.gz
- tar zxf /etc/squid/updatedb/shallalist.tar.gz -C /etc/squid/updatedb/
- cp -R -f /etc/squid/updatedb/BL/* /etc/squid/blacklists/
- rm -R /etc/squid/updatedb/BL/
- echo '============================'
- echo 'Успешно!'
- echo '============================'
- echo '__________Обновляем базы и реконфигурируем SQUID__________'
- /etc/squid/updatedb/rebuid_base.sh
- squid -k reconfigure
- echo '============================'
- echo 'Всё успешно сделано!'
- echo '============================'
2) Create a file that will update the database for SquidGuard every 20 minutes (rebuild_base.sh)
#!/bin/sh
chown -R squid:squid /etc/squid/blacklists
/usr/local/bin/squidGuard -u /etc/squid/blacklists/*/*.diff
chown -R apache:apache /etc/squid/blacklists/*/*.diff
/usr/sbin/squid -k reconfigure
3) Run the scripts on the crown:
tux# crontab –u squid –e
0,20,40 * * * * /etc/squid/updatedb/rebuid_base.sh
00 21 * * 7 /etc/squid/updatedb/update_squidGuard.sh
4) Naturally, there are many complex systems for remote control and monitoring of Squid. Historically, SAMS collects statistics from us, but nobody wants to use its lock lists, for reasons not known to this day. I wrote my little file editor in 30 minutes. Since files with prohibitions are processed by us every 20 minutes, I can easily change the list of prohibitions.
First, create file symlinks (* .diff) for each of the lock groups in / var / www / html /:
Example for group sheets: bad, good, pron.
domains-bad.diff
domains-good.diff
domains-pron.diff
…
urls-bad.diff
urls-good.diff
urls-pron.diff
…
This is certainly a tedious and not grateful task, but having done it once, you can no longer think about it.
4.1) The easiest access protection to this section of the site through the htaccess file:
Order deny,allow
Deny from all
Allow from 192.168.0.1
Allow from 192.168.0.2
Allow from 192.168.0.3
4.2) Example index.html file:
Редактор -
-
-
-
-
-
-
-
-
-
Список блокировок DOMAINS-BA редактировать Список блокировок URLS-BAD редактировать
* This source code was highlighted with Source Code Highlighter.
4.3) File dealing with the output of locksheet lists.php:
- header('Content-Type: text/html; charset=UTF-8');
- $var = "domains";
- if (isset($_GET['action']))
- {
- $var = $_GET['action'];
- }
- ?>
- BL EDITOR:
- Список запретов:
- На главную
* This source code was highlighted with Source Code Highlighter.
4.4) After changing the lock files, run update.php, which writes the changes to the file.
* This source code was highlighted with Source Code Highlighter.
- header('Content-Type: text/html; charset=UTF-8');
- $var = "domains";
- if (isset($_GET['action']))
- {
- $var1 = $_GET['action'];
- }
- // Добавляем новые данные в файл
- // списка доменов domains.diff
- $upd1 = $_GET['$var'];
- $upd1 = str_replace("\r",' ',$upd1);
- $fd = "$var1.diff";
- $fdomain = fopen($fd,"w+");
- fwrite($fdomain, $upd1);
- fclose($fdomain);
- echo "<b>all ok!
";- echo "Home";
- ?>
That's all, you can try to test the result. I hope this experience or parts of it will be useful to someone. Successes.
PS: Criticism is welcome.