【转发】Build Squid with SSL Bump and ICAP Client

原文文档:http://docs.diladele.com/administrator_guide_3_4/installation_and_removal/filtering_https.html

上面这个地址更新了,用这个吧:https://docs.diladele.com/howtos/build_squid_ubuntu16/index.html

感觉Squid的配置乱七八糟的,不同版本参数,配置方法都不一样,所以我为了测HTTPS+ICAP的功能,就找了这个3.3.8版本的配置方法,不过这其中仍然有些参数丢失了。所以如果你不设置ACL的话,用我这个配置应该可以。我这个配置是在原配置上增加了:

ssl_bump server-first all
sslproxy_cert_error allow all
sslproxy_flags DONT_VERIFY_PEER

这三个参数。

 

我的配置文档: https://files.cnblogs.com/files/bonjov1/Ubuntusquid3.3.8.rar

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

This article will tell you how to compile, setup and configure Squid proxy capable of filtering encrypted HTTPS connections using Diladele Web Safety ICAP content filtering server. Being able to look into HTTPS contents greatly increases your ability to control what is allowed and accepted within your network while keeping inappropriate contents away.

Why do We Need to Filter HTTPS

HTTPS protocol was designed to provide secure means of communications between internet browser and remote web servers. In order to achieve this goal HTTPS protocol encrypts data passing through established connections so that it cannot be decrypted in reasonable amount of time thus preventing anyone from sniffing the contents interchanged over this connection. This protocol was primarily invented to enable safe and secure communication between the user and financial sites or government institutions over the insecure medium such as the Internet.

Recently more and more web sites started to use HTTPS encrypted communications to increase online privacy of users. Google who as first enabled HTTPS for all its searches by default probably initiated this trend. Although there are no doubts that HTTPS encryption is a good thing for safety on the wire we must take into account that it also creates several problems for controlled networks typically found at home or offices. The main problem here is the essence of the HTTPS protocol itself – no one except the browser and the web server is able to see and thus filter transferred data. This may not always be desired. Contents that are usually blocked suddenly become immediately accessible by anyone. As an example imagine a school network where minors can see questionable content by just mistyping a search term in Google. Moreover the law often forces administrators in educational institutions to block access to such content (e.g. CIPA for educational environments) and encrypted access to web sites makes it nearly impossible to fulfill such an obligation.

In order to overcome these limitations it is advised to setup HTTPS filtering of web contents with help of SSL bump feature of Squid proxy server and Diladele Web Safety web filter.

How It Works

In order to filter web requests user’s browser needs to be explicitly directed to use the proxy that is deployed in the same network. It is also possible to set the transparent proxy but we are not going to explain how this is done in this tutorial because steps involved are quite different from explicit proxy setup.

When a user tries to navigate to a web site, browser sends the request to proxy server, asking it to get the requested page on his behalf. The proxy establishes a new connection to the remote site and returns the response to browser. If normal HTTP is used then proxy is able to see the original contents of the response and filter it. In case of HTTPS the flow of data is a little different. Browser asks the proxy to establish a virtual tunnel between itself and remote server and then sends encrypted data through the proxy. Domain name to which a virtual tunnel is being established is usually known, so proxy is able to block this virtual tunnel when it finds out that domain name belongs to a prohibited category. Unfortunately this is not a complete solution as there are a lot of sites on the Internet which are general in nature (like Google or YouTube) but allow you to easily navigate to something undesired.

To improve the quality of web filtering and get access to contents in encrypted connections, browsers in the network may be setup to trust proxy to act on their behalf for establishing HTTPS connections, filtering them and passing the allowed data to clients while blocking everything that is not allowed. Although this assumption is too strict to be implemented in public networks, it is easily doable in controlled home, educational or corporate environments where administrators act as sole owners of network devices and may force any trusting rules. After established trust browser is able to ask proxy to connect to a remote site in a safe manner with HTTPS, proxy is able to decrypt the traffic, filter it, encrypt it again and pass it to browser. As browser trusts the proxy it continues working with filtered HTTS without any errors or warnings.

Unfortunately most of the Squid versions included in most common Linux/FreeBSD distributions do not contain compile switches necessary for successful HTTPS filtering. Proxy administrator needs to recompile Squid proxy, reinstall and reconfigure it with additional list of options. Although this process is not very complex, it is complex enough and it would be helpful to have all the necessary steps exactly described. We will provide such exact instructions for the latest version of one of the most popular Linux distributions –Ubuntu Server 13.10.

Build Squid with SSL Bump and ICAP Client

Before compiling it is considered a good practice to bring the operation system to a most recent state. This can be done by running the following commands in the terminal.

sudo apt-get update && sudo apt-get upgrade && sudo reboot

In order to build the Squid from source we need to install some build tools and fetch the sources of Squid and various dependent packages from Ubuntu repository. This does not need to take place on the productionserver, it is possible to build Squid on one machine and install the resulting binaries on others.

sudo apt-get install devscripts build-essential fakeroot libssl-dev
sudo apt-get source squid3
sudo apt-get build-dep squid3

Running the following command unpacks Squid source package together with all system integration scripts and patches provided by Ubuntu developers.

dpkg-source -x squid3_3.3.8-1ubuntu3.dsc

Sources are unpacked intosquid3-.3.8 folder. We need to set this folder as current and modify configure options in debian/rules to include compiler switches (–enable-ssl and –enable-ssl-crtd) necessary for HTTPS filtering.

patch squid3-3.3.8/debian/rules < rules.patch

The rules.patch file should look like this.

--- rules 2013-11-15 11:49:59.052362467 +0100
+++ rules.new 2013-11-15 11:49:35.412362836 +0100
@@ -19,6 +19,8 @@
 DEB_CONFIGURE_EXTRA_FLAGS := --datadir=/usr/share/squid3 \
 --sysconfdir=/etc/squid3 \
 --mandir=/usr/share/man \
+ --enable-ssl \
+ --enable-ssl-crtd \
 --enable-inline \
 --enable-async-io=8 \
 --enable-storeio="ufs,aufs,diskd,rock" \

One file in source code of Squid Proxy needs to be adjusted too ( src/ssl/gadgets.cc ). This change is needed to prevent Firefox error sec_error_inadequate_key_usage that usually occurs when doing HTTPS filtering with latest Firefox browsers. If you use only Google Chrome, Microsoft Internet Explorer or Apple Safari this step is not required.

Download patches here

patch squid3-3.3.8/src/ssl/gadgets.cc < gadgets.cc.patch

Where gadgets.cc.patch looks like this.

--- gadgets.cc 2013-07-13 09:25:14.000000000 -0400
+++ gadgets.cc.new 2013-11-26 03:25:25.461794704 -0500
@@ -257,7 +257,7 @@
 mimicExtensions(Ssl::X509_Pointer & cert, Ssl::X509_Pointer const & mimicCert)
 {
 static int extensions[]= {
- NID_key_usage,
+ //NID_key_usage,
 NID_ext_key_usage,
 NID_basic_constraints,
 0

Then we build the package using the following command. After a while this command builds all required *.DEB packages.

cd squid3-3.3.8 && dpkg-buildpackage -rfakeroot -b

Install Diladele Web Safety

SSL Bumpng feature alone is not enough to block questionable web content. We also need the filtering server that could be paired with Squid. We will use Diladele Web Safety (DDWS) formerly known as QuintoLabs Content Security for the filtering and blocking part. It is an ICAP daemon capable of integrating existing Squid proxy and providing rich content filtering functionality out of the box. It may be used to block illegal or potentially malicious file downloads, remove annoying advertisements, prevent access to various categories of the web sites and block resources with explicit content.

We will use version 3.0 of qlproxy that is in release candidate state and will probably be finally released this month. It was designed specifically with HTTPS filtering in mind and contains rich web administrator console to perform routine tasks right from the browser.

By default, DDWS comes with four polices preinstalled. Strict policy contains web filter settings put on maximum level and is supposed to protect minors and K12 students from inappropriate contents on the Internet. Relaxed policy blocks only excessive advertisements and was supposed to be used by network administrators, teachers and all those who do not need filtered access to web but would like to evade most ads. Third policy is tailored to white list only browsing and the last group contains less restrictive web filtering settings suitable for normal web browsing without explicitly adult contents shown.

In order to install Diladele Web Safety for Squid Proxy, download package for Ubuntu 13.10 from Diladele B.V. web site at http://www.quintolabs.com using browser or just run the following command in terminal.

wget http://updates.diladele.com/qlproxy/binaries/3.0.0.3E4A/amd64/release/ubuntu12/qlproxy-3.0.0.3E4A_amd64.deb

Administration console of Diladele Web Safety is built using Python Django framework and is usually managed by Apache webserver. To install packages required for correct functioning of web UI run the following commands in the terminal.

sudo apt-get install python-pip
sudo pip install django==1.5
sudo apt-get install apache2 libapache2-mod-wsgi

Install the DEB package and perform integration with Apache by running the following commands.

sudo dpkg --install qlproxy-3.0.0.3E4A_amd64.deb
sudo a2dissite 000-default
sudo a2ensite qlproxy
sudo service apache2 restart

Please note you may need to uncomment the #Require all granted line in /etc/apache2/sites-available/qlproxy.conf if you get Access Denied errors trying to access Diladele Web Safety’s Web UI. This is due to the fact that Apache configuration settings have changed between Ubuntu 12 and Ubuntu 13. Luckily has to be done only once.

Configure Squid for ICAP Filtering and HTTP Bumping

The Squid packages we have compiled previously need to be installed on the system. To perform installation run the following commands.

sudo apt-get install ssl-cert
sudo apt-get install squid-langpack
sudo dpkg --install squid3-common_3.3.8-1ubuntu3_all.deb
sudo dpkg --install squid3_3.3.8-1ubuntu3_amd64.deb
sudo dpkg --install squidclient_3.3.8-1ubuntu3_amd64.deb

In order to recreate original SSL certificates of the remote web sites during HTTPS filtering Squid uses a separate process named ssl_crtd that needs to be configured like this.

sudo ln -s /usr/lib/squid3/ssl_crtd /bin/ssl_crtd
sudo /bin/ssl_crtd -c -s /var/spool/squid3_ssldb
sudo chown -R proxy:proxy /var/spool/squid3_ssldb

Finally, modify Squid configuration file in /etc/squid3/squid.conf to integrate it with Diladele Web Safety as ICAPserver. Due to the size of the patch file it’s text not included into the article directly but is part of the download archive.

sudo cp /etc/squid3/squid.conf /etc/squid3/squid.conf.default
sudo patch /etc/squid3/squid.conf < squid.conf.patch
sudo /usr/sbin/squid3 -k parse

From now on Squid is capable of HTTPS filtering and we may continue filtering adjustments from Web UI of Diladele Web Safety.

Navigate to http://<YOUR PROXY IP ADDRESS>/ and login with default name root and password P@ssw0rd. Select Settings / HTTPS Filtering / Filtering Mode. Diladele Web Safety may either filter specific HTTPS sites or all of them with exclusions. Total filtering is more tailored to providing very safe network environments.

Select the desired mode, click Save Settings , add target domains or exclusions as you like and then restart ICAP server by clicking on the green button in the top right corner as indicated on the following screenshots.

posted @ 2016-07-22 09:50  bonjov1  阅读(1200)  评论(0编辑  收藏  举报