Welcome to the Linux Foundation Forum!

Lab 12.2: Restrict access to http://www.foxnews.com/ using Squid

donaldsebleung
donaldsebleung Posts: 21
edited August 2020 in LFS211 Class Forum

I am attempting the aforementioned lab in CentOS Linux 7. I have added the two lines to /etc/squid/squid.conf above the lines I added in Lab 12.1 as instructed in the lab instructions as follows:

acl blockedsite url_regex ^https?://.*.foxnews.com/.*$
http_access deny blockedsite

(Note I tuned the regex to cover HTTPS, since I noticed Fox News automatically redirected to HTTPS and initially thought that that was the problem)

I then executed

# squid -k reconfigure

which produced no output and exited with code 0 and visited http://www.foxnews.com/ again using Firefox on a MacBook Air running macOS on my local network, but the website loaded as normal (same if I reload the Squid daemon by systemctl restart squid). Note that I did Lab 12.1 successfully and confirmed the Squid error page with an invalid URL so the correct proxy is definitely being used. Has anyone else encountered a similar issue?

Also note in Firefox on my macOS client that I ticked the box to use the proxy for FTP and HTTPS as well.

Comments

  • Hi @donaldsebleung ,

    You are right, I was able to reproduce the issue. It's related to blocking https, and it's not that easy as adding the 's' to the url_regex rule.

    Probably Lee will provide an answer later, for the while I can provide this workaround:

    1.- Add the following lines the beginning of the /etc/squid/squid.conf file:

    acl CONNECT method CONNECT
    acl blockedsite dstdomain .foxnews.com
    http_access deny blockedsite
    acl examplenetwork src 192.168.1.0/24

    Note: replace the last line for your own local network.

    2.- Add the following line later in the file (where you can find the rest of the http_access rules) :

    http_access allow examplenetwork

    3.- Parse the configuration file:

    squid -k parse

    4.- If everything looks well, restart the service (as root or using sudo):

    systemctl restart squid

    5.- Test it again and let us know.

    Regards,
    Luis.

  • lee42x
    lee42x Posts: 380

    Ah, yes, This exercise depends on an "http" (not "https"). It looks like the site we were using has been updated. You can use any site that is HTTP and it will work. Blocking the whole domain is an option, but we were trying to be more selective. Something like: http://www.cbc.com should work.

    It will get updated , thanks !

  • It's a pleasure, Lee!

    Regards,
    Luis.

  • Thank you both for your replies. With regards to @luisviveropena 's proposed workaround, the default Squid configuration file in CentOS 7 already contains some of the suggested lines so I found out in my case that I only needed to add the following two lines to the start of the configuration file:

    acl blockedsite dstdomain .foxnews.com
    http_access deny blockedsite
    

    With the following two lines added, I did not see the "Access Denied" Squid page that I anticipated, but it prevented the CSS and dynamic content from loading properly resulting in a bare HTML page (initially it did seem to block the HTML as well leading to a blank page, but after a few refreshes it seems the HTML went through).

    Apologies for the late reply, I only revisited the Forums today after redoing the exercise in preparation for the LFCE exam which I'm planning to attempt before the end of October.

  • lee42x
    lee42x Posts: 380

    Please see the entry on Aug 26th, the foxnews.com site no longer uses http so it will not work. The site http://www.cbc.com does use http and will work for this lab.

    Lee

  • @lee42x said:
    Please see the entry on Aug 26th, the foxnews.com site no longer uses http so it will not work. The site http://www.cbc.com does use http and will work for this lab.

    Lee

    Thanks, I did try this as well and the regex solution as proposed in the lab PDF does work as expected.

Categories

Upcoming Training