The Saurian Spider – Making It A Little Harder for Your ISP to Sell Your Web History

In 2016, the FCC ruled that internet service providers had to get your permission before selling your raw browsing data
.

While that wasn’t hard for them to do, the Trump-led GOP is trying to remove that tiny bit of privacy.

While there’s little substitute for tools such as HTTPS Everywhere, a
VPN, and setting your DNS to ones other than your ISPs, I worked up a BASH script to pollute your web browsing history.

The idea is simple – by adding in random requests, your actual web browsing history is hidden among the noise and chaff.

I’m calling it the Saurian Spider (because, dinosaurs?), and you can find it at:

https://github.com/uriel1998/saurian-web-history-pollution

The script maintains a list of URLs – creating one at
$HOME/.config/saurianspider.conf if needed – and retrieves them randomly
at random-ish (1-30 second) intervals. Any new links it finds on those pages, it’ll
add to the list. It also switches the useragent between Firefox, Chrome,
Opera, Opera Mini, Edge, and Internet Exploder semi-randomly as well,
thus making it more difficult to filter out these requests from your
legit ones.
 
The URL list is seeded with the current events page at Wikipedia and
the “Random” page on Wikipedia; that said, it doesn’t ADD links from
Wikipedia or Wikimedia, as that could get really obvious, really
quickly.

If you want to use your own list of URLs in a different location, the file location should be the first (and only) argument.

Depends upon/uses (most of these are GNU coreutils):
curl
awk
sed
head & tail
grep
mktemp
wc
shuf

https://github.com/uriel1998/saurian-web-history-pollution

Popular posts:

  • HOWTO Make Your Hamburger Helper Better
  • If there's one Nazi (or a racist) at the table...
  • Bash string padding with SED
  • Word Porn Quotes
  • The Complicated Mess When The Missing Stair Gets Noticed
  • SOLVED: Command line BPM (beats per minute) Analysis in Linux
  • Odds and Ends: Optimizing SSHFS, moving files into subdirectories, and getting placeholder images