Scripts and Utilities to Make Newsbeuter a Console Replacement For Google Reader

tl;dr:  I created some utilities for Newsbeuter, a console-based RSS reader, so that it gives me the social options I want and need in a light newsreader.  You might find them useful too.

Note:  If you use Firefox, Sage is a damn good and damn quick RSS reader… but you have to deal with a full browser’s toolkit loading…and staying loaded.  And that also leads to distraction (for me) rather than reading my RSS feeds.  Further, it doesn’t do offline reading.  So I wanted a simple offline-capable RSS reader that kept me focused.

Newsbeuter seems to be one of the quicker and more flexible
console-based newsreaders out there.  Yes, console-based.  TERMINAL APPLICATION.  Why?  I don’t get distracted by widgets as much and can actually go quickly.

With Google Reader going away, I had to think hard about my
RSS reading.  It falls into three categories:

  1. Stuff to read at length later.
  2. Stuff to quickly read and share.
  3. LOLCats.

The first two are fairly easily dealt with through newsbeuter
itself and its bookmarking application. Hit control-B, then for the
description type twitter, pocket, and so on. The bash script will
shorten titles (and URLs) when needed.

If you want to use, modify the script at to only output an URL.

For twitter, I use ttytter, a nice perl script that you can find at . It does a great job of sending simple text updates.

The other services are simply using Mutt ( ) to send e-mail to Pocket ( ) and Buffer (
). Works great, no need for heavy OAuth APIs for something this
simple. Best of all, if you’ve got a decent linux machine, you can DO
THIS OFFLINE and the mail just sits in the mailqueue until you get a
chance to get back on.

The third… that’s when images are vital, and well, none of the linux readers works well for saving images. I share
LOLCats to my hard drive long enough to share with my girlfriend (hi
honey!). Liferea and RSSOwl are great for browsing, not so much for sharing to disk.  So I created a macro that saves those posts to a text file.

macro o set external-url-viewer "dehtml >>
~/imagetemp/imglinks.txt" ; show-urls ; next-unread ; set
external-url-viewer "urlview"

Dehtml is available here:

Then I use the script in question to strip out the image links (with jpg/jpeg/gif/png extensions) and wget ( )to download them. BDOW.

You can check out the git repository here: .  Again, this isn’t meant to be a drop-in replacement;  it’s intended to get you going and knowing what’s going on behind the scenes.

Was this post helpful or insightful? Buy me a coffee here or here and share this post with others!

Popular posts:

  • The difference between boundaries and rules
  • Two Ways to get CMYK Separation Using GIMP Instead of Photoshop in 2022
  • Organizing and Tiling Your Windows on #Openbox Using Only... Openbox
  • Weekend Project: Whole House and Streaming Audio for Free with MPD
  • Simple Smart Playlists for MPD (that work!)
  • If there's one Nazi (or a racist) at the table...

Recent Posts