One to read: Walking the DWeb walk

One to read: “Walking the DWeb walk”

I recently was doing some research for my appearance on TechSNAP, and could have done with finding this link before I recorded the show. It’s a great summary of how to put your website on IPFS. Definitely worth a look!

This was automatically posted from my RSS Reader, and may be edited later to add commentary.

Podcast Summary – “TechSNAP Episode 384: Interplanetary Peers”

Last night I was a guest host on TechSNAP, a systems, network and administration podcast.

The episode I recorded was: TechSNAP Episode 384: Interplanetary Peers

In this episode, I helped cover the news items, mostly talking about the breach over at NewEgg by the MagePay group and a (now fixed) vulnerability in Alpine Linux, and then did a bit of a dive into IPFS.

It’s a good listen, but the audio right at the end was quite noisy as a storm settled in just as I was recording my outro.

One to read: A Beginner’s Guide to IPFS

One to read: “A Beginner’s Guide to IPFS”

Ever wondered about IPFS (the “Inter Planetary File System”) – a new way to share and store content. This doesn’t rely on a central server (e.g. Facebook, Google, Digital Ocean, or your home NAS) but instead uses a system like bittorrent combined with published records to keep the content in the system.

If your host goes down (where the original content is stored) it’s also cached on other nodes who have visited your site.

These caches are cleared over time, so are suitable for short outages, or you can have other nodes who “pin” your content (and this can be seen as a paid solution that can fund hosts).

IPFS is great at hosting static content, but how to deal with dynamic content? That’s where PubSub comes into play (which isn’t in this article). There’s a database service which sits on IPFS and uses PubSub to sync data content across the network, called Orbit-DB.

It’s looking interesting, especially in light of the announcement from CloudFlare about their introduction of an available IPFS gateway.

It’s looking good for IPFS!

This was automatically posted from my RSS Reader, and may be edited later to add commentary.

One to read: Overview of TLS v1.3

One to read: “Overview of TLS v1.3”

Wondering what TLS v1.3 means to your web browsing? OWASP break it down into what the differences are between TLS1.2 and TLS1.3. It’s a really good set of slides and would be great if you need so show someone some of the moving pieces without reading the RFS (RFC8446). It’s good :)

This was automatically posted from my RSS Reader, and may be edited later to add commentary.

“You can’t run multiple commands in sudo” – and how to work around this

At work, we share tips and tricks, and one of my colleagues recently called me out on the following stanza I posted:

I like this [ansible] one for Debian based systems:
  - name: "Apt update, Full-upgrade, autoremove, autoclean"
    become: yes
      upgrade: full
      update_cache: yes
      autoremove: yes
      autoclean: yes

And if you’re trying to figure out how to do that in Shell:
apt-get update && apt-get full-update -y && apt-get autoremove -y && apt-get autoclean -y

His response was “Surely you’re not logging into bash as root”. I said “I normally sudo -i as soon as I’ve logged in. I can’t recall offhand how one does a sudo for a string of command && command statements”

Well, as a result of this, I looked into it. Here’s one comment from the first Stack Overflow page I found:

You can’t run multiple commands from sudo – you always need to trick it into executing a shell which may accept multiple commands to run as parameters

So here are a few options on how to do that:

  1. sudo -s whoami \; whoami (link to answer)
  2. sudo sh -c "whoami ; whoami" (link to answer)
  3. But, my favourite is from this answer:

    An alternative using eval so avoiding use of a subshell: sudo -s eval 'whoami; whoami'

Why do I prefer the last one? Well, I already use eval for other purposes – mostly for starting my ssh-agent over SSH, like this: eval `ssh-agent` ; ssh-add

One to listen to: “Jason Scott is Breaking Out in Archives”

This is a fascinating episode. Jason Scott works for the Internet Archive, personally hosts a large archive of historic BBS text files, and is an engaging interviewee (plus, he talks a lot :) )

If you’re interested in how works (including how they choose disks for their storage, how content is processed and managed, whether they keep spam, how many hours of “sermons” have been stored, etc), or what happens when you get sued for USD2,000,000,000, it’s well worth a listen.