What does HackerNews think of single-file-cli?

Language: JavaScript

This is amazing, looks like there's a cli version too: https://github.com/gildas-lormeau/single-file-cli

Presumably you can roll your own pocket now.

Check out https://gwern.net/archiving.

Since your bookmarks are already tagged, perhaps you don't need to tag the files? In some ways, it may be convenient, but at the cost of duplicating the information. As long as you can map a bookmarked URL to a file path or paths, you can find archived copies through your bookmarks.

Here is what I do for external URLs on my personal website. It is inspired by Gwern's approach. A major difference is that he doesn't nest directories; he uses ${domain}/${url-checksum}.ext.

I translate the URL to a file path in my link-archive directory by applying the function dest-dir from the Tcl code below. In the directory, I save whatever is at the URL with a name based on its checksum (b2sum -l 32), so I can have multiple archived copies of the same URL. I use https://github.com/gildas-lormeau/single-file-cli to save the URL. I determine the destination file extension from the MIME type.

This gives you paths like link-archive/365tomorrows.com/2005/10/23/postcard/e5445dff.html for https://365tomorrows.com/2005/10/23/postcard/.

  proc slug s {
    set s [string tolower $s]
    regsub -all {[^A-Za-z0-9\.\_\~\-]+} $s - s
    string trim $s -
  }

  proc dest-dir link {
    set slugs [lmap part [file split [regsub {#[^!].*$} $link {}]] {
      set x [string range [slug [regsub {^./} $part {}]] 0 127]
      regsub {^~} $x {./~}
    }]
    # Drop the protocol.
    file join {*}[lrange $slugs 1 end]
  }