The goal here is to hit 2 books a month, so 24 for the year. I’m on
track as of now. Perilous. I’ve (mostly) finished moving my blog off of Jekyll and over to IkiWiki. The source for my posts is still on github. My Jekyll stuff is still
in thcipriani/blog_src.
My IkiWiki stuff can be found in thcipriani/wiki.
I’ve also made a couple of plugins for IkiWiki—one to use Pygments for syntax highlighting,
another to use s3cmd to do my
publishing. I’ll get around to publishing those plugins Soon™. For
whatever reason the amazon_s3 plugin for
IkiWiki failed in strange ways for me. This move was prompted mostly by content ownership and licensing
issues. On the previous iteration of this site, I used the Disqus service (which is really good) for
comments—but I really wanted to own the comments on my blog! I wanted to
be certain that the comments made here were owned by the folks that made
them and licensed in a Free way. To that end, I exported all comments
from Disqus (which is an awesome feature) and imported them into a
service that I hosted on a tiny platform called Isso. Isso seems like good
software, and everything worked out of the box. So after a lot of yak shaving, I was running on Jekyll and Isso which
are both MIT
Licensed projects. For my personal blog, I really wanted to use
something with a Copyleft license. The criteria of a
statically-generated blog and comment system with a Copyleft license was
surprisingly limiting. I looked at using Pelican
and IkiWiki. Ultimately, IkiWiki just seemed more flexible and
extensible to me. Plus, I’m writing this via the CGI and it will be
built statically and uploaded to s3 which is kind of neat. I’ve made some effort to ensure that RSS feeds don’t break. This
effort consists of a single 301 redirect. I’ve also written a new comment policy
for this site. Sorry if I’ve broken things. Let me know if I’ve broken things
badly. I like to post my photos
on the internet. I used to post all of my photos on Flickr, but that
site has been getting worse
and worse
and worse.
More recently, I’ve been using a static photo gallery generator I
Hiraeth is, to put it mildly, missing some features. There are a few
reasons that I opted to create hiraeth rather than use something that
was already built: Hiraeth is invoked like:
Hiraeth was designed to look and behave like a static version of
Flickr circa 2007. There are still features to add, but there is a base
that works in place at least. I mange my The ability to drop a bunch of photos means that hiraeth needs to be
able to get photo metadata from a picture without having the file
actually be on disk. We use gerrit at work and I genuinely like it. HOWEVER, The workflow gerrit enforces, the git
features it uses, and the beautiful repository history that results
makes gerrit a really nice code review system. Gerrit is the first system I’ve seen use git-notes. Gerrit has a cool feature where it keeps all of the patch review in
git-notes: This is super cool. You can have, effectively, an offline backup of
lots of information you’d usually have to brave the gerrit web-ui to
find. Plus, you don’t have to have this information in your
local repo taking up space, it’s only there if you fetch it down. There is another project from Google that uses git-notes for review
called git-appraise. This is the stated use of git-notes in the docs: store extra
information about a commit, without changing the SHA1 of the commit by
modifying its contents. It is, however, noteworthy that you can store a note that points to
any object in your repository and not just commit objects. After some minor testing it seems that I can store all the EXIF info
I need about my images in git-notes without actually having
those images on disk; i.e., I can have git-annex drop the actual files
and just have broken symlinks that point to where the files live in
annex. I wrote a small
bash script to play with some of these ideas. Now it seems like it should be possible to
We’ll see how any of that goes in practice : Most Debian installed bash completions live under
Some complete options you want:
Content Ownership
Copyleft
Administravia
Flickr
wrote hacked together that I (perhaps unfortunately)
named hiraeth.
~/Pictures
directory
is organized—I can find stuff—and I don’t want to mess all that up to
generate a crappy website out of my photos.~/Pictures
, which
creates…unique challenges :)publish [edited-photo-dir] [output-dir]
. Hiraeth looks for
a file named _metadata.yaml
inside the directory of edited
photos and uses that to map photo files to photo descriptions and add
titles and whatnot to the page. It makes a few different sized
thumbnails of each photo, grabs the exif info, and generates some
html.Home Pictures
~/Pictures
directory using git-annex (which I’ve
wanted to write something about for a long time). This is mostly amazing
and great. Git-annex has a lot of cool features. For instance, in
git-annex once you’ve copied files to a remote, it will allow you to
“drop
” a file locally to save space. You can still get the
file back from the remote any time you rootin’ tootin’ feel like, so
nbd. Occasionally, when I’m running out of space on one machine or
another, I’ll drop
a bunch of photos.Gerrit
<rant>
The web-UI is one of the worst interfaces
I’ve ever used. The web interface is an unfortunate mix of late-90s,
designed-by-engineers, impossibly-option-filled interface mashed
together in an unholy union with a fancy-schmancy new-fangled
javascripty single-page application. It’s basically a mix of two
interface paradigms I hate, yet rarely see in concert: back-button
breakage + no design aesthetic whatsoever.
</rant>
tyler@taskmaster:mediawiki-core$ git fetch origin refs/notes/*:refs/notes/*
remote: Counting objects: 176401, done
remote: Finding sources: 100% (147886/147886)
remote: Getting sizes: 100% (1723/1723)
remote: Compressing objects: 100% (116810/116810)
remote: Total 147886 (delta 120436), reused 147854 (delta 120434)
Receiving objects: 100% (147886/147886), 14.91 MiB | 3.01 MiB/s, done.
Resolving deltas: 100% (120449/120449), done.
From ssh://gerrit.wikimedia.org:29418/mediawiki/core
* [new ref] refs/notes/commits -> refs/notes/commits
* [new ref] refs/notes/review -> refs/notes/review
tyler@taskmaster:mediawiki-core$ ls -l .git/refs/notes
total 8
-rw-r--r-- 1 tyler tyler 41 Aug 28 16:44 commits
-rw-r--r-- 1 tyler tyler 41 Aug 28 16:44 review
tyler@taskmaster:mediawiki-core$ git log --show-notes=review --author='Tyler Cipriani'
commit ab131d4be475bf87b0f0a86fa356a2b1a188a673
Author: Tyler Cipriani <tcipriani@wikimedia.org>
Date: Tue Mar 22 09:08:52 2016 -0700
Revert "Add link to anon's user page; remove "Not logged in""
This reverts change I049d0671a7050.
This change was reverted in the wmf/1.27.0-wmf.17. Since there is no
clear consensus, revert in master before branching wmf/1.27.0-wmf.18.
Bug: T121793
Change-Id: I2dc0f2562c908d4e419d34e80a64065843778f3d
Notes (review):
Verified+2: jenkins-bot
Code-Review+2: Legoktm <legoktm.wikipedia@gmail.com>
Submitted-by: jenkins-bot
Submitted-at: Tue, 22 Mar 2016 18:08:27 +0000
Reviewed-on: https://gerrit.wikimedia.org/r/278923
Project: mediawiki/core
Branch: refs/heads/master
EXIF data without pictures
tyler@taskmaster:Pictures$ git photo show fish.jpg
+ git notes --ref=pictures show d4a9c57715ce63a228577900d1abc027
error: No note found for object d4a9c57715ce63a228577900d1abc0273396e8ef.
tyler@taskmaster:Pictures$ git photo add fish.jpg
+ git notes --ref=pictures add -m 'FileName: fish.jpg
FileTypeExtension: jpg
Make: NIKON CORPORATION
Model: NIKON D610
LensID: AF-S Zoom-Nikkor 24-70mm f/2.8G ED
FocalLength: 62.0 mm
FNumber: 2.8
ISO: 3200' d4a9c57715ce63a228577900d1abc027
+ set +x
tyler@taskmaster:Pictures$ git photo show fish.jpg
+ git notes --ref=pictures show d4a9c57715ce63a228577900d1abc027
FileName: fish.jpg
FileTypeExtension: jpg
Make: NIKON CORPORATION
Model: NIKON D610
LensID: AF-S Zoom-Nikkor 24-70mm f/2.8G ED
FocalLength: 62.0 mm
FNumber: 2.8
ISO: 3200
+ set +x
git push origin refs/notes/pictures
, fetch them on the
other side, and modify hiraeth to read EXIF from notes when the symlink
target doesn’t exist.
Links
Basics
/usr/share/bash_completion/completions
# the first argument ($1) is the name of the command whose arguments are being completed
# the second argument ($2) is the word being completed,
# and the third argument ($3) is the word preceding the word being completed on the current command line
# In the context of this function the following variables are defined:
# COMP_LINE, COMP_POINT, COMP_KEY, and COMP_TYPE
# as well as COMP_WORDS and COMP_CWORD
# It must put the possible completions in the COMPREPLY array
# See bash(1) '^ Programmable Completion' for more information
_some_function() {
local cur cmd
cur=${COMP_WORDS[$COMP_CWORD]}
cmd=( "${COMP_WORDS[@]}" )
}
complete -F _some_function command
complete
options
-o bashdefault
- if no completions are found do the
bash default thing-o default
- readline completions if both the complete
function and bash expansions fail-o nospace
- don’t append a space at the end of matches
(useful if you’re doing directory stuffs)-S or -P
- a prefix or suffix that is added at the end
of a completion generated by the function passed to
complete -F
Example
# Many of the ideas presented in this script were stolen
# in-part or wholesale from
# <https://github.com/git/git/blob/master/contrib/completion/git-completion.bash>
__scap_subcommands=
__scap_get_subcommands() {
if [ -n "$__scap_subcommands" ]; then
return
fi
__scap_subcommands=$(scap --_autocomplete)
}
_scap() {
local cur cmd=() sub rep
cur=${COMP_WORDS[$COMP_CWORD]}
cmd=( "${COMP_WORDS[@]}" )
if (( COMP_CWORD == 1 )); then
__scap_get_subcommands
rep=$( compgen -W "$__scap_subcommands" -- "$cur" )
COMPREPLY=( $rep )
return
fi
# limit the command to autocomplete to the first 3 words
if (( COMP_CWORD >= 2 )); then
# don't complete any sub-subcommands, only options
if [[ -n "$cur" && "${cur:0:1}" != '-' ]]; then
COMPREPLY=()
return
fi
cmd=( "${COMP_WORDS[@]:0:3}" )
fi
# replace the last word in the command with '--_autocomplete'
cmd[ $(( ${#cmd[@]} - 1 )) ]='--_autocomplete'
rep=$( compgen -W "$( ${cmd[@]} )" -- "$cur" )
COMPREPLY=( $rep )
}
# By default append nospace except when completion comes from _scap
complete -S' ' -o bashdefault -o default -o nospace -F _scap scap
Posted
Posted
Posted
Posted